Siri stole the show in the Commons on Tuesday, “heckling” Conservative minister Gavin Williamson as he attempted to deliver a speech on Isis in the Middle East.
The Defence Secretary was updating MPs on the fight against the terror group when a voice coming from his jacket pocket interrupted proceedings.
In what the Mirror suggested was probably the first speech to Parliament by an artificial intelligence assistant, Siri could be heard saying: “Hi Gavin, I found something on the web for: “In Syria, democratic forces supported by…”
The speaker of the Commons, John Bercow, then interrupted the unlikely exchange, announcing to the house that this was “a rum business”.
Williamson replied: “I’m not sure what caused that intervention, but I do apologise for that.
“It is very rare that you’re heckled by your own mobile phone, but on this occasion it is a new parliamentary convention, without a doubt.”
Williamson added that he would continue his speech, “without the help and support of Siri”.
He later tweeted about the incident saying he had a new iPhone and “must ask my 13 year old daughter how to use it”.
While much was made about the humorous nature of Siri’s unexpected intervention, BBC’s Laura Kuenssberg pointed out that Williamson, given his job, probably should have disabled Siri to avoid being hacked.
According to an article in The Hacker News last year, hackers can make calls, send text messages and even browse malicious websites by using Siri or Google Now.
It reported that a team of security researchers from China’s Zhejiang University had discovered a clever way of activating voice recognition systems without speaking a word by exploiting a security vulnerability that is apparently common across all major voice assistants
The technique, Hacker News reported, works by feeding the AI assistants commands in ultrasonic frequencies, which are inaudible to humans but can be picked up by microphones on smart devices.
“With this technique, cyber criminals can ‘silently’ whisper commands into your smartphones to hijack Siri and Alexa, and could force them to open malicious websites and even your door if you have a smart lock connected,” Hacker News wrote.
Techworld also reported last month that hackers need only a short audio sample to synthesise or replay a human voice convincingly enough to trick people and security systems.
David Emm, principal security researcher at Kaspersky Lab, believes the central risk posed by a voice-activated device is that we forget that it’s there.
He told Techworld “What it does offer is a way to get people when they’re less guarded.
“If somebody has access to a device that can listen to what’s going on, then they can scoop up lots and lots of information because in daily life we talk a lot - a lot more than we type on a keyboard - and therefore the potential for gathering information is so much greater.”
Tuesday’s interjection isn’t the first time Siri has become part of British political debate.
In April, Siri’s voice was heard coming from the pocket of Transport Minister Jo Johnson during an appearance on BBC Question Time.