A client recently asked our thoughts on using Alexa in the operating room. Presumably, the tasks Alexa would be charged with doing would be basic. Turn the lights on. Turn the lights off. Make a call. And so on.

For those of you unfamiliar with Alexa, it is a product sold by Amazon which serves as your virtual assistant. It takes your verbal commands and turns them into action. Here’s Alexa at its most basic.

          Alexa, what’s on my calendar today?

          Alexa, remind me to check on my patient at 3PM?

No surprise, you can buy stuff with Alexa.

          Alexa, buy a package of double AA batteries.

          Alexa, order package of size 7 ½ gloves.

And there are Internet of Things devices that will respond to verbal commands; for example, to turn on the lights.

Now for the down side.

Security.

Such devices already pose a risk in the general consumer space without even touching on the more stringent security requirements an operating room would demand.

A recent article detailed “researchers can now send secret audio signals undetectable to the human ear to Apple’s Siri, Amazon’s Alexa, and Google’s Assistant.”

So, your dog may hear your device being hacked while you are blissfully unaware. (Actually, I have no idea if your dog would be able to hear the attacking frequency.)

This year, another group of Chinese and American researchers from China’s Academy of Sciences and other institutions, demonstrated they could control voice-activated devices with commands embedded in songs that can be broadcast over the radio or played on services like YouTube.

More recently, [Nicholas Carlini] and his colleagues at Berkeley have incorporated commands into audio recognized by Mozilla’s DeepSpeech voice-to-text translation software, an open-source platform. They were able to hide the command, “O.K. Google, browse to evil.com” in a recording of the spoken phrase, “Without the data set, the article is useless.” Humans cannot discern the command…..

“Companies have to ensure user-friendliness of their devices, because that’s their major selling point,” said Tavish Vaidya, a researcher at Georgetown. He wrote one of the first papers on audio attacks, which he titled “Cocaine Noodles” because devices interpreted the phrase “cocaine noodles” as “O.K., Google.”

Mr. Carlini said he was confident that in time he and his colleagues could mount successful adversarial attacks against any smart device system on the market.

I do think we will reach a point where such devices are usable in the operating suite. But, I would not be the first on the block to use them. Such devices will need to be made commercial grade, taking into account reasonable security concerns.

It is easy to imagine a hacked device recording operating room banter. Think that poses no risk? In 2015, a patient successfully sued an anesthesiologist for $500,000 for insults (about the patient) that were recorded during the procedure. Here, the patient’s phone was recording all operating room conversations while stored in a clothes bag on the bottom of the gurney.

The patient said he had his phone set to record the post-operation instructions given to him before he was put to sleep. He forgot to turn off his phone during the procedure, though, and when he listened back to what the operating team had said about him, he was shocked.

So, my two cents worth.

“Alexa, stay out of the operating suite for now.”

What do you think? Weigh in using the comments box below. And if you haven’t already, subscribe to our newsletter for weekly content.


Feeling the pressure? Learn how we can protect you…

We know your time is valuable. Spend a few minutes with us and discover how membership protects what’s important to good medical practice – and does away with what’s detrimental…

Browse Our Protection Programs BETA 1