Imagine you’re stuck in a hospital bed after having surgery. You can’t even close the window blinds without a nurse’s help. And you can forget about requesting a blanket to take off the chill or getting details on visiting hours when everyone’s busy handling more-pressing matters.
You feel powerless.
But what if you got what you needed just by saying it? You could instantly open the blinds, find out more about your doctor’s expertise or turn up the room temperature. Sounds great, right? All you’d need is one of today’s digital voice assistants that constantly listen for a request, send your query to the internet and either answer your question or complete a task.
Unfortunately, you can’t do that right now with the current crop of smart assistants — like Apple‘s Siri, Amazon‘s Alexa and Google’s Assistant — because they can’t satisfy hospitals’ privacy and security requirements. Yet according to Bret Greenstein, vice president of IBM‘s Watson Internet of Things platform, some medical staff can spend nearly 10 percent of their time with patients answering questions about lunch, physician credentials and visiting hours. If a smart speaker can answer those questions, doctors and nurses could spend more time on patient care.
It’s why Thomas Jefferson University Hospitals in Philadelphia decided to work with audio giant Harman and IBM’s Watson artificial intelligence technology. Together, they developed smart speakers that will respond to about a dozen commands. When a patient says “Watson,” the speakers can, for instance, play calming sounds and adjust the room’s lighting, thermostat and blinds.
“This is a way for patients to get some simple comfort measures addressed just by speaking,” says Dr. Andrew Miller, associate chief medical officer at the Philadelphia hospital group. “How great is that?”
For the hospital, it’s just the beginning.
Watson, turn on the lights
Like Amazon’s popular Echo speaker, Harman’s JBL clock radio packs smarts that respond to command words it hears spoken.
Jefferson Hospital experimented with Amazon’s popular Echo speaker, but found the hospital couldn’t simultaneously control multiple speakers from one management system. What’s more, the Echo couldn’t access the hospital’s secure Wi-Fi network, and it didn’t have the right “skills,” or capabilities, for a medical environment.
“It would have done simple things people are used to doing in the home, but not the things we wanted to do,” says Neil Gomes, the hospital’s chief digital officer.
So late last year, Jefferson Hospital started testing five prototype speakers that Harman made using the external casing of a regular JBL cylindrical speaker and components specially designed for artificial intelligence.
The initial trial tested two models. One required patients to press a button to wake up the device, getting around privacy concerns of an ever-listening microphone. The other woke when someone said “Watson,” the name of IBM’s AI technology that won the $1 million first-place prize on “Jeopardy” in 2011.
“The button gives a sense of privacy, but it proved to be very frustrating to users because they had to keep pushing it,” says Greenstein.
The newest speakers, now built into Harman’s round JBL clock radios, rely solely on voice commands. The hospital is testing about 40 of the new speakers, with IBM and Harman tweaking the smarts as they go. The speakers also tie into the hospital’s automated facilities management system, which lets administrators control things like heating, air conditioning and lighting online. That’s a convenience for everyone.
“When my father-in-law was in the hospital, we had to talk to the nurse about adjusting the thermostat,” says Kevin Hague, vice president of technology strategy at Harman. “It was absurd that we had to have an RN come in and figure out on the computer how to adjust the temperature.”
As of this writing, the hospital hadn’t decided if it would stick with “Watson” or go with some other wake-up word, like “Jefferson.”
It’s fair to say we’d rather voice assistants do our bidding in a hotel room instead of in a hospital.
Some hotels are exploring that option — and finding that off-the-shelf digital assistants work just fine.
Marriott, for instance, has been testing Apple’s Siri and Amazon’s Alexa at an Aloft Hotel in Boston. The hotel installed iPad tablets and Echo speakers in 10 rooms, letting guests speak commands to control the TV and adjust the lighting. That sounds awfully tempting considering how tough it can be sometimes to figure out which switch does what.
“The room would become an extension to your personal tech,” says Toni Stoeckl, Marriott global brand leader and vice president. “I don’t think we’re there quite yet.”
In the meantime, Jefferson Hospital, Harman and IBM are working on ways to teach their smart speaker to branch out beyond simple tasks. The possibilities are intriguing. Maybe Watson could follow you home to make sure you’re taking your medication correctly. Or it could prompt you to take a walk so you could heal faster, easily change pharmacies or arrange follow-up appointments.
Right now, the speakers don’t need regulatory approval, although that could change if they provide information about your diagnosis or explain your medications.
No matter how the hospital ends up using them, one thing is certain. It sucks being in a hospital. Having a little control over your environment could make it suck a little less.
This story appears in the summer 2017 edition of CNET Magazine. Click here for more magazine stories.
Special Reports:CNET’s in-depth features in one place.
Technically Literate:Original works of short fiction with unique perspectives on tech, exclusively on CNET.