Next big thing
Coming soon: EMRs that respond to voice commands and supply answers
By Rosie Lombardi
Improvements in healthcare systems rely on electronic data entered by doctors into intelligent devices and analytical systems. But many doctors hate the laborious process of typing and fussing with computers. Apple’s Siri, a voice command module for the iPhone, points the way to easier approaches in the future – and many technology companies are helping to lead this charge. Nuance Healthcare, maker of Dragon Medical 360 and other speech recognition and clinical language understanding products, is working with American and Canadian EMR vendors to make talking to EMRs a reality.
The concept of speaking commands to intelligent devices that analyze, react, and respond verbally goes by many names, says Jonathon Dreyer (pictured), director of mobile solutions marketing at Nuance Healthcare. “We lump all those concepts under the label of ‘intelligent voice interactions’.”
Nuance recently released a new version of its cloud-based speech engine – part of the company’s 360 Development Platform – and is allowing EMR developers to incorporate it in their product development, says Dreyer.
“Dozens of developers have already integrated our speech and understanding engines in their applications and their solutions are available in the market now. We have more than 250 developers in our program so we’ll continue to see a steady stream of new apps hit the market, and existing products evolve, as the integration of new capabilities starts to take shape over the next 12 to18 months.”
What will those capabilities look like, or rather, sound like? In current systems, physicians have to type a multitude of terms for even basic searches, and troll through mountains of information to find what they want. Instead, they could simply speak their questions – a verbal Google search – and the system will return those results on-screen, with further verbal commands to refine them.
For example, a physician may ask the system, “What’s the treatment for Lyme disease?” or “What are the common drug interactions between medication A and B?”
Moving beyond searches, Nuance has also developed functions for EMR interactions, ranging from simple questions like “What’s my schedule for today?” to more complicated ones that call up and act on information in patient records, says Dreyer.
“They could say, ‘Show labs for John Doe’ or ‘What are the patient’s vitals?’ Or they could ask something very free-form like, ‘The patient has pneumonia. What’s the recommended treatment?’ It could leverage the data that’s already available in that patient’s record and be able to do some intelligent follow-up actions like, ‘Schedule a follow-up appointment for next Friday for this patient.’”
Communications with the system would be interactive for some actions, with the system responding with information pulled from databases, and then using a verbal prompt to ask the doctor about the next action.
“In the near future, this interaction might go like this: ‘Please schedule this exam for John Doe.’ It may respond: ‘There’s availability Friday at 3:00 and Tuesday at 2:00. Which one would you like to schedule?’ So instead of having to click through multiple screens, you can use voice to cut through the clutter.”
Dreyer emphasizes that much of this functionality is already available in the here and now. “We just recently released the capability for EMR developers to add voice control functionality to their software. But it might take a few months before the developers fully integrate that technology and then make it available to their end users.”
While consumer versions like Siri can handle everyday language, doctors may wonder how accurate Nuance’s system is with more complicated medical terminology, and also with the multitude of accents that exist across the English-speaking world.
According to Dreyer, Nuance’s engine is very different from the ones in use in the consumer space, as it has advanced algorithms to sniff out the medical meanings of ambiguous words based on the context. “We have medical language models which are tailored to clinical settings. For example, in a consumer setting, the word humorous probably means something funny, and not humerus, a piece of anatomy. So in the context of clinical formatting, we make the meanings of possible dictations more predictable in healthcare settings.”
On the accent front, the company has been tackling this for over a decade with its Dragon Medical 360 dictation software, so this is not a new capability, he adds. “Even thick, heavy accents aren’t really a big issue for us anymore. In recent versions of Dragon Medical 360, there’s accent support for Southern U.S., Northern U.S., Asian American, Indian American, and so on. We have an array of different accent and dialect filters.”
As noted above, Dreyer says about 250 EMR and other healthcare system developers are participating or evaluating Nuance’s development program for its voice capabilities. “There are a couple dozen partners that already have commercial products available on the market. We have live end-users at various healthcare facilities of all sizes, and in different specialties. So we have general EMR partners, but we also have dermatology, orthopedic and radiology providers. And then we have a number of partners that are trying some new, emerging technology in pilot programs right now.”
EMR vendors working on mobile applications are particularly keen to work with Nuance’s engine, as data entry is extremely difficult in these smaller next-generation devices, he adds. “What we’ve seen over the last year or so is that many physicians are reluctant to use mobile EMR systems because entering data into an iPhone, iPad, or Android Tablet is a real barrier.
Doctors will use these devices to view information, but when it comes to creating documentation, they say they need something like our voice-enabled engine to really make mobile work.”
Posted October 11, 2012