IBM and Nuance Communications have announced a partnership in speech recognition to direct physician-dictated text into the structured fields of an EHR (electronic health record).
Nuance is the maker of the Dragon speech-recognition software, which in addition to health care and other industries is used in the White House and Defense Department.
According to Peter Durlach, Nuance Communications’ senior vice president of marketing and product strategy for health care, the collaboration between IBM and Nuance will use IBM’s research in NLP (natural language processing) to enhance Nuance’s CLU (Clinical Language Understanding) software products.
Nuance’s CLU technology is a health-care-specific type of NLP that involves extracting specific data about a patient’s condition from the narrative text dictated by a physician or nurse. CLU is a core element of EHR workflows, according to Nuance.
“It will combine some of the work IBM has done in the natural language processing area with the work we’re already doing at Nuance to tackle that big problem in health care, which is, How do you get structured data out of the narrative part of the dictation?” Durlach explained to eWEEK.
“This new partnership will enable IBM Research to develop improved technology where the information extraction system can benefit from structured knowledge such as a medical ontology of symptoms to develop an understanding of the dictated text,” Salim Roukos, IBM’s senior manager of NLP Technologies, wrote in an e-mail to eWEEK.
According to Roukos, IBM has been developing new ways to extract discrete information from dictated material using advanced text mining.
The narrative part of a doctor’s dictation involves an unformatted, or unstructured, section of text that an EHR is unable to process into separate fields. “If you don’t get that structured data, all that narrative is just there as a blob of text in the database,” Durlach said. “It’s very hard to do decision making after the fact because you don’t have structured field level data.”
Durlach said the collaboration with IBM will allow the CLU application to “parse [data] out of that narrative blob of text and populate EHR fields so you can do data mining.” Physicians will not have to key in data to the EHR fields themselves.
He compared the use of CLU to being able to extract mentions of specific companies from a radio or television broadcast on financial trends.
Nuance has posted a video demonstration of a doctor using CLU technology to dictate a patient’s diagnosis and treatment plan into an EHR.
Of the approximately 2 billion medical reports dictated per year in the United States, most are dictated by physicians using this narrative process, Durlach noted.
Physicians still need to use the narrative format and have the CLU technology format it, he explained.
“These EHRs, they require physicians to point and click through multiple screens. The physicians can’t stand it because it slows them down-they’re very awkward,” Durlach added.
“EMRs [electronic medical records, or EHRs] have a significant amount of freely dictated/written unstructured doctors’ notes and comments,” Roukos explained. “The CLU analyzes this unstructured text and extracts facts into structured tables such as any allergies a patient might have. The structured data can be used to automate any checks to improve on health care services.”
CLU can identify and pull data on medical problems, social history, allergies and medications from the narrative text, according to Nuance.
The CLU technology can also alert health care providers to previous information about a patient, Roukos said.
Companies such as 3M Health also offer dictation apps for health care.