Artificial intelligence
New wave of generative AI is appearing in healthcare
May 1, 2023
Why all the fuss about ChatGPT and other forms of ‘generative AI’, which are said to be a new wave of artificial intelligence?
“It’s an amazing evolution of AI,” commented Dr. David Rhew, chief medical officer and VP of Healthcare for Microsoft, in an interview with CHT. “Previously, with AI, really only data scientists could work with it. But ChatGPT offers an interface that everyday people can understand. Anyone can use ChatGPT to manipulate large data sets to obtain answers. It’s leading to the democratization of AI.”
Thanks to ultrafast computers and networks, generative AI systems can process massive stores of knowledge. Using this data, they can write essays and stories, produce songs and paintings, and they can even conduct very good diagnoses in a medical setting. In a word, they can ‘generate’ new knowledge, hence the moniker generative AI.
Tests of ChatGPT have shown the system can even pass the U.S. medical school exams. Not only can it answer true and false questions, but it can also produce an accurate diagnosis when told the symptoms that a mock patient is presenting with and their lab test results. It’s just the kind of information a doctor gets in a medical office, when he or she must figure out what ails the patient.
“GenAI can answer these questions better than a lot of clinicians can, but it’s not always perfect,” said Dr. Rhew. From time to time, the latest iteration of GPT-4, which is found in the ChatGPT system, “hallucinates”. That’s tech-speak for making up answers. And occasionally, GPT simply gets things wrong.
That’s why Dr. Rhew believes the systems should act as assistants, instead of replacing doctors or any sort of human expert. “There needs to be a human in the loop to verify the results,” he asserted. “We call this a co-pilot, and we always want to make sure a human is part of the process.”
Given the dramatic growth of medical knowledge, ChatGPT and other genAI systems could help doctors with assessments and therapies. The artificially intelligent systems could comb the Internet for advances and the latest knowledge or best practices. If something seems out of kilter, however, the doctor is always there to check, using his or her own experience and knowledge.
Commercial systems that produce consistently accurate diagnoses and suggest the best therapies may still be some time off. On a related note, physicians have shown a degree of mistrust of AI systems, largely because they appear as “black boxes” that don’t tell us how they arrive at decisions.
Dr. Rhew, however, asserted that genAI systems can be asked how they arrive at an answer, and they will supply it. “They’re very good at quality assurance,” he said. “You can even ask it to respond in a way that a seventh-grade student would understand, and it will answer in this way.”
On a more positive note, generative AI systems are already helping clinicians reduce their crushing loads of administrative tasks. Software solutions have appeared that can reduce the documentation that clinicians are currently required to do – documentation that’s leading to exhaustion and burnout for many.
For example, Abridge, a Pittsburgh-based leader in AI-powered medical documentation, this year announced a partnership with The University of Kansas Health System that it calls the most significant rollout to date of generative AI in healthcare.
Abridge said the new partnership has the potential to serve and support more than 1,500 practicing physicians across the University of Kansas Health System’s 140+ locations, as well as additional clinicians in a phased rollout.
Abridge’s technology identifies over 90 percent of the key points from provider-patient conversations and generates summaries in the formats preferred by clinicians. According to a company news release, Abridge keeps the provider in the loop, enhancing their productivity, but never replacing their judgment. The core technology acts as an intelligent co-pilot, producing organized drafts and providing interactive tools to accelerate the editing process, ensuring that providers get off to a running start as soon as a visit concludes.
The technology also integrates with healthcare software, including Epic, a widely adopted electronic health record system, to simplify and streamline documentation.
“With Abridge, we have found a powerful solution that addresses the biggest challenge facing our providers – excessive time spent on documentation including non-traditional hours,” said Dr. Gregory Ator, chief medical information officer and head and neck surgeon at The University of Kansas Health System.
“This cutting-edge technology will not only close the documentation cycle in real-time but also improve the overall quality and consistency of our clinical notes. Our partnership with Abridge represents a major step forward in reducing burnout, improving provider satisfaction, and ultimately enhancing the delivery of patient care.”
Abridge’s solution addresses these pain points, starting with a draft that’s generated within a minute of the conversation ending. Abridge’s AI-powered interactive editing tools then support the provider to expedite the remaining edits.
Mainstream electronic health record companies are also adopting generative AI in their solutions. Epic, for example, has been experimenting with GPT-4, the version of the Large Language Model that underlies ChatGPT.
In March, Seth Hain, senior vice president of research and development at Epic, said the company sees promise in the new AI-based application and considers it to be “transformational” for the healthcare industry.
“We’ll use it to help physicians and nurses spend less time at the keyboard and to help them investigate data in more conversational, easy-to-use ways,” said Mr. Hain in a March 21 Microsoft press release.
For its part, Microsoft is a major investor in Open AI, the company that released ChatGPT late last year, making it available to the public. (Microsoft invested $10 billion in the company in January 2023, building on earlier investments in 2019 and 2021.)
Microsoft has extensive plans for generative AI and intends to include it in a host of products, including its flagship software systems like Word and the rest of its Office suite, as well as Teams.
On the healthcare front, Microsoft’s Nuance division has been rolling out its DAX Express system in the United States, a software solution that uses AI and voice technology to record, understand and document the encounter between patients and physicians. Much like Abridge’s solution, it does this by monitoring the “ambient sound” in the doctor’s office, making sense of the language spoken between patients and clinicians, and automating the tedious task of charting.
While the DAX Express software fills out the charts, the clinician receives the final reports and checks to make sure that they’re accurate; if not, he or she can edit them before uploading them.
It’s been found that ambient voice systems of this sort can often chart more comprehensively than doctors. “If the doctor is tired, and is just trying to get through the day, he might not be charting everything. But the AI system will chart everything,” Dr. Rhew said.
Microsoft and Nuance are close to releasing DAX Express in Canada. “It’s coming soon to Canada,” he said.
Meanwhile, other companies have been devising their own versions of generative AI, including Google, which is using it in its search engine. It’s also planning to release a system for public use, called Bard. Meta, the parent company of Facebook, is also a major developer of generative AI.
Of course, the rise of such powerful systems have aroused fears that they could be misused. That fear recently led prominent figures in the technology industry, including Elon Musk, to call for a six-month moratorium of the development of systems beyond the capabilities of ChatGPT.
Dr. Rhew said there’s some sense in this, comparing the emergence of ChatGPT to the automobile in the age of the horse and buggy. “Now, with newer technology, we’ve introduced a car that takes us much faster. But we don’t yet have the stop signs and lines on the road that are needed. We need these rules to do things safely and to mitigate harm.”