Diagnostics
Healthcare lags other industries in Gen AI, but many use cases beckon
November 1, 2024
To the patient’s eye, a family doctor’s job begins and ends in the exam room. Medical professionals know that’s not true. According to the Ontario College of Family Physicians, doctors spend 19 hours a week on administrative tasks – reports, referrals, and more.
A full-day’s slate of patients amounts to five hours of such paperwork, and 94 percent of Ontario family practitioners feel “overwhelmed” by the workload.
“Your family doctor, on average, usually has about 1,200 to 1,800 patients,” said Dr. Chandi Chandrasena, a family doctor and chief medical officer of OntarioMD, a subsidiary of the Ontario Medical Association charged with digital health initiatives in the community.
“When they see you in your office, that’s only a small part of the work that they do. A good bulk of the work is managing the documentation: writing in the notes, the forms, reading and managing the reports that come in their EMR, letters, the messages that patients and requests.”
“Some of this administration work is expected and this is part of caring for our patients, but a large amount is the administrative burden, which is caused by the health system and the way we are expected to operate.”
Generative artificial intelligence (Gen AI) could ease that burden. In fact, medical scribe applications – which transcribe entire patient interactions, generate the corresponding paperwork and integrate with electronic health records – can save physicians up to four hours a week by reducing documentation workload by 71 percent, Dr. Chandrasena said.
Generative AI could anticipate next steps, booking referrals, populate follow-up notes for future visits and order tests, for example.
Still, healthcare, particularly in Canada, is trailing other industries in the uptake of generative AI, according to a recent survey by data and AI firm SAS Institute and Coleman Parks.
Even though survey respondents in Canadian healthcare were more likely to consider themselves to have a good understanding of Gen AI (53 percent) than those from Canadian and global general industry respondents (51 percent and 48 percent, respectively), only 20 percent said they use Gen AI daily, compared to 29 percent of respondents from global industry in general.
Artificial intelligence – specifically, the intersection of disciplines including predictive analytics, machine learning, speech and visual recognition, among others – has played a business role for some years now, particularly in financial institutions.
In healthcare, it has been helping predict appointment no-shows and in some radiology departments, it’s assisting with interpretations of images.
But generative or creative AI only made a commercial splash in the late 2022, and it was aimed at consumers, not business. And while it has been writing endless episodes of Seinfeld or producing convincing images of penguins playing bagpipes, it has been largely a solution in search of a business application.
“It took hold of our curiosity and of course, as individuals we’d love to play with it. What’s possible? What’s not?” said Jay Upchurch, SAS executive vice-president of technology, and chief information officer. “(But) you’ve got to have business strategy that’s being realized by the technical application of generative AI.”
Medical scribe applications, which are numerous, check those boxes. But there are other challenges to Gen AI adoption, especially in a community health setting.
Health system pushes family doctors into other unfamiliar roles: Unlike clinicians in hospitals, family doctors must acquire software and do their own tech support. And they must pay for it out of their own pockets, Dr. Chandrasena noted.
Onboarding, training, rollout, troubleshooting – hospitals have discrete staff to deal with the technology.
“It’s a different resourcing structure in the community. It’s a different support structure in that there really isn’t anything, it’s just us” she said.
The healthcare system pushes family doctors into roles that they’re not trained for, she added.
“We are trained as physicians, not administrators or IT or privacy officers, these are unfamiliar roles.
“As physicians, we want to learn about new technology and we want to innovate,” she said. “But what about legal issues, the contracts, the liability, the consent? Who’s responsible if there’s an information breach? We don’t have any protected time to learn about all of this. We’re trained to solve your problems in the medical sense, but we’re not trained to be administrators. We’re not trained to even run a clinic, and we’re certainly not trained to be privacy officers or procurement officers.”
Upchurch cautions against losing sight of the “Gen” in GenAI, especially in healthcare.
“It’s generative. It’s trying to predict next best word (or to be) creative,” he said. “And in a lot of cases, businesses can’t be creative when trying to make business decisions, or healthcare decisions about treatment or drug development or something like that. That’s where other AI techniques come into play, and we need to make sure that we again apply Gen AI in a point-specific way that is effective to the business outcome or need.”
Enter synthetic data: One such application is the development of synthetic data. GenAI can be used to generate masses of data consistent with a smaller volume of actual data collected, providing a bigger dataset for theoretically more accurate analysis. It’s also literally anonymous; it can’t be attributed to individuals that don’t exist.
This is a guardrail for healthcare practitioners looking to share data for in-depth analysis of trends and training AI models.
“We’re always worried about privacy and data handling and governance and management of that data. In some cases, creating synthetic versions of that will allow us to move a little bit faster,” Upchurch said.
For example, he explains, researchers could model the impact of an event like 2020’s COVID pandemic. The data set might not yet exist. “Then it’s a question of, can I create a version of that data that would emulate things that start to suggest a pandemic? And then, can I model out what that would do to my supply chain and how I could handle that?
It always starts with data: Most conversations related to technology adoption in healthcare begins with the quality of the data, not the predictive or machine learning models, said Dr. Onyi Daniel, board member for the Sinai Health System in Chicago and former vice-president of data and analytics strategy for Highmark Health, an American national non-profit healthcare company based in Pittsburgh.
“It really starts upstream at the data collection, the data quality, and the data governance,” she said. [The emergence of generative AI] “has sparked a little bit more of investment and attention to that upstream data quality, data collection, and data governance, … so that we are including all the populations, specifically the target populations which cross all demographics, groups, etc., in the model building.”
Even with some of the challenges associated with GenAI adoption, Dr. Chandrasena sees an upside to AI in healthcare as well.
“Could AI be used outside of just AI Scribe? Could it populate forms, can it auto-populate requisitions, referrals, different platforms? Could it auto-populate my electronic health record (EMR) using proper terminology and coding?
Dr. Chandrasena said AI scribe is just the beginning, but it can do more.
“We always talk about the incredible need for “one patient, one record,” she said. “So, during these visits with your physicians, your nurse practitioners, your social workers or any health professionals, could AI convert the discussion into discrete data fields that then get exported and put into the “one record” for your patient that travels with them from province to province? I believe so. We’re just not there yet.