Feature Story
AI solutions making their way through DI departments
January 28, 2021
In the 1950 Stanley Cup playoffs, then-22-year-old Gordie Howe mistimed a check and fell headfirst into the boards. Removed unconscious on a stretcher before a hushed crowd, he was rushed to hospital. With Howe’s life on the line, doctors determined that intracranial pressure from a brain bleed would kill him. At a medical imaging conference in Toronto last year, Howe’s son Murray, a radiologist, described how neurosurgeons determined where to drill to relieve the pressure: “They guessed.”
Fortunately, they guessed right, and saved the young man who would become an ice hockey legend.
Brain imaging has become much more sophisticated since 1950. That’s fortunate, since the mortality rate of brain bleeds is almost 50 percent in patients who are not diagnosed within 24 hours. Still, it was a source of frustration for Dr. Errol Colak, a staff radiologist at St. Michael’s Hospital in Toronto. Imaging files were being sent to him from the emergency room for analysis in chronological order – on a first-in, first-out basis.
“You don’t really know what you’ve got until you open (the file),” Colak says. The patients in most dramatic need of diagnosis weren’t necessarily higher up on the work list. Dr. Colak would get a sinking feeling when, hours and dozens of patients into his list, he would find a brain bleed. “I’d think, ‘I really hope this patient is still okay,’” he says.
For all the technology and networking capabilities, “computers are still pretty stupid,” Colak says. Having done his undergraduate work in computer programming, he wanted the machines to streamline the diagnosis process and make sure the highest priority cases got attention quickly. If the computers were dumb, he would make them smarter.
AI and signal processing: Enter Hojjat Salehinejad. He wanted to use his talents and education to help people. Hojjat majored in machine learning and signal processing and is now pursuing his PhD in electrical and computer engineering at the University of Toronto, while working as a senior data scientist at Unity Health Toronto – he was a natural fit for an artificial intelligence (AI) project to streamline the emergency radiology process.
“Images and videos are signals (for processing),” he says. And signal processing is critical to AI.
Artificial intelligence is not a technology, but a suite of intertwined technologies – natural language processing, machine learning, deep learning, computer vision – that allow computers to perform human-like, decision-making activities. Rooted in the science of analytics, AI systems can “teach themselves” the relationships among the data collected and outcomes.
But first, they have to be trained.
Salehinejad and Colak spent a year feeding thousands of radiological images into an experimental system, tweaking algorithms and programming. The goal was to build a system that could process imagery from scanning devices and flag potential brain bleeds for the radiologist’s analysis.
Rather than waiting several hours, patients could be referred to a neurologist in minutes instead of hours. The system was launched in the fall of 2020, and has an accuracy rate, verified against historical hospital data, of 93 percent. And, Salehinejad says, as the system collects more data and reports, performance improves.
AI trends in medicine: If Dr. Colak is an “IT guy” turned radiologist, Greg Horne’s career arc is the reverse. Once an X-ray technician, he’s now the Global Principal for Healthcare at analytics and AI giant SAS Institute. AI is essentially an exercise in pattern matching and search optimization, sorting through assembled experience. Data can’t necessarily tell you who will turn up in an emergency room, but it can tell you about why and how many.
It can solve other pressing medical problems, too. For example, the Centre for Addiction and Mental Health (CAMH) crunched data to create a model that identifies patients that are likely candidates for alternate level of care (ALC) – those in acute care that no longer require hospitalization – with 80 percent accuracy, freeing up beds and streamlining processing.
(In one case, reconstructing data after the fact, largely customs and airline travel data, algorithms predicted down to the county level where the first U.S. case of Ebola would appear.)
Worldwide, Horne says, AI projects have modeled the viability of kidney transplants, detected the growth of liver metastases in colon cancer patients, and chosen the exact model of stent to match a patient condition. But there are impediments, regulatory and institutional, to the growth of AI in healthcare. On the institutional side “the trend that still wins, over and above everything else, is ‘do nothing, watch and observe,” Horne says.
“Healthcare traditionally has been a place where people don’t want to go first. They certainly don’t like to go last, either,” Horne says.
One of the stronger growth areas for AI in healthcare is in serving as an intelligent assistant, and acting as another set of eyes for physicians. One such application is an algorithm from GE that is very successful at identifying pneumothoraxes. Paired with X-ray technology, it is flagging possible cases of pneumothorax and helping radiologists and other doctors ensure that they’re not overlooked.
Radiology and innovation: AI is especially ripe for use in imaging, partly because radiology has thrived on innovation, says Dr. An Tang, a professor in the department of radiology, radiation oncology and nuclear medicine at Université de Montréal. Computerized tomography scans (CT or CAT scans) were developed in the 1980s and had become commonplace in the 1990s. The early 2000s brought image-guided, interventional procedures. Dr. Tang says about 20 percent of radiological work is now interventional, including minimally invasive procedures to heat and burn tumours, precisely target chemotherapy, and cryogenically freeze kidney tumours.
“Our work has changed a lot over the last few decades,” he says.
Dr. Tang chairs the Canadian Association of Radiologists (CAR) AI Standing Committee. He is “deeply convinced” that AI married with radiology will render superior results for patient care.
Deep learning is a technological breakthrough that Dr. Tang thinks has enormous potential for radiology. Deep learning (DL) is a subset of machine learning (ML). ML uses algorithms to parse and learn from data and make decisions based on what it has learned. DL is a type of neural network structured in numerous layers (hence the adjective “deep”). DL learns imaging features and classification rules on its own. Its design is inspired by that of biological neurons. Nodes, mimicking neurons, turn on and off if data reaches a certain threshold in the algorithm, says Dr. Geoffrey Hinton, a vice-president and engineering fellow with Google Inc. and professor emeritus at the University of Toronto.
Unlike a decision tree model, which eliminates data as it moves through its branches, DL monitors the state of the millions of “neurons” in the system, using all of the data it has collected for decision-making. Data loss, says Hinton, is the enemy of deep learning.
De-identification: How much data is enough? For a deep learning model to exceed human judgment requires thousands, even hundreds of thousands of images to train on, Dr. Tang says. And there are a couple of obstacles to gathering that much information.
First, it’s unrealistic for radiologists to re-input reports that outdate a project. Also, institutions will have to pool datasets and models, called a federated model. Here, the privacy issue raises its head.
“The implementation of ML algorithms often requires sharing highly sensitive (personal health information) contained in medical imaging through collaborations between different sites or data transfer to a third party,” wrote the CAR’s AI Ethical and Legal Working Group, composed of more than a dozen radiologists, including Dr. Tang, and two lawyers in a two-part White Paper published last fall. “Therefore, careful precautions are required to ensure there is no inadvertent transfer of information which could be used to identify a patient.”
Hospitals share sensitive data governed by the federal Personal Information Protection and Electronic Documents Act (PIPEDA), along with parallel, health-focused acts on a provincial level, like Ontario’s Personal Health Information Protection Act (PHIPA). As a result, when personal health information is used for secondary purposes such as research, care must be taken to ensure health information cannot be traced back to an individual.
The CAR White Paper on De-Identification of Medical Imaging outlines best practices in “data management, access to health care data, de-identification, and accountability practices.”
Movement of images is a challenge, says Dr. Tang, and whatever pooling or distribution models that are used must not remember or expose personal data. “That is a requirement,” he says.
ImageNet: Fortunately, training imagery from hospital networks can be supplemented by ImageNet, a carefully architected database of objects developed to enhance the performance of machine learning. The brainchild of computer science professor Fei-Fei Li, ImageNet was inspired by the work of George Miller, who developed a hierarchal architecture for the English language by not defining words but mapping their associations – something that aligned with machine-readable logic.
Li’s project, though, was to catalogue images in a similar fashion. Beginning in 2007, the dataset had more than three million images by 2010, all labeled and segmented into more than 10,000 categories. In 2012, a team from the University of Toronto created AlexNet, a neural networking architecture that brought image recognition error rates below 25 percent. By 2017, that number was 2 percent.
ImageNet now catalogues more than 14 million hand-annotated images. Stanford University’s Medical ImageNet contains petabytes of de-identified radiology and pathology images, linked to genome and electronic health record data. And more data means better modeling.
Intervention: AI modeling can speed diagnosis in an emergency situation, like Dr. Colak’s, at St. Mike’s. It’s also becoming important for interventional radiology.
However, to guide percutaneous or endovascular intervention, we often need information coming from different imaging modalities (ultrasound, CT, MRI and PET-CT). Algorithms have been developed to merge images coming from modalities where the target lesion is well delineated (CT, MR or PET-CT) with a modality used to track the interventional device during the intervention (ultrasound, fluoroscopy, CT).
This process is called image fusion. Some limitation of this process are related to patient motion, respiration or even deformation of the tissue due to the interventional device.”
AI can be used to improve image fusion (elastic registration of fusion) by predicting the deformation of the image during the intervention. This can be done by using computer modeling of mechanical properties (Finite Element Analysis (FEA) Computing) assisted by artificial intelligence training.
For endovascular intervention such as arterial occlusive disease or aortic or brain aneurysm, we can now simulate the flow in the target vessel using computational flow dynamic (CFD) simulation and enhance the prediction using AI algorithms. With superior modeling, AI can also be used for simulations, whether for training purposes or a rehearsal for a specific procedure.
Dr. Gilles Soulez, principal scientist with the University of Montreal Health Centre (CHUM) Research Centre and director of the Laboratory of Clinical imaging Processing at CRCHUM, now thinks of radiology as a four-part process: image acquisition, classification and risk stratification of the target lesion (is this benign or malignant? or is the vascular lesion at risk of occlusion or rupture?), semantic segmentation (differentiating tumors or target vessel from the surrounding organ), object segmentation (delineation of individual tumors or target vessel) and intervention planning and guidance. “Before working with data and computer scientists, I didn’t think in these terms,” said Dr. Soulez.