Australia deploys AI in radiology across the country
January 31, 2024
CHICAGO – An Australian diagnostic imaging organization has surged ahead with an enterprise-scale implementation of AI, deploying Annalise.ai to 250 sites and to over 400 radiologists across the country.
The group, called I-MED, provides reading and reporting services to clinics and public and private hospitals throughout Australia. Like other DI organizations around the world, its radiologists had been finding it difficult to keep up with the fast-growing demand for exams.
AI was seen as a solution for improving quality while maintaining throughput and reducing stress and burnout of radiologists.
By implementing artificial intelligence across the enterprise, I-MED has leapfrogged most other reading groups worldwide, most of which are still testing AI in small-scale projects.
“Australia is advanced, compared to the rest of the world, when it comes to deploying AI,” said Dr. Catherine Jones, a cardiothoracic radiologist at I-MED who helped lead the project. She is also a professor of clinical imaging at the University of Sydney and a clinical consultant with Annalise.ai.
Dr. Jones outlined how and why I-MED implemented AI software enterprise-wide at the annual RSNA radiology conference in late November. Addressing an audience in Chicago, she discussed the approach used by I-MED and explained how the organization handled change management and the use of metrics.
She said I-MED had to develop much of this methodology on its own, because no one else had done such a large-scale deployment of AI and there were no cases to model their own upon.
I-MED developed the Chest application, called Annalise CXR, in a joint venture with AI technology company Harrison.ai. The solution was deployed in Australia in 2021 and can detect more than 120 findings; it’s now used daily by I-MED’s 400+ radiologists.
In an initial test of the AI system for chest X-ray exams, I-MED set several targets for success. The software performed well, with the tool doing three times better than expected on several of the reported metrics.
I-MED had 11 of its radiologists pilot the software for six weeks. All of them used the Annalise.ai solution to assist them with their chest X-ray readings.
Some of the key aspects of the test:
- I-MED determined that if the AI tool caused the radiologists to change their reports 1 percent of the time, that would be a good outcome. It would also be enough of a quality improvement to justify rolling out the software to the full cohort of 400 radiologists across Australia. In the post-pilot survey, the 11 participating rads reported they changed their reports 3.1 percent of the time – a significantly higher rate than expected.
- It was estimated that patient management would change 0.5 percent of the time because of the AI findings. In actuality, it changed 1.4 percent of the time.
- In the pilot, I-MED management didn’t want the radiologists asking for extra imaging more than 3 percent of the time because of the AI findings. “The last thing we wanted was to generate more demand for a lot more CTs,” said Jones. In the end, there was only a 1 percent increase in further imaging recommendations. (Almost all of them were to investigate suspected lung cancers or cases or osteoporosis.)
Moreover, the organization thought that if 50 percent or more of the radiologists felt the software positively impacted their work, it would be considered a success. However, after the six weeks passed, 90 percent of them reported the tool positively impacted their CXR reporting.
“The radiologists who were the most opposed to AI became its biggest supporters, after they had used it,” said Dr. Jones.
With 250 sites across a large geographic area, and with different types of sites – from clinics to large hospitals – and a range of users with various degrees of buy-in, a lot could have gone wrong, she said.
However, with good planning, the pilot project worked and showed excellent results. For that reason, the organization rolled out the chest X-ray AI tool and many others to its full cohort of radiologists across continent.
“We planned out the [initial] deployment for about four months before commencing it,” said Dr. Jones. “This was paramount to making sure we got it right.”
With the full rollout, she noted that it was done in phases, as Australia is such a big country, and I-MED serves a wide range of customers. It operates three different PACS and three RIS and also provides tele-radiology services.
“With the wider rollout, we had to stage it. We couldn’t go from zero to hero in one step,” she quipped. The company also added many different AI tools.
In addition to assisting radiologists with reading exams, such as chest X-rays, Dr. Jones said I-MED wanted a tool that would help with triage. The organization found that at the end of the day, urgent cases were still sitting in the queue.
AI software could help with this, shifting urgent cases to the top of the stack and routing them to available and qualified physicians. Moreover, in future, the company wants to add AI report generation.
But in getting from zero to organizational acceptance of AI, Dr. Jones explained there were several components required to get things right.
First, I-MED had to ensure that it had the in-house skills to manage the project, from AI tool selection to deployment of the system.
To get started, it decided to begin with the most commonly used modality, which is chest X-ray. I-MED then selected Annalise.ai as its vendor. “We knew it had excellent accuracy, based on internal and external validation.”
After deciding the direction to head in, I-MED had to educate its radiologists about the project and AI in general. It had to put together an education and training program.
Dr. Jones said that early engagement with the users – the radiologists – was key. In order to help the project succeed, moreover, they had to make sure that essential stakeholders were on board.
“You must identify who those people are, including senior executives and influential radiologists, and you must have those people sponsoring the project. They will have to be supportive,” she said.
She said, moreover, that you won’t get anywhere without having top people from the IT department aligned with the team.
Next, she said, comes the discovery phase. What is it that’s needed at every site, in terms of infrastructure? And secondly, how do the radiologists interact with the technology? A solid base must be in place before deploying the AI, and the radiologists must be comfortable with all the underlying technologies in order to benefit from the new systems.
Finally, a change management piece must be deployed. Dr. Jones said constant and effective communication is required. “That means, no one at the end of this process could possibly say, ‘nobody told me this was coming’.”
She asserted that every week, radiologists would receive emails, memos and messages on Teams. Many of the messages came from the CEO, indicating that top management was backing the project. Moreover, short videos were sent out with messages, and webinars were available to radiologists who had time.
“We made all of the reference material available online,” said Dr. Jones. She added that all the key radiology leaders were made part of the project.
Importantly, to win over dubious radiologists, “we also had evidence that the tool would help us with our problems, because it’s very hard to rationally argue with evidence.” Still, she noted, “People will irrationally argue, but there’s not much you can do about that.”
As well, there must be feedback mechanisms. And she emphasized that you must show people that you are listening to them and responding to their feedback.
During rollouts of the technology, there were staff dedicated to gathering feedback available. Super-users were also trained, so they could assist others.
Metrics were of paramount importance. The organization had to decide what it wanted to measure during the trial, to prove the AI software was useful.
It turned out that the chest X-ray tool was even more useful than anticipated, which impressed the radiologists and turned skeptics into believers.
What especially won over the radiologists was the ability of the software to spot problems that they had missed. “Other radiologists said they would have missed them too,” said Dr. Jones. “The issues were later confirmed by CT and MRI,” adding that these problems included cancers.
It was gratifying for Dr. Jones and her colleagues when radiologists were asked if the trial changed their opinion of the CXR tool. “There was a very positive response,” she said. “They reported a much higher approval of AI, in general, because of the trial.”
She added, “Those radiologists who were the most opposed to AI became its biggest supporters after they had used it for six weeks.”
After the chest X-ray tool was rolled out across Australia, I-MED conducted another survey. It found that 93 percent of radiologists were using the tool for every CXR they did.
Additionally, 75 percent said the AI tool did not slow them down, and 25 percent said it speeded up their work.
And 90 percent said the AI tool had positively impacted their CXR reporting; the company expected this to be only 50 percent.
Dr. Jones said I-MED does regular surveys across the entire group and has metrics that are reported back every month.
The metrics include accuracy of the software and turnaround times of urgent cases. “We also look at how often the software fails to provide a result,” she said.
The organization also assesses the impact on workflow and on patient outcomes.
Dr. Jones said the AI deployment was challenging and that a lot could have gone wrong. However, in the end, “It actually went as well as it could,” she modestly commented.
But preparation and having a good framework were necessary prerequisites, she observed. Moreover, constant communication with the users and stakeholders underpinned the success of the deployment.
“Implementing AI tools can be done,” she said. “If we can do it in our very diverse network, almost anyone can. However, you need to have a framework and a project plan.
In addition to communicating the plan, she said that “evidence is a powerful ally in convincing your colleagues and the IT department, as well as your executives, that there is going to be a successful outcome.
“When you’ve done all of this, the benefits are enormous.”