Peer review projects in radiology aim to improve quality of reporting

By Dianne Daniel

No matter what perspective you take, Canadian radiologists are poised to benefit from automated peer review. As long as the focus shifts from a culture of blame to one of continuous process improvement, that is.

That’s the consensus as efforts to implement peer review systems forge ahead across Canada. In Ontario, the Integrated Department of Diagnostic Services at Hamilton Health Sciences (HHS) and St. Joseph’s Healthcare Hamilton (SJHH) have deployed a city-wide peer review implementation that’s billed as the “first-ever cross-institutional, cross-system, prospective peer review platform.” And just a few months ago, Ontario’s health minister, Deb Matthews, announced that peer review for radiologists will, in the future, be conducted across the province.

In British Columbia, the first wave of a comprehensive, province-wide endeavour is set to go live this spring, involving six different PACS, over 100 hospitals, and more than 250 radiologists.

The approaches may differ, but the goal is the same. By giving radiologists access to automated tools designed to facilitate the review process – both in real time and historically – a general increase in quality and reduction of errors or discrepancies is expected.

“The purpose of such a system should not be that heads will roll, but that minds will blossom,” says Dr. John Mathieson, medical director, medical imaging, Vancouver Island Health Authority, borrowing a quote from the authority’s chief quality officer. “If we can identify factors that lead to errors or lead to lower quality, less useful reports, and then identify methods by which those things can be improved, that’s when we’ll have long-term benefit to all patients,” he says.

Announced by the Ministry of Health in December, 2012, the B.C. project is implementing McKesson Corp.’s QICS for Radiologist Peer Review to conduct cross-facility peer review of diagnostic imaging reports. In an ideal scenario, says Mathieson, two radiologists would review the same study before issuing a report. At this juncture, however, that goal is proving to be a challenge for the BC project.

“One trade-off is that you risk delaying the report,” says Kirk Eaton, Ministry of Health Services’ director, diagnostic imaging, noting that although it’s possible to filter automated sampling so urgent reports are excluded, the implementation is technically challenging due to the breadth and depth of the province-wide approach. “It’s something we’ll revisit down the road,” he says.

Instead, the B.C. peer review system will initially focus on automated, random sampling of recently completed reports. It will also enable radiologists to capture their own review efforts when they call up previous reports to compare with follow-up images and will maintain a record of instances when they sought a second opinion.

Looking at prior studies is something many radiologists do on a daily basis as part of their normal workflow. What’s lacking, says Dr. Mathieson, is a way to hold on to that work and use it to inform the quality improvement process moving forward. With B.C.’s automated peer review system, radiologists will be able to save historical comparisons, giving them the advantage of learning from hindsight.

The ultimate goal is to close the loop on peer review by studying cases to which answers are known; in other words, cases that either went on to surgery, pathology or autopsy with a specific result. “We can then show people how well they scored in finding those things and making the correct assumption,” says Dr. Mathieson. “… I would really like to know about all of the ones I missed because they were small and subtle so that I can figure out a way to learn.”

The strategy is one used by the airline industry and U.S. military, he adds. By applying root cause analysis to airplane crashes, for example, the airline industry discovered that the “up” and “down” switches on some aircraft were not only side by side, but also had a similar appearance. Altering them to look completely different eliminated the problem. “Rather than saying this is pilot error and leaving it at that, they made huge strides,” he says.

In Hamilton, HHS and SJHH are using DiaShare Quality from Real Time Medical, a context-aware, workflow management and quality assurance software platform designed from the ground up to support cross-system implementations. Initially, the pilot involved 11 radiologists, two separate PACS, two different radiology information systems (RIS) and PowerScribe, a voice recognition system, with the intent of increasing to 65 radiologists early this year.

The HHSC project is the first successful, cross-system deployment in North America or Europe to successfully perform peer review before diagnostic reports are finalized, a workflow process Real Time Medical calls prospective review versus retrospective review. Instead of looking back, prospective peer review looks forward. A sampling of cases, typically between 3 percent and 5 percent, are automatically submitted for peer review and the response comes back within hours, prior to a report being finalized and sent on to the ordering physician.

If there’s agreement over the findings, the report is finalized. If there’s a discrepancy, the originating radiologist can choose to amend the report, consult with the reviewer (maintaining anonymity), or forward the review to a third party for another opinion.

A retrospective review process, on the other hand, relies on comparing current exams with prior exams, which may have occurred months or even years earlier. Its main shortcomings are that it relies on the existence of prior exams and if discrepancies are found, it may be too late to take corrective action or make a difference for the patient.

While their system is capable of both prospective and retrospective peer review, Real Time Medical pioneered cross-system, prospective peer review as a means of catching errors before diagnostic reports are issued, hence in time to make a difference for the patients affected while also serving as peer review feedback to radiologists. Learnings are not merely after the fact, but the patients sampled also benefit. “In this way all stakeholders benefit, physicians, patients and the health care system in general since diagnostic results drive what happens next in the care continuum, including what resources are used and which will be the most effective for the patient,” says Ian Maynard, CEO and co-founder of Real Time Medical. The company’s diagnostic sharing platform, DiaShare, addresses these limitations, as well as others. For example, some legacy systems don’t support automated sampling. Rather, radiologists are free to decide which cases to send for peer review, removing any level of objectivity. Also, reporting radiologists and reviewers are not anonymous, leaving room for human bias to affect results.

With DiaShare and QICS, sampling is random and reporting radiologists and reviewers are anonymous. “For us, complete anonymity of participating physicians was a must,” says Maynard. “It allows radiologists to fully embrace peer review as an objective learning experience as opposed to a policing, non-objective exercise.”

Another problem identified by early approaches to peer review is that radiologists don’t always have access to a peer with the same sub-specialty, particularly in smaller centres or remote locations. Or, there are so few radiologists that it’s impossible to achieve an appropriate level of anonymity as people who work together become familiar with one another’s reporting styles.

Cross-platform solutions like DiaShare and QICS address that dilemma using intelligent rules-based engines to control workflow across geographic boundaries. Reviewers may be in the same hospital, at another site down the road, in another city or even in another province. As long as they have the gateway software connecting them to the intelligent platform, they can become part of the peer review network.

“One of the key issues we recognized with legacy approaches is they did not address the need for geographic and emotional separation between reviewers and reviewees,” notes Maynard.

DiaShare was “born out of necessity,” he adds. Developed to support the company’s radiology collaboration service for participating sites, it needed to dynamically manage service levels in order to balance workloads for radiologists while also ensuring rapid report turnaround and the fulfilment of service level commitments. In particular the system actively monitors and reassigns cases to ensure that urgent cases are reported quickly, particularly important when dealing with time sensitive diagnosis such as the 2-4 hour stroke window.

Applied to the peer review process, this feature removes the fear of a report sitting in limbo by matching needs to availability in real time, enabling reviews to be completed in a matter of hours subject to radiologist availability. The McKesson system in B.C. also uses an intelligent engine to check who’s logged in and available, speeding the review process and optimizing workflow but does not feature active workload balancing and service level management of cases.

Dr. Jacques Lévesque, president of the Canadian Association of Radiologists (CAR), says there is a need to implement peer review across the country, in part because so much has changed over the past decade. It’s an important, although sensitive, subject area, but one he believes will ultimately lead to better patient outcomes.

“These kinds of tools should be used for continual professional development of radiologists; it shouldn’t be a punitive kind of thing,” says Dr. Lévesque. “If I look at the vision for the next two to three years, peer review will be widely accepted in Canada … As a national association with an orientation for quality guidelines and safety for the patient, it is clearly on the agenda.”

One of the key recommendations from CAR is that “if we implement peer review, it must be done in such a way as to improve care rather than track individual radiologist’s discrepancy rates.”

DiaShare, for example, can be customized per user, per exam, giving radiologists the ability to increase sampling rates for specific exams types of greatest concern.

The peer review system in B.C. isn’t designed as an outlier identification project, but it will be able to detect error trends that are outside the mainstream. Before jumping to conclusions, however, the first step will be to increase the particular radiologist’s sampling rate to ensure the trend is valid as opposed to a statistical variation, explains Dr. Mathieson.

If three out of 300 cases show a discrepancy, it’s a 1 percent error rate. But if those three cases happened to be sampled consecutively, it would skew to a 100 percent error rate and could lead to inappropriate conclusions.

“Some jurisdictions are saying they will pull a licence of someone they sampled incorrectly. That’s astonishing and a good way to ensure nobody will ever tell the truth,” he says.

“We’ve observed things that happened in places that implemented this against people’s will, without buy-in, without anonymity and with some sense of a punitive nature towards it. What they ended up with was garbage in, garbage out. People didn’t like it and only paid cursory attention to it.”

Instead, the B.C. Ministry of Health is working hard to make opponents of the peer review process into proponents. “Rather than telling them they’re wrong and bulldozing over them, we’ve actually tried to bring them in and make them a part of the process,” he says, noting that their critical eye can lead to system improvements.

It is also fostering support by making the system easy to use. QICS will interface to whichever PACS is already in use at the sites included in the pilot. There’s no separate log in required and the look and feel will be familiar. The same is true of the Real Time Medical System DiaShare Quality solution which drives the local PACS viewer that the radiologists are accustomed to using. As both projects continue to roll out, expectations are high that the improved collaboration will lead to improved outcomes.

“We’re only scratching the surface of this,” says Dr. Mathieson in B.C. “Right now radiologists view themselves as being in a gold fish bowl … but errors occur everywhere. We are lucky in that we can do things about quality improvement that others would find more difficult.” Dr. Greg Butler, chairman and co-founder of Real Time Medical, adds that, “Forward looking radiologists are embracing the opportunity to do peer review right. It is possible to do peer review in a manner that benefits patients, physicians and the system and resources we all share, while addressing legacy issues that have to date, prevented large scale adoption of peer review.”

Written by canhealth

No comments yet.

No one has left a comment for this post yet!

Leave a comment