Performance & Quality
Alberta to launch province-wide plan for DI peer review
February 3, 2016
Alberta Health Services (AHS) is putting the finishing touches on a province-wide peer review program and Marlene Stodgell-O’Grady, Director of Quality, Safety and Education, Diagnostic Imaging Services, expects to see positive results. “We’ve really focused our program on education, learning and improving outcomes as opposed to looking for outliers,” she says, noting that AHS performs about 2.85 million exams per year.
By March 2016, all general radiology reports and CT scans performed at public sector facilities in Alberta will be available for peer review. The underlying software used is McKesson Conserus, which integrates with AHS’s existing IMPAX picture archiving and communication system from Agfa HealthCare. This integration enables the reviews to be performed seamlessly within the normal daily workflow of the radiologists.
Each week, the fully integrated peer review system will randomly select studies from across the province and assign them to any one of AHS’s 300 radiologists, so that “the statistical likelihood of having a report style you recognize is very low,” explains Stodgell-O’Grady.
Only reports from the prior week are selected, so that reviews are performed in a timely fashion and radiologists only receive reports that are within their normal scope of practice.
The system automatically assigns the same number of reports each week and any that aren’t reviewed simply fall out of the queue by week’s end. For general radiology, reviewers will be provided with seven reports each week and will be asked to report on five; for CT scans, reviewers are provided with two cases and asked to report on one.
Each facility is expected to carry out its assigned peer review load, based on volumes, and it’s up to each one to figure out how that will work, says Stodgell O’Grady.
“Though it is an off-the-shelf product, we have done an awful lot of customization and worked very closely with McKesson Conserus project managers to make it work the way we want it to,” said Stodgell O’Grady. “We want to make sure it’s not just about finding discrepancies; it’s also about acknowledging that some people do some exceptional work.”
Ratings in the peer review system range from “good call, most people would have missed it” to “concur” to “disagree – should usually be caught” to “disagree – should almost always be caught.” Reviewers also have the ability to remark as to whether or not a discrepancy is clinically significant.
Any discrepancies deemed clinically significant are automatically flagged to go through an adjudication process. A secondary reviewer, who must be a member of the interpretation and reporting subcommittee, either confirms or disagrees with the diagnosis. If they disagree, the report is returned to the reviewing radiologist so they can receive feedback about why.
Stodgell O’Grady’s internal staff created dashboards using Tableau, an analytics system, so that each radiologist can see his or her performance at a glance, including how many reviews they’ve conducted, how many reviews have been conducted on their reports, and how their personal ratings compare against the provincial average.
AHS is shying away from determining discrepancy rates at this point and focusing instead on learning. “We are very much trying to stay away from looking at the individual performance of anybody,” says Stodgell O’Grady. “Instead we’re looking at the over-arching trends and where we can provide feedback.”
AHS is using the information in its peer review database to create an anonymous teaching file to ensure that really good cases are shared. “The reason we’ve been successful is we engaged the radiologists from day one and we let them tell us what works for them, what makes sense for them and how it would truly be an educational program,” she says.