MCC has just published an executive summary of its Risk Based Monitoring Usage Survey. We’ve also just released our Metrics Search Engine that allows you to find the right metrics for your situation and level of expertise. Check out our web site www.metricschampion.org for details. In April, we looked at eCRF data entry cycle time as a useful RBM metric. This month, let’s look at another metric that’s very important in a Risk-Based Monitoring (RBM) environment: Audit Findings per Site.
Why this metric is important: Audit findings are a good way to identify sites that may be experiencing quality problems – or may have in the past – and thus should be monitored more carefully than other sites. Hence its value as a RBM metric. Audit findings can also be used to identify problems in the protocol, eCRF or site training that should be remedied – either for this protocol or for future protocols.
Definition: The metric can be calculated in two ways:
How to calculate this metric:
<2 major/critical findings per site (average) is a reasonable target for this metric.
Example: In the table below, we’ve listed five sites that were audited. Sites 1 & 2 each had no major/critical audit findings, so each received a Site Audit Score of 0. Site 3 had 2 major/critical audit findings, so received a Site Audit Score of 1. Sites 4 & 5 had 3 and 6 major/critical audit findings respectively, so each received a Site Audit Score of 4.
The Average Audit Score for the 5 sites was calculated as 1.8 [(0+0+1+4+4)/5].
Finally, the Relative Site Audit Scores were calculated by dividing each Site Audit Score by the Average Audit Score. Sites 4 & 5 have Relative Site Audit Scores above 1.0, so should be looked at more closely.
What you need in order to measure this: All you need to calculate this metric is the number of major/critical findings for each site audited.
What makes performance on this metric hard to achieve: Performance can be hard to achieve because the monitors must determine the root causes of the audit findings and remedy them. If you don’t remedy the root causes, then the problems that led to the audit findings will likely recur.
Things that you can do to improve performance: Once you are tracking this metric, if problems are occurring at individual sites:
If problems are occurring across the study protocol (i.e., the Average Audit Score is high), then the same steps as above should be applied at the study level (rather than the site level).
Companion metrics: Other metrics that you should consider in tandem with this metric include: (1) the MCC Site Quality metric and it’s related tracking tool, (2) site query response times and (3) other RBM metrics such as those developed by TransCelerate BioPharma.
Dave Zuckerman, CEO, Metrics Champion Consortium, [email protected]Linda Sullivan, COO, Metrics Champion Consortium, [email protected]