Achieving Optimal Patient Data Quality: All Hail the Auto-Queries

October 17, 2013

Applied Clinical Trials

Medidata

In previous reports, Medidata has presented a metric showing that—as an industry average—significantly less than 3% of all eCRF data is ever updated after it’s initially entered by investigative sites.  This reveals the strikingly low impact—or “return on investment”—that all of the data reviews conducted by the study team, including 100 percent source document verification (SDV), have on the quality of the data.  Further evidence to this point was recently published in the TransCelerate BioPharma position paper on risk-based monitoring.  According to TransCelerate’s analysis, only 2.4% of all eCRF data queries were SDV-driven queries focused on critical data points (i.e., data supporting critical efficacy and/or safety analyses). 

Here the analysis has been taken a step further to determine the relative contribution of different query sources on the total eCRF data cleaning effort by assessing the percentage of data corrections driven by four standard query types:

  • Auto-queries (i.e., queries issued automatically based on programmed or configured electronic data capture (EDC) edit checks)

  • Queries issued by site monitors (typically through SDV-related data reviews)

  • Queries issued by data management personnel

  • Queries issued by study team stakeholders other than the three above (e.g., medical/safety reviewers, biostatisticians, etc.)

The accompanying graph is based on data from the Medidata Insights™ metrics warehouse, which currently includes operational metrics from over 6,000 studies contributed by more than 100 sponsor organizations. 

Almost half of all data corrections are driven by system-generated auto-queries.

Close to half (48.8%) of all data corrections are driven by system-generated auto-queries.  In other words, virtually half of all electronic case report form (eCRF) data cleaning is accomplished by the site’s interaction with the EDC system prior to any intervention from site monitors, data managers, and any other study team data reviewers.  Most of the remaining data cleaning is split between data management and site monitor queries. 

Many professionals in clinical research have questioned the value of configuring so many edit checks into the EDC system for each new study, suggesting that it represents a significant effort and that many never “fire” as auto-queries.  It is indeed the case that many auto-queries never “fire,” but the above result makes it clear that the overall up-front investment in configuring EDC edit checks is well worth the downstream benefit.  While many don’t end up “firing,” edit checks can be thought of as a very low-cost insurance policy supporting optimal data quality.  

Medidata will continue to reveal trends and analyses conducted using the Medidata Insights tool and is, as always, very keen to hear from you on this topic.  Is this the result you expected?

—Medidata Solutions, www.mdsol.com