OR WAIT null SECS
The quality and ultimate reliability of clinical data collected from investigative sites during a study depends on many factors. One area that may not get enough attention is the cycle time from subject visit through eCRF data capture, and final review and cleaning of that data. Site personnel's transcription of source data generated during subject visits into the sponsor-provided eCRF system can cause long delays. These delays allow loss of memory to potentially impact proper interpretation of information originally entered into the patient chart. Once the eCRF data has been captured, further delays in review and querying of that data by sponsor personnel (e.g., data managers and site monitors) will only exacerbate this issue.
The Medidata Insights metrics warehouse includes a number of metrics focused on these data capture and cleaning cycle times. The graph shows the industry trend for two of these cycle times in particular:
This month's graph presents the industry median values for Phase II and III studies from 2009 through 2012. The overall data capture and cleaning cycle time efficiency does show a positive downward trend over this period—from 36 days in 2009 to 33 days in 2012. It actually dipped below 30 days in 2011 before rising again. This certainly supports the notion that organizations have been putting increased focus on these processes in recent years, and in particular we're noticing a positive downward trend in the time from subject visit to eCRF entry by sites.
The trend in time from eCRF entry to generation of data manager queries is very interesting since it correlates closely with the overall capture and cleaning cycle time trend. In particular, it appears to be trending downward in the years prior to 2012 and then back up again in 2012. While not yet conclusive, the 2012 "bounce" in this metric is a very likely driver of the similar bounce upward in the overall cycle time metric. What continues to amaze is the actual length of that cycle time—right-hand axis on the graph—where the annual benchmark for the industry ranges from 59 to 89 days. This amount of lag time in generation of queries to the site from data management personnel cannot be looked at as having a positive influence on data quality.
This brings up an ongoing—and sometimes heated—debate we have seen in clinical organizations regarding the proper cadence of sponsor team data reviews. There were essentially two distinct schools of thought—one that insisted data managers wait until site monitors have completed their source document verification (SDV) review of the eCRF data before conducting their reviews, and the other that insisted data managers should conduct their reviews as soon as the data was available in the eCRF system regardless of the status of site monitor review. It is apparent from the current industry benchmark that most organizations are still tied to the "monitors first" approach.
We believe that parallel data reviews are the best way to accomplish this. What do you think?
—Medidata Solutions, www.mdsol.com/