Distinct RBM Divisions: Source Document Verification and Source Data Review

Article

Applied Clinical Trials

Source data verification (SDV) remains a concept that many sponsors, CROs and sites take to heart.

SPOTLIGHT EVENTRisk-Based Monitoring – Beyond Theory In-Depth ReviewMarch 13, 2014
Cambridge, MassachusettsDownload BrochureRegister

RELATED
- The Emergence of the Centralized Monitor
- Risk-Based Approaches
- ICON Exec Discusses ICONIK MonitoringMore in Risk-Based Monitoring

Source data verification (SDV) remains a concept that many sponsors, CROs and sites take to heart. The 100% SDV approach, in fact, is believed to have evolved over the years as an FDA requirement. FDA itself admitted in its August 2011 Draft Guidance for Oversight of Clinical Investigations — A Risk-Based Approach to Monitoring, that a CTTI study found that companies typically conduct on-site monitoring visits at four to eight-week intervals because of the perception that the frequent on-site monitoring model with 100% SDV is the FDA’s preferred way for the sponsors to meet their monitoring obligations. And the FDA believes that the reason can be traced back to 1988 in its now withdrawn guidance on monitoring of clinical investigations, which stated: “The most effective way to monitor an investigation is to maintain personal contact between the monitor and the investigator throughout the clinical investigation.” But as the updated guidance notes, at that time, sponsors had only limited ways to effect meaningful communication with investigators other than on-site visits.

Now, 25 years later, with technological advances and analysis that supports the 100% SDV may not be the most effective data monitoring tool or that on-site monitoring at a four-to-eight week interval regardless of need, many in the industry are backing the recently minted guidances from FDA, EMA, MHRA and others to say that this model is financially unsustainable and doesn’t provide value.

Jeff Kasher, Vice President of the Development Center of Excellence at Eli Lilly, and TransCelerate BioPharma Operations Committee member, told Applied Clinical Trials that many sponsor companies have used different monitoring approaches, even before the FDA, EMA and MHRA guidance papers; even before the early June release of TransCelerate’s Risk-Based Monitoring methodology paper. He said, “Lilly’s approach to monitoring has evolved over the years, from a 100% source data verification historically, and then utilizing different sampling methodologies.” This is not uncommon. Many companies have published or presented their approaches to data monitoring and how they approach the SDV process within their company.

The TransCelerate methodology specifically addresses on-site, centralized, and off-site monitoring through its tools, which include risk assessment, risk indicators, integrated quality managed plan approach, that when used to support an adjustment of monitoring activities based on the issues and risks identified throughout the study, without comprising data quality.

In order to understand SDV and its importance to data quality, the TransCelerate RBM work stream analyzed nine sample studies from six member companies to determine the rate of queries identified via SDV as compared to all queries for a study, and then further assessed that data to determine what percentage of SDV-generated queries were found in Critical Data. Despite differences in the ways companies manage their data review activities, all companies had a low rate of SDV-generated queries. The average percentage of SDV queries generated was 7.8% of the total number of queries generated. The average percentage of SDV queries that were generated in Critical Data as a part of the total number of queries was 2.4%.

Mike Luker, Senior Advisor in Clinical Development Innovation at Eli Lilly, and RBM work stream member says, “If you look at all queries generated across these trials, queries come from different places. They can come from site monitors. They can come from data reviewers. They can come from a central data review process. And what this data says is that of all queries generated, only 7.8% across all data arise from the process of source data verification at sites. The overwhelming majority come from data managers doing central data review or various forms of data review.”

“So just a small percentage come from source data verification to begin with. Then, when you break that down further and say, of those queries that are raised by site monitors doing source data verification, how many actually impact what we would characterize as critical data as defined per the TransCelerate methodology? That number drops to 2.4%. For an activity that we spend a tremendous amount of time and money on, very little substantive impact, actually, is realized in terms of querying the data,” explains Luker.

TransCelerate’s paper makes a distinction between SDV and Source Data Review (SDR). SDV is the process by which data within the CRF or other data collection systems are compared to the original source of information (and vice versa) to confirm that the data were transcribed accurately.
SDR involves the review of source documentation to check quality, review protocol compliance, ensure the Critical Processes and source are adequate, check investigator involvement and site staff duties, and assess compliance to other areas (e.g. SOPs, ICH GCPs). SDR evaluates the areas that aren’t associated in a CRF data field or other system.

TransCelerate maintains in the paper the distinction between SDV and SDR proposes that companies prioritize the high-value task of compliance checks and de-prioritize the low-value task of checking for transcription errors. Based on risk level, as well as available technology, different levels of SDV and SDR may be proposed, and facilitate a more focused response.

Technology Requirements
The RBM methodology is now in the hands of eight member companies who are in pilot implementation stage. Overall input into the whole methodology will occur after the pilot, and part of that is in regard to the system requirements. Current and updated system requirements, however, are not going to include specific software products or services that have or are beginning to pepper the market.

“We’ll get a better understanding of what technology capabilities are superior or what capabilities we need,” said Kasher. “As we go through the pilots, we’ll be working with a multitude of CROs who we partner with to run our trials.”

“There’s a lot of competition out there among CROs and other companies to bring the tools and the capabilities to help with centralized risk-based monitoring, both from the algorithm standpoint, and the visualization capabilities,” continued Kasher. “This is a good example of where we’re going to see competition and innovation that will lead to solutions. And those companies that can offer the best capabilities will gain a competitive advantage and be in the best position to win business from trial sponsors going forward.”

Luker said the paper and the table in section 8.1.6 outlines the system requirements and preferred system attributes, which TransCelerate believes is what a sponsor needs to implement RBM effectively. “As we move through our pilot and get into the practical application of the methodology to trials, we’ll learn about what we really need and what we thought we needed that we don’t need. I think you’ll see that refined through an update to this table in version two of the paper.” Version two of the paper is due in early 2014.

© 2024 MJH Life Sciences

All rights reserved.