Comparing Risk Based Monitoring and Remote Trial Management vs. SDV

Article

Applied Clinical Trials

A new RBM method used in PaxVax trial proves successful vs. onsite source data verification for trial oversight.

Five years ago, the FDA and EMA released final guidance to change trial oversight methodology from onsite visits using source data verification (SDV), the gold standard for more than 30 years, to a risk-based monitoring (RBM) approach.1,2

Implementing this guidance created two daunting challenges to reconcile as follows:

  • No standard RBM definition or standard way to implement RBM exists; these myriad of definitions and implementation approaches correspondingly represent different levels of effectiveness to identify “errors that matter.” 

  • No head to head comparisons exist that compare different trial oversight methods.

This lack of scientific data on trial oversight effectiveness is a critical unmet clinical research need. It affects more than 100,000 research participants per year and their health care providers. 

This paper represents a prospective analysis comparing the effectiveness of using traditional SDV versus one method of RBM (i.e., the MANA Method). We identified the specific RBM method used herein due to inconsistent RBM definitions and RBM implementation methods, and the varying levels of effectiveness for all the different RBM approaches.

Research methods

PaxVax conducted a Phase IV vaccine trial in approximately 500 subjects at nine U.S. sites.  

The study was conducted using Electronic Data Capture (EDC). The study was approved by an Institution Review Board (IRB) and each subject signed an IRB-approved informed consent prior to participating. Subjects received one dose of study vaccine. Subjects collected any changes in health for nine days in a paper diary aid and sites entered the results into the EDC. Each research site maintained its own informed consents and site regulatory binders.

Site monitors visited the research sites monthly and spent approximately 72 days onsite conducting SDV of the trial data. PaxVax’s senior management team (i.e., Medical Monitor, Sr. Director Biostatistics, Director of Data Management and Statistical Programming) reviewed the data monthly to identify trends or data errors that would be followed up by the site monitor.  

MANA RBM modified its risk based monitoring and remote trial management system (i.e., the MANA Method) to initiate an RBM approach for this study that began after 5.5 months of trial conduct (approximately 500 subjects already enrolled). To fully implement the MANA Method, additional trial oversight and remote document review, including informed consents and site regulatory documents, would also have been implemented and evaluated.

Note: MANA RBM’s risk-based monitoring approach is a patent-pending, data-driven, scientifically-focused, systematic, remote approach to trial oversight called the MANA Method. This approach, conducted independently of the EDC, synthesizes data across data sets and data sources to conduct review of protocol-specific high-risk data and processes identified during a proprietary, risk assessment service. This review focuses on how analysis and safety data are collected (i.e., process) in addition to the actual data for analysis. Integrated remote subject review starts within days of subject visits and includes rapid trend analysis of site performance to identify and correct systematic errors quickly. 

In this pilot study, MANA RBM independently and remotely reviewed and used the existing trial data available electronically to determine whether errors and trends could be identified faster and more comprehensively than using the traditional SDV method. Analysis of informed consents, regulatory documents, and source documents were not included in this pilot study because the documents were not available electronically.

MANA RBM first conducted a proprietary, risk assessment service based on the protocol. It then designed proprietary, study-specific reports and data visualizations to evaluate the high-risk data and processes identified during the risk assessment. The basic categories included: efficacy endpoints, safety assessments, investigational product (IP) management, and human subjects’ protection.

Trial data was imported from the EDC platform into JReview, hosted by Integrated Clinical Systems, Inc. MANA RBM designed its proprietary Subject Profile Analyzing Risk (SPAR) tool to provide an integrated visualization of the high-risk data for each subject over time and trained the remote monitors in its use. SPAR configuration is unique for each trial based on the critical issues identified during the risk assessment. Additional proprietary, custom reports were also developed to support protocol-specific analysis of high-risk data and processes and trends.

All review was performed independently of the EDC system and based on MANA RBM data analytics. Results of the review were captured in a separate, proprietary MANA RBM Site Tracker Analyzing Risk database (STAR); MANA RBM developed this tool to conduct study quality oversight. Subject review was documented in JReview.

MANA RBM conducted review using its remote quality management approach as shown in Figure 1. The MANA Method splits the review process into tiers. Remote site monitors focus on subject review and high-risk data and process oversight at the subject level. Central review focuses remote review on trend analysis by evaluating data across subjects at a site and across sites.

            

Figure 1. MANA Method for Remote Quality Review

The pilot study compared SDV versus the MANA Method in the following areas:

1) Identifying major deviations

2) Queries raised as a result of SDV

3) Identifying trends in data affecting trial conduct and/or results

4) Timing of the subject review

5) Resource use

 

Results

Risk assessment and development of protocol-specific reports
MANA RBM conducted the risk assessment and implemented the SPAR within two weeks of uploading the data into JReview. Additional custom reports were developed over eight weeks. These reports included customized, cross-database reports and trend analysis of high-risk data and processes.

Subject review
Once the SPAR was available, reviewers began reviewing the data immediately. MANA RBM split the subject review. An experienced monitor reviewed half of the subjects during the first month and the data reviewer, new to subject review, reviewed the other half of the subjects during that month. The following month, the subject reviewers switched subjects to review to allow evaluation of oversight by remote monitors with different training and experience. The lead monitor performed quality control (QC) oversight of each of the remote monitors to provide immediate feedback on items missed or documentation correction. 

Identification of protocol deviations: MANA RBM’s remote site monitors identified critical deviations using the SPAR and accompanying high risk reports. The MANA Method identified critical deviations not previously identified by the sponsor’s onsite monitors.

Speed of identification: Using remote methods, the monitoring team could have identified deviations faster and earlier than using SDV and onsite visits. Within six weeks, two rounds of review of all critical subject data were completed and all deviations for critical data were identified.

Categorization of deviations: Differences in classifications of deviations as major or minor were identified between the MANA RBM remote monitoring team and the onsite monitors. This resulted in challenges when comparing the total numbers of deviations. Total numbers of deviations were similar and there were no major deviations discovered by the sponsor’s onsite monitors that the MANA Method did not also identify remotely.

Source document review
This study was conducted with paper memory aids and transcription by the research sites. Since this was a pilot study, sites were not asked to convert the paper memory aids to certified copies, which would have allowed remote review.

To evaluate whether there were findings that the MANA Method would not have been able to identify without onsite visits or using eDiaries, MANA RBM reviewed the queries related to subject diaries generated from the study. The MANA RBM team identified 300 queries associated with source documents verification. Table 1 shows the distribution of the queries and illustrates how remote review would have identified all critical findings with the use of eSource or certified copies of the diary aids. Important data is defined as data that would affect subject safety or analysis of efficacy.

 

Table 1.  Details of Source Document Review 

 

MANA RBM reviewed the important data remotely from source review query rates and found two sites had much higher query rates (i.e., 2-10 times the rates of the other sites) as shown in Table 2. This information, if known to the sponsor, would have allowed it to determine the need for continued onsite SDV and, if needed, focus SDV, additional training, or other strategic considerations on only two sites instead of all sites.

Table 2. Variability in queries of important data identified by onsite monitors 

MANA Method central review and trending
Central review and trending was conducted in addition to subject review. This review occurred during the second and third month of the pilot study using proprietary reports designed specifically for the high-risk areas identified in the study’s protocol. From this review, MANA RBM central monitors recognized several trends that could have significantly impacted this study as follows:  

1) Deviation evaluation identified at least one trend that could have enabled more evaluable subjects. A higher rate of out of window visits existed for one site. While not usually considered a major deviation, the timing of this critical visit represented the collection point for primary, efficacy data. The MANA Method would have identified  and corrected this error sooner, leading to more evaluable subjects. Onsite monitors did not identify this issue. The PaxVax Senior Clinical Research Management Team identified this site deviation at its monthly review meeting while the MANA RBM reviewers discovered this issue immediately upon performing central review.

2) Vital signs evaluation identified one site that had issues with collecting vital signs; specifically collecting manual temperatures (Table 3, Site 27 highlighted). Analytics identified this issue by using the differences in the mean values and a scattergram of actual values. This indicated a process issue that could have significant impact on future studies where immediate measures of temperature elevation after an IV injection could have been under-reported. Only the MANA Method remote central monitoring approach identified this issue.

Table 3: Temperature Irregularities Identified by MANA Method

3) Incomplete dosing represented another area where variability existed in performance across sites. Since sites “batch” (i.e., enroll large groups of subjects over a few days) their dosing for vaccine trials, identifying this issue rapidly may have increased the number of subjects that took the complete dose. PaxVax Senior Clinical Research Management Team noted this issue at its monthly meeting. The MANA Method central monitors noted it immediately upon review. Onsite SDV did not identify this issue.

4) Variability in reporting on adverse events of special interest occurred across sites. One site routinely ranked lowest or second lowest among the sites across the reported eight adverse events. While it was not clear if an issue existed, it was a trend that should have been evaluated to understand the processes by which this critical assessment was being conducted. Only the MANA Method central monitoring approach identified this finding.  In Table 4, shown below, all sites with at least 15 subjects enrolled were evaluated on the ranking, across sites, for severity of the adverse events of special interest using Z scores (i.e., the number of standard deviations from the mean). One site routinely ranked subjects either the lowest or second lowest in severity, while one site routinely ranked subjects at a higher severity. 

 

Table 4. Variability in rates of reporting of Adverse Events of Special Interest (Solicited Events) across sites based on sites with at least 15 subjects.

 

 

Findings not requiring action

1) The early termination rate was higher at one site than at the others. The reasons for early termination were not different across sites. No action was recommended at that time.

Review timing
The MANA Method enabled remote, comprehensive subject review of the high-risk data and processes to begin within two weeks of starting the project. No minimum data requirement  was required to begin the review after a subject’s visit data is entered.

Central trend analysis began approximately two weeks after remote subject review and identified additional data errors that could be corrected quickly. This rapid review could have eliminated errors in several aspects of study conduct as follows:

  • Large number of out of window visits for critical assessments at one site

  • Large number of incomplete dosing at two sites

  • Confusion about the definition of diarrhea versus loose stools across sites (approximately 85 queries)

  • Errors in manual temperature measures

Errors identified early facilitate site retraining thus reducing the future workload for sites and study staff.

Resource use
The sponsor assigned eight months of resources to the study as follows:

  • 1.75 FTE data manager (DM), a lead DM, and a programmer (study conduct only)

  • 3 FTE monitors, including a lead monitor (72 days of onsite monitoring)

  • Senior management: four senior managers met monthly for four hours. Prep time for the meetings estimated as 40 hours per meeting. A second monthly meeting reviewed deviations. It took approximately 10 hours of senior management and data management resources. 

MANA RBM used the following resources
Design, build, and validate study-specific reports in JReview: 2 FTEs for two months.

The reviewer (“monitoring”) resources were much smaller than used in a traditional trial as follows:

  • 1 Data Reviewer (100 hours)-review time averaged seven minutes/subject

  • 1 Monitor (100 hours)-review time averaged seven minutes/subject

  • Central Monitor (analysis) (20 hours)

  • Quality Control (QC) of Monitor and DM performance (20 hours).

Time savings occurred in three areas

  • Onsite Monitoring: Time to conduct subject safety oversight (onsite SDV versus monitor remote review) resulted in a savings of at least 83% of monitoring time, since only the onsite monitoring time was used for this comparison.   

  • Data Management Review-Data oversight took 100 hours versus 1.75 FTE (630 hours for two months). When data are cleaner, errors corrected earlier, and central oversight identifies the critical data trends, the time to raise and close queries is significantly decreased. In addition, when central monitoring oversight was used, the time to create the materials for senior management review (40 hours per month) could have been significantly decreased.

  • The 20 hours of central monitor review would have saved senior management over 60 hours per month. This results in approximately 1.5 weeks of savings for senior management per month.

 

Discussion
Increased quality, lower cost, and faster review times (including earlier detection of problems) represent the holy grail of trial oversight. The dogma was that you could only achieve two of the three. Using the MANA Method for remote trial oversight in this pilot study confirmed this is no longer true.  

1. Quality-The MANA Method identified issues not seen using SDV. Its review focused on “errors that matter” that could affect trial outcome, not just traditional SDV point-to-point checking or identifying only data that did not conform to expectations (e.g., out of range values). Central (cross subject/cross-site) and remote subject review identified specific site actions that could be corrected rapidly, enhancing the number of subjects that could be evaluated and lowering the overall burden of trial management.

A second, quality benefit of this RBM approach was the ability to perform and document QC remotely on each monitor/data reviewer’s performance.  This provided enhanced oversight not possible when all or most activities occurred at the research site. In 2013, MANA RBM reported on using remote review to perform a 10% QC review of informed consents on 788 subjects across 12 sites. This review took two days and required no travel.3

PaxVax senior management spent a significant amount of time evaluating trends, which the MANA Method identified with fewer resources and faster while conducting the monitoring/trial oversight. Many companies do not have the resources and/or make the commitment PaxVax made to oversee the trial at this level. These findings confirmed that the MANA Method provided a cost-effective alternative for allocating senior management resources efficiently.

Using the MANA Method, monitors/data managers understood the critical data and processes and how they should be evaluated based on the Data and Document Review Guidelines. Instead of reviewing the subject’s data in the eCRF (whether doing transcription checking or just reviewing the eCRF), the MANA Method allowed more comprehensive oversight of each subject’s data in context (i.e., across multiple data sets) and over time. This powerful approach identified errors in process that were not obvious when the review focused only on out of range values, transcription errors, or missing data.

2. Time-The MANA Method meets the RBM regulatory guidance for rapidly reviewing critical data. The main tool used for subject review, MANA RBM’s Subject Profile Analyzing Risk (SPAR), was built and deployed within two weeks of data upload into JReview-allowing comprehensive subject data review immediately after data entry. 

While not possible in this pilot, when the MANA Method is implemented from the beginning of the trial, actual time to subject review and time to identification of major issues could be calculated, delivering oversight in days rather than waiting for an onsite visit.

This illustrates how overall monitoring time can be greatly decreased. Instead of selecting a subset of subjects or a subset of data for SDV, now every subject’s critical data can be reviewed without impacting overall study costs. Rapid, comprehensive review can also occur when new data are added without significantly impacting costs.  There is no “critical amount” of data needed to perform subject review. The data from a subject visit is sufficient to start review. These findings align with the data MANA RBM previously published on the speed of using the SPAR to conduct subject review.4  

Once the MANA RBM protocol-specific complete reports were designed, developed, and validated, the actual review process was significantly shorter, and performed remotely.  This provides tremendous potential savings for studies, such as oncology trials, that currently require onsite visits to review subject data, even for a single subject.  

Time savings were not restricted only to monitoring time. Using the MANA Method, site monitoring savings were at least 83%, data management time savings could have exceeded 40 hours per month, and senior management time savings could have exceeded 60 hours per month.

3. Cost-This approach should be, at a minimum, cost neutral. Cost savings can be significant depending on how the entire study is designed and implemented.  

Any cost comparisons of methods should include total costs for trial oversight. With better oversight by the monitors, data are corrected faster-saving site time and enhancing the number of evaluable subjects. In addition, this pilot demonstrated that internal senior management time can be saved when the MANA Method is used to ensure cleaner data and identify critical issues earlier. 

Using an electronic Investigator Site File (eISF) and certified copies of informed consent and other source documents would have enabled complete remote review because all documents would have been available remotely. Clinical trial associates can perform many tasks to manage the regulatory binders (i.e., complete and correct documents) and informed consent review-adding to cost savings. While the eISF and remote informed consent review were not used in the pilot, these tools can save additional resources and enable more comprehensive remote review.

Employing ePRO/eDiary in this study would have also yielded significant cost savings as discussed below. If eDiaries had been used, with eConsent (or certified copies of paper informed consents and subject diaries) and eISF, the number of onsite visits could have been significantly decreased.

The importance of eSource and eConsent
eSource and eConsent provide several benefits for RBM and remote trial management. Most  companies incorrectly assume a change is necessary to add these tools to its EDC. eSource can be implemented using EDC with direct data entry or with a system designed to be used on a tablet. The benefits include:

  • Meets the ICHE6(R2) and eSource ALCOA data requirements (i.e., Attributable, Legible, Contemporaneous, Original, Accurate)

  • Immediate access to data for review

  • Collecting the data needed to document study processes, not just the clinical data needed for analysis  

  • Identifying errors at the user level based on audit trail or documentation of who performed assessments, rather than just at the site level, allowing for more focused remediation

  • Providing immediate feedback to the person conducting the assessment through instructions and queries to identify data that do not conform to expectations (e.g., a very low height recorded because the height entry recorded was in centimeters but collected in inches)

  • Using the audit trail to identify data not entered contemporaneously according to the protocol and instead entered post hoc

  • Providing a complete source record for each subject 

  • Allowing remote QC of monitor/data management performance because all subject data are available for review.

Using eSource provides significant cost savings. For the 500+ subjects in this study, using an eDiary would have resulted in savings from sites entering 20,000 data points from memory aids (assuming 40 items/subject, five seconds of data entry/item), monitors visiting the sites to review the 20,000 data points (five seconds/item), and an estimated 500 queries (2.5% error rate, 15 minutes/query). This one change could have saved, conservatively, 179 hours of study staff time (over four weeks of work), not including costly monitor travel time or the increased frequency of visits required to review these critical data.   

For eConsent, additional benefits include:

  • Immediate access to the informed consent forms (ICF) for remote review

  • Assuring the correct ICF version was used

  • Importing the date/time of the ICF signature into the EDC/eSource system. This can be a triggering event to activate the EDC/eSource and assure that no assessments were done before completing the ICF. This feature is not available in all systems.

  • Eliminating many edit checks and queries based on determining the time of informed consent (if import of date/time into the EDC was used)

  • Providing additional documentation of the process of obtaining informed consent

  • Remote audits of informed consents

Using certified copies of paper informed consents and paper subject source data such as diaries provide an intermediate alternative to eConsent and facilitates rapid remote review.

 

The Importance of Central Review and Trending
While MANA RBM remote site monitors found important deviations using subject review, the central review process was invaluable in identifying the critical findings discussed in this paper. 

Reviewing trends allowed the MANA RBM team to identify sites having problems with scheduling patient visits, dosing according to the protocol, methods for collecting vital signs, and rating differences. While not necessarily critical findings in isolation, these issues can affect trial outcomes if left alone to compound over time. Investigating critical data and process findings represent the core of RBM principles.  

Oversight should be focused on “errors that matter,” which include processes in addition to analysis data. Trend analysis is critical because trends indicate systemic issues with those data and processes. These types of issues cannot be identified by SDV or even remote eCRF review. Only through using more scientific, data-driven, systematic approaches can important findings be identified, evaluated, and corrected. 

Protocol-Specific Analysis
It is notable, for many reasons, that many RBM models incorporate SDV as its method for quality oversight; albeit fewer fields are now reviewed than the previous 100% SDV standard prior to the release of the FDA, EMA, and ICH Guidances. One problem reported in the Kunzi et al. paper is echoed by others: That monitors, although instructed to do less SDV, are concerned that they do not have a good grasp of the subjects when doing anything less than 100% SDV and will, therefore, perform 100% SDV regardless of the monitoring plan-this negated any anticipated RBM cost savings and required longer site visits.5

Our data conflict with the perceptions published by Kunzi et al. who reported that 58% of monitors in Europe, experienced in RBM, thought important protocol violations were missed using RBM.5The MANA Method identified remotely all critical deviations discovered by onsite monitors.

In addition, the MANA Method allowed the monitors to know exactly what the important data were and how to efficiently review all critical data in minutes, while providing more effective oversight than traditional SDV.

Sponsor opportunities
These data demonstrate the potential opportunities for enhanced trial oversight using remote, systematic, data-driven, analytic methods focused on the data that matters, (i.e., affecting trial analysis, subject safety, IP management, and human subject protection). These approaches use fewer resources, at a lower cost, and can be adopted without increasing study budgets-in many cases with lower study budgets. More importantly, trial quality is improved and Sponsors know immediately about the issues that can affect the study, study participants, and regulatory submissions. 

Just as sound research methods are the hallmark of pharma, biotech, device, and vaccine discovery efforts, sponsors now have the opportunity and the responsibility to apply sound, quality-based, research methods and tools to the clinical research they conduct. As clinical research professionals, it is our responsibility to embrace improved methods for quality oversight and not be complacent and continue to perform trials “as we have always done them.” Regulators, patients, and their physicians are counting on us.

The MANA Method is a proprietary, study-specific RBM approach performed remotely, independent of the EDC system used, and adoptable at any time during trial conduct. It was shown to systematically identify errors in trial conduct, subject safety oversight, and GCP compliance. The MANA Method identified critical errors in trial data and study conduct trends, within and across sites, more effectively when compared with onsite SDV. This pilot study demonstrated that subject review could be started earlier, and overall resource use was less than with traditional SDV onsite monitoring.

 

References

1. US Department of Health and Human Services. Food and Drug Administration. Guidance for Industry: Electronic Source Data in Clinical Investigations.  U.S. Department of Health and Human Services. Washington, DC. US Department of Health and Human Services; 2013. http://www.fda.gov/downloads/drugs/guidancecomplianceregulatoryinformation/guidances/ucm328691.pdf

2. European Medicines Agency, Science Medicines Health.Reflection paper on expectations for electronic source data and data transcribed to electronic data collection tools in clinical trials.09 June 2010. EMA/INS/GCP/454280/2010.

3. Manasco PK, Riley-Wagenmann C, Flack M; Remote Informed Consent Review: Results of Implementation in Phase III Trial.  Drug Information Association.  June 2013.

4. Manasco PK, McKee c, Dacpano G, Manasco G, Pallas M, Finlay C, Galang S, White T, Erwin R, Danzig L.Subject Profile Analyzing Risk Saves Time for Monitors Applied Clinical Trials, Dec 2, 2016

5. Kunzi C, Rollinger Y, Sigmund M, Kunert V, Breuer B, RBM An Update of Experiences Among European CRAs. Applied Clinical Trials 20 OCT 2017

 

Penelope Manasco M.D. is CEO, MANA RBM.

Eric Herbel is president, Integrated Clinical Systems, Inc.

Sean Bennett M.D.,Ph.D. is senior director, Clinical Development and Medical Affairs, PaxVac, Inc.

Michelle Pallas is director of Statistical Programming and Data Management, PaxVax, Inc.

Lisa Bedell MA is senior director, Biostatistics, PaxVax, Inc.

Deborah Thompson MPH is a consultant for MANA RBM.

Kevin Fielman Ph.D. is affiliated with MANA RBM.

Garrett Manasco is a consultant for MANA RBM. 

Charlene Kimmel is affiliated with MANA RBM. 

Everett Lambeth is a consultant with MANA RBM.

Lisa Danzig M.D. is chief medical officer, PaxVax, Inc.

© 2024 MJH Life Sciences

All rights reserved.