OR WAIT null SECS
2020 Tufts CSDD – IBM Watson Health benchmarking study highlights need for new functionality from EDC solutions and providers.
The results of several studies published in Applied Clinical Trials and elsewhere have demonstrated the impact of growing clinical trial data volume and the increasing diversity of clinical data sources on study timelines and inefficiencies.1, 2 New benchmarks show that the increase in data volume and diversity is accelerating. The typical late-stage protocol, for example, now collects on average 3.6 million data points, three times the number collected 10 years ago.3
In collaboration with and funding from IBM Watson Health, Tufts CSDD has now completed a study assessing how the clinical data management function—internal teams in collaboration with their electronic data capture (EDC) partners—are managing, collecting, organizing, storing and analyzing their clinical data. This article summarizes the results of this collaborative study and provides more granular cycle time measures, experience with EDC solutions, insights into challenges and opportunities to optimize clinical data management practice.
A global survey was conducted between September and October 2020.In total, 194 verified unique individual responses were received and analyzed. Respondents had an average of 10.8 years of experience working in a variety of roles including clinical operations, data management, executive leadership, clinical development and others. The mean number of clinical trials performed per year was 54. In the discussion that follows, we’ve classified organizations into two primary subgroups: those conducting 50+ trials/year are designated as “Larger” and those conducting fewer than 50 trials/year are designated as “Smaller.”
More than twelve different EDC solutions were identified by respondents. Most organizations (73.9%) report using two or more and the remaining 26.1% of respondents report using only one EDC solution.
Table 1 below shows the wide range of clinical data management pain points reported. The typical respondent noted 3.3 different pain points associated with their primary EDC solution. Managing mid-study updates and their associated delays topped the list followed by flexibility and customization challenges.
Nearly one third of respondents reported challenges with database go live delays and the lack of integrated patient engagement and electronic clinical outcome assessment (eCOA) applications. Slightly more than 30% reported challenges with the cost of their primary EDC solution. Pain points that were reported the least frequently included data incompatibility between platforms, customer support problems and solution complexity (slow user learning curve).
Cycle time benchmarks
Although a relatively high percentage of respondents cited database go live delays as a key challenge, it is interesting to note that the cycle time between final protocol approval and the database go live milestone has not changed during the past three years.4 Figure 1 below shows that the average duration from final protocol approval to the database go live milestone has increased by one day.
The results of this study also show that the average study close-out cycle time—from last patient last visit to database lock—remains unchanged between 2017 and 2020 with organizations reporting an average duration of 36.8 days. Mid-study updates represent a time intensive challenge.Managing a mid-study update requires about half of the total time needed to achieve the initial go live release of the database, as shown in Figure 2 below. This helps explain why planned and unplanned study updates are the top reported pain points.
Looking at each of these cycle times individually, and comparing smaller vs. larger organizations, there were some notable differences. The results show that smaller organizations (<50 trials/year) are faster at early-stage data management processes whereas larger organizations (≥ 50 trials/year) are faster in late-stage processes.
Despite the challenges noted, respondents report a high level of satisfaction with their EDC solutions, as shown in Table 2 below. Database design and data collection tasks received the highest satisfaction levels with over 90% satisfied. Database closeout, the overall end-to-end process, and data processing and cleaning abilities also received high levels of satisfaction. Managing mid-study updates and the overall study cycle time received the lowest relative satisfaction scores.
As noted in Table 3 below, EDC solutions received high marks with 84.1% and 82.3% of respondents satisfied with the solutions’ ability to reduce the number of manual errors and avoid costs in managing mid-study updates. Fewer respondents (75.8%) were satisfied with the ability of their EDC solutions to help them reduce delays in managing mid-study updates.
EDC solutions also received high satisfaction ratings for electronic data capture, randomization and endpoint adjudication (100% of respondents reported being satisfied). Over 90% were satisfied with medical coding, data integration, data management and eConsent capabilities. Electronic patient-reported outcomes (ePRO), eCOA and Study-Level Reporting received the lowest satisfaction ratings. Nonetheless, over 70% report being satisfied in these areas as detailed in Table 4 below.
Table 5 below shows how respondents ranked aspects of their EDC solutions and solution providers that they like least. Lack of responsiveness and poor individualized attention were among the top criticisms of EDC providers.
Functionality was by far the top factor that influences respondents’ decision when selecting an EDC solution as shown by Table 6 below. A distant second was Ease of Use with 57.4% of respondents including this factor among their top three. The ability of a solution to handle clinical trial scope and complexity was also noted as a top factor influencing selection decisions. Customer support, pricing and reputation were among the least influential factors.
Organizations look for their EDC solutions to support a wide range of data management tasks and provide a variety of functionality. This study found that—although cycle times are not improving—the majority of organizations are satisfied with their providers and EDC solutions on all accounts.
Cycle times from final protocol approval to database go-live and last patient visit to database lock—now average 69.4 and 36.1 days respectively—represent opportunity areas to accelerate. Similarly, the time required to implement mid-study updates (28.5 days for planned and 29.9 days for unplanned) against a backdrop of frequent amendments and an increasing number of adaptive design studies also represent a major area of opportunity requiring attention and improvement.
One issue highlighted by the contradictory relationship raised between challenges and satisfaction rates is the close dependency that data management has on other study tasks. Evaluating EDC solutions for efficiency or time savings remains impractical due to the many external confounders—such as the protocol approval process and amendment resolution—that impact data management cycle times.
How EDC solutions and providers can add or integrate new functionality, manage mid-study updates and reduce overall cycle times remains to be seen. This study serves as a baseline from which to measure future progress. We look forward to monitoring the progress in this highly important aspect of drug development.
The authors would like to thank Dilhan Weeraratne and Van Willis at IBM for their contributions to the project and their support in developing this manuscript.
Beth Harper, MBA, Clinical Performance Partners; Zachary Smith, MA, Tufts CSDD; Jane Snowdon, PhD, FAMIA, IBM Watson Health; Robert DiCicco, PharmD, IBM Watson Health; Rezzan Hekmat, MSH, IBM Watson Health; Ken Getz, MBA Tufts CSDD