From Quality by Constraint to Quality and Choice

, ,

The evolution of means for measuring clinical trial quality cannot be categorized as divergent. Until relatively recently, trials have been managed by limiting the options for researchers and patients, effectively applying stringent guard rails to reduce variability in tools, methods, and approaches. Beyond the advent of ePRO and eCOA technology capturing data directly from the participant, most of the work has been at the hands of sites steering trials via rigid protocols. The status quo of controlling quality at sites provided a streamlined approach because it allowed for methods such as regular visits and 100% source data verification to impose an acceptable threshold for quality. With the rise of decentralized approaches and a changing regulatory landscape, this framework is set to change for the better.

The industry is shifting from a paradigm of designing studies by narrowing optionality to a framework of leveraging choice to promote flexible participation in trials. Clinical trials are now being structured to ensure accessibility for patients and productivity for researchers. The pandemic has only served to highlight the limitations of the status quo and accelerate the adoption of next-generation digital innovations; over 90% of executives at a recent McKinsey Clinical Operations round table expect virtual trials to become a major component of their portfolio, up from 38% in 2019.1

Trials are increasingly scoped with technology to meet patients where they want to be, ultimately driving equity and efficiency in clinical research.These technologies—from wearables for endpoint collection to telemedicine visits to full-blown virtual metasites—generate an abundance of data faster than ever before, providing evidence for the next wave of adaptive trials and decentralized approaches.But as trials rapidly evolve, it is crucial that quality is not compromised.

Reimagining study management to preserve quality and choice

While we now have the technology, process, and regulatory momentum to create optionality for researchers and participants, parallel innovations are needed for the operators managing these studies.Maintaining safety and measuring efficacy are equally important and can be more complex in a study that offers choice to participants.Naturally, optionality may create more risks to quality, so we must adapt the way we manage trials accordingly.Study management must be more flexible, more targeted, and more-data-driven.

Thankfully, some of the regulatory frameworks and processes are already in place.ICH Good Clinical Practice guidelines have laid out frameworks such as Risk-Based Monitoring2 to allow for more targeted oversight of trials.The FDA even recently released a request to flag “remote” and “onsite” datasets in oncology “to foster use of ‘decentralized’ aspects of clinical trials prospectively in the post-COVID era.”3 Other industry organizations such as Metrics Champion Consortium (MCC) or the newly formed Decentralized Trial Research Alliance (DTRA) have ongoing efforts to operationalize this guidance through common metrics and best practices.

On the technology side, remote monitoring solutions were introduced to support trial operation teams. These solutions—often comprised of dashboards and workflow to communicate virtually with sites—assisted in moving away from rigid protocols focused on in-person site visits and 100% source data verification.While uptake was relatively slow through 2019, the pandemic and resultant lockdowns have been a powerful accelerant.However, with the volume, velocity, and complexity of data associated with next-generation trials, maintaining quality today requires more than incremental point solutions.Instead, teams need powerful tools and analytics that keep up with the pace and complexity of novel trials. Clinical operations must be empowered to see their study health in real-time and receive actionable, data-driven recommendations on how to proactivelymaintain quality.

Data-centricity as the key to quality by choice

Enter the increasing need for a data strategy, technologies and related data competencies. By providing the right data to study operators at the right time, they can effectively ensure quality in any next-generation trial.There are key strategies that all study sponsors and their research partners must address today to future-proofstudy management. Some of the most impactful we have seen in our experience supporting industry innovation include:

Flexible data collection, but standardized data analysis: The average clinical trial already uses six different data sources, and that number is only expected to grow.4 Novel data sources will continue to emerge and should be embraced to match the specific needs of patients and researchers in any given trial.However, for many study teams, taking in data from so many new, different source systems requires arduous, manual reconciliation that distracts from their core focus on mitigating quality issues. Instead of this manual process, or restricting choice by limiting which systems they use, Sponsors should consider creating a single source of truth—a data platform that captures and harmonizes key datapoints and metrics in real time.This platform should surface analysis and insights to the study team in an intuitive way regardless of from the source of these data.Ultimately, this approach allows for comparison across studies, and lets the study team focus on what the data are saying vs. where they are coming from.

Moving from reactive to proactive study management: In next-generation trials, it is no longer sufficient to look only at what happened in the past days, weeks, or months in your trial. The cadence of these trials and data captured is often too fast; study managers need to be ready for what quality issues will emerge next.Part of the solution is leveraging historical data in a systematic way to better predict future outcomes.However, as mentioned earlier, the pace of data generated by new, more patient-centric trials is growing exponentially, and so relying on historical data alone will leave study teams making decisions on dated information. While study teams should leverage historical data to inform choices from design to execution, the right solution will combine this information with incoming real-time data from their studies. A complete solution will translate these data sources to action, using powerful analytics to suggest improvements to study design and execution that eliminate current and future quality issues.

Deep partnership between clinical operations and data and analytics experts: The strategies mentioned above are fundamental shifts in how many organizations think about managing studies—they are essential, but not easy. We have seen organizations have the most success and rapid impact when clinical and data experts partner closely from the very start. During scoping and change management, clinical teams provide critical inputs on the most important use cases and metrics to track or predict while technology, data science and analytics teams explore what is possible, often supplemented by specialize experts like Lokavant. In implementation, this partnership allows the data team to focus on implementing key technology improvements while clinical operators change relevant processes, and ultimately focus on high value work. Post-implementation, these teams establish a strong feedback loop for continuous improvement, periodically surfacing, evaluating, and implementing new technologies and trial structures. This data framework not only positively impacts clinical and development teams, but the benefits are also further realized at the site and patient level. Data exchange fuels decentralized trials and will continue to play an important role in trial flexibility, providing further options for patient participation.

So, what can your organization do now to prepare for an increasing patient- and data-centric future? This is the first in a series of articles where we explore how companies are harnessing and leveraging operational and patient data to maintain quality and choice. Whether it is for risk management, vendor oversight, or portfolio management, each article will introduce a problem, present a use case and provide practical and useful approaches that stakeholders can use to help them on their data journey.

Craig Lipset, Advisor and Founder, Clinical Innovation Partners, Rohit Nambisan, President, Lokavant, Devin Solanki, Head of Growth, Lokavant

References

  1. https://www.mckinsey.com/industries/life-sciences/our-insights/no-place-like-home-stepping-up-the-decentralization-of-clinical-trials
  2. https://www.fda.gov/files/drugs/published/E6%28R2%29-Good-Clinical-Practice--Integrated-Addendum-to-ICH-E6%28R1%29.pdf
  3. https://www.fda.gov/about-fda/oncology-center-excellence/advancing-oncology-decentralized-trials
  4. https://www.contractpharma.com/contents/view_online-exclusives/2017-11-21/volume-and-diversity-of-clinical-trial-data-sources-expected-to-soar/