
Technology’s Role in Clinical Trials
How the use of eSource, eTMF, and other cloud-based solutions are changing clinical trials
Even the most casual observer cannot help but notice the array of technological solutions bringing disruptive change to the clinical trials process. From a growing body of literature to webinars to multi-day conferences, the word is out that technology is key to a world focused on making measurable improvements such as speeding study start-up, streamlining transmission of clinical trial data, and overhauling how studies are monitored. And it’s no longer about individual point solutions to seemingly intractable problems, but rather, it’s about the sharing of real-time data they capture to enable strategic decisions by collaborators on the status of a study as it actually unfolds. This is a dramatic change from traditional paper-based methods that have been at the root of the industry’s slow and costly methods of conducting global clinical research whereby evaluation of data quality relied upon onsite monitoring or near database lock, sometimes years after the data were first collected. These older approaches have, in no small part, contributed to the ever growing timeframe for clinical development, which is an estimated 6.8 years, up from six years in the 1999 to 2001 timeframe, a 13% increase.
This article offers a glimpse into some of the expansive solutions taking hold in today’s market and why the industry is finally ready for change. Only a decade ago, the clinical trials sector was embarking on implementing electronic data capture (EDC) on a large scale
As described in a recent FDA webinar, the biopharmaceutical industry has been slow to uptake electronic solutions that could revolutionize clinical trials
Why now?
In 2012, the Institute of Medicine (IOM) released Envisioning a Transformed Clinical Trials Enterprise in the United States: Establishing an Agenda for 2020
And that’s where today’s technological advances fit in. The report takes a broad-based view of the overhaul, identifying a wide range of factors, including the need for improved data collection, management and analysis, data standards, and improved communication of results among stakeholders. Today’s cloud-based technologies are designed to enable these sought-after improvements.
The advent of cloud-based capability could not come at a better time. It coincides with an all-out push from the FDA to adopt technology, as evidenced by the release of two guidances and one draft guidance on this subject (see sidebar). In August 2013, FDA released a guidance encouraging use of RBM where appropriate.
Ed Seguine, CEO of Clinical Ink, a provider of eSource solutions, comments, “What’s unique about these three guidances in quick succession is their tone. They use terms such as ‘promote’ and ‘encourage.’ They are wanting to make it absolutely clear that FDA wants to see use of innovative solutions, so there is no regulatory excuse not to implement them.”
Before the string of FDA guidances, the European Medicines Agency (EMA) stated its position on the use of electronic collection of source data in a 2010 Reflection Paper
The EMA also released a Reflection Paper in 2013 on risk-based quality management
This drive toward electronic solutions coupled with regulatory encouragement is destined to impact stakeholders faced with managing growing pipeline activity. Research indicates a rise in the number of drug candidates in global development, namely 646 Phase III drugs in 2012, up from 573 in 2010 and 547 in 2008. Similarly, Phase II drugs exceeded 2,100 in 2012, up from 1,812 two years earlier, a 17% increase (see figure).
With this growing cadre of drugs under development, there is a need for cloud-based capability to streamline clinical trial activity. As explained by Jennifer Goldsmith, Vice President of Veeva Vault at Veeva Systems, a cloud-based provider of eTMF solutions, “Cloud-based technology has matured to a point where it can be used in a highly regulated industry. When we started, there were tremendous questions around maturity of the cloud and ability to integrate with in-house systems, but there has been a 180-degree turnaround. Those who would never consider a cloud environment a few years ago are now mandating it.”
New solutions
With the cloud gaining acceptance, so, too, are the technologies it supports, such as eTMF, eSource, RBM and next-generation clinical analytic interfaces built on virtual, on-demand data warehouses. A brief review of each of these technologies highlights why they are essential to supporting forward-moving clinical trials.
Electronic trial master file (eTMF)
The eTMF refers to the electronic version of the essential documentation needed to conduct a clinical trial and evaluate the quality of the data produced. Initially, when the TMF went electronic, it was generally maintained on an in-house client server, and functioned mostly as an electronic file cabinet whereby paper documents were scanned and archived, but were essentially static. Over time, this format has matured to a cloud-based solution housing massive amounts of interactive data that can be used for strategic planning of an ongoing trial. In particular, the eTMF offers greater visibility of data among stakeholders, documents are always audit ready, and there is functionality as a business planning tool, without having to maintain the IT infrastructure in-house.
Recently, Veeva Systems released results of a large eTMF survey of 252 TMF owners.
In interpreting these results, Goldsmith comments, “Requirements around the world for inspection readiness have become more stringent. Before, documents that were archived were acceptable. Today, we are seeing more in-process auditing and greater interest in remote auditing. There has also been a dramatic shift over the past 10 years in the amount of collaboration that goes on in a clinical trial. CROs, for example, are not only providers, they are actually strategic partners. For that reason, the ability to share the information housed in the eTMF among partners has become more important.”
eSource
Another solution, eSource, is gaining momentum, especially since the September 2013 release of the FDA guidance on the subject, which encourages its use.8 As stated in the guidance, eSource is defined as data initially recorded in electronic format. Data obtained at a study visit can be entered directly into an electronic case report form (eCRF), a process that has the advantage of reducing errors by eliminating the step of first writing on paper and then transcribing (see sidebar).
eSource addresses one of the major issues of EDC: the need to transcribe source documents into CRFs. With eSource, clinical information flows directly into the study’s clinical trial database in real time or near real time, allowing for earlier visibility and a faster start to data analysis. Given this capability, one could argue that the CRF, a conduit for data collection, could actually be eliminated. Moreover, for the sponsor or CRO anxious to see clinical trial data, the issue of waiting several weeks for monitoring visits to review data captured by EDC is removed with eSource, and is replaced by real-time review of data, which is entered during a subject’s visit.
Risk-based monitoring (RBM)
Capabilities offered by eSource are a natural lead-in to RBM. The concept behind RBM is to focus sponsor oversight activities on preventing or mitigating risks to data quality and to processes critical to human subject protection and trial integrity.7 Traditionally, companies have conducted 100% source data verification (SDV) via on-site monitoring, a notoriously labor-intensive practice. But the need to continue this practice has been called into question, especially since it is not mandated by FDA or EMA. Moreover, there is a growing awareness that 100% SDV does not automatically result in better clinical data quality.
Between the regulatory encouragement for RBM and the robust volume of literature touting its value, RBM is a hot, if complex, topic. Much of the discussion focuses on the hefty costs associated with 100% SDV. Medidata puts the cost of onsite monitoring for a large clinical trial at 28.7% of the study budget, clearly the largest study expense.
In moving to RBM, monitors can better spend their time reviewing critical study-related issues, protocol compliance, and patient safety. As described by Hunter Walker, Chief Technology Officer of Atlantic Research Group, “Despite the widely held belief that RBM is a money saving technique, it’s really about data quality. With RBM, monitors will still visit the sites, but instead of reviewing data transcription from source to eCRF, they can devote more time to process and compliance issues.”
RBM uses targeted approaches to remote or cloud-based monitoring of subsets of the clinical trial data, and can take the form of partial SDV and statistical risk-based algorithms based on fixed risks tied to the protocol. TransCelerate BioPharma, a non-profit focused on innovation in R&D, has provided further clarification into RBM through its position paper on the subject.
Single interface across multiple systems
The new cloud-based analytics technologies are meant to improve the collection and transmission of clinical trial data from disparate sources. To make meaningful insights from the data, stakeholders need to be able to generate ad hoc reports and develop operational metrics that determine if milestones and timelines are being met, if there are safety signals related to the investigational product, and how sites are performing relative to others.
With a single interface, it is possible to collect study-related trial data housed in multiple systems from multiple vendors, such as EDC, interactive voice response systems (IVRS), and clinical trial management systems, and use it to answer key questions. With the help of a self-service visualization-based data discovery environment, stakeholders can visualize up-to-the-minute study trends, such as enrollment rates, timely entry of data, degree of data quality, abnormal lab values, etc. Without these reports, stakeholders have limited ability to determine the status of an ongoing trial.
Anecdotal claims from Comprehend Systems, a provider of a cloud-based interface, suggest that use of this technology may lead to a 30% decrease in site monitoring visits by utilizing visualizations of key performance indicators such as recruitment pace and drop-out rates. Again, anecdotally, there have been 50% reductions in database lock time, and 3-4 days per month per trial saved in clinical data review time as a result of having actionable data available in a unified reporting interface.
Traditionally, the major way of analyzing volumes of data has been by accessing a user-controlled data warehouse, a methodology that is slow, costly, and inflexible and requires constant re-programming and re-validation as questions change. As part of that effort, stakeholders have been faced with the significant challenge of integrating clinical, operational, and safety data from different sources in different formats. Bringing those data together into one location where they can be linked together for analysis has been rendered all the more difficult as there is no standard way of collecting those data. Because of these issues, companies that embark on building out a warehouse can spend years developing it, so using a provider may be a viable option.
With the advent of a hosted cloud-based non-standard clinical data warehouse, which is essentially how the interface operates, it is possible to ask the necessary questions without having to invest the resources needed to maintain it in-house. Validation is still needed, but programming is handled by the provider’s application. In addition, steps such as infrastructure analysis; extract, transform, and load; and performance tuning can be managed by the provider. This effort will be further facilitated once the industry develops standards for aggregating data in a data warehouse.
Technology driving process change
With the implementation of the described cloud-based technologies, a change in the current business model is required. This model was developed decades ago, in response to a time when clinical trials were very different from today’s global multi-site approach.2 Even the significant introduction of EDC, which brought huge strides in terms of edit checks, a better query process, and faster viewing of clinical trial data, was still rooted in the traditional business model with legacy approaches to monitoring and validation. By comparison, other data-heavy industries have changed their underlying business models, and research suggests a similar change is in order in the clinical trials sector.2
Not surprisingly, the industry is in various stages of adoption, reflecting different sizes of companies and levels of sophistication among stakeholders. David Scott, President and CEO of Palm Beach Research Center in West Palm Beach, Florida, comments that use of technology is on the upswing, but notes that his site does not use eSource documents. “We have paper source documents, and by not making that transition, that is one huge amount of responsibility we don’t have to take on. At this point, we think of cloud-based solutions as best suited to business documents, not study documents.”
Regarding RBM, he explains that only a few of Palm Beach’s sponsors are using the technology, but more are trying to make that transition. Similarly, Laura Whitaker, Neurosciences Research Coordinator at Hoag Memorial Hospital in Newport Beach, California, comments that only one study is 100% RBM, and in that case, she speaks with the monitor by phone.
An industry veteran, Whitaker appreciates and welcomes the changes offered by technology, but feels that, at least to some degree, the human touch is getting lost. She says her site is starting to notice that streamlined processes and technologies are shortening the research phase by getting the data collated and meeting contracted timelines, “but we are losing the human element with RBM. The monitor can see the clinical trial data remotely, but they aren’t seeing documentation of the informed consent process or the actual signature of the investigator on various documents. Also, they aren’t seeing any notations, such as why the patient was out of window or why certain factors were not reportable to an IRB,” she explains. Other comments appear in the sidebar at right.
Going forward
As regulators, sponsors, CROs, sites, and other stakeholders come together to bring greater efficiencies to the beleaguered clinical trials industry, technology and process change have been highlighted as key. The days of paper-bound methodologies are diminishing as cloud-based solutions are stepping in, but there is a range of adoption of these new solutions from modest to full-blown.
How technology adoption changes the business model will be a closely watched subject as the need to rein in costs, streamline operations, and adhere to regulatory guidelines steers the clinical trials industry toward more timely development of safe and effective therapies that benefit patients across the globe.
Rick Morrison is CEO of Comprehend Systems.
References
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
. Accessed August 11, 2014.
13.
14.
15.
16.
17.
18.
. Accessed August 14, 2014.
19.
Newsletter
Stay current in clinical research with Applied Clinical Trials, providing expert insights, regulatory updates, and practical strategies for successful clinical trial design and execution.






.png)



.png)



.png)
.png)
