OR WAIT null SECS
As a principal investigator in the early 1980s at University Hospital in Peru, I used to receive instructions and queries from the study team through a Telex machine.
As a principal investigator in the early 1980s at University Hospital in Peru, years before my time with Merck and Biogen Idec, I used to receive instructions and queries from the study team through a Telex machine. It was a character printer connected to a telegraph. Study monitors used it to send investigators data clarifications and protocol updates. If you visit a museum of technology you might find one.
By the time I joined the pharmaceutical industry the fax machine had become the preferred method of communicating with sites. It wasn’t until the early 1990s that people started using personal computers and had wider internet access. Eventually the use of e-mail became an accepted channel of communication, but with a number of restrictions on the sharing of regulatory documents.
Investigator Meetings were also interesting (though overwhelming). Attendees quickly learned to bring an empty suitcase to carry all the trial documents and binders home. Unfortunately these documents became outdated soon after they were distributed, and sharing information across sites was a real challenge that required follow-up meetings, calls or site visits.
Looking back, I can’t believe how labor-intensive these activities were. The amount of information that had to be updated and shared, making sure people were always aware of how to comply with changing protocol specs, and catching errors on a timely basis were massive challenges—as was trying to communicate. There were many intermediaries and moving parts with very little centralized control. It was impossible for anybody to keep up with everything and make sure people always had the current version of the documents. Everything took a lot of work, a lot of patience and a lot of time.
During the past two decades there have been numerous technological advances, mostly powered through the internet, to improve both the conduct of clinical trials and the decision making processes at all levels. Examples of these are CTMS, RDC / EDC, IXRS, eLearning / Training, payment technology, etc. The implementation and adoption of these systems has often been challenging if not painful.
We have endured these inconveniences because, regardless of our different perspectives, those of us in the clinical research enterprise have always shared one common goal: we want to accelerate drug development and make new effective, safe drugs and vaccines available to patients.
Today there are increasingly massive amounts of information distributed and updated for every clinical trial. We need to comply with the protocol and manage the data generated in a timely manner, without a lot of overtime.
Study participants in a clinical trial place their trust in us. All stakeholders must respect our common mandate to preserve and improve their safety, manage processes efficiently, and ensure the integrity and quality of study data:
The days of the Telex machine are long behind us. Technology platforms that optimize clinical operations are rapidly gaining increased acceptance in our industry which traditionally has been so reluctant to adopt new methods. This technology is helping stakeholders collaborate and align to conduct trials more efficiently and economically, with streamlined processes and with greater cooperation and transparency in every direction.
It certainly is an exciting time to be involved in clinical research, whether from the industry or site perspective. I am impressed by the new technologies and realize the benefits they can provide not only on cost and cycle time reductions, but also for patient safety and site engagement—which can influence everything from activation and enrollment to retention, quality execution, and database lock.
Such innovative systems provide us with direct and persistent access to study news, critical documents, training materials, tools for recruitment and retention, visit support guides, and communication platforms. We are improving the way we use technology to collaborate in real time.
Never before has our industry been able to communicate so effectively and efficiently with sites. This essential ability takes on additional importance given the confluence of global trials utilizing sites with multiple concurrent studies, which requires both sponsors and CROs to engage PIs and study coordinators in order to keep their trial top of mind.
Understandably, the regulatory aspect inherent in our industry has historically slowed the embrace of new technologies. When considering any change to an existing process, we must focus first on how it affects our ability to ensure patient safety while also complying with a number of different laws and regulations on a country-by-country basis. At the same time, many of our sites may be wary of adopting new systems into their workflow because prior experiences with technology promised far more than it delivered.
The speed of adoption for optimization technology has hastened on the shoulders of the successful implementation of data management systems, and perhaps more importantly demand for efficient online tools similar to the personal sites and apps that individuals at sponsors, CROs and sites are accustomed to using at home.
As optimization technology has matured, regulatory concerns have abated and providers have designed interfaces and workflows that ease, rather than increase, the burden on sites. For example, it is possible to consolidate and integrate features into fewer platforms, enable single sign-on, and customize dashboard views to focus on the news, documents and information sites actually need.
The best and most consistent way to improve the overall conduct, efficiency and quality of clinical trials is to better support and manage the people actually running them at the sites. This involves keeping the investigators and their staff constantly interested and motivated to want to work on your study and providing tools to make it easy for them to work on it.
I see the highest value today in site engagement and access to information on a real-time basis. The results of site engagement can be significant and powerful in terms of building a sense of ownership and commitment if sites have the tools they need to accomplish their work. Technology can help sponsors and sites run better trials than ever before. People are starting to notice and are using these systems on a more frequent basis.
For far too long, our goal has been to reduce the time it takes to identify and correct errors. With so many sites in so many countries all reporting data at different times and often through different systems, as well as oversight based primarily on periodic site visits by CRAs, it is no wonder that it can take weeks to notice an error and follow up on the issue. Reducing this time lapse from weeks to days or even hours may seem like a worthy accomplishment, but in my opinion—considering our capabilities and tools—it should be a given. Therefore, our goal should be the prevention or early detection of errors in the first place.
As much as we need further innovations in technology, it is absolutely necessary for the industry to embrace and successfully implement Quality by Design (QbD) in drug development. The Clinical Trial Transformation Initiative (CTTI) has defined QbD as the absence of errors that matter, i.e. the prevention of important errors that could undermine our ability to obtain meaningful information from a trial. This can only be achieved through a systematic approach that applies risk management principles to identify both scientific and operational critical to quality factors in the design and execution of clinical trials. Protocols that are very clear and have no room for error ensure sites know what to do at every patient visit. This is an essential first step that is enhanced when optimization systems are added to the equation.
Smart technology layered on top of a quality protocol can achieve this by analyzing data in real time for leading risk indicators. The system predicts potential errors or complications (for example, patients likely to drop study drug) and automatically notifies sites and the study team with the information and training needed to address the specific concern before it becomes a problem. This approach also supports advances in risk-based monitoring by identifying which sites require the most detailed onsite follow-up visits and which can be monitored remotely.
Optimization technology is quickly growing in scope and adoption, and for good reason. The word is spreading as more sponsors continue to realize the benefits of consistently engaging sites, improve conduct efficiency, enabling real-time reporting and proactively identifying and addressing potential issues.
Sooner or later the expectation will be for everyone to utilize Quality by Design in conjunction with optimization technology as well as risk-based monitoring. Only then will we truly change this incredibly inefficient monitoring process and empower sites.
Clinical operations technology can help the industry cut costs, streamline timelines, improve collaborative alignment and encourage more transparency as we put patient safety above everything.
Thankfully, clinical technologies have relegated the Telex—and the slow-moving, labor-intensive form of communication it represented—to history. From here our industry can go anywhere it chooses. It’s an exciting time and I look forward to what the future holds.
Jorge G. Guerra, MD, is an international expert in all aspects of clinical operations whose 30+ year career includes leadership roles at Merck & CO. and Biogen Idec. He currently consults with multiple organizations, serves on the Board of Directors of CIDAL and Rapid Pharmaceuticals, and is a member of the Board of Advisors of SAFE Biopharma and CISCRP. He is the author or co-author of 47 published articles and several book chapters. Dr. Guerra can be reached at [email protected].