OR WAIT null SECS
When negative results arise, novel analysis packages help speed the decision to move forward or pull the plug on a new drug.
In recent years, pharmaceutical companies have been working toward two key goals for clinical trials: improving drug safety and reducing late-phase attrition by discovering the potential for adverse events at earlier stages, and streamlining the scope of trials to reduce costs and bring drugs to market earlier. The pressure is on to catch problems before they arise in costly registration trials or even worse, turn into public health problems with the potential for market recalls. Yet, it's also essential to gain a competitive advantage by speeding up the drug pipeline.
Photography: DigitalVision Illustration: Paul A. Belci
While these goals can work against one another, they don't have to. For example, in one common bottleneck—the occurrence of unexpected results from clinical trials data—new, highly interactive data analytic tools coupled with group problem-solving techniques can shave many weeks off the analysis process while improving safety monitoring. These tools and techniques can also be used to document and share insight acquired during issue resolution and negotiations with regulators, potentially helping to avoid costly retrials.
In a typical analysis scenario, after a study fails to meet the efficacy response or a safety issue crops up, a project team meeting is called to discuss what went wrong. Understanding and addressing unexpected results often requires more than pairing a clinician with a statistical analyst. Increasingly, there's a need for experts such as pharmacologists, biologists, and even regulatory experts to sit in to help sort through the potential indicators together. After identifying potential causes for the unexpected results, a statistician goes off for two to three weeks to work with the data management team to pull the data from the clinical database and guide specialized programmers to crunch the numbers in an analytical program. The project team members then meet again to study a wide variety of tables generated by the statistician. Attendees make comments and suggest new lines of inquiry, and the statistician goes off for additional days or weeks to generate new tables. Sometimes this cyclical process can go on for months.
The process is not only lengthy and disjointed, but it also requires that the statistical analyst have a deep understanding of a wide variety of clinical and scientific issues. The analyst depends on the group, yet the group is not around when he or she needs them most: when initially reviewing new lines of inquiry.
When the expertise is available during the meetings, the paper-oriented discussion framework is less than optimal for group dynamics. With paper reports, it is difficult to track multiple variables at once and get a sense of the big picture or complex relationships. Whenever a new set of relationships is discovered, a new chart needs to be created, which means another delay in generating tables and scheduling a meeting. With so many meetings, key personnel are sometimes absent, and for each new meeting, team members must re-immerse themselves in the issues.
At the core of the problem is the fact that traditional statistical tools and organizational strategies don't facilitate fast, effective decision-making. A growing number of pharmaceutical firms are tackling this problem by taking the same analytics and visualization tools that have helped to accelerate drug discovery applications and applying them to clinical trials data. Combined with new practices, these tools are greatly reducing the number of meetings and the time between meetings. Rather than sorting through stacks of paper reports, team members can explore much of the data together on screen and more quickly identify the sources of unexpected results.
Setting the Right Tone
Business analytics tools such as Spotfire DecisionSite, SAS Enterprise Miner, and S+Insightful are much easier to learn than traditional statistical software. Because of the speed and flexibility of these tools, participants can work together with the data onscreen using dynamic visualizations that quickly show trends among multiple variables.
With varying degrees of statistical sophistication, these packages can handle most analytic tasks in real time. This makes them perfect for performing "triage" on the data to more quickly narrow down the list of potential relationships that need to be explored in-depth with more sophisticated statistical modeling programs such as SAS. Multiple experts can work together, suggesting various pathways of analysis as new information emerges. The process makes it easier to change the course of the investigation on the fly as new factors emerge, and it quickly focuses attention on critical data points. As a result, more work can be accomplished before it's necessary to move to offline data crunching. It takes fewer meetings to arrive at answers, and because there's less work than with the typical process, the time between meetings can be shortened and the paperwork reduced.
Some data analytics tools are designed to easily integrate data from multiple sources. This allows the review team to broaden the selection of data to include a wide variety of clinical assessment, such as vital signs, assay data, genetic profiles, biomarkers, and other "omics" data. By slicing and dicing the data with the help of experts in an iterative, almost intuitive manner, problems from unforeseen sources often emerge more quickly.
These tools are particularly helpful in exploring multivariate responses to see how different endpoints correlate. Subsets of subjects can be explored for supporting or contradicting endpoints. Lack of overall efficacy response, for example, where those who were treated with the drug didn't appear to respond any differently from placebo, can be explored with numerous available analyses to subset the data to see who has responded. With some packages, changing parameters results in visualizations changing dynamically in real time, so it's easier to explore the interrelations between multiple data sets and spot flaws and outliers. Without having to wait for a report, analysts can correlate identifiable problems with clinical endpoints and search for effective subgroups and aberrations based on subject characteristics or protocol design.
As a result of this process, clinical trials analysis is no longer a solitary pursuit in which an analyst must guess what clinicians want. Instead, participants from a broad range of disciplines can share their expertise, helping the team to coalesce.
Specific changes can be easily made to the usual data management and analysis process to easily implement an efficient and proactive team resolution approach, should issues arise. Probably the first and one of the most critical of these changes is quick access to the right data. Ideally, a standardized process should be implemented that calls for the routine creation of two exploratory datasets, one for efficacy and one for safety, at the close of each study. These would integrate data from the various sources (PK, genomics, etc.), although most of the data exploration tools allow the user to do this on the fly as long as there is a common primary identification variable. The data management group should determine standardized variables and formats so that these exploratory sets can be easily combined for critical analysis across studies, drugs, indications, and therapeutic areas. Keep in mind that a successful, positive study has exploration value when compared to a study in which issues have arisen.
The natural user of the exploration tool is the statistician or biometrician. They have a strong relationship with the data management group and can facilitate the data retrieval from the clinical database and the construction of the exploratory datasets. They are usually the project representative for the data management and analysis effort, and they have an established relationship with the project team. Typically, they are comfortable with the data, the science, and using analytical software tools.
The final component is the project team. This effort must be clearly supported by the clinician and project manager to ensure a productive critical review of the data. The power of the approach comes from the real-time brainstorming of the "what ifs."
Once an exploratory analysis is completed, data analytics software can also help make the case to regulatory agencies. It speeds up the preparation for discussions with a regulatory agency or an advisory committee review and lets analysts quickly select only those reports and graphics that the regulator requires. In many cases, the preparation process for a regulatory review is reduced by weeks.
The ability to visually isolate the impact of subgroups and place them in perspective can assist in negotiating alternatives. The more precisely one can indicate why a study failed, the better chance there is of negotiating the continued use of the results. Often, regulators are willing to be walked through analysis in real time, providing an even more convincing argument than possible with a canned presentation. Regulators are impressed when analysts can demonstrate that they truly understand their data, and the authority and integrity demonstrated by such efforts can pay off in future negotiations.
In summary, when analyzing unexpected study results to identify causes, working with a data analytics package in real-time can provide a strong competitive advantage. The ability to explore complex data in a group context with experts from different disciplines cuts through the bureaucratic paperwork and frees analysts from time-consuming table generation. This shortens the time to push the drug on to the next phase, to send it back to the lab for more work, or to pull the plug on a seriously flawed drug before it wastes more money or threatens public safety.
Patricia Ruppel is president of Innovative Analytics, M-Tec at the Groves, 7107 Elm Valley Dr., Kalamazoo, MI 49009, www.IAnalytics.biz.