Lean Outsourcing Models for Clinical Trials, Part 2: Optimizing Technology

Denise Calaprice, PhD, Mitchell Katz, PhD

A key success factor for lean models is the possession of technology that can measure performance and manage risk in a well-organized and user-friendly manner. This article describes the results of an investigation to identify and describe the basic requirements for such technology solutions.

In an article published last month (http://www.appliedclinicaltrialsonline.com/lean-outsourcing-models-clinical-trials-factors-achieving-successreference), we discussed the recent trend toward “leaner,” more efficient models for the outsourcing of clinical trials, and summarized the findings of our recent research in this space. As described, The Avoca Group, working on behalf of Purdue Pharma, interviewed representatives from companies that execute lean models effectively, in order to gather information regarding strategies, decision models, tactics, and experiences, for the benefit of those exploring their use. As might be expected, efficient and effective oversight under such models typically requires the use of a well-designed set of metrics and associated thresholds, reports, and dashboards to measure performance and manage risk. A key success factor in lean models is therefore the possession of technology that can provide these in a well-organized and user-friendly manner, optimizing staff focus and productivity. For this reason, we conducted an investigation specifically to identify and describe the basic requirements for such technology solutions, as well as the factors considered by successful lean outsourcers in selecting and implementing the right solutions for their companies. Here we share the findings of that segment of our research.

According to our investigation, suitable technology solutions address five key requirements:

1) They provide the means to identify important and actionable trends in operational data by:

  • collating, presenting, and monitoring an appropriate, well-organized, and configurable selection of Key Performance Indicators (KPIs) and/or Key Risk Indicators (KRIs).

  • highlighting KPIs and KRIs that are failing or at risk of failing through configurable (and potentially dynamic) thresholds. Color-coded stoplight indicators (red, yellow, green) are the common illustration of such status for individual KRIs and KPIs, with separate views for trending over time.

  • providing configurable, proactive (“push”) alerts when KPI or KRI thresholds are met.

2) They provide support for detailed evaluation and root cause analysis of performance issues by:

  • offering trend, pattern, and outlier identification in an easy to visualize and interpret manner (ideally offering the ability to select from a menu of pre-configured graphic visualizations and analyses).

  • offering filtering and drill-down capability, including from within a graphic visualization. 

3) They provide compliant audit trails and workflows for tracking reviews, observations, issue escalation, and issue resolution across all relevant roles and levels.

  • Ideally, they also support “push” notification of actions that are due or otherwise at risk of noncompliance. 

4) They provide a user experience suited to the type of resource accountable for operating the technology solution, with specific focus on a user-friendly interface for the clinical operations user-base:

  • intuitive navigation.

  • user-friendly screens to create, modify and respond to metric thresholds and other associated parameters.

  • reports, exports and alerts in customizable formats, aligned to stakeholder roles.

  • appropriate support and training, including self-service tools within the software.

5) They allow for external/third party access, and integrate easily with both third party systems and other industry-leading tools and practices. 

Given these requirements, careful consideration of a number of questions drives the selection of specific solutions suitable for specific companies. These include:

  • A. Could the technology solution be used for data analysis and visualization beyond support of clinical operations? For example, could it also be used for risk-based monitoring (RBM), or other functions completely outside of clinical operations? 

Given the broad potential utility of a user-friendly data aggregation, analysis, and visualization tool, a sponsor company may wish to leverage such a solution to support a broad range of functions in a cost-effective manner, including functions such RBM or safety monitoring that utilize clinical as well as operational data. If this is the case, the sponsor may want to selectively explore and choose solutions with broad aggregation, analysis, and visualization capabilities.

  • B.What works well within existing oversight processes and practices that must be supported by the system, and conversely, what key gaps and opportunities characterize current processes and practices that may be addressed by the system?

If changing from a more traditional to a lean outsourcing model is presenting a challenge for the sponsor, tools that allow for permission-based, configurable dashboards aligned to role-specific accountabilities may help to focus the staff on only the oversight areas for which they hold accountability, and only those metrics that are deemed important for the desired level of oversight. In this manner, “detail-level” operational metrics that are to be monitored and managed by staff at the CRO partner may be excluded from the first-line dashboard presented to sponsor managers. Similarly, if staff skills relating to trend or root-cause analysis are limited, a system that allows a central user to pre-program a library of logical reports (e.g. drill-down, filtering, time-series) that are triggered in certain data-defined circumstances may be desirable, whereas if staff skills in this area are sophisticated and advanced, it may be important to select a technology solution that allows any user to create custom trial- or function-specific algorithms and advanced visualizations including box plots, funnel plots, etc.

 

 

As another example, technology solutions vary in the extent to which they directly integrate or support replication of existing risk planning tools, such as the TransCelerate RACT tool. If such tools are being used to positive effect as a current practice or corporate requirement, solutions that fully support and even enhance their use would be good choices. Conversely, adoption of a solution that does not support such integration may require a process or even an additional system, to reconcile data and actions tracked in different places. As a final example, many technology solutions maintain an electronic audit trail that identifies users and tracks navigation within the system as evidence that the data are being reviewed. However, not all solutions capture the processes that feed data and other information (thresholds) into the system, and/or capture the observations, actions and status arising from data review as evidence of oversight. Consideration must be given to whether the functionality offered by a specific solution aligns with existing procedures for documenting evidence of review and action as well as whether there is the need for additional integration with other systems (e.g. CTMS) necessary to avoid duplicate data and data reconciliation.

  • C. What is the existing technical infrastructure for housing source data for key performance and risk indicators? With how many solutions will integration be required? 

Significant consideration must be given to the ability of a solution to integrate with existing sponsor and CRO partner systems that collect and store source data for key performance and risk indicators. This integration must ultimately be seamless enough to support or enhance existing processes while preventing gaps and overlaps. The need for resources to design, test, and manage interfaces can be a key driver in the time and effort required to implement and support a technology solution. As an example, technologies that support RBM all rely on receiving data from other systems, and thus will require some level of technical integration to either receive feeds directly or through a data transfer/import process. The more systems (at multiple partners) that currently house such data, the more integration interfaces will be required. Many technology solutions offer off-the-shelf integration with established eClinical technologies, have open APIs that facilitate integration with other tools, and/or utilize ETL (extract, transform and load) processes to pull data from source systems, but some do not.

  • D. How mature is the oversight model to be supported, and what foreseeable changes may need to be accommodated by the system in the future? 

The considerations above involve assessment of model details, current processes and gaps, staff skills, and technical infrastructure, but in many cases these may be in various stages of development. If the oversight model, corporate or partnership structure, volume of study data, or technical infrastructure is likely to be in flux over the expected timeframe for use of the technology solution, it will be critical to select a tool that is scalable and flexible enough to accommodate–and potentially to take advantage of–such changes. For example, technology solutions may vary in their ability to add additional value with larger volumes of data, and/or with greater levels of sophistication in predictive modeling. Some offer not only simple user-defined thresholds, but also statistically derived thresholds and/or the ability to set multiple, varying thresholds based on factors such as sample size or site-specific enrollment. Likewise, some technology solutions incorporate “unsupervised,” exploratory methods, whereby statistical engines run across all available data rather than just the data outlined by the plan, to surface potential risks and/or performance outliers. Fewer solutions offer such features rather than track pre-identified KPIs and KRIs against pre-defined thresholds; thus although the enhanced functionality may be useful, the anticipated maturity of the oversight, risk management, and RBM strategies over the course of the anticipated period for use of the tool will determine whether such features would likely add enough value to compensate for limiting the field of solutions from which to choose. 

A successful assessment of technology solutions to support lean oversight models involves careful consideration of each of the above questions, followed by collection and documentation of the formal business requirements that emerge. These business requirements may then be translated into an RFI questionnaire for assessment of suitable technology solution providers. Technology solution providers that meet critical business requirements may then be invited to demonstrate their solutions, and a selected solution moved forward to pilot on a limited basis. 

Data analysis and visualization technologies that support clinical trial oversight and risk-based planning and monitoring are rapidly evolving as industry experience grows and the requirements of individual companies mature. When selected carefully, such technology configurations can be powerful tools to align actual practice with strategy, to prevent duplication of effort (both internally and externally) while ensuring that key oversight requirements are met, and even to elevate the ability of the companies’ staff to provide oversight in a more sophisticated, comprehensive, and efficient manner. 

Denise Calaprice, PhD, is Senior Consultant with The Avoca Group.

Mitchell Katz, PhD, is Head of Clinical Research & Drug Safety Operations at Purdue Pharma.