Acknowledging Cycle Time Impact from Protocol Amendments

Apr 01, 2016
Volume 25, Issue 4

Clinical trials with at least one substantial protocol amendment require several hundred thousand dollars in unplanned direct costs to implement. But perhaps the most expensive impact is the unplanned incremental cycle time tacked on to the study. A new analysis by the Tufts Center for theKen Getz Study of Drug Development (Tufts CSDD), in collaboration with more than a dozen sponsors and contract research organizations (CROs), indicates that protocols with at least one substantial amendment take an average of three unplanned months more to complete than do those without an amendment. These findings shed new light on the importance of adopting new strategies to reduce select amendments. 

Collecting amendment data

Between March and July 2015, Tufts CSDD and 15 pharmaceutical companies and CROs collected data from 836 Phase I – IIIb/IV protocols approved between 2010 and 2013. Protocols approved within the more recent 12-month period were excluded from the study, as these had the potential to continue accumulating amendments following the conclusion of clinical trial data collection. From the protocols reviewed, Tufts CSDD analyzed data from 984 amendments. Seven of the 15 participating companies also gathered direct cost data from 52 protocols for which substantial amendments had been identified during the January and May 2015 timeframe. This study was supported in part by an unrestricted grant from Medidata Solutions.

Only substantial protocol amendments were evaluated in this study to ensure a more consistent assessment of prevalence and impact. Substantial amendments were defined as any change to a protocol on a global level requiring internal approval followed by approval from the institutional or ethical review board or regulatory authority. Country-specific amendments that affected protocol designs for clinical trials within a given region alone were excluded.

High prevalence ... and avoidability

The majority of protocols (57%) had at least one substantial amendment. On average, across all phases, the typical protocol had 2.1 substantial amendments. Phase II and III studies had the highest prevalence at 77% and 66% of the total, respectively.  The average number of substantial amendments per Phase II protocol was 2.2; Phase III studies—typically the longest duration and the costliest to conduct—had the highest mean number (2.3) of substantial amendments. 

Sponsors report that the vast majority of changes made to an approved protocol originate internally. Only one-in-six (16%) stems from a regulatory agency request. The most common changes addressed by a substantial amendment are associated with modifications and revisions to study volunteer demographics and eligibility criteria (53%). Four-out-of-10 (38%) changes are related to modifications to safety assessment activity; 35% are related to typographical errors; 27% are associated with endpoint modifications.   

In the Tufts CSDD study, sponsors and CROs reviewed their respective amendments and indicated the degree to which they could have been avoided. One-out-of-four substantial (23%) amendments were considered “completely avoidable” and 22% were considered “somewhat avoidable.” Avoidable amendments included protocol design flaws, errors and inconsistencies in the protocol narrative, and infeasible execution instructions and eligibility criteria. 

Approximately one-third (30%) of substantial amendments were deemed “somewhat unavoidable” and 25% were classified as “completely unavoidable.”  The causes of unavoidable amendments included manufacturing changes, the availability of new safety data, changes in standard of care, and regulatory agency requests to change the protocol design. 

The total median direct cost to implement a substantial protocol amendment for Phase II and III protocols was $141,000 and $535,000, respectively.

The magnitude of impact

No surprise—substantial protocol amendments significantly impact some study scope elements and the entire study conduct cycle. But the new Tufts CSDD study puts some real metrics on the table: Studies that had at least one substantial amendment saw a significantly higher reduction in the actual number of patients screened and enrolled relative to the original plan. This may have been due to sample size re-estimations and concrete steps taken to reduce patient screening and enrollment burden. In contrast, protocols that had no substantial amendments saw only a modest reduction in the actual number of patients screened relative to plan; and a modest increase in the actual number of patients enrolled relative to the original plan.

Substantial amendments significantly increased cycle times at individual time points and throughout the study duration, suggesting that the delays associated with amendment implementation are not recovered or reversed later in the study. Study initiation durations (i.e., protocol approved to first patient screened) were, on average, 18% longer for protocols with at least one substantial amendment compared to those without an amendment. This difference was not statistically significant, as expected, since the majority of substantial amendments are implemented once the study is underway.


For those protocols with at least one substantial amendment, the time points from protocol approval to last patient last visit (LPLV) and from first patient first visit (FPFV) to LPLV were significantly longer—at 90 days and 85 days, respectively—compared with those protocols without an amendment. 

A whopping 5.5-month increase in time was observed in the “first patient participation cycle” (i.e., from first FPFV to first patient last visit [FPLV]), suggesting that the implementation of substantial amendments impacts study volunteers differently depending on when they are randomized and enrolled in the clinical trial.

Eyes on the prize

A large and growing number of sponsors and CROs recognize the incredible unplanned and unbudgeted toll that protocol amendments take on study budgets and timelines, and the major opportunity to improve clinical trial efficiency and performance. Companies are routinely gathering metrics to monitor their protocol amendment experience. A number of sponsors and CROs are leveraging new technologies and implementing new mechanisms, functions, and processes to optimize protocol design.  

Amgen, for example, has implemented a new Development Design Center to assist clinical teams in designing better studies before going to the protocol-authoring stage. The Center taps experts and data to facilitate decision-making and promote a deeper understanding of design-related trade-off decisions and their impact on executional feasibility.  

Pfizer and GlaxoSmithKline have implemented extensive internal review processes to improve protocol quality and reduce amendments.  GSK implemented a new governance mechanism several years ago. Pfizer recently revised its standard operating procedures (SOPs) to require that all protocols go through a detailed protocol and amendment review prior to implementation. The first step in this process calls for a review by a senior-level governance committee to achieve consensus on the design elements of the study, to ensure that the protocol is consistent with the overall development plan, and to challenge the executional feasibility of the protocol.

Eli Lilly has implemented three core initiatives throughout the organization to simplify and focus protocol design; to incorporate patient-centered approaches; and to streamline the drug development process. One approach to support these initiatives is to solicit input—before protocol approval—from patients and investigative site staff during a simulation of study execution and the participation experience. Lilly’s study teams observe these simulations to identify and address feasibility issues that could potentially trigger the need to amend the protocol.

EMD Serono routinely conducts patient advisory boards to solicit patient feedback on protocol design and the feasibility of the schedule of assessments. These boards are conducted globally, each among six to 10 patients in collaboration with patient advocacy groups.

Lastly, TransCelerate BioPharma has made protocol feasibility one of its top areas of focus in 2016. TransCelerate recently released a Common Protocol Template, offering a common structure and language to drive protocol design quality and identify areas of misalignment between protocol endpoints and their respective procedures. TranCelerate’s initiative is among several other common authoring templates now available, including one developed by a community of global medical writers—the SPIRIT initiative—and launched a number of years ago.

Sponsors and CROs are rallying to reduce the number of avoidable amendments and ultimately improve protocol quality, executional feasibility, and efficiency. The anticipated improvements in study performance and cost could not come at a better time, given rapid growth in the scientific and executional complexity of protocol designs and growing interest in patient-centric drug development.


Kenneth A. Getz, MBA, is the Director of Sponsored Research at the Tufts CSDD and Chairman of CISCRP, both in Boston, MA, e-mail: [email protected]

lorem ipsum