OR WAIT 15 SECS
A call to sponsors to rethink the role of ECGs in drug development and the use of central core labs.
Cardiac safety has become the number one reason for drug withdrawals and labeling changes during the past several years. This has resulted in greater regulatory scrutiny for all new compounds and greater concern among pharma in regard to the potential effect of their compounds on the heart.
Photography: Chris Knapton, Getty Images
One of the outcomes of this focus was the ICH E14 document, which provided guidance on the conduct of a dedicated ECG study that many term the "thorough QT (TQT) trial." These trials generally are conducted after proof of concept has been completed and employ digital and centralized ECGs as described in E14. However, although almost all TQT trials are centralized, only 20% to 25% of all other clinical research trials (Phases I through IV) use centralized ECG analyses.
If a TQT trial shows cardiac safety concerns, regulatory bodies generally require more robust or intense ECG collection in Phase III, and those trials are more commonly centralized and use digital ECGs. However, the majority of trials conducted worldwide still get ECGs the old fashioned way. They are done by local ECG machines with varying algorithms and often collected at the beginning and end of the trial, providing little to no data about the cardiac effects of the chemical entity being studied.
The reasons for this decentralized approach include a lack of regulatory guidance in this area and the perception that centralization must be more costly. I will discuss these and other issues to demonstrate that ECG centralization has an important place in the drug development process.
ECGs serve many roles in the research and development process. They generally are required as part of the screening, selection, and enrollment process for study participants; they assist in assessing the cardiac safety of the investigational drug; they provide efficacy data for cardiovascular-related compounds; and they provide indirect benefits to the subject and his or her physician in diagnosing undetected structural or electrical cardiac diseases, as well as detecting progression of cardiac disease during the course of the clinical trial.
Currently, most view the main role of ECGs as the tool that best assesses the effect of investigational drugs on the electrical functions of the heart. In particular, ECGs are used to gather information on changes in the QT interval, the best surrogate marker for the possibility that an investigational drug may cause Torsade de Pointes, a life threatening arrhythmia. ECGs also gather information on heart rate effects, atrioventricular conduction, depolarization, and changes in ECG morphology that may indicate pathological cardiac changes.
This process begins with early phase studies in healthy volunteers and continues into Phase III and Phase IV studies, where subjects are most susceptible to the cardiac effects of agents and where effect modifiers exist that may enhance unknown or minimal cardiac actions of new drugs.
For those who do not centralize ECG analysis because of a lack of concern about cardiac safety when the drug has no early QT signal or other cardiac issues, it is important to understand that it is difficult to assess all the possible cardiac safety effects of investigational drugs in premarketing clinical trials. Most of the best known recent drug withdrawals and labeling changes have been for drugs that were already being marketed to the public. It would, therefore, make sense to get the best possible data during clinical trials before these drugs make their way to the market with unexpected effects.
The task of identifying potential cardiac safety effects in clinical trials is difficult enough due to the short duration, narrow patient populations, limited indications, and relatively small sizes of these studies. The difficulty is further compounded by the fact that very few clinical trials use centralized ECG readings to increase the probability of detecting cardiac effects. This increases the background "noise" in clinical trials due to the different ECG machines at different clinical sites; the different measurement algorithms for QT, QTc, QRS, and PR intervals; and the differing morphology interpretation programs within the ECG machines themselves.
Add to this the large daily variation in a subject's QT duration (about 60 to 75 msec) and the lack of consistency amongst cardiologists in ECG interpretation, and analyzing data obtained in this manner becomes extremely difficult.
The site analyzing the ECG, a decentralized analysis process, is the most common method for ECG interpretation. This is estimated to occur in 75% to 80% of clinical trials (excluding TQT trials) and is, therefore, the most common method of ECG interpretation in clinical research. All queries, data reconciliation, and database issues are the responsibility of the sponsor.
Most clinical trial sponsors do not consider the consequences of ECG collection in their integrated summary of cardiac safety in filing approval for marketing until late in the process. Therefore, they incur many data problems that would generally be addressed throughout the research process by a central ECG laboratory. These processes include qualifying the ECG equipment at sites, training investigators, generating queries, reconciling data, and performing the actual management of the ECG database and database lock.
The sites are responsible for performing and acting upon the interpreted ECG, which must be documented in the case report form. This transfer of information can result in transcription errors and other unexpected results due to the investigator's inability to adequately interpret the ECG data.
From a purely clinical standpoint, Viskin et al.1 did a study to look at the competency of noncardiologists, cardiologists, electrophysiologists, and QT experts in assessing the QT interval measurement in a small sample of ECGs. The findings were striking: Noncardiologists and cardiologists were found to be essentially equal in their inability to assess the length of the QT interval with an accuracy rate of 21% and 22% respectively; electrophysiologists fared better at 62%; and QT experts were correct 96% of the time. For the majority of clinical trials being conducted today, whether the investigator interprets the ECG or has a local cardiologist do the work for them, it is highly unlikely that accurate clinical signals will be detected.
For clinical trial sponsors who have not considered centralizing ECG analysis, I raise the question: Why is it so common to use a central lab for blood analysis but not for ECGs? The concerns that led to blood laboratory centralization are the same for ECGs when it comes to data quality and integrity.
When a sponsor chooses to centralize the ECG process, an ECG core laboratory takes on much of the work that would normally be done by sponsors, CROs, and sites. The use of digital ECG collection allows for a process that expedites the ECG reading process and results in cleaner data that is not subject to interpretation of the handwriting at the investigator site.
The query process is also enhanced, as some core labs have systems that automatically check for changes in demography and/or missing visits. This will aid the data lock process as the study draws to a close. Additionally, most core labs provide equipment to the sites for the digital collection of ECGs.
The sites should be provided with equipment that has been tested to assure full function and programmed for the specific demography capture for the study. Importantly, the interpretation algorithm used for all machines in a given study should be the same. This ensures that all sites are working from a common platform and should reduce some of the spurious errors seen when sites use old or poorly constructed equipment.
Most investigator sites involved in the study of noncardiac drugs do not have the expertise to analyze ECGs. As stated earlier, since most studies are conducted in a decentralized fashion, generally the investigator sites just copy the interval duration measurements (IDMs) directly off the machine's paper printout. Often the machine's morphology interpretation is used with a local cardiologist over-read.
As each site will have different machines, the IDMs—which generally consist of the heart rate or R-R interval, the PR interval, the QRS interval, the QT interval, and the calculated QTc—will vary from site to site. In addition, some sites may use different machines on different visits, resulting in varying IDMs on the same subject. Also, it is important to note that not all machines calculate QTc the same way, making comparison across the entire study unscientific. Lastly, the accuracy of the measurements of these algorithms is known to be inaccurate, especially in ECG tracings with abnormal ST-T morphology, which is common in subjects with clinical conditions.
When a core ECG laboratory is used, the work of the CRAs and data management teams at the sponsor or CRO is diminished due to the query and data cleaning processes employed by the laboratory. In regard to data quality, every IDM is measured by a qualified person and each ECG is seen by a qualified cardiologist, thus ensuring a high degree of consistency.
Currently, there is no regulatory mandate to centralize ECGs for trials other than what is implied in the E14. However, with the advent of the ECG warehouse (which uses digitally acquired ECGs provided in XML format to allow regulators to more efficiently view and analyze the quality of ECG data submitted), regulators have been asking for most studies in all phases of research to be submitted to the warehouse.
This is not a prospective requirement at the current time in Phases I through III, but it is a request that trial sponsors are complying with and requires them to collect their ECG data in a centralized fashion for that trial. The submission of the data to the warehouse will serve to expedite regulatory review and will assist the regulatory agency in advancing their knowledge of cardiac safety issues, which could lead to improvements in the current process or to new methods of testing.
Perhaps the greatest difficulty in changing the current balance of decentralized to centralized ECG collection and interpretation is the concern over cost. Unfortunately, most clinical trial sponsors cannot quantify the number of ECGs they do, and even more rarely are they able to estimate the true cost of doing ECGs decentralized versus centralized. The general perception is that they are already paying the site to perform and read the ECGs and that the addition of a centralized ECG reading service is a further expense.
One approach sponsors can take is to decrease the ECG acquisition fee they pay, as that fee takes into account both the technician time and the use of the investigator's ECG machine. Digitally centralized ECGs require the use of centralized equipment; therefore, technically the investigator is not entitled to reimbursement related to the equipment provided.
Much of the work done by ECG core laboratories has little to do with the actual reading of the ECGs. Each core laboratory has its own method of measuring IDMs and interpreting ECGs. Clinical trial sponsors should endeavor to understand what approach is being taken before selecting a provider (see sidebar). Often overlooked in the selection process are the systems being used by the core laboratory and the procedures in place to simplify the ECG data acquisition process for the sponsor.
Homework for Sponsors
Sponsors need to assess the ECG core laboratory's ability to manage, ship, and program ECG equipment (logistics); to query data on an ongoing basis; to deliver clean data on time; and to supervise the process with a strong emphasis on project management. Although many sponsors feel that they already have these services "in house," they are certainly not free, and a good ECG core laboratory will serve to decrease the work of CRAs at clinical sites, decrease the work of the study's project manager, and decrease the work of the data management team being used by the sponsor, freeing all up for other work.
It has been a challenge for core labs to prove to sponsors that improvement in data quality and the reduction of effort on their part and the part of sites is of sufficient value to justify the expense of ECG centralization. However, at current pricing levels it is a challenge that can be met when sponsors consider all of the costs involved with decentralized ECG collection.
It is important for sponsor companies to recognize that the support each core laboratory provides may not be the same. They also need to understand what they are paying for to ensure that they are truly getting value for their investment.
1. Viskin et al., "Inaccurate Electrocardiographic Interpretation of Long QT: The Majority of Physicians Cannot Recognize a Long QT When They See One," HeartRhythm, 2 (6) 569-574 (2005).
Jeffrey S. Litwin, MD, FACC, is executive vice president and chief medical officer at eResearchTechnology, 8th Floor, 30 S. 17th Street, Philadelphia, PA 19103, email:email@example.com He is also a member of the Applied ClinicalTrials Editorial Advisory Board.
Related Content:Investigative Sites