OR WAIT 15 SECS
A rundown for sponsors of the right questions to ask and facts to check before selecting a CRO.
Since clinical trials and/or noninterventional studies (NIS) represent major investments by pharmaceutical companies and the choice of a CRO is a crucial success factor for the project, it should be common practice to perform precontract audits at those CROs in question with which no cooperation is established yet.1
In general, these audits are performed by experienced quality management (QM) employees of the sponsor, who do not always have a profound background in data management (DM), biostatistics/statistical programming (BIO) or medical writing (MW). Up to now, these fields, which are only a part of the tasks to be subcontracted, were to some extent neglected in such audits.
This article focuses on aspects to be checked with regard to data management facilities, biostatistics, and medical writing know-how of the CRO and may serve as a guide for audit activities in these fields.
A crucial question to begin with is: Has the CRO developed and kept up to date its own standard operating procedures (SOPs) with regard to DM, BIO, and MW workflows and corresponding interfaces between these functions, and is it willing to provide these SOPs so that they can be checked by the sponsor. The CRO should also be willing to perform the subcontracted activities according to the sponsor's own SOPs.
In order to assess capabilities and free capacity, the CRO should provide numbers of clinical trials by development phases it is currently working on and has worked on in the last 12 months (plus noninterventional studies if applicable).
The CRO should give concise figures to quantify which parts of their activities are subcontracted to third parties—for example, other CROs or freelancers in the fields of DM, BIO, statistical programming, and MW (see Figure 1).
Figure 1. Important questions to ask a CRO when inquiring about the amount in which they use third parties.
The CRO should specify the number and qualification of employees in the different functions. Curricula vitae as well as job descriptions should be available for all coworkers and should be up to date. An important issue is continuing training of coworkers, and this should be documented by training certificates and/or training logs.
The sponsor should also get an organization chart and information should be given by the CRO regarding fluctuation of employees (e.g., percentage of employees replaced) in the different functions during the last two years.
An important point is the hardware and software used in DM. The type of hardware (IBM/PC compatible, Mac or UNIX/Solaris-based) should be given and the architecture of servers and workstations explained. In addition, the database system (vendor, version, validation status) should be specified and validation certificates for the database system and corresponding system query language (SQL) should be provided.2 If additional software tools are used, these should be declared.
This also applies to electronic data capture systems (EDC) if offered alternatively to paper-based recording of data. Interactive voice response services (IVRS) are nowadays frequently used in randomized trials, and corresponding systems need to be inspected with regard to validation status and corresponding certificates. In each case it should be checked if the latest version of software tools is used and if maintenance contracts with the corresponding vendors cover the duration of the project to be subcontracted.2
It is now common practice that for each project DM procedures are specified a priori in a DM plan (DMP). It is to be defined in advance who will be responsible for preparation of the DMP and who will approve it. It is to be discussed with the CRO whether they can provide their own DMP template or the sponsor's template is to be used.
The CRO is to be audited with regard to the locking and tracking system used for case report forms (CRFs). The CRO should clearly inform the sponsor if tracking of the CRF is done manually or if an electronic system is used, and if tracking of the CRF is performed on the investigator, patient, visit, and/or page level. A crucial question is if cross-checks between the CRF tracking system and the database are performed on a regular basis.
If an EDC system is offered, the CRO should describe the system and its experience with the design of electronic CRF screens (eCRF). Edit checks that are to be implemented in the eCRF need to be described in a special document, which may be part of the DMP. If such a system is to be used, a user acceptance test is mandatory and special consideration is to be given to adequate archiving of the eCRF.3
With regard to the study database, the CRO should make clear if the database content is a one-to-one copy of the CRF.3 Electronic tracking of database changes is a must, and the CRO should clearly state if this is implemented.
It is required that the CRO has an SOP regarding the handling of database freeze and unfreeze, which are to be provided for inspection. It is recommended that the database system should be compliant to the FDA's 21 Code of Federal Regulations Part 11,4 which is mandatory for registration studies to be filed in the United States. This leads directly to the questions: are audit trails available, at which step does the audit trail begin, what data are tracked—should be at least date, time, user name, subject ID/visit/item, old value and new value—and are audit trail data available for review.
The data entry staff involved needs to undergo specific training for the project in question. In case double data entry is performed for a trial, this needs to be done by two different people, and any discrepancies found in comparing both entry files are to be handled by a third person.
Despite the fact that plain text should be avoided in CRFs as much as possible, handling of plain text should be discussed; that is, the procedures of first entering the text, rechecking the plain text, and handling discrepancies (preferably by a third person) need to be addressed.
Beyond clinical trials, single data entry may be sufficient. However, in the case of single data entry, the CRO should specify what percentage of CRF is cross-checked with database entries and what the maximum tolerated error rate is. The CRO should be capable to support data entry by plausibility checks and range checks as far as possible.4
With regard to the data validation process, the software used for it should be specified. If the software is not an SQL-based system implemented in the database system itself, the software package used for data validation should be specified and the corresponding interface must be validated.2 Whatever approach is used, the possibility to confirm/flag verified implausible data should be available.
Coding is another critical DM process to be audited. The coding dictionaries5,6 and versions available at the CRO should be given; however, based on the sponsor's requirements, coding according to other dictionaries should also be possible. With regard to manual coding steps, specifically trained employees should be available and their qualification should be specified (see Figure 2).
Figure 2. Questions to ask a CRO on its coding capabilities.
With respect to security of the database, access to it should be organized and restricted on the project level, and it should be described how access to the database is controlled, for example, password, user-rights, timeout, etc.7 It has to be documented who will have access to the database and what will be the individual access rights.
The CRO should specify how frequent back-ups of the database are made—that is, on a daily, weekly, monthly and/or quarterly basis—and where and how long these back-ups are stored; for example, will it be off-site storage, server-based, on the local drive or on tape. A common procedure is that incremental back-ups are made on a daily basis and full back-ups are performed once a week.
The CRO should specify the number and academic education of biostatisticians and the number and qualification of statistical programmers. The experience of the biostatisticians and statistical programmers in the different indications and phases of drug development—for example, pharmacokinetics/pharmacodynamics, dose finding, Phase IV clinical trials, meta-analyses, NIS—should be made transparent during the assessment.
In clinical trials, biostatisticians are responsible for randomization.8 Sometimes this task is outsourced to a CRO, which has to specify vendor and version of the software used to generate randomization plans and sealed envelopes. If the software used for randomization is not part of the database system, the interface to transfer randomization lists to the database must be validated and corresponding documentation must be available.2 The CRO's SOP on randomization has also to address the questions:
Biostatistical analyses and assignment of patients to analysis populations are to be defined in detail before unblinding in a statistical analysis plan (SAP). It is to be discussed with the CRO whether they can provide their own SAP template or the sponsor's template is to be used. It must be defined in advance who will be responsible for preparation and content of the SAP and the table shells, and who will provide approval.
Vendors and versions of the statistical software packages licensed by the CRO should be named. Applications depending on these software packages, such as standardized macros and customized interfaces for data import from database and/or for exchange between different software packages, are to be specified. Validation of these applications is mandatory, and corresponding documentation should be provided.2,6
It is to be discussed with the CRO which parts of the analyses are validated by output checks, source code checks or independent double statistical programming.9,10 Responsibilities for and documentation of these validation steps should be discussed in advance.
The CRO should make transparent how many of the medical writers are native speakers and what their individual experience is in different indications and with integrated clinical study reports (iCSR),11 drug dossiers for health authorities, and preparation of publication manuscripts.12,13
An important question to ask is if the CRO is able and willing to prepare an iCSR according to the sponsor's template, corresponding instruction documents, and style guides.
Responsibilities and corresponding documentation regarding internal reviews of the iCSR within the CRO should be made transparent.7
Based on the described assessments, the selection of a CRO able to perform the project should be made by the sponsor's management. Then, the CRO is called to make bids.
Unfortunately, this selection cannot be based on a simple scoring system, since the individual requirements of the project itself, such as the type of trial or noninterventional study, timelines, activities to be subcontracted, and budget—come into play and need to be weighted and compared against the CRO's capabilities. That includes the experience of their staff with the indication/substance, availability of resources, their size, and financial background.
To enable a detailed comparison of cost estimates, the structure of these bids needs to be specified in detail in advance. A spreadsheet with the individual items to be subcontracted may be provided to the CRO and structured according to the n following headings:
When the choice is finally made and a contract covering all aspects of the cooperation is signed, it is advisable to start documenting the progress of the project and adherence of the CRO with regard to quality of the subcontracted activities and corresponding timelines. We suggest that a topic called CRO performance should be a regular agenda item at study team meetings.
Progress of the activities subcontracted to the CRO and corresponding problems in the fields of data management, biostatistics, medical writing, and quality management are to be reported by study team members and documented in the study team meeting protocols. This includes actions to be taken by the responsible persons at the sponsor and CRO level and timelines for resolution of problems. If deemed necessary during the cooperation, special audits should be initiated by the study team in addition to the regular audits and should also be performed by the sponsor's QA staff.
When the CRO has completed all tasks subcontracted, the study team should make a comprehensive assessment of the cooperation. For this purpose, all study meeting minutes, audits, and visit reports should be reviewed. A structured summary of all problems in the fields of data management, biostatistics, medical writing, and quality management and their resolutions should be prepared using a special template to facilitate comparison of different CROs.
The summary of strengths and weaknesses of the CRO in these different fields is to be provided to the sponsor's management for perusal. CRO assessments should be filed at quality managment (and purchasing) and should form a growing database for future analysis and selection of CROs for bids and eventually preferred partnerships.
This data acquisition process has been successfully implemented at Sanofi-Aventis in Germany. A first exploratory analysis of the predictive power of precontract assessments with regard to the final assessments—taking into account the project's requirements—is planned to be performed in about three years, since average runtime of projects is around two years.
Stefan Schinzel,* MSc, is manager of biostatistics and epidemiology, email: Stefan.Schinzel@Sanofi-Aventis.com, Ferdinand Hundt, MD, PhD, FFPM, is director of clinical operations/medical and public affairs, Karlheinz Theobald, MSc, is manager of biostatistics and epidemiology, Friedbert Theis is head of quality management, Andrea Buchmann is manager of quality management, and Marlis Herbold is head of biostatistics and epidemiology, all at Sanofi-Aventis Deutschland GmbH, Clinical Operations/Medical Public Affairs, Industriepark Hoechst, Bldg K703, D-65926, Frankfurt am Main, Germany.
* To whom all correspondance should be addressed.
1. D. Chase and J.H. Schmidt, "Audit 2000 III: Sponsor Audits at Contract Research Organizations (CRO)—Description of the Point of View of the CRO, Part 1 and Part 2, Pharm Ind, 62 (9) 662-665 (2000), and Pharm Ind, 62 (10) 744-750 (2000).
2. General Principles of Software Validation; Final Guidance for Industry and FDA Staff, January 11, 2002, http://www.fda.gov/cdrh/comp/guidance/938.html.
3. Y. Noda, F. Alaya, H. Watanabe, T. Suzuki, K. Suzuki, N. Nishimura, H. Miida, M. Takezewa, "Resolution of Issues in Clinical Trials that Sponsors Entrusted to Contract Research Organizations In and Outside Japan," Quality Assurance Journal, 8, 77-86 (2004).
4. Food and Drug Administration, "21 CFR Part 11, Electronic Records; Electronic Signatures; Final Rule," Federal Register, 62 (54) 13429 (20 March 1997).
5. The Medical Dictionary for Regulatory Activities (MedDRA) terminology is the international medical terminology developed under the auspices of the International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH). MedDRA is a registered trademark of the International Federation of Pharmaceutical Manufacturers and Associations (IFPMA).
6. J. Kübler, R. Vonk, S. Beimel, W. Gunselmann, M. Hommering, D. Nehrdich, J. Köster, K. Theobald, P. Voleske, "Adverse Event Analysis and MedDRA: Business as Usual or Challenge? Drug Information Journal, 39, 63-72 (2005).
7. Department of Health and Human Services and the Food and Drug Administration, Guidance for Industry, Computerized Systems used in Clinical Trials (April 1999).
8. European Medicines Agency, ICH Topic E 9, Note for Guidance on Statistical Principles for Clinical Trials (CPMP/ICH/363/96), September 1998.
9. K.C. Benze, Risk-Based Approach to SAS Program Validation, Paper FC04 presented at PharmaSUG, Phoenix, Arizona, 2005.
10. N. Tsokanas, Validation of Programs Developed Using SAS, Paper IS02 presented at PhUSE, Lisbon, Portugal, 2007.
11. European Medicines Agency, ICH Topic E 3, Note for Guidance on Structure and Content of Clinical Study Reports (CPMP/ICH/137/95), July 1996.
12. D.G. Altman, K.F. Schulz, D. Moher, M. Egger, F. Davidoff, D. Elbourne, P.C. Gotzsche, T. Lang, "The Revised CONSORT Statement for Reporting Randomized Trials: Explanation and Elaboration," Annals of Internal Medicine, 134 (8) 663-694 (2001).
13. E. Elm, D.G. Altman, M. Egger, S.J. Pocock, P.C. Gotzsche, J.P. Vandenbroucke, "The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: Guidelines for Reporting Observational Studies," Epidemiology, 18 (6) 800-804 (November 2007).
Related Content:Online Extras