The Turning Point for Clinical Research: Global Data Standardization

Article

Applied Clinical Trials

Clinical research is only as effective as its ability to have an impact on health.

Clinical research is only as effective as its ability to have an impact on health. This impact comes when researchers find breakthroughs, discover new diagnostics or treatments, and identify critical pathways that lead to curing diseases. To maximize their utility, clinical research data should be traceable, accessible, interoperable, reproducible, and of good quality, allowing study findings to be imparted and shared in a clear and understandable way.1 Unfortunately, today clinical research data are often collected in a variety of formats, leading to difficulties to effectively share and compare the data under the terms allowed by study participants’ consent. This disconnect creates an evidence gap that slows scientific advances, which can result in ineffective and even harmful treatments and diagnostics that continue to be employed in clinical practice.2

Standardization of clinical research data will be a topic explored in depth at the 2019 Bridging Clinical Research & Clinical Health Care Collaborative on March 4-5 in Washington, D.C.

A significant issue that arises when working with research data is the inability to validate and reproduce findings to demonstrate that the experimental result is in fact true. A survey of over 1,500 researchers conducted by Nature in 2016 found that more than 70% of researchers have tried and failed to reproduce another scientist’s experiments, and more than half have failed to reproduce their own experiments.3 This effect is commonly caused by divergence from the protocol and the inability to retrace steps in the process.4 The landmark article by Ioannidis in 2005 titled Why Most Published Research Findings Are Falsestates: “The greater the flexibility in designs, definitions, outcomes, and analytical modes in a scientific field, the less likely the research findings are to be true.”5

While irreproducibility of research results in the field of genetics is encouraging greater transparency in methods and materials, along with the analytic codes that underlie the conclusions, this does not appear to be the case for clinical trials. There are also efforts to leverage ‘big data’, which may provide information on trends, signals, or hypotheses to be tested further, but generally do not provide results of sufficient adequacy to support regulatory submissions. Regulated clinical resesearch has become increasingly global, particularly for areas such as rare diseases for which there is a small population of patients spread throughout the world. Efforts to streamline regulatory submissions for new product approvals have encouraged the development, largely through the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH),6 to standardize and harmonize the structure of these submissions as eCommon Technical Documents (eCTD). Such standards are useful not only for sponsors who wish to submit in multiple regions simultaneously, but also for regulators to facilitate reviews. ICH has also provided guidelines for global research on protocols, terminologies, and statistical analyses. Currently, an estimated 85% of research studies do not translate to a meaningful clinical discovery.7 The causes for this low level of translation of promising research into meaningful insights and interventions for human health are multiple. One of many examples is the discovery of the relationship between infant sleeping position and Sudden Infant Death Syndrome (SIDS). Had it been possible to aggregate and systemically analyze all the evidence available by the year 1970, over 60,000 infant deaths worldwide could have been preventd.8 Differences in protocols among studies, small sample sizes, numbers of patients, and families involved per study, and differences in comparisons between SIDS and unaffected infants were among the factors that may have contributed to the delayed recognition of infant positioning on their back while sleeping as a protective factor against SIDS. This is one of many cases where critical health findings were present, but hidden in the data.

Regulatory validation of clinical trial findings involves stringent requirements to ensure that regulators can adequately evaluate the safety and efficacy of the medicinal product. Within the flexibilities afforded by the US Federal Food, Drug and Cosmetic Act, at least two adequate and well-controlled studies, each convincing on its own, are generally needed to establish effectiveness; similar recommendation was given by the European Medicines Agency (EMA).9,10,11 Review of trial data includes the “validation” needed to establish that the results have clinical meaning and that the findings are not due to chance alone. Furthermore, the need to provide adequate directions for the use of a drug in relevant subgroups requires an assessment of aggreagated data from multiple trials. This regulatory review is facilitated by the use of standards for protocol information, outcome definitions, data terminology, and formats. Adoption of common standards in research becomes pertinent to the regulatory process as data from early discovery is translated into clinical benefit (e.g. biomarker discovery, mechanistic studies, etc). The terminology standards used in regulatory submissions and healthcare (see below) can be similarly adopted in research clinical trials to facilitate this seamless integration of data.12,13

How to ensure meaningful exchange of information

Interoperability is “the ability of different information technology systems and software applications to communicate, exchange data, and use the information that has been exchanged.”14 ‘Semantic interoperability’ refers not only to the exchange of information, but also the exchange of meaning such that the recipient of the information can readily understand and interpret the information accurately in the manner intended by the data generator and/or sender.

Recently, FAIR has been cited as an acronym for four desiderata that should be provided for a data publishing environment for machines and humans, to support appropriate aspects of data sharing.15 These FAIR "Facets" are:

· Data should be Findable

· Data should be Accessible

· Data should be Interoperable

· Data should be Re-usable

One key to ensuring semantic interoperability and adherence to the FAIR principles or facets is for parties to use the same data standards and terminologies or ontologies. Clearly, the more parties who agree on the data standards and terminologies, the better. This is the rationale behind consensus-building for a robust standards development process.

To maximize the real-world impact of any research study, the data must be collected and analyzed in a common format. Standardization helps build efficient and interoperable research data networks capable of producing high-quality and more reliable data that can support healthcare decisions, detect safety and other signals, be utilized to generate new hypotheses and new knowledge. It also streamlines research activities by allowing data to be accrued more efficiently, and makes it possible to consolidate digital data available from different sources to support further research and healthcare decisions.16  

Data standards allow research teams to explicitly name and define the different elements and aspects of their studies. By using standard terms, researchers can precisely describe, manage, and share their data, allowing external research teams to understand what the researchers did, how they did it, how to interpret the results, and accurately reproduce these results in future studies. It also lets researchers perform queries across diverse datasets, which allows for data from different research studies to be consolidated into larger datasets for analysis. In addition to supporting collaboration among researchers, standardization ultimately leads to more organized evidence, which can be better understood by audiences possessing limited scientific literacy. This organization can increase the ability of researchers and lay people to comprehend and share important findings.

There are several clinical research standards in use globally today, which cover the different stages of clinical research. These include those from the Clinical Data Interchange Standards Consortium (CDISC) for clinical and translational research,17 Controlled Terminology published through the NCI’s Enterprise Vocabulary Services, MedDRA (Medical Dictionary for Regulatory Activities) for medical history in clinical trials and for adverse events reporting,18 Health Level Seven (HL7) for structured product labels and ECG waveforms, International Standards Organization (ISO) for Identification Medical Products (IDMP), LOINC (Logical Observation Identifiers Names and Codes) for clinical laboratory tests and observations,19 and ICH, as previously mentioned. There are also standards that exemplify collaboration among standards development organizations (SDOs) and other organizations. For example, the Biomedical Research Integrated Domain Group (BRIDG) Model is a CDISC, HL7 and ISO standard with the U.S. National Cancer Institute (NCI) and FDA as key stakeholders.20

Over the past two decades, CDISC, a global, non-profit organization that develops data standards through a volunteer-driven, consensus-based process has developed a global, open-access suite of clinical and translational research data standards. These standards support the entire research lifecycle (including pre-clinical research) from structured protocol information through data collection, exchange, tabulation, analysis, reporting.21 Standards specific to certain therapeutic areas have been developed collaboratively through the Coalition for Accelerating Standards and Therapies (CFAST), which has included the Critical Path Institute, CDISC, the U.S. Food and Drug Administration (FDA), NCI and TransCelerate along with medical experts and patient groups working in these therapeutic areas (TA). Regulators from Europe and Japan have also contributed to the development of these TA standards. The TA User Guides specify how to use these standards to structure the data for research on a given disease or treatment, broadening the circle of collaboration with patient representative groups, research investigators, and public-private partnerships. FDA has published specifications for these TAs in their Study Data Technical Conformance Guide.22 Working with data in a common format with controlled terminology makes it easier, faster, and more efficient for pharmaceutical companies, clinical research organizations, academic organizations, regulators, and other government entities to collaborate on projects.23 These standards are utilized for both regulated and some non-regulated trials including interventional, observational studies, nutrition, public health, epidemiology, medical device, and outcomes research. They have even been applied to data from studies on healthy birth, growth, and development.

Data from traditional pharmaceutical, academic, public health, and the healthcare enterprises vary in their level of standardization. This interdependent research continuum highlights the need for standards that translate across the evidence divide.7 Implementing standards from protocol through analysis stages can enhance the quality and efficiency of clinical research processes and facilitate traceability, particularly when the standards are implemented from the start. Many research teams have made impactful discoveries with the application of data standards in later stages of the research process, but not without significant data transformation effort at the end of the process. For instance, a research team recently conducted a meta-analysis of chemotherapy in head and neck cancer (MACH-NC) by contacting and requesting individual patient data from several published studies. Analyzing the combined data, which included patient and tumor characteristics, dates of failure and death, treatment details, and toxicities, the researchers demonstrated the superiority of concurrent chemotherapy in the treatment of certain cancers, validating the results of the published studies.24 Their work could have been simplified and enhanced substantially had the different datasets been standardized from the beginning of each individual study.

Standardization allows a significantly faster and less costly avenue for generating evidence and performing robust analyses, by providing the data and processes employed in a common, predictable, and explicit format. A recent research project exploited open-access clinical trial data standardized using CDISC to answer important questions in prostate cancer to save time and reduce costs of the initiative.25 Data standards also provide great potential for semi-automation of the evidence generation process26 and for saving substantial human resources and time in the start-up of a clinical trial.21 If data collection standards are employed from the beginning, study start-up times can be reduced by 70% to 90%, since standard case report forms, edit checks and validation documentation already exist and many can be re-used from study to study. Study teams can then focus on protocol-specific additions to the standards, which results in cost savings, faster delivery of results, and higher quality data.27

Data standards also facilitate community engagement, data sharing, and transparency. An open-data, crowdsourced project from Project Data Sphere identified predictors for survival in castration-resistant metastatic prostate cancer through prognostic models that used CDISC-standardized data from the comparator arms of four Phase III clinical trials and enabled 50 independent teams.25 These teams developed a comprehensive set of benchmarked models that uncovered key prognostic variables and novel interactions between them. All method predictions and code from this initiative are available for public use, increasing transparency and facilitating collaboration. Project Data Sphere participants noted that the data provided in a known standard format were easier to interpret and more useful than those that were submitted in proprietary formats.

Responses to epidemics and global public health emergencies, such as outbreaks like Ebola and Zika, realize significant benefit from standards by ensuring that decisions are based on the best available evidence. The earlier treatments can be evaluated, the faster outbreaks can be contained. In 2015 the World Health Organization (WHO) conducted a consultation on research data sharing during public health emergencies. A background briefing for this exercise mentioned multiple opportunities for improvement with regard to data sharing, including the “need to build databases where all data are entered in a uniform way, which can be populated when outbreaks occur and are available worldwide.”28 This solution requires that data standards be available prior to outbreaks. WHO convened a diverse group of stakeholders to discuss the development of global norms and standards for more rapid and transparent data-sharing during public health emergencies.29 Common research data standards have now been collaboratively developed for Ebola,30 malaria,31 and influenza,32 all of which can be leveraged for responding to new outbreaks.33,34

 

Improving drug regulation

Data standards in regulatory submissions supporting new product applications have enabled efficient review through automated validation of data quality. A suite of tools and services for clinical and nonclinical standardized data support high level analysis early in the review process.35,36 Transparency of the regulatory review processes is enhanced through engagement in the process of standards development and the availability of publicly-shared standard analyses scripts.37,38 The incorporation of patient reported outcome measures along with the TA standards could draw an even broader set of stakeholders into the process. These standards are freely available and could be adopted to enable the same transformation in all supported clinical research. Downstream standard development efforts built on standardized data include harmonized research protocol templates and outcomes adapted for therapeutic areas. These efforts bring us closer to the possibility of even greater efficiency with master protocols for use in clinical trial networks. FDA and ICH developed a common protocol template concurrently with another such development effort by TransCelerate. These templates have now been harmonized and  published as one.39 They are now being ‘technology-enabled’ based upon protocol standards developed previously and incorporated into the BRIDG model. This common protocol template has already proven to be quite useful in a) ensuring that endpoints to be collected are aligned with protocol objectives; and, b) information from the protocol can be re-used across multiple downstream documents such as the statistical analysis plan, the clinical study report and the product label. These efforts have now led to a new protocol project with ICH. 

Exchange of ‘computable biomedical knowledge’ (CBK) is also being studied in academia for providing results of research back to practice as in the final portion of a learning health cycle.40 The Learning Health Community41 has an inititive called Essential Standards to Enable Learning (ESTEL),42 which has published a white paper regarding a framework for LHS standards. These LHS-related efforts do not encourage the development of new standards, rather leveraging those that already exist and building upon them. The NIH has also recently invested funds in a Center for Data to Health (CD2H) to encourage adoption of standards across NIH CTSAs as one goal.43 Another area ripe for standards adoption is electronic health records, which will be better leveraged for research purposes when data can readily be shared in a standard format. FDA has issued recent Guidance in this regard.44

For research studies intended for regulatory review, concerted efforts have been made to create global guidelines and standards for developing new therapies. The International Conference on Harmonization (ICH) developed guidelines for good clinical practices and formats for new product submissions to regulators for review in Europe, United States, and Japan. One key data standard output of ICH was MedDRA, which consisted of a rich and highly specific standardized medical terminology, created to facilitate sharing of regulatory information internationally for medical products used by humans. Global data standards for regulated clinical research were collaboratively developed  to complement the ICH work, for example, the Clinical Trial Registry (CTR) standard,45 which can be used to register clinical trials in the NIH/NLM ct.gov, the WHO International Clinical Trial Registry Platform (ICTRP),46 and the EMA’s EudraCT.47 The European Innovative Medicines Initiative (IMI) also encouraged the use of standards for the research studies they fund by offering a ‘Standards Starter Pack’ as a reference.36

Improving policymaking through research standards

Governmental authorities, international public health sponsors and advocates, biomedical research consortia, professional medical societies, and advisory committees charged with recommending ways to improve the efficacy and safety of medicines and other health technologies have promoted data sharing as a way to improve research. At the time of the writing of this manuscript the NIH is drafting guidelines to foster the development of scientific evidence with explicit, transparent, and consistently reported methods allowing: 1) decisions to be traced to the underlying evidence; 2) additional analyses of the dataset that may be required for decision-making; 3) new knowledge and insights to be gained through the analysis of pooled data; and 4) routine updating of systematic reviews across studies as new evidence becomes available.48 The United States’ 21st Century Cures Act49 encourages FDA to develop ways to leverage real world data (e.g. from EHRs and mobile devices) to augment clinical trial data and specifically referenced CDISC as a standards setting body. The Patient Centered Outcomes Research Instiute (PCORI)50 has funded, through its Trust Fund, a cross-agency project led by FDA to facilitate the use of real world data through the harmonization of common data models (CDM) that have been adopted by various research networks, including PCORNet, ODHSI/OMOP, and Sentinel. This act did not, however, mandate use of standards for federally-funded academic clinical trials.

More generally, funding agencies also have established data sharing policies, though few require the use of data standards over the course of conducting the funded research. While trials that meet criteria for submission to electronic clinical trial registries will need some degree of protocol description or adverse event standardization, aggregation, and secondary use of full datasets is inhibited due to the absence of a requirement that funded researchers utilize standards. As long as federal funding agencies do not have similar mandates or guidelines for standards as do regulatory agencies, sharing of data between or among agencies is hindered. Some funding agencies have taken another approach-to standardize data from researchers to common structures and semantics. The U.S. National Institute of Allergy and Infectious Disease (NIAID) has created a data warehouse that utilizes CDISC’s data collection and aggregaton standards to model and standardize their funded clinical trial data from diverse sources;51,52 also, NIAID is funding the development of a TA standard and implementing CDISC standards for global research studies. Similarly, NIAID’s Immport Database,53 which aggregates information from diverse translational or clinical immunology studies, uses CDISC to structure data extracts to support secondary use.54 These platforms maximize the NIAID investment in research by providing sources of data that share common meaning. Their data can be readily utilized for meta-analyses with similar regulated trials, as the FDA requires use of CDISC standards for submissions, but adoption and use of a common standard within academic federal funding agencies’ systems is not yet common globally. Thus, policymakers have the opportunity to multiply the value of federally-funded and regulated trials by not only making provision for data sharing, but also by requiring global clinical research standards.

Contribute to research data standardization efforts

Getting from where we currently operate to a place where standardized research data around the world can truly talk to each other is a great challenge and an immense opportunity. We have a collective responsibility to contribute to this effort; global stakeholders have different roles to play. Researchers and sponsors alike should become aware that the initial training and time required to implement data standards is more than worth the effort, since standards simplify the regulatory submission process, while enabling the data to be repurposed, within and outside their research teams. Furthermore, regulatory agencies could continue increasing the amount of information-publicly or via controlled access-from regulatory submissions, following the example of EMA, to allow examination from different parties and enable the wider scientific community to conduct research and answer more questions using the increasingly available data. Coupled with the use of standardized data, it should eventually lead to higher quality submissions and regulatory reviews.55 National and international health policymakers have the responsibility to demand a broader evidence base to support their decisions and recommendations, as well as a more rigorous approach for evidence synthesis presented to them or developed by their teams. As FDA and PMDA have done, national entities, such as the 27 different institutes and centers that comprise the NIH in the United States, should avoid unnecessary duplication of efforts and coordinate around existing robust standards that are maintained by global standards development organizations. There are several examples of global standards used within NIH. The National Human Genome Research Institute (NHGRI) relies heavily on the use of international standards to annotate genetic and phenomic data. Without the use of standards such as the Gene Ontology (GO) and the Human Phenotype Ontology (HPO), scientists would not be able to directly compare scientific results. Furthermore, as new discoveries are made, these same scientists contribute back to the ontologies to maintain the standards. Another example of NIH involvement with standards bodies is Genetic and Rare Diseases (GARD), which relies heavily on SNOMED, ICD, and Orphanet to find and share resources. National policymakers should form a team of technical experts to evaluate the best avenues for implementing data standards, adopting and encouraging the use of existing international standards whenever possible, to pave the way for global data exchange. International policymakers, in turn, should promote the adoption of global data standards as means of accelerating and enhancing collaborations among international partners for greater global impact of research. International policymakers are also responsible for providing technical support to countries in the progressive implementation of research data standards, so countries can make more informed national decisions and contribute to the global pool of standardized data. Entities that are part of the healthcare system should continue efforts to bridge the gap between clinical practice and research while implementing data standardization as well. 

Imagine a world in which research data can be shared and aggregated seamlessly such that the power of that data can be maximized to accelerate collaborative learning and streamline the path to new therapies. We have an ethical imperative to adopt and leverage robust global data standards that will improve the way research is conducted to benefit all patients.

 

Authors:

Barbara Jauregui,* is an International Consultant, Pan American Health Organization/World Health Organization; Lynn D. Hudson* is Chief Science Officer, Critical Path Institute (C-Path); Lauren B. Becnel is Senior Director RWDnA & Data Strategy-Oncology Client Partner, Pfizer; Eileen Navarro Almario is Associate Director for Clinical Affairs in the Office of Computational Science, OTS, CDER, FDA; Ronald Fitzmartin is Data Standards Staff, Office of the Director, CBER, FDA; Frank Pétavy is Head of Biostatistics and Methodology Support, EMA; Nathalie Seigneuret is Senior Scientific Project Manager, Innovative Medicines Initiative; James K. Malone is Senior Medical Director, Eli Lilly and Company; Fang Liz Zhou is Director, Global Medical Evidence Generation, Sanofi-Aventis; Jose Galvez is Chief, Office of Biomedical Translational Research Informatics (BTRIS), NIH; Tammy Jackson is Senior Director, Clinical Innovation, PPD Inc.; Nicole Harmon is Chief of Staff, Clinical Data Interchange Standards Consortium (CDISC);  and Rebecca D. Kush is President, Catalysis Research; Scientific Innovation Officer, Elligo Health Research, and Fellow, Translational Research Center for Medical Innovation, Foundation for Biomedical Research and Innovation, Kobe, Japan.

Authors include representatives of PAHO/WHO, Critical Path Institute, FDA (CBER and CDER), EMA, NIH, IMI, ACRO, CDISC, Lilly (TransCelerate), Sanofi (DataSphere), Pfizer and Elligo Health Research.

*LDH and BJ contributed equally to this article.

Funding Information: No funding of any kind was received for the development and publication of this manuscript. This collaboration was funded in part by 1U01FD005855 awarded to the Critical Path Institute to develop therapeutic area data standards through the Coalition for Accelerating Standards and Therapies (CFAST).  This manuscript was supported in part by the Intramural Research Program of the NIH.

Conflicts of Interest: JKM holds stock in Eli Lilly and Company. FLZ is employed by and holds stock in Sanofi-Aventis. LBB is employed by Pfizer. All other authors have no conflicts of interest to declare.

Acknowledgments: Authors would also like to thank Enrica Alteri and Fergus Sweeney from EMA, and Paul Houston for their suggestions to improve the draft manuscript. Last but not least, the authors would like to acknowledge the thousands of individuals around the world who contribute to developing consensus-based standards for research and to enhance the connections between research and healthcare.

Disclaimer: The views expressed in this article are the personal views of the authors and may not be understood or quoted as being made on behalf of or reflecting the position of the US Food and Drug Administration, the position of the National Institutes of Health, the position of the Pan American Health Organization/World Health Organization, the position of the European Medicines Agency or one of its committees or working parties, or the position of the Innovative Medicines Initiative (IMI) nor the European Union, EFPIA, or any Associated Partners.

 

References

1. Wilkinson MD, Dumontier M, Aalbersberg IJ, Appleton G, Axton M, Baak A, et al. The FAIR Guiding Principles for scientific data management and stewardship. Sci Data. 2016 Mar 15;3:160018.

2. Kush R, Goldman M. Fostering responsible data sharing through standards. N Engl J Med. 2014 Jun 05;370(23):2163-5.

3. Baker M. 1,500 scientists lift the lid on reproducibility. Nature. 2016.

4. Weissgerber TL, Garovic VD, Winham SJ, Milic NM, Prager EM. Transparent reporting for reproducible science. J Neurosci Res. [Editorial]. 2016 Oct;94(10):859-64.

5. Ioannidis JP. Why most published research findings are false. PLoS Med. 2005 Aug;2(8):e124.

6. International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). Homepage.   [28 November 2018]; Available from: https://www.ich.org/home.html.

7. Ioannidis JP. How to make more published research true. PLoS Med. [Research Support, Non-U.S. Gov't]. 2014 Oct;11(10):e1001747.

8. Gilbert R, Salanti G, Harden M, See S. Infant sleeping position and the sudden infant death syndrome: systematic review of observational studies and historical review of recommendations from 1940 to 2002. Int J Epidemiol. [Meta-Analysis Review]. 2005 Aug;34(4):874-87.

9. Government USF. Drug Amendments Act of 1962, Public Law 87-781, 76 STAT 780. 1962.

10. Drug USDoHaHSFa. Guidance for Industry: Providing Clinical Evidence of Effectiveness for Human Drug and Biological Products1998: Available from: https://www.fda.gov/ucm/groups/fdagov-public/@fdagov-drugs-gen/documents/document/ucm072008.pdf.

11. The European Agency for the Evaluation of Medicinal Products. COMMITTEE FOR PROPRIETARY MEDICINAL PRODUCTS (CPMP) - POINTS TO CONSIDER ON APPLICATION WITH 1. META-ANALYSES; 2. ONE PIVOTAL STUDY. London2001.

12. Institute NC. Common Terminology Criteria for Adverse Events v.4.032010: Available from: https://evs.nci.nih.gov/ftp1/CTCAE/CTCAE_4.03_2010-06-14_QuickReference_5x7.pdf.

13. Shamsuzzaman M, Patel, T., Navarro Almario, E., Wu, C., Tesfaldet, B., Fleg, J., Csako, G., et al. Abstract 18043: Identifying Predictors for All-Cause Mortality in Diabetic Patients in the ACCORD Trial Using Random Survival Forests Circulation. Circulation. 2017;136(Suppl 1).

14. HIMSS. Definition of Interoperability.   [28 November 2018]; Available from: https://www.himss.org/sites/himssorg/files/FileDownloads/HIMSS%20Interoperability%20Definition%20FINAL.pdf.

15. FORCE11. GUIDING PRINCIPLES FOR FINDABLE, ACCESSIBLE, INTEROPERABLE AND RE-USABLE DATA PUBLISHING VERSION B1.0.   [28 November 2018]; Available from: https://www.force11.org/fairprinciples.

16. Califf RM, Robb MA, Bindman AB, Briggs JP, Collins FS, Conway PH, et al. Transforming Evidence Generation to Support Health and Health Care Decisions. N Engl J Med. 2016 Dec 15;375(24):2395-400.

17. CDISC. Clinical Data Interchange Standards Consortium (CDISC) Website Homepage.  2017 [December 1, 2017]; Available from: https://www.cdisc.org/.

18. Brown EG, Wood L, Wood S. The medical dictionary for regulatory activities (MedDRA). Drug Saf. 1999 Feb;20(2):109-17.

19. Institute R. Logical Observation Identifiers Names and Codes (LOINC) Website Homepage.  2017 [December 1, 2017]; Available from: https://loinc.org.

20. BRIDG. Biomedical Research Integrated Domain Group (BRIDG) Model [28 November 2018]; Available from: https://bridgmodel.nci.nih.gov.

21. Rozwell C KR, Helton E. CDISC Standards: Enabling Reuse without Rework. Applied Clinical Trials. 2006.

22. Center for Biologics Evaluation and Research (CBER). STUDY DATA TECHNICAL CONFORMANCE GUIDE. [28 November 2018]; Available from: https://www.fda.gov/downloads/forindustry/datastandards/studydatastandards/ucm384744.pdf.

23. Rozwell C KR, Helton E. Saving time and money. Applied Clinical Trials. 2007;16(6):70-4.

24. Eisbruch A. Meta-analyses and Systematic Reviews: Can They Solve Contentious Clinical Questions? Cancer J. 2017 Mar/Apr;23(2):84-5.

25. Guinney J, Wang T, Laajala TD, Winner KK, Bare JC, Neto EC, et al. Prediction of overall survival for patients with metastatic castration-resistant prostate cancer: development of a prognostic model through a crowdsourced challenge with open clinical trial data. Lancet Oncol. 2017 Jan;18(1):132-42.

26. Bastian H, Glasziou P, Chalmers I. Seventy-five trials and eleven systematic reviews a day: how will we ever keep up? PLoS Med. 2010 Sep 21;7(9):e1000326.

27. CDISC. 2014 Business Case for Use of Standards2014: Available from: https://www.cdisc.org/2014-business-case-use-cdisc-standards.

28. Goldacre B HS, Mahtani KR, Heneghan C. WHO consultation on Data and Results Sharing During Public Health Emergencies. Oxford: Centre for Evidence-Based Medicine2015.

29. World Health Organization. An R & D Blueprint for Action to Prevent Epidemics. Geneva: World Health Organization2016.

30. CDISC. CDISC Ebola Therapeutic Area User Guide2016: Available from: https://www.cdisc.org/standards/therapeutic-areas/ebola.

31. CDISC. CDISC Malaria Therapeutic Area User Guide v1.02017: Available from: https://www.cdisc.org/standards/therapeutic-areas/malaria.

32. CDISC. CDISC Influenza Therapeutic Area User Guide v1.12017: Available from: https://www.cdisc.org/standards/therapeutic-areas/influenza.

33. IDDO IDDO. New CDISC data standardaids development of therapies for Ebola virus. IDDO; 2017.

34. WWARN WARN. Launch of the first malaria therapeutic area data standard and new Case Record Form (CRF). WWARN; 2017.

35. Sakushima K, Sakaguchi, H., Chen, T.J. Electronic Study Data submission for New Drug Application in Japan2017: Available from: www.phusewiki.org/docs/2017_CSS_US/PP15_Final.pdf.

36. Law D, Koussis, P., Allard, C., Sviglin, H., Ho, J., Li, J., Kropp, T., et al. JumpStarting Review: Highlights. FDA/PhUSE US Computational Science Conference Proceedings: FDA/PhUSE US CSS Symposium 2017, March 14-17 [serial on the Internet]. 2017: Available from: www.phusewiki.org/docs/CSS2015Presentations/PP21FINAL.pdf.

37. Nilsson MaB, N. . Standard Analyses and Displays for Common Data in Clinical Trials - The Journey Continues! FDA/PhUSE US Computational Science Conference Proceedings: FDA/PhUSE US CSS Symposium 2017, March 19-21 [serial on the Internet]. 2017: Available from: www.phusewiki.org/docs/2017_CSS_US/PP14_Final.pdf.

38. Tu H, and Soukup, M. Reducing Duplicated Effort on Standard Analyses through Collaboration - Roadmap for Standard Analyses and Code Sharing Working Group. FDA/PhUSE US Computational Science Conference Proceedings: FDA/PhUSE US CSS Symposium 2017, March 19-21 [serial on the Internet]. 2017: Available from: www.phusewiki.org/docs/2017_CSS_US/PP18_Final.pdf.

39. TransCelerate. TransCelerate BioPharma Common Protocol Template.   [28 November 2018]; Available from: https://transceleratebiopharmainc.com/assets/common-protocol-template.

40. Medical School University of Michigan. Mobilizing Computable Biomedical Knowledge.   [28 November 2018]; Available from: https://medicine.umich.edu/dept/lhs/service-outreach/mobilizing-computable-biomedical-knowledge.

41. Community LH. Homepage.   [28 November 2018]; Available from: http://www.learninghealth.org/.

42. Community LH. Community Initiatives.   [28 November 2018]; Available from: http://www.learninghealth.org/sample-initiative-1.

43. (CTSA) CaTSAP. Homepage.   [28 November 2018]; Available from: https://ctsa.ncats.nih.gov/cd2h/.

44. Food and Drug Administration. Use of Electronic Health Record Data in Clinical Investigations Available from: https://www.fda.gov/downloads/Drugs/GuidanceComplianceRegulatoryInformation/Guidances/UCM501068.pdf.

45. CDISC. Clinical Trial Registry XML (CTR-XML).   [12 December 2018]; Available from: https://www.cdisc.org/standards/data-exchange/ctr-xml.

46. World Health Organization. WHO International Clinical Trials Registry Platform Available from: https://www.who.int/ictrp/en/.

47. EudraCT. Homepage. [12 December 2018]; Available from: https://eudract.ema.europa.eu/.

48. Valkenhoef G, Tervonen T, Brock B, Hillege H. Deficiencies in the transfer and availability of clinical trials evidence: a review of existing systems and standards. BMC Med Inform Decis Mak. [Research Support, Non-U.S. Gov't Review]. 2012 Sep 04;12:95.

49. U.S. Food and Drug Administration. 21st Century Cures Act and FDA [12 December 2018]; Available from: https://www.fda.gov/RegulatoryInformation/LawsEnforcedbyFDA/SignificantAmendmentstotheFDCAct/21stCenturyCuresAct/default.htm.

50. Patient Centered Outcomes Research Institute. Homepage.   [28 November 2018]; Available from: www.pcori.org.

51. Singh J, Kandaswamy, H., Otoo, J.D., Gumne, P., Duvenhage, M., Whalen, C.J.,  and Tartovsky, M. CDASH and SDTM for Clinical Trials and Reporting. Proceedings of the 2017 National Institutes of Health Research Festival [serial on the Internet]. 2017: Available from: https://researchfestival.nih.gov/2017/posters/cdash-and-sdtm-clinical-trials-and-reporting.

52. Newell KO, Xiao, J., Duvenhage, M., Singh, J., Gumne, P., Kandaswamy, H., Whalen, C., et al. Using a data warehouse model based on CDISC SDTM for centralized data and safety reporting for clinical trials. National Institutes of Health 2016 Research Festival [serial on the Internet]. 2016: Available from: https://researchfestival.nih.gov/2016/posters/using-data-warehouse-model-based-cdisc-sdtm-centralized-data-and-safety.

53. Bhattacharya S, Andorf S, Gomes L, Dunn P, Schaefer H, Pontius J, et al. ImmPort: disseminating data to the public for the future of immunology. Immunol Res. 2014 May;58(2-3):234-9.

54. Shankar RD, Bhattacharya S, Jujjavarapu C, Andorf S, Wiser JA, Butte AJ. RImmPort: an R/Bioconductor package that enables ready-for-analysis immunology research data. Bioinformatics. 2017 Apr 1;33(7):1101-3.

55. Davis AL, Miller JD. The European Medicines Agency and Publication of Clinical Study Reports: A Challenge for the US FDA. Jama. 2017 Mar 07;317(9):905-6.

© 2024 MJH Life Sciences

All rights reserved.