Why Data Integrity Should be an Integral Part of the Pharma Biotech Industry Culture

Article

Applied Clinical Trials

Data integrity is the essence of GMP, the cornerstone of how the industry operates, and it is vital that all organizations embrace it to survive the rapidly changing life sciences landscape.

Data integrity is not just another hot topic in the Clinical Trial Supply Industry. It is not a new requirement, limited to computerized systems or just an issue observed in other geographical territories, plagued with poor Good Manufacturing Practice (GMP) issues in the past. It is the essence of GMP, the cornerstone of how the industry operates and it is vital that all organizations embrace it to survive the rapidly changing life sciences landscape. It affects companies across the globe, no one is exempt, with data integrity issues causing many problems throughout the clinical supply chain including: study delays, reworks, cost implications, or patient enrolment challenges. At the core of data integrity is patient-centric decisions, which ensure the quality, safety, and efficacy of Investigational Medicinal Product reaching the patient. Essentially, without integrity, your data is unreliable. 

The Regulatory Perspective

Data integrity and regulatory compliance have been a major focus for international regulatory agencies across the globe, such as the US Food and Drug Administration (FDA), the Medicine and Healthcare Products Regulatory Agency (MHRA), and the World Health Organization (WHO). Data integrity issues have resulted in an alarming number of observations, warning letters, recalls, import alerts, prompting the publication of additional regulatory guidance globally by the authorities. Regulators are much more aware and specifically trained to look for data integrity issues during inspections. They have reported a number of frequently occurring issues including some which would also be considered as fundamental GMP issues such as incomplete or missing records, shared usernames and passwords, and deleting original GMP records1.

Industry Changes

Long before the term Data Integrity was used, organizations were aware of the importance of good documentation practices, the need to ensure all documentation was complete, consistent and correct, not just to make sure batch records are complete, but rather in providing assurance in the product produced and ultimately patient safety. Yet regulators across the globe felt compelled to issue additional guidance.  

Introduced in 1997, the 21 Code of Federal Regulations (CFR) Part 11 is arguably the most universally recognized regulation for life sciences companies. According to 21 CFR Part 11, Data Integrity for Electronic Records includes “ensuring data integrity by protecting original data from accidental or intentional modification, falsification, or even deletion is the key for reliable and trustworthy records that will withstand scrutiny during regulatory inspections.” However, the topic of data integrity is much wider than this.

Fast forward over 20 years later and the biggest question is, as an industry, have we complied with 21 CFR Part 11, or have we missed the point? Much has changed since 1997, with more complex clinical supply chains, numerous partners and vendors, increased globalization of clinical trials and multifaceted, complicated systems. For example, a batch record review in 1997 would have been largely paper-based with entries completed by hand and manual Quality Assurance (QA) review (visibility of cross-outs, errors, re-issued pages demonstrating history). In 2018, a similar batch manufacturing process would include automatic data entries with electronic printouts forming part of the batch record, but how much has changed in the QA review process to verify the data?  Or are companies blind-sighted to the critical data with limited oversight of the audit trails to provide them with the necessary confidence in the data? 

It is a situation described as being ‘data rich, but information poor’ and it is crucial to consider the impact this has on the business and the patient. Ultimately, the reason for the 21 CFR Part 11 regulation was to ensure product quality, patient safety, data integrity, and business continuity. Yet many Pharmaceutical Quality Systems (PQS) have not evolved at the same pace as industry advancements and the PQS fit for purpose in 1997 may not be fit for purpose today. It requires the same level of investment as other processes in order to keep up-to-date and meet regulatory expectations.   

How to Comply

So, what is the expectation from the regulators and how do companies ensure compliance at a time when there are increasing pressures from all angles? Thankfully the guidance issued by the various regulatory bodies is fairly aligned and the essence is the same. They are not expecting a forensic approach to data checking on a routine basis. In other words, they are not expecting every single piece of data to be checked regularly, but they are expecting data review processes to be in place, commensurate with risk and that drives data integrity as an integral part of any project. 

Regulatory requirements cover many areas, including:

  • Control of proformas for recording raw data

  • Unique log-ins

  • Access control

  • Risk-based approaches

  • Accessibility of records

  • Periodic audits

  • Routine data reviews

  • Audit trails

  • Reconstruct activities

  • Data governance

  • Organisational culture

The MHRA published the final version of their “GxP Data Integrity Guidance and Definitions” document in March 2018, which provides a means of understanding the MHRA’s position on data integrity and the minimum expectations to achieve compliance. The MHRA have specifically outlined the expectation for Data Integrity Risk Assessments, conducted to consider the risk of deleting, amending, or excluding data without the opportunity for detection. The requirement for Risk Assessment is certainly not new. It is clearly spelled out in ICH Q9 and it is the expectation for Quality Risk Management (QRM) processes to assess the risk, implement controls, and to ensure management oversight.2

The guidance aims to promote a risk-based approach to data management that includes data risk, criticality, and lifecycle. The MHRA indicates that users of the guidance need to understand their data processes (as a lifecycle) to identify data with the greatest GxP impact. From that, the identification of the most effective and efficient risk-based control and review of the data can be determined and implemented.

Importance of Culture

The right environment is crucial to enable data integrity controls to be effective. Indeed, guidance from the MHRA states: “The impact of organizational culture, the behaviour driven by performance indicators, objectives and senior management behaviour on the success of data governance measures should not be underestimated.  The data governance policy (or equivalent) should be endorsed at the highest levels of the organization.”3

There is often a general misconception that data integrity failures only result from acts of deliberate fraud. Most issues relate to bad practice, poor organizational behaviour and weak systems, which create opportunities for data to be manipulated-something that if managed correctly from the outset, can be avoided. Ultimately management must lead by example and address any pressure to falsify data, for example by setting realistic goals. To create a data governance framework, it is important for organizations to know the risks in their current processes, decide what needs to be focused on and implement a remediation plan to assess whether alternative controls are needed for older systems that must be proven to be effective.

Conclusion

Knowledge of business processes and critical data is key, in addition to ensuring that the PQS is aligned with current business practices. Technical, procedural and behavioural controls should be commensurate with risk, with ultimately the right culture underpinning success. Every person must ensure that the data that they are responsible for adheres to the principles of data integrity and be able to report issues/practices observed which could impact the integrity of the data without fear of repercussions.

As data integrity continues to be major focus for international regulatory agencies across the globe, it is imperative that organisations increase their scrutiny to ensure the quality, integrity, and efficacy of data over its entire lifecycle. Not only is data integrity fundamental to ensure patient safety, but if data is used wisely it can contribute to making more efficient and effective data-driven business decisions, at a time when many companies are experiencing greater financial and performance pressures.

References

  1. NSF International, Data Integrity: A Closer Look (May 2018)
  2. ICH Q9 Quality Risk Management. ICH Harmonised Tripartite Guideline, Step 4, Geneva (2005)
  3. Medicines and Healthcare Products Regulatory Agency, ‘GxP’ Data Integrity Guidance and Definitions, Revision 1 (March 2018) 

 

Olive McCormick, Head of Quality & QP, Almac Clinical 

© 2024 MJH Life Sciences

All rights reserved.