OR WAIT 15 SECS
Joe Pollarine, Head of GxP Systems Strategy Director at Janssen, recently spoke about CRO oversight models and will expand on these models in this interview.
Novel CRO oversight models are starting to advance to include more efficient and effective ways of managing and inspecting both individual clinical trials and study portfolios. Joe Pollarine, Head of GxP Systems Strategy Director at Janssen, recently spoke about CRO oversight models at ExL’s 8th Clinical Quality Oversight Forum, and will expand on these models in this interview. ExL’s 7th CROWN Congress is occurring on January 23-25 in Philadelphia.
Moe Alsumidaie: What is the difference between Minimal and Strong sponsor oversight at Janssen?
Joe Pollarine: Traditional oversight typically consists of a sponsor periodically meeting with a CRO provider and having them drive the content of the meetings. In such cases, the sponsor typically answers questions, reviews and acknowledges the CRO’s performance metrics, and the CRO documents these interactions through meeting minutes. In addition, CROs also conduct ad hoc meetings to discuss issues they need to escalate for immediate attention or input from the sponsor. This oversight model meets minimal expectations for sponsor oversight, meaning sponsors are doing their diligence by periodically meeting with the CRO to stay aware of the progress of their study and managing significant issues, but the CRO generally does most of the work to conduct the trial. Alternatively, a stronger oversight model requires the sponsor to be more engaged in the trial by conducting independent evaluations and driving proactive two-way discussion with their CRO instead of reacting to what is being brought to them. For example, Janssen trial teams look at various trial compliance data (e.g. monitoring visit compliance, protocol deviations, quality issues reported etc.) on a quarterly basis and conduct a sponsor-led meeting with their CRO trial team in which cross-functional attendees (e.g. trial managers, physicians, data manager, quality managers) from both companies participate. They identify their areas of concern and bring these for discussion with the CRO teams and if there are disagreements in interpretation then they are discussed and actions are developed and documented as appropriate. This approach keeps the Janssen team engaged in the conduct of the study as well as provides ongoing opportunities to help the CRO understand Janssen’s expectations which can result in CRO performance optimization over time.
MA: What analytical systems and methodologies support strong oversight?
JP: Most trial data is typically held in a CRO’s technology systems, and sponsors access them through portals which allow them to conduct their own assessments. Janssen requests key data to be periodically transferred and integrates it into our in-house environment. With access to these data, Janssen can use a proprietary dashboard built on a visualization platform to efficiently monitor key variables (e.g. volume of protocol deviations, number of patients enrolled at a site over time etc.) across all sites in a trial to look for potential site performance issues which are then researched to better understand the root cause. The insights gained through these reviews can then be used as intervention points to have targeted discussions with CROs to understand why an investigator site’s (or even study) performance is not as expected. Oftentimes, these insights can lead Janssen teams to identify data irregularities, site performance issues, and possibly even CRO performance issues. Regardless of whether after investigation an underlying issue is identified, these analytical data reviews are a way Janssen can maintain strong oversight of outsourced trials as they lead to ongoing communication and remediation (as needed) with CRO trial teams.
MA: Define the inspection readiness process at Janssen. How and when is inspection readiness implemented?
JP: In the current environment where more regulatory authorities are requesting access into a company’s systems it is important for each person to conduct their role with the mindset that their activities could be reviewed at any time. Therefore, inspection readiness is a mindset that results in the ability to be ready for an audit or inspection at any time with minimal preparation. Over time, Janssen trial teams have adopted this mindset such that they are completing their roles to execute trials and assuring systems are updated and documentation of completion is filed in the trial master file in a timely manner. They have also worked to instill this mindset into their CRO trial teams. In addition to the growth of this mindset, Janssen also has an internal team that simulates inspection related activities using a risk based approach that considers factors such as type (e.g. NDA, BLA, s-NDA etc.), location, and timing of a regulatory submission. This team employs a hybrid style mock inspection program mixing techniques of data review commonly employed by FDA with elements of procedural document and system review in a style consistent with the way European health authorities inspect in order to complete a high level assessment of the key requirements of the trials in the submission, test Janssen’s inspection management logistical capabilities, and provide readiness training and coaching to the Janssen team to confidently work with a Health Authority if an inspection occurs. When a CRO is involved, these mock activities focus mainly on Janssen’s oversight with emphasis on governance, communication, and performance of sponsor accountable activities. The work this team does to support inspection preparation not only helps Janssen and CRO trial teams identify and address any potential gaps, it also helps them to continuously reinforce a culture and mindset of ongoing inspection readiness.
MA: What are portfolio quality meetings and what do they accomplish?
JP: Portfolio quality meetings are quarterly meetings between Janssen and Strategic Alliance CRO quality leaders (operational quality & quality assurance) to discuss the overall quality of Janssen’s entire portfolio of studies. CROs are used to presenting their quality data on a trial by trial basis, but oftentimes do not look at quality across a sponsor’s full portfolio if they are conducting more than one trial. Janssen’s quality organization is working to move the compliance needle towards a more holistic approach by asking its strategic alliance partners to aggregate quality metrics across its entire portfolio and begin thinking of the portfolio in the same way as a sponsor would. Examples of aggregated quality metrics reviewed in these meetings include: training metrics for all CRO personnel working on a Janssen trial, audit data from all audit types (clinical site and system), CAPA resolution metrics beyond on-time completion (e.g. number of extensions granted, number of failed CAPAs, efficacy of effectiveness checks), and impact of regulatory inspection findings to all trials. This approach gives Janssen a more comprehensive picture of how the partners are identifying problems and addressing issues across trials and it also promotes engaged discussion on overall impact of issues that get reported by the partners. For example, if CRA performance issues are reported for a trial, Janssen is going to ask if the CRA is monitoring any other trials in the portfolio. Sharing of aggregated quality metrics is prompting alliance partners quality professionals to start thinking proactively about the potential impact of a seemingly localized issue to determine if a broader corrective action plan is needed. The result of shifting to this way of thinking is better quality across Janssen’s portfolio as well as enabling the CRO quality team to provide better services to all of their other sponsors once they move to portfolio level quality thinking.
Moe Alsumidaie, MBA, MSF is Chief Data Scientist at Annex Clinical, and Editorial Advisory Board member for and regular contributor to Applied Clinical Trials.