Varied Needs for Biological Sample Management

Article

Applied Clinical Trials

Applied Clinical TrialsApplied Clinical Trials-10-01-2016
Volume 25
Issue 10

From basic blood draws to more involved samples, keeping accurate track and records is crucial for trials.

Although it’s difficult now to remember a time before computers were readily available, prior to the 1970s, the tracking of biological samples and their annotations was tedious, time-consuming and error prone. The growing presence of affordable computers and the desire to streamline the collection and reporting of data led some laboratories to develop their own management systems, while others saw profit in developing them for commercialization. Today, sample management systems range from manual processes aided by spreadsheets to sophisticated services that accession samples at the point of collection and track them and their unique data all the way into the biorepository

What has changed between the earliest systems and today is that biological samples have become exceptionally more valuable. Samples that were once tested and discarded are often carefully preserved in biorepositories as insurance against future regulatory inquiries, as well as to potentially serve as vital keys to some yet-unknown branch of research. With such valuable assets at stake, it is important to ensure that each and every sample from every clinical trial-untold billions in all-is known and that its entire history can be verified from the moment the sample is collected until it is deemed no longer needed.

A clinical laboratory sample management system should have the following features:

  • Accessioning, where a sample is assigned a unique identity attached to attendant demographic data, either at the time of collection or upon entry into the laboratory environment

  • Anonymization, to protect the privacy of individuals

  • Tracking, which may include logistical tracking and tracking within the analytical workflow of the lab

  • Quality control of attendant processes

  • Analysis and storage of collected data

  • Storage or dispensation of the physical sample

This article examines the key components in various sample management systems and relates them to the research type for which they are being used. This comprises simple testing and storage within the same lab setting, to complex protocols encompassing multiple sites in foreign and rural locations, as well as detailed cryologistics, chain of custody requirements and long-term biostorage. 

 

Beyond spreadsheets: LIMS

When the needs of a laboratory grow beyond spreadsheets and off-the-shelf solutions, many initially turn to a laboratory information management system, or LIMS. Once prohibitively expensive, simple LIMS and software as a service (SaaS) systems are within reach of even small and virtual organizations.

At a basic level, a LIMS can be any piece of software that manages the information that is produced or digested in a laboratory setting. Although there may be a few foundational requirements that seem to apply across all laboratory types, such as the ability to track and manage samples, the variation of one LIMS to the next may be dramatic.1 Most LIMS start from the premise that a process exists, e.g., samples are received, accessioned, bar-coded, processed according to protocol, data analyzed and stored in a freezer. A different lab may follow a different process, and one of the defining differences between LIMS choices is the degree of flexibility in making the LIMS follow your processes or vice versa.

The core function of LIMS has traditionally been the management of samples. This typically is initiated when a sample is received in the laboratory, at which point the sample will be registered in the LIMS. The registration process usually involves accessioning the sample along with clinical or phenotypic information and producing bar codes to affix to the sample container. The LIMS then tracks chain of custody as well as sample location, typically a particular freezer location.

Benefits from implementing a LIMS can be both qualitative and quantitative, but are very dependent on the lab environment. In a pharmaceutical quality assurance lab, for example, quantitative benefits may include increased efficiency through integration of systems, automation of routine reports, and streamlining the review process. In a research laboratory, the benefits may be more about adaptable experiment design and workflow. In both instances, however, qualitative benefits would include reduction of transcription error, adherence to regulatory requirements and easy accessibility to data.

Modern LIMS have extensive configurability, enabling them to adapt to individual laboratory environments. LIMS users may also have regulatory concerns to comply with such as CLIA, HIPAA, GLP, and FDA specifications, affecting certain aspects of sample management. One key to compliance with many of these standards is audit logging of all changes to LIMS data; in some cases, a full electronic signature system is required for rigorous tracking of field-level changes to LIMS data.3 

In addition to configurable fields for special processes, LIMS capabilities typically include:

  • Audit management

  • Bar code handling

  • Chain of custody

  • Compliance tracking

  • Configurable annotation

  • Document management 

  • Electronic data entry and transfer

  • Instrument calibration and maintenance

  • Inventory and equipment management

  • Process management

  • Personnel and workload management

  • Quality assurance and control

  • Reporting

  • Search

  • Workflows

​

LIMS platforms

Whether part of a LIMS or as a standalone sample management system, deployment of the application can be on one of several different platforms.

A thick-client system typically has part of the software residing on the user’s computer or workstation, where the processing takes place, with the remainder installed on the user company’s servers that take care of data storage. Because the program is resident on computers within the user company, changes, upgrades and other modifications must of necessity happen on the client side. Thick-client systems have some advantage of speed, but require a robust computing environment and can only be accessed by those with network access. Pricing is typically based on an initial purchase covering a set number of licenses and ongoing technical support.

 

 

A thin-client, or SaaS system, offers functionality through a web browser. The software resides on a host server that processes information without saving it to the user’s hard drive. Upgrades and other modifications are handled by the hosting company, and the user’s only responsibility is maintenance of the integrity of the web browser. Advantages to a thin-client system include significantly lower cost of ownership and fewer network and client-side maintenance expenses, making it attractive to small and medium-sized laboratory enterprises. Disadvantages of SaaS include a need for increased network throughput, and some compromises in configurability and functionality. Pricing is typically based on licensing fees for each user on the system plus ongoing support.

A web-based architecture is a hybrid of the thick- and thin-client architectures. While much of the client-side work is done through a web browser, the system may also require the support of desktop software installed on the client device. Web-based architecture has the advantage of providing more functionality through a more user-friendly web interface. 

 

Beyond LIMS: Central laboratories

When the needs of development teams grow beyond what their own local lab can handle, they may contract with a central laboratory to handle sample management along with a wide range of associated services.

According to Dr. Francisco Leão, Jr., writing in Applied Clinical Trials,4 the central laboratory concept was developed in the early 1990s by laboratories delivering services to major pharmaceutical companies: 

“The goal was to consolidate the test results and data originating in different clinical sites, which was previously analyzed in local labs. Bringing the samples to one single laboratory would avoid consolidation of biased test results among different laboratories, all of which could be using different analytical platforms, kits, and reference values. This concept was first applied to clinical studies conducted in the United States. Soon after, the courier industry started offering solutions for biologic sample transportation, which allowed the central lab concept to be applied globally. Later, the concept of the affiliated laboratory was created. The affiliated laboratory covered geographic regions that had difficulties exporting biologic samples. As a consequence, the central laboratory became more global and started to build different types of associations with analytical laboratories in different parts of the world.”

A central lab is exclusively responsible for lab assessments and provides services from conducting lab tests and compiling lab test reports, to contracting courier services for delivering lab kits and biosamples to and from investigative sites. 

Affiliated central labs enable large multi-country studies-even complex genomic or adaptive protocol trials-by ensuring compliant aggregation of the data. But it’s not necessarily easy. As Dr. Leão, pointed out, clinical site staff may be responsible for logistical tasks, causing samples to reach the lab in a condition that doesn’t allow them to be properly analyzed. There are also difficulties in shipping lab materials to remote sites in developing countries that raise costs and cause logistic constraints. While query resolution and clinical site support processes are usually best dealt with by local teams and staff, Dr. Leão warns that differences in language and time zones between central labs and clinical sites may be problematic. The key to running successful global studies through affiliated central labs is harmonization.

Harmonization is a process that must be carried out by affiliated central labs in order to integrate results of biological sample tests from different laboratories, and avoid any possible bias generated by technical differences among them. Depending on the degree and method of harmonization that a group of labs implements, the labs may reach a very close technical comparability and be considered a single entity by the trial sponsor, delivering the same service and results all over the world. Of course, there are different aspects and levels of harmonization.

Analytical platform harmonization. Analytical equipment, methodologies, kits, and reagents used in the laboratory test either in general or for the specific trial are compared and, if necessary, addressed through correlation tests. These can show that the results coming from different equipment can be considered homogeneous and, therefore, can be consolidated in the study databank. Also to be considered is the IT platform, because the final product is the data, which will have to be generated, transmitted and stored in ways compatible and compliant within the study database. 

Reference value harmonization. Although important for the data analysis and data management process, this can be challenging, depending on the population and tests involved. Safety test reference values are easily harmonized because most of them follow international standards, but more esoteric testing requires a higher level of scrutiny. 

Certification, accreditation and external QC programs. Laboratories involved in the same study are typically harmonized according to their national and international certification and accreditation.

Laboratory routines and reports. To ensure harmonization, laboratory routines, not just the equipment, should go through a harmonization process. Calibration frequencies, preventive equipment maintenance and repeat thresholds should be comparable among participating laboratories.

 

Beyond central labs: Sample management as a service

Although central laboratories are able to network together to provide services, sample tracking is not necessarily the highest priority. For the most part, samples are not accessioned into the system until they reach the lab and are entered into the networked LIMS or clinical trial management system.

For high-value samples, especially those being obtained in developing regions of the world, some companies provide sample management as a service that begins with a detailed sample management plan to help control pre-analytic variables that could compromise sample integrity or otherwise alter research outcomes. The planning procedure looks at every detail, including what samples should be collected, how they should be handled and how they will be accessioned into a sample tracking system, as well as how they will be transported, analyzed and prepared for long-term storage. Under this scenario, every step in a sample’s life cycle is monitored, recorded and carried out through adherence to uniform standard operating procedures that are harmonized throughout the trial. From collection through cold chain transport, to central lab testing and biorepositories, everything must be standardized: collection tubes and shipping containers; laboratory equipment; and cryogenic freezers.

 

 

Specific considerations

When preparing a comprehensive sample management plan, sponsors should vet their providers to ensure they have the regional capability and capacity to carry out the program logistics. More remote regions, for example, will require the use of advanced dry vapor shipping dewars designed to minimize the risks of temperature excursions, with hold capacities of <-150° for up to 10 days. Before the study begins, the investigator site list should be evaluated to determine if specific locations should be subject to a logistics dry run, enabling the development of alternative logistics solutions.

Another aspect to be considered is providing sites with appropriate tools. If the protocol calls for collecting blood and isolating peripheral blood mononuclear cells (PBMCs), each of the sites must have kits containing the right kind of collection tubes, labeling and bar-coding equipment as well as shipping containers that have been certified for a variety of conditions, including crush resistance and temperature maintenance.

Because researchers are looking for the best ways to leverage individual specimens to drive clinical research as well as translational and personalized medicine, they want complete datasets surrounding each specific sample. Unlike LIMS or many central labs, a critical aspect of high-end sample management is accessioning samples at the time of collection, using a digital pen for compliance, and condition monitoring systems to record sample status at each stage. This enables individual samples to be tracked and monitored from collection through testing and during long-term storage and also provides a 21 CFR Part 11 compliant audit trail for later reference. 

New cellular and gene-based research and studies on immunology or cancer immunotherapies require additional care in sample management. With each study participant sample collected at different intervals during the study, the biorepository may, for example, be required to extract DNA and RNA and make aliquots of each sample, or isolate PBMCs and cryopreserve them in liquid nitrogen. How development teams preserve the sample for downstream testing becomes increasingly important as we move closer to translational medicine and to expedite drug discovery for personal medicine and companion diagnostics utilizing biomarkers. As a facet of sample management planning, teams must imagine every possible use for samples as a part of the protocol development discussion.

Conclusion

As technology develops in each of these areas-collection tools, cryogenic logistics, condition monitoring, IT integration, biostorage-it is likely that a greater percentage of samples will be treated with more care and attention to their long-term viability and usefulness. In the meantime, even relatively simple systems can maintain the integrity required for most clinical and post-clinical applications.

Sample management runs the gamut from very rudimentary standalone systems, to sophisticated LIMS, to large central labs managed by pharmaceutical companies or CROs. Any one of these may be perfectly appropriate, depending on the size, scope and strategy of the development program. Routine assays and safety testing require only modest management, but as the focus of research shifts to preserving the integrity of scientific assets to support biomarker discovery projects, personalized medicine efforts and the development of other, yet-to-be-determined, genomic-based treatments, the value of each individual sample takes on a greater importance and greater value. In these cases, implementation of a robust sample storage management system, including a comprehensive sample plan, is necessary to ensure samples collected during clinical trials will benefit both current and future R&D efforts.

 

Eric Hayashi, MBA, is President & CEO, LabConnect

 

References

1. 2011 Laboratory Information Management: So what is a LIMS?, Sapio Sciences. July 28, 2010. 

2. Joyce, John R., et al. Industry Insights: Examining the Risks, Benefits and Trade-offs of Today’s LIMS, Scientific Computing, March 30, 2010. http://www.scientificcomputing.com/article/2010/03/industry-insights-examining-risks-benefits-and-trade-offs-today%E2%80%99s-lims

3. Kent, Thomas, Managing Samples and Their Storage, Lab Manager, May 13, 2009

4. Leão, Jr., Francisco. The Local Central Lab Model, Applied Clinical Trials, April 1, 2008

 

Related Videos
Related Content
© 2024 MJH Life Sciences

All rights reserved.