Applied Clinical Trials
Evolving out of disconnected technology solutions and working toward integrated processes.
In a lot of ways, clinical research seemed a whole lot easier in the old days. Data was collected on paper CRFs and keyed into a Clinical Data Management System (CDMS) for cleaning and analysis. Sure it was primitive, awkward, and slow but it was also familiar, generally accepted, and predictable. At most, you might have to load an external lab file, and most of the data you needed would be in one place where you could run your queries, review your listings, and fire off a complete set of SAS datasets to the statisticians for final analysis.
Wayne R. Kubick
Of course, memories tend to gloss over some of the grimmer aspects of past realities. Paper also causes lots of other problems such as quality, data reusability, and lack of timeliness. Yet, it seems like those double data entry clerks were pretty fast, efficient, and cost effective at the time.
I also recall giving a talk at RAPS back in those days, describing my experiences as a global road warrior. In those days my 007-like attaché included a profusion of communication and work devices. I had a laptop and attachments, with a series of international electrical and modem telephone connectors. I used a Palm Pilot to manage my calendar and contacts, and a clunky cell phone and a sleek pager to keep in constant voice contact. I carried a walkman to help distract my brain from the pain of distance running using my digital chronograph to keep track of my training times. Add in spare batteries and power supplies and I was ready for the road. Obviously, this preceded my inevitable back surgery, which resulted from hauling all that gear around the globe.
Today, my kids pretty much manage to do everything they need with just a mobile phone (and a laptop for more complex and visual computing purposes). They don't even bother with a wristwatch anymore. Not only do their smart phones have most of the computing applications they need, it also has their data: contacts, URLs, photos, music—basically all they care about on a day-to-day basis. Being able to use one multipurpose device that covers your essential daily activities and information needs—that's what we mean by convergence.
Now convergence of clinical applications didn't seem so essential in the old days, because everyone had their own distinct tasks and the tools they themselves needed for their particular job. Sure there were those who tried to create a Swiss army knife out of a portable computing device, but these were found to be deficient in too many respects and excelled at nothing. Meanwhile, clinical study data rolled along the assembly line without much cross-utilization by different functions. Back then a CDMS served as an adequate data repository for clinical trials, when the primary purpose was to enter and process data for a regulatory submission. Once a study was locked, we'd just package up the individual SAS datasets and send them over to Statistics for reporting.
Oh, maybe the clinical people used some sort of Clinical Trials Management System too, which, more often than not, came down to a collection of spreadsheets and documents. And then there were those safety folks obsessing over the Serious Adverse Event reports—they inevitably used their own system as well. But that was okay—each group minded their own business most of the time.
Like most other computer systems of that era, none of these applications talked together very well. Once in awhile, someone would buy an integrated suite of products with high-gloss marketing literature claiming to bring everything together in a seamless package, but it turned out these didn't talk to each other much better either. This didn't surprise anyone—and as a result didn't quite sweep the market either—which was tolerable as long as there was a good enough tool for each specific job.
But after years of psycho-dependency on my smart phone, I tend to travel a little lighter these days. And I've developed a keen appreciation for a smaller set of multipurpose tools that can access most of the information I need via the Internet.
I suspect a lot of others feel similarly and wonder why clinical applications haven't kept pace with the rest of technology. Because times have changed, different types of data are coming in faster in much greater volume in many different ways, and the old study CDMS's aren't necessarily up to the task anymore (especially as the world finally edges over the tipping point to EDC).
We are discovering many more important uses for data that traverse functional silos, with the frustration of knowing it's not always easy to fit data together so we can pursue those uses. Some believe the answer is data standards. But, while these have improved the situation in many ways, they don't always work so well in practice as they appear in demos when applied to more complex, real business scenarios.
And it usually seems the current standards are hobbled with functionality gaps or are being supplanted by the next great standard before they ever get where we need them to be right now. Semantic interoperability is the next nirvana, but today we often still can't reliably exchange even the simplest types of data without manual intervention, much less ensure meaning is conveyed unambiguously.
An integrated solution?
So far, the vendor community's answer to this need is a new generation of integrated product suites. It's possible that this time these really might be the answer—assuming they actually work as advertised and we have the opportunity to start anew with a clean slate. But how well are these working right now with so many existing disconnected or loosely connected tools and legacy data structures? And aren't most of these still straddled with a few gaps or weak links in the chain?
Despite the increasing focus on such a suite approach, the client community still tends to purchase ala carte. Perhaps this is because they want to hedge their bets, or have mixed experiences with some of the individual components in the suite, or don't want to relinquish the one tool that really works for them now for the unknown. As a result, some sponsors make due with what they have, others plan for the future when the perfect solution will be available, and still more forge ahead to make it work now—using system integrators to glue together the various components and fill in the holes. And maybe some of these will actually get close to the ideal—at least until the versions of the products or standards change.
One noteworthy case study of a sponsor who needed to do it all themselves is the National Cancer Institute (NCI) whose ambitious cancer Biomedical Informatics Grid (caBIG) initiative has been developing an integrated, open-source architecture (caGrid) that includes a set of tools for conducting clinical trials and exchanging data.
The caBIG program is heavily committed to using model-driven development along with CDISC and HL7 data standards to provide a comprehensive research infrastructure that ensures that interoperability and integration is built-in from the very start. Although it wasn't designed to support commercial product development, it provides a very interesting benchmark on what some sponsors may be looking for in their next generation systems. Perhaps caBIG offers a glimpse of what industry really needs in their ideal suite.
In addition to having integrated tools, we also know that sponsors—as well as regulators, researchers, and pretty much everyone—want all the data they need in one place (or at least available from a single access point). EDC systems do this for a single trial, but not so well for a full program or domain of interest. As a result, many sponsors are investing in clinical data warehouses.
Again, standards are critical because they should make it easier to bring the different types of data together. But it can be a lot of work to make existing data conform to these standards and, as we've said, the standards we have don't necessarily have all the standardization we need for all the data we're using at this point.
So it's incumbent upon the vendor community to prove that an integrated suite of tools really can operate on the enterprise level and fit smoothly with standards-based data repositories. And vendors must also commit to standards-compatible suite components that can be switched in and out, because it's not realistic to expect sponsors who have invested heavily to implement a particular product—with which they may be fully satisfied—to replace it just to integrate with other related tools.
We now have some evidence that such a degree of convergence is looking like it's possible. Perhaps we can actually achieve it this time, especially if sponsors and vendors work together more effectively to deliver the solutions that work for most everyone. Perhaps in our lifetime the convergence that our kids have experienced will begin to extend to our professional lives as well. And if so, the age of convergence may finally reach our world too.
Wayne R. Kubick is Senior Vice President and Chief Quality Officer at Lincoln Technologies, Inc., a Phase Forward company based in Waltham, MA. He can be reached at wayne.kubick@phaseforward.com
Moving Towards Decentralized Elements: Q&A with Scott Palmese, Worldwide Clinical Trials
December 6th 2024Palmese, executive director, site relationships and DCT solutions, discusses the practice of incorporating decentralized elements in a study rather than planning a decentralized trial from the start.