New ‘Research Data Mart’ To Help Academic Sites Track Trial Performance
By Deborah Borfitz
May 10, 2021 | In the competition for research resources, a university’s ability to conduct clinical trials with speed and efficiency has become a key consideration of the National Institutes of Health (NIH) when doling out grants, according to Leslie A. Lenert, M.D., chief research information officer for Medical University of South Carolina (MUSC) and director of its Biomedical Informatics Center. The response of MUSC was to create a digital platform called Research Integrated Network of Systems (RINS) to generate the kind of performance benchmarks that can help keep institutions in the running.
RINS is designed to facilitate information sharing about studies across six systems—SPARCRequest, an open-source research transaction management system; Epic, one of the leading electronic health record (EHR) systems; Click, an electronic institutional review board system (IRB); CayuseSP, a system for grants award management; SmartStream, for expenditure tracking; and OnCore, an enterprise-wide clinical trials management system (CTMS)—to reveal which improvements to the clinical trial process are making a difference. Lenert likens the bioinformatics tool to “a medical record for a study.”
Lenert says he is hopeful that RINS will be widely adopted, at minimum among the 27 universities using MUSC-developed SPARCRequest that serves as the foundation of the new “research data mart.” SPARCRequest was effectively repurposed for metric tracking across the disparate systems.
SPARCRequest users encompass 12 Clinical and Translational Science Awards (CTSA) and Clinical and Translational Research hubs around the country. MUSC does a lot of free process improvement consultations with sites associated with the National Center for Advancing Translational Sciences (NCATS), which runs the CTSA program, says Lenert.
Excepting Click, most of the systems being integrated are commonly used in academia, he notes. The software specially built for those systems, as well as the core of RINS, can easily be reused elsewhere.
The virtual data warehouse is described in an article recently published in the Journal of the American Medical Informatics Association (DOI: 10.1093/jamia/ocab023). A key component is a unique research master identifier (RMID) to both create the linkages between systems and distinguish individual studies, Lenert says.
In 2017, MUSC began requiring the use of RMIDs in informatics systems that support human subject studies. Over the next four years, 5,513 RMIDs were created. Some of them bridged systems needed to evaluate research study performance and others linked to the EHR to enable patient-level reporting.
A web-based tool was developed to create the RMIDs, and application programming interfaces to synchronize research records and visualize linkages to protocols across systems.
“To manage your research processes, you need to be able to link together data in disparate systems,” Lenert says. “Unless you have a process for that, you are continually going to be groping in the dark.” RINS provides the necessary organizational structure “to tell the left hand what the right hand is doing.”
This not the first time that a clinical data warehouse has been mined for the benefit of translational research, says Lenert. Other institutions have either employed CTSA-developed bioinformatics platforms, adapted proprietary clinical enterprise business intelligence tools, or created their own integration solutions.
The Stanford Translational Research Integrated Database Environment (STRIDE) takes a similar approach as RINS, except it uses a patient instead of a study identifier to facilitate data integration, as reported in the journal. The Clinical Research Administration (CLARA) at the University of Arkansas for Medical Sciences is another centralized platform that functions in much the same way, but data is extracted from a fewer number of research systems and a trial cannot be traced to its underlying grant award, which facilitates reporting to funding agencies.
While it would have been easier to amass all the data in one monolithic system, MUSC opted for the federated approach to integration so research administrators could continue using existing “best-of-breed” systems they are accustomed to using, continues Lenert. Questions posed by study teams get answered upon request by a data administrator running RINS in the Biomedical Informatics Center.
“The ability to do a query is pretty tightly controlled at this point,” he says. “This is not an open system any investigator could use.”
The rationale for maintaining the performance metrics is to demonstrate to the NCATS that MUSC is accelerating clinical and translational research, Lenert adds. MUSC also looks to be a preferred site for industry-sponsored trials based on its track record for speed and cost-effectiveness.
One of the lessons in working with RINS is the importance of a clinical trial management system in making trials “faster, cheaper, and better,” he says. Up until late last year, MUSC was using Velos but switched to OnCore, a CTMS optimized for cancer trials.
The main difficulty in integrating the different systems was finding the unit of analysis, says Lenert, noting that in the university setting a human subject protocol might refer to a pilot project, NIH-funded trial, or follow-on study—often with a different name at various stages of its evolution. RMIDs are now required to prevent it from being variably titled in SPARCRequest, CayuseSP, and Click.
RMIDs denote the primary investigator, department, long title, short title, funding source, and study type. They are a bit like the unique identifiers mandated in the U.S. for medical devices, he says.
SPARCRequest answered the need for a place to store the linked identifiers and bring the data together, Lenert explains. It was already being used by investigators to request study-related services from the South Carolina Clinical and Translational Science Research Institute (SCTR, MUSC’s NCATS-funded CTSA), and the university’s Office of Clinical Research, such as help in writing an IRB protocol and/or budget development. Thanks to RINS, MUSC can now see if it was money well spent (i.e., the protocol was ultimately approved).
As a management tool, RINS is being used to perform live queries of the integrated systems, says Lenert. The research data mart could be asked, for example, how the availability of biostatisticians impacts the number of grant applications generated and the timeframe for their submission.
Study-level data is generally being examined, such as time to first revenue for industry studies. Given the linkage to Epic, it would even be possible to learn if a specific research procedure resulted in any downstream patient complications, he adds.
Critical data gets extracted nightly to provide up-to-the-minute institutional performance metrics appearing in user-friendly dashboards created with data visualization software Tableau, Lenert says. Among the featured summary data are study startup time, financial performance, and time to enroll the first patient.
The single dashboard can be used for multiple reporting purposes, he says. Leaders of the SCTR can use it to assess the performance of the research enterprise as a whole and scout for improvement opportunities. Study managers can assess the performance of their specific programs, and their staff can drill down to look for underperforming studies in need of service support.
In terms of generating metrics, “the sky is the limit,” says Lenert. The tricky part is uneven compliance with the RMID mandate. Investigators will sometimes make up a new name for a study if they do not immediately find a previously named one in the database. “Occasionally we have to clean that out and merge the duplicate entities... and that’s done manually [by a dedicated analyst].”
The next step with RINS is to directly link care services supporting basic sciences—e.g., gene sequencing, mouse models, and electron microscopy—with basic science grants, Lenert says.
“RINS has been a game-changer, allowing us to monitor our institutional metrics in real time to quickly assess our clinical and translational research pain points and the impact of our improvements rather than the months it took to collate the data in the past,” says Royce R. Sampson, chief operating officer for SCTR and one of the inventors of SPARCRequest. “In the future, we plan to provide access to key metrics for study teams to benchmark their studies against institutional goals.”