The Case For Tokenizing Data On Clinical Trial Participants
By Deborah Borfitz
March 29, 2022 | Based on a lively panel discussion at the recent Summit for Clinical Ops Executives (SCOPE), it appears enthusiasm is building for the “tokenization” of clinical research participants to link information from trials to real-world data (RWD) on the same patients. The rationale includes the ability to track individuals more easily over the long haul and contextualize adverse events detected in studies, says Tim Riely, vice president of clinical data analytics at IQVIA.
Unique, anonymized identifiers, or tokens, have been used for multiple years now to link different sources of patient-level real-world data (RWD) while protecting patient privacy, he says. Processes and systems are in place to do the same on the clinical research side and join the two to find the overlap.
Tokenization is a means to “future-proof” studies and entire research programs, according to Sandy Leonard, senior vice president of partnerships and RWD at HealthVerity. When done on the front end, the process can help study sponsors nimbly respond to the push-pull associated with regulatory approvals and evidence requirements.
As has been seen with the U.S. Food and Drug Administration (FDA) and other regulatory bodies, submission of real-world evidence as a complement to trial data can reduce if not remove post-marketing commitments, adds Kathleen Mandziuk, vice president of real-world solutions and tokenization at ICON. If companies already have a cohort of patients exposed to a drug, tokenization can also be useful for product positioning in the market.
Tokens provide a way to stitch together data fragments from trials and the real world, potentially in lieu of a phase 4 or long-term follow-up study, says Vera Mucaj, chief scientific officer at Datavant. “Three to five years is a long time [to devote to such studies].”
When AbbVie started its journey with tokens, the list of potential use cases easily grew to 40, says Kyle Holen, M.D., head of advanced analytics and data. The tokenizing of trial participants offers a “ton of value” to the company in terms of better understanding patients who sacrificed to be part of a trial.
Holen says he was frustrated by reports in the literature that Humira (an AbbVie drug originally approved in 2002 for the treatment of rheumatoid arthritis) was having a clinically meaningful effect in delaying the onset of Alzheimer’s disease. Longer term, secondary impacts like this are undiscoverable in the absence of a 360-degree view on the thousands of participants responsible for a product’s market launch.
An “easy starting place” for tokenization are decentralized clinical trials involving self-reported data and wearable technology, according to Craig Lipset, co-chair of the Decentralized Trials and Research Alliance. It would be “just another smart data stream within the flow.” RWD might come via a simple record retrieval step that fits the therapeutic culture, helping to ensure diversity and the quality of data relied on for decision-making.
The Holdups
Few companies intrigued with the idea of patient-identifying tokens have started implementing them into their research programs, as evidenced by the four hands raised in a room of 50 clinical operations executives participating in the dialogue. The “soft reasons,” offers Lipset, revolve around education and awareness. Tokenization is a “foreign and unfamiliar term that sounds like blockchain [to which it bears no relation].”
ICON built an offering over two years ago, says Mandziuk, and it has taken until now to understand all the components—including the necessary partnerships, regulatory concerns, consent and data collection process, when and how to tokenize patients, and the privacy framework. Tokenization “hits all parts of our organization and takes a lot of time and validation… [and] education,” as the idea of direct patient identifiers runs counter to what people have been told their entire career can’t be done.
An eConsent tool can capture patients’ name but not all the other required information, notes Riely, which could possibly be access via a site portal. Education of sites needs to happen up front since it is “harder to do downstream.”
Legal and compliance issues often get wrongly blamed for the holdup in implementing tokens, says Leonard. Organizational leaders need to see the vision, notably acceleration of trials and reducing costs.
Tokenizing trials at scale can also “generate better science,” adds Mucaj, but it can’t be done by disrupting the current trial process. Datavant, for its part, has embedded its technology for tokenization into existing workflows and helped educate site staff.
Scaling The Concept
Availability of population-based data will determine how far the concept expands outside the U.S., where tokens will need to be adjusted, Mandziuk says. Over time, as data become increasingly accessible, the regulatory how-tos will also evolve. “The fact we’re able to do this in the U.S. … creates more opportunities outside the U.S.”
“We can’t let the lack of a common denominator [stall] progress,” says Lipset. During the wait time for a global framework to emerge, companies could “bring to life” different narrow use cases. “Staying still will be falling behind.”
Tokenization has the potential to re-identify individuals once different datasets start getting linked, says Holen, who proposes a “money in the bank” scenario that accommodates both risk mitigation and global expansion. He’d like to see tokens used as much as possible in preparation for their unknown but potentially significant post-study utility, including early detection of drug side effects.
Datavant has clients utilize token technology now, Mucaj says, noting that the FDA uses RWD for regulatory decision-making. Clinical data collected for real-world healthcare can’t be expected to always focus on the same endpoint, but a lot of research is underway to identify new and more powerful endpoints using RWD (e.g., surrogate endpoints, based on mortality data, in oncology trials).
By tokenizing patients, a study that today uses time to next treatment as an endpoint might instead look at claims data to see who stayed on an intervention and how they fared, Mucaj continues. That could make tokenization useful for positioning a drug in the market.
Tackling The Naysayers
White papers, blogs, and even regulatory guidance won’t by themselves boost adoption of tokens, says Leonard. “It’s a conversation… [about] how an identifier is generated and how [patient records are] matched and how you know what data to bring in and how to control who gets access to hashes. It’s not the same question for each use case.”
In talking with different stakeholders—corporate legal compliance, IT, and clinical operations—find out why they would be willing to take the risk in the first place, Leonard advises. They may want to tokenize simply to better understand the comorbidities of a population, and that use case could create a comfort level in leveraging RWD. “Not everyone wants to jump in on the deep end.”
At AbbVie, convincing stakeholders to move forward with tokens was “a bear,” says Holen. Lawyers in the room were resistant to the idea, worried about possible discrepancies between clinical trial and real-world data. Sites also “freaked out,” which education helped remedy.
The word “token” tends to elicit an ungrounded fear of losing would-be study volunteers, says Holen, pointing to 18 of 19 pages of a consent form talking about the possibility of death. “Do you really think this one word is a problem?” he rhetorically asks.
The good news, in terms of scaling tokenization of research, is that RWD has already been tokenized, says Mucaj. A clinical trial data token would serve as the bridge connecting the two types of information.
Linking Data
The process of analyzing linked data to gain insights needs a place for mitigating risk of re-identification of clinical trial participants so privacy is maintained, Mandziuk says. Symphony datasets, owned by ICON, are “claims data but not deep data,” she adds, which can be linked with other datasets and tokens to create scale at the industry level.
HealthVerity, for its part, makes sure patient-level data from electronic health records and electronic data capture systems are being aggregated on the same individuals so tokens are highly accurate, says Leonard. It helps to start with highly curated clinical data facilitating integration.
A variety of wraparound services exists to help with tokenization, but HealthVerity’s role is identity resolution and technology-enabled privacy and security, Leonard says. The company was involved in the COVID-19 vaccine rollout.
Referencing its 2021 merger with PRA Health Sciences, Mandziuk says ICON has run clinical trials utilizing tokens. Multiple tokens can be applied, together with analytics, to glue together an end-to-end service.
Any use case needs pipes to link two datasets together, says Mucaj, and Datavant’s tokenization team supports clients to make that a reality. Its wraparound services fall in the connect, control, and comply buckets.
Licensing Can Wait
Large pharma of course has an entire data science team to help with implementations, notes Mandziuk. Small biotechs are wanting a turnkey solution for one study.
Datasets are not cheap and are unaffordable for many startups to license, interjects Holen. “You can’t link data if you don’t have rights to the data.” The tokenization process, on the other hand, is relatively inexpensive. “We got the cost down to less than $10 a patient.”
Data licensing can wait, says Lipset. “Just do the token for now… move one step to the right, spread knowledge [and] share your expertise.”
Tokenizing trial participants is a show of respect for their time in the study, says Mucaj. Trial volunteers are generally very willing to altruistically share their data with researchers provided their personal identity is not revealed.