-
Date Time 16:00 - 17:00
Location Webinar Timezone Europe/London
There has been significant innovation in the way clinical trials are designed and conducted in recent times, with the mainstreaming of decentralised clinical trials just one of the many advances underway. The next level of innovation will come from the ability to combine clinical trial data with real word data (RWD). Previously encumbered by privacy concerns, and the risk of data being re-identified when combined, the ability to securely link data across the entire patient journey from trial to real world settings will unlock multiple benefits for drug development- from enabling longitudinal drug safety studies to informing trial design with a previously unattainable granularity of patient-level data.
Tokenisation-de-identification technology that replaces private patient information with an encrypted token that can’t be reverse engineered to reveal the patient-will unlock the power of data connectivity and enable the full potential of real-world data to be realised. The technology can create patient-specific tokens for any data set, allowing a patient’s data to be combined across multiple data sources without privacy being compromised.
This Financial Times Digital Dialogue, held in partnership with ICON, will bring together pioneering early adopters of clinical trial tokenisation and other industry experts to discuss the potentially transformative role of tokenisation.
Issues to be addressed include:
- Defining the concept: what is tokenisation and its role, use cases and benefits in clinical trials?
- How does tokenisation work? Understanding the journey towards tokenisation in clinical trials - key steps and considerations; What kinds of trials are most suited to tokenisation and data linkage (use case, trial size)? When should tokenisation take place- at the beginning, during or at end of the trial?
- What have been the experiences of early adopters of tokenisation in clinical trials?
- What are the barriers to greater uptake and scaling tokenisation in clinical trials, and how can they be addressed?
- Will patients place trust in tokenisation as a research tool? The importance of patient education and consent. How do you create a level of comfort for the patient and manage the burden for sites and investigators?
- Future scoping: what are the potential future use cases of tokenisation? How might the data ecosystem develop? The impact of patients owning data and the monetisation of patient data
Speakers:
Kathleen Mandziuk
Jason Lott
Henry Wei
Sidharth Jain
Lucinda S. Orsini
Moderator:
Sarah Neville