Header image

Spotlight poster session - Sunday Group 2

Tracks
Track 2
Sunday, November 23, 2025
13:00 - 13:55

Speaker

Agenda Item Image
Graduate Student (master’s Program) Sojeong Bae
Ewha Womans University

Use of Surrogate endpoints and Real-world evidence in recent oncology NICE submissions

Abstract

Introduction:
Surrogate endpoints are used in oncology to faster regulatory and reimbursement decisions. Real-world evidence (RWE) has emerged as complementary data in health technology assessment (HTA).

Aims:
To evaluate the use and validity of surrogate endpoints and assess how RWE was utilized in oncology HTA.

Methods:
We reviewed all oncology-related NICE Technology Appraisals published between February 1, 2022 and January 15, 2025 (n=130). Terminated appraisals (n=27) were excluded, and three were further excluded: two symptom managements and one with insufficient evidence.
For each appraisal, we extracted information from the committee papers regarding: (1) the primary endpoint, (2) the maturity of overall survival (OS) data, (3) the level of surrogate validity and (4) the use of RWE. Fisher’s exact test was conducted to examine whether the use of RWE differed according to OS maturity and the use of surrogates.

Results:
Of the 100 appraisals, 78% relied on surrogate endpoints. The most used surrogates were progression-free survival (32%) and overall response rate (26%), with validation lacking in 76% and 95% of cases, respectively. In contrast, all uses of recurrence-free survival, disease-free survival, event-free survival were supported by full validation (100%). RWE was used in 48% of appraisals, primarily to support clinical effectiveness (21%), as external control data (20%), or for OS validation (10%). Although the difference was not statistically significant (p = 0.39), RWE was less frequently used for clinical effectiveness when surrogate endpoints were used than when they were not. The use of RWE was not associated with OS maturity (OR = 0.95, p = 1.00).

Conclusions:
Surrogate endpoints are commonly used in HTA; however, their validity remains limited. No significant association was found between RWE use and either OS maturity or surrogate use, suggesting limited alignment between RWE application and trial uncertainty.

Keywords:
Surrogate endpoint, Health technology assessment, Real-world evidence

Biography

Sojeong Bae is a master’s student in pharmacy at Ewha Womans University, South Korea. She earned her bachelor’s degree in pharmacy from the same university, where she developed a strong interest in evidence-based medicine and health policy. Her current research focuses on the use of surrogate endpoints and real-world evidence (RWE) in health technology assessment (HTA), particularly for high-cost medicines such as cancer and rare disease treatments. She is conducting a study analyzing the relationship between surrogate endpoint validity and the use of RWE in regulatory and reimbursement decision-making, using NICE appraisal documents and Korean real-world data. Her academic interests include evidence interpretation under uncertainty and real-world data methods for HTA. She aims to contribute to policy-relevant pharmacoepidemiology in the Asia-Pacific region. This is her first time presenting at the ISPE ACPE conference.
Agenda Item Image
Mr Marco Chau
London School of Hygiene & Tropical Medicine

Potential interactions between SSRIs and DOACs: population-based cohort and case-crossover study

Abstract

Introduction: Bleeding is a known side effect of direct oral anticoagulants (DOACs) and selective serotonin reuptake inhibitors (SSRIs). However, it is unknown whether their concomitant use would further exacerbate bleeding risk.

Aims: To compare the hazard of bleeding in patients with concomitant use of DOACs and SSRIs versus non-SSRI antidepressants.

Methods: We performed a population-based cohort and case-crossover study using primary care data from the UK Clinical Practice Research Datalink (CPRD) Aurum between 1/1/2011 and 31/3/2021. The outcomes were intracranial, gastrointestinal, and other bleeding. We used a cohort design to estimate hazard ratios (HRs) using propensity score weighting, comparing DOAC+SSRI and DOAC+non-SSRI antidepressant users. We also conducted a 6-parameter model case-crossover design comparing odds of exposure to different drug initiation patterns for outcomes in hazard vs referent window within an individual to eliminate time-invariant confounding and repeated the analysis using non-SSRI as negative control precipitants.

Results: We included 35,782 DOAC users co-prescribed SSRIs and 39,745 co-prescribed non-SSRIs. There was no difference in risk of bleeding outcomes in the cohort design (intracranial bleeding: HR1.16, 99% confidence interval [CI] 0.62-2.20; gastrointestinal bleeding: HR1.09, 99%CI 0.83-1.41; other bleeding: HR1.01, 99%CI 0.78-1.29). In the case-crossover design, we observed an odds ratio (OR) of 1.64 (99%CI 1.14-2.35) for other bleeding associated with initiation of SSRI while taking DOAC, which was greater than that observed for SSRI monotherapy (OR1.06; 99%CI 1.01-1.11; p for Wald test=0.002), but greater odds ratio was not observed in patients initiated non-SSRI while taking DOAC (p for Wald test=0.83).

Conclusions: We found no evidence of increased risk of intracranial and gastrointestinal bleeding when DOACs were used with SSRIs versus non-SSRIs. However, when analysing specific order of exposures, we found a higher risk of other bleeding associated with initiating SSRIs when taking DOACs.

Keywords: direct oral anticoagulant, drug-drug interaction, selective serotonin reuptake inhibitor

Biography

Marco Chau is a registered pharmacist in Hong Kong. He received his Bachelor of Pharmacy and Master of Clinical Pharmacy degrees from the University of Hong Kong (HKU). Marco is a Board-Certified Pharmacotherapy Specialist (BCPS) and a Board-Certified Cardiology Pharmacist (BCCP). He also obtained a Master of Science in Epidemiology from the London School of Hygiene & Tropical Medicine (LSHTM). Currently, he works as a pharmacist in a public hospital under the Hospital Authority in Hong Kong, where he focuses on optimizing medication therapy and enhancing patient care.
Agenda Item Image
Ms Cherry Chu
Biostatistician
Women's College Hospital

Effectiveness of academic detailing for type 2 diabetes care in Ontario, Canada

Abstract

Introduction: Academic detailing (AD) is a one-on-one evidence-based educational outreach for healthcare providers. AD has been effective in improving prescribing behavior in various contexts, however, its impact on diabetes care in Canada remains underexplored.
Aims: We aimed to compare diabetes prescribing and care patterns between physicians who received AD and those who did not.
Methods: We conducted a population-based matched cohort study using health administrative databases in Ontario, Canada. We included primary care physicians with active billing status from September 2020 to September 2022. Each AD physician was matched to a maximum of four controls based on index year, region, sex, years in practice, and proportion of patients with diabetes. We assessed patient clinical outcomes monthly from 12 months pre- until 18 months post-intervention using mixed-effects modelling to account for matching and repeated measures, and to adjust for physician and patient characteristics.
Results: The cohort included 372 AD and 1450 control physicians, with balanced demographics. AD physicians saw fewer patients (1292 vs. 1526) but delivered more appointments per patient (4.2 vs. 3.0). Both groups had 15% of patients with diabetes. Post-intervention, biosimilar insulin use increased more sharply in the AD group compared to controls (average 9.0% vs. 5.6% of patients monthly). Patients of AD physicians consistently had higher B12 testing among those using metformin (76.5% vs. 60.0%) and greater use of SGLT2 inhibitors or GLP-1 receptor agonists (40.1% vs. 31.5%). Patient A1C control (defined as <8%) remained similar across groups (~80%). Time x group differences were significant for biosimilar insulin and SGLT2/GLP-1 prescribing (p<0.001), but not for B12 testing (p=0.790) and A1C levels (p=0.815).
Conclusions: AD was associated with improved diabetes prescribing patterns but did not affect other outcomes of interest. AD’s success in improving diabetes care underscores the need to maximize physician engagement.
Keywords: diabetes, quality improvement, primary care

Biography

Cherry is a Biostatistician providing statistical support and oversight for numerous healthcare research studies. She received her Bachelor of Science Honours in Life Sciences at Queen’s University and a Master of Science in Epidemiology at McGill University in Canada. Her work has included leading and supporting studies to answer important research questions related to drug policy and drug utilization while leveraging population health data. She also advises various project teams on quantitative analytical methods in general, and has experience analyzing data from a variety of sources, ranging from survey and vendor data, to clinical records.
Agenda Item Image
Dr. Gangadharappa HV
Associate Professor and Head
JSS COLLEGE OF PHARMACY, JSS ACADEMY OF HIGHER EDUCATION & RESEARCH

Digital Solutions in Materiovigilance for enhanced safe use of Medical Devices

Abstract

Introduction: Medical devices are essential tools in modern healthcare, playing a critical role in diagnosis, monitoring, and treatment. Additionally, to further advance meteriovigilance, integration with electronic health systems and predictive technologies in necessary.
Aim: To develop and validate machine learning based tool for the prediction and structured analysis of Medical Device Adverse Events
Methodology: A machine learning–powered tool was developed. Random Forest algorithm was used as the predictive model for detecting potential risks. Further to evaluate the tool functioning, data of MDAEs by active surveillance method were collected from departments of Intensive Care Unit, Pulmonology, Urology, Nephrology, Neurology, General medicine, Surgery, and Paediatrics of a tertiary care hospital after patient consent. The collected data was uploaded and was assessed for its Tool functionality. Both manually and machine learning–powered tool Reported MDAEs were assessed for causality, severity, and device disposition according to Materiovigilance Program of India guidelines version 1.2.
Results: During the study period, a total of 1857 patients were reviewed, out of them 122 patients developed MDAEs during the hospital stay. MDAEs reported were IV cannula-associated thrombophlebitis (46.72%), catheter-associated urinary tract infections (18.85%), and central line–associated bloodstream infections (14.75%). Most devices were single-use (83.6%), assistive (74.59%), and invasive (91.8%). Causality assessment revealed 90.98% of cases as probable. Serious events accounted for 33.6%, and the recovery rate was 89.34%. Infants were identified as a statistically significant high-risk group for MDAEs (OR: 12.5, p = 0.0159). The developed ML tool accurately predicted risk patterns, aiding proactive MDAE monitoring and overall report generating. The Random Forest technique used in the current study successfully predicted and categorized MDAEs.
Conclusion: The finding supports the integration of machine learning technology in into active surveillance frameworks and call for increased awareness for better report practice and stronger regulatory machines to enhance patient Safety
Keywords: Medical devices, MDI

Biography

Dr. Gangadharappa is a recognized research supervisor for M.Pharm and Ph.D. programs at JSSAHER, having successfully guided 8 Ph.D. scholars and 71 postgraduates, with several others currently under his mentorship. His research expertise lies in innovative drug delivery technologies, including microneedles, silk fibroin nanoparticles, graphene-based carriers, and polymeric systems for cancer, autoimmune disorders, and medical devices. His translational research contributions have led to 05 Indian patents. He has a prolific publication record, with 140 papers published in national and international peer-reviewed journals with cumulative impact factor exceeds 230, and his recent works include advanced reviews and original studies on microneedles, silk fibroin, graphene nanoribbons, and carbon nanotubes for targeted therapy. He has also contributed to 10 book chapters, published by reputed publishers including Elsevier, Springer Nature, CRC Press, and Academic Press, covering areas such as carbon nanotube-based delivery, gene therapy materials, 3D-printed microneedles, and stem cell therapy for neurodegenerative disorders
Agenda Item Image
MSc Hotaka Maruyama
Reviewer
Pharmaceuticals and Medical Devices Agency

Cardiovascular risk of romosozumab versus teriparatide: cohort study using Japan’s national database

Abstract

Introduction:
Disproportionality analyses suggested a cardiovascular risk signal for romosozumab, while statistically significant associations were not found in real-world database studies in Japan. However, their main study population did not include patients with a history of major adverse cardiovascular events (MACE), and databases used in these studies captured small population of Japanese patients with osteoporosis. Therefore, a larger comparative study was necessary to examine this risk.

Aims:
This study aimed to compare the cardiovascular risks of romosozumab with those of teriparatide in the overall population and in groups with a history of MACE.

Methods:
A new user cohort study was conducted using Japan’s national claims database. Patients aged ≥40 years who initiated romosozumab or teriparatide between March 2019 and March 2023 were analyzed. A multivariable Cox proportional hazards model was used to estimate the adjusted hazard ratios (aHR) for MACE. Subgroup analyses were performed based on MACE history.

Results:
A total of 251,219 romosozumab and 500,445 teriparatide users were analyzed (most common age group was 80–89 years for both drugs; male: 9.33% for romosozumab and 14.14% for teriparatide). MACE occurred in 1,853 romosozumab and 3,427 teriparatide users, with incidence rates of 1.09 and 1.22 per 100 person-years, respectively. The aHR (95% confidence interval [CI]) for romosozumab compared to teriparatide was 1.00 (0.94–1.06). In subgroup analyses based on MACE history, the aHRs (95% CI) for no history, for the one-year period leading up to t0, and for more than one year before t0 were 1.01 (0.95–1.08), 0.93 (0.72–1.21), and 1.00 (0.85–1.18), respectively.

Conclusion:
In conclusion, no statistically significant difference in MACE risk was observed between romosozumab and teriparatide in Japan’s national claims database, regardless of MACE history.

Key words :Romosozumab, Cardiovascular risk, Nation-wide observational study

Biography

Mr. Hotaka Maruyama is a pharmacoepidemiologist in Pharmaceuticals and Medical Devices Agency (PMDA), Japan. In 2016, he started his career as a reviewer in the regulatory authority, and mainly involved in the causal assessment between adverse reactions and medicines, especially central nervous system drugs, based on case series reports, disproportional analysis, and scientific literatures. After that, he joined the division of pharmacoepidemiology in 2019 to perform pharmacoepidemiological studies and review the post-marketing safety study protocols as a pharmacoepidemiologist under scientific supervision of Drs. Yoshiaki Uyama and Masao Iwagami.
Agenda Item Image
Dr Lejin Mathew
Independent Researcher

Clinical Comparison of Magnesium versus traditional bone implants in fracture fixation

Abstract

Introduction: Magnesium-based orthopaedic implants are emerging as promising alternatives to traditional bone implants like Titanium. These implants are biodegradable, offer the potential to reduce stress shielding, eliminate the need for implant removal surgery and allow clearer Magnetic resonance imaging (MRI). Traditional implants, though effective, are associated with surgical complications and MRI artefacts. This study explores whether magnesium bone implants can match or outperform the clinical outcomes of traditional implants in real-world surgical settings.

Aims: This study aimed to evaluate the clinical performance of biodegradable Magnesium implants (screws) in comparison to Titanium implants, focusing on healing, complication rates, implant removals and imaging outcomes.

Methods: A qualitative case study approach was used to review two published randomised clinical trials involving Magnesium and Titanium screws for distal metatarsal osteotomies in hallux valgus correction. Both trials followed 26 patients, equally split between the two implant groups. The case studies were assessed based on reported healing times, pain function scores, radiological findings, MRI clarity and any need for implant removal. The goal was to interpret real-world clinical evidence in a controlled surgical setting.

Results: In both studies, patients with Magnesium and Titanium screws experienced similar healing within 6 months. No significant differences were observed in functional scores or pain relief. There were no implant-related complications in either group. By 3 years, Magnesium implants remained stable and gradually degraded without compromising bone integrity. MRI scans showed notably fewer artefacts in the Magnesium group, enhancing visibility of the surgical site. Importantly, none of the Magnesium screws required removal, unlike Titanium screws.

Conclusions: Magnesium screws showed similar short and mid-term clinical results compared to Titanium screws, with clear benefits in imaging and avoiding secondary removal surgeries. These findings support ongoing evaluation of their use in low-load orthopaedic procedures.

Keywords: Magnesium implants, biodegradable screws, case study review

Biography

Dr. Lejin Mathew is a PharmD graduate. His academic and professional interests lie in pharmacovigilance, regulatory science, and medication safety. He has collaborated on observational studies in oncology, conducted prescription audits in clinical settings, and contributed to regulatory feasibility research. His current work explores the feasibility and regulatory pathways for magnesium-based biomaterials as bone implants, with a focus on biocompatibility, degradation behaviour, and compliance with EU and FDA requirements. Dr. Mathew is also a member of TOPRA and ISPOR and is committed to advancing evidence-based approaches that focus also on regulatory decision to help improve therapeutic outcomes. He aims to combine clinical insight with evolving regulatory standards to improve drug and device safety in healthcare environments.
Agenda Item Image
Dr Manojkumar V
Research Scholar
JSS College of Pharmacy, Ooty

Study on prevalent CYP2C19 variants driven sertraline PopPk in South Indian patients

Abstract

Introduction: Sertraline is principally metabolized by the CYP2C19 pathway, and South Indians exhibit an unusually high frequency of CYP2C19 loss-of-function alleles. Consequently, standard dosing may lead to overexposure and toxicity in this population. Population pharmacokinetic (PopPK) analysis enables quantification of such genetic effects at the bedside. Understanding this interaction is essential for precision dosing of antidepressants in resource-diverse settings.

Aim: This study aimed to quantify the prevalence of CYP2C19 poor-metabolizer genotypes in South Indian sertraline users and to determine how these variants affect sertraline PopPK parameters.

Methods: This study was approved by the IEC, JSSMC, Mysore. This is an open-label study conducted at JSSMC, Mysore, on 104 South Indian adults receiving steady-state sertraline provided serial plasma samples. Sertraline and desmethyl-sertraline concentrations were measured by validated HPLC, and CYP2C19 *2/*3 alleles were detected by RFLP genotyping. Sampling times, doses, demographics, and genotypes were modeled in NONMEM® using one-compartment structure with first-order absorption; covariates were selected by forward-addition/backward-elimination (α = 0.05). Model stability was confirmed by bootstrap and VPC.

Results: Poor-metabolizer genotypes (*2/*2 or *3/*3) were detected in 13 of 104 patients (12.5%). The final PopPK model estimated typical oral clearance (CL/F) at 76.8 L h⁻¹ (relative standard error 8%) and volume of distribution (V/F) at 1870 L. Age and CYP2C19 status significantly reduced clearance (ΔOFV = -37; p < 0.001), with poor metabolizers showing a 35% lower CL/F versus extensive metabolizers. Correspondingly, the parent-to-metabolite ratio at C_max fell by 42–48% in poor metabolizers (p = 0.002).

Conclusions: The notable 12.5% prevalence of CYP2C19 poor-metabolizer genotypes among South Indians meaningfully slows sertraline clearance, supporting a 25-50% empirical dose reduction—especially in elderly patients. The validated PopPK model offers practical tool for genotype-guided sertraline dosing and underscores the value of pharmacogenomics in regional precision psychiatry.

Keywords: Sertraline, CYP2C19 polymorphism, Population pharmacokinetics

Biography

Dr. Manojkumar Venkatesan is a Pharm.D graduate and current Ph.D. scholar specializing in pharmacometrics, with a research focus on oncology precision medicine. His scientific journey began with a deep interest in pharmacokinetics during his undergraduate studies, which evolved into expertise in advanced modeling and simulation. He has contributed to multiple research projects involving dosage optimization and virtual patient simulations using platforms such as NONMEM, Pumas, and the Monolix Suite. Dr. Manojkumar has presented his work at national and international scientific forums and earned recognition, including first prize in a national review article competition and a best oral presentation award. With an expanding portfolio of academic contributions and publications, he is actively involved in integrating clinical insights with computational pharmacology. His goal is to enhance therapeutic outcomes by promoting data-driven, individualized treatment strategies through model-informed drug development and precision dosing.
loading