Duke-NUS Innovates Global Regulatory Frameworks for Generative AI in Medical Imaging

Singapore Leads Push for Adaptive AI Oversight in Healthcare Devices

  • generative-ai
  • singapore-higher-education
  • research-publication-news
  • duke-nus
  • medical-imaging

Be the first to comment on this article!

You

Please keep comments respectful and on-topic.

a typewriter with a paper that reads personalized medicine
Photo by Markus Winkler on Unsplash

Promote Your Research… Share it Worldwide

Have a story or written a research paper? Become a contributor and publish your work on AcademicJobs.com.

Submit your Research - Make it Global News

The Rise of Generative AI in Medical Imaging: Opportunities and Challenges

Generative Artificial Intelligence (GenAI), powered by advanced models like diffusion models and large language models (LLMs), is transforming medical imaging. These technologies can synthesize realistic images from text prompts or incomplete data, aiding in tasks such as anomaly detection, image reconstruction, and virtual patient simulations. In radiology, for instance, GenAI enhances MRI scans by filling in gaps from motion artifacts or generates synthetic CT images to reduce radiation exposure. However, this rapid evolution raises critical questions about reliability, bias, and safety in clinical use. 81 40

Singapore, a hub for biomedical innovation, is at the forefront through institutions like Duke-NUS Medical School. Recent advancements highlight GenAI's potential to democratize high-quality imaging in resource-limited settings, but underscore the need for robust oversight to prevent harms like hallucinated pathologies or amplified dataset biases.

Duke-NUS npj Digital Medicine Paper: A Landmark Perspective

Published on March 19, 2026, in npj Digital Medicine, the paper "Innovating global regulatory frameworks for generative AI in medical devices" by Jasmine Chiat Ling Ong and colleagues from Duke-NUS' Centre of Regulatory Excellence (CoRE) and international partners calls for urgent regulatory evolution. Led by Duke-NUS researchers, it dissects risks unique to GenAI—such as data poisoning vulnerabilities and non-deterministic outputs—and critiques static frameworks unfit for dynamic AI. 81

The authors, including experts from Stanford, University of Birmingham, and Duke-NUS AI + Medical Sciences Initiative, emphasize multidisciplinary collaboration. Duke-NUS' role exemplifies Singapore's higher education leadership in bridging AI research and policy.

Duke-NUS Centre of Regulatory Excellence team advancing AI frameworks

Key Risks Highlighted in GenAI Medical Devices

GenAI in medical imaging risks perpetuating biases if training data underrepresents diverse populations, leading to inequitable diagnostics. For example, models trained on Western datasets may falter on Asian ethnicities common in Singapore. Other concerns include adversarial attacks altering images imperceptibly and ethical issues like synthetic data attribution. 81

  • Hallucinations: Fabricating non-existent tumors in scans.
  • Vulnerabilities: Prompt injection exploiting LLMs.
  • Equity gaps: Poor performance in low-resource settings.

The paper advocates algorithmovigilance, akin to pharmacovigilance, for post-market monitoring.

Current Global Regulatory Landscape

The U.S. Food and Drug Administration (FDA) has cleared over 1,000 AI/ML-enabled devices, many for imaging, via the 510(k) pathway but lacks GenAI-specific rules. Its 2025 Action Plan emphasizes lifecycle management, yet struggles with adaptability.FDA AI/ML SaMD guidance 51

In the EU, the AI Act (effective 2024) classifies medical AI as high-risk, mandating transparency and human oversight, intersecting with Medical Device Regulation (MDR). 61

Singapore's HSA: Pioneering AI Medical Device Oversight

Singapore's Health Sciences Authority (HSA) achieved WHO Maturity Level 4 in 2026, the highest globally, with AI-enabled risk classification tools and updated Software Medical Devices guidelines (GL-04 r4, Dec 2025). HSA's framework covers lifecycle approaches for AI-SaMD, including public institution notifications.HSA Software MD Guidelines 73

Duke-NUS CoRE contributes via initiatives like regulatory sandboxes, aligning with IMDA's Privacy Enhancing Tech sandboxes for safe GenAI testing.

scrabble tiles spelling the word innovation on a wooden surface

Photo by Markus Winkler on Unsplash

Case Studies: GenAI in Action and Regulatory Hurdles

In diabetic retinopathy screening, GenAI federated learning (Duke-NUS example) preserves privacy across sites. Yet, FDA-cleared imaging AIs show predicate networks vulnerabilities. Singapore trials highlight GenAI for chest X-rays, but need equity safeguards. 81

RegionKey FrameworkGenAI Focus
USA (FDA)AI/ML Action PlanLifecycle mgmt, no GenAI specific
EUAI Act + MDRHigh-risk, transparency
Singapore (HSA)Software MD GuidelinesAI risk tool, sandboxes
International regulators discussing GenAI frameworks in medical imaging

Duke-NUS Recommendations: Adaptive and Collaborative Paths

The paper proposes:

  • Global regulatory science centers (e.g., expand Duke-NUS CoRE model).
  • Ethical tools: CARE-AI checklists, TRIPOD-LLM reporting.
  • Sandboxes for interoperability testing.
  • IMDRF harmonization including HSA, FDA.

Read the full npj Digital Medicine paper for detailed strategies. 81

Singapore Higher Education's Pivotal Role

Duke-NUS, alongside NUS and NTU, drives AI-health convergence. CoRE trains regulators, fostering Singapore's ecosystem. Universities collaborate on federated learning, addressing data silos while complying with PDPA.

Stakeholder Perspectives: From Clinicians to Policymakers

Clinicians praise GenAI for workflow efficiency but demand explainability. Regulators like HSA prioritize equity; ethicists urge diverse datasets. Duke-NUS' Nan Liu notes, "Global collaboration ensures AI benefits all populations."

Future Outlook: Harmonized Frameworks by 2030?

With IMDRF and WHO efforts, harmonized GenAI regs could emerge. Singapore positions as leader via HSA's ML4 status and Duke-NUS innovations, potentially exporting frameworks to ASEAN.

In medical imaging, expect adaptive licensing, real-world evidence mandates, and GenAI-specific predicates.

Three antique globes on a wooden surface

Photo by Dorian Labbe on Unsplash

Actionable Insights for Singapore Academics and Researchers

  • Leverage HSA sandboxes for prototypes.
  • Adopt CARE-AI in grants/publications.
  • Collaborate via Duke-NUS initiatives.
  • Focus on equity in datasets for local relevance.

This positions Singapore universities as global AI-health pioneers.

Portrait of Prof. Evelyn Thorpe

Prof. Evelyn ThorpeView full profile

Contributing Writer

Promoting sustainability and environmental science in higher education news.

Discussion

Sort by:

Be the first to comment on this article!

You

Please keep comments respectful and on-topic.

New0 comments

Join the conversation!

Add your comments now!

Have your say

Engagement level

Frequently Asked Questions

🖼️What is generative AI in medical imaging?

Generative AI creates synthetic images for tasks like reconstruction and simulation, enhancing diagnostics but requiring strict oversight.81

⚠️Key risks of GenAI medical devices per Duke-NUS paper?

Biases, hallucinations, vulnerabilities like data poisoning, and equity issues in diverse populations.

🇸🇬How does Singapore HSA regulate AI medical devices?

Via lifecycle guidelines, AI risk tools, and sandboxes; achieved WHO ML4 status in 2026.HSA updates

🇺🇸FDA approach to GenAI in imaging?

Over 1,000 clearances via 510(k); Action Plan for lifecycle but no GenAI specifics yet.

🇪🇺EU AI Act impact on medical AI?

High-risk classification mandates transparency, oversight; aligns with MDR.

📋Duke-NUS recommendations?

Global collaboration, sandboxes, CARE-AI checklists, algorithmovigilance.

🎓Role of Singapore universities in AI regs?

Duke-NUS CoRE leads research, training; NUS/NTU on federated learning.

🔬Examples of GenAI in med imaging?

MRI gap-filling, synthetic CT, retinopathy screening.

🔮Future of global AI med device regs?

IMDRF harmonization, adaptive licensing by 2030.

🤝How to engage with Duke-NUS AI initiatives?

Collaborate via CoRE, apply sandboxes; focus equity datasets.

📊Algorithmovigilance explained?

Post-market AI monitoring like drug safety surveillance.