Home » Business » What to know before saying yes to your doctor using AI

What to know before saying yes to your doctor using AI

by Priya Shah – Business Editor

AI Medical Scribes Gain Traction Amidst Patient Privacy Scrutiny

Doctors increasingly use AI for note-taking, but regulatory gaps and data diversity remain concerns

Artificial intelligence is rapidly entering Australian healthcare, with AI digital scribes now assisting nearly a quarter of general practitioners nationwide. While the technology promises more direct patient interaction for doctors, experts urge caution and full patient disclosure.

Understanding the Digital Scribe

AI digital scribes are sophisticated software applications designed to convert spoken conversations between clinicians and patients into written text. These transcripts are then used by the AI to generate clinical summaries or letters, which are integrated into a patient’s medical record or shared with other healthcare providers.

“We’ve had digital scribes for a while now,” explains Yves Saint James Aquino, a research fellow at the Australian Centre for Health Engagement, Evidence and Values. “But the new stuff that we have are the scribes that are using large language models (LLM) to summarise the text. So, it’s not just transcribing exactly… they are now summarising a conversation or a clinical interaction based on an audio recording.” Yves Saint James Aquino emphasizes that clinicians bear the responsibility of verifying the accuracy of these AI-generated summaries.

Patient Rights and Transparency

Patient safety advocate Jennifer Morris stresses the importance of transparency and patient autonomy in the adoption of this technology. “Patients have the right to access their medical records and decide for themselves whether they think it’s accurate enough,” she states. Jennifer Morris has personally opted against using AI scribes in her own consultations, finding the output did not meet her standards for accuracy.

Patient safety advocate Jennifer Morris advises patients can review AI-generated notes for accuracy.

Experts recommend that clinicians clearly explain what the digital scribe software is, how it functions, and where patient data is stored. Yves Saint James Aquino cautions that if data is processed or stored outside of Australia, robust privacy protections equivalent to Australian laws must be ensured.

Regulatory Landscape and Potential Risks

A critical point of discussion is the regulatory status of these AI tools. Yves Saint James Aquino notes that many AI scribes currently do not meet the legal definition of a medical device as defined by the Therapeutic Goods Administration (TGA). This means they have not undergone the rigorous testing expected of medical products, leaving potential implications and long-term harms largely unexamined.

“It means we don’t really know the implications or we don’t really know any evidence of safety or long-term harm,” Yves Saint James Aquino commented.

The TGA has acknowledged that digital scribes sometimes propose diagnoses or treatments beyond what a clinician has identified, suggesting they may indeed fall under medical device regulations requiring pre-market approval. The agency is currently reviewing this area to address the rapid integration of AI in clinical settings. Simultaneously, the Australian Health Practitioner Regulation Agency has developed a code of practice for clinicians utilizing AI.

What to know before saying yes to your doctor using AI
Yves Saint James Aquino highlights the need for more research into AI scribe effectiveness in healthcare.

Jennifer Morris raises concerns about the training data used for AI models, particularly regarding diversity. “So, if you have AI that’s trained on a population, for example, of mostly white, wealthy, middle-aged people, it may not be as effective,” she explained. This lack of representation could impact the AI’s accuracy and efficacy for Australia’s diverse population.

While clinicians report benefits like time savings and improved patient interaction, the evidence for effectiveness remains largely anecdotal, underscoring the need for further research. The TGA’s recent report and the AHPRAs code of practice signal a growing effort to navigate the complexities of AI in healthcare, aiming to ensure both innovation and patient safety.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.