PulseAugur
LIVE 11:02:06
research · [11 sources] ·
2
research

Ontario audit: AI medical scribes produce errors, hallucinations

An audit in Ontario, Canada, has revealed significant inaccuracies in AI medical scribes recommended by the provincial government. These tools, intended to help doctors summarize patient interactions, were found to generate incorrect, incomplete, and hallucinated information. The audit tested 20 vendors, with all showing issues, including fabricated referrals, misidentified medications, and missed mental health details, raising concerns about potential harm to patient care. AI

Summary written by gemini-2.5-flash-lite from 11 sources. How we write summaries →

IMPACT Highlights critical safety and accuracy issues with AI tools in healthcare, potentially slowing adoption.

RANK_REASON Audit report on AI product performance and safety.

Read on Ars Technica — AI →

Ontario audit: AI medical scribes produce errors, hallucinations

COVERAGE [11]

  1. Ars Technica — AI TIER_1 · Kyle Orland ·

    Your doctor’s AI notetaker may be making things up, Ontario audit finds

    Made-up therapy referrals, incorrect prescriptions among the common mistakes.

  2. The Register — AI TIER_1 ·

    Sick and wrong: Ontario auditors find doctors' AI note takers routinely blow basic facts

    60% of evaluated AI Scribe systems mixed up prescribed drugs in patient notes, auditors say

  3. Mastodon — fosstodon.org TIER_1 · [email protected] ·

    Ontario auditors find doctors' AI note takers routinely blow basic facts https://www. theregister.com/ai-ml/2026/05/ 14/ontario-auditors-find-doctors-ai-note-ta

    Ontario auditors find doctors' AI note takers routinely blow basic facts https://www. theregister.com/ai-ml/2026/05/ 14/ontario-auditors-find-doctors-ai-note-takers-routinely-blow-basic-facts/5240771 # ai

  4. Mastodon — fosstodon.org TIER_1 · [email protected] ·

    📰 Your doctor’s AI notetaker may be making things up, Ontario audit finds Made-up therapy referrals, incorrect prescriptions among the common mistakes. 📰 Source

    📰 Your doctor’s AI notetaker may be making things up, Ontario audit finds Made-up therapy referrals, incorrect prescriptions among the common mistakes. 📰 Source: Ars Technica 🔗 Link: https://arstechnica.com/health/2026/05/your-doctors-ai-notetaker-may-be-making-things-up-ontario-…

  5. Mastodon — fosstodon.org TIER_1 · [email protected] ·

    🎮 Amazon Reportedly Forced Devs To Make A GenAI Game And Then Laid Them Off When It Wasn’t Working Project Trident apparently pivoted several times to meet unre

    🎮 Amazon Reportedly Forced Devs To Make A GenAI Game And Then Laid Them Off When It Wasn’t Working Project Trident apparently pivoted several times to meet unrealistic deadlines and a corporate-directed mandate to use generative AI 📰 Source: Kotaku 🔗 Link: https://kotaku.com/amaz…

  6. Mastodon — fosstodon.org TIER_1 · [email protected] ·

    Gee, what a surprise! - Audit finds Ontario doctors' # AI scribes inventing key medical details https:// arstechnica.com/health/2026/05 /your-doctors-ai-notetak

    Gee, what a surprise! - Audit finds Ontario doctors' # AI scribes inventing key medical details https:// arstechnica.com/health/2026/05 /your-doctors-ai-notetaker-may-be-making-things-up-ontario-audit-finds/

  7. Mastodon — fosstodon.org TIER_1 · [email protected] ·

    Ontario’s :flagon: auditor general found that AI transcriber for use by doctors 'hallucinated,' generated errors https://www. cbc.ca/news/canada/toronto/ai- scr

    Ontario’s :flagon: auditor general found that AI transcriber for use by doctors 'hallucinated,' generated errors https://www. cbc.ca/news/canada/toronto/ai- scribe-system-hallucinations-9.7197049 - - - La vérificatrice-générale de l’Ontario :flagon: a trouvée que le transcripteur…

  8. Mastodon — mastodon.social TIER_1 · [email protected] ·

    Ontario auditors find doctors' AI note takers routinely blow basic facts https://www.theregister.com/ai-ml/2026/05/14/ontario-auditors-find-doctors-ai-note-take

    Ontario auditors find doctors' AI note takers routinely blow basic facts https://www.theregister.com/ai-ml/2026/05/14/ontario-auditors-find-doctors-ai-note-takers-routinely-blow-basic-facts/5240771 # HackerNews # Tech # AI

  9. Mastodon — mastodon.social TIER_1 · [email protected] ·

    Your doctor's AI notetaker may be making things up, Ontario audit finds https://arstechnica.com/health/2026/05/your-doctors-ai-notetaker-may-be-making-things-up

    Your doctor's AI notetaker may be making things up, Ontario audit finds https://arstechnica.com/health/2026/05/your-doctors-ai-notetaker-may-be-making-things-up-ontario-audit-finds/ # AI # HealthTech # Privacy

  10. Mastodon — mastodon.social TIER_1 · [email protected] ·

    Medical AI transcriber for Ontario doctors 'hallucinated,' generated errors: auditor general Artificial intelligence note-taking tools intended for use by Ontar

    Medical AI transcriber for Ontario doctors 'hallucinated,' generated errors: auditor general Artificial intelligence note-taking tools intended for use by Ontario doctors provided incorrect and incomplete information or demonstrated "hallucinations," and were not evaluated adequa…

  11. Mastodon — mastodon.social TIER_1 · [email protected] ·

    AI transcriber for use by Ontario doctors 'hallucinated,' generated errors, auditor finds Artificial intelligence note-taking tools intended for use by Ontario

    AI transcriber for use by Ontario doctors 'hallucinated,' generated errors, auditor finds Artificial intelligence note-taking tools intended for use by Ontario doctors provided incorrect and incomplete information or demonstrated "hallucinations," and were not evaluated adequatel…