Ontario auditors find doctors' AI note takers routinely blow basic facts

· ai · Source ↗

TLDR

  • Ontario’s audit of 20 approved AI Scribe vendors found 60% mixed up prescribed drugs and 45% fabricated clinical information never discussed in patient recordings.

Key Takeaways

  • 12 of 20 AI Scribe systems inserted incorrect drug information; 9 of 20 fabricated treatment suggestions not present in simulated recordings.
  • 17 of 20 systems missed key mental health details; 6 missed mental health issues fully or partially.
  • Procurement scoring weighted Ontario domestic presence at 30% of total score; medical note accuracy counted only 4%.
  • No mandatory attestation feature exists in any approved AI Scribe system despite OntarioMD recommending manual review.
  • Over 5,000 Ontario physicians are currently using these systems; the Ministry reports no known patient harms so far.

Hacker News Comment Review

  • Commenters with firsthand experience confirmed hallucination in live clinical and workplace settings, with fabricated diagnoses and invented vendor commitments appearing in real notes.
  • The procurement scoring breakdown drew the most technical criticism: weighting domestic presence 7.5x higher than accuracy is seen as the structural root cause, not just model quality.
  • Several commenters raised the baseline question of human error rates in medical records, noting no comparative data was provided, which weakens the implied severity framing.

Notable Comments

  • @Groxx: Personally received an AI summary diagnosing him with osteoporosis and hip pain after a Runner’s Knee visit – none of it was said. “CHECK YOUR TRANSCRIPTS.”
  • @Hobadee: Notes timestamped to recording segments enable spot-checking; argues provenance linking is the minimum viable safety feature for clinical AI scribes.

Original | Discuss on HN