Many medical centers use an AI-powered tool called Whisper to transcribe patients’ interactions with their doctors. But researchers have found that it sometimes invents text, a phenomenon known in the industry as hallucinations, raising the possibility of errors like misdiagnosis. John Yang speaks with Associated Press global investigative reporter Garance Burke to learn more.