“A mother recently wrote to me with concerns about her 14-month-old daughter. The child's hospital had begun using AI to record visits -- and her husband had unknowingly consented. The woman feared that her child's sensitive information might live online forever or leak.
That fear captures what many families feel as artificial intelligence moves quietly into exam rooms, in the form of AI scribes. These digital tools transcribe and summarize your doctor visit into notes -- sometimes recording and storing the audio in the process. After the doctor reviews the notes, they go into your medical record so you and the doctor can refer to them in the future.
Some doctors recorded patient visits and used computer dictation and transcription before AI, but adding the new technology to the mix has made the practice much more popular. Around 30% of U.S. physician practices have adopted AI scribes, and doctors praise them for reducing after-hours paperwork (often referred to as "pajama time") and allowing more attention for patients.
Yet these systems also raise hard questions about privacy, accuracy and consent. Perhaps the biggest issue: With AI scribes, your words are usually sent to a third party, often cloud-based, that converts audio to text. It is often not clear where the recordings are stored, for how long and who has access to them.
Here are questions every patient or parent should ask their providers before agreeing to let an AI scribe into the consultation room.
1. Can the doctor record without my consent?
It varies. States have different rules about getting patient consent before recording a doctor visit. In some cases, the providers must ask you first; in others, they don't. That said, most health systems still obtain explicit patient consent as a matter of policy -- but once again, policies vary, and some practices may not let you opt out of an AI scribe for a nonurgent visit.
So, be sure to check your state's rules and ask about the provider's policy right off the bat. Let them know your preference explicitly. You can also ask providers to have a human handle the transcription, not an AI, but in most cases they have no legal obligation to comply.
2. What is being recorded?
Assume everything is. And that means anything you say while the recorder is on may be reflected in your medical record, where your insurer may be able to see it. So you should have the doctor pause the recording during any discussion of sensitive topics such as sexual or mental health, or substance use.
As always, your rights vary from state to state and clinic to clinic -- but, generally, if doctors are required to get your consent to record in the first place, they must also pause a recording when you ask them to.
3. Who is processing my data, and where is it stored?
The visit recordings usually end up in the hands of third-party vendors -- ranging from big names like Microsoft to medical-technology startups -- usually in the cloud. As you would expect, their privacy practices differ.
Ask the doctor's office which vendor processes your visit and ask for specifics. Some critical points to check on: Who can access the data? How long it is kept, what security controls apply and who contacts you if there is a breach? Also be sure to find out if your healthcare provider has a HIPAA Business Associate Agreement with the vendor. This imposes requirements for things like permitted uses and safeguards of your data, and makes vendors directly liable for security under HIPAA.
4. Who checks the notes for accuracy?
The transcription process, meanwhile, poses risks of its own. Modern AI tools often deliver lower overall error rates (around 1% to 3%) than older speech-recognition systems (7% to 11%). Even so, small percentages matter in medicine, and AI can introduce new kinds of errors, like invented details known as hallucinations. It may omit relevant information while summarizing, mix up who is speaking and misinterpret context -- such as missing nonverbal cues.
Bias is a risk, too. Recent studies found that AI scribes showed substantially poorer accuracy in transcribing strong regional accents, as well as Black patients' speech. Problems also arise when patients have limited English proficiency.
5. Will my information be shared, or used to train AI?
Reputable clinics should be able to say no -- both about their own policies and their vendor's. Under HIPAA, a healthcare provider may use a patient's information for treatment, payment and routine operations. But if the data is used by the clinic or third-party vendor to develop a product, service or other commercial application, separate patient consent is generally required.
The bottom line: If you don't want your information used for these purposes, tell your provider to record the preference in your chart. That way, any other provider in the network will know what you want.
6. Can I say no to a scribe and get the same care?
Once again, it varies. Even in states that require your consent to record, providers may technically refuse to see you if you say no, provided the visit isn't an emergency. Most clinics, though, will respect your wishes.” [1]
1. If Your Doctor Uses A Bot, Ask Questions. Akhlaghpour, Saeed. Wall Street Journal, Eastern edition; New York, N.Y.. 23 Dec 2025: A10.
Komentarų nėra:
Rašyti komentarą