• 0 Posts
  • 37 Comments
Joined 11 months ago
cake
Cake day: August 11th, 2023

help-circle


















  • We’re actually trained not to just use what the roomer wrote. The reason is that what the receptionist writes and roomer writes can be inaccurate, and inaccuracies can multiply each time they’re transcribed.

    For example, the call center might write “pain in testicle,” and then the roomer might write “lump in left testicle for 2 weeks” and then the patient tells me the lump has been in the right testicle for 3-4 weeks. If we just all copied the original note, we might be working with the wrong symptoms or wrong location. And asking questions assuming the notes are 100% accurate can lead a patient into giving us inaccurate answers, which is a much lower risk if we ask open-ended questions and let you fill them in. We do read the roomer’s notes, but our documentation is much better if we are getting the information directly from you rather than playing telephone.

    As for cutting people off, I can’t speak for your individual doctors, other than to say there is a certain personality type who will answer every question (even yes/no questions) with a 1-2 minute meandering answer. And if we have 20 questions to get through, we simply can’t ask every patient for the rest of the day to wait an extra 20-40 minutes just to avoid cutting people off. If your doctor is doing that even when you’re giving a 1 sentence answer, though, you may need to look for a new one.


  • Yep, exactly.

    As a doctor who’s into tech, before we implemented something like AI-assisted diagnostics, we’d have to consider what the laziest/least educated/most tired/most rushed doctor would do. The tools would have to be very carefully implemented such that the doctor is using the tool to make good decisions, not harmful ones.

    The last thing you want to do is have a doctor blindly approve an inappropriate order suggested by an AI without applying critical thinking and causing harm to a real person because the machine generated a factually incorrect output.