Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

“ mimic doctor’s notes and reduce the amount of paperwork a physician would have to manually compile.”

So there is AI making up doctors notes ? That’s extremely contentious.

In Ontario doctors are paid per patient per year and then also per visit / procedure. This doctor is just outsourcing her doctor work that taxpayers are footing the bill for - to AI. This is wrong in so many ways. What a lazy practitioner!




Did we read different articles? Did you just fixate on that line and not read the rest? She was the one doing the examination and verbalising her concerns. The AI disseminated that information the doctor provided and reformatted it into bureaucratic prose just like doctors scribes have been doing for the last few centuries. I don't see the issue here in the slightest. None of the "doctoring" has been outsourced, just the scribe work.


In my line of work (healthcare in the UK), an AI system that would change what a human writes in any way would be considered a medical device, and would require an absolute ton of paperwork to certify. In the case of LLMs, I don't know if certification would even be possible, because you have to show that your system didn't change the intent of the human, which is impossible to do with LLMs.


Sure, if the LLM note were filed straight into the chart without review, it’d be pretty unsafe.

But these systems are meant to generate a draft note that the doctor still has to review, edit, and sign. At the end of the day, it’s still up to the doctor to ensure the note is correct.


Its mostly translating dictation and recordings. The doctors are still liable if it's wrong so there's plenty of incentive to simply read the output, which is still a major productivity improvement over writing+reading.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: