Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

An in-law had a minor tremor to their left hand, and whilst this wasn't pointed out to said in-law, it was noticed, and there were other small tells about what the problem might be, which ultimately led to a diagnosis and appropriate treatment.

Any kind of LLM and its frequency of hallucination means that LLMs are inappropriate as a solution in this scenario. An LLM is not a physician, it's an LLM.

You can make soup in an electric kettle, but it doesn't make it the right tool for the job, and comes with a lot of compromises.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: