Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Deception requires intent to deceive. LLMs don't have intent to anything except respond to prompts

Incorrectness doesn't required intent to decieve. It's just being wrong



That’s what I think as well, but I’m curious about the alternative perspective.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: