Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Probably so, assuming that what it spits out is actually real and not some hallucination, but that's not at all a given. And I also assume that the people most inclined to regurgitating what an LLM spits out are also heavily overlapped with the people who are least likely to verify that the information is correct, or verify primary sources, or even think to ask for sources in the first place.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: