Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The sota chatbots are getting more and more functionality that is not just LLM inference. They can search the web, process files, integrate with other apps. I think that's why most people will consider local LLMs to be insufficient very soon.


But that's just software that also runs fine locally. A few tools with a local LLM can do it.


Well I don't see people running their web search locally, so I don't think they will run their own search+LLM.


No but you can call out to Google or DDG APIs.


Nah I disagree, tool calling isn't that difficult. I've got my own Cats Effect based model orchestration project I'm working on, and while it's not 100% yet I can do web browse, web search, memory search (this one is cool), and others on my own hardware.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: