Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Like most software, LLMs can be run locally, or on private infrastructure. This was on the front page yesterday, which is not the only way to run an LLM locally, but about the easiest way possible: https://news.ycombinator.com/item?id=38464057


Thanks! Well, yea, I just thought the quality of offline models might not yet be good enough. By I'm glad to be told otherwise :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: