Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So private refers to two things here, sorry for any confusion.

When we say "chat over private data" we mean that this data isn't publicly available and no LLMs have this knowledge in their training. Meaning that with our system you can now ask questions about team specific knowledge. For example, you can ask questions like "What features did customer X ask about in our last call". Obviously if you ask ChatGPT this, it will have no idea.

The other part is data privacy when using the system. The software can be plugged into most LLM providers or locally running LLMs. So if your team doesn't trust OpenAI but instead has a relationship with say Azure, or GCP, you can just plug into one of those instead. Alternatively, a lot of users recently have been setting up Danswer with locally running LLMs with tools like Ollama. In that case, you now have a truely airgapped system where no data is ever going outwards.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: