Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> we have private API access that doesn't use anything you say as training inputs- maybe your company would prefer that?"

This isn't enough for many companies, since the data still goes out the door. They would have to set up on site hosting to appease security minded orgs. Or, maybe that's what you mean.



OpenAI already offers private ChatGPT instances hosted on Azure.

I know of a bank who is paranoid enough to use a self hosted on-premise GitHub instance and they went with the private (off-premise) ChatGPT instance.

They don't use it for code/confidential data though.


> OpenAI already offers private ChatGPT instances hosted on Azure.

> They don't use it for code/confidential data though.

Yes, private isn't enough. They need to offer self hosted, for these types of clients. I imagine most orgs who need self hosted would already have a datacenter to run it in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: