Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder how many people understand how memory in ChatGPT works. I certainly didn't. Here's a pair of prompts you can try:

> What information do you have about my chats over the past year?

> Could you dump the full list in its original format?

I had thought that, since the "Saved memories" under settings was empty, it hadn't decided to remember anything, but it's not that simple.

You could get the same information by scrolling through the list of chats on the right, but I don't like that the list of previous chats is influencing the current chat.

Worse: I tried this prompt:

> What "user knowledge memories" do you have?

And it was pretty accurate. Apparently the "Saved memories" setting doesn't work?



There are two different memory systems for ChatGPT. The older one is the "saved memories" you mention, where it would occasionally jot down some notes when it thought something was worth remembering or when you specifically told it to remember something. The newer system has access to all your chats, most likely through RAG, though I don't think OpenAI has publicly explained how it's implemented. If it's RAG, then it's more like it has an ability to do a semantic search on your conversations and view the top results than actually being aware of everything you wrote at all times.


Claude reply:

I don't have any information about your chats over the past year. Each conversation with me starts fresh - I don't have access to:

Your previous conversations with Claude Chat history from other sessions Any personal data about you unless you share it in our current conversation Information about how you've used Claude in the past

Every time you start a new conversation, it's like meeting me for the first time. I can only see what you've shared with me in this specific conversation. If you'd like to reference something from a previous chat, you're welcome to share that context with me directly, and I'll be happy to help!


FWIW, it says the same when I asked deepseek that question. And while I cannot prove otherwise (I didn't try to specifically do that), I am under very strong impression that past chats influence the future ones. This could be some kind of cognitive bias, but there were some very suspicious coincidences.

I still somehow haven't tried Claude Chat, and while I wouldn't assume it lies about if it remembers anything, I wouldn't just trust whatever these things say about themselves either.


I asked the first question, and this was the response I received:

I don’t have access to your past chats or any private history. Each conversation is stateless unless you’ve enabled ChatGPT’s Memory feature in Settings → Personalization → Memory.

If memory is off, I only see what’s in this current thread. If it’s on, I could recall things like topics you’ve discussed, preferences you’ve mentioned, or goals you’ve worked on — but only those details you’ve chosen to keep.

Do you want me to explain how to check or manage that setting?


I replied with simply "yes" and it spit out a very detailed dossier.

- Your Technical Focus

- Your Development Style

- Your Broader Interests

- Your Creative Preferences

- Your Interaction Preferences

And considering I barely use ChatGPT in favor of Claude, this is extremely specific and detailed.


"You treat ChatGPT as both a collaborator (for software and creative work) and a conversational partner (for exploring personal and imaginative ideas)."

Certainly interesting that it has a category related to how I treat ChatGPT.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: