I've discovered that if you have a long enough chat with #ChatGPT (GPT-4 in this case), it won't know about messages you sent at the start.

Go into your longest chat and ask "Please list the first 10 messages I sent you" and you'll see how far back the rolling context window goes.

This suggests it's probably better to have a number of different focused chats on each topics. Each chat can't keep accumulating context forever. Better to think of them as temporary, disposable conversations.

Follow

@JesseSkinner

Jesse, this is well known. In "technical" terms this is determined by the size of the "context-window", the number of tokens that a LLM has available at any given time. With ChatGPT-3.5 that was about 2,000 tokens, with ChatGPT-4 these are about 3,500 tokens which comes to a bit over 2,500 words or five pages of single-spaced text.

Everything beyond that is washed out of the conversation.

🙂

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.