Jesse, this is well known. In "technical" terms this is determined by the size of the "context-window", the number of tokens that a LLM has available at any given time. With ChatGPT-3.5 that was about 2,000 tokens, with ChatGPT-4 these are about 3,500 tokens which comes to a bit over 2,500 words or five pages of single-spaced text.
Everything beyond that is washed out of the conversation.
🙂
@boris_steipe ok, I guess I was the last to know.