this post was submitted on 30 Dec 2024
653 points (98.5% liked)
Programmer Humor
19817 readers
138 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How does conversation context work though? Is that memory not a form of learning?
The context window is a fixed size. If the conversation gets too long, the start will get pushed out and the AI will not remember anything from the start of the conversation. It's more like having a notepad in front of a human, the AI can reference it, but not learn from it.
Also, a key part of how GPT-based LLMs work today is they get the entire context window as their input all at once. Where as a human has to listen/read a word at a time and remember the start of the conversation on their own.
I have a theory that this is one of the reasons LLMs don't understand the progression of time.