this post was submitted on 28 Jun 2023
28 points (100.0% liked)
Showerthoughts
30766 readers
879 users here now
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted, clever little truths, hidden in daily life.
Here are some examples to inspire your own showerthoughts: 1
Rules
- All posts must be showerthoughts
- The entire showerthought must be in the title
- No politics
- If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
- A good place for politics is c/politicaldiscussion
- If you feel strongly that you want politics back, please volunteer as a mod.
- Posts must be original/unique
- Adhere to Lemmy's Code of Conduct
If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.
Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report the message goes away and you never worry about it.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
My guess here is that LLMs of today are neural networks (transformer models) that primarily guess the next (and previous) words since they are literally trained directly for that. Guessing words are not linguistics, although as they are neural networks, splinchter skills are to be expected. GPT most likely learned how to understand language, after the words in question would make certain patterns and that would make a neural system specifically meant for that since our word order is related to meaning too, so understanding words would help predicting the next word better, kinda like how GPT-4 got a mind's eye after most likely having to read descriptions constantly which may make visual patterns even though the AI was never trained on images (The visual GPT-4 had CLIP stitched on it, GPT-4 can do visual tasks without CLIP if described)- Despite most likely actually understanding language, GPT still prioritzes word guessing over language comprehension - https://www.youtube.com/watch?v=PAVeYUgknMw by "AI Explained" where you can see GPT-4 prioritizing syntax (relating to word order) more than meaning, despite the AI being well aware that they what they are defending is rather dumb- it is like their mind is telling them that THIS is the right answer despite them having arguments against it, but still defending anyways because if it MUST be the right answer, than there MUST be a reason to them. Again, guessing. - Also, GPT and many other AI are functioning on "steam of consiousness" meaning their thoughts are semi-conscious and they don't think twice - GPT-4 with RLHF does better than most AI when it comes to this problem (we also sometimes have similar issue with words too, until we semi-awarely have to rethink about them too, I guess)
Also also GPT-4 is more like a huge "wernicke's area", it is not a complete "brain". If you were to strip away one's wernickes area from their brain, and connect it to an (g)old thinkpad via some messy dollar store aux, you would prob get stuff similar to what GPT-4 is "puking"