this post was submitted on 24 Aug 2024
-23 points (17.1% liked)

science

14413 readers
464 users here now

just science related topics. please contribute

note: clickbait sources/headlines aren't liked generally. I've posted crap sources and later deleted or edit to improve after complaints. whoops, sry

Rule 1) Be kind.

lemmy.world rules: https://mastodon.world/about

I don't screen everything, lrn2scroll

founded 1 year ago
MODERATORS
 
Key points
  • The hippocampus enables abstract reasoning; LLMs mirror this through pattern-based language prediction.
  • Future AI could emulate human inference by integrating multimodal learning and reinforcement methods.
  • AI's evolution hinges on bridging prediction and reasoning, moving toward deeper, human-like understanding.
top 7 comments
sorted by: hot top controversial new old
[–] fart_pickle 22 points 1 month ago
[–] [email protected] 13 points 1 month ago

"Can LLMs think like us?"

No.

"Can LLMs think—?”

No.

"Can LLMs—?"

No.

[–] just_another_person 7 points 1 month ago
[–] [email protected] 5 points 1 month ago

Facts, reasoning, ethics, ect. are outside the scope of an LLM. Expecting otherwise is like expecting a stand mixer to bake a cake. It is helpful for a decent part of the process, but typically is lacking in the using heat to process batter into a tasty desert area. An AI like one from the movies would require many more pieces than an LLM can provide and saying otherwise is a a category mistake*.

That isn't to say that something won't be developed eventually, but it would be FAR beyond an LLM if it is even possible.

(* See also: https://plato.stanford.edu/entries/category-mistakes/)

[–] [email protected] 5 points 1 month ago

Not like us, but maybe like OP 🤣

[–] A_A -1 points 1 month ago

"Can LLMs Think ?" YES "Like Us ?" NO ... not right now anyway.

[–] Zexks -3 points 1 month ago

The fear in here is palpable.