this post was submitted on 16 Sep 2024
725 points (94.9% liked)

Atheist Memes

5453 readers
12 users here now

About

A community for the most based memes from atheists, agnostics, antitheists, and skeptics.

Rules

  1. No Pro-Religious or Anti-Atheist Content.

  2. No Unrelated Content. All posts must be memes related to the topic of atheism and/or religion.

  3. No bigotry.

  4. Attack ideas not people.

  5. Spammers and trolls will be instantly banned no exceptions.

  6. No False Reporting

  7. NSFW posts must be marked as such.

Resources

International Suicide Hotlines

Recovering From Religion

Happy Whole Way

Non Religious Organizations

Freedom From Religion Foundation

Atheist Republic

Atheists for Liberty

American Atheists

Ex-theist Communities

[email protected]

[email protected]

[email protected]

Other Similar Communities

[email protected]

[email protected]

[email protected]

[email protected]

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 1 day ago (1 children)

Complexity for one. A cramped foot has an influence on the brain, as does apparently the gut bacteria. Focusing on the brain is a starting point and we don't even understand that that well.

If someone perfectly simulated your entire brain, would that digital brain be sentient?

I don't know. It could be. For now I don't think so. Are you comparing that to an LLM? That would be like comparing the paths of snail slime to a comic. One could compare story lines and art styles to something that just isn't there. And never will be.

What is sentience?

Sentience is the ability to experience feelings and sensations (wiki). A word not based on a clear understanding, but rather an attempt to categorize. Nonetheless, an LLM doesn't experience anything. It uses pattern recognition and human provided categorization to try and create different stuff. All in the confines of the recognitions.

I think it's strange to say that AI will never be sentient.

It's why it's important to distinguish between "AI" and "LLM". AI, as an AGI, is something we might be able to build one day. LLMs might be a step on the way to this. But not the way they are now.

[–] [email protected] 2 points 1 day ago (1 children)

You have a point with most of the things you said, it's mostly a matter of perspective and how you define stuff. the only thing I really fundamentally disagree with is equating AI to AGI.

[–] [email protected] 1 points 1 day ago (1 children)

Why do you disagree with that? No, that's a stupid question. How do you disagree with that? Can you elaborate your point?

[–] [email protected] 2 points 12 hours ago* (last edited 12 hours ago) (1 children)

AI refers to lots of things, including image recognition or generation models. AGI only refers to artificial general intelligence, aka the kind of AI you would see in science fiction movies. we have ai, we don't have AGI

[–] [email protected] 1 points 10 hours ago

Yeah, I see how this looks. I was trying to comment about how for some people an AI (as in LLM) seems like a real person (or something different, but sentient), so I was reducing the category "AI" to LLMs.

AI is also, as you said, used for ie pathfinding algorithms in games. I never liked the word "AI" for that. But I came to terms with it as the AI got more sophisticated and rounded, making the figurines in games appear more natural in their behaviour. Also I don't have a better word for it.

I used AGI because that is the only subpath of AI that I can consider having a chance of being/becoming sentient. That's why I went into that direction, to oppose LLMs, despite LLMs being perceived by some as being sentient.

So yeah, the categorization was a bit off to drive home a point. I didn't realize you wanted to discuss semantics (I know this sounds sarcastic, but I also tend to correct people on semantics if I can, therefore I don't intend to be sarcastic.)