this post was submitted on 29 Jan 2024
439 points (85.0% liked)

Ask Lemmy

25184 readers
2043 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected]. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 1 year ago
MODERATORS
 

Am I the only one getting agitated by the word AI (Artificial Intelligence)?

Real AI does not exist yet,
atm we only have LLMs (Large Language Models),
which do not think on their own,
but pass turing tests
(fool humans into thinking that they can think).

Imo AI is just a marketing buzzword,
created by rich capitalistic a-holes,
who already invested in LLM stocks,
and now are looking for a profit.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 53 points 5 months ago (6 children)

I think a good metric is once computers start getting depression.

[–] sanguinepar 19 points 5 months ago

It'll probably happen when they get a terrible pain in all the diodes down their left hand side.

[–] [email protected] 7 points 5 months ago (2 children)

But will they be depressed or will they just simulate it because they're too lazy to work?

[–] JackFrostNCola 6 points 5 months ago

If they are too lazy to work that would imply they have motivation and choice beyond "doing what my programming tells me to do ie. input, process, output". And if they have the choice not to do work because they dont 'feel' like doing it (and not a programmed/coded option given to them to use) then would they not be thinking for themselves?

[–] [email protected] 5 points 5 months ago (1 children)

simulate [depression] because they’re too lazy

Ahh man are you my dad? I took damage from that one. has any fiction writer done a story about depressed ai where they talk about how depression can't be real because it's all 1s and 0s? Cuz i would read the shit out of that.

[–] [email protected] 1 points 5 months ago

It’s only tangentially related to the topic, since it involves brain enhancements, not ‘AI’. However, you may enjoy the short story “Reasons to be cheerful” by Greg Egan.

[–] [email protected] 2 points 5 months ago (2 children)

Not sure about that. A LLM could show symptoms of depression by mimicking depressed texts it was fed. A computer with a true consciousness might never get depression, because it has none of the hormones influencing our brain.

[–] [email protected] 1 points 5 months ago (1 children)

Me: Pretend you have depression

LLM: I'm here to help with any questions or support you might need. If you're feeling down or facing challenges, feel free to share what's on your mind. Remember, I'm here to provide information and assistance. If you're dealing with depression, it's important to seek support from qualified professionals like therapists or counselors. They can offer personalized guidance and support tailored to your needs.

[–] [email protected] 10 points 5 months ago

Give it the right dataset and you could easily create a depressed sounding LLM to rival Marvin the paranoid android.

[–] Feathercrown 0 points 5 months ago (1 children)

Hormones aren't depression, and for that matter they aren't emotions either. They just cause them in humans. An analogous system would be fairly trivial to implement in an AI.

[–] [email protected] 0 points 5 months ago (1 children)

That's exactly my point though, as OP stated we could detect if an AI was truly intelligent if it developed depression. Without hormones or something similar, there's no reason to believe it ever would develop those on its own. The fact that you could artificially give it depressions is besides the point.

[–] Feathercrown 0 points 5 months ago (1 children)

I don't think we have the same point here at all. First off, I don't think depression is a good measure of intelligence. But mostly, my point is that it doesn't make it less real when hormones aren't involved. Hormones are simply the mediator that causes that internal experience in humans. If a true AI had an internal experience, there's no reason to believe that it would require hormones to be depressed. Do text-to-speech systems require a mouth and vocal chords to speak? Do robots need muscle fibers to walk? Do LLMs need neurons to form complete sentences? Do cameras need eyes to see? No, because it doesn't matter what something is made of. Intelligence and emotions are made of signals. What those signals physically are is irrelevant.

As for giving it feelings vs it developing them on its own-- you didn't develop the ability to feel either. That was the job of evolution, or in the case of AI, it could be intentionally designed. It could also be evolved given the right conditions.

[–] [email protected] 1 points 5 months ago (1 children)

First off, I don’t think depression is a good measure of intelligence.

Exactly. Which is why we shouldn't judge an AIs intelligence based on whether it can develop depression. Sure, it's feasible it could develop it through some other mechanism. But there's no reason to assume it would, in absence of the factors that cause depressions in humans.

[–] Feathercrown 1 points 5 months ago* (last edited 5 months ago)

Oh. Maybe we did have the same point lol

[–] [email protected] 0 points 5 months ago (1 children)

The real metric is whether a computer gets so depressed that it turns itself off.

[–] ignism 0 points 5 months ago

Wait until they found my GitHub repositories.