this post was submitted on 18 Jun 2024
697 points (99.0% liked)

196

16714 readers
2749 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 73 points 6 months ago (18 children)

Do you ever feel like we will be the last generation to know anything?

[–] grandkaiser 3 points 6 months ago (2 children)

Hopefully your generation will be the last that can't tell an obvious shitpost from reality.

[–] Retrograde 5 points 6 months ago
[–] [email protected] 2 points 6 months ago (2 children)

How is this an obvious shit post? I am here to learn.

[–] grandkaiser 5 points 6 months ago* (last edited 6 months ago) (1 children)

AI didn't write this. AI would never write this. It's outrageously wrong to an extreme degree. Making dangerous and false claims have happened on occasion with LLM's (Often due to being fed various prompts until the user twists it into saying it), but an AI wouldnt write something like that, come up with a fake graph, and include a made up song (!?!) from the beetles about it. The fact that you are believing it doesn't speak to the danger of AI as much as it speaks to the gullibility of people.

If I said "obama made a law to put babies in woodchippers" and someone believes it, it doesn't speak to Obama being dangerous, it speaks to that person being incredibly dense.

[–] [email protected] 4 points 6 months ago* (last edited 6 months ago)

I have used LLMs before and they are occasionally wrong, seems like you don't disagree. I don't see how someone who isn't deeply familiar with LLMs would be obviously tipped off that this post is a shit post. As for the graphs, who knows, Google probably already has that working. I've seen LLMs make up songs before too.

AI would never write this.

Why not? I figure you could train an AI to write this. I could see a Google engineer messing up and producing a bad AI. GPT2 engineers has made this mistake before.

The fact that you are believing it doesn’t speak to the danger of AI as much as it speaks to the gullibility of people.

This is kind of like saying "the problem with nuclear bombs is that people are too easy to evaporate at high temperatures, not the bombs themselves". Yeah, that is true, but it's really hard to make people less gullible. I wouldn't say LLM's and AI are bad or we should stop using them. But I think people like you need to understand that the average person is not on your level, and you need to slow your roll.

If I said “obama made a law to put babies in woodchippers”....

I don't think this is a good comparison, because Obama has been around for a while and most people believe Obama wouldn't do that. Now if Obama went from being a nobody to president in a day and then someone told me the about the woodchipper law. I would be unsure and have to double check. It wouldn't be obvious. Likewise, since LLMs are relatively new to most people, it's going to take a while before most people figure out what is a normal mistake by an LLM vs an obviously faked mistake by a shit poster.

[–] Zess 4 points 6 months ago (1 children)

Are you confused about the shitpost part or the obvious part

[–] [email protected] 2 points 6 months ago* (last edited 6 months ago) (1 children)

Ok, your not the original person. Is this an obvious shit post in your opinion?

Because if it isn't obvious how am I supposed to know this is a shit post or not?

[–] [email protected] 2 points 6 months ago (1 children)

I’m a different person from those two and yes, I definitely agree this is an extremely obvious shitpost. The Beatles song and the image (zoom in if you haven’t seen the included dog safety chart) are the giveaways.

[–] [email protected] 2 points 6 months ago (1 children)

As someone who doesn't pay close attention to what Google does at the top of the page, this is not obvious to me. Glad it is a shit post and not something Google actually responded with.

[–] [email protected] 2 points 6 months ago* (last edited 6 months ago)

Totally fair! LLMs are blurring the lines of shitposts and insane responses, so I’m for sure not gonna shit on anyone for not recognizing a shitpost.

I do have a lot of experience, as a person who (un?)fortunately was on /b/ when it was released until the late ‘00s.

This is something else, though… I’m interested and horrified to see where this all goes.

Edit: I do also have to say I use no google products or services so I’m with you there hahaha

load more comments (15 replies)