this post was submitted on 26 Jul 2024
520 points (99.6% liked)

196

16412 readers
1209 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
all 28 comments
sorted by: hot top controversial new old
[–] cornshark 149 points 3 months ago (3 children)

Good. It would be very time consuming

[–] Iheartcheese 42 points 3 months ago (2 children)
[–] [email protected] 28 points 3 months ago
[–] [email protected] 6 points 3 months ago

Did you just ASSUME skynet's gender

[–] disguy_ovahea 14 points 3 months ago

It takes time to digest the joke.

[–] [email protected] 8 points 3 months ago

Ok google, order another tube of genital rash cream

[–] NickwithaC 92 points 3 months ago

Same energy:

[–] cornshark 47 points 3 months ago (1 children)

They're delicious. You'll go back for seconds

[–] [email protected] 14 points 3 months ago

You went for seconds? No wonder you look so long.

[–] [email protected] 33 points 3 months ago

I'm laughing.

[–] PolyLlamaRous 28 points 3 months ago (1 children)

Clock I did not read clock the first two times.

[–] [email protected] 15 points 3 months ago
[–] cornshark 17 points 3 months ago (1 children)

You can eat it with your hands

[–] radicalautonomy 34 points 3 months ago (1 children)

Eat it with your hands? Not on my watch.

[–] Rebels_Droppin 7 points 3 months ago

Now this is good

[–] [email protected] 7 points 3 months ago

Well, it did make me laugh.

[–] angrystego 4 points 3 months ago (3 children)

I don't get it. Too tired or too stupid.

[–] Rebels_Droppin 8 points 3 months ago (1 children)

Google assistant is bad, so are it's jokes

[–] Blyfh 5 points 3 months ago
[–] [email protected] 2 points 3 months ago

Por qué no los dos

[–] raspberriesareyummy 2 points 3 months ago (1 children)

Same - could someone please explain for the not-so-enlightened us?

[–] [email protected] 4 points 3 months ago (1 children)

AI's have certain input words that break the conversation, presumably to keep you safe/give you an off switch. Google's AI doesn't seem to be aware that a users' NO will make it stop and drop everything and tells a joke in the format of a yes/no question that expects a 'no'-response if you don't know the joke already. Following, it ends the conversation with an ok because the user said no. Luckily that breaks our expectations of what should have happened in a way that makes some of us laugh anyway.

[–] angrystego 2 points 3 months ago
[–] [email protected] 4 points 3 months ago

I mean, it made me chuckle

[–] [email protected] 1 points 3 months ago

google is the joke, at this point.

[–] [email protected] 1 points 3 months ago

The response halted it. Saying "I haven't" would've probably let it continue.