this post was submitted on 28 May 2024
100 points (96.3% liked)

Linux and Tech News

980 readers
78 users here now

This is where all the News about Linux and Linux adjacent things goes. We'll use some of the articles here for the show! You can watch or listen at:

You can also get involved at our forum here on Lemmy:

Or just get the most recent episode of the show here:

founded 1 year ago
MODERATORS
top 11 comments
sorted by: hot top controversial new old
[–] TommySoda 19 points 5 months ago

Funny how the average person figured this out almost immediately while Google needed half a year to figure it out with their researchers. Almost like they were ignoring it as long as they could for the sake of profit. Fuck around and find out, I guess.

[–] [email protected] 10 points 5 months ago

Where were the signs? Why didn't someone warn us?!

[–] homesweethomeMrL 7 points 5 months ago

If only anyone - anyone at all - could have foreseen this horrible outcome

[–] something15525 6 points 5 months ago

Version that doesn't require an account: https://archive.is/OA7Jb

[–] Anticorp 3 points 5 months ago

They're admitting that they are the source of a massive problem. But are they going to do anything about it, or keep pushing their shitty, half-baked AI? It's crazy to me how much worse their AI is than ChatGPT, considering all of the financial and engineering resources available to Google.

[–] [email protected] 3 points 5 months ago

It's alright guys--I just looked up a solution and Google suggests eating glue and a few small pebbles will solve the issue.

[–] requiem 3 points 5 months ago (1 children)

Google Researchers Now Also Say We All Should Use Their Shit AI Search That Tells Us To Eat Glue

[–] [email protected] 1 points 5 months ago

GRNASWASUTSASTTUTEG.

[–] darthelmet 3 points 5 months ago

With their shitty AI this belongs on not the onion.

[–] cobysev 2 points 5 months ago

Ahh, just in time for the election season.

[–] [email protected] 1 points 5 months ago

I think almost certainly that disinformation based on fake accounts simply posting memes or targeted viewpoints, hoping to send the message through sheer repetition, it still a lot more common than doctored factual information. (Not that that means that faked up disinformation isn't a problem - just saying I think it's still relatively rare as a vehicle for disinformation.)

Why would you even open yourself up to "see, the underlying citation for this thing they're saying is not true" when you might as well not even enter into the sphere of backing up what you're saying with facts, and just state your assertions as if they were facts, instead.