this post was submitted on 10 Jan 2025
74 points (100.0% liked)

Fuck AI

1682 readers
26 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 10 months ago
MODERATORS
 
top 7 comments
sorted by: hot top controversial new old
[–] dual_sport_dork 16 points 1 week ago

Guess what!

When accuracy matters, the labor cost of babysitting the LLM's output is the same as doing the work yourself. That's before you even consider having to unfuck it after it paints itself into a corner and ruins its own model.

[–] [email protected] 16 points 1 week ago (2 children)

“Even in the best case, the models had a 35% error rate,” said Stanford’s Shah

So, when the AI makes a critical error and you die, who do you sue for malpractice?

The doctor for not catching the error? The hospital for selecting the AI that made a mistake? The AI company that made the buggy slop?

(Kidding, I know the real answer is that you're already dead and your family will get a coupon good for $3.00 off a sandwich at the hospital cafeteria.)

[–] FlyingSquid 9 points 1 week ago (1 children)

"AIs are people" will probably be the next conservative rallying cry. That will shield them from all legal repercussions aside from wrist-slaps just like corporations in general.

[–] dual_sport_dork 6 points 1 week ago

Cool, so they are entitled to wages and labor protections, then.

"Not like that!"

[–] [email protected] 4 points 1 week ago* (last edited 1 week ago)

So, when the AI makes a critical error and you die, who do you sue for malpractice?

well see that is the technology, it is a legal lockpick for mass murderers to escape the consequences of knowingly condemning tens of thousands of innocent people to death for a pathetic hoarding of wealth.

[–] Lost_My_Mind 4 points 1 week ago

I hope those employees work really really slowly.

[–] [email protected] 2 points 1 week ago

I feel like any AI tool that's being sold as saving you money just won't do that. Some of the ones that sell improved detection rates might.

AI that works as a tool designed to be used by an existing or new professional to augment their abilities works as well as any other tool. An ultrasound doesn't save you money except in the abstract of being more freely usable than x-ray allowing for more checks with less equipment.
A tool that highlights concerning areas on a mammogram isn't replacing a person anymore than the existing tools that highlight concerning heart rhythms.

Trying to get llms to replace people, particularly when it comes to trying to explain the content of a potentially technical medical discussion is just not going to be reliable.