this post was submitted on 19 Mar 2025
880 points (98.2% liked)

Not The Onion

15160 readers
2668 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

In the piece — titled "Can You Fool a Self Driving Car?" — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

you are viewing a single comment's thread
view the rest of the comments
[–] Soleos 9 points 15 hours ago (7 children)

The bar set for self-driving cars: Can it recognize and respond correctly to a deliberate optical illusion?

The bar set for humans: https://youtu.be/ks11nuGGupI

For the record, I do want the bar for self-driving safety to be high. I also want human drivers to be better... Because even not-entirely-safe self-driving cars may still be safer than humans at a certain point.

Also, fuck Tesla.

[–] legion02 19 points 15 hours ago (1 children)

I mean it also plowed through a kid because it was foggy, then rainy. The wall was just one of the tests the tesla failed.

[–] [email protected] 6 points 13 hours ago* (last edited 13 hours ago) (2 children)

Right, those were the failures that really matter, and Rober included the looney tunes wall to get people sharing and talking about it. A scene painted on wall is a contrived edge case, but pedestrians/obstacles in weather involving precipitation is common.

[–] [email protected] 3 points 10 hours ago (1 children)

It's no longer an edge case if faulty self driving becomes the norm.

Want to kill someone in a Tesla? Find a convenient spot and paint a wall there.

Doesn't even have to be an artificial wall, for example take a bend on a mountain road and paint the rock.

[–] [email protected] 2 points 7 hours ago (1 children)

Next test I would love is what is the minimum amount of false road to fool it.

[–] [email protected] 1 points 2 hours ago* (last edited 2 hours ago) (1 children)

Have you ever seen examples of how the features that ai picks out to identify objects isn't really the same as what we pick out? So you can generate images that look unrecognizeable to people but have clearly identifiable features to ai. It would be interesting to see someone play around with that concept for interesting ways to fool tesla's ai. Like could you make a banner that looks like a barricade to people, but the cars think looks like open road?

This isn't a great example for this concept, but it is a great video. https://youtu.be/FMRi6pNAoag?t=5m58s

[–] [email protected] 2 points 1 hour ago

I was thinking something that the AI would think the road turns left and humans see it turns right

[–] [email protected] 3 points 12 hours ago

I think it does highlight the issue with the real world. There will always be edge cases and situations that lead to odd visuals.

load more comments (5 replies)