this post was submitted on 27 Apr 2024
885 points (95.8% liked)

Technology

58473 readers
4798 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 11 points 5 months ago (57 children)

It only matters if the autopilot does more kills than an average human driver on the same distance traveled.

load more comments (57 replies)
[–] [email protected] 10 points 5 months ago* (last edited 5 months ago) (7 children)

Is linked to excess deaths? Technically it could be saving lives at a population scale. I doubt that's the case, but it could be. I'll read the article now and find out.

Edit: it doesn't seem to say anything regarding "normal" auto related deaths. They're focusing on the bullshit designation of an unfinished product as "autopilot",and a (small) subset of specific cases that are particularly aggregious, where there were 5-10 seconds of lead time into an incident. In these cases a person who was paying attention wouldn't have been in the accident.

Also some clarity edits.

load more comments (7 replies)
[–] [email protected] 7 points 5 months ago (1 children)

I just read on LinkedIn a post from a Tesla engineer laid off.

He said "I checked my email while auto piloting to work".

The employees know more than anyone its capabilities and they still take the same stupid risk.

load more comments (1 replies)
[–] [email protected] 6 points 5 months ago (1 children)

This is the best summary I could come up with:


In March 2023, a North Carolina student was stepping off a school bus when he was struck by a Tesla Model Y traveling at “highway speeds,” according to a federal investigation that published today.

The Tesla driver was using Autopilot, the automaker’s advanced driver-assist feature that Elon Musk insists will eventually lead to fully autonomous cars.

NHTSA was prompted to launch its investigation after several incidents of Tesla drivers crashing into stationary emergency vehicles parked on the side of the road.

Most of these incidents took place after dark, with the software ignoring scene control measures, including warning lights, flares, cones, and an illuminated arrow board.

Tesla issued a voluntary recall late last year in response to the investigation, pushing out an over-the-air software update to add more warnings to Autopilot.

The findings cut against Musk’s insistence that Tesla is an artificial intelligence company that is on the cusp of releasing a fully autonomous vehicle for personal use.


The original article contains 788 words, the summary contains 158 words. Saved 80%. I'm a bot and I'm open source!

load more comments (1 replies)
load more comments
view more: ‹ prev next ›