this post was submitted on 21 Dec 2023
59 points (100.0% liked)

Technology

58133 readers
4730 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

7.1 million miles, 3 minor injuries: Waymo’s safety data looks good::Waymo says its cars cause injuries six times less often than human drivers.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 23 points 9 months ago (3 children)

If those same miles had been driven by typical human drivers in the same cities, we would have expected around 13 injury crashes.

I'm going to set aside my distrust at self reported safety statistics from tech companies for a sec to say two things:

First, I don't think that's the right comparison. You need to compare them to taxis.

Second, we need to know how often waymos employees intervene. From the NYT, cruise employed 1.5 staff-members per car, intervening to assist these not-so-self driving vehicles every 2.5 to 5 miles, making them actually less autonomous than regular cars.

Source : https://www.nytimes.com/2023/11/03/technology/cruise-general-motors-self-driving-cars.html?unlocked_article_code=1.7kw.o5Fq.5WLwCg2_ONB9&smid=url-share

[–] [email protected] 5 points 9 months ago* (last edited 9 months ago)

First, I don’t think that’s the right comparison. You need to compare them to taxis.

It's not just that, you generally have a significant distribution shift when comparing the self-drivers/driving assistants to normal humans. This is because people only use self-driving in situations where it has a chance of working, which is especially true with stuff like tesla's self-driving where ultimately people are not even going to start the autopilot when it gets tricky (nevermind intervening dynamically: they won't start it in the first place!)

For instance, one of the most common confounding factors is the ratio of highway driving vs non-highway driving: Highways are inherently less accident prone since you don't have to deal with intersections, oncoming traffic, people merging in from every random house, or children chasing a ball into the street. Self-drivers tend to report a lot more highway traffic than ordinary drivers, due to how the availability of technology dictates where you end up measuring. You can correct for that by e.g. explicitly computing the likelihood p(accident|highway) and use a common p(highway) derived from the entire population of car traffic.

load more comments (2 replies)