this post was submitted on 30 Aug 2023
202 points (93.2% liked)

Technology

59666 readers
3616 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Elon Musk’s FSD v12 demo includes a near miss at a red light and doxxing Mark Zuckerberg — 45-minute video was meant to demonstrate v12 of Tesla’s Full Self-Driving but ended up being a list of thi...::Elon Musk posted a 45-minute live demonstration of v12 of Tesla’s Full Self-Driving feature. During the video, Musk has to take control of the vehicle after it nearly runs a red light. He also doxxes Mark Zuckerberg.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (1 children)

I obviously don't know for sure, but at least it's conceivable that, in fact, it may be the case that erratic behavior of other drivers, caused by someone else driving slower than them, leads to a significant number of accidents every year that would not have happened had they been driving at the same speed as everyone else.

In this case, forcing the self-driving vehicle to never go over the speed limit literally means you're knowingly choosing an option that leads to more people dying instead of less.

I think there's a pretty clear moral dilemma here. I'm not claiming to know the right way forward, but I just want to point out that strictly following the rules without an exception is not always what leads to the best results. Of course, allowing self-driving cars to break the rules comes with its own issues, but this just further points to the complexity of this issue.

[–] [email protected] 1 points 1 year ago (1 children)

Tehnyt again if that follow others behavior is drive faster, that also leads to accidents. Not many with the other frustrated drivers, but with say wildlife. People not being aboe to stop in time more often dye to the increased speed and thus increased braking distance.

That is why bendy narrow roads have slower speed limit. It is function of what is the predicted reaction time, the amount of sight distance one had.

Can't cheat physics, the more speeding there is, the longer the braking distances, the more often it isn't anymore a near miss due to braking in time and instead a full on collision.

So sure one is more synch, but every is in synch with less reaction time available, when the unavoidable chaos factor raises its head. Chaos factor like wild live (who are not obligated nor obliged to follow traffic rules) or say someone bursting a tire leading to sudden change in speed and control.

[–] [email protected] 2 points 1 year ago

When a self-driving car drives at or below the speed limit on a fast-moving highway, it can disrupt the natural flow of traffic. This can lead to a higher chance of accidents when other human drivers resort to aggressive maneuvers like tailgating, risky overtaking, or sudden lane changes. I'm not claiming that it does so for a fact, but it is conceivable, and that's the point of my argument.

Now, contrast this with a self-driving car that adjusts its speed to match the prevailing traffic conditions, even if it means slightly exceeding the speed limit. By doing so, it can blend with the surrounding traffic and reduce the chances of accidents. It's not about encouraging speeding but rather adapting to the behavior of other human drivers.

Of course, we should prioritize safety and adhere to traffic rules whenever possible. However, sometimes the safest thing to do might be temporarily going with the flow, even if it means bending the speed limit rules slightly. The paradox lies in the fact that by mimicking human behavior to a certain extent, self-driving cars can contribute to overall road safety. It's a nuanced issue, but it underscores the complexity of integrating autonomous vehicles into a world where human drivers are far from perfect. This would not be an issue if every car was driven by an competent AI and there was no human drivers.