this post was submitted on 14 Feb 2024
482 points (98.6% liked)

Technology

58140 readers
4715 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Last year, two Waymo robotaxis in Phoenix "made contact" with the same pickup truck that was in the midst of being towed, which prompted the Alphabet subsidiary to issue a recall on its vehicles' software. A "recall" in this case meant rolling out a software update after investigating the issue and determining its root cause.

In a blog post, Waymo has revealed that on December 11, 2023, one of its robotaxis collided with a backwards-facing pickup truck being towed ahead of it. The company says the truck was being towed improperly and was angled across a center turn lane and a traffic lane. Apparently, the tow truck didn't pull over after the incident, and another Waymo vehicle came into contact with the pickup truck a few minutes later. Waymo didn't elaborate on what it meant by saying that its robotaxis "made contact" with the pickup truck, but it did say that the incidents resulted in no injuries and only minor vehicle damage. The self-driving vehicles involved in the collisions weren't carrying any passenger.

After an investigation, Waymo found that its software had incorrectly predicted the future movements of the pickup truck due to "persistent orientation mismatch" between the towed vehicle and the one towing it. The company developed and validated a fix for its software to prevent similar incidents in the future and started deploying the update to its fleet on December 20.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 70 points 7 months ago (5 children)

The company says the truck was being towed improperly

Shit happens on the road. It's still not a great idea to drive into it.

The company developed and validated a fix for its software to prevent similar incidents

So their plan is to fix one accident at a time..

[–] DoomBot5 19 points 7 months ago (1 children)

Rules are written in blood. Once you figure out all the standard cases, you can only try and predict as many edge cases that you can think of. You can't make something fool proof because there will always be a greater fool that will come by.

[–] [email protected] 13 points 7 months ago (1 children)

Unexpected or not, it should do its best to stop or avoid the obstacle, not drive into it.

An autonomous vehicle shouldn't ever be able to actively drive forward into anything. It's basic collision detection that ought to brake the car here. If something is in the position the car wants to drive to, it simply shouldn't drive there. There's no reason to blame the obstacle for being towed incorrectly..

[–] NotMyOldRedditName 7 points 7 months ago* (last edited 7 months ago) (1 children)

In this case it thought the vehicle had a different trajectory due to how it was improperly set up.

The car probably thought it wasn't going to hit it until it was too late and the trajectory calculation proved incorrect.

Every vehicle on the road is few moments away from crashing if we calculate that incorrectly. It doesn't matter if it knows its there.

[–] [email protected] 2 points 7 months ago (1 children)

Same thing applies to a human driver. Most accidents happen because the driver makes a wrong assumption. The key to safe driving is not getting in situations where driving is based on assumptions.

Trajectory calculation is definitely an assumption and shouldn't be allowed to override whatever sensor is checking for obstructions ahead of the car.

[–] NotMyOldRedditName 2 points 7 months ago (1 children)

The car can't move without trajectory calculations though.

If the car ahead of you pulls forward when the light goes green, your car can start moving forward as well keeping in mind the lead cars trajectory and speed.

If it was just don't hit an object in its path, the car wouldn't move forward until the lead was half way down the block.

The car knew the truck was there in this case, it wasn't a failure to detect. Due to a programming failure it thought it was safe to move because the truck wouldn't be there.

If you're following a vehicle with proper distance and it slams the brakes you should be able to stop in time as you've calculated their trajectory and a safe speed behind. But if that same vehicle slams on the brakes and goes into reverse, well... Goodluck.

It's all assumptions assuming the detection is accurate in the first place.

[–] [email protected] 1 points 7 months ago (1 children)

If you're following a vehicle with proper distance and it slams the brakes you should be able to stop in time as you've calculated their trajectory and a safe speed behind.

You dont need to calculate their trajectory. It's enough to know your own.

If a heavy box falls off a truck and stops dead in front of you, you need to be able to stop. That box has no trajectory, so it's an error to include other vehicles trajectories in the safe distance calculation.

Traffic can move through an intersection closely by calculating a safe distance, which may be smaller than the legal definition, but still large enough to stop for anything suddenly appearing on the road. The only thing needed is that the distance is calculated based on your own speed and a visually confirmed position of other things. It can absolutely be done regardless of the speed or direction of other vehicles.

Anyway. A backwards facing truck is a weird thing to misinterpret. Trucks sometimes face backwards for whatever reasons.

It would be interesting to know how the self driving car would react to a ghost driver.

[–] NotMyOldRedditName 1 points 7 months ago* (last edited 7 months ago) (1 children)

You dont need to calculate their trajectory. It’s enough to know your own.

This doesn't make sense. It's why I was saying the car won't move at a stop light when it goes green until the car is half way down the street.

If the car is 2.5 seconds ahead of me at 60mph on the highway, it's only 2.5 seconds ahead of me if the other car is doing 60 mph. If the car is doing 0mph then I'm going to crash into it.

It needs to know how fast and what direction the obstacle is going, and how to calculate the rate of acceleration/deceleration and extrapolate from there.

[–] [email protected] 0 points 7 months ago (1 children)

2.5 seconds at 60 mph is more than enough to come to a full stop. If the car in front of you dropped an anvil (traveling at 0 mph) on the road, you could stop before crashing into the anvil. You do not need to drive into the other cars trajectory path.

[–] NotMyOldRedditName 1 points 7 months ago* (last edited 7 months ago) (1 children)

You can't be driving behind that vehicles at 60mph with 2.5s WITHOUT knowing it's trajectory.

You keep trying to saying it doesn't need to know the trajectory of all objects around it, but that's not true.

[–] [email protected] 1 points 7 months ago (1 children)

Yes you can. It is a stopping distance. 2.5 seconds at 60 mph is 220 feet. A car can brake from 60 to 0 in less than 220 feet. It will take longer than 2.5 seconds to do, but it won't hit the object which originally was 2.5 seconds ahead.

[–] NotMyOldRedditName 2 points 7 months ago (1 children)

Maybe a straight behind isn't as good an example, although it is calculating the likelihood of it continuing to go straight.

An oncoming car, drifting out of the lane towards your lane.

It's not going to hit you until it's in your path, but the trajectory of it coming towards you is in your path.

If you don't consider where it's going and how fast it's going, you won't know if it's going to enter your lane before you pass it.

If you're only trying to avoid hitting objects and its not in your path until the last quarter second, you won't take appropriate actions because you don't know it's coming at you.

All these measurements are taken as time between you and them and it uses that info to calculate the trajectories.

[–] [email protected] 1 points 7 months ago

Yes I know and it should. What I am saying is that the trajectory calculations should never be allowed to override the basic collision calculations, like it did in this case.

It does not matter if the towed truck appeared to have a different trajectory than it actually had, because it was very obviously in the range of collision.

Do you have a reverse sensor in your car that beeps when you're close to stuff?

It was the self driving car that drove into the tow truck. All it's sensors must've been beeping, and it still decided to keep driving.

[–] [email protected] 13 points 7 months ago (1 children)

Honestly, I think only trial and error will let us get a proper autonomous car.

And I still think autonomous cars will save many more lives than it endangered once it become reliable.

But for now this is bound to happen...

To be clear, they still are responsible of these car and the safety of others. They didn't test properly.

They should be trying every edge case they can think about.

A large screen on the side of a truck ? What if a car is displayed on it ? Would the car sensor notice the difference?

A farmer dropped a hay bale on the road ? It got flattened by rain ? Does the car understand that this might not be safe to drive on or to brake on ?

There is hundreds of unique situations that they should be trying before an autonomous car gets even close to a public road.

But even if you try everything there will be mistakes and fatalities.

[–] [email protected] 1 points 7 months ago (1 children)

There is hundreds of unique situations that they should be trying before an autonomous car gets even close to a public road.

Do you think "better than human drivers" is sufficient for deployment on public roads, or do you think the bar should be higher?

[–] [email protected] 2 points 7 months ago

Honestly, I'm pragmatic, if less people die in accidents involving autonomous car, then yes.

The thing is we shouldn't be trusting the manufacturers for these stats. It has to be reported by a government agency or something.

Similarly Autonomous car software should have to be certified by an independent organization before being deployed. Same thing for updates to the software. Otherwise we would get deadly updates from time to time.

If we deploy and handle autonomous car with the same safety approach as in aviation I'm sure this transition can be done fairly safely.

[–] [email protected] 12 points 7 months ago

Just like Tesla! And people wonder why they are a hated company.

[–] [email protected] 12 points 7 months ago (3 children)

So their plan is to fix one accident at a time…

Well how else would you do it?

[–] [email protected] 28 points 7 months ago (3 children)

You drive a car and can't quite figure out what is happening in front of you.

Do you:

  • A: Turn up the music and plow right through.
  • B: Slow down (potentially to a full stop) and assess the situation.
  • C : Slow down, close your eyes and continue driving slowly into the obstacle
  • D: Sound the horn and flash the lights

From the description offered in the article the car chose C, which is wrong.

[–] [email protected] 15 points 7 months ago (1 children)

Given the millions of global road deaths annually I think B is probably the least popular answer.

[–] [email protected] 0 points 7 months ago

Honestly slowing down too much can easily create an accident that didn't exist in the first place.

Not every situation can be handled by slowing down.

If that's the default behavior on high speed road this could be deadly for the car behind you.

[–] HeyThisIsntTheYMCA 2 points 7 months ago

I mean that's machine learning for ya

[–] [email protected] 1 points 7 months ago (1 children)

I wasn't asking about the car's logic algorithm; we all know that the SDC made an error, since it [checks notes] hit another car. We already know it didn't do the correct thing. I was asking how else you think the developers should be working on the software other than one thing at a time. That seemed like a weird criticism.

[–] [email protected] 6 points 7 months ago

Sorry, I didn't answer your question. Consider the following instead:

Your self driving car has crashed into a god damn tow truck with a backwards facing truck.

Do you:

  • A: Program your car to deal differently with fucking backwards facing trucks on tow trucks
  • B: Go back to question one and make your self driving car pass a simple theory test.

According to the article the company has chosen A, which is wrong.

[–] [email protected] 4 points 7 months ago

Radars > Don't hit stuff

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago)

Ideally they don't need actual accidents to find errors, but discover said issues in QA and automated testing. Not hitting anything sounds like a manageable goal to be honest.

[–] [email protected] 5 points 7 months ago (1 children)

In this case it fixed two accidents at one time. But only because they were the exact same.