this post was submitted on 14 Aug 2023
522 points (96.8% liked)

Technology

60134 readers
3044 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 174 points 1 year ago (2 children)

In fact, by the time the crash happens, it’s alerted the driver to pay more attention no less than 150 times over the course of about 45 minutes. Nevertheless, the system didn’t recognize a lack of engagement to the point that it shut down Autopilot

I blame the driver, but if the above is true there was a problem with the Tesla as well. The Tesla is intended to disengage and disable autopilot for the remainder of the drive after a small number of ignored alerts. If the car didn’t do that, there’s a bug in the Tesla software.

I think it’s more likely the driver used a trick to make the car think he was engaged when he was not. You can do things like put a water bottle wedged in the steering wheel to make the car think you have tugged on the steering wheel to prove you are engaged. (Don’t ask me how I know)

[–] NeoNachtwaechter 87 points 1 year ago (6 children)

The Tesla is intended to disengage and disable autopilot

What about: slow down, pull up to the right, stop the car, THEN disengage?

load more comments (6 replies)
[–] [email protected] 40 points 1 year ago (1 children)

After 3 alerts, it's off until you park. There are visual cues that precede the alert though and these do not count. I don't recall how many there are and for how long, but you start by seeing a message asking to have your hands on the wheel, then a blue line at the top, them the line starts pulsing ,then you've got an audio alert that is the first strike. Three strikes during the same drive and you need to park before using autopilot again.

[–] meco03211 15 points 1 year ago (2 children)

And those alerts don't come if you've overridden the system by putting a weight on the wheel or something.

[–] SargTeaPot 12 points 1 year ago (1 children)
[–] Ado 12 points 1 year ago (1 children)

Balancing an orange on the steering wheel?

load more comments (1 replies)
load more comments (1 replies)
[–] daikiki 84 points 1 year ago (4 children)

I have a lot of trouble understanding how the NTSB (or whoever's ostensibly in charge of vetting tech like this) is allowing these not-quite self driving cars on the road. The technology doesn't seem mature enough to be safe yet, and as far as I can tell, nobody seems to have the authority or be willing to use that authority to make manufacturers step back until they can prove their systems can be integrated safely into traffic.

[–] SpaceNoodle 86 points 1 year ago

It's just ADAS - essentially fancy cruise control. There are a number of autonomous vehicle companies who are carefully and successfully developing real self-driving technology, and Tesla should be censured and forbidden for labeling their assistance software as "full self-driving." It's damaging the real industry.

[–] [email protected] 39 points 1 year ago (2 children)

That's similar to cruise control. Cruise control can be dangerous because someone could fall asleep (not having to manage your speed can afford up sleepiness) and the car wouldn't slow down.

In my opinion, those options are all the driver's responsibility to know their own limit and understand that the tool is just a tool and you are responsible to making sure your driving is safe for others. Tesla autopilot adds a ton of safety features that avoid a lot of collisions based on lacking attention, sleepiness, and actively avoiding other drivers faults. But it's still just a tool and the driver is responsible of their own car and driving.

[–] daikiki 31 points 1 year ago (5 children)

The difference is that cruise control will maintain your speed, but 'autopilot' may avoid or slow down for obstacles. Maybe it avoids obstacles 90% of the time or 99% of the time. It apparently avoids obstacles enough that people can get lulled into a false sense of security, but once in a while it slams into the back of a stationary vehicle at highway speed.

It's easy to say it's the driver's responsibility, and ultimately it is, of course, but in practice, a system that works almost all of the time but occasionally causally kills somebody is very dangerous indeed, and saying it's all the driver's fault isn't really realistic or fair.

[–] abhibeckert 18 points 1 year ago* (last edited 1 year ago) (1 children)

A lot of modern cruise control systems will match the speed of the car in front of you and stop if they stop. They'll also keep the car in the current lane. And even without cruise control, most modern cars will stop if a pedestrian steps onto the road.

It's frustrating that Tesla's system can't detect a stationary police car in the middle of the road... but at the same time apparently that's quite a difficult thing to do and it's not unique to Tesla.

It's honestly not too much to ask a driver to step on the brakes if there's a cop car stopped on the road.

load more comments (1 replies)
load more comments (4 replies)
[–] [email protected] 17 points 1 year ago

The problem with Tesla is that their entire marketing is based on "Our cars drives themselves".

[–] Not_Alec_Baldwin 16 points 1 year ago

It's not "not-quite-self-driving" though, it's literal garbage. It's cruise control, lane assist and brake assist. The robot vision in use is horrible.

There are Tesla engineers bad mouthing the system openly.

Musk is a scammer and they need to issue an apology for all of the claims around autopilot, probably pay a great deal of money, and then change the name and advertising around it.

Oh, and also this guy should never drive again.

load more comments (1 replies)
[–] CaptainProton 76 points 1 year ago* (last edited 1 year ago) (12 children)

This is stupid. Teslas can park themselves, they're not just on rails. It should be pulling over and putting the flashers on if a driver is unresponsive.

That being said, the driver knew this behavior, acted with wanton disregard for safe driving practices, and so the incident is the driver's fault and they should be held responsible for their actions. It's not the courts job to legislate.

It's actually the NTSB's job to regulate car safety so if they don't already have it congress needs to grant them the authority to regulate what AI behavior is acceptable/define safeguards against misbehaving AI.

[–] [email protected] 10 points 1 year ago (2 children)

There's no way the headline is true. Zero percent. The car will literally do exactly what you stated if it goes too long without driver engagement and I've experienced it first hand.

load more comments (2 replies)
[–] chris2112 9 points 1 year ago

The driver is responsible for this accident, Tesla still should be liable imo for all the shady and outright misleading advertising around their so called "self driving". Compare Tesla's marketing to like GMs of Hyundai's, both of which essentially have parity with Teslas system in terms of actual features, and you'll see a big difference

load more comments (10 replies)
[–] zerbey 72 points 1 year ago (5 children)

150 more warnings than a regular car would give, ultimately it's the driver's fault.

[–] [email protected] 10 points 1 year ago (10 children)

Tesla actively markets their cars as ''the car drives itself''.

load more comments (10 replies)
load more comments (4 replies)
[–] [email protected] 63 points 1 year ago (3 children)

So if the guy behind the wheel died and couldn't react to the alerts then the car can't do a decision to just stop instead of crashing into a police car?

[–] [email protected] 13 points 1 year ago* (last edited 1 year ago) (2 children)

He was reacting to alerts, complying to them by simply touching the steering wheel. He did that 150 times during that 45 minute trip ( not all the trip was on auto pilot).

So if the guy died the car would of disengaged auto pilot (I'm not sure how this works).

You can check the video in the article. It's quite informative .

Edit

I saw another video and it takes ~60 seconds after taking off your hand from the steering wheel for the car to safely come to a full stop.

[–] [email protected] 11 points 1 year ago (1 children)

So the headline should be "drunk driver hits police car."

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 53 points 1 year ago (5 children)

Driver is definitely the one ultimately at fault here, but how is it that Tesla doesn’t perform an emergency stop in this situation - but just barrels into an obstacle?

Even my relatively ‘dumb’ car with adaptive cruise control handles this type of situation better than Tesla?!

[–] daikiki 50 points 1 year ago (1 children)

Your relatively 'dumb' car probably doesn't try to gauge distance exclusively by interpreting visual data from cameras.

[–] [email protected] 20 points 1 year ago (4 children)

Wait, the Model X doesn’t have RADAR/LIDAR to supplement the cameras?

[–] [email protected] 25 points 1 year ago (2 children)

Nope. For whatever reason, Musk decided to just use cameras

[–] AnotherRyguy 10 points 1 year ago

"Whatever reason" is obviously just trying to cut corners and improve the bottom line with no regard for the consequences.

load more comments (1 replies)
[–] [email protected] 13 points 1 year ago (3 children)

No, they were too expensive for Musk

load more comments (3 replies)
load more comments (2 replies)
load more comments (4 replies)
[–] hark 40 points 1 year ago

Setting aside the driver issue, isn't this another case that could've been prevented with LIDAR?

[–] Md1501 37 points 1 year ago (3 children)

You know what might work, program the car so that after the second unanswered "alert" the autopilot pulls the car over, or reduces speed and turns on the hazards. The third violation of this auto pilot is disabled for that car for a period of time.

[–] [email protected] 15 points 1 year ago (10 children)

I drive a Ford Maverick that is equipped with adaptive cruise control, and if I were to get 3 "keep your hands on the wheel" notifications, it deactivates adaptive cruise until the vehicle is completely turned off and on again. It blew my mind to learn that Tesla doesn't do something similar.

load more comments (10 replies)
[–] [email protected] 9 points 1 year ago (10 children)

This is literally exactly how it works already. The driver must have been pulling on the steering wheel right before it gave him a strike. The system will warn you to pay attention for a few seconds before shutting down. Here’s a video: https://youtu.be/oBIKikBmdN8

load more comments (10 replies)
load more comments (1 replies)
[–] n3cr0 30 points 1 year ago (2 children)

Poor ~~drunk~~ impaired driver falling victim to autonomous driving... Hopefully that driver lost their license.

load more comments (2 replies)
[–] [email protected] 29 points 1 year ago (6 children)

Don't see how that's a Tesla problem... Drunk/high driver operating their car incorrectly.

load more comments (6 replies)
[–] [email protected] 28 points 1 year ago* (last edited 1 year ago) (1 children)

Data from the Autopilot system shows that it recognized the stopped car 37 yards or 2.5 seconds before the crash.

Is the video slowed down? In the video, if you pause 2.5s before the crash, the stopped police car seems to be very close already. A (awake) human driver would've recognized the stopped police car from way more distance than that.

[–] [email protected] 12 points 1 year ago

I find that it can be hard to tell when a car ahead is stopped, maybe the visual system on the tesla has similar limitations. I think autopilot is controlled by the cameras alone but I'm not super up to date on tesla stuff. I would assume even a basic radar set up could tell something was stationary from quite far away.

[–] Snapz 27 points 1 year ago

This source keeps pushing tesla propaganda. There's always an angle trying to sell that it wasn't the tesla's fault

[–] EndOfLine 24 points 1 year ago* (last edited 1 year ago) (2 children)

Officers injured at the scene are blaming and suing Tesla over the incident.

...

And the reality is that any vehicle on cruise control with an impaired driver behind the wheel would’ve likely hit the police car at a higher speed. Autopilot might be maligned for its name but drivers are ultimately responsible for the way they choose to pilot any car, including a Tesla.

I hope those officers got one of those "you don't pay if we don't win" lawyers. The responsibility ultimately resides with the driver and I'm not seeing them getting any money from Tesla.

load more comments (2 replies)
[–] [email protected] 18 points 1 year ago (5 children)

I'm not so sure disengaging autopilot because the driver's hands were not on the wheel while on a highway, is the best option. Engage hazard lights, remain in lane (or if able move to the slowest lane) and come to a stop. Surely that's the better way?

Just disengaging the autopilot seems like such a copout to me. Also the fact it disengaged right at the end "The driver was in control at the moment of the crash" just again feels like bad "self" driving. Especially when the so-called self-driving is able to come to a stop as part of its software in other situations.

Also if you cannot recognize an emergency vehicle (I wonder if this was a combination of the haze and the usually bright emergency lights saturating the image it was trying to analyse) it's again a sign you shouldn't be releasing this to the public. It's clearly just not ready.

Not taking any responsibility away from the human driver here. I just don't think the behaviour was good enough for software controlling a car used by the public.

Not to mention, of course, the reason for suing Tesla isn't because they think they're more liable. It's because they can actually get some money from them.

load more comments (5 replies)
[–] liontigerwings 15 points 1 year ago

Hard to argue Tesla at fault when clearly the driver was impaired and at fault here.

[–] BroederJakob 12 points 1 year ago

It's also so misleading that Tesla use the word Autopilot for what is basically adaptive cruise control and lane assist

load more comments
view more: next ›