this post was submitted on 07 Jul 2024
238 points (99.6% liked)

News

23307 readers
3722 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
top 32 comments
sorted by: hot top controversial new old
[–] [email protected] 104 points 4 months ago* (last edited 4 months ago) (2 children)

The company is responsible. Waymo should get the citation. If there were a live driver, the driver would get the citation. If companies want to start going down the route of AI, then whoever is in ownership or responsible for training, should be responsible for the actions of the AI.

[–] FlowVoid 54 points 4 months ago (2 children)

Arizona law does allow officers to give out tickets when a robotaxi commits a traffic violation while driving autonomously; however, officers have to give them to the company that owns the vehicle. Doing so is “not feasible,” according to a Phoenix police spokesperson

[–] [email protected] 49 points 4 months ago (2 children)

I'm not sure why the police say it's "not feasible" to issue Google a citation. Google are the registered owners of the vehicles and thus responsible for any actions it performs, just mail them a ticket?

[–] FlowVoid 29 points 4 months ago* (last edited 4 months ago) (2 children)

I'm just speculating, but there is probably a very efficient workflow for sending a ticket to an individual (given the number of tickets police write and the revenue they generate), and I wouldn't be surprised if the workflow doesn't accommodate an AI operated vehicle. Kind of like how a restaurant would need to restructure its workflow to accommodate DoorDash.

In other words, "infeasible" might actually mean "would take extra effort".

[–] [email protected] 7 points 4 months ago

Yeah they probably just use a 20 years old out of date system (like any government agency that respects itself) that doesn't take into account that maybe a car doesn't have a driver

[–] SlopppyEngineer 3 points 4 months ago (3 children)

I thought the laws in the USA prevented this. It's why you have manned speed traps because citations must be handed over personally to the driver while other countries have automated speed check systems and send the ticket to the owner of the car, and that can be a leasing company for example.

[–] [email protected] 11 points 4 months ago

how about you tape/glue copies of the ticket over the lenses of any exposed cameras and allow Google to figure out the logistics of how to pay the ticket?

[–] FlowVoid 5 points 4 months ago* (last edited 4 months ago)

citations must be handed over personally to the driver

In Arizona, the operator of an AI vehicle must submit a law enforcement interaction plan that specifies how they will be ticketed.

However, it's quite possible that actually following the plan is a pain in the butt for traffic cops, and they simply don't want to put in the effort.

[–] Spiralvortexisalie 2 points 4 months ago (1 children)

Generally in the United States you have an opportunity to cross-examine all evidence, these cameras are not calibrated regularly and generally not kept up (arguably they are so low budget they need no upkeep), so they become un-admissable when you challenge them, which many people win because the camera was last calibrated and cleaned when it was installed.

[–] SlopppyEngineer 1 points 4 months ago (1 children)

We have that opportunity too. You can opt to not accept the proposed (automated) settlement, and challenge the citation itself. People have done that and won. However, administrative fees for that are often higher that the proposed settlement so it's only worth it in special cases.

[–] Spiralvortexisalie 1 points 4 months ago

Can’t speak to other countries but that generally offends American Courts, it comes off as retaliatory for exercising your American rights and has been struck down numerous times in various venues. One of the most scared rights in America is to be heard and reheard in front of a court of competent jurisdiction, we all have our day in court.

[–] [email protected] 2 points 4 months ago (1 children)

I think the issue is theres no specific person or driving license, purely by speculation

[–] [email protected] 18 points 4 months ago (1 children)

Then the vehicle is being operated unlicensed. Impound it, suspend the registration, and fine the owner. That's how it works for everyone else.

[–] [email protected] 6 points 4 months ago

I'd love to see that happen

[–] [email protected] 12 points 4 months ago (1 children)

How is it not feasible? Companies have addresses and records of employees. I know you're just citing, but something doesn't sound right. I mean, we are talking about Phoenix police so that could explain it.

[–] Regrettable_incident 9 points 4 months ago

Maybe it means they can't be bothered to try something different.

[–] FlyingSquid 35 points 4 months ago

Corporations are people until a crime is committed, at which point you can't punish a corporation for a crime a person commits.

I don't understand it, but apparently that's how it works.

[–] [email protected] 40 points 4 months ago (1 children)

If a human did this, they would at least get a ticket with a fine, and have the violation recorded on their license which would be revoked if it kept happening. With the computer controlled car, the cop called customer support and was like, "hey you might want to look into it or something."

I guess we can't expect the people hired to protect capital to act against capital, but it's still a bit disturbing.

[–] [email protected] 3 points 4 months ago (1 children)

if they have an answer from support.

[–] [email protected] 1 points 4 months ago

In the video they answered right away?

[–] [email protected] 27 points 4 months ago (2 children)

AI can't recognize a pattern it's wasn't trained on? Imagine that....

[–] [email protected] 45 points 4 months ago (4 children)

I used to work on the software for these cars, so I can speak to this a little. For what it's worth, I'm no longer with the project, so I have no reason to be sucking Google's dick if these weren't my honest opinions on the tech being used here. None of this is to excuse or defend Google, just sharing my insight on how these cars operate based on my experiences with them.

Waymo's cars actually do a really good job at self-navigation. Like, sometimes it's scary how good they actually are when you see the conditions they can operate under. There are so many layers of redundancies that you could lose all of the camera feeds, GPS, and cellular data, and they'll still be able to navigate themselves through traffic by the LIDAR sensors. Hell, even if you removed the LIDAR from that scenario, those cars accurately know their location based on the last known location combined with how many times each tire has revolved (though it'd just run into everything along the way, but at least it'd know where it's located the entire time). All of the other sensors and data points collected by the cars actually end up making GPS into the least accurate sensor on the car.

That said, the article mentions that it was due to "inconsistent construction signage", which I'd assume to be pretty accurate from my own experience with these cars. Waymo's cars are usually really good at detecting cone placements and determining where traffic is being rerouted to. But... that's generally only when the cones are where they're supposed to be. I've seen enough roadwork in Phoenix to know that sometimes Mad Max rules get applied, and even I wouldn't know how to drive through some of those work zones. It was pretty rare that I'd have to remotely take over an SDC, but 9/10 times I did it was because of construction signs/equipment being in weird places and I'd have to K-turn the car back where it came from.

That's not to say that construction consistently causes the cars to get stuck, but I'd say was one of the more common pain points. In theory, if somebody were to run over a cone and nobody picks it back up, an SDC might not interpret that obstruction properly, and can make a dumb decision like going down the wrong lane, under the incorrect assumption that traffic has been temporarily rerouted that way. It sounds scary, and probably looks scary as hell if you saw it on the street, but even then it's going to stop itself before coming anywhere near an oncoming car, even if it thinks it has right of way, since proximity to other objects will take priority over temporary signage.

The "driving through a red light" part I'm assuming might actually be inaccurate. Cops do lie, after all. I 100% believe in a Waymo car going down the opposing lane after some sketchy road cones, but I have a hard time buying that it ran a red light, since they will not go if they don't detect a green light. Passing through an intersection requires a positive detection of a green light; positive or negative detection of red won't matter, it has to see a green light for its intended lane or it will assume it has to stop at the line.

In the video, the cop says he turns on his lights and the SDC blows through a red light. While I was working there, red light violations were so rare that literally 100% of the red light violations we received were while a human was driving the car in manual mode. What I'd assume was likely going on is that the SDC was already in a state of "owning" the intersection for an unprotected left turn when the lights came on. When an SDC thinks it's being pulled over, it's going to go through its "pullover" process, which first requires exiting an intersection if currently in one. So what likely ended up happening is the SDC was already in the intersection preparing for a left turn, the light turns red while the SDC is in the box (and still legally has right of way to the intersection), cop turns on the sirens, SDC proceeds "forward" through the intersection until it's able to pull over.

But, that's just my speculation based on my somewhat outdated understanding of the software behind these cars. I'd love to see the video of it, but I doubt Waymo will release it unless there's a lawsuit.

[–] [email protected] 12 points 4 months ago

This is pretty interesting to read, thanks! I would think that Waymo employs an abundance of visual sensors that could give us an idea of what happened if they so chose to do so. Construction zones can be hard, maybe they need to own this one?

[–] [email protected] 12 points 4 months ago

The red light bit seems spot on. In every article stating “it blew through a red light” there’s always the caveat that it’s just trying to clear the intersection while getting pulled over. Technically people are allowed to do that (and/or move to a safer area, such as getting into the right lane when being pulled over in the left lane).

I think media like to add the intersection stuff to rile people up.

[–] bitwaba 3 points 4 months ago

If you listen to the video of the interaction with the police officer and the two Waymo guys, it's clear to me he's not making anything up about the events that took place. The car did run through the intersection when he turn on the light. He's not trying to issue tickets or anything - he really is interacting with the Waymo people to let the know "your car was behaving erratically. It needs to be off the road". Its very possible the road construction uncertainty plus being in an oncoming traffic lane plus being lit up by the police triggered some very specific failure of process in the code.

[–] doubletwist 2 points 4 months ago (1 children)

So I've been in situations where I was stopped at a red light, and emergency vehicles were coming and I was waved by a policeman to cross the intersection against the red light to clear the way.

So what, is a self driving car going to just sit there and keep the intersection blocked?

[–] [email protected] 3 points 4 months ago* (last edited 4 months ago)

(I'm assuming we're talking about unprotected left turns.)

I don't know if I ever saw it happen, myself, so I can't say for certain. My understanding of the SDC's logic is that if it was already in the intersection, it would complete the turn, and then pull off to the right shoulder to let the emergency vehicle pass. If it hasn't yet entered the intersection and detects siren lights behind it, I believe it will turn on the hazard lights and remain stationary unless honked at (I could be mistaken, but I think it'll recognize being honked at by emergency vehicles, and will assume it to mean "move forward and clear a path"). The SDCs have an array of microphones around the car to detect honks, sirens, nearby crashes, etc, and can tell the direction the sounds are coming from for this purpose.

That said, because it's listening for sirens, the SDC will usually be aware that there's an emergency vehicle heading toward it well ahead of time, and if they've got their lights on, the SDC will usually be able to determine which vehicle, specifically, is the emergency vehicle, so it can monitor its trajectory and make sure it's staying out of the way when possible. Typically, they will be proactive about steering clear of anything with lights/sirens running.

This would also considered a higher-priority event, and usually it will automatically ping a live human to remotely monitor the situation, and depending on the specific context, they may either command the SDC to remain stationary, proceed forward, make a U-turn, or whatever else may be necessary. In case the emergency vehicle has a loud speaker, we'd be able to hear any requests they're making of us, as well.

For what it's worth, I know that Waymo also works pretty closely with the Phoenix PD, and provide them with updates about any significant changes to the car's behaviors or any tips/tricks for dealing with a stuck car in an emergency situation, so if a situation got particularly sticky, the cops would know how to work around it. My understanding is that Phoenix PD has generally been very cooperative, though they've apparently had issues with state troopers who don't seem to care to learn about how to deal with the cars.

[–] [email protected] 13 points 4 months ago* (last edited 4 months ago) (1 children)

It did recognize the patterns, but the construction signs were (allegedly) inconsistent

Also not that your comment wss alluding otherwise, but self driving cars only use AI for recognition. The decision making is deterministic algorithms

[–] [email protected] 8 points 4 months ago

Don't confuse me with facts.

[–] [email protected] 10 points 4 months ago* (last edited 4 months ago)

Waymo told multiple outlets that the vehicle drove into the oncoming lane because of “inconsistent construction signage,” and that it “was blocked from navigating back into the correct lane.” The company said the car drove away from the cop “in an effort to clear the intersection” before pulling into the parking lot where the traffic stop took place

...

Waymo didn’t immediately respond to The Verge’s request for comment. The company told Fox 10 Phoenix that its cars “are three-and-a-half times more likely” to avoid a crash than a human being.

I actually rode in a Waymo yesterday. They're quite cool, and as much as I hate the car centric society in the West, I hope they catch on

[–] JeeBaiChow 3 points 4 months ago

Isn't a company at least responsible for the safe operation and training of human drivers? Wouldn't it be the same for the training of self driving cars?

[–] benjihm 2 points 4 months ago

We're still so far away from this technology being viable for everyday use, aren't we?