this post was submitted on 22 Nov 2023
399 points (98.1% liked)

Technology

59200 readers
3846 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Judge finds ‘reasonable evidence’ Tesla knew self-driving tech was defective::Ruling clears way for lawsuit brought against company over fatal crash in 2019 in which Stephen Banner was killed near Miami

all 33 comments
sorted by: hot top controversial new old
[–] [email protected] 54 points 11 months ago (4 children)

It’s asinine that Tesla is trying to do full self driving without actually using some sort of LiDAR. Using video/photos to judge distance is just unreliable and stupid.

[–] xanu 33 points 11 months ago (1 children)

but it's MUCH cheaper, so keeping with every other shitty idea he's ever had, Musk was REALLY banking on Tesla engineers to make a crazy breakthrough so he could reap billions in reward.

It worked at SpaceX because of a perfect concoction of all the best rocket scientists and engineers wanting to work at SpaceX, since it was one of the only space programs not owned by a government and could push the boundaries, the technology being possible and wildly practical to implement, and massive government subsidies.

Tesla is in the car market, which is notoriously competitive and, while they do have massive government subsidies, they don't have the best engineers and musk's insistence that they "figure out" how to shove autonomous driving into a medium that simply doesn't provide enough information drives even the better engineers away.

I really wish my government would stop funding his ego and let his fantasy projects die already.

[–] [email protected] 9 points 11 months ago (2 children)

The tech has gotten so cheap now that there is no reason to skimp out on it.

[–] [email protected] 10 points 11 months ago (1 children)

Oh there definitely is, marginally higher profits at the cost of public safety

A tale as old as capitalism: short term profit first, who gives a shit about later

[–] topinambour_rex 1 points 11 months ago

And tesla is the car company who has the highest benefits by car sold.

[–] topinambour_rex 7 points 11 months ago

We talk about a company who got rid of ultrasound sensors for make the car reverse parks.

[–] [email protected] 7 points 11 months ago* (last edited 11 months ago) (2 children)

With two offset cameras, depth is reliable, especially using a wide angle and narrow angle lens offset. This is what OpenPilot does with the Comma 3 (FOSS self driving).

Radar is better, but some automotive radar seems to only be great at short ranges (from my experience with my fork of OP in combination with radar built into a vehicle).

[–] MeanEYE 6 points 11 months ago (1 children)

depth is reliable

No it's not. World is filled with optical illusions that even our powerful brains can't process and yet you expect two web cams to do. And depth is not the only thing that's needed when it comes autonomous driving. Distance is an absolute factor. Case in point, two killed (if not more) bikers because they had 2 tail lights instead of one and Tesla thought it's a car far away instead of motorcycle close by. Ran them over as if they were not there. Us as humans would see this rider and realize it's a motorcyle... first because of sound second because our brain is better at reasoning. And we'd avoid the situation. This is why cars MUST have more sensors, because processing is lacking so much.

[–] [email protected] 1 points 11 months ago

Here is an alternative Piped link(s):

Case in point

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] [email protected] 7 points 11 months ago

Sitting in a Tesla and watching it try to understand anything other than highway driving is so unnerving. It gets so much wrong about other cars' direction of travel that it's not too shocking one occasionally is plowed into or plows into someone else

[–] topinambour_rex 0 points 11 months ago (2 children)

Using video/photos to judge distance is just unreliable and stupid.

All depend how powerful is the computer managing the datas. A human brain does the job by example.

[–] [email protected] 15 points 11 months ago* (last edited 11 months ago) (1 children)

I thought the whole point was to overcome human shortcomings, not just make a worse version of a human driver. Humans don't even rely purely on visual cues.

[–] [email protected] 43 points 11 months ago* (last edited 11 months ago) (1 children)

Tesla FSD... Coming ~~2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022, 2023,~~ 2024

At this rate Ferrari will win another championship before FSD comes out

[–] MotoAsh 22 points 11 months ago (1 children)

The sun will explode before Tesla succeeds in making full self-driving work with only basic cameras.

[–] aaaantoine 3 points 11 months ago (2 children)

Are there at least two front facing cameras for depth perception?

[–] MeanEYE 3 points 11 months ago

Which means pretty much nothing. Perception is just perception, not reliable absolute data.

[–] [email protected] 1 points 11 months ago (1 children)

I think they quietly reversed that decision and cars now have lidar

[–] MeanEYE 1 points 11 months ago

Tesla doesn't not have a radar. Just two cameras and they removed the radar. So it's blind right now.

[–] [email protected] 9 points 11 months ago (1 children)

This is the best summary I could come up with:


A judge has found “reasonable evidence” that Elon Musk and other executives at Tesla knew that the company’s self-driving technology was defective but still allowed the cars to be driven in an unsafe manner anyway, according to a recent ruling issued in Florida.

The lawsuit, brought by Banner’s wife, accuses the company of intentional misconduct and gross negligence, which could expose Tesla to punitive damages.

The ruling comes after Tesla won two product liability lawsuits in California earlier this year focused on alleged defects in its Autopilot system.

“It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers was acutely aware of the problem with the ‘Autopilot’ failing to detect cross traffic,” the judge wrote.

Bryant Walker Smith, a University of South Carolina law professor, told Reuters that the judge’s summary of the evidence was significant because it suggests “alarming inconsistencies” between what Tesla knew internally, and what it was saying in its marketing.

“This opinion opens the door for a public trial in which the judge seems inclined to admit a lot of testimony and other evidence that could be pretty awkward for Tesla and its CEO,” Smith said.


The original article contains 462 words, the summary contains 195 words. Saved 58%. I'm a bot and I'm open source!

[–] NotMyOldRedditName -4 points 11 months ago* (last edited 11 months ago) (2 children)

“It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers was acutely aware of the problem with the ‘Autopilot’ failing to detect cross traffic,” the judge wrote.

If that's what this centers I don't think this necessarily a correct ruling

That semi was moving fairly slowly as it was crossing the road as any semi would from a stop.

Radar does not detect stationary objects at high speeds, which this slow moving cross traffic vehicle could look like. I imagine there's some limit where a cross traffic item moving very slowly would appear for all intents and purposes stationary as it fills the entire roadway horizontally and not just a portion of it.

The car explicitly warns you that Radar won't detect stationary objects at high speeds. Other manufacturers explicitly warn about this very same problem as well.

It'll be interesting to see what happens with this case, but if that's what it hinges on, IMO, it doesn't look good for the plaintiff.

[–] [email protected] 6 points 11 months ago (1 children)

Doesn't Tesla only use cameras and image processing though? As in no radar at all?

[–] NotMyOldRedditName 3 points 11 months ago* (last edited 11 months ago)

This was in 2019 when radar was a more primary than vision, or vision only.

[–] [email protected] 3 points 11 months ago (1 children)

Here’s the basis of the finding:

Palm Beach county circuit court judge Reid Scott said he had found evidence that Tesla “engaged in a marketing strategy that painted the products as autonomous” and that Musk’s public statements about the technology “had a significant effect on the belief about the capabilities of the products”.

Judge Scott also found that the plaintiff, Banner’s wife, should be able to argue to jurors that Tesla’s warnings in its manuals and “clickwrap” were inadequate. He said the accident is “eerily similar” to a 2016 fatal crash involving Joshua Brown in which the Autopilot system failed to detect crossing trucks.

The bot that parses the articles creates a worse summary than you’d get by just reading random sentences.

In any case, we should note that this finding was reached after the recent media disclosures that Musk and Tesla deliberately created a false impression of the reliability of their autopilot capabilities. They were also deceptive in the capabilities of vehicles like the cybertruck and their semi, as well as things like range estimation, which might show a pattern of deliberate deception - demonstrating that it is a Tesla company practice across product lines. The clickthrough defense compared to what the CEO says on stage on massively publicized announcements sounds to me a bit like Trump’s defense that he signed his financial statements but noted that by doing so he wasn’t actually confirming anything and the people who believed him are the ones to blame.

Given his groundless lawsuit against media matters and his threats against the ADL, I think Elon might have started circling the drain.

[–] NotMyOldRedditName 1 points 11 months ago* (last edited 11 months ago)

Ah, that makes a lot more sense. Shouldn't trust the bot.

I'd only add that the "click through" is actually a well laid out screen with info graphic showing the problem and a few lines of text.

They'd be hard pressed to say that warning was difficult or hard for anyone to read or understand.

Unrelated but relevant, but like GDPR where privacy explanations need to be short, concise and easy to understand, I'd say the click through thing was more than adequate and would exceed those.

But as you point out, that's only a part of it.

Edit: trying to find an image of it for reference, but my GoogleFu is failing me :(

Edit: to further clarify, I'm only talking about the radar and stopped vehicles in terms above. The whole agreement I think was larger in some places, but my memory is a little foggy on that without images.