this post was submitted on 28 Aug 2023
435 points (97.8% liked)
Technology
59707 readers
5308 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
First of all what is it that you consider safe? I'm sure you realize that 100% safety rating is just fantasy so what is the acceptable rate of accidents for you?
Secondly would you mind sharing the data "that's piling up is showing that Autopilot is deadly" ? Reports of individual incidents is not what I'm asking for because as I stated above; you're not going to get 100% safety so there's always going to be individual incidents to talk about.
You also seem to be talking about FSD beta and autopilot interchangeably thought they're a different thing. Hope you realize this.
There are very strict regulations around what is allowed to be in the streets and what isn't. This is what protects us from sloppy companies releasing unsafe stuff in the streets.
Driver assist features like the Autopilot are operating in a regulatory grey zone. The regulation has not caught up with technology and this allows companies like Tesla to release unsafe software in the streets, killing people.
Exactly. Driver assist features. These aren't something to be blindly relied on and everyone knows this and the vehicle will remind you. Every crash is fault of the driver - not the system.
Now if you don't mind showing me the data that's "piling up is showing that Autopilot is deadly"
Except Tesla isn't selling them as such. Theid advertisement videos as early as 2016 say "the driver is not necessary, the car is driving itself". This is false marketing in its purest and simplest form: https://www.theguardian.com/technology/2023/jan/17/tesla-self-driving-video-staged-testimony-senior-engineer
I'm still waiting for the data that you said is piling up. You also did not specify what number of accidents you find acceptable for a self driving system. It's almost like you're trying to evade my questions..
Give me a breakn The WaPo article is linked above. Also, when it comes to safety, the burden of proof is on those arguing that something is safe.
If there's piles of data it shouldn't be difficult to prove it's unsafe.
You still haven't even specified what is considered safe.
Stop talking nonsense. The anecdotes that are piling up clearly indicate there is a problem with Tesla's autopilot.
It's in the weapon article you were already linked, you just keep choosing to ignore it. Another user on my blocked list.
Do you think Tesla would get sued if the data wasn't piling up?