this post was submitted on 17 Dec 2023
246 points (94.6% liked)
Technology
60023 readers
3616 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Tesla has to get more punishment than NOTHING. This sucks
Yeah, judging by the article, Tesla should take some responsibility here. Not that the driver should get off, if your car is blowing a red light at 120km/h you're just not paying proper attention.
Sure, I'd prefer to know more exactly the time between. Was it 2 seconds or 25? But my premise is this shouldn't happen in the software. I know I read some time ago that Teslas had shut off the software moments before collision, no time to save it, but I'd have to double check that. All to blame the customer
Automakers should not be allowed to use the unsuspecting public as toys for their experimental software, it quickly becomes a 1-4 ton death machine, but I think we agree on that.
Oh yeah, I work in software development myself. No way I'd trust my life to something like Tesla's autopilot, which is perpetually in beta, relies on just the camera feed and is basically run by a manager that has clear issues with over promising and under delivering (among other things). You can get away with shit like that for a website or mobile app, but these are people's lives.
Suing Tesla seems a little dumb to me. Sue the DMV that's giving people like this licenses
Auto manufacturers must be held liable for faulty software. If it's not safe, it does not go on the road
Only if the software is causing the accident or preventing the driver from avoiding one. Here the fault of the software was to not slow down out of the highway (which by experience must be a very specific situation because it most certainly do), the drive could have disengage autopilot or applied brakes to stop at the red light. The software specifically mentions it can't stop at red lights and alerts the driver when it's about to burn one. 100% of fault is the driver here.
Are manufacturers solely responsible for safety, or lack thereof, on the public roads of the USA?
I don't believe they are.
Solely? No. But if the airbag, seatbelt, or self-driving autopilot feature that they created contributed to someone's death, they are partially responsible and should face consequences or punishments. Especially if they market it as a safe feature.
https://en.wikipedia.org/wiki/Section_230
Your point being?
They are legally protected from the punishment of more than anything. Nothing will ever hit them with more than NOTHING.
Then the law has failed, which was my point from the start.
Why would you try to belittle me by repeating my opinion? That doesn't make sense at all.
Because unless you plan on becoming a lobbyist, or politician, or activist nothing will change. Sitting around saying "They have to get in trouble in some manner" doesn't do anything. If you want that to happen, since they are legally protected from what you want, go make a change.
I don't live in the US, but the first step towards change is getting mad and raising awareness, this is the change I can make.
Closer to home, I full heartedly support the strike on Tesla Sweden for example.