this post was submitted on 24 Aug 2023
417 points (89.6% liked)

Technology

59672 readers
4191 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 95 points 1 year ago (9 children)

Imagine naming a feature "Full Self-Driving," and yet you can't take your attention away from the road and must be ready to take over at a moment's notice.

[–] [email protected] 57 points 1 year ago (1 children)

This is on par for Elon's entire career. He loves claiming success and taking credit for things he either didn't accomplish himself, or things he hasn't accomplished yet.

[–] [email protected] 12 points 1 year ago (1 children)
[–] [email protected] 14 points 1 year ago (2 children)

I remember reading a post that claimed that Tesla's safety rating was given to them because a bunch of their crashes were determined to be human error - because the self-driving feature would automatically disconnect if it faced a crash it couldn't avoid.

[–] MowFord 12 points 1 year ago (2 children)

Fairly certain the statistic requires fsd to have been disabled for 10s before or is counted as human-caused

[–] [email protected] 7 points 1 year ago

Correct. It's documented.

[–] [email protected] 1 points 1 year ago

Maybe it does now after Musk tried to find a loophole.

[–] T156 6 points 1 year ago

The issue is a bit muddied by the fact that hitting the brake or the accelerator will deactivate it, and people will usually hit one of those if they believe that they are going to crash.

[–] [email protected] 12 points 1 year ago (1 children)

It's ok, it's in beta, so some features may not be complete just yet, but hey, let's just release this to the public anyways.

[–] [email protected] 5 points 1 year ago

And charge a shit load for it

[–] UsernameIsTooLon 6 points 1 year ago (1 children)

I feel like even with fully autonomous cars, there's going to be laws about how the main driver should always be alerted. This would be the case unless our cars are their own independent drivers like a cab.

[–] [email protected] 11 points 1 year ago

Honestly, there should be laws against full self driving modes unless they can be proven to be good enough to not require driver intervention at all, and the manufacturer can be legally considered as the driver in case of an incident.

Requiring a driver to be alert and attentive to the road while not doing anything to operate the car runs contrary to human psychology. People cannot be expected to maintain focus on the road for extended periods while the car drives itself.

I don’t know exactly where the line should be drawn between basic cruise control and full self driving, but either the driver should be kept actively involved in driving or the car manufacturer should be held liable for whatever the car does.

[–] helmet91 5 points 1 year ago (1 children)

It's just a driving assistant, like in any other car. As far as I know, currently Mercedes is the only one who implemented autonomous driving, and even that one is limited to some specific areas. But at least that one is real. So much, that legally Mercedes (the company) is considered to be the driver of such cars, in case anything happens on the roads.

[–] practisevoodoo 5 points 1 year ago (1 children)

Depends on your definition for autonomous driving which mainly depends on your ODD but they're not the only ones. Honda ,Volvo and GM have something. Others (i.e. BMW) have stuff next year but they're all going with more accurate names. CoPilot, PilotAssist, Super cruise, Traffic Jam Pilot. Makes it clear that these are drive assists, not drive replace.

[–] froh42 2 points 1 year ago

Mercedes has Level 3 autonomy in certain highway situations, so you are legally allowed to watch a video or read a newspaper. You just need to be able to take over again within 20 seconds or so.

Others are following close up, I think Audi had to postpone Level 3 a bit etc. BMW has something in the pipeline as well.

But these are really more than drive assists. I really find the "Level n" specifications more helpful than "drive assist" vs "autonomy"

None of the other brands oversell what they are offering.

[–] asdfasdfasdf 3 points 1 year ago (2 children)

Are there any truly autonomous machines which don't require a human to monitor?

[–] [email protected] 12 points 1 year ago (2 children)

Lots. Toasters, refrigerators, robot vacuums, thermostats, smart home lights, etc.

The reason why self-driving cars are extra tricky is both because they have a much more complex task and the negative consequences are sky high. If a robot vacuum screws up, it's not a big deal. This is why it's totally irresponsible to advertise something as having "full" autonomy when the stakes are so high.

[–] asdfasdfasdf 3 points 1 year ago

Those are mostly automatic, not autonomous.

[–] [email protected] 0 points 1 year ago

It would not be such a problem, if there wouldn’t be that many monkey driving around and if cars would talk to each other.

[–] [email protected] -1 points 1 year ago* (last edited 1 year ago) (1 children)

Yeah, got a small delivery car in my country that drives the streets fully autonomous. It is used to deliver groceries to a distribution point.

It was kind of hallucinating to see it drive past. Since the car has a sort of cockpit, but it is too narrow to seat any human.

It is currently limited to 25 kph, and someone supervises it remotely at all times and can intervene. Just to be on the safe side. Although that rarely happens.

The main reason it can do this is because it always drives the same route.

https://press.colruytgroup.com/collectgo-tests-unmanned-vehicle-in-londerzeel#

[–] [email protected] 0 points 1 year ago (2 children)

You're absolutely right, it can be quite misleading to name a feature "Full Self-Driving" when it still requires constant attention and intervention from the driver. The expectations set by such a name may not align with the reality of the technology's current limitations.

[–] Mr_Dr_Oink 8 points 1 year ago* (last edited 1 year ago) (1 children)

What is going on?

One or both are bots?

[–] [email protected] 1 points 1 year ago

Booze Fan’s comment totally sounds like a bot

… Bender?

[–] [email protected] 2 points 1 year ago

Some might even call it fraud.

[–] [email protected] -5 points 1 year ago (2 children)

Let's be fair. It could be called Driver Assistant Plus and you people would still be complaining because this isn't about Tesla

[–] drdabbles 1 points 1 year ago (1 children)

I complained because it absolutely sucked. Only Tesla would release this garbage in such a fraudulent manner, no other company would risk the lawsuits. Tesla's been killing people with autopilot since 2016, and FSD since it was released to the public. That should make you think, but that seems to be hard for some people when it comes to a Musking,

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago) (1 children)

Autopilot or FSD Beta has never been something you're supposed to rely on and every Tesla owner knows this. If they drive over someone it's the fault of the driver, not the vehicle. Accidents will always happen and if you focus on individual incidents you're missing the big picture. You're never going to reach 100% safety and 99.99% safety means 33000 accidents a year in the US alone. Also the little statistics we have about this indicate that drivers with FSD or Autopilot engaged already crash less than the average.

According to this report, the average Tesla equipped with FSD Beta, driven on predominantly non-highway sections of road, crashes 0.31 times per million miles, a dramatic decrease from the average American, who crashes 1.53 times every million miles.

Source

[–] drdabbles 1 points 1 year ago (1 children)

Too bad there's so many owners relying on it.

if you focus on individual incidents you're missing the big picture

Not at all. In fact, the point is to focus on classes of crashes. Which Tesla fails miserably at.

Also the little statistics we have about this indicate that drivers with FSD or Autopilot engaged already crash less than the average.

This is an outright lie. Period. Having owned a Tesla since 2018, I'm quite familiar with the garbage software and the user community that loves to say no one should trust it on one side, and on the other side of their face says that it's better than a human.

[–] [email protected] 0 points 1 year ago (1 children)

Literally can't debate with you guys because you straight out refuse to believe any evidence presented to you and just base your opinions on anecdotal evidence and individual incidents. If those stats are made up then provide a better source that backs you up.

[–] drdabbles 1 points 1 year ago (1 children)

You seem to be confusing evidence with marketing. That's your problem. If you actually owned one, you'd know I was right.

[–] [email protected] 0 points 1 year ago (1 children)

That's anecdotal evidence. There's nothing scientific about sample of one.

[–] drdabbles 1 points 1 year ago (1 children)

I don't know how to break this to you, but you don't have any data at all. Please do link it here for me if you do, though. I'd love to see what you think is "data". If it's Tesla's EOQ slides, I'll at least have got some great laughs out of this thread.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago) (1 children)

I'm basing my view on the only data that I'm aware of and it's linked above. You might not like the source of this data and that's fair but you not liking it is not evidence to the contrary. Even this article that refutes those stats doesn't provide any significant evidence that Tesla autopilot/FSD is more dangerous than a human driver. You can focus on individual incidents all day but what truly matters is the big picture and wether on average it's safer than a human driver or not and currently it seems that at worse it's about on par with humans and thus it's just a matter of time untill it's better. A lot better.

I have no dog in this fight and I'm fully willing to grant that the data Tesla hands out is very likely more or less misleading but that still doesn't make my basic argument wrong which is that Teslas and their driver assist systems aren't inherently any more dangerous than the competition. Tesla just gets way more attention in the media.

[–] drdabbles 1 points 1 year ago

Of course you link Brad's bullshit. Man I just love it.

The data that exists is the standing general order data which shows Tesla absolutely sucks at life. Period.

[–] JdW 1 points 1 year ago (1 children)

Let's be fair. Elon could be killing a man, on camera, and shout a confession afterwards and you would still find excuses for his behaviour and tell us we're just misinterpreting facts...

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

There's plenty to criticize Musk for. Your apparent assumption that anyone not violently agains him must be a fan is just further evidence for the lack of ability to think rationally about the subject.

[–] [email protected] 1 points 1 year ago (1 children)

Nah, more the assumption that anyone not violently against him doesn't know him or his actions.

[–] [email protected] -2 points 1 year ago

If you violently oppose Musk then how do you feel about people like Putin, Xi Jinping, Ali Khamenei or Bashar al-Assad? Those are the people I focus my opposition to. Someone like Musk, Bezos, Tate etc. I simply have no time nor interest for. That's recreational outrage.