this post was submitted on 03 Sep 2023
331 points (92.1% liked)
Technology
60082 readers
3332 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Auto pilot beta? People are willing to test betas for cars? Are you insane? Insurance is going to have a field day.
What bothers me is, I have to drive on the road with people running some braindead Elon Musk software?
Have you seen how humans drive? Its not a very high bar to do better.
But betas? Seriously?
I was taught to always drive defensively. You never know when someone's going to get distracted, get stupid, have a stroke... add glitchy robots to the list, it doesn't make a whole lot of difference.
And yet FSD is still worse than the one time I got in the car with an exchange student who had never driven a car before coming to the US and thought her learners permit was the same as a driver's license.
From what i read, Auto Pilot (AP) is just to keep u on your lane while Full Self Driving (FSD) just switches lanes into oncoming traffic.
Funny how George Hotz of Comma.ai predicted this exact same issue years ago: "if I were Elon Musk I would not have shipped that lane change".
This issue likely arises as the cars sensors can not look "far enough ahead" on the lane it changes to. Which can lead to crashes from behind due to much faster cars and in this case lane confusion aa the car can not see oncoming traffic.
Even better, several people have died using it or killed someone else. It also has a long history of driving underneath semi truck trailers. Only Europe was smart enough to ban this garbage.
FSD has never driven under a truck, that was autopilot, which is an LKAS system. The incident happend 1 year prior to "Navigate on autopilot" so the car in question was never even able to change lanes on its own. The driver deliberately instructed the car to drive into the trailer.
FSD beta is currently available in most of Europe and has been for several months.
Yes it has. Well, into the back of one so fast that it went under at least.
So is FSD. 🤣 It's level 2 bud, you're really REALLY confused for someone pretending to own one.
Are you saying Josh Brown killed himself? Because if you are, that would be a new repulsive low even for you Elon simps.
The craziest part of the article is just how much effort the author put into collecting data and filing feedback and really really hoping that Tesla could pull the videos (they can), then went on to actively try and succeeded in recreating the problem at high speed next to another car.
Not Auto Pilot (AP). There's a difference between FSD and AP. AP will just keep you between the lane lines and pace the car in front of you. It can also change lanes when told to. There's also Enhanced Auto Pilot (EAP). EAP was supposed to bridge the gap between AP and FSD. It would go "on ramp to off ramp". So it could switch lanes as needed and get to exit ramps. FSD is the mode where you shouldn't need to touch it outside of answering the nag (the frequent nag to "apply force to the steering wheel" to tell it you are still alive and paying attention)*.
'* At least I think that's the same for FSD. I'm only on AP with AP1 hardware. Never had an issue that I'd blame on a "bug" or the software doing something "wrong".
It’s the beta part that scares me the most, the type of assistance isn’t really relevant. People shouldn’t be driving around in betas. These aren’t phones.