this post was submitted on 06 Feb 2024
107 points (98.2% liked)

Enough Musk Spam

2269 readers
920 users here now

For those that have had enough of the Elon Musk worship online.

No flaming, baiting, etc. This community is intended for those opposed to the influx of Elon Musk-related advertising online. Coming here to defend Musk or his companies will not get you banned, but it likely will result in downvotes. Please use the reporting feature if you see a rule violation.

Opinions from all sides of the political spectrum are welcome here. However, we kindly ask that off-topic political discussion be kept to a minimum, so as to focus on the goal of this sub. This community is minimally moderated, so discussion and the power of upvotes/downvotes are allowed, provided lemmy.world rules are not broken.

Post links to instances of obvious Elon Musk fanboy brigading in default subreddits, lemmy/kbin communities/instances, astroturfing from Tesla/SpaceX/etc., or any articles critical of Musk, his ideas, unrealistic promises and timelines, or the working conditions at his companies.

Tesla-specific discussion can be posted here as well as our sister community /c/RealTesla.

founded 2 years ago
MODERATORS
 

Self-driving tech is widely distrusted by the public, and Tesla's huge Autopilot recall and Cruise's scandals don't seem to have helped.

top 19 comments
sorted by: hot top controversial new old
[–] [email protected] 21 points 10 months ago (1 children)

That's because Teslas aren't autonomous vehicles. They're just calling lane assist features self-driving.

A true self-driving vehicle wouldn't need a human watching it.

[–] Spiralvortexisalie 9 points 10 months ago

Even the more advanced cars, the so called level 3 cars that have come out in last year (Tesla I believe is still the lower rated level 2) do not do as much as people think. And even the Level 3 cars I have tried suffer from a serious drawback in that they essentially will ditch self-driving with little warning in populated areas. In the burbs and country with straight highways the vehicles can do alright and/or enough to convince people they are FSD. In city areas I have seen vehicles lose GPS-lock and unable to read the lane markings just kick out of self driving mode, which can happen semi-often around large trucks and heavy bridges, a life or death situation if you weren’t paying attention or stupid enough to sit in the back for a tiktok.

[–] FlyingSquid 16 points 10 months ago

Maybe they should try public transportation instead.

[–] [email protected] 13 points 10 months ago* (last edited 10 months ago)

It’s been 2667 days since Musk promised fully autonomous cross country trips: https://elonmusk.today/#autonomous-cross-country-trip

And 2954 days since he promised cross country summons: https://elonmusk.today/#cross-country-summons

[–] n3m37h 4 points 10 months ago (1 children)

Tesla's make driving on the roads dangerous. Was driving on a highway (2 lane) passed a vehicle that was going a bit slow, and was going past a off ramp. Decided to merge back behind a car going approx the same speed as I was starting to merge back the bloody car saw a speed limit sign and started braking. I nearly rear ended the fucking Tesla. Got a video too.

The first day I got my dash cam, was winter and roads had just been plowed, so there is a approx car width wide if salt/sand in the middle of the road. On my way to work was coming down a hill on my side of the road, coming up the hill, was a Tesla SUV driving all 4 wheels on the sand (aka on the yellow line)

Both cars and drivers are brain dead

[–] [email protected] 3 points 10 months ago (1 children)

That's on you for not maintaining a safe distance, not the car for obeying speed limit signs.

[–] n3m37h -1 points 10 months ago (2 children)

That's not it m8, the average speed on the 401 is 120kph on slow days 130 typical and speed limit is 100. And suddenly having a vehicle drop from 130 to 100 because it saw a sign is fucking dangerous.

The Tesla was doing about 125 or so when I started merging and I was 3 car lengths behind them, then randomly braked when there was nothing in front of em.

There is nothing safe about abrupt actions which those POS do all the time

[–] [email protected] 1 points 10 months ago* (last edited 10 months ago) (1 children)

If you're close enough to the car in front of you that them slowing down by 25kph causes you to almost hit them, you're too close for the speed you're traveling.

It doesn't matter if there's a clear reason for it or not, or if everyone else is breaking the law. You're responsible for maintaining a safe distance, not the person you're behind.

3 car lengths is not a sufficient distance for 125kph.

[–] n3m37h -1 points 10 months ago (1 children)

Easy Karen I don't need a driving lesson, I did not get into a fucking accident because I was paying attention to the road and traffic around me. Something the driver(s) of the Tesla was not doing because who the fuck slows down suddenly on a highway with no one in front of you?

Seriously take the hard sandpapery object out of your ass and calm down

[–] [email protected] 2 points 10 months ago

Apparently you do, since you're blaming others for your mistakes.

[–] AreaKode 0 points 10 months ago* (last edited 10 months ago) (1 children)

And I would argue that you would see 10 humans do this same action in the same time you'd see a robot to it once. Humans are horribly inefficient drivers, and eventually, computers will be far superior. I hope to see much better advances in the coming years.

They need to just... don't call it "auto pilot" until it actually functions like auto pilot.

[–] n3m37h 3 points 10 months ago

Hahahahahahaha

Tesla won't because cameras alone won't work. The people who have access to lidar and radar as well as mics aren't anywhere close to universal safe autonomous driving (outside of a predefined/maped location).

And machine learning is in its infancy right now, it is unpredictable, unreliable in a lot of applications. And don't even get me started on this stuff operating in any other conditions than bright sunny days.

I've used lane assist (Elantra, Tuscon) and I find it to be terrible in many cases, long turns being the worst.

Autonomous vehicles in warehouses have just as many issues and nowhere near the variables that a car needs to account for...

Please take your head outta your arse and come back to reality please

[–] [email protected] 0 points 10 months ago (1 children)

Self driving tech is pretty good and getting better at an insane rate. I think people only distrust it because of bad media reporting.

[–] [email protected] 5 points 10 months ago (2 children)

I don't trust it because musk lies all the time. It may work fine, but you can't tell lies like he does and expect people to believe you this time.

[–] [email protected] 2 points 10 months ago

I don't trust it because they can't even get the car part of self driving car right.

[–] [email protected] 1 points 10 months ago (1 children)

Self driving tech isn't only tesla. There are many implementations and they are pretty amazing in my opinion.

[–] [email protected] 3 points 10 months ago (1 children)

Sure, but it's impressive in the same way that a dancing bear is impressive - and it's not because the bear dances well.

Even the best self driving implementation are limited to warm sunny days in well mapped areas.

[–] dvoraqs 0 points 10 months ago (1 children)

Actually, it can work pretty well. My Comma 3X could see and navigate the road better than I could in heavy rain on the highway. There's many different levels of maturity here, but even lane keep assist makes driving easier and is useful for that.

You're still right to distrust these systems, but that doesn't mean that they are bad.

[–] [email protected] 2 points 10 months ago

Oh yeah, it can work great. And it can work terribly. We haven't hit the point where it's reliably "great" though. And that makes it rather more dangerous to me since it builds a sense of security that is unwarranted (not that I'm saying you disagree I'm just expanding on my distrust).

One of the major problems is that the failure modes can be very different from how a person fails. Like when you see a car just sitting in the middle of a road because it can't figure out what to do for some reason. A person you could wave on. An AI you can't. We understand human behavior but can't really understand the AI decision-making process.

This is why I can't quite get behind the "all AI needs to do is be slightly better than people" argument. On one hand, from a purely statistics pov, I get it I. But if self-driving cars were "basically perfect" except that every-now-and-then one of them randomly exploded (still killing fewer people than auto accidents) would people be okay with that? Automobile accidents aren't truly "random" like that.