this post was submitted on 25 Feb 2024
698 points (97.7% liked)

Technology

59568 readers
4427 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

• Concerns rise as Neuralink fails to provide evidence of brain implant success, raising safety and transparency questions.

• Controversy surrounds Neuralink's lack of data on surgical capabilities and alarming treatment of monkeys with brain implants.

• While Neuralink touts achievements, experts question true innovation and highlight developments in other brain implant projects.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 170 points 9 months ago (4 children)

Plus, he likes to pretend he's an expert on the industries of the companies he runs. That's already potentially dangerous with Tesla and Space X, but in this case his hubris is very directly dangerous to the people receiving his services.

[–] [email protected] 92 points 9 months ago

The difference is with Tesla and Space X he has actual experts doing the work, with Neuralink he gets the worst of the crop - no successful or ethical medical professional is going to want to work with him on this.

[–] [email protected] 62 points 9 months ago* (last edited 9 months ago) (3 children)

Teslas are already directly dangerous to his customers but our society is numb to traffic violence so people don’t care as much as they should. But “full self-driving” has already killed people.

Edit: removed “a lot” because while I suspect it is true, it remains unproven.

[–] [email protected] 28 points 9 months ago (1 children)

But ~~“full self-driving”~~ false advertising has already killed a lot of people.

[–] [email protected] 4 points 9 months ago

Sure, that’s what I was referring to. But I’m realizing not everyone is as aware of the whole story here.

[–] [email protected] 19 points 9 months ago* (last edited 9 months ago) (2 children)

“full self-driving” has already killed a lot of people.

There's only one death linked to FSD beta and even he was driving drunk.

In a recent interview, Rossiter said he believes that von Ohain was using Full Self-Driving, which — if true — would make his death the first known fatality involving Tesla’s most advanced driver-assistance technology

Von Ohain and Rossiter had been drinking, and an autopsy found that von Ohain died with a blood alcohol level of 0.26 — more than three times the legal limit

Source

However there's approximately 40 accidents that have led to serious injury or death due to the use of the less advanced driver assist system "autopilot".

[–] [email protected] 15 points 9 months ago

You’re right, I was conflating the two. However, I suspect there are more cases than just this one due to Tesla’s dishonesty and secrecy.

[–] [email protected] 4 points 9 months ago (3 children)

(Why would the human's inebriation level matter if the vehicle is moving autonomously?)

[–] [email protected] 8 points 9 months ago (1 children)

Because it's not autonomous, nor "full self driving". It's a glorified adaptive cruise control. I don't think it's even in the L3 category... (I'm not the biggest fan of the autonomy "levels" classification but it's an ok reference for this).

[–] [email protected] 2 points 9 months ago

Tesla would just get up and lie to the public like that?

[–] [email protected] 2 points 9 months ago

Agreed. Also while it’s impossible to say in any individual case I suspect people might be more likely to drive while inebriated if they believe the autopilot will be driving for them.

[–] [email protected] 0 points 9 months ago

This kind of thinking is why these accidents happen. The goal of autonomous driving is for it to one day be better driver than the best human driver, but this technology is still in its infancy and requires an attentive driver behind the wheel. Even Teslas tell you this when you engage these systems.

[–] [email protected] 2 points 9 months ago (1 children)
[–] [email protected] 9 points 9 months ago (1 children)

Tesla’s secrecy around its safety data makes it hard to do a robust analysis but here’s a decent overview: https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/

[–] [email protected] -3 points 9 months ago (1 children)

What if we compare that to human related injuries?

I bet more people were killed by other human drivers today. Probably another right now...

I'm not excusing lack of tech safety, but I think there's a double standard not in context.

[–] [email protected] 2 points 9 months ago* (last edited 9 months ago) (1 children)

So I hear what you’re saying—what we really want to measure is deaths avoided versus those caused. But it’s a difficult thing to measure how many people the technology saved. So while I’m cognizant of this issue, I’m not sure how to get around that. That said, the article mentions that Tesla drivers are experiencing much higher rates of collisions than other manufacturers. There could be multiple factors at play here, but I suspect the autopilot (and especially Tesla’s misleading claims around it) is among them.

Also, while there may be an unmeasured benefit in reducing collisions, there may also be an unmeasured cost in inducing more driving. This has not been widely discussed in this debate but I think it is a big problem with self-driving technology that only gets worse as the technology improves.

[–] [email protected] 2 points 9 months ago* (last edited 9 months ago)

Yeah, I'm hoping though it progresses to the point that we can reasonably reduce vehicle related incidents.

Between drunk driving, texting, and generally not paying attention, I'd love more people using automated driving if it became statistically safer.

Some people are scared to fly even thought it's statistically safer. They don't want to be the rare happening. Unless Boeing, then check your doors...

Edit, I also agree you can't easily track or correlate things that didn't happen with all the factors here.

[–] Zerlyna 18 points 9 months ago (1 children)

That didn’t work out for Elizabeth Holmes either.

[–] Tyfud 30 points 9 months ago

She exploited and got rich off rich people though, like SBF, so she went down. Musk exploited and got rich off the working class and apartheid exploitation in SA. So that's ok. He's one of them.

[–] pigup 2 points 8 months ago

Elon is a dirty manipulative liar out for power and all the money. It saddens me that there are people stupid enough to trust this guy with their bodies to let them implant a chip whose main purposed is to make him even richer.