this post was submitted on 13 May 2024
636 points (96.6% liked)

memes

10686 readers
2774 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to [email protected]

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

Sister communities

founded 2 years ago
MODERATORS
 
all 43 comments
sorted by: hot top controversial new old
[–] gedaliyah 104 points 7 months ago (2 children)

Sir, this place is for memes, not news reporting

[–] Gutek8134 41 points 7 months ago (1 children)

Speak for yourself, memes are my primary source of information /hj

[–] Viking_Hippie 44 points 7 months ago (1 children)
[–] [email protected] 16 points 7 months ago (1 children)
[–] WraithGear 10 points 7 months ago (1 children)
[–] RealFknNito 8 points 7 months ago (2 children)

This is getting out of hand. /x

[–] [email protected] 5 points 7 months ago (1 children)
[–] [email protected] 4 points 7 months ago (1 children)

Nah an x means a kiss. Here's a hug /o

[–] [email protected] 1 points 7 months ago

Just don't confuse it with \o

…never confuse it with \o

[–] [email protected] 1 points 7 months ago

Wait, what news?

[–] n3cr0 47 points 7 months ago (3 children)

You are always liable, In both cases.

[–] radiohead37 20 points 7 months ago

I take it this is just a hypothetical situation. Eventually it will be case.

[–] [email protected] 8 points 7 months ago (1 children)

I assumed future case where Level 3+ autonomous driving was in play and the human is not the responsible party.

[–] [email protected] -2 points 7 months ago (1 children)

The human should always be attentive behind the wheel, autonomous driving or not. They should be just as liable as if they were driving the car themselves

[–] [email protected] 3 points 7 months ago

That will be especially applicable to cars without the wheel or other controls at all.

And before you say that an AI will never be able to do that, that's already possible with a human operating the car remotely, it's just not adopted yet

[–] [email protected] 2 points 7 months ago

Right, unless it’s that new Mercedes and you’re in traffic on one of a specific set of roads under 40 MPH.

Think they’ve sold a handful.

[–] [email protected] 38 points 7 months ago (3 children)

Yup. The trolly problem is one of ethics and responsibility, not whether one person or several people die.

The death of the people is irrelevant, your responsibility for those deaths is the point.

I didn't get it either until a good friend and I were discussing it and he said: forget the trolly. How about this, you're walking down the street after eating at Subway (or some similar shop) and you have half a sandwich left, you pass by someone begging for food. You can either choose to give it to them or not. If you choose not to, and later that same day the person dies from starvation, are you responsible for their death because you didn't give them the excess food you had?

The dilemma is based on a few points, if you take action and the person dies, are you responsible for the death you caused, if you take no action are you responsible for deaths you could have avoided by taking action, when you chose not to?

In OP's post, legally, if you are the driver/operator of the vehicle, you are always, 100% responsible for anything the vehicle does, whether under autonomous control or not. This is the law. Whether you are morally at fault, is a matter of debate. You didn't direct the car to run over people, but you also did not stop the car from running over the people.

There's an argument to be made about duty of care, etc.

However, this is the root of the trolly problem.

Thank you for coming to my Ted talk.

[–] [email protected] 8 points 7 months ago (1 children)

with level 3+ autonomous driving the "driver" is not responsible.

[–] [email protected] 8 points 7 months ago (1 children)

Legally, or morally?

Maybe neither?

IDK. I'm not going to start a philosophical debate here. Just asking for you to clarify.

[–] [email protected] 2 points 7 months ago* (last edited 7 months ago) (1 children)

fair. I was talking legally but morally is a whole other thing

[–] [email protected] 2 points 7 months ago

Agreed. I won't get into it, since the trolley problem has taught me that there's a lot of opinions on it, which makes it seem relevant, but there's nearly zero consensus on what the correct analysis of the situation is. At the end of the day, the dilemma is a near impossibility.

The courts have made up their mind on it and that's all I'm going to concern myself with for the moment.

[–] [email protected] 3 points 7 months ago* (last edited 7 months ago) (1 children)

This issue has been explored previously, and with a better example of the trolley problem that centers the ethical dilemma entirely on the autopilot.

I do agree that in most situations, the driver retains full control over the vehicle, and therefore remains fully responsible, even if there's a case to be made that the autopilot neglected the safety of others outside the car.

However, I'd also argue that this example leaves a possibility where fault cannot be assigned to them: If the driver became aware of the hazards at a reasonable time (i.e. spotting the pedestrians just around a sharp bend, rather than 200m down a straightaway), and made every reasonable effort to stop within that time but could not. There are limits to the driver's responsibility, but the most interesting cases are crashes that the autopilot is capable of preventing (even if the driver reasonably cannot), but fails to do so.

[–] [email protected] 7 points 7 months ago (1 children)
[–] LordKitsuna 11 points 7 months ago* (last edited 7 months ago) (2 children)

Given the amount of distance between that car and the crosswalk, and the fact that it's a crosswalk meaning the car is not going to be traveling at freeway speeds. I would hazard a third option and say maybe just kind of lightly press on the brakes? ( ͡° ͜ʖ ͡°)

[–] [email protected] 1 points 7 months ago (1 children)

What if the brakes were malfunctioning.

[–] LordKitsuna 4 points 7 months ago* (last edited 7 months ago) (1 children)

Then the car could just turn to the closest side and grind on the barrier to stop itself ༼ つ ◕_◕ ༽つ

[–] [email protected] 4 points 7 months ago

Additionally the car should honk frantically to alert the pedestrians and fellow drivers of a dangerous situation happening.

[–] Chriszz 30 points 7 months ago* (last edited 7 months ago)

Since the fine is meaningless to Elon, this becomes an ethics problem. Swerve, killing 1 person and be charged with manslaughter—or kill 5 people and be found not guilty.

[–] [email protected] 22 points 7 months ago (1 children)

Realistically though, when incidents occur, Tesla pretends FSD wasn’t on and they’ll never pay anything.

[–] [email protected] 9 points 7 months ago (3 children)

It's level two they don't need to pretend. It's always the driver's fault. What will be interesting is Mercedes level three turning itself off right before a crash.

[–] [email protected] 3 points 7 months ago

But your honour, the driver had a whole 873 milliseconds to react before the impact! It's clearly their fault 🧑‍⚖️

[–] [email protected] 3 points 7 months ago

Nevertheless, people still want to know. Tesla apparently has a habit of turning of FSD less than a second before a crash happens. And, Tesla does publicly disavow responsibility. If FSD does something entirely ridiculous like suddenly veer off the road or brake unexpectedly, the driver may not have time to respond. Check out this article and lawsuit for instance:

https://www.denver7.com/news/evergreen/teslas-autopilot-drove-car-into-tree-killing-colorado-man-in-fiery-crash-lawsuit-alleges

Tesla’s advanced Autopilot driving system malfunctioned and caused one of the electric car maker’s Colorado employees to drive off the road and die in a fiery crash, a newly filed wrongful death lawsuit alleges.

The car was too mangled to be analyzed as to whether FSD was on. The surviving passenger said yes, the driver was using it. Musk denies it:

https://www.teslarati.com/tesla-elon-musk-clarifies-fsd-not-enabled-model-3-fatal-dui-crash/

"He was not on FSD. The software had unfortunately never been downloaded. I say “unfortunately”, because the accident probably would not have happened if FSD had been engaged."

[–] Neon 1 points 7 months ago (1 children)

what levels are you refering to?

[–] [email protected] 2 points 7 months ago

I think he means level of autonomy.

[–] [email protected] 14 points 7 months ago* (last edited 7 months ago) (1 children)

If they were restrained in the road against their will, I'd save 4 people and take the heat. But seeing as they are just hanging out in the middle of the street of their own volition, I'ma let autopilot handle it. ^/s^

[–] [email protected] 2 points 7 months ago

That's some J from Man in Black level of argument, good job

[–] [email protected] 12 points 7 months ago

In Germany nobody cares about if Autopilot was on, because you are still the driver of the car and responsible for everything it does. The Law says you need to be able to react to it, if you do not you are the only one responsible

[–] Konstant 6 points 7 months ago

The troll (Elon) problem

[–] werefreeatlast 5 points 7 months ago

You could pretend to swerve but still clip them all except for the hot chick. Then jump and pretend to save her from your evil car. So that's how you get lai....wait what was the question?