this post was submitted on 16 Apr 2024
139 points (95.4% liked)

World News

37336 readers
4781 users here now

A community for discussing events around the World

Rules:

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News [email protected]

Politics [email protected]

World Politics [email protected]


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS
all 40 comments
sorted by: hot top controversial new old
[–] Gigan 63 points 2 months ago (2 children)

If they're going to go this way, I don't think it should be limited to just porn. There are plenty of ways you could ruin someone's life without a deepfake being sexually explicit.

[–] Deestan 17 points 2 months ago (1 children)

There already are a lot of laws covering that. This one is to cover an additional angle where people create deepfake without provably publishing it, the intent being that showing it to friends and verbally threatening to "leak" them should be easier to prosecute.

If you create a deepfake and share it, you're slapped with two crimes.

[–] [email protected] 10 points 2 months ago (1 children)

the intent being that showing it to friends and verbally threatening to "leak" them should be easier to prosecute.

That's blackmail, which is already illegal.

[–] nogooduser 7 points 2 months ago

Using a mobile phone while driving has always been illegal if you could argue that it was dangerous driving or driving without due care and attention. They made a law specifically saying that using a mobile phone without hands free is illegal anyway. This makes it easier to prosecute because you don’t need to argue that they were driving dangerously or without due care.

I imagine that this law has the same intent of making this specific act illegal to prevent them having to argue that it fits another crime.

[–] billbasher 8 points 2 months ago

Yeah the way people can recreate someone “in need of assistance” to trick family or associates is really scary especially for people who aren’t exactly tech savvy. That seems to me to be a worse crime than an explicit video that is pretty obviously doctored

[–] [email protected] 20 points 2 months ago (2 children)

I wonder what happens when it just accidentally looks like someone but was intended to be a fictional person. Also, how much can you base it on a real person before it's considered a deep fake of that person? Would race-swapping be enough to make it a "new" person so it's not illegal anymore? My intuition is that just eye colour or something wouldn't be enough, but it's a sliding scale where the line must be drawn somewhere even if it's a fuzzy line.

What about an AI generated mashup of two people like those "what the child would look like" pictures back in the day. Does that violate both people or neither?

What about depicting a person older than they are now? That's technically not somebody that exists, but might in the future.

What if you use AI but make it look like it's hand-drawn or a cartoon?

What if you use AI to create sexual voice clips of a real person but use images that don't look like them or no image at all?

There are just so many possibilities and questions that I feel it might be impossible to legislate in a way that isn't always 10 steps behind or has a million unforeseen consequences.

[–] [email protected] 5 points 2 months ago

There's already laws against using someone's likeness for commercial purposes without their consent, I'm guessing this will require the same fuzzy cutoff and basically just be up to the jury to decide or the judge to dismiss.

[–] [email protected] 11 points 2 months ago (3 children)

I have a hard time accepting this as a crime. What if the illustration hand-drawn, or clothed but still sexual in character? Is caricature illegal, by this standard?

[–] [email protected] 15 points 2 months ago (1 children)

You'd better not have a particularly vivid imagination or else you'll be prosecuted for daydreaming.

[–] [email protected] 6 points 2 months ago* (last edited 2 months ago)

Yea, this is a funny thing to think about.

You can jerk off to photos of people, you can imagine some wild things involving other people etc.

If you just create some deepfake porn to masturbate by yourself to, I don't see a big problem with that.

The only issue I can find is, that due to neglect someone else sees it, or even hears about it.

The problem starts with sharing. Like it would be sexual harassment to tell people who you are masturbating to, especially sharing with the "actors" of your fantasies.

There is however another way this could go:

Everyone can now share nudes with way less risk.

If anyone leaks them, just go: "That's a deepfake. This is obviously a targeted attack by the heathens of the internet, who are jealous of my pure and upstanding nature. For me only married missionary with lights out."

[–] andrewta 12 points 2 months ago (2 children)

There’s a big difference between a deep fake and a caricature.

[–] [email protected] 3 points 2 months ago (2 children)

Yeah, but only one of degree.

[–] [email protected] 4 points 2 months ago (1 children)
[–] [email protected] 0 points 2 months ago* (last edited 2 months ago) (1 children)

It's making an image of someone that portrays them in an unrealistic and offensive context.

[–] [email protected] 5 points 2 months ago

So if I use AI to make pornography of 50 men gang banging you, you will consider that to be on the same level as going to a carnival and getting a characture done?

[–] [email protected] 0 points 2 months ago

The difference is so big, it easily becomes qualitative.

[–] Deestan 10 points 2 months ago

Is caricature illegal, by this standard?

No.

The official government announcement is linked in the article btw.

[–] gedaliyah 7 points 2 months ago* (last edited 2 months ago) (1 children)

This is why we should be making laws around likeness rights. If you damage somebody by publicly using their name to spread falsehoods, that's defamation or libel. But, if you produce an image or video of their likeness instead of using their name, there's no legal recourse. Makes no sense this day in age

[–] [email protected] -2 points 2 months ago (1 children)

Who decides how similar somebody is "allowed" to look to another? There are people who bear an uncanny resemblance to others. And what of identical twins? Can one sue the other if they do porn?

[–] [email protected] 11 points 2 months ago

The courts, probably. That's what they are for.

[–] FlyingSquid 6 points 2 months ago (3 children)

"Without consent." I'm very curious who would consent to having deepfake porn made of themselves.

[–] [email protected] 12 points 2 months ago (2 children)

I can imagine a non-zero amount of people would consent to a deep-fake porn video of themselves having sex with some generic hot woman, just as one example.

[–] FlyingSquid 2 points 2 months ago

That makes sense, I hadn't thought of that sort of deepfake.

[–] [email protected] -1 points 2 months ago

Better make sure the generic hot woman doesn't resemble anyone real though.

[–] [email protected] 8 points 2 months ago

Could be very lucrative if you are already in porn and want to make some money from your likeness. This guy's gonna pay me $500 to make a video and I don't even have to do anything?

Could also be very good for porn stars who have "aged out" but can still make videos using their younger bodies as weird as that may be.

[–] Grimy 5 points 2 months ago* (last edited 2 months ago)

A user shared a story a while back about his wife and her sister giving photos and agreeing to it. Lots of kinky people out there.

[–] [email protected] 5 points 2 months ago (2 children)

seems like the only way to deal with it.. make it equivalent to sexual assault..

[–] [email protected] 1 points 2 months ago

A naked picture of me simply existing is not equivalent to sexual assault. If you want to make it illegal then treat it as its own thing.

[–] [email protected] -3 points 2 months ago (1 children)

The only way to deal with it is to let so much of it flood the digital world that nobody cares anymore because there's a deepfake porno of everyone.

This is a waste of money to ensure rich people don't get porn made of them by poor people.

Poor people won't be able to afford lawyers and aren't able to take time off to show up in court.

[–] [email protected] 2 points 2 months ago (1 children)

you have a cracked view of jurisprudence.. maybe you look at too much deepfake porn..

[–] [email protected] -1 points 2 months ago

I prefer real porn.

[–] [email protected] 3 points 2 months ago

So what if the deep fake consent?

[–] [email protected] 0 points 2 months ago (1 children)

“Deepfake pornography is a growing cause of gender-based harassment online and is increasingly used to target, silence and intimidate women — both on and offline,” Meta Oversight Board Co-Chair Helle Thorning-Schmidt said in a statement.

considers

I think that there's an argument for taking the opposite position. If someone could make deepfake porn trivially and it were just all over the place, nobody would care about it; one knows that it's fake.

In fact, it'd kind of make leaked actual pornography no-impact as a side effect, unless there were a way to distinguish distinguish between deepfakes. And that's a harder issue to resolve. I was reading a discussion yesterday about sextortion on here and talking about how technically-difficult it would be to keep someone from recording sex video chats, that there'd always be an analog hole at least. But...there is another route to solve that, which is simply to make such a video valueless because there's a flood of generated video.

[–] [email protected] 2 points 2 months ago

Deep fake recognition is already available. And, while what you predict sounds logical, these criminals prey on emotions. I feel that a lot of innocent people will be victimized even if deep fake porn becomes common.

[–] [email protected] -4 points 2 months ago* (last edited 2 months ago)

Government grasping at straws. More news at 8.