this post was submitted on 03 Dec 2023
468 points (95.2% liked)

News

23638 readers
3300 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS
 

A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 57 points 1 year ago* (last edited 1 year ago) (5 children)

Maybe it is just me, but its why I think this is a bigger issue than just Hollywood.

The rights to famous people's "images" are bought and sold all the time.

I would argue that the entire concept should be made illegal. Others can only use your image with your explicit permission and your image cannot be "owned" by anyone but yourself.

The fact that making a law like this isn't a priority means this will get worse because we already have a society and laws that don't respect our rights to control of our own image.

A law like this would also remove all the questions about youth and sex and instead make it a case of misuse of someone else's image. In this case it could even be considered defamation for altering the image to make it seem like it was real. They defamed her by making it seem like she took nude photos of herself to spread around.

[–] [email protected] 53 points 1 year ago* (last edited 1 year ago) (3 children)

There are genuine reasons not to give people sole authority over their image though. "Oh that's a picture of me genuinely doing something bad, you can't publish that!"

Like, we still need to be able to have a public conversation about (especially political) public figures and their actions as photographed

[–] Zachariah 7 points 1 year ago (1 children)

Seems like a typical copyright issue. The copyright owner has a monopoly on the intellectual property, but there are (genuine reasons) fair use exceptions (for journalism, satire, academic, backup, etc.)

[–] [email protected] 9 points 1 year ago (1 children)

Reminder that the stated reason for copyrights to exist say all, per the US Constitution, is “To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.”

Anything that occurs naturally falls outside the original rationale. We've experienced a huge expansion of the concept of intellectual property since then, but as far as I can tell there has never been a consensus on what purpose intellectual property rights are supposed to serve beyond the original conception.

[–] afraid_of_zombies 2 points 1 year ago

Makes sense. If I do something worth taking a picture of that means I have zero rights to it since that is "natural", but the person who took the photo has all the rights to it.

Tell me this crap wasn't written for and by the worst garbage publishers out there.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

Yeah I'm not stipulating a law where you can't be held accountable for actions. Any actions you take as an individual are things you do that impact your image, of which you are in control. People using photographic evidence to prove you have done them is not a misuse of your image.

Making fake images whole cloth is.

The question of whether this technology will make such evidence untrustworthy is another conversation that sadly I don't have enough time for right this moment.

[–] [email protected] 7 points 1 year ago (1 children)

That sounds pretty dystopian to me. Wouldn't that make filming in public basically illegal?

[–] [email protected] 9 points 1 year ago* (last edited 1 year ago)

In Germany it is illegal to make photos or videos of people who are identifieable (faces are seen or closeups) without asking for permission first. With exception of public events, as long as you do not focus on individuals. It doesn't feel dystopian at all, to be honest. I'd rather have it that way than ending up on someone's stupid vlog or whatever.

[–] [email protected] 5 points 1 year ago (1 children)

Wait you thought this was a problem for Hollywood?

[–] MargotRobbie 8 points 1 year ago (2 children)

It is for actors, since you would be handing over the right to your likeness to studios for AI to reproduce for eternity.

It was one of the main issues for the SAG-AFTRA strike.

[–] [email protected] 4 points 1 year ago (1 children)

Well at least they're getting paid for it. But someone could copy your likeness for free

[–] MargotRobbie 5 points 1 year ago

They could be impersonating me as we speak!

[–] APassenger 1 points 1 year ago

Or their identity?

[–] CleoTheWizard 3 points 1 year ago

The tools used to make these images can largely be ignored, as can the vast majority of what AI creates of people. Fake nudes and photos have been possible for a long time now. The biggest way we deal with them is to go after large distributors of that content.

When it comes to younger people, the penalty should be pretty heavy for doing this. But it’s the same as distributing real images of people. Photos that you don’t own. I don’t see how this is any different or how we treat it any differently than that.

I agree with your defamation point. People in general and even young people should be able to go after bullies or these image distributors for damages.

I think this is a giant mess that is going to upturn a lot of what we think about society but the answer isn’t to ban the tools or to make it illegal to use the tools however you want. The solution is the same as the ones we’ve created, just with more sensitivity.

[–] afraid_of_zombies -5 points 1 year ago (1 children)

Many years ago I mentioned this on reddit. Complaining how photographers can just take pictures of you or your property and do what they want with it. Of course the group mind attacked me.

Problem just seems to get worse by the year.

[–] [email protected] 15 points 1 year ago (1 children)

That's because your proposal would make photography de facto illegal, because getting the rights to everyone and everything that appears in a photograph would be virtually impossible. Hell, most other kinds of visual art would be essentially illegal as well. There would be hardly anything but abstract art.

[–] afraid_of_zombies 4 points 1 year ago (1 children)

Bullshit.

Taking a photo of yourself or your family at a public landmark? Legal.

Taking a photo of yourself or your family at a celebration? Legal.

Zooming in on the local Catholic school to get a shot of some 12 year olds and putting it on the internet? Illegal.

We need to stop pretending that photography isn't a thing and that there is zero expectation of privacy if someone can violate it. This is crap we see with police using infrared cameras to get around the need for warrants and the crap we see of people using drones to stalk. You have the right to be left the fuck alone and if someone wants to creep on teens well sorry you are out of luck.

[–] [email protected] 10 points 1 year ago (1 children)

There are literally already cases where taking a photo of yourself in front of a public landmark is illegal because of copyright issues.