this post was submitted on 07 Aug 2024
44 points (97.8% liked)
Casual Perchance
87 readers
1 users here now
Casual Perchance
A nonspecific casual place for anything Perchance, including generator outputs, memes, prompts, casual discussion, advertising your generator, and anything else you wouldn't post in the more technical Perchance Forum.
This is where to post Community Events.
Resources:
- a quick link to a wealth of prompting info for any Text to Image Generator.
- Google Docs of info and characters for the ai-character-chat.
- AdComsUsefulLinks
Rules:
- friendly discussion only please.
- No loli/shota/etc and mark as NSFW anything NSFW
founded 8 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Excerpt from the article:
THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.
The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday. The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.
The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.
“Current laws don’t apply to deepfakes, leaving women and girls who suffer from this image-based sexual abuse without a legal remedy,” Durbin posted on X after the bill’s passage. “It’s time to give victims their day in court and the tools they need to fight back. I urge my House colleagues to pass this bill expediently.”
Senate Majority Leader Chuck Schumer (D-N.Y.) praised the bill’s passage, commending Durbing for his work. “This isn’t just some fringe issue that happens to only a few people — it’s a widespread problem,” said Schumer. “These types of malicious and hurtful pictures can destroy lives. Nobody is immune, not even celebrities like Taylor Swift or Megan Thee Stallion. It’s a grotesque practice and victims of these deepfakes deserve justice.”
Ocasio-Cortez, the progressive New York lawmaker, first announced she was co-leading the bicameral legislation in an interview with Rolling Stone. She’s had personal experience with this specific type of abuse, and discussed the trauma it can cause.
“There’s a shock to seeing images of yourself that someone could think are real,” she told us in March. “And once you’ve seen it, you’ve seen it. It parallels the same exact intention of physical rape and sexual assault, [which] is about power, domination, and humiliation. Deepfakes are absolutely a way of digitizing violent humiliation against other people.”
In a press release following the bill’s passage in the Senate, Ocasio-Cortez said it “marks an important step in the fight to protect survivors of non-consensual deepfake pornography,” adding: “I’m committed to collaborating with colleagues from both sides of the aisle to shepherd the bill through the House of Representatives to get it to the president’s desk. Together, we can give survivors the justice they deserve.”
When the bill first came up for a unanimous vote in the Senate in June, it was blocked by Rep. Cynthia Lummis (R-Wyo.). The most recent version of the bill now includes a “findings” section, refining the definition of “digital forgery” and updating the available damages to ensure victims receive appropriate compensation, along with some other clarifications.
The “findings” section discusses facts previously reported on by Rolling Stone: Technology like generative AI has made it easier for people to quickly generate digital forgeries without technological experience; victims of this abuse can potentially experience depression, anxiety, and suicidal ideation; the harms of this abuse are not mitigated through labels depicting the image as fake; and victims do not know how to prevent future abuse.
“I had no idea if the person who did it was near me location-wise, [or] if they were going to do anything else to me,” one survivor told Rolling Stone in our April print issue. “I had no idea if anyone who saw that video was going to try to find me. I was very physically vulnerable at that point.”
RELATED
Senators Introduce Long-Awaited Bill to Protect Artists From AI Deepfakes
SAG-AFTRA Video Game Performers Announce Strike
Sheryl Crow Slams Drake's 'Hateful' Use of Tupac Shakur AI Verse
The Senate introduced the DEFIANCE Act on Jan. 30, about a week after several AI-generated, sexually-explicit deepfakes of Taylor Swift went viral on X. If the bill passes the House, it would become the first federal law to create a civil cause of action for deepfakes. There are currently a few federal statutes that can be used for criminal prosecution of deepfakes for minors, but DEFIANCE would allow both adults and minors to sue.
Past legislative efforts by Rep. Yvette Clarke (D-N.Y.) and Rep. Joe Morelle (D-N.Y.) to rein in deepfakes were not successful. Those bills involved criminal penalties, while DEFIANCE focuses on civil legal recourse.
“It’s so important to me that people understand that this is not just a form of interpersonal violence, it’s not just about the harm that’s done to the victim,” Ocasio-Cortez previously told Rolling Stone. “Because this technology threatens to do it at scale — this is about class subjugation. It’s a subjugation of entire people. And then when you do intersect that with abortion, when you do intersect that with debates over bodily autonomy, when you are able to actively subjugate all women in society on a scale of millions, at once digitally, it’s a direct connection [with] taking their rights away.”