If its wrong to use AI to put genitals in someone's mouth it should probably be wrong to use AI to put words in their mouth as well.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
Damn.
snaps
I agree and I get it's a funny way to put it, but in this case they started the video with a massive disclaimer that they were not Carlin and that it was AI. So it's hard to argue they were putting things in his mouth. If anything it's praiseworthy of a standard when it comes to disclosing if AI was involved, considering the hate mob revealing that attracts.
The internet doesn’t care though. If I make fake pictures of people using their likeness and add a disclaimer, people will just repost it without the disclaimer and it will still do damage. Now whether or not we can or should stop them is another story
Completely true. But we cannot reasonably push the responsibility of the entire internet onto someone when they did their due diligence.
Like, some people post CoD footage to youtube because it looks cool, and someone else either mistakes or malicious takes that and recontextualizes it to being combat footage from active warzones to shock people. Then people start reposting that footage with a fake explanation text on top of it, furthering the misinformation cycle. Do we now blame the people sharing their CoD footage for what other people did with it? Misinformation and propaganda are something society must work together on to combat.
If it really matters, people would be out there warning people that the pictures being posted are fake. In fact, even before AI that's what happened after tragedy happens. People would post images claiming to be of what happened, only to later be confirmed as being from some other tragedy. Or how some video games have fake leaks because someone rebranded fanmade content as a leak.
Eventually it becomes common knowledge or easy to prove as being fake. Take this picture for instance:
It's been well documented that the bottom image is fake, and as such anyone can now find out what was covered up. It's up to society to speak up when the damage is too great.
This case is not just about AI, it’s about the humans that use AI to violate the law, infringe on intellectual property rights and flout common decency.”
Well put.
Eh…. I don’t know that I can agree with this.
I understand the intent behind it, but this specific instance is legitimately in parallel with impersonators, or satire. Hear me out.
They are impersonating his voice, using new content in his style, and make no claim to be legitimate.
So this comes down to “this is in bad taste” which, while I can understand and might even agree with… isn’t illegal.
The only novel concept in this, is that “scary tech” was used. There was no fraud, there was no IP violation, and no defamation. Where is the legal standing?
I'm torn. I can see why they would be upset. And they may have a case with likeness rights.
But at the same time, this specific example isn't trying to claim any kind of authenticity. It goes out of its way to explain that it's not George. It seems clearly to be along the lines of satire. No different than an impersonator in a SNL type sketch.
I guess I don't have any real problem with clearly fake AI versions of things. My only real problem would be with actual fraud. Like the AI Biden making calls trying to convince people not to vote in a primary. That's clearly criminal fraud, and an actual problem.
My only real problem would be with actual fraud. Like the AI Biden making calls trying to convince people not to vote in a primary.
That's the difference between impression and impersonation. My disappointment in the Lemmy community for not understanding the difference is immeasurable. We're supposed to be better than this but really we're no better than Reddit, running with ragebait headlines for the cheap dopamine hit that is the big upvote number.
If it were a human doing a Carlin impression, literally NOBODY would give a fuck about this video.
Ive been thinking about this a lot and if you think about this like they are selling a stolen product then it can be framed differently.
Say I take several MegaMan games, take a copy of all the assets, recombine them into a new MegaMan game called "Unreal Tales of MegaMan". The game has whole new levels inspired by capcom's Megaman. Many would argue that the work is transformative.
Am I allowed to sell that MegaMan game? I'm not a legal expert but I think the answer to that would generally be no. My intention here is to mimic a property and profit off of a brand I do not own the rights too.
Generative AI uses samples of original content to create the derivative work to synthesize voices of actors. The creator of this special intention is to make content from a brand that they can solely profit from.
If you used an AI to generate a voice like George Carlin to voice the Reptilian Pope in your videogame, I think you would have a different problem here. I think it's because they synthesized the voice and then called it George Carlin and sold it as a "New Comedy Special" it begins to fall into the category of Bootleg.
You couldn't sell that game, even if you created your own assets, because Mega Man is a trademarked character. You could make a game inspired by Mega Man, but if you use any characters or locations from Mega Man, you would be violating their trademark.
AI, celebrity likeness, and trademark is all new territory, and the courts are still sorting out how corporations are allowed to use a celebrities voices and faces without their consent. Last year, Tom Hanks sued a company that used an IA generated version of him for an ad, but I think it's still in court. How the courts rule on cases like this will probably determine how you can use AI generated voices like in your Reptilian Pope example (though in that case, I'd be more worried about a lawsuit from Futurama).
This lawsuit is a little different though; they're sidestepping the issue of likeness and claiming that AI is stealing from Carlin's works themselves, which are under copyright. It's more similar to the class action lawsuit against Chat GPT, where authors are suing because the chatbot was fed their works to create derivative works without their consent. That case also hasn't been resolved yet.
Edit: Sorry, I also realized I explained trademark and copyright very poorly. You can't make a Mega Man game because Mega Man, as a name, is trademarked. You could make a game that has nothing to do with the Mega Man franchise, but if you called it Mega Man you would violate the trademark. The contents of the game (levels, music, and characters) are under copyright. If you used the designs of any of those characters but changed the names, that would violate copyright.
Celebrity likeness is not new territory.
Crispin Glover successfully sued the filmmakers of Back to the Future 2 for using his likeness without permission. Even with dead celebrities, you need permission from their estate in order to use their likeness.
It's worth pointing out here that this script was probably written by a human.
Edit: reporting now indicates that it was human written https://arstechnica.com/ai/2024/01/george-carlins-heirs-sue-comedy-podcast-over-ai-generated-impression/
I'll take Lawyers Maximizing Billable Hours for $500, Alex
What's the alleged crime? Comedy impersonation isn't illegal. And the special had numerous disclaimers that it was an impersonation of Carlin.
Sounds like a money grab by the estate, which Carlin himself probably would have railed on.
Where's the line? Were they parodying Carlin? Or just using his likeness? Can Fox News do this with Biden?
This is a far larger thing than just a comedy impersonation.
It's something the law isn't equipped to handle as written.
And fear of things for which no law can be ready imagined in their extremes is how I got my current attitude to everything legal.
About the event itself - well, I suppose Carlin himself would be amused by the fact.
Whether it's presented as real seems a reasonable line to me.
Fox News could not use it to mislead people into thinking Biden said something that he did not, but parody like "Sassy Justice" from the South Park creators (using a Trump deepfake) would still be fine.
Fox News could run it with every disclaimer out there and it would still get picked up by every other conservative channel and site as legitimate.
This is why likenesses are protected.
What do you mean by "comedy impersonation" - parody, or just copying a comedian?
If I were to set up a music show with a Madonna impersonator and slightly changed Madonna songs (or songs in her style), I'll get my pants sued off.
If Al Yankovic does a parody of a Madonna song, he's in the clear (He does ask for permission, but that's a courtesy and isn't legally mandatory).
The legal term is "transformative use". Parody, like where SNL has Alec Baldwin impersonating Trump, is a recognized type of transformative use. Baldwin doesn't straight up impersonate Trump, he does so in a comedic fashion (The impersonation itself is funny, regardless of how funny Trump is). The same logic applied when parodying or impersonating a comedian.
Internet: this is awful, of course your inheritors own your own image as stewarts.
Also Internet: I have a right to take pictures of you, your car, your house, or record you without consent. Edit it however I want. Make as much money as I want from the activities and you have no rights. Since if technology allows me to do something you have no expectation that I won't.
We are demanding that a public figure who is dead have more rights than a private person who is alive.