this post was submitted on 15 Sep 2023
725 points (97.1% liked)

Harry Potter

886 readers
1 users here now

founded 1 year ago
MODERATORS
 

LINK (archive.ph)


AI may be a buzzword on Wall Street, but on the West Coast it’s at the center of Hollywood’s biggest labor dispute in more than 50 years. Among those warning about the technology’s potential to cause harm is British actor and author Stephen Fry, who told an audience at the CogX Festival in London on Thursday about his personal experience of having his identity digitally cloned without his permission.

“I’m a proud member of [actors’ union SAG-AFTRA], as you know we’ve been on strike for three months now. And one of the burning issues is AI,” he said.

Actors’ union SAG-AFTRA, which has around 160,000 members, went on strike last month over pay, working conditions, and concerns related to the use of AI in the film industry. It joined the Writers Guild of America—a union representing thousands of Hollywood writers—which went on strike in early May, marking the industry’s biggest shutdown in more than six decades.

A key sticking point for actors on strike is the possibility that studios could use AI to make digitally replicate their image without compensating them fairly for using their likeness.

Speaking at a news conference as the strike was announced, union president Fran Drescher said AI “poses an existential threat” to creative industries, and said actors needed protection from having “their identity and talent exploited without consent and pay.”

During his speech at CogX Festival on Thursday, Fry played a clip to the audience of an AI system mimicking his voice to narrate a historical documentary.

“I said not one word of that—it was a machine. Yes, it shocked me,” he said. “They used my reading of the seven volumes of the Harry Potter books, and from that dataset an AI of my voice was created and it made that new narration.”

Fry—who has appeared in movies including Gosford Park, V for Vendetta, and The Hitchhiker’s Guide to the Galaxy—is the narrator of the British Harry Potter audiobooks, while actor Jim Dale narrated the American version of the series.

“What you heard was not the result of a mash up, this is from a flexible artificial voice, where the words are modulated to fit the meaning of each sentence,” Fry told the audience at CogX Festival on Thursday.

“It could therefore have me read anything from a call to storm parliament to hard porn, all without my knowledge and without my permission. And this, what you just heard, was done without my knowledge. So I heard about this, I sent it to my agents on both sides of the Atlantic, and they went ballistic—they had no idea such a thing was possible.”

Fry added that when he discovered his voice was being used in projects without his consent, he saw it as just the beginning of an emerging threat to creative talent, warning his angry agents: “You ain’t seen nothing yet.” “This is audio,” he said he told them. “It won’t be long until full deepfake videos are just as convincing.”

As AI technology has advanced, doctored footage of celebrities and world leaders—known as deepfakes—has been circulating with increasing frequency, prompting warnings from experts about artificial intelligence risks. Fry warned on Thursday that those technologies only had further to go.

“We have to think about [AI] like the first automobile: impressive but not the finished article,” he said, noting that when cars were invented no one could have envisioned how widespread they are today.

“Tech is not a noun, it is a verb, it is always moving,” he said. “What we have now is not what will be. When it comes to AI models, what we have now will advance at a faster rate than any technology we have ever seen. One thing we can all agree on: it’s a f***ing weird time to be alive.”

Not the first

Fry isn’t the only famous actor to publicly vocalize their concerns about AI and its place in the film industry.

At a U.K. rally held in support of the SAG-AFTRA strike over the summer, Emmy-winning Succession star Brian Cox shared an anecdote about a friend in the industry who had been told “in no uncertain terms” that a studio would keep his image and do what they liked with it.

“That is a completely unacceptable position,” Cox said. “And that is the position that we should be really fighting against, because that is the worst aspect. The wages are one thing, but the worst aspect is the whole idea of AI and what AI can do to us.”

Oscar winner Matthew McConaughey told Salesforce CEO Marc Benioff during a panel event at this year’s Dreamforce conference that he had concerns about the rise of AI in Hollywood.

“We have a real chance, if we are irresponsible, of cannibalizing ourselves and creating this digital god that we’ll bow to, and we’ll all of a sudden become tools of this tool,” he said.

Meanwhile, Star Trek and Mission Impossible star Simon Pegg has called AI “worrying” for actors.

“We’re looking at being replaced in some ways,” he said at the rally in London in July. “We have to be compensated and we have to have some say in how [our image is] used. I don’t want to turn up in an advert for something I disagree with… I want to be able to hang on to my image, and voice, and know where it’s going.”

A spokesperson for the Alliance of Motion Picture and Television Producers (AMPTP), the entertainment industry’s official collective bargaining representative, was not available for comment when contacted by Fortune.

you are viewing a single comment's thread
view the rest of the comments
[–] flossdaily 166 points 1 year ago (16 children)

The much, much, much more concerning aspect of voice cloning technology is that it will be used to scam people on a massive scale.

Imagine you get a call at 4am from a loved one who tells you that they are in an emergency situation and had to borrow a phone to call you. The beg you to venmo some money to a stranger's account so that they can get their car fixed/get a plane ticket/pay someone back for giving them a lift/etc.

You recognize your loved one's voice. They can respond to your questions (because chatbot AI). They know details about your life (because social media). It's the middle of the night. You're scared and not thinking clearly.

This technology all exists TODAY. In 10 or 20 years it'll be so terrifyingly sophisticated, even the most wary people will be vulnerable to it.

[–] [email protected] 62 points 1 year ago (2 children)

Easy solution, don't have any loved ones. Checkmate scam artists

[–] [email protected] 37 points 1 year ago (3 children)

"Hey it's ur son I've been arrested in Mexico"

"Well good then".

[–] [email protected] 10 points 1 year ago

My uncle was recently arrested in another state. We had a similar reaction.

[–] kittyjynx 5 points 1 year ago

Someone tried to scam my grandpa with that. He told "me" to enjoy rotting in jail then called me up to ask how jail was.

[–] Buddahriffic 4 points 1 year ago

"It's about time they caught you! Oh wait, they don't want the reward money, do they? Ah fuck it, they can't unarrest you, tell them to get fucked!"

[–] FinalRemix 7 points 1 year ago

Or if you do, make sure none of 'em are dumb enough to rely on "cash apps" like venmo. Even Zelle, through our bank is suspicious as shit.

[–] [email protected] 50 points 1 year ago (1 children)

That's why I do like Gilbert Gottfried and do two voices: one in public and one for friends and family.

It gets confusing when we dine outside.

[–] [email protected] 10 points 1 year ago* (last edited 1 year ago) (2 children)

😂 Wtf does that guy sound like at home? Posh mid Atlantic accent or some shit? I'm so curious now.

[–] [email protected] 12 points 1 year ago (1 children)
[–] [email protected] 2 points 1 year ago

He passed away last year. I personally wouldn't call that long dead.

[–] Rebels_Droppin 8 points 1 year ago

There's a Howard Stern clip of Gilberts "normal" voice on his voicemail.

[–] MossBear 18 points 1 year ago

They can't get me if I live in a hole. Not a nasty, dirty, wet hole, filled with the ends of worms and an oozy smell, nor yet a dry, bare, sandy hole with nothing in it to sit down on or to eat: but a hobbit-hole, and that means comfort.

[–] RGB3x3 15 points 1 year ago (2 children)

The solution that EVERYBODY needs to learn for something like that is to hang up and call them back using the contact you have in your phone. They can afford 10 seconds while you do that if they're calling you for money. And if it isn't them calling for money, well sorry for waking you up Frank, but an AI was posing as you asking for cash.

[–] [email protected] 10 points 1 year ago (2 children)

That and a family or per person verification word or protocol or something.

"Clumsy..."

"Draconiquist!"

[–] [email protected] 5 points 1 year ago

Thank you, your suggestion has been added to the training data.

/s

[–] [email protected] 3 points 1 year ago (1 children)

"Oh my god, agent_flouder, I was just in a car accident and they need the bank info to process the co-insurance so I can get the organ transplant that is expiring in minutes! I've lost a lot of blood, have a concussion and have forgotten our code word. PLEASE don't do this right now or this might be the last time you hear from me..."

In the capitalist hellscape that is the US, that isn't that far fetched and with emotions high, I doubt it's unlikely. On the other hand, I can see a news article that reads, "Man lets daughter die by refusing hospital critical information needed for transplant."

[–] [email protected] 2 points 1 year ago (1 children)

Lawl they don’t do insurance shit for emergencies like needing an organ or blood immediately, they deal with that shit after the operation

[–] [email protected] 3 points 1 year ago (1 children)

Yeah but this is America. And do you think the average person knows that, will remember it when their loved one calls them crying, and will have the temerity to actually refuse when there's a time constraint?

[–] [email protected] 3 points 1 year ago
[–] [email protected] 3 points 1 year ago (1 children)

Unless they are calling from the hospital, police station, borrowed a cell phone after a car accident, etc.

[–] FinalRemix 3 points 1 year ago* (last edited 1 year ago)

Then you call THAT number. Station's non emergency number, hospital, etc.

[–] _number8_ 8 points 1 year ago

yet another reason to never answer the phone

[–] [email protected] 7 points 1 year ago

Ah, that'll be the equivalent scam for our age that spam emails are for the age before.

[–] [email protected] 6 points 1 year ago (2 children)

Easy peasy. Tell them okay. Then hang up, proceed to call that loved one using your record of his /her number. Confirm.

[–] Its_not_Dave 5 points 1 year ago (4 children)

In the referenced scenario they had to borrow a phone to call you.

Presumably their phone is out of battery, broken, stolen, or they're in another country without service.

[–] FlexibleToast 14 points 1 year ago (1 children)

So that means the "real" person definitely won't answer their phone right? That all is useful for trying to confirm someone is who they say they are.

[–] [email protected] 5 points 1 year ago (1 children)

Yeah it has the flaw that at that hour the real person might not answer tough... (If they shutdown the phone or mute it or whatever). But yeah that is the common approach.

[–] FlexibleToast 6 points 1 year ago

It's not perfect, but it's something.

[–] [email protected] 4 points 1 year ago

Deviant had this fascinating/awful video about this kind of situation: https://www.youtube.com/watch?v=6ihrGNGesfI

Here is a (really) top infosec expert saying that when someone you know is in jail, you absolutely have to turn off all your call filtering and spam filtering, because who knows what shitty system the facility they've been moved to today is using to route calls.

[–] [email protected] 3 points 1 year ago

Well since it's a scam, it means that the phone is really not broken or out of commission.

Not readily believing anything you hear from something that is unusual or out of the ordinary will save you in the future.

Call just to check. If the phone is unreachable, call someone close to that person, a wife, a son /daughter. Just think. If they have a wife or kids, why call you in the first place.

Sometimes it's good to be not overly trusting.

[–] Buddahriffic 2 points 1 year ago

Or even better, conference them in to that call. Then get them to debate each other about which one is real.

[–] [email protected] 6 points 1 year ago

Scam them out of what?

90% of the world will be unemployed and fighting for whatever scraps of food are grown between the constant flooding and fires we had 100 years to prevent but it wasn't profitable to.

We will have to learn to live entirely without factual information as every form of communication becomes hopelessly compromised by corporations, governments and billionaire extremists.

You won't even be able to trust the people you meet in meatspace. You think Fox News addicts are fucked up? Wait until every piece of entertainment is propaganda that's been personalised just for you.

And when it inevitably turns to war and you're put in charge of the big red button, will you even care if the order to press it has been deep faked by a death cult?

Unemployment and petty scams are small fry. With this technology, we can end the world.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago)

I've had friends fall for email scams. We're from Canada and I remember one being that another friend of our was stuck in Wales and needed a bank transfer. I knew it was a scam but a few of my friends were worried. I said that I live in the UK now and can take a train to Wales if you REALLY want. They were still panicking and saying they should do it in case. I'm like, you can't be serious!

So yeah, it can already be bad and with ChatGPT they can pass the Turing test. All of our friends will probably test each other on our memories. "Tell me the name of your ex-gf and which year and how you broke again?"

[–] [email protected] 2 points 1 year ago

Most people don't know what their loved ones sound like on the phone. This is already a scam and you should never believe someone calling you like that. You can ask them something only you two know or just tell them to call the police and that you'll meet them at the police department or hospital or whatever. Never give out credit card info ect over the phone. Nobody would ever do that in a legit situation.

[–] SCB 2 points 1 year ago (1 children)

Easy enough to teach people to just hang up and call their friend.

[–] kmkz_ninja 1 points 1 year ago

Physical 2 factor authentication. Have a code word for your kids to tell you or for you to tell them.