this post was submitted on 13 Jan 2024
580 points (95.9% liked)

People Twitter

5367 readers
1759 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] gastationsushi 82 points 11 months ago (3 children)

Are we sure it's AI?

I've heard of this scam happening maybe a decade ago with my extended family. The voice was a real person overseas with a lot of exp tricking grandparents. Scammers only had basic information.

They act as a freaked out kid and the victim gets roped in. They scam for thousands of dollars each time, even succeeding a few times a day would net a big profit. Also cell connections are low fidelity, I bet that aids their ability to trick the victim.

[–] TORFdot0 19 points 11 months ago (1 children)

Yeah this happened to my grandparents, they just say “I sound like shit because I’ve been crying”

load more comments (1 replies)
[–] [email protected] 13 points 11 months ago

Same. Years ago my grandfather received a call from a guy claiming to be my younger, male cousin saying he was in jail for something and needed bail. Luckily (?), my grandfather was an asshole and told him to call his mother.

[–] [email protected] 7 points 11 months ago

Yeah, my dad called me one day asking if my brother was out if the country because our grandma got a call saying he was kidnapped in Mexico and she needed to put up money for his release.

It's wild.

[–] slimarev92 57 points 11 months ago (5 children)
[–] AFaithfulNihilist 18 points 11 months ago (3 children)

I don't know, the "Spanish prisoner" is a scam that seems to be reinvented every few years every time we see a little bit of a change in technology. It wouldn't take much to fake a person's voice with a trained model, especially if that person has an online profile open to the public where they post content in their own voice.

load more comments (3 replies)
[–] [email protected] 13 points 11 months ago (1 children)

Yeah, it has some sus vibes. I'm usually far too trusting, but here even my bullshit detectors rang

[–] Mr_Blott 14 points 11 months ago

You know that old adage "Never attribute to malice that which can be easily explained by stupidity"?

We need a new one along the lines of "Never attribute to truth that which can be easily explained by attention-starved teenagers"

load more comments (3 replies)
[–] [email protected] 42 points 11 months ago (1 children)

Where did they get their voice?

[–] [email protected] 63 points 11 months ago (3 children)

You can train AI with just a single voice clip. You can do this on your desktop. Microsoft doesn't need to sell shit, you put that clip on tiktok yourself.

[–] [email protected] 12 points 11 months ago (2 children)

You don’t even need to upload anything. They can call you, have a short convo and then just say “oh sorry wrong number” or something. You’d never know.

load more comments (2 replies)
[–] [email protected] 8 points 11 months ago (1 children)

Well they said they dont share their voice anywhere, if thats true it would be concerning. I for one just dont use any centralized unencrypted services that could scrape my voice but i would assume most people think that if they dont publish anything, they are safe...

[–] overzeetop 5 points 11 months ago (1 children)

You don’t talk to anyone on the phone through a pbx? Never call your bank? Your doctor? Your credit card company? Any of your insurance company? Even on private systems all of those calls are recorded for legal reasons. And all of them will eventually be compromised.

load more comments (1 replies)
[–] [email protected] 8 points 11 months ago (11 children)
[–] [email protected] 50 points 11 months ago (2 children)

The 'old' way of faking someone's voice like you saw in 90s spy movies was to get enough sample data to capture each possible speech sound someone could make, such that those sounds can be combined to form all possible words.

With AI training you only need enough data to know what someone sounds like 'in general' to extrapolate a reasonable model.

One possible source of voice data is spam-calls.

You get a call, say "Hello?" And then someone launches into trying to sell you insurance or some rubbish, you say "Sorry I'm not interested, take me off your list please. Okay, bye" and hang up.

And that is already enough data to replicate your voice.

When scammers make the call using your fake voice, they usually use a crappy quality line, or background noise, or other techniques to cover up any imperfections in the voice replica. And of course they make it really emotional, urgent and high-stakes to override your family member's logical thinking.

Educating your family to be prepared for this stuff is really important.

[–] [email protected] 13 points 11 months ago

Clutch explanation. I just sent a screenshot of your comment to my folks.

[–] Hildegarde 7 points 11 months ago (1 children)

Soo.... use a funny voice when answering the phone? Poison the data

[–] [email protected] 11 points 11 months ago

Keep a helium balloon on you at all times, just in case.

load more comments (10 replies)
[–] [email protected] 38 points 11 months ago (2 children)

When I was a kid, my parents had "the talk" with me. It was about sex. Now I'm older and my parents are too. I have to have "the talk" with them. It's about scams.

[–] theangryseal 12 points 11 months ago (1 children)

My uncle got divorced a few years back and it nearly crushed him. He we a ridiculously handsome young and successful man, so women chased him. At any point when he was younger he had at least a handful of women actively pursuing him. Now he was older and divorced. Those women were long gone, all having married and carried on with their lives. He didn’t expect to struggle with dating like he did and that made the whole thing even harder.

I set him up on all of the big dating sites. I didn’t know how bad it was, I’d never used them.

He was talking to at least 10 scammers a day, probably more.

He’s kind of a miser so no one was going to get any of his money, but his hobbies showed his wealth and oh boy did they try.

It was so bad that he gave up on the dating sites entirely. He’s had a few girlfriends since then but he only met one person in over a year on the dating sites.

It blows my mind just how many people are out there making a living scamming people.

[–] [email protected] 7 points 11 months ago (1 children)

The sad thing is that, in the current era, virtually all dating sites are scams riddled with bots and have been for over a decade. Their goal is to make money not produce matches.

[–] theangryseal 8 points 11 months ago

He really struggled with it.

I’d do a reverse image search and find the actual person, he’d thank me and move on. Some of them got in his head. One even faked a Skype call with a video of a beautiful woman and somehow scouted his Facebook profile to really dive deep into his personality. The video was like 13 pixels and she’d say, “I’m sorry, I live way out in the wilderness and we have bad satellite internet.”

He said, “She’s too good to be true. No one is this agreeable.” I told him to ask her to make a specific gesture because of all of scammers. He did, he asked her to hold her hands above her head in the shape of a triangle. She refused, said something like, “I can’t believe you don’t trust me. That breaks my heart. You know me.” She stopped talking to him, a couple weeks later she messaged, “I’m so sorry, my mom is in the hospital and I have no money to eat. I wouldn’t ask you but I have been alone so long you’re all I know.” He told her if it was that desperate she could prove who she is and he’d help. Nope. Nothing. Radio silence. That one really hurt him. Whoever it was played the scam game real good.

You might not believe this, but I have a cousin (my mom’s first cousin actually) who fell in love with her scammer. He conned her out of thousands of dollars, turned out to be from Nigeria when he said he was from somewhere else. About 6 months in he said, “Listen. I am not who I have said I am. My real name is John, I am truly in love with you. I know you and I want you to really know me.”

She was in her 50s, he was in his 30s. She was not an attractive woman. She was short, fat, walked with a limp and was born with physical deformities (her nickname to us kids was “old bat”, playfully of course). He wasn’t attractive either, but still. 20 years younger, from another country, had, I don’t know, scammed her. My mom tried her best to stop it, but she flew to Nigeria and married the dude. She stayed over there a few months and then he flew back with her. He stayed with her about 4 years I guess, and he ran around with any woman who would have him. He finally drained her money and rolled out.

The last thing I heard about him, he was arrested for breaking into the grocery store he worked at and robbing the safe.

It is crazy to me just how much money can motivate people to do absurd and crazy things. I can’t relate at all. I’ve been broke as shit and all I had to do was sell a few things to get back on track and I couldn’t motivate myself to even do that. Money just doesn’t mean enough to me to go to any trouble to get it haha.

load more comments (1 replies)
[–] [email protected] 34 points 11 months ago (4 children)

Uh there's zero chance these big techs are selling voices like this. Also, this sounds very targeted and planned, so there must be more context to this. Also, why the hell are they on bluesky?

[–] [email protected] 14 points 11 months ago* (last edited 11 months ago) (1 children)

They're on BlueSky because there are more people there that would actually believe this total load of bullshit. Plenty of scams that will claim to be vague family member (no names; just like "son" or "daughter," "aunt" or "uncle") but highly unlikely they're getting an AI to mimic your voice. Just like vagueness of the family member they claim to be, the voices used may coincidentally sound similar.

[–] [email protected] 5 points 11 months ago (1 children)

Because its probably the best twitter alternative

[–] [email protected] 5 points 11 months ago (3 children)

Not even close to Mastodon

load more comments (3 replies)
[–] ZombieTheZombieCat 4 points 11 months ago

I wish I were this naive

load more comments (1 replies)
[–] [email protected] 33 points 11 months ago (2 children)

All those TV shows that taught us how to spot which twin was the evil one by asking about life history were just training us to beat AI

[–] lastweakness 7 points 11 months ago (1 children)

Given the sheer scale of data collection now, i doubt even that will protect us into the future

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 33 points 11 months ago* (last edited 11 months ago) (3 children)

The only way to train an AI voice model is to have lots of samples. As scummy as they are, neither Microsoft nor Apple is selling your voice recordings with enough info to link them to you specifically. This person probably just forgot about an old social post where they talk for enough time for a model to be trained. Still super scary stuff.

[–] [email protected] 34 points 11 months ago* (last edited 11 months ago)

Not true anymore. You can create a reasonable voice clone with like 30 seconds of audio now (11labs for example doesn't do any kind of authentication). The results are good enough for this kind of thing, especially in a lower bandwidth situation like a phone call.

[–] nifty 13 points 11 months ago* (last edited 11 months ago)

This person probably just forgot about an old social post…

Or recordings made during customer service calls, maybe a disgruntled employee decides to repurpose the data.

[–] [email protected] 6 points 11 months ago* (last edited 11 months ago)

True for creating voices at all, but that work has already been done.

Now we're just taking these large AI's trained to mimic voices and giving them a 30 second audio clip to tell them what to mimic. It can be done quickly and give convincing results especially when hidden by the phonecall quality.

[–] sleepmode 31 points 11 months ago (3 children)
[–] VelvetStorm 17 points 11 months ago

I dont have one. The automated message is just reading my number and telling you to leave a message.

[–] [email protected] 9 points 11 months ago

People still set that?

[–] [email protected] 5 points 11 months ago

My voicemail is just me going "BRRRRRRRKZM" or some sort of screaming distorted gibberish

[–] [email protected] 28 points 11 months ago

you don't even need to fake a voice for these scams tho, it's very difficult to differentiate q voice while you're crying

[–] Zeshade 24 points 11 months ago (4 children)

Do a lot of people put their voice on the internet "as much as they're able to"? It sounds like that person may post their voice online more than the average person...

[–] [email protected] 5 points 11 months ago

Discord just automatically started putting you opt in for having your voice recorded for clips

[–] [email protected] 4 points 11 months ago

If you post videos of yourself online, your voice will be caught. Whether social media or public presence as a streamer. I’m pretty sure that’s all they meant.

load more comments (2 replies)
[–] LaunchesKayaks 22 points 11 months ago

My grandmother got a call from scammers pretending to be me. They didn't use my name, but I was the only adult granddaughter at the time lol. Anyway, the scammers said that they needed money for hospital bills and a bus ticket home. They said they got into a fight at a friend's funeral in New Jersey and had to go to the hospital. And then after that their car got stolen. My grandmother knew that I was not in New Jersey, and told the scammers that she'd call them back once she got to the bank. She then informed my parents, who told me. It was hilarious.

[–] hperrin 19 points 11 months ago

Seriously, it was only one person, and it was with a golf cart.

[–] DirkMcCallahan 18 points 11 months ago (1 children)

Waiting for the comment that's going to say something like, "Joke's on you, my parents don't even talk to me."

[–] [email protected] 34 points 11 months ago (1 children)

My parents probably do not talk to you either.

load more comments (1 replies)
[–] [email protected] 15 points 11 months ago (1 children)

I advise everyone to contact their loved ones and inform them of this possibility. I also advise having some codeword that would be used if there was an emergency and money needs to be sent.

For example is more than $100 is being asked for we have to share the code word or we should not transfer money.

[–] diffcalculus 6 points 11 months ago (4 children)

The code word is Donkeyballs

load more comments (4 replies)
[–] [email protected] 9 points 11 months ago

Tbh it’s not that hard to stop scams. Treat EVERY call you get as a scammer!

Either phone back on a known number, not some shit they give you or if they claim you need bail, ask for a reference number and the place being held and phone them after looking up the number, and If they get pissed, it’s a scam!

No real police force is going to care/shouldn’t care if you call back. It’s not like cops get a percentage of bail money but scammers always seem too desperate to get you to pay and lose it pretty quick.

load more comments
view more: next ›