this post was submitted on 13 Jan 2024
580 points (95.9% liked)
People Twitter
5367 readers
1611 users here now
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a tweet or similar
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The 'old' way of faking someone's voice like you saw in 90s spy movies was to get enough sample data to capture each possible speech sound someone could make, such that those sounds can be combined to form all possible words.
With AI training you only need enough data to know what someone sounds like 'in general' to extrapolate a reasonable model.
One possible source of voice data is spam-calls.
You get a call, say "Hello?" And then someone launches into trying to sell you insurance or some rubbish, you say "Sorry I'm not interested, take me off your list please. Okay, bye" and hang up.
And that is already enough data to replicate your voice.
When scammers make the call using your fake voice, they usually use a crappy quality line, or background noise, or other techniques to cover up any imperfections in the voice replica. And of course they make it really emotional, urgent and high-stakes to override your family member's logical thinking.
Educating your family to be prepared for this stuff is really important.
Clutch explanation. I just sent a screenshot of your comment to my folks.
Soo.... use a funny voice when answering the phone? Poison the data
Keep a helium balloon on you at all times, just in case.