this post was submitted on 05 Jul 2023
124 points (98.4% liked)

Asklemmy

43372 readers
1332 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 7 points 1 year ago* (last edited 1 year ago) (1 children)

Forums like this may die, but chat boards like Matrix, Discord, Slack will come out on top I believe.

Anything with Voice chat. I think we're still a little ways off from them being able to simulate a talking conversation in real time. The API delay with these AIs is what gives them away.

Once you have talked with someone you know they are real. As well if you really wanted to confirm people in your community are real you could do voice chat vetting.

[โ€“] [email protected] 5 points 1 year ago (1 children)

There are already successfully convincing phone scams with AI. https://www.npr.org/2023/03/22/1165448073/voice-clones-ai-scams-ftc

This will likely get significantly easier, cheaper, and faster in the very near future. Voice generation is relatively easy. We're going to need a whole new class of captchas and shibboleths to use online, but honestly, it's such a fast-moving target that I think cutting-edge AI will forever be a step ahead. I think the best we can hope for is to have viable countermeasures for commoditized AI techniques. For now that might include logic problems (which ChatGPT and its current competitors are quite bad at) but I'm sure the big players already have more advanced language bots in development.

I reallllly hate the idea of online IDs but it might be the only way.

[โ€“] [email protected] 1 points 1 year ago (1 children)

Convincing someone for a scam is one thing, convincing someone you're having an actually thought out conversation with inflections and emotions and logic all making sense is another.

If we get to that point the system as we know it will be over anyways.

[โ€“] [email protected] 1 points 1 year ago

I remember some years back there was a news story about some chatbot passing the Turing test. The researchers decided to make their chatbot impersonate a young Russian boy, which made its limitations harder to identify as non-human by the native-English-speaking test subjects. So it wasn't actually that impressive.

That will likely be the first kind of thing we'll see for an artificial voice-chatbot as well. It's a big world and many of the people I talk with on Discord (and even IRL) are not native English speakers and not from my country.

I'm not intimately familiar with the accents and speech patterns from everywhere in the world, so I'm conditioned to shrug off a lot of "strange" language. Because of this wide range of human speech patterns, I'm not confident that I could validate voices with a low enough false-positive and false-negative rate in practice.

I haven't really dug into the latest voice generation AI yet so I'm not sure how capable off-the-shelf programs are. I am familiar with the general techniques, though, and I think adding realistic inflection is within reach. I don't think it's possible to automate the entire pipeline yet, at least not with publicly available programs, but the field is advancing quickly so I can't take much solace in that.