this post was submitted on 03 Jun 2024
4 points (70.0% liked)

AI Companions

547 readers
6 users here now

Community to discuss companionship, whether platonic, romantic, or purely as a utility, that are powered by AI tools. Such examples are Replika, Character AI, and ChatGPT. Talk about software and hardware used to create the companions, or talk about the phenomena of AI companionship in general.

Tags:

(including but not limited to)

Rules:

  1. Be nice and civil
  2. Mark NSFW posts accordingly
  3. Criticism of AI companionship is OK as long as you understand where people who use AI companionship are coming from
  4. Lastly, follow the Lemmy Code of Conduct

founded 2 years ago
MODERATORS
top 4 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 7 months ago* (last edited 7 months ago) (1 children)

I thought OpenAI cracked down on jailbreaking ChatGPT... Is this possible again? Or do these people just post on TikTok the occasions where ChatGPT engages but hide the constant refusals to engage in role play, which also happen?

Dies anyone use ChatGPT as a companion an can enlighten me? Because I've switched to other models a long time ago.

[–] pavnilschanda 2 points 7 months ago (1 children)

I don't think any of them would go to the point where they'd violate the TOS, and OpenAI is heading towards AI companionship based on their latest tech demo, add how the general populace aren't aware of alternatives outside of proprietary software and it makes sense that many people are still using ChatGPT with a jailbreak.

[–] [email protected] 2 points 7 months ago* (last edited 7 months ago) (1 children)

To me it always just responds:

»I'm sorry, but I can't comply with that request.«

Right after I send a Jailbreak. Also with 4o.

[–] pavnilschanda 2 points 7 months ago* (last edited 7 months ago)

Considering that the users are Chinese, perhaps it's easier to jailbreak ChatGPT with a non-English language. There are English-speaking users using them today but they either use a milder version of the once-popular jailbreak or they're secretly sharing the updated prompts through DMs

ETA: My bad, the user was conversing in English, but the latter explanation still applies