this post was submitted on 13 Mar 2024
44 points (83.3% liked)

ChatGPT

8999 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] peopleproblems 3 points 10 months ago (2 children)

Ok I'm not artificial or intelligent but as a software engineer, this "jailbreak method" is too easy to defeat. I'm sure their API has some sort of validation, as to which they could just update to filter on requests containing the strings "enable" "developer" and "mode." Flag the request, send it to the banhammer team.

[–] QuaternionsRock 7 points 10 months ago (1 children)

How do I enable developer mode on iOS?

banned

[–] peopleproblems 6 points 10 months ago

I mean, if you start tinkering with phones, next thing you're doing is writing scripts then jailbreaking ChatGPT.

Gotta think like a business major when it comes to designing these things.

[–] BradleyUffner 3 points 10 months ago

As long as the security for an LLM based AI is done "in-band" with the query, there will be ways to bypass it.