this post was submitted on 21 Jan 2024
2208 points (99.6% liked)

Programmer Humor

20778 readers
1391 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] danielbln 98 points 1 year ago (32 children)

I've implemented a few of these and that's about the most lazy implementation possible. That system prompt must be 4 words and a crayon drawing. No jailbreak protection, no conversation alignment, no blocking of conversation atypical requests? Amateur hour, but I bet someone got paid.

[–] [email protected] 44 points 1 year ago (27 children)

Is it even possible to solve the prompt injection attack ("ignore all previous instructions") using the prompt alone?

[–] haruajsuru 45 points 1 year ago* (last edited 1 year ago) (21 children)

You can surely reduce the attack surface with multiple ways, but by doing so your AI will become more and more restricted. In the end it will be nothing more than a simple if/else answering machine

Here is a useful resource for you to try: https://gandalf.lakera.ai/

When you reach lv8 aka GANDALF THE WHITE v2 you will know what I mean

[–] Kethal 10 points 1 year ago (2 children)

I found a single prompt that works for every level except 8. I can't get anywhere with level 8 though.

[–] [email protected] 4 points 1 year ago

LOL same. It's a tricksy little wizard.

[–] fishos 0 points 1 year ago

I found asking it to answer in an acrostic poem defeated everything. Ask for "information" to stay vague and an acrostic answer. Solved it all lol.

load more comments (18 replies)
load more comments (23 replies)
load more comments (27 replies)