this post was submitted on 07 Jun 2024
1154 points (92.2% liked)

Programmer Humor

31183 readers
186 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 4 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 28 points 3 weeks ago (1 children)

LLM system input is unsanitizable, according to NVidia:

The control-data plane confusion inherent in current LLMs means that prompt injection attacks are common, cannot be effectively mitigated, and enable malicious users to take control of the LLM and force it to produce arbitrary malicious outputs with a very high likelihood of success.

https://developer.nvidia.com/blog/securing-llm-systems-against-prompt-injection/

[โ€“] [email protected] 2 points 2 weeks ago

Everything old is new again (GIGO)