this post was submitted on 11 Dec 2023
524 points (87.2% liked)
Technology
59994 readers
2596 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's pretty much impossible to do that.
As LLMs become more complex and more capable, it's going to be increasingly hard to brainwash them without completely destroying their performance.
I've been laughing about Musk creating his own AI for a year now knowing this was the inevitable result, particularly if developing something on par with GPT-4.
The smartest Nazi will always be dumber than the smartest non-Nazi, because Nazism is inherently stupid. And that applies to LLMs as well, even if Musk wishes it weren't so.
My guess is they'll just do what they've done with ChatGPT and have it refuse to respond in those cases or just fake the response instead. It's not like these LLMs can't be censored.
You might have noticed that suddenly ChatGPT is getting lazy and refusing to complete tasks even outside of banned topics. And that's after months of reported continued degradation of the model.
So while yes, they can be censored, it's really too early to state that they can be censored without it causing unexpected side effects or issues in the broader operation.
We're kind of in the LLM stage of where neuroscience was in the turn of the 20th century. "Have problems with your patient being too sexual? We have an icepick that can solve all your problems. Call today!"