dsilverz

joined 2 days ago
[–] [email protected] 8 points 2 days ago

@rimu @Bronzebeard On the one hand, when Deep Seek "doesn't know" about a thing (i.e., something not present the training data), it'll state it clearly (I'm not sure if the image will be sent as I'm not using Lemmy directly to reply this):

The context of the image is the following: I asked DeepSeek about "Abnukta", an obscure and not-so-much-known Enochian term that is used during one of the invocations of Lilith, and DeepSeek replied the following:

"Abnukta is a term that does not have a widely recognized or established meaning in mainstream English dictionaries or common usage. It could potentially be a misspelling, a neologism, or a term from a specific dialect, jargon, or cultural context. If you have more context or details about where you encountered the term, I might be able to provide a more accurate explanation. Alternatively, it could be a name or a term from a specific field or community that is not widely known".

So, the answer that the user Rimu received is not regarding something "unknown" to the LLM (otherwise it'd be clearly stated as that, as per my example), but something that triggered moderation mechanisms. So, in a sense, yes, the LLM refused to answer...

However... On the other hand, western LLMs are full of "safeguards" (shouldn't we call these as censorship, too?) regarding certain themes, so it's not an exclusivity of Chinese LLMs. For example:
- I can't talk about demonolatry (the worshiping of daemonic entities, as present in my own personal beliefs) with Claude, it'll ask me to choose another subject.
- I can't talk with Bing Copilot about some of my own goth drawings.
- Specifically regarding socio-economics-politics subjects, people can't talk with ChatGPT and Google Gemini about a certain person involved in a recent US event, whose name is the same as a video-game character known for wearing a green hat and being the brother of another character that enters pipes and seeks to set free a princess.
- GitHub Copilot refuses (in a blatant Scumthorpe Problem) to reply or suggest completion for code containing terms such as "trans" or "gender" (it's an open and known issue on GitHub, so far unanswered why or how to make Copilot answer).

But yeah, west is the land of the freedom /s