to;dr: It’s not Deepseek the model, it’s their app and its privacy policy.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
DeepSeek the self-hosted model is pretty decent even as distilled down to 8b, but I always ensure i get an abliterated version to remove all the Chinese censorship (and also built-in OpenAI censorship given the actual history of how the model was actually developed).
To be clear, that only removes (or attempts to remove) refusals; it doesn't add in training data that it doesn't have. Ask it about tiennemen square, for example.
The abliterated model of DeepSeek can fully discuss Tiananmen Square. I've tried to even use the 4chan copy paste that allegedly gets Chinese chat users session dropped and the prompts work fine
ensure i get an abliterated version
Could you expand on this re how, please and thank you.
Learned something new. Thank you!
You can also run your own fancy front-end and host your own GPT website (locally).
I'm doing that with docker compose in my homelab, it's pretty neat!
services:
ollama:
volumes:
- /etc/ollama-docker/ollama:/root/.ollama
container_name: ollama
pull_policy: always
tty: true
restart: unless-stopped
image: ollama/ollama
ports:
- 11434:11434
deploy:
resources:
reservations:
devices:
- driver: nvidia
device_ids: ['0']
capabilities:
- gpu
open-webui:
build:
context: .
args:
OLLAMA_BASE_URL: '/ollama'
dockerfile: Dockerfile
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
volumes:
- /etc/ollama-docker/open-webui:/app/backend/data
depends_on:
- ollama
ports:
- 3000:8080
environment:
- 'OLLAMA_BASE_URL=http://ollama:11434/'
- 'WEBUI_SECRET_KEY='
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
volumes:
ollama: {}
open-webui: {}
Much obliged!