this post was submitted on 28 Jan 2025
496 points (86.5% liked)

Privacy

33120 readers
1293 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

Is anyone actually surprised by this?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 26 points 1 day ago (6 children)

I run it locally on a 4080, how do they capture my keystrokes?

[–] dai 4 points 1 day ago (2 children)

No idea, have you monitored the package / container for network activity?

Perhaps this refers to other clients not running the model locally.

[–] [email protected] 3 points 22 hours ago (1 children)

It doesn't. They run using stuff like Ollama or other LLM tools, all of the hobbyist ones are open source. All the model is is the inputs, node weights and connections, and outputs.

LLMs, or neural nets at large, are kind of a "black box" but there's no actual code that gets executed from the model when you run them, it's just processed by the host software based on the rules for how these work. The "black box" part is mostly because these are so complex we don't actually know exactly what it is doing or how it output answers, only that it works to a degree. It's a digital representation of analog brains.

People have also been doing a ton of hacking at it, retraining, and other modifications that would show anything like that if it could exist.

[–] dai 1 points 12 hours ago* (last edited 12 hours ago)

Yeah I've not had much of a deep dive into anything "ai" - closest thing I've got is a Google Coral monitoring my cameras. My current GPU selection is rather limited - sold my 1080ti and currently have a 3070 in my gaming rig + an un used 1660 which would run out of vram / be limited in what models they could run. Really not looking to run out and grab another card with more vram to play with either, maybe in a few years.

So the article is referring to the mobile app, and therefore would not have anything to do with someone running this model at home on their own hardware. I've not looked at DeepSeek's repo yet but assuming there isn't anything other than the model in the repo people need to calm down.

edit: deepstack to deepseek...

[–] [email protected] 1 points 1 day ago (1 children)

Yea, I am looking for a deep analysis of this as well

[–] [email protected] 1 points 13 hours ago* (last edited 13 hours ago)

Are you looking for a deep learning of this as well?

load more comments (3 replies)