this post was submitted on 22 May 2024
197 points (98.5% liked)

195

4149 readers
2 users here now

RULE 1: IF YOU VISIT THIS COMMUNITY, YOU MUST POST BEFORE LEAVING!!

The Lemmy equivalent of r/195 on Reddit. Not officially affiliated (yet).
Any moderators from the reddit can have mod here, just ask.
There's another 196 over on [email protected] Most people use the Blahaj.zone one so this place isn't very active. ALL HAIL LORD SPRONKUS!!!

founded 1 year ago
MODERATORS
197
Screenshots Rule. (lemmy.world)
submitted 4 months ago* (last edited 4 months ago) by ekZepp to c/196
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 16 points 4 months ago (1 children)

And based on their track record, they will just quietly turn it back on.

Microsoft is so far beyond the benefit of the doubt they couldn't get back to it if they tried.

[–] efstajas 2 points 4 months ago* (last edited 4 months ago) (1 children)

Are there actually any documented cases of them just enabling userland features after they've been disabled? The only thing I heard of before was registry edits / telemetry changes being undone. Not to say that that's cool of course, but at least it's not like it asks you for your privacy settings during startup and then undoes your choices. As far as I know, maybe I'm just out of the loop.

Generally though, what do you think would actually be Microsoft's motivation to randomly re-enable this particular feature? Do you think that the claim that the data doesn't leave the device is a lie?

[–] [email protected] 6 points 4 months ago

Does it get much worse than telemetry settings being quietly enabled? It's spyware at the best of times, much less when they get all sneaky about it. And I've definitely had them change privacy/telemetry options that I set on startup, multiple times.

I don't necessarily think they're stupid enough to come out with the full data harvesting machine on day one. They'll release what they can get away with - in this case, taking screenshots and storing them locally - and they'll boil the metaphorical frog from there. Maybe they offer more powerful AI by running it through their servers, and then they can start "accidentally" opting people into that "service".

I'm not even necessarily saying there's some grand scheme going on here, but nobody can possibly deny they have every incentive to push that boundary until it breaks, and they have consistently shown that they will pursue that incentive without any regard for user privacy whatsoever.

We know this because they have done it so many times before.