this post was submitted on 19 Feb 2024
223 points (100.0% liked)

Privacy

32173 readers
664 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] SpaceNoodle 57 points 9 months ago* (last edited 9 months ago) (2 children)

There's no excuse for a buffer overflow in a caching component to lead to a security hole like this. If the data were properly encrypted and could only be decrypted by the client on their own device, the result would have been users simply not seeing videos instead of being able to view others'.

[–] Ottomateeverything 25 points 9 months ago

It doesn't even need to go that far. If some cache mixes up user ids and device ids, those user ids should go to request a video feed and the serving authority should be like "woah, YOU don't have access to that device/user". Even when you fucking mix these things up, there should be multiple places in the chain where this gets checked and denied. This is a systemic/architectural issue and not "one little oopsie in a library". That oopsie simply exposed the problem.

I don't care if I was affected or how widespread this is. This just shows Wyze can't be trusted with anything remotely "private". This is a massive security failing.

[–] [email protected] 11 points 9 months ago (2 children)

If the data were properly encrypted and could only be decrypted by the client on their own device

Yeah, but part of Wyze's sales pitch is their AI image recognition features, and they'd lose all training data by doing that and would force it to be processed locally, both of which would be a dead end.

I realize these might not be features you want nor care about... but those are the features they want to offer.

[–] [email protected] 7 points 9 months ago (3 children)

If I would have said 30 years ago that people in the future would pay money for a device that lets companies basically spy on you, and then they can also sell the data, I would have been branded a lunatic and sent for psychiatric help. Yet, here we are.

[–] [email protected] 5 points 9 months ago

You're talking about a device which is a full-color high-definition surveillance camera that works at night and can be viewed from literally anywhere in the world and can be configured to send you alerts based on seeing people/animals/packages/whatever. That only costs them an inflation adjusted $13.

I don't really think the "they wouldn't believe this shit" argument really applies with how rapidly tech has changed.

[–] [email protected] 3 points 9 months ago

That and ring doorbells being used as a big data harvesting point for the police.

The surveillance culture we have is so normalized now people don't even care that their security camera is more of a corporate livestream then a secure loop. But hey, how else an I gonna pust pictures of the guy stealing my 3rd Amazon package of the day.

[–] [email protected] 1 points 9 months ago

30 years ago was 1994, the internet was quickly becoming a thing and if you would have told them that companies would eventually offer extra services if you chose to store your data with them, they would have believed you because that's how the banking system worked for centuries prior.

[–] SpaceNoodle 2 points 9 months ago

Even just encrypting it before transmission would have prevented this, and still allowed them to harvest data.