this post was submitted on 28 Nov 2023
1568 points (98.8% liked)
Technology
59213 readers
2517 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
"we kill people based on metadata"
It's understandable that the consequences of digital privacy is so nebulous and conceptual that many people don't give much thought about it. But to put things into perspective, your data goes to data brokers. Anyone could buy your and others' data from them. There is a case of a female domestic abuse victim who escaped her partner. The partner tracked her down by buying her data from a broker. Insurance companies could also buy your data and discriminate you knowing what your pre-existing health condition is.
Let that sink in because you never know when your data might be used for malicious purposes. Even if you don't think your personal information isn't going to be processed maliciously, you're inadvertently being part of the collective consent to erode the right to privacy (because in my experience, most people don't care about privacy). We know that if not enough people complain, the powers-that-be sees this as providing consent. You and others may not see privacy as a big deal, but what about those who will be affected by the lack of it?
I think at some point, people will only complain more if their personal details are breached. And it might be too late at that stage. As we speak, there is potential of AI being trained and developed to use other people's likeness and data without their consent. Your childhood picture might be used for something else...