Technology
This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
view the rest of the comments
Snapchat is not the only problem here, but it is a problem.
If they can't guarantee their recommendations are clean, they shouldn't be offering recommendations. Even to adults. Let people find other accounts to connect to for themselves, or by consulting some third party's curated list.
If not offering recommendations destroys Snapchat's business model, so be it. The world will continue on without them.
It really is that simple.
Using buggy code (because all nontrivial code is buggy) to offer recommendations only happens because these companies are cheap and lazy. They need to be forced to take responsibility where it's appropriate. This does not mean that they should be liable for the identity of posters on their network or the content of individual posts—I agree that expecting them to control that is unrealistic—but all curation algorithms are created by them and are completely under their control. They can provide simple sorts based on data visible to all users, or leave things to spread externally by word of mouth. Anything beyond that should require human verification, because black box algorithms demonstrably do not make good choices.
It's the same thing as the recent Air Canada chatbot case: the company is responsible for errors made by its software, to about the same extent as it is responsible for errors made by its employees. If a human working for Snapchat had directed "C.O." to the paedophile's account, would you consider Snapchat to be liable (for hiring the kind of person who would do that, if nothing else)?
No i would not, unless it was proven that said employee knew the person was an S.O and knew that the account was a minor (but at that point the employee should have disabled the account per Snapchats policy regardless). If that data was not available to them, then they wouldn't have the capability to know so I would concider it not at fault.
Then, in my opinion, you would have failed to perform due diligence. Even if you'd thought C.O. was an adult, suggesting a woman strike up a private conversation with a man neither of you know is always something that deserves a second look (dating sites excepted), because the potential for harm is regrettably high.