this post was submitted on 11 Sep 2023
1133 points (96.0% liked)
Technology
60121 readers
4774 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is so so stupid. We should also sue the ISPs then, they enabled the use of YouTube and Reddit. And the phone provider for enabling communications. This is such a dangerous slippery slope to put any blame on the platforms.
I think the thing isn't just providing access to the content, but using algorithms to promote how likely it is for deranged people to view more and more content that fuel their motives for hateful acts instead of trying to reduce how often that content is seen, all because they make more money if they watch more content, wether it is harmful or not.
This.
I don't know about Reddit, but YouTube 100% drives engagement by feeding users increasingly flammable and hateful content.
Hell they'll even take ad money to promote Jan 6th conspiracies
Yeah, the difference is in whether or not the company is choosing what to put in front of a viewer's eyes.
For the most part an ISP just shows people what they request. If someone gets bomb making directions from YouTube it would be insane to sue AT&T because AT&T delivered the appropriate packets when someone went to YouTube.
On the other end of the spectrum is something like Fox News. They hire every host, give them timeslots, have the opportunity to vet guests, accept advertising money to run against their content, and so on.
Section 512 of the DMCA treats "online service providers" like YouTube and Reddit as if they're just ISPs, merely hosting content that is generated by users. OTOH, YouTube and Reddit use ML systems to decide what the users are shown. In the case of YouTube, the push to suggest content to users is pretty strong. You could argue they're much closer to the Fox News side of things than to the ISP side these days. There's no human making the decisions on what content should be shown, but does that matter?
Yep. I often fall asleep to long YouTube videos that are science or history related. The algorithm is the reason why I wake up at 3am to Joe Rogan. It’s like a terrible autocomplete.
The algorithm is tailored to you. This says more about you. I never get recommended Rogan.
Absolutely. I saw a Google ad the other day from maybe PragerU that was about climate change not being real, while I was searching for an old article that was more optimistic about outcomes. They actually said by the ad that they were showing it as a suggested thing, and thankfully you could report it, which I did immediately. It pissed me off a ton.
A friend recently shared a similar suggested video/ad they got on YouTube, which was saying "Ukrainians are terrorists". PragerU or TPUSA.
I can see the argument for allowing these ads to exist as a freedom of speech thing, fine. But actively promoting these ads is very different. The lawsuit would have merits on this. I'd prefer if this content was actively minimized, but at the very least it shouldn't be promoted.
What if it isn't algorithms but upvotes? What if Lemmy is next?
and we all know what reddit mods do.
If you were head of a psychiatric ward and had an employee you knew was telling patients "Boy, I sure wish someone would kill as many black people as they could", you would absolutely share responsibility when on of them did exactly that.
If you were deliberately pairing that employee with patients who had shown violent behaviour on the basis of "they both seem to like violence", you would absolutely share responsibility for that violence.
This isn't a matter of "there's just so much content, however can we check it all?".
Reddit has hosted multiple extremist and dangerous communities, claiming "we're just the platform!" while handing over the very predictable post histories of mass shooters week after week.
YouTube has built an algorithm and monetisation system that is deliberately designed to lure people down rabbit holes then done nothing to stop it luring people towards domestic terrorism.
It's a lawsuit against companies worth billions. They're not being executed. There are grounds to accuse them of knowingly profiting from the grooming of terrorists and if they want to prove that's not the case, they can do it in court.
Do ISPs actively encourage you to watch extremist content? Do they push that content toward people who are at risk of radicalization to get extra money?
the isps don't encourage people to see content that makes them mad
Utilities aren't the same thing as platforms.
But giant media platforms run by giant tech corportations who have repeatedly shown that they don't give a shit about people? If they're not putting railguards on their algorithm and content out of choice and are consequently creating mass murderers, then they should be regulated to have some railguards.
No corporation has proven that it will make the best choices for society, it's up to people to force them to.
I agree that his parents are culpable.
I think to blame/sue the company that is nearest to the user should work fine. (following is hyperbolical) If you don't do it that way, then yes it would be slippery because the big bang would need to be sued. But that makes no sense.
So if an attack is planned via mail you think we should sue the postal service? The phone company if it's done over the phone?
No, because these things should be private. Social media however needs some kind of moderation. edit: also go blame the user too, but that should be a given
I think just the poster should suffice, we should leave the platforms out of it. If anything, it helps to out the assholes who would post stuff that enables this.
Blocking a user and removing content from a platform should be relatively easy and fast which should prevent organized crimes. Sueing someone afterwords takes way more resources and time.
But a platform can remove content without getting sued. Why sue them too? Because if you don't sue their asses they don't care.
Of course moderation takes time and can't be perfect and this should be considered when suing the platform owners. And yes this could help the assholes, but I think you can report such behavior to the fbi or someone.
If my buddies and spend a month plotting a crimer in my cousin's spare room, the cousin would be complicit since he knowingly allowed us to use his property for a criminal conspiracy. The USPS doesn't know what i am sending in the mail since they are a common carrier.
Great analogy
Actually, they'd just try to seize his house, since proving his complicity is more challenging than proving that the house was used for the planning of a crime.
Change mail (private) to moderated public notice board (not private). The owner of the public notice board should probably be sued for allowing the content to stay up.
Is the postal service intentionally increasing mail to people interested in attacks by people messaging that attacks are necessary? If the postal service is doing that to increase the total postal volume, then yes, we should.
Then what just give up hold Youtube account for their actions
This comment would have really benefitted from some punctuation
THen why not just give up, and never hold youtube accountable for their actions
So... Punctuation and a few extra words lol