this post was submitted on 05 Nov 2023
276 points (96.6% liked)

Technology

60021 readers
3355 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

It's not just about facts: Democrats and Republicans have sharply different attitudes about removing misinformation from social media::One person’s content moderation is another’s censorship when it comes to Democrats’ and Republicans’ views on handling misinformation.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 45 points 1 year ago* (last edited 1 year ago) (1 children)

Democrats and Republicans have sharply different attitudes about whether disinformation is desirable.

[–] [email protected] 10 points 1 year ago (1 children)

It benefits Republicans, so the side it benefits would obviously desire that benefit.

load more comments (1 replies)
[–] [email protected] 21 points 1 year ago (10 children)

Part of the problem is who decides what is misinformation. As soon as the state gets to decide what is and isn't true, and thus what can and cannot be said, you no longer have free speech.

[–] [email protected] 75 points 1 year ago (7 children)

Education is key. Destroying education and critical thinking is the problem.

[–] TrickDacy 19 points 1 year ago (6 children)

Don't worry, the person you responded to is conservative so they're doing their damnedest to finish off education

load more comments (6 replies)
load more comments (6 replies)
[–] scarabic 57 points 1 year ago* (last edited 1 year ago) (1 children)

The state deciding on speech is a red line yes but that’s not even on the table here. This is about social media moderation. It actually seems really suspiciously disingenuous to bring that up here.

OP: Thread about social media moderation

You: The state deciding what’s true is the death of free speech!

Actually your comment is one of the big problems in this debate. People can’t tell the difference between a private social media firm moderating hate content and the government taking away their freedom of speech. You just slurred the two together yourself by bringing this up here.

[–] [email protected] 5 points 1 year ago (3 children)

Centralized for-profit companies policing speech doesn’t really solve free speech concerns. It doesn’t violate the US first amendment, but corporate-approved speech isn’t really free speech either. No person or organization is really suitable to be the arbiter of truth, but at the same time unmoderated misinformation presents its own problems.

[–] scarabic 6 points 1 year ago (1 children)

Yes it solves it. Companies are not required to carry your voice around the world, which is what their platforms do. Stop equating guaranteed amplification with your freedom of speech. It’s wrong and dumb. I’ve lived in countries that actually restrict speech and whatever the Facebook mod did to you is NOTHING. The only reason Americans even fall into this stupid way of thinking is because their speech is so free. When your speech has never truly been restricted you have no idea what that freedom even means.

[–] [email protected] 2 points 1 year ago (3 children)

I’m not necessarily in favor of “guaranteed amplification”, as you put it. Anyone is free to yell whatever ideas they have on a street corner. Barring some specific exceptions, that is free speech. I understand why a for-profit company might not want to amplify any means everything someone decides to spew out. We’ve designed an un-free capitalist system though where some people do have guaranteed amplifiers, and others do not. That’s the problem I’m more interested in solving. It’s not forcing any one company to be forced to amplify any specific idea, but rather to make sure that centralized authorities, be they governments, social media companies, etc can’t in unison stamp out those ideas. I think decentralized platforms like this are somewhat key to that goal, even with individual instances having full moderation and federation control.

[–] partial_accumen 3 points 1 year ago (2 children)

I’m not necessarily in favor of “guaranteed amplification”, as you put it.

...and...

We’ve designed an un-free capitalist system though where some people do have guaranteed amplifiers, and others do not. That’s the problem I’m more interested in solving.

Those two statements of yours seem in opposition to one another. In your second statement you're calling out that some people have guaranteed amplifiers while others don't and say thats a problem. However your first statement says you're not in favor of guaranteed amplifiers for everyone.

The only logical third outcome I can make out that would make those two statement NOT contradictory is if you don't want guaranteed amplifiers for ANYONE, but I don't think you're saying that.

Can you clarify who you believe should have guaranteed amplifiers?

load more comments (2 replies)
load more comments (2 replies)
[–] [email protected] 3 points 1 year ago

No person or organization is really suitable to be the arbiter of truth

Courtrooms are arbiters of truth literally all the time. There are plenty of laws for which truth is a defence, and dishonesty is punished.

When battling misinformation, the problem is not that lying on the internet is legal - it is still actionable. Fraud is still illegal. False or misleading advertisements are still illegal. Defamation is still illegal. Perjury is illegal in the criminal law sense, not just torts. Ask Martha Stewart who the "arbiter of truth" is.

The problem is that it's functionally impossible to enforce on the scale of social media. If 50,000 people call you a pedophile because it became a meme even though it was completely untrue, and this costs you your job and you start getting death threats, what are you going to do about that? Sue them all?

So we throw up our hands and let corporations handle it through abuse policies, because the actual law is unworkable - it's "this is illegal but enforcing it is so impractical that it's legal". Twitter and Facebook don't have to deal with that crap so we let them do a vague implementation of the law but without the whole "due process" thing and all the justice they can mete out is bans.

If you disagree, then I've got a Nigerian prince who'd like to get your banking info, and also you're all cannibals.

[–] Dkarma 2 points 1 year ago

You don't have free speech.

[–] echo64 30 points 1 year ago* (last edited 1 year ago) (1 children)

You do not have free speech on social media today, private platforms decide what they want to have.

The state does not have to be the one to decide these things, nor is it a case of "deciding" what is true, we have a long history of using proofs to solidify something as fact, or propaganda, or somewhere in between. This is functionally what history studies are about.

[–] [email protected] 9 points 1 year ago (12 children)

That brings up another thing. At what point does it become a "public space"?

Theres an old supreme court case on a company town that claimed someone was trespassing on a sidewalk. The supreme court ruled it was a public space, and thus they could pass out leaflets.

https://firstamendment.mtsu.edu/article/marsh-v-alabama-1946/

Imo, a lot of big sites have gotten to that stage, and should be treated as such.

[–] [email protected] 14 points 1 year ago (10 children)

I think this is an underrated point. A lot of people are quick to say "private companies aren't covered by free speech", but I'm sure everyone agrees legal ≠ moral. We rely on these platforms so much that they've effectively become our public squares. Our government even uses them in official capacities, e.g. the president announcing things on Twitter.

When being censored on a private platform is effectively social and informational murder, I think it's time for us to revisit our centuries-old definitions. Whether you agree or disagree that these instances should be covered by free speech laws, this is becoming an important discussion that I never see brought up, but instead I keep seeing the same bad faith argument that companies are allowed to do this because they're allowed to do it.

[–] [email protected] 14 points 1 year ago (6 children)

This is an argument for a publicly-funded “digital public square”, not an argument for stripping private companies of their rights.

[–] [email protected] 9 points 1 year ago (2 children)

Why not both?

While I agree that punishing companies for success isn't a good idea, we aren't talking about small startups or local business ran by individual entrepreneurs or members of the community here. We're talking about absurdly huge corporations with reach and influence the likes that few businesses ever reach. I don't think it's unreasonable to apply a different set of rules to them, as they are distinctly different situations.

load more comments (2 replies)
load more comments (5 replies)
[–] [email protected] 8 points 1 year ago

It's different because the company built and maintains the space. Same goes for a concert hall, a pub, etc...

Nobody believes that someone being thrown out of a pub for spouting Nazistic hate speech is their "free speech being trampled". Why should it be any different if it's a website?

You rarely see the discussion, because there's rarely a good argument here. It boils down to "it's a big website, so I should be allowed to post whatever I want there", which makes little to no sense and opens up a massive quagmire of legal issues.

load more comments (8 replies)
[–] SexyTimeSasquatch 6 points 1 year ago (2 children)

There is a key difference here. Social media companies have some liability with what gets shared on the platform. They also have a financial interest in what gets said and how it gets promoted by algorithms. The fact is, these are not public spaces. These are not streets. They're more akin to newspapers, or really the people printing and publishing leaflets. The Internet itself is the street in your analogy.

[–] puppy 5 points 1 year ago (2 children)

Your analogy about Newspapers isn't accurate either. The writers of a newspaper are paid by the company and everyone knows that writers execute the newspaper's agenda. Nothing gets published without review and everything aligns with the company's vision. Information is one way and readers buy it to consume information. They don't expect their voice to be heard and the newspaper don't pretend that the readers have that ability either. This isn't comparable to a social media site at all.

load more comments (2 replies)
[–] [email protected] 3 points 1 year ago

Companies probably shouldn't be liable then for what individuals share / post then, instead the individuals should. Social media constantly controls their push / promotion of posts currently using algorithms to decide what should be shown / shared and when.

I hate this so much. I want real, linear feeds from all my friends I'm following, not a personally curated style sanitized feed to consider my interests and sensibilities.

load more comments (10 replies)
[–] [email protected] 26 points 1 year ago (3 children)

Nobody (besides maybe extreme conservatives) is advocating for "the state" to decide what "is and isn't true". That's not what this is about.

Furthermore, "misinformation" and "disinformation" refer to things that can be true! Propogansists don't always need to invent false facts for them to be used in deceptive ways. To suggest that the goverment should stay out of the matter unless they utilze a perfectly foolproof fact-o-meter is IMO, shortsighted. "The state" makes policy decisions all the time with imperfect facts.

load more comments (3 replies)
[–] dhork 18 points 1 year ago (3 children)

Except there have always been limits on speech, centered mainly on truth. Your freedom of speech doesn't extend to yelling "Fire" in a crowded theater when there is no fire, for instance.

But we live in an age of alternative facts now, where science isn't trusted if it comes up with conclusions that conflict with your world view. Do you get a pass if you are yelling "Fire" because you are certain there are cell phone jammers in the theater that are setting your brain on fire because you got the COVID shot and now the 5G nanoparticles can't transmit back to Fauci's mind control lair?

[–] FireTower 8 points 1 year ago (2 children)

Do you get a pass if you are yelling "Fire" because you are certain there are cell phone jammers in the theater that are setting your brain on fire

Yes. Anyone in good faith attempting to warn others of any potential harm that they believe to be true to the best of their abilities should have their speech protected.

[–] dhork 10 points 1 year ago* (last edited 1 year ago) (6 children)

Anyone in good faith attempting to warn others of any potential harm that they believe to be true to the best of their abilities

But what if their beliefs are verifiably false? I don't mean that in a sense of a religious belief, which cannot be proven and must be taken on faith. I mean that the facts are clear that there are no 5G nanoparticles in the vaccine for cell phone jammers to interfere with in the first place. That isn't even a thing.

It's one thing to allow for tolerance of different opinions in public. It's another thing entirely to misrepent things that can be objectively disproven as true, just because you've tied it to a political movement. Can that really still be considered to be in good faith?

load more comments (6 replies)
[–] grabyourmotherskeys 6 points 1 year ago

I wrote a comment about this earlier today. People who have been brainwashed to believe total nonsense often act in ways that are rational to them, but irrational to people who see the world through different eyes.

That's fine until it's violent action.

The alcoholic who thinks he's "fine to drive" believes he's perfectly rational. He's drunk all the time and no accidents. That's wonderful until he kills a family some night.

load more comments (2 replies)
[–] [email protected] 17 points 1 year ago

Uh, you know that happens regularly in courtrooms right? Like, almost every court battle hinges on what's true and what's not. And courts are an arm of the state.

In some cases it's directly about the truth of speech. Fraud, defamation, perjury, filing a false report, etc. are all cases where a court will be deciding whether a statement made publicly is true and punishing a party if it was not. Ask a CEO involved in a merger how much "free speech" they have.

[–] TrickDacy 12 points 1 year ago (1 children)

Oh weird, you coincidentally are a conservative mod lol

Gee so surprising you're mad about cEnSoRsHiP

load more comments (1 replies)
[–] scarabic 5 points 1 year ago

Well, here’s how that was framed for participants of this study:

identified as misinformation based on a bipartisan fact check

And even with this, Republicans didn’t care if it was true or not.

We’re actually past the point of anyone being able to be considered truthful by Republicans. It either tickles their feelings right or it doesn’t and that is all.

[–] [email protected] 4 points 1 year ago

Section 230 gets the state involved from the get go. Remove liability protections from the state and everything else will shake out. Make little tweaks from there as necessary. The broad protection of 230 is causing this issue.

[–] cheese_greater 3 points 1 year ago (1 children)

Isnt a grand jury enough to deal with this kinda thing? Like before damage is done but I don't see why that mechanism can't be useful here too?

load more comments (1 replies)
load more comments
view more: next ›