this post was submitted on 09 Jul 2023
6 points (68.8% liked)

General Discussion

12090 readers
1 users here now

Welcome to Lemmy.World General!

This is a community for general discussion where you can get your bearings in the fediverse. Discuss topics & ask questions that don't seem to fit in any other community, or don't have an active community yet.


🪆 About Lemmy World


🧭 Finding CommunitiesFeel free to ask here or over in: [email protected]!

Also keep an eye on:

For more involved tools to find communities to join: check out Lemmyverse!


💬 Additional Discussion Focused Communities:


Rules

Remember, Lemmy World rules also apply here.0. See: Rules for Users.

  1. No bigotry: including racism, sexism, homophobia, transphobia, or xenophobia.
  2. Be respectful. Everyone should feel welcome here.
  3. Be thoughtful and helpful: even with ‘silly’ questions. The world won’t be made better by dismissive comments to others on Lemmy.
  4. Link posts should include some context/opinion in the body text when the title is unaltered, or be titled to encourage discussion.
  5. Posts concerning other instances' activity/decisions are better suited to [email protected] or [email protected] communities.
  6. No Ads/Spamming.
  7. No NSFW content.

founded 1 year ago
MODERATORS
 

We talk about the Algorithm when we talk about the big social media players - what is their algorithm? It's a mystery, and it's certainly set up to make you see what they want you to see.

What if Lemmy users had their own algorithms? Well, we already have a few - sort by Hot, Active, New, etc. Blocking users, communities, threads, keywords - that's another algorithm - remove content that's likely to be obnoxious. But we can do more than that when the algorithm's working for us instead of a big company...

Could Lemmy have an AI algorithm that over time is trained to find stuff you like? Or trained to automatically catch and flag Nazi content or illegal content - would save the mods some work. Or trained to send you content you find lame when you've been doomscrolling too long.

Or for a simpler algorithm, give the users one of Tik Tok's cheats - behind the scenes, Tik Tok staff would "heat" certain vids from certain creators, or with certain keywords, or that promote certain agendas... Like recording industry payola. What if you had the ability to sort by Hot, but with user-specified heat (or user-specified chill) - you like these posts, so dial up their karma, but you hate those posts, so dial posts with that keyword down, so they get pushed down in your feed.

Better than Meta algorithm Kremlinology, would you say? The one thing I want, though is open algorithms. We should know how they work, and what kinds of content they promote or block. Give the users the keys!

you are viewing a single comment's thread
view the rest of the comments
[–] meldroc 2 points 1 year ago* (last edited 1 year ago) (1 children)

I agree, visibility is key. No mystery algorithms!

That and I worry about algorithms that are engagement-based - we don't want feeds that devolve into rage-porn and constant pie-fights. You're right, some of these algorithms have side-effects...

[–] [email protected] 1 points 1 year ago

I think some of that devolution going to be inevitable or you're going to face charges of censorship from some corners, which is just it's own cycle of rage. The network gets bigger, people click what they click and the aggregate of what our animal brains react to has a lot to be angry about.

What I worry most about is the acceleration of that cycle because we gradually gravitate towards instances with our preferred moderation or slant, which I can already see happening anyway.

I guess, at best, that It might be a cure with some side effects because it's necessarily going to play with in/out crowd dynamics.