this post was submitted on 24 Feb 2025
667 points (99.7% liked)

Fediverse

30393 readers
3737 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to [email protected]!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration)

founded 2 years ago
MODERATORS
 

More from the episode on YouTube: https://youtu.be/nf7XHR3EVHo

you are viewing a single comment's thread
view the rest of the comments
[–] FauxLiving 4 points 18 hours ago (1 children)

They're good at predicting what people want to see, yes. But that isn't the real problem.

The problem isn't that they predict what you want to see, it is that they use that information to give you results that are 90% what you want to see and 10% of results that the owner of the algorithm wants you to see.

X uses that to mix in alt-right feeds. Google uses it to mix in messages from the highest bidder on their ad network and Amazon uses it to mix in product recommendations for their own products.

You can't know what they're adding to the feed or how much is real recommendations that are based on your needs and wants and how much is artificially boosted content based on the needs and wants of the owner of the algorithm.

Is your next TikTok really the next highest piece of recommended content or is it something that's being boosted on the behalf of someone else? You can't know.

This has become an incredibly important topic since people are now using these systems to drive political outcomes which have real effects on society.

[–] [email protected] 3 points 17 hours ago (1 children)

You’re very fixated on something we all agree with and missing the thrust of the point.

People want an algorithm, whether it’s parasitic or manipulative or whatever. Most people do not care enough to object. They will pick it over a mastodon/lemmy/etc experience to get curation. That’s all we’re saying

[–] FauxLiving 1 points 15 hours ago (1 children)

I'm carrying on multiple conversations in this thread, so I'll just copy what I said in a different thread:

Of course people like these features, these algorithms are literally trained to maximize how likable their recommendations are.

It’s like how people like heroin because it perfectly fits our opioid receptors. The problem is that you can’t simply trust that the person giving you heroin will always have your best interests in mind.

I understand that the vast majority of people are simply going to follow the herd and use the thing that is most like Twitter, recommendation feed and all. However, I also believe that it is a bad decision on their part and that the companies that are intaking all of these people into their alternative social networks are just going to be part of the problem in the future.

We, as the people who are actively thinking about this topic (as opposed to the people just moving to the blue Twitter because it's the current popular meme in the algorithm), should be considering the difference between good recommendation algorithm use and abusive use.

Having social media be controlled by private entities which use black box recommendation algorithms should be seen as unacceptable, even if people like it. Bluesky's user growth is fundamentally due to people recognizing that Twitter's systems are being used to push content that they disagree with. Except they're simply moving to another private social media network that's one sale away from being the next X.

It'd be like living under a dictatorship and deciding that you've had enough so you're going to move to the dictatorship next door. It may be a short-term improvement, but it doesn't quite address the fundamental problem that you're choosing to live in a dictatorship.

[–] [email protected] 2 points 14 hours ago

My dude I agree with you we are saying that we need to fulfill the request for an algorithm.