this post was submitted on 20 Mar 2024
1014 points (98.0% liked)

Technology

59199 readers
3846 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 7 months ago* (last edited 7 months ago)

I’ve literally watched friends of mine descend into far right thinking and I can point to the moment when they started having algorithms suggest content that puts them down a “rabbit hole”

Like, you’re not wrong they were right wing initially but they became the “lmao I’m an unironic fascist and you should be pilled like me” variety over a period of six months or so. Started stock piling guns and etc.

This phenomena is so commonly reported it makes you start wonder where all these people deciding to “radicalize themselves” all at once seemingly came out in droves.

Additionally, these companies are responsible for their content serving algorithms, and if they did not matter for affecting the thoughts of the users: why do propaganda efforts from nation states target their narratives and interests appearing within them if it was not effective? Did we forget the spawn and ensuing fall out of the Arab Spring?