this post was submitted on 11 Sep 2023
1133 points (96.0% liked)
Technology
59612 readers
3667 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I sub to primarily leftist content and their YouTube shorts algorithm insists on recommending the most vile far right content on the planet. It is to the point that I'm convinced YouTube is intentionally trying to shift people far right
I primarily watch woodworking or baking content on Youtube. I feel like the far right content is super prevalent with Shorts. I'll watch something like a quick tool review, and the next video will be someone asking folks on the street if it's ok to be white. What color you are isn't your decision, but what you do every day is, and being some dumbass white kid accosting black tourists in Times Square for shitty reaction content is just gross.
It doesn't matter how often I say I dislike the content, block channels or whatever, Youtube has just decided it's going to check in from time to time and see if I want to let loose my inner Boomer and rage with Rogan.
It could be that pushing videos on the other side of the political spectrum gets interactions in the form of people sharing/commenting on it. Even if you disagree, going "Why does YouTube recommend this, this is awful" is still a share.
The algorithm prioritises interactions above all else, and fewer things get people interacting more than being wrong, or them disagreeing vehemently.
This is happening on my FB video feed. I watch a funny chick called Charlotte Dobre and she does funny reaction videos. I honestly love her, but all my algorithm shows me for recommendations are these cop brutality videos with comments praising the cops, and right wing crap that praises Abbotts wall and desantis dictatorship. It drives me nuts, and no matter howany pages I block I always get more right wing recommended crap videos
Wow I am so surprised by this. I watch mainly tech and gardening YouTube and my shorts have been extremely applicable to me.
Even when I use a new computer like at work the shorts are mostly pop culture.
Didn't make shorts any less annoying though
I literally only get Marvel Snap/general gaming, College Humor, tech, educational, stand up comedy, and drones. That's it. I don't mean to victim blame, but it learns what you click and what you stay to watch.
Mine acted similar to yours. I recently started watching a few more short videos and now it's showing me an unfortunate amount of that far right nonsense.
I am pretty sure it is just showing politically charged content based on people watching other politically charged content. I feel the blame is misdirected at something that only provides content people will like based on their past.
Depends entirely on what you're subscribed to, if you have multiple linked youtube accounts (such as the premium family plan) it depends on what THEY'RE subscribed to, and depends on location (my recommendations at home and my recommendations at work are wildly different)