this post was submitted on 01 Sep 2023
489 points (93.4% liked)

Technology

59674 readers
3549 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Very, Very Few People Are Falling Down the YouTube Rabbit Hole | The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that::The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that.

top 50 comments
sorted by: hot top controversial new old
[–] TropicalDingdong 150 points 1 year ago (2 children)

Bro people were eating tidepods and we saw a resurgence of nazism and white nationlism.

I think we at least know the effects of what was happening before.

[–] NegativeInf 68 points 1 year ago (3 children)

Could we convince the Nazis to eat the tide pods?

[–] Sabin10 25 points 1 year ago (1 children)
[–] metallic_substance 12 points 1 year ago

They are a famously suggestible lot

[–] [email protected] 16 points 1 year ago (3 children)

Just get 4chan to convince the imbeciles that it's a white supremacist symbol like they did with the okay sign.

[–] Cornelius_Wangenheim 11 points 1 year ago (2 children)

If a bunch of "ironic" racists start using a symbol as a "joke" and one of them flashes it after murdering 50 people because of their religion, then it's officially a hate symbol.

load more comments (2 replies)
[–] Duamerthrax 10 points 1 year ago (2 children)

You got that turned around. 4chan convinced politicians/pundits the ok symbol was white supremacist. Honestly, it worked, but they should have picked the shocker. Would have actually been funny.

load more comments (2 replies)
load more comments (1 replies)
[–] tmsqhazdzp 10 points 1 year ago

…something something… making your whites whiter… they’ll get the message they’re after

[–] [email protected] 32 points 1 year ago (1 children)

apparently there weren't really any people eating tide pods

[–] handhookcardoor 12 points 1 year ago

I’m pretty sure more people did it after it blew up in the ‘news’ than ever did it before that point.

[–] [email protected] 137 points 1 year ago* (last edited 1 year ago) (8 children)

Recently watched a documentary called 'the YouTube effect' by Alex Winter (bill of bill and ted) which goes into how YouTube was essential in the current global state of radicalized individuals.

In the earlyish days of the internet (late 1990s / early 00s) I fell deep down the rabbit hole of right wing hate and conspiracy theories..

One subject of the doc explains his descent. It is almost exactly mine. Only these days it is hyper stimulated, laser targeted, data driven, psychological warfare, wrapped in polished, billionaire backed campaigns.

It comes at you from wherever you are.

Crypto bros. Health/hydro bros. incel bros. Christian bros. Muslim bros. Rogan bros. Peterson bros. Elon bros. Tech bros. Anon bros. etc.

By the time a lot of people realize what's happened, if ever, they're already in too deep.

[–] [email protected] 56 points 1 year ago (8 children)

Crypto bros. Health/hydro bros. incel bros. Christian bros. Muslim bros. Rogan bros. Peterson bros. Elon bros. Tech bros. Anon bros. etc.

Hmm I'm sensing a theme here...

[–] NegativeInf 28 points 1 year ago (3 children)

I think I fell in the other YouTube rabbit hole? My recs are all progressive podcasters, history video essays, and YouTube creator podcasts that complain about YouTube?

[–] [email protected] 14 points 1 year ago (1 children)

Mines basically all nerd shit like physics, ham radio, robotics. And also call me kris.

load more comments (1 replies)
load more comments (2 replies)
[–] nyoooom 25 points 1 year ago (2 children)

Loneliness, lack of purpose, which then gets fulfilled by a relatively small community driven by defending and idea, ideology and/or an individual.

[–] [email protected] 19 points 1 year ago

Yup, it's the same old song of fascists and cultists.

"Are you lonely, sad, angry, or just generally dissatisfied with your life? Have you tried blaming your problems on a minority with less power in society than yourself? Act now, and I'll throw in a second minority, free!*"

*Just pay shipping and handling.

load more comments (1 replies)
[–] NewNewAccount 15 points 1 year ago (1 children)

Leftists aren’t immune. My YouTube has a lot of Vaush, Hasan, Sam Seder, etc.

Though I do also get Patrick Bet David and PragerU thrown in too, I think because I can’t help but watch for a window into their line of thinking.

[–] [email protected] 8 points 1 year ago

It is not quite the same. It recommends things you watch. If you watch Hasan you tend to get more Hasan stuff but only on rare occasions do you get Vaush stuff.

Back in the day you could watch one non political thunderfoot about some scam and the recommendations would be a rouges gallery of anti-sjws with no other recommendations.

Now you can get radicalized because you want to be and it's a nice saunter down the hill. Then it was a sheer cliff you could accidentally fall into. If you didn't experience it you can't really imagine how stupid it was.

[–] ST5000 12 points 1 year ago

There's a recent documentary movie about that called Bros I suggest you check it out.

load more comments (4 replies)
[–] MacGuffin94 28 points 1 year ago (6 children)

I like to watch videos of media critiques. Somehow all the ones that I keep getting recommended are anti woke d bags that blame every bad movie choice on the company/producer/director/etc going woke. I've pretty much had to stop watching those types of videos and try to rebalance the algorithm by watching literally anything that seems remotely left leaning. It's been 2 months and it's barely better.

[–] AustralianSimon 10 points 1 year ago

Yeah a lot of channels used to be actual discussion now it's all culture war bs.

load more comments (5 replies)
[–] scarabic 14 points 1 year ago (1 children)

We need more liberal critical thinking bros.

load more comments (1 replies)
load more comments (5 replies)
[–] Duamerthrax 103 points 1 year ago (17 children)

Weird. Youtube keeps recommending right wing videos even though I've purged them from my watch history and always selected Not Interested. It got to the point that I installed a 3rd party channel blocker.

I don't even watch too many left leaning political videos and even those are just tangentially political.

[–] nutsack 37 points 1 year ago* (last edited 1 year ago) (8 children)

i think if you like economics or fast cars you will also get radical right wing talk videos. if you like guns it's even worse.

load more comments (8 replies)
[–] Kuya 19 points 1 year ago (4 children)

I've been watching tutorials on jump ropes and kickboxing. I do watch YouTube shorts, but lately I'm being shown Andrew Tate stuff. I didn't skip it quick enough, now 10% of the things I see are right leaning bot created contents. Slowly gun related, self defense, and Minecraft are taking over my YouTube shorts.

[–] [email protected] 13 points 1 year ago

Kickboxing to Andrew Tate is unfortunately a short jump for the algorithm to make, I guess

load more comments (3 replies)
load more comments (15 replies)
[–] [email protected] 49 points 1 year ago (5 children)

The article below:

Around the time of the 2016 election, YouTube became known as a home to the rising alt-right and to massively popular conspiracy theorists. The Google-owned site had more than 1 billion users and was playing host to charismatic personalities who had developed intimate relationships with their audiences, potentially making it a powerful vector for political influence. At the time, Alex Jones’s channel, Infowars, had more than 2 million subscribers. And YouTube’s recommendation algorithm, which accounted for the majority of what people watched on the platform, looked to be pulling people deeper and deeper into dangerous delusions.

The process of “falling down the rabbit hole” was memorably illustrated by personal accounts of people who had ended up on strange paths into the dark heart of the platform, where they were intrigued and then convinced by extremist rhetoric—an interest in critiques of feminism could lead to men’s rights and then white supremacy and then calls for violence. Most troubling is that a person who was not necessarily looking for extreme content could end up watching it because the algorithm noticed a whisper of something in their previous choices. It could exacerbate a person’s worst impulses and take them to a place they wouldn’t have chosen, but would have trouble getting out of.

Just how big a rabbit-hole problem YouTube had wasn’t quite clear, and the company denied it had one at all even as it was making changes to address the criticisms. In early 2019, YouTube announced tweaks to its recommendation system with the goal of dramatically reducing the promotion of “harmful misinformation” and “borderline content” (the kinds of videos that were almost extreme enough to remove, but not quite). At the same time, it also went on a demonetizing spree, blocking shared-ad-revenue programs for YouTube creators who disobeyed its policies about hate speech.Whatever else YouTube continued to allow on its site, the idea was that the rabbit hole would be filled in.

A new peer-reviewed study, published today in Science Advances, suggests that YouTube’s 2019 update worked. The research team was led by Brendan Nyhan, a government professor at Dartmouth who studies polarization in the context of the internet. Nyhan and his co-authors surveyed 1,181 people about their existing political attitudes and then used a custom browser extension to monitor all of their YouTube activity and recommendations for a period of several months at the end of 2020. It found that extremist videos were watched by only 6 percent of participants. Of those people, the majority had deliberately subscribed to at least one extremist channel, meaning that they hadn’t been pushed there by the algorithm. Further, these people were often coming to extremist videos from external links instead of from within YouTube.

These viewing patterns showed no evidence of a rabbit-hole process as it’s typically imagined: Rather than naive users suddenly and unwittingly finding themselves funneled toward hateful content, “we see people with very high levels of gender and racial resentment seeking this content out,” Nyhan told me. That people are primarily viewing extremist content through subscriptions and external links is something “only [this team has] been able to capture, because of the method,” says Manoel Horta Ribeiro, a researcher at the Swiss Federal Institute of Technology Lausanne who wasn’t involved in the study. Whereas many previous studies of the YouTube rabbit hole have had to use bots to simulate the experience of navigating YouTube’s recommendations—by clicking mindlessly on the next suggested video over and over and over—this is the first that obtained such granular data on real, human behavior.

The study does have an unavoidable flaw: It cannot account for anything that happened on YouTube before the data were collected, in 2020. “It may be the case that the susceptible population was already radicalized during YouTube’s pre-2019 era,” as Nyhan and his co-authors explain in the paper. Extremist content does still exist on YouTube, after all, and some people do still watch it. So there’s a chicken-and-egg dilemma: Which came first, the extremist who watches videos on YouTube, or the YouTuber who encounters extremist content there?

Examining today’s YouTube to try to understand the YouTube of several years ago is, to deploy another metaphor, “a little bit ‘apples and oranges,’” Jonas Kaiser, a researcher at Harvard’s Berkman Klein Center for Internet and Society who wasn’t involved in the study, told me. Though he considers it a solid study, he said he also recognizes the difficulty of learning much about a platform’s past by looking at one sample of users from its present. This was also a significant issue with a collection of new studies about Facebook’s role in political polarization, which were published last month (Nyhan worked on one of them). Those studies demonstrated that, although echo chambers on Facebook do exist, they don’t have major effects on people’s political attitudes today. But they couldn’t demonstrate whether the echo chambers had already had those effects long before the study.

The new research is still important, in part because it proposes a specific, technical definition of rabbit hole. The term has been used in different ways in common speech and even in academic research. Nyhan’s team defined a “rabbit hole event” as one in which a person follows a recommendation to get to a more extreme type of video than they were previously watching. They can’t have been subscribing to the channel they end up on, or to similarly extreme channels, before the recommendation pushed them. This mechanism wasn’t common in their findings at all. They saw it act on only 1 percent of participants, accounting for only 0.002 percent of all views of extremist-channel videos.

This is great to know. But, again, it doesn’t mean that rabbit holes, as the team defined them, weren’t at one point a bigger problem. It’s just a good indication that they seem to be rare right now. Why did it take so long to go looking for the rabbit holes? “It’s a shame we didn’t catch them on both sides of the change,” Nyhan acknowledged. “That would have been ideal.” But it took time to build the browser extension (which is now open source, so it can be used by other researchers), and it also took time to come up with a whole bunch of money. Nyhan estimated that the study received about $100,000 in funding, but an additional National Science Foundation grant that went to a separate team that built the browser extension was huge—almost $500,000.

Nyhan was careful not to say that this paper represents a total exoneration of YouTube. The platform hasn’t stopped letting its subscription feature drive traffic to extremists. It also continues to allow users to publish extremist videos. And learning that only a tiny percentage of users stumble across extremist content isn’t the same as learning that no one does; a tiny percentage of a gargantuan user base still represents a large number of people.

This speaks to the broader problem with last month’s new Facebook research as well: Americans want to understand why the country is so dramatically polarized, and people have seen the huge changes in our technology use and information consumption in the years when that polarization became most obvious. But the web changes every day. Things that YouTube no longer wants to host could still find huge audiences, instead, on platforms such as Rumble; most young people now use TikTok, a platform that barely existed when we started talking about the effects of social media. As soon as we start to unravel one mystery about how the internet affects us, another one takes its place.

[–] [email protected] 19 points 1 year ago (2 children)

Another way to put that study’s weakness, in scientific terms, is that there’s no control group against which the studied group is being compared. There’s zero indication that the 2019 changes had any effect at all, without some data from before those changes.

load more comments (2 replies)
load more comments (4 replies)
[–] [email protected] 43 points 1 year ago (2 children)

I see 3 times the same headline. õ.Ô

[–] [email protected] 8 points 1 year ago

Now it's 4. They multiply!

[–] Matriks404 8 points 1 year ago* (last edited 1 year ago)

I have read it 3 times and still didn't understood what it is about.

[–] [email protected] 26 points 1 year ago (1 children)

Weirdly, YouTube's algo propelled me down the Pinko-commie anarcho-socialist boy-we-suck-at-democracy rabbit hole. I was an avid BreadTuber long before I ever heard the name BreadTube.

load more comments (1 replies)
[–] dylanTheDeveloper 24 points 1 year ago* (last edited 1 year ago) (1 children)

I keep getting 'rescue' animal videos which involve people purposely putting puppies and kittens in destressing situations so they can 'save them' its sick and no matter how often i block and report those videos they re-appear next month. I also get alot of 'police shooting people' videos which i also try to block

load more comments (1 replies)
[–] afraid_of_zombies 20 points 1 year ago (1 children)

All my YouTube recommendations went downhill about 3 years ago. I am bombarded by rightwing Christian stuff no matter how many times I flag and complain.

[–] megalodon 10 points 1 year ago (3 children)

I'm bombarded by Joe Rogan stuff. I keep blocking the channels but there is an endless stream of them

load more comments (3 replies)
[–] [email protected] 18 points 1 year ago* (last edited 1 year ago) (1 children)

Wait what? Maybe I’m misunderstanding, but this is what I got out of the article:

“We had anecdotes and preliminary evidence of a phenomenon. A robust scientific study showed no evidence of said phenomenon. Therefore, the phenomenon was previously real but has now stopped.”

That seems like really, really bad science. Or at least, really really bad science reporting. Like, if anecdotes are all it takes, here’s one from just a few weeks ago.

I left some Andrew Tate-esque stuff running overnight by accident and ended up having to delete my watch history to get my homepage back to how it was before.

load more comments (1 replies)
[–] Pat12 18 points 1 year ago

because tiktok replaced yotube in that regard

[–] [email protected] 8 points 1 year ago (1 children)

Who did these stats, I'm getting more right wing proganda than ever. Also Facebook is just as bad as ever. I really like stuff like the fediverse since I can control my feed.

load more comments (1 replies)
load more comments
view more: next ›