TLDR if you don’t wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for “Kamilia” would lose you “10000 rizz”, and how voting for Trump would get you “1 million rizz”.

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn’t necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

  • gmtom@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    35 minutes ago

    So you’re saying we need to start pumping out low quality left wing brainrot?

  • danciestlobster@lemm.ee
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    6 hours ago

    I don’t think it makes me feel better to know that our descent into fascism is because gru promised 1MM rizz for it

  • DarkFuture@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 hours ago

    I like Youtube (with adblocker) but shorts are pretty trashy. It’s mostly shorts of women as naked as they can get on Youtube without breaking the rules who have purposefully given themselves super camel toes and set the thumbnail for the short to show their camel toe to get people’s attention. And it’s just a front to get you to their OnlyFans.

  • WrenFeathers@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    5 hours ago

    Same happened to me (live in WA) but not only do I get pro-tyranny ads and Broprah (Rogan) shorts, I also get antivax propaganda.

    I always use the “show less of this” option or outright remove it from my feed. Seems better now.

      • WrenFeathers@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        4 hours ago

        True, but the comparison lies more in the fact that- according to her fanbase, she can seemingly do no wrong. They gobble up anything she says like a nest of hungry baby birds.

        I see Rogan’s army of dudebros as being no different, only less intelligent.

  • HoMaster@lemm.ee
    link
    fedilink
    English
    arrow-up
    28
    ·
    9 hours ago

    Alt right videos are made to elicit outrage, hate, and shock which our lizard brains react to more due to potential danger than positive videos spreading unity and love. It’s all about getting as many eyeballs on the video to make money and thi is the way that’s most effective.

      • Duamerthrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        7 hours ago

        Are people making clickbait/ragebait articles about climate change? Are people seeking out clickbait about climate change?

        I don’t need to be constantly reminded of climate change, but an old “friend” is constantly telling me about the politics of video games he doesn’t even have a system to play with.

      • deus@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        7 hours ago

        All alt-right content is made to generate outrage but content that generates outrage does not have to be necessarily alt-right.

        • I Cast Fist@programming.dev
          link
          fedilink
          English
          arrow-up
          5
          ·
          7 hours ago

          Another important part of alt right bullshit is that they blame people that viewers can easily identify on the streets. Crime? It’s the immigrants and blacks! Shit economy? Jews and the deep state!

          So, I guess the only way to fight climate change is by accusing every petrol CEO of being a deep state Jew gay communist

          • MaggiWuerze@feddit.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 hours ago

            I don’t think you meant it that way, but how are Jews ‘easily identifiable’ on the street?

            • I Cast Fist@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              5 hours ago

              Ever seen that caricature of a Jew? The one with a huge nose and a grin, curly hair? That’s how the idiots picture all Jews. It doesn’t matter that it’s a racist/xenophobic stereotype, it has a “clear, recognizable face” of the enemy. It creates an image of “the enemy” in their mind

      • Saltycracker@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        5 hours ago

        I feel like they have a hard time defining alt right. If you type in is drinking coffee alt right there is a article, playing video games, driving cars.

  • socialmedia@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    1
    ·
    edit-2
    10 hours ago

    I realized a while back that social media is trying to radicalize everyone and it might not even be entirely the oligarchs that control its fault.

    The algorithm was written with one thing in mind: maximizing engagement time. The longer you stay on the page, the more ads you watch, the more money they make.

    This is pervasive and even if educated adults tune it out, there is always children, who get Mr. Beast and thousands of others trying to trick them into like, subscribe and follow.

    This is something governments should be looking at how to control. Propaganda created for the sole purpose of making money is still propaganda. I think at this point that sites feeding content that use an algorithm to personalize feeds for each user are all compromised.

    • ayyy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      19 minutes ago

      This discussion existed before computers. Before that it was TV and before that it was radio. The core problem is ads. They ruined the internet, TV, radio, the press. Probably stone tablets somehow. Fuck ads.

    • whoisearth@lemmy.ca
      link
      fedilink
      English
      arrow-up
      9
      ·
      8 hours ago

      The problem is education. It’s a fools game to try and control human nature which is the commodification of all and you will always have commercials and propaganda

      What is in our means is to strengthen education on how to think critically and understanding your environment. This is where we have failed and I’ll argue there are people actively destroying this for their own gain.

      Educated people are dangerous people.

      It’s not 1984. It’s Brave New World. Aldous Huxley was right.

  • doortodeath@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    12 hours ago

    I don’t know if anyone of you still looks at memes on 9gag, it once felt like a relatively neutral place but the site slowly pushed right wing content in the last years and is now infested with alt-right and even blatantly racist “memes” and comment-sections. Fels to me like astroturfing on the site to push viewers and posters in some political direction. As an example: in the span during US-election all of a sudden the war on palestine became a recurring theme depicting the Biden admin and jews as “bad actors” and calling for Trump; after election it became a flood of content about how muslims are bad people and we shouldn’t intervene in palestine…

    • blubfisch@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      From what I heard, the site was astroturfing long before it took a right turn. But my only sources are online rumors…

  • vga@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    9 hours ago

    The people where I live are – I guess – complete morons because whenever I try to check out Youtube without being logged in, I get the dumbest of dumb content.

    But as another weird data point, I once suggested my son check out a Contrapoints video which I found interesting and about 1 year later she told me she wanted to get a surgery – I don’t exactly remember which kind as I obviously turned immediately into a catatonic far right zombie.

  • Shardikprime@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 hours ago

    If the channel is popular, those videos will get recommend

    Of it has engagement on top of that, you are fucked, it will definitely get recommend to you.

    Either block the channel, the user, or use in incognito. Or don’t

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    30
    ·
    14 hours ago

    From my anecdotal experiences, it’s “manly” videos that seem to lead directly to right wing nonsense.

    Watch something about how a trebuchet is the superior siege machine, and the next video recommended is like “how DEI DESTROYED Dragon Age Veilguard!”

    • Valmond@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      14 hours ago

      Or “how to make ANY woman OBEY you!”

      Check out a short about knife sharpening or just some cringe shit and you’re all polluted.

  • Victor@lemmy.world
    link
    fedilink
    English
    arrow-up
    60
    arrow-down
    1
    ·
    16 hours ago

    I keep getting recommendations for content like “this woke person got DESTROYED by logic” on YouTube. Even though I click “not interested”, and even “don’t recommend channel”, I keep getting the same channel, AND video recommendation(s). It’s pretty obvious bullshit.

    • lennivelkant@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      16
      ·
      14 hours ago

      You’d think a recommendation algorithm should take your preferences into account - that’s the whole justification for tracking your usage in the first place: recommending relevant content for you…

      • Ace@feddit.uk
        link
        fedilink
        English
        arrow-up
        11
        ·
        13 hours ago

        recommending relevant content for you

        The aim is not to recommend relevant content. The aim is to recommend content you will engage with. That may be because you’re interested, or it may be because it’s ragebait that you will hate but watch anyway.

        • lennivelkant@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          6
          ·
          10 hours ago

          Even in the best-intentioned recommender system, trained on the content you watch to estimate what you’re interested in and recommend similar things, that would be the drift of things. You can’t really mathematically judge the emotions the viewers might feel unless they express them in a measurable way, so observing their behaviour and recommending similar by whatever heuristic. And if they keep clicking on rageposts, that’s what the system has to go on.

          But at least giving the explicit indication “I don’t want to see this” should be heavily weighted in that calculation. Just straight up ignoring that is an extra layer of awful.

      • andallthat@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        14 hours ago

        it is. But who said that **you ** get to decide what’s relevant for you? Welcome and learn to trust your algorithmic overlords

    • SaharaMaleikuhm@feddit.org
      link
      fedilink
      English
      arrow-up
      16
      ·
      15 hours ago

      Anything but the subscriptions page is absolute garbage on that site. Ideally get an app to track your subs without having to have an account. NewPipe, FreeTube etc.

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        13 hours ago

        Are those available on PC/Linux? On my TV? 😭 I have them on my phone but I feel like there’s too much hassle to do on my main viewing devices.

  • TankovayaDiviziya@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    12 hours ago

    Yeah, I’ve gotten more right wing video recommendations on YouTube, even though I have turned off my history. And even if I turned on my history, I typically watch left wing videos.

  • CircuitGuy@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 hours ago

    I use YouTube and don’t get much far-right content. My guess is it’s because I don’t watch much political content. I use a podcatcher and websites for that. If I watched political content, it might show me some lurid videos promoting politics I disagree with because that tends to keep viewers engaged with the site/app longer than if they just showed videos consistent with the ideology I seek out. That gives people the feeling they’re trying to push an ideology.

    I made that up without any evidence. It’s just my guess. I’m a moderate libertarian who leans Democratic because Republicans have not even been pretending to care about liberty, and for whatever reason it doesn’t recommend the far-right crap to me.