TLDR if you don’t wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for “Kamilia” would lose you “10000 rizz”, and how voting for Trump would get you “1 million rizz”.

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn’t necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

  • @ragebutt@lemmy.dbzer0.com
    link
    fedilink
    English
    47
    edit-2
    2 months ago

    Do these companies put their fingers on the scale? Almost certainly

    But it’s exactly what he said that’s what brought us here. They have not particularly given a shit about politics (aside from no taxes and let me do whatever I want all the time). However, the algorithms will consistently reward engagement. Engagement doesn’t care about “good” or “bad”, it just cares about eyes on it, clicks, comments. And who wins that? Controversial bullshit. Joe Rogan getting elon to smoke weed. Someone talking about trans people playing sports. Etc

    This is a natural extension of human behavior. Human behavior occurs because of a function. I do x because of a function, function being achieving reinforcement. Attention, access to something, escaping, or automatic.

    Attention maintained behaviors are tricky because people are shitty at removing attention and attention is a powerful reinforcer. You tell everyone involved “this person feeds off of your attention, ignore them”. Everyone agrees. The problematic person pulls their bullshit and then someone goes “stop it”. They call it negative reinforcement (this is not negative reinforcement. it’s probably positive reinforcement. It’s maybe positive punishment, arguably, because it’s questionable how aversive it is).

    You get people to finally shut up and they still make eye contact, or non verbal gestures, or whatever. Attention is attention is attention. The problematic person continues to be reinforced and the behavior stays. You finally get everyone to truly ignore it and then someone new enters the mix who doesn’t get what’s going on.

    This is the complexity behind all of this. This is the complexity behind “don’t feed the trolls”. You can teach every single person on Lemmy or reddit or whoever to simply block a malicious user but tomorrow a dozen or more new and naive people will register who will fuck it all up

    The complexity behind the algorithms is similar. The algorithms aren’t people but they work in a similar way. If bad behavior is given attention the content is weighted and given more importance. The more we, as a society, can’t resist commenting, clicking, and sharing trump, rogan, peterson, transphobic, misogynist, racist, homophobic, etc content the more the algorithms will weight this as “meaningful”

    This of course doesn’t mean these companies are without fault. This is where content moderation comes into play. This is where the many studies that found social media lead to higher irritability, more passive aggressive behavior and lower empathetization could potentially have led us to regulate these monsters to do something to protect their users against the negative effects of their products

    If we survive and move forward in 100 years social media will likely be seen in the way we look at tobacco now. An absolutely dangerous thing that was absurd to allowed to exist in a completely unregulated state with 0 transparency as to its inner workings

      • @x00z@lemmy.world
        link
        fedilink
        English
        12 months ago

        Commenting on stuff definitely strengthens it, but I wouldn’t know if a shadow ban changes that. I don’t think there’s much difference if you are shadowbanned or not, you’re still interacting with the content.

        • @glowing_hans@sopuli.xyz
          link
          fedilink
          English
          12 months ago

          In my view, instagram blocking Searchterm democrat for short times is kind of a shadowban … on all democrats

          • @x00z@lemmy.world
            link
            fedilink
            English
            62 months ago

            That’s not what a shadowban is. A shadow ban is where the user does not know they are banned. These search terms were very obviously censorship and not a shadowban.

            If it were a shadowban then you would still get results and be able to interact with it. But some results might have been hidden and your interactions would be hidden to others too. A shadowban is meant to make you believe you were not censored.

  • @Shardikprime@lemmy.world
    link
    fedilink
    English
    12 months ago

    If the channel is popular, those videos will get recommend

    Of it has engagement on top of that, you are fucked, it will definitely get recommend to you.

    Either block the channel, the user, or use in incognito. Or don’t

    • @psx_crab@lemmy.zip
      link
      fedilink
      English
      3
      edit-2
      2 months ago

      I didn’t watch the video, but it’s YT short, you just swipe like tiktok. The few ways to curate the algorithm is to either swipe away quickly, click on the “not interested” button, downvote, or delete watched shorts from history. If you doesn’t interact with any of this and watch the full length of the video, the algorithm gonna assume you like this kind of content. They also will introduce you content you never watched before to gauge your interest, a lot of times it’s not even related to what you currently watched, and if you didn’t do any curation, they gonna feed you the exact type for some times. I don’t know how they manage the curation but that’s the gist of it from my experience. My feed have 0 politics, mostly cats. I control the feed strictly so i got what i demand.

  • @Imhotep@lemmy.world
    link
    fedilink
    English
    42 months ago

    fresh YouTube account

    change their location to a random city in the US

    yeah but you’re still bound to IP addresses. I was under the impression Youtube used those for their profiling

      • @Imhotep@lemmy.world
        link
        fedilink
        English
        22 months ago

        most likely yes.

        My point is, if youtube customizes the feeds based on the IPs, then the youtube accounts used are not really “fresh” but there’s already some data entered into the profiles upon their creation.

      • @ObsidianNebula@sh.itjust.works
        link
        fedilink
        English
        32 months ago

        You are correct. He used a VPN for several US locations in the video. He then compared what content was shown in different regions of the US to see if everyone sees the same thing or if the content is radically different depending on where you are.

        • @Shardikprime@lemmy.world
          link
          fedilink
          English
          12 months ago

          Even then, do they really think YouTube has not flagged and taken into account VPNs connections on their suggestions?

          If a lot of people are using that VPN ip range, it’s basically the same as doing nothing

  • Pennomi
    link
    fedilink
    English
    2022 months ago

    I think the explanation might be even simpler - right wing content is the lowest common denominator, and mindlessly watching every recommended short drives you downward in quality.

    • @credo@lemmy.world
      link
      fedilink
      English
      152 months ago

      I refuse to watch those shit shorts; I think your theory has legs. Unfortunately there doesn’t seem to be a way to turn them off.

        • @Clinicallydepressedpoochie@lemmy.world
          link
          fedilink
          English
          02 months ago

          Was it a conspiracy in 2016? Was it a conspiracy that elon bought x to control the narrative? Was it a conspiracy that TikTok has avoided shutdown by glazing trump? Was it a conspiracy when zuck just changed the way they did moderation on Facebook and then showed up at trumps inauguration?

          Youtube cucks are the worst.

    • @jimmy90@lemmy.world
      link
      fedilink
      English
      4
      edit-2
      2 months ago

      yeah i created a new youtube account in a container once and just watched all the popular/drama suggestions. that account turned into a shitstorm immediately

      these days i curate my youtube accounts making liberal use of Not interested/Do not recommend channel/Editing my history and even test watching in a container before watching it on my curated account

      this is just how “the algorithm” works. shovel more of what you watch in your face.

      the fact that they initially will give you right-wing, conspiracy fueled, populist, trash right off the bat is the concern

    • @Plebcouncilman@sh.itjust.works
      link
      fedilink
      English
      65
      edit-2
      2 months ago

      I was gonna say this. There’s very little liberal or left leaning media being made and what there is is mostly made for a female or LGBTQ audience. Not saying that men cannot watch those but there’s not a lot of “testosterone” infused content with a liberal leaning, one of the reasons Trump won was this, so by sheer volume you’re bound to see more right leaning content. Especially if you are a cisgender male.

      Been considering creating content myself to at least stem the tide a little.

      • @BassTurd@lemmy.world
        link
        fedilink
        English
        392 months ago

        I think some of it is liberal media is more artsy and creative, which is more difficult to just pump out. Creation if a lot more difficult than destruction.

        • @anomnom@sh.itjust.works
          link
          fedilink
          English
          92 months ago

          Plus fact based videos require research, sourcing and editing.

          Emotional fiction only takes as long to create as a daydream.

        • @Plebcouncilman@sh.itjust.works
          link
          fedilink
          English
          152 months ago

          Not necessarily. For example a lot of “manosphere” guys have taken a hold of philosophy,health and fitness topics, a liberal influencer can give a liberal view on these subjects. For example in philosophy, explain how Nietzsche was not just saying that you can do whatever the fuck you want, or how stoicism is actually a philosophy of tolerance not of superiority etc. there’s really a lot of space that can be covered.

      • @sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        -132 months ago

        Really? As someone who dislikeds both mainstream extremes (I consider myself libertarian), I see a lot more left-leaning content than right-leaning content. I wouldn’t be surprised if >75% of the content I watch comes from a left-leaning creator, nor because I seek it out, but because young people into tech tend to lean left, and I’m into tech.

      • @UraniumBlazer@lemm.eeOP
        link
        fedilink
        English
        42 months ago

        Agreed 100%. Whenever I’m in India, I get Hindu nationalist content A LOT. I briefly attempted the dislike/don’t recommend thing, but nope! I was getting absolutely spammed with stuff like this regardless. I just disabled shorts after that.

  • @Valmond@lemmy.world
    link
    fedilink
    English
    132 months ago

    I bet thise right wing shorts are proposed and shoehorned in everywhere because someone pays for the visibility. Simple as that.

  • @FireTower@lemmy.world
    link
    fedilink
    English
    152 months ago

    Saying it disproportionately promotes any type of content is hard to prove without first establishing how much of the whole is made up by that type.

    The existence of proportionately more “right” leaning content than “left” leaning content could adequately explain the outcomes.

    • @glowing_hans@sopuli.xyz
      link
      fedilink
      English
      -3
      edit-2
      2 months ago

      adding to this: youtubes audience seems male dominated. Males are “on average” more right leaning.

      approximately 54.3 percent of YouTube male

      statista

      • ZeroOne
        link
        fedilink
        English
        32 months ago

        Yeah I wonder why they’re right leaning ? It’s not as if something is pushing men to the right

            • @patatahooligan@lemmy.world
              link
              fedilink
              English
              32 months ago

              Bro, no way you’re not a troll. I clicked on the 2nd link to see what the source is. It’s a guy using copilot to calculate percentage of women smoking and drinking during pregnancy, as well as late abortions, then does completely arbitrary math to conclude that women are more violent. If I wrote a parody of a person abusing stats to prove stupid points I would never have managed to make it as ridiculous as this.

              • ZeroOne
                link
                fedilink
                English
                -3
                edit-2
                2 months ago

                Ok what is the arbitrary math here ? so you’re just simply calling an evidence fake/riddiculous because you don’t agree with it & not show any counter-evidence ??<br> Great, keep doing that<br> Go there & argue with guy if you are capable of showing a more accurate math

                Also Trolls don’t link evidences that support their talking points, they do exactly what you’re doing

                • @patatahooligan@lemmy.world
                  link
                  fedilink
                  English
                  22 months ago

                  Comparing pregnant women who drink to men in prison as equally violent individuals?

                  Straight up adding pregnant women who smoke to pregnant women who drink alcohol to women who get late stage abortions with no concern that an individual might belong to more than one group?

                  Removing fathers who drank before conception from the equation entirely with the justification that an article called it less harmful, but clearly not harmless, which is the opposite of what he did when he put number of drinkers and convicted criminals in the same equation.

                  Go there & argue with guy if you are capable of showing a more accurate math

                  I’m not going to argue with that dipshit because it’s total waste of time. And so is arguing with you. I only commented for the benefit of other users who might scroll by without noticing the absolutely ridiculous evidence you cited, and get trapped into taking your position at face value.

            • @glowing_hans@sopuli.xyz
              link
              fedilink
              English
              02 months ago

              Yes, I mean all cultures created by humans, no exception exists, as you seem to imply.

              […] such that men comprise 95% of those convicted for homicide worldwide (United Nations Office on Drugs and Crime 2013). https://www.journals.uchicago.edu/doi/10.1086/711705

              From my view males are to an extent biologically programmed to be right leaning and more violent, always have been. This is why they die in wars and crimes in male on male violence often and stand on top of hierarchies of physical power on average. This risk taking behavior might have positive side effects in case of victory as well. And lets not forget youtube was founded by a pure male team.

        • magic_lobster_party
          link
          fedilink
          72 months ago

          This is my thought, but I think many young men are (rightfully) frustrated, but they don’t know what they’re frustrated about. It’s hard to get a job - especially without higher education. It’s hard to buy a home and build a family. Many young men are increasingly more alone.

          At the same time, there’s a lot of talk about the ” white male privilege”. ”What privilege?”, they might think. They don’t feel particularly privileged about their situation.

          And then they find people like Jordan Peterson who seem to speak for their struggles. For first time they hear someone seem to understand them. And they point to the (very wrong) diagnosis of the situation: it’s the woke identity politics fault! But that’s good enough for them, and that’s where the alt-right pipeline starts.

  • @bulwark@lemmy.world
    link
    fedilink
    English
    112 months ago

    I noticed my feed almost immediately changed after Trump was elected. I didn’t change my viewing habits. I’m positive YouTube tweaked the algorithm to lean more right.

  • @danciestlobster@lemm.ee
    link
    fedilink
    English
    292 months ago

    I don’t think it makes me feel better to know that our descent into fascism is because gru promised 1MM rizz for it

    • @eronth@lemmy.world
      link
      fedilink
      English
      202 months ago

      Insanely, that seems to be the play. Not logic or reason, but brainrot and low blows. Which is a bit at odds with the actual desire.

      • @xor@lemmy.dbzer0.com
        link
        fedilink
        English
        82 months ago

        fight fire with fire i guess….
        maybe people get on board quicker if they feel the emotions first, and then learn the logic….
        one good example is Noam Chompsky: every thing is says is gold, but he says it so slow and dispassionately even people who agree with him find it hard to watch.

    • @sudo@programming.dev
      link
      fedilink
      English
      7
      edit-2
      2 months ago

      It only must be extremely simplified and evoke emotional reactions. That’s just basic propaganda rules. The brainrot quality of the content is a consequence of the sheer quantity of the content. You can’t make that volume of content and without making fully automated ai slop.

      What the experiment overlooks is that there are PR companies being paid to flood YouTube with rightwing content and are actively trying to game its algorithm. There simply isn’t a left-wing with the capital to manufacture that much content. No soros-bucks for ai minions in keffiyehs talking about medicare.