(This post is referring to tiktok, instagram reels, youtube shorts type content)

So it’s no secret that for half a decade now we’ve had short form video content get increasingly popularised. To the point where your uncle probably watches them on some platform and so does your 6 year old.

Now in terms of addictiveness, privacy falling through the roof, advertising down your throat, etc. we’ve reached new records. But that isn’t what this post is about.

I never used this format much, except in 2020 during the pandemic, but I can’t help but notice when I do. Even if I make a brand new VPN’d account for it. Every 5-10th post or so wil be some kind of culture war propaganda.

The kind of stuff that made me an edgy right wing bigot as a teenager because I got influenced by it on youtube. Luckily, I grew and changed. Unfortunately, many former friends, not so much.

On a completely new account you’ll get bombarded with videos like:

  • “This is why women NEED real MEN”
  • “How to piss of a cyclist”
  • ”If a climate activist is blocking the road, you should be allowed to run over them”
  • ”When MEN were thrown a grenade, they reacted quickly, WOMEN got confused and scared. That’s why we NEED men in the army”
  • Wholesome looking video — homophobic caption: “This is why children need both a mum and a dad”

These are all examples I got today when I decided to make completely new accounts on a new device with a VPN. I tried youtube shorts and tiktok for this and scrolled 20ish videos on each without liking anything.

It scares me how much there is a strong rightward shift amongst the younger population. To the point where 18-24’s in recent german elections have voted the most for the borderline neo-nazi AFD party. I know these parties are by far the most active and very popular on tiktok.

I know to some people these examples might seem anoying but trivial. But you are forgetting how impressionable children and teenagers are. There really is this thing called the alt-right pipeline. One very specific example which impacted me at a younger age, was you start with “mad karen” type videos, then after a couple you get recommended “mad SJW” videos, I think you see where I’m going with this, after a while you’re on “feminism debunked” and watching peterson/crowder/shapiro. I know many childhood friends who went down this pipeline and became insufferably racist bigot adults.

I wonder what others think on this?

  • Servais (il/le)@discuss.tchncs.de
    link
    fedilink
    arrow-up
    30
    arrow-down
    1
    ·
    3 months ago

    To be fair, alt-right content is present in all types of medias: short videos, but also tweets, articles, long videos, essays, etc.

    The fake Facebook groups created by Russia to antagonize two local groups did not need shorts videos to succeed: https://www.npr.org/2017/11/01/561427876/how-russia-used-facebook-to-organize-two-sets-of-protesters

    If you look at Twitter nowadays, the “For you” is going to be more or less what you listed above

    • FundMECFSResearch@lemmy.blahaj.zoneOP
      link
      fedilink
      arrow-up
      5
      ·
      3 months ago

      You’re completely right. But somehow, I feel like it’s mostly escapable on most formats, except the short form video type one, as you rely entirely on an algorithm.

      • Servais (il/le)@discuss.tchncs.de
        link
        fedilink
        arrow-up
        4
        ·
        3 months ago

        except the short form video type one, as you rely entirely on an algorithm.

        Facebook, Twitter, even Reddit nowadays promote a certain type of content (reaction inducing), so unfortunately it’s not only shorts

        • FundMECFSResearch@lemmy.blahaj.zoneOP
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          3 months ago

          Again, correct, but you still custimise your feed to large extent by following stuff.

          Although you can follow people on tiktok and stuff, your feed is almost entirely predetermined by how you interact with content, ie. how long you stayed on certain videos, what you liked, what made you close the app, etc.

          • Servais (il/le)@discuss.tchncs.de
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            you still custimise your feed to large extent by following stuff.

            I’m not that familiar with the latest versions of the three I listed, but I’ve seen quite a few complaints on Reddit recently who seems to be pushing irrelevant content just for rage bait.

            Also the things you mentioned about tracking are there on Twitter too:

            Twitter’s privacy policy states they use the information collected to: “improve and personalise our products and services so that you have a better experience on Twitter, including by showing you more relevant content and ads, suggesting people and topics to follow, enabling and helping you discover affiliates, third-party apps, and services.”

            It’s important to note that opting out of Twitter’s interest-based ads doesn’t necessarily mean you won’t see targeted advertising. For example, you may still see ads which are based on information such as what you Tweet, who you follow and the links you click on Twitter.

            Often while browsing, if Twitter recognises that you’re interacting with something repeatedly, they will offer for you to follow that particular topic – which can be anything from celebrities to sports teams.

            https://www.scotsman.com/lifestyle/tech/what-does-twitter-know-about-you-and-your-data-how-to-see-everything-twitter-thinks-about-you-4125827

  • 9point6@lemmy.world
    link
    fedilink
    arrow-up
    25
    arrow-down
    3
    ·
    3 months ago

    Alt-right fuckos have unfortunately been targeting impressionable kids for decades now.

    Tiktok is just where the kids are today, a few years ago it was the YouTube alt-right pipeline and before that it was basically just weirdos on 4chan

    That’s not to say it’s not alarming or something to drag your friends away from, but it’s not new and it’s not going away.

    No one rational can possibly support their ideology, so they need to find easily manipulated people to dupe in order to remain relevant.

  • ContrarianTrail@lemm.ee
    link
    fedilink
    English
    arrow-up
    8
    ·
    3 months ago

    My hot take on this is that the algorithm is working as intended. I don’t think there’s a secret agenda behind it, and even if there was, it would likely skew left due to the nature of tech companies. The algorithm is simply trying to maximize engagement, and certain kinds of content do that better than others. People might feel like they’re being served content they don’t like, but I think they’re actually drawn to that content despite not liking. It’s like driving past a car wreck; you know it’s not good, and you might see something you wish you didn’t, but you still look.

    I actively block content from my feed using word filters. I’ve decided I no longer want to see posts about certain things, so I block them. However, every now and then, a post on one of these topics manages to evade my content filters. And what do I do? I click on it and make a comment, getting myself into hot water again. I know I shouldn’t, and I’ve even taken active steps to stop myself, but I still do.

    • AwkwardLookMonkeyPuppet@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      3 months ago

      it would likely skew left due to the nature of tech companies

      What do tech companies have to do with it? TikTok is controlled by the Chinese government, which is about as far right as you can get.

  • barsquid@lemmy.world
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    3 months ago

    I agree and I wish I knew how to stop it or counter it for larger numbers of people.

  • GBU_28@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    3 months ago

    Homie clicking alt right stuff.

    I should be targeted, based on demographics. I don’t see any of this shit. Don’t click the bait. We learned this in the 90s.

    CLICK HERE

    DOWNLOAD NOW

  • Boozilla@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    3 months ago

    I’m guessing the content offering algorithms are looking at your IP, browser profile, etc. and throwing stuff at you that it assumes you and “people in your area” want to see. Even with brand new accounts, they always try to figure this stuff out. It often gets it wrong, but that never stops them from trying.

  • lurch (he/him)@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    3 months ago

    it’s a problem that normal, responsible adults immediatly know these apps are problematic or at least a waste of time. this creates a loop making them more problematic and shitty every iteration.

  • RobotZap10000@feddit.nl
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 months ago

    Despite me having maybe a bit too much freedom on the internet when I was younger, I never fell down that hole. Even though many openings presented themselves, I always avoided them due to reasons that I then could not describe. I doubt that many others also had the same experience.

    But then again, Reddit was really the only social media that I properly interacted with. I did also have a nice and fat dose of YouTube, but I (thankfully) never watched the kinds of videos that you speak of, I found them appalling.

    I’ve only ever really heard of what you describe, as I don’t really talk much with people who hate my guts. But by now I’ve heard of it so much I don’t think that it’s an exaggeration anymore. I can only hope that those affected by it eventually free themselves, even if much later in life. Mindless hatred hurts everybody, including the hater.

    • FundMECFSResearch@lemmy.blahaj.zoneOP
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      edit-2
      3 months ago

      I’m worried that the hole is getting bigger and bigger, and it’s becoming almost normal for young kids to fall into it.

      My older brother works at a middle school. Every year, they do mock elections to teach kids about voting. He told me over the past decade, it’s gone from kids choosing the green party 75% (I’m in Switzerland btw), to it being 50% the far right party and the rest split between all the others.

  • MiDaBa@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    I didn’t read this post because it is way too long. Please make a short ten to twenty second video summering your points but make sure it’s also funny or I’ll get bored and swipe.

  • P4ulin_Kbana@lemmy.eco.br
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    Offtopic but this reminds me of a Shorts channel I randomly subscribed to, and then I see him posting a video criticising Hollywood’s casting.

  • AwkwardLookMonkeyPuppet@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    I think the entire concept is damaging, regardless of who is on there. It’s all one giant marketing engine. People aren’t making those videos because they love it. They’re making those videos because they’re hoping to become financially successful doing it, which means selling out and shilling for products, which happens all the time in those videos. It’s a less obvious form of advertising, and it has a strong sway over people.

  • HubertManne@moist.catsweat.com
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    3 months ago

    I don’t care about any of that shit. People will find things they want to. I just want to have places free of bullshit or really just configurable enough that I can block the bullshit