A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • guyrocket@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    This is not new. People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me. The result is the same: fake porn/nudes.

    And all the hand wringing in the world about it being non consensual will not stop it. The cat has been out of the bag for a long time.

    I think we all need to shift to not believing what we see. It is counterintuitive, but also the new normal.

    • EatATaco@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      I suck at Photoshop and Ive tried many times to get good at it over the years. I was able to train a local stable diffusion model on my and my family’s faces and create numerous images of us in all kinds of situations in 2 nights of work. You can get a snap of someone and have nudes of them tomorrow for super cheap.

      I agree there is nothing to be done, but it’s painfully obvious to me that the scale and ease of it that makes it much more concerning.

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      I hate this: “Just accept it women of the world, accept the abuse because it’s the new normal” techbro logic so much. It’s absolutely hateful towards women.

      We have legal and justice systems to deal with this. It is not the new normal for me to be able to make porn of your sister, or mother, or daughter. Absolutely fucking abhorrent.

      • SharkAttak@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        It’s not normal but neither is new: you already could cut and glue your cousin’s photo on a Playboy girl, or Photoshop the hot neighbour on Stallone’s muscle body. Today is just easier.

        • echo64@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          Sorry if I didn’t position this about men. They are the most important thing to discuss and will be the most impacted here, obviously. We must center men on this subject too.

          • Thorny_Insight@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            7 months ago

            Pointing out your sexism isn’t saying we should be talking about just men. It you whose here acting all holy while ignoring half of the population.

            • echo64@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              7 months ago

              Yes yes, #alllivesmatter amiirte? We just ignore that 99.999% of the victims will be women, just so we can grandstand about men.

      • brbposting@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        It’s unacceptable.

        We have legal and justice systems to deal with this.

        For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):

        Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.

        Telegram got right on it (not). Fuckers.

    • kent_eh@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me

      Because now it’s faster, can be generated in bulk and requires no skill from the person doing it.

      • Vespair@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        no skill from the person doing it.

        This feels entirely non-sequitur, to the point of damaging any point you’re trying to make. Whether I paint a nude or the modern Leonardi DaVinci paints a nude our rights (and/or the rights of the model, depending on your perspective on this issue) should be no different, despite the enormous chasm that exists between our artistic skill.

      • ArmokGoB@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        I blame electricity. Before computers, people had to learn to paint to do this. We should go back to living like medieval peasants.

      • Bob Robertson IX @discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        A kid at my high school in the early 90s would use a photocopier and would literally cut and paste yearbook headshots onto porn photos. This could also be done in bulk and doesn’t require any skills that a 1st grader doesn’t have.