A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • kent_eh@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me

    Because now it’s faster, can be generated in bulk and requires no skill from the person doing it.

    • Bob Robertson IX @discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      A kid at my high school in the early 90s would use a photocopier and would literally cut and paste yearbook headshots onto porn photos. This could also be done in bulk and doesn’t require any skills that a 1st grader doesn’t have.

    • Vespair@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      no skill from the person doing it.

      This feels entirely non-sequitur, to the point of damaging any point you’re trying to make. Whether I paint a nude or the modern Leonardi DaVinci paints a nude our rights (and/or the rights of the model, depending on your perspective on this issue) should be no different, despite the enormous chasm that exists between our artistic skill.

    • ArmokGoB@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      I blame electricity. Before computers, people had to learn to paint to do this. We should go back to living like medieval peasants.