• Kusimulkku@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    Even in cases when the content is fully artificial and there is no real victim depicted, such as Operation Cumberland, AI-generated CSAM still contributes to the objectification and sexualisation of children.

    I get how fucking creepy and downright sickening this all feels, but I’m genuinely surprised that it’s illegal or criminal if there’s no actual children involved.

    It mentions sexual extortion and that’s definitely something that should be illegal, same for spreading AI generated explicit stuff about real people without their concent, involving children or adults, but idk about the case mentioned here.

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      9 days ago

      It would depend on the country. In the UK even drawn depictions are illegal. I assume it has to at least be realistic and stick figures don’t count.

      • Kusimulkku@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        It sounds like a very iffy thing to police. Since drawn stuff doesn’t have actual age, how do you determine it? Looks? Wouldn’t be great.

        • JuxtaposedJaguar@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          Imagine having to argue to a jury that a wolf-human hybrid with bright neon fur is underage because it isn’t similar enough to a wolf for dog years to apply.

  • Xanza@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    I totally agree with these guys being arrested. I want to get that out of the way first.

    But what crime did they commit? They didn’t abuse children…they are AI generated and do not exist. What they did is obviously disgusting and makes me want to punch them in the face repeatedly until it’s flat, but where’s the line here? If they draw pictures of non-existent children is that also a crime?

    Does that open artists to the interpretation of the law when it comes to art? Can they be put in prison because they did a professional painting of a child? Like what if they did a painting of their own child in the bath or something? Sure the contents questionable but it’s not exactly predatory. And if you add safeguards for these people could then not the predators just claim artistic expression?

    It just seems entirely unenforceable and an entire goddamn can of worms…

    • sunbeam60@lemmy.one
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      It obviously depends on where they live and/or committed the crimes. But most countries have broad laws against anything, real or fake, that depicts CSAM.

      It both because as technology gets better it would be easy for offenders to claims anything they’ve been caught with is AI created.

      It’s also because there’s a belief that AI generated CSAM encourages real child abuse.

      I shan’t say whether it does - I tend to believe so but haven’t seen data to prove me right or wrong.

      Also, at the end, I think it’s simply an ethical position.

    • Allero@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      I actually do not agree with them being arrested.

      While I recognize the issue of identification posed in the article, I hold a strong opinion it should be tackled in another way.

      AI-generated CSAM might be a powerful tool to reduce demand for the content featuring real children. If we leave it legal to watch and produce, and keep the actual materials illegal, we can make more pedophiles turn to what is less harmful and impactful - a computer-generated image that was produced with no children being harmed.

      By introducing actions against AI-generated materials, they make such materials as illegal as the real thing, and there’s one less reason for an interested party not to go to a CSAM site and watch actual children getting abused, perpetuating the cycle and leading to more real-world victims.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        8 days ago

        Nah the argument that this could grow “pedophile culture” and even encourage real activities is really not that far fetched and could be even true. Without very convincing studies do you take a chance where real kids could soon suffer? And I mean the studies would have to be really convincing.

        • Allero@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 days ago

          The thing is, banning is also a consequential action.

          And based on what we know about similar behaviors, having an outlet is likely to be good.

          Here, the EU takes an approach of “banning just in case” while also ignoring the potential implications of such bans.

  • Allero@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    I’m afraid Europol is shooting themselves in the foot here.

    What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.

    Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there’s no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it’s easier to escalate, and that’s dangerous.

    As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      This relies on the idea that “outlet” is not harmful. It might be encouraging even but who do you think even would ever study this to help us know here. Can you imagine the scientists who’d have to be leading studies like this - incredibly grim and difficult subject with high likelihood that no one would listen to you anyway.

    • Fungah@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      I haven’t read any of this research because, like, the only feelings I have about pedophiles are outright contempt and a small amount of pity for the whole fucking destructive evilness of it all, but I’ve been told having access to drawings and images and whatnot makes people more likely to act on their impulses.

      And like. I don’t think images of CSAM in any form, no matter how far removed they are from real people, actually contribute anything worthwhile st all yo the world, so like. I dunno.

      Really couldn’t give two squirts of piss of about anything that makes a pedophiles life harder. Human garbage.

      • Allero@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        As an advocate for online and offline safety of children, I did read into the research. None of the research I’ve found confirm with any sort of evidence that AI-generated CSAM materials increase risks of other illicit behavior. We need more evidence, and I do recommend to exercise caution with statements, but for the time being, we can rely on the studies in other forms of illegal behaviors and the effects of their decriminalization, which paint a fairly positive picture. Generally, people will tend to opt for what is legal and more readily accessible - and we can make AI CSAM into exactly that.

        For now, people are criminalized for the zero-evidence-its-even-bad crime, while I tend to look quite positively on what it can bring on the table instead.

        Also, pedophiles are not human trash, and this line of thinking is also harmful, making more of them hide and never get adequate help from a therapist, increasing their chances of offending. Which, well, harms children.

        They are regular people who, involuntarily, have their sexuality warped in a way that includes children. They never chose it, they cannot do anything about it in itself, and can only figure out what to do with it going forward. You could be one, I could be one. What matters is the decisions they take based on their sexuality. The correct way is celibacy and refusion of any sources of direct harm towards children, including the consumption of real CSAM. This might be hard on many, and to aid them, we can provide fictional materials so they could let some steam off. Otherwise, many are likely to turn to real CSAM as a source of satisfaction, or even turn to actually abusing children IRL.

    • raptir@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 days ago

      What would stop someone from creating a tool that tagged real images as AI generated?

      Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.

      • Allero@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 days ago

        Some form of digital signatures for allowed services?

        Sure, it will limit the choice of where to legally generate content, but it should work.

        • raptir@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 days ago

          I highly doubt any commercially available service is going to get in on officially generating photorealistic CSAM.

            • raptir@lemmy.zip
              link
              fedilink
              English
              arrow-up
              0
              ·
              7 days ago

              …and then we’re back at “someone can take that model and tag real images to appear AI-generated.”

              You would need a closed-source model run server-side in order to prevent that.

    • turnip@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      You can download the models and compile them yourself, that will be as effective as the US government was at banning encryption.

  • JuxtaposedJaguar@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    Not going to read the article, but I will say that I understand making hyper-realistic fictional CP illegal, because it would make limiting actual CP impossible.

    As long as it’s clearly fictional though, let people get off to whatever imaginary stuff they want to. We might find it disgusting, but there are plenty of sexual genres that most people would find disgusting b yet shouldn’t be illegal.

      • ifItWasUpToMe@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        I don’t think this is actually true. Pretty sure if you feed it naked adults and clothed children it can figure out the rest.

        • DoPeopleLookHere@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          That’s not how these image generators work.

          How would it know what an age appropriate penis looks like with our, you know, seeing one.

          • lime!@feddit.nu
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            9 days ago

            no, it sort of is. considering style transfer models, you could probably just draw or 3d model unknown details and feed it that.

            • DoPeopleLookHere@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 days ago

              Again, that’s not how image generators work.

              You can’t just make up some wishful thinking and assume that’s how it must work.

              It takes thousands upon housands of unique photos to make an image generator.

              Are you going to draw enough child genetalia to train these generators? Are you actually comfortable doing that task?

              • lime!@feddit.nu
                link
                fedilink
                English
                arrow-up
                0
                ·
                8 days ago

                i’m not, no. but i’m also well-enough versed in stable diffusion and loras that i know that even a model with no training on a particular topic can be made to produce it with enough tweaking, and if the results are bad you can plug in an extra model trained on at minimum 10-50 images to significantly improve them.

                • DoPeopleLookHere@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  8 days ago

                  Okay, but my point still stands.

                  Someone has to make the genitals models to learn from. Some human has to be involved otherwise it wouldn’t just exist.

                  And if your not willing to get your hands dirty and do it, why would anyone else?

          • Allero@lemmy.today
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 days ago

            That’s exactly how they work. According to many articles I’ve seen in the past, one of the most common models used for this purpose is Stable Diffusion. For all we know, this model was never fed with any CSAM materials, but it seems to be good enough for people to get off - which is exactly what matters.

            • DoPeopleLookHere@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              8 days ago

              How can it be trained to produce something without human input.

              To verify it’s models are indeed correct, some human has to sit and view it.

              Will that be you?

              • Allero@lemmy.today
                link
                fedilink
                English
                arrow-up
                0
                ·
                8 days ago

                Much as all in modern AI - it’s able to train without much human intervention.

                My point is, even if results are not perfectly accurate and resembling a child’s body, they work. They are widely used, in fact, so widely that Europol made a giant issue out of it. People get off to whatever it manages to produce, and that’s what matters.

                I do not care about how accurate it is, because it’s not me who consumes this content. I care about how efficient it is at curbing worse desires in pedophiles, because I care about safety of children.

              • TheRealKuni@midwest.social
                link
                fedilink
                English
                arrow-up
                0
                ·
                8 days ago

                How can it be trained to produce something without human input.

                It wasn’t trained to produce every specific image it produces. That would make it pointless. It “learns” concepts and then applies them.

                No one trained AI on material of Donald Trump sucking on feet, but it can still generate it.

                • DoPeopleLookHere@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  8 days ago

                  It was able to produce that because enough images of both feet and Donald Trump exist.

                  How would it know what young genitals look like?