• macniel@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Mhm I have mixed feelings about this. I know that this entire thing is fucked up but isn’t it better to have generated stuff than having actual stuff that involved actual children?

    • BrianTheeBiscuiteer@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I have trouble with this because it’s like 90% grey area. Is it a pic of a real child but inpainted to be nude? Was it a real pic but the face was altered as well? Was it completely generated but from a model trained on CSAM? Is the perceived age of the subject near to adulthood? What if the styling makes it only near realistic (like very high quality CG)?

      I agree with what the FBI did here mainly because there could be real pictures among the fake ones. However, I feel like the first successful prosecution of this kind of stuff will be a purely moral judgement of whether or not the material “feels” wrong, and that’s no way to handle criminal misdeeds.

      • Zorque@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        Everything is 99% grey area. If someone tells you something is completely black and white you should be suspicious of their motives.

    • PhlubbaDubba@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I think the point is that child attraction itself is a mental illness and people indulging it even without actual child contact need to be put into serious psychiatric evaluation and treatment.

    • Catoblepas@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Did we memory hole the whole ‘known CSAM in training data’ thing that happened a while back? When you’re vacuuming up the internet you’re going to wind up with the nasty stuff, too. Even if it’s not a pixel by pixel match of the photo it was trained on, there’s a non-zero chance that what it’s generating is based off actual CSAM. Which is really just laundering CSAM.

    • retrospectology@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      The arrest is only a positive. Allowing pedophiles to create AI CP is not a victimless crime. As others point out it muddies the water for CP of real children, but it also potentially would allow pedophiles easier ways to network in the open (if the images are legal they can easily be platformed and advertised), and networking between abusers absolutely emboldens them and results in more abuse.

      As a society we should never allow the normalization of sexualizing children.

      • lily33@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        4 months ago

        Actually, that’s not quite as clear.

        The conventional wisdom used to be, (normal) porn makes people more likely to commit sexual abuse (in general). Then scientists decided to look into that. Slowly, over time, they’ve become more and more convinced that (regular) porn availability in fact reduces sexual assault.

        I don’t see an obvious reason why it should be different in case of CP, now that it can be generated.

        • Lowlee Kun@feddit.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          It should be different because people can not have it. It is disgusting, makes them feel icky and thats just why it has to be bad. Conventional wisdom sometimes really is just convential idiocracy.

      • forensic_potato@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        This mentality smells of “just say no” for drugs or “just don’t have sex” for abortions. This is not the ideal world and we have to find actual plans/solutions to deal with the situation. We can’t just cover our ears and hope people will stop

    • Murvel@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      It feeds and evolves a disorder which in turn increases risks of real life abuse.

      But if AI generated content is to be considered illegal, so should all fictional content.

      • SigHunter@lemmy.kde.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        4 months ago

        Or, more likely, it feeds and satisfies a disorder which in turn decreases risk of real life abuse.

        Making it illegal so far helped nothing, just like with drugs

        • Murvel@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          That’s not how these addictive disorders works… they’re never satisfied and always need more.

      • Norgur@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        Two things:

        1. Do we know if fuels the urge to get real children? Or do we just assume that through repetition like the myth of “gateway drugs”?
        2. Since no child was involved and harmed in the making of these images… On what grounds could it be forbidden to generate them?
        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          4 months ago

          Alternative perspective is to think that does watching normal porn make heterosexual men more likely to rape women? If not then why would it be different in this case?

          The vast majority of pedophiles never offend. Most people in jail for child abuse are just plain old rapists with no special interest towards minors, they’re just an easy target. Pedophilia just describes what they’re attracted to. It’s not a synonym to child rapist. It usually needs to coinside with psychopathy to create the monster that most people think about when hearing that word.

        • HopeOfTheGunblade@kbin.social
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          I would love to see research data pointing either way re #1, although it would be incredibly difficult to do so ethically, verging on impossible. For #2, people have extracted originals or near-originals of inputs to the algorithms. AI generated stuff - plagiarism machine generated stuff, runs the risk of effectively revictimizing people who were already abused to get said inputs.

          It’s an ugly situation all around, and unfortunately I don’t know that much can be done about it beyond not demonizing people who have such drives, who have not offended, so that seeking therapy for the condition doesn’t screw them over. Ensuring that people are damned if they do and damned if they don’t seems to pretty reliably produce worse outcomes.

      • Zorque@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        Is everything completely black and white for you?

        The system isn’t perfect, especially where we prioritize punishing people over rehabilitation. Would you rather punish everyone equally, emphasizing that if people are going to risk the legal implications (which, based on legal systems the world over, people are going to do) they might as well just go for the real thing anyways?

        You don’t have to accept it as morally acceptable, but you don’t have to treat them as completely equivalent either.

        There’s gradations of questionable activity. Especially when there’s no real victims involved. Treating everything exactly the same is, frankly speaking, insane. Its like having one punishment for all illegal behavior. Murder someone? Death penalty. Rob them? Straight to the electric chair. Jaywalking? Better believe you’re getting the needle.

        • Mastengwe@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          4 months ago

          Ironically, You ask if everything is completely black and white for someone without accepting that there’s nuance to the very issue you’re calling out. And assuming that “everything”- a very black and white term, is not very nuanced, is it?

          No, not EVERYTHING, but some things. And this is one of those things. Both forms should be illegal. Period. No nuance, no argument, NO grey area.

          This does not mean that nuance doesn’t exist. It just means that some believe that it SHOULDN’T exist within the paradigm of child porn.

    • ImminentOrbit@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      It reminds me of the story of the young man who realized he had an attraction to underage children and didn’t want to act on it, yet there were no agencies or organizations to help him, and that it was only after crimes were committed that anyone could get help.

      I see this fake cp as only a positive for those people. That it might make it difficult to find real offenders is a terrible reason against.