He allegedly used Stable Diffusion, a text-to-image generative AI model, to create “thousands of realistic images of prepubescent minors,” prosecutors said.
Alternative perspective is to think that does watching normal porn make heterosexual men more likely to rape women? If not then why would it be different in this case?
The vast majority of pedophiles never offend. Most people in jail for child abuse are just plain old rapists with no special interest towards minors, they’re just an easy target. Pedophilia just describes what they’re attracted to. It’s not a synonym to child rapist. It usually needs to coinside with psychopathy to create the monster that most people think about when hearing that word.
I would love to see research data pointing either way re #1, although it would be incredibly difficult to do so ethically, verging on impossible. For #2, people have extracted originals or near-originals of inputs to the algorithms. AI generated stuff - plagiarism machine generated stuff, runs the risk of effectively revictimizing people who were already abused to get said inputs.
It’s an ugly situation all around, and unfortunately I don’t know that much can be done about it beyond not demonizing people who have such drives, who have not offended, so that seeking therapy for the condition doesn’t screw them over. Ensuring that people are damned if they do and damned if they don’t seems to pretty reliably produce worse outcomes.
It feeds and evolves a disorder which in turn increases risks of real life abuse.
But if AI generated content is to be considered illegal, so should all fictional content.
Or, more likely, it feeds and satisfies a disorder which in turn decreases risk of real life abuse.
Making it illegal so far helped nothing, just like with drugs
That’s not how these addictive disorders works… they’re never satisfied and always need more.
Two things:
Alternative perspective is to think that does watching normal porn make heterosexual men more likely to rape women? If not then why would it be different in this case?
The vast majority of pedophiles never offend. Most people in jail for child abuse are just plain old rapists with no special interest towards minors, they’re just an easy target. Pedophilia just describes what they’re attracted to. It’s not a synonym to child rapist. It usually needs to coinside with psychopathy to create the monster that most people think about when hearing that word.
I would love to see research data pointing either way re #1, although it would be incredibly difficult to do so ethically, verging on impossible. For #2, people have extracted originals or near-originals of inputs to the algorithms. AI generated stuff - plagiarism machine generated stuff, runs the risk of effectively revictimizing people who were already abused to get said inputs.
It’s an ugly situation all around, and unfortunately I don’t know that much can be done about it beyond not demonizing people who have such drives, who have not offended, so that seeking therapy for the condition doesn’t screw them over. Ensuring that people are damned if they do and damned if they don’t seems to pretty reliably produce worse outcomes.