A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.
I say stop antagonizing the AI.
The only difference between a skilled artist making it with Photoshop and someone using a Neural Net, is the amount of time and effort put into creating the instance.If there are to be any laws against these, they need to apply to any and every fake that’s convincing enough, no matter the technology used.
Don’t blame the gunpowder for the dead person.
People were being killed by thrown stones before that.The laws that oppress us on a daily basis suck ass I’ll give yall that for fucking sure… but downvoting someone wishing for the law equally being applied to all?
Maybe I should go back to 4chan.
We are acting as if through out history we managed to orient technology so as to to only keep the benefits and eliminate negative effects. while in reality most of the technology we use still comes with both aspects. and it is not gonna be different with AI.
I doubt tbh that this is the most severe harm of generative AI tools lol
Israeli racial recognition program for example
It’s stuff like this that makes me against copyright laws. To me it is clear and obvious that you own your own image, and it is far less obvious to me that a company can own an image whose creator drew multiple decades ago that everyone can identify. And yet one is protected and the other isn’t.
What the hell do you own if not yourself? How come a corporation has more rights than we do?
This stuff should be defamation, full stop. There would need to be a law specifically saying you can’t sign it away, though.
That’s a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they’re purposefully exploiting people who aren’t tech savvy.
Wait? This is a tool built into stable diffusion?
In regards to people doing it themselves, it might be a bit too technical for some people to setup. But I’ve never tried stable diffusion.
It’s not like deep fake pornography is “built in” but Stable Diffusion can take existing images and generate stuff based on it. That’s kinda how it works really. The de-facto standard UI makes it pretty simple, even for someone who’s not too tech savvy: https://github.com/AUTOMATIC1111/stable-diffusion-webui
Thanks for the link, I’ve been running some llm locally, and I have been interested in stable diffusion. I’m not sure I have the specs for it at the moment though.
The people being exploited are the ones who are the victims of this, not people who paid for it.
There are many victims, including the perpetrators.
And mechanics exploit people needing brake jobs. What’s your point?
It’s gonna suck no matter what once the technology became available. Perhaps in a bunch of generations there will be a massive cultural shift to something less toxic.
May as well drink the poison if I’m gonna be immersed in it. Cheers.
I was really hoping that with the onset of AI people would be more skeptical of content they see online.
This was one of the reasons. I don’t think there’s anything we can do to prevent people from acting like this, but what we can do as a society is adjust to it so that it’s not as harmful. I’m still hoping that the eventual onset of it becoming easily accessible and useable will help people to look at all content much more closely.
This is not new. People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me. The result is the same: fake porn/nudes.
And all the hand wringing in the world about it being non consensual will not stop it. The cat has been out of the bag for a long time.
I think we all need to shift to not believing what we see. It is counterintuitive, but also the new normal.
People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me
Because now it’s faster, can be generated in bulk and requires no skill from the person doing it.
no skill from the person doing it.
This feels entirely non-sequitur, to the point of damaging any point you’re trying to make. Whether I paint a nude or the modern Leonardi DaVinci paints a nude our rights (and/or the rights of the model, depending on your perspective on this issue) should be no different, despite the enormous chasm that exists between our artistic skill.
A kid at my high school in the early 90s would use a photocopier and would literally cut and paste yearbook headshots onto porn photos. This could also be done in bulk and doesn’t require any skills that a 1st grader doesn’t have.
I blame electricity. Before computers, people had to learn to paint to do this. We should go back to living like medieval peasants.
But I saw it on tee-vee!
The irony of parroting this mindless and empty talking point is probably lost on you.
God, do I really have to start putting the /jk or /s back, for those who don’t get it like you??
Upgraded to “definitely.”
Okay, okay, you won. Happy now? Now go.
I hate this: “Just accept it women of the world, accept the abuse because it’s the new normal” techbro logic so much. It’s absolutely hateful towards women.
We have legal and justice systems to deal with this. It is not the new normal for me to be able to make porn of your sister, or mother, or daughter. Absolutely fucking abhorrent.
It’s unacceptable.
We have legal and justice systems to deal with this.
For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):
Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.
Telegram got right on it (not). Fuckers.
Because gay porn is a myth I guess…
Sorry if I didn’t position this about men. They are the most important thing to discuss and will be the most impacted here, obviously. We must center men on this subject too.
Pointing out your sexism isn’t saying we should be talking about just men. It you whose here acting all holy while ignoring half of the population.
Yes yes, #alllivesmatter amiirte? We just ignore that 99.999% of the victims will be women, just so we can grandstand about men.
It’s not normal but neither is new: you already could cut and glue your cousin’s photo on a Playboy girl, or Photoshop the hot neighbour on Stallone’s muscle body. Today is just easier.
I don’t care if it’s not new, no one cares about how new it is.
I suck at Photoshop and Ive tried many times to get good at it over the years. I was able to train a local stable diffusion model on my and my family’s faces and create numerous images of us in all kinds of situations in 2 nights of work. You can get a snap of someone and have nudes of them tomorrow for super cheap.
I agree there is nothing to be done, but it’s painfully obvious to me that the scale and ease of it that makes it much more concerning.
I’d like to share my initial opinion here. “non consential Ai generated nudes” is technically a freedom, no? Like, we can bastardize our president’s, paste peoples photos on devils or other characters, why is Ai nudes where the line is drawn? The internet made photos of trump and putin kissing shirtless.
The internet made photos of trump and putin kissing shirtless.
And is that OK? I mean I get it, free speech, but just because congress can’t stop you from expressing something doesn’t mean you actually should do it. It’s basically bullying.
Imagine you meet someone you really like at a party, they like you too and look you up on a social network… and find galleries of hardcore porn with you as the star. Only you’re not a porn star, those galleries were created by someone who specifically wanted to hurt you.
AI porn without consent is clearly illegal in almost every country in the world, and the ones where it’s not illegal yet it will be illegal soon. The 1st amendment will be a stumbling block, but it’s not an impenetrable wall - congress can pass laws that limit speech in certain edge cases, and this will be one of them.
The internet made photos of trump and putin kissing shirtless.
And is that OK?
I’m going to jump in on this one and say yes - it’s mostly fine.
I look at these things through the lens of the harm they do and the benefits they deliver - consequentialism and act utilitarianism.
The benefits are artistic, comedic and political.
The “harm” is that Putin and or Trump might feel bad, maaaaaaybe enough that they’d kill themselves. All that gets put back up under benefits as far as I’m concerned - they’re both extremely powerful monsters that have done and will continue to do incredible harm.
The real harm is that such works risk normalising this treatment of regular folk, which is genuinely harmful. I think that’s unlikely, but it’s impossible to rule out.
Similarly, the dissemination of the kinds of AI fakes under discussion is a negative because they do serious,measurable harm.
I think that is okay because there was no intent to create pornography there. It is a political statement. As far as I am concerned that falls under free speech. It is completely different from creating nudes of random people/celebrities with the sole purpose of wanking off to it.
Is that different than wanking to clothed photos of the same people?
They’re making pornography of women who are not consenting to it when that is an extremely invasive thing to do that has massive social consequences for women and girls. This could (and almost certainly will) be used on kids too right, this can literally be a tool for the production of child pornography.
Even with regards to adults, do you think this will be used exclusively on public figures? Do you think people aren’t taking pictures of their classmates, of their co-workers, of women and girls they personally know and having this done to pictures of them? It’s fucking disgusting, and horrifying. Have you ever heard of the correlation between revenge porn and suicide? People literally end their lives when pornographic material of them is made and spread without their knowledge and consent. It’s terrifyingly invasive and exploitative. It absolutely can and must be illegal to do this.
I think the biggest thing with that is trump and Putin live public lives. They live lives scrutinized by media and the public. They bought into those lives, they chose them. Due to that, there are certain things that we push that they wouldn’t necessarily be illegal if we did them to a normal, private citizen, but because your life is already public we turn a bit of a blind eye. And yes, this applies to celebrities, too.
I don’t necessarily think the above is a good thing, I think everyone should be entitled to some privacy, having the same thing done to a normal person living a private life is a MUCH more clear violation of privacy.
Public figures vs private figures. Fair or not a public figure is usually open season. Go ahead and make a comic where Ben Stein rides a horse home to his love nest with Ben Stiller.