Thanks ahead of time for your feedback
i think its ‘barrier to entry’
photoshop took skills that not everyone has/had keeping the volume low.
these new generators require zero skill or technical ability so anyone can do it
so anyone can do it
so anyone can do it and the victim can be your neighbor next door, not some celebrity, where you can internally normalize it with “well, it is a price of fame”
unfortunately, this list is only going to grow: https://en.wikipedia.org/wiki/List_of_suicides_attributed_to_bullying
When Photoshop first appeared, image manipulations that would seem obvious and amateurish by today’s standards were considered very convincing—the level of skill needed to fool large numbers of people didn’t increase until people became more familiar with the technology and more vigilant at spotting it. I suspect the same process will play out with AI images—in a few years people will be much more experienced at detecting them, and making a convincing fake will take as much effort as it now does in Photoshop.
Nope, the ai will continue to get better, and soon spotting the fakes will be nearly impossible.
I have been a professional editor for decades and I can tell you that probably 30 to 40% of fakes still get past me, and I am much better at spotting these things any of you are lol
imho, not dissimilar to model planes>drones.
To operate a model plane, there was a not-small amount of effort you needed to work through (building, specialist components, local club, access to a proper field, etc.).
This meant that by the time you were flying, you probably had a pretty good understanding of being responsible with the new skill.In the era of self-stabilising GPS guided UAVs delivered next-day ready-to-fly, the barrier to entry flew down.
And it took a little while for the legislation to catch up from “the clubs are usually sensible” to “don’t fly a 2KG drone over a crowd of people at head height with no experience or training”It would also take a lot more effort to get something even remotely believable. You would need to go through thousands of body and face photos to get a decent match and then put in some effort pairing the two photos together. A decent “nude” photo of a celebrity would probably take at least a day to make the first one.
Ehhhh, I like to think that eventually society will adapt to this. When everyone has nudes, nobody has nudes.
Unfortunately, I doubt it will be everyone. It will primarily be young women, because we hyper-sexualize those…
You might think so, but I don’t hold as much hope.
Not with the rise of holier than thou moral crusaders who try to slutshame anyone who shows any amount of skin.
I like to be optimistic, eventually such crusaders will have such tools turned against them and that will be that. Even they will begin doubting whether any nudes are real.
Still, I’m not so naive that I think it can’t turn any other way. They might just do that thing they do with abortions, that is the line of reasoning that goes: “the only acceptable abortion is my abortion”, now changed to “the only fake nudes, are my nudes”
Have you tried to get consistent goal orientated results from these ai tools.
To reliably generate a person you need to configure many components, fiddle with the prompts and constantly tweak.
To do this well in my eyes is a fair bit harder than learning how to use the magic wand in Photoshop.
I mean, inpainting isn’t particularly hard to make use of. There are also tools specifically for the purpose of generating “deepfake” nudes. The barrier for entry is much, much lower.
It is honestly kind of horrifying how much shrugging I’m seeing on this post over this issue
You could also just find the promps online and paste them in.
Honestly? It was kind of shitty back then and is just as shitty nowadays.
I mean, I get why people do it. But in my honest opinion, it’s still a blatant violation of that person’s dignity, at least if it’s distributed.
It’s not that now it’s bad… it’s that now it’s actually being addressed. Whereas before it was just something people would sweep under the rug as being distasteful, but not worthy of attention.
It was easier to ignore when teenagers couldn’t produce convincing images of their classmates with about 5min of research and a mediocre piece of software, then plaster it all over their friend groups or - god forbid - online forums/social media.
Because previously if someone had the skills to get rich off the skill making convincing fake nudes we could arrest and punish them - people with similar skillsets would usually prefer more legitimate work.
Now some ass in his basement can crank them out and it’s a futile game of whack-a-mole to kill them dead.
it’s a futile game of whack-a-mole
It’s still going to be futile even with this law in place. Society is going to have to get used to the fact that photo-realistic images aren’t evidence of anything (especially since the technology will keep improving).
It blows my mind when I think about where we might be headed with this tech. We’ve gotten SO used to the ability to communicate instantly with people far away in the technology age, how will we adapt when we have to go back 300 years and can only trust something someone tells us in person. Will we go back to local newspapers? Or can we not even trust that? Will we have public amphitheaters in busy parts of town, where people will around the news? And we can only trust these people, who have a direct chain of acquaintance all the way back to the source of the information? That seems extreme, but I dunno.
I think most likely we won’t implement extreme measures like that, to ensure we’re still getting genuine information. I think most likely we’ll just slip into completely generated false news from every source, no longer have any idea what’s really going on, but be convinced this AI thing was overblown, and have no idea we’re being controlled.
I don’t think it will be quite that bad. Society worked before photography was invented and now we have cryptographic ways to make sure you’re really talking to the person you think you’re talking to.
Truuue, I hadn’t thought of that. Okay, at least it won’t be as bad as I feared.
Now I just have to sell all these vintage printing presses I bought…
Because now, anyone can do it to anyone with zero effort and a single photo.
Sure, before anyone with decent Photoshop skills could put together a halfway decently convincing fake nude but its still significantly more effort and time than most would be bothered with and even then its fairly easy to spot and dispute a fake.
Most people weren’t concerned if a celebrity’s fake nudes were spread around before but now that a colleague, student, teacher, family member, or even a random member of the public could generate a convincing photo the threat has become far more real and far more conceivable
To be fair. Photoshop has made tasks like this incredible simple. With a “good” photo, the process is much less esoteric now than it was once.
-
it still takes time/knowledge and isn’t automated
-
it can’t be turned into an unending assembly line where one 16 year old with basic computer literacy can pump out thousands a day if they want
-
I have a similar opininipn. People have been forging!/editing photographs and movies for as long as the technique existed.
Now any stupid kid can do it, the hard part with AI is actually not getting porn. If it can teach everyone that fake photo are a thing, and make nudes worthless (what’s the point of a nude anyway ? Genitals looks like… Genitals)
Imagine this happening to your kid(s) or SO. Imagine this happening to you as a hormonal, confused 15 year old.
Would you tell them “this doesn’t really matter”?
Kids have killed themselves because they feel ostracized by social media, the act of just not being included or having a “great life” like they think they see everyone else having. You think they’re simply going to adapt to a world where their classmates (and complete strangers) can torture them with convincing nude images?
This is the bullying equivalent of a nuclear weapon IMO
Mostly dating situations are where it matters.
When does it matter in a dating situation?
Sexting
In what way exactly? I’m sorry, I don’t see it
People who are thinking of dating, or who are dating have been known to send nudes to each other.
This serves several purposes, it indicates interest, it’s meant to entice interest, it’s a sign that things are going well, sets the mood sets the tone etc it is a form of sexual communication which is necessary, and not unheard of, when people are dating
Oh, I thought you were talking about where fake nudes matter. I didn’t think we were talking about “legitimate” uses of nudes, because this whole thread is about fakes :D
It’s always been a big deal, it just died down as Photoshop as a tool became normalized and people became accustomed to it as a tool.
Because now teenagers can do it with very little effort whereas before it at least required a lot of time and skill
If AI is so convincing, why would anyone care about nudes being controversial anymore? You can just assume it’s always fake. If everything is fake, why would anyone care?
You’re right. I’m going to go make convincing images of your partner/sibling/parents/kids/etc. and just share them here since no one should care.
In fact, I’ll share them on other sites as well.
That is where we are not seeing this the same way. You wanna make fake images, knock yourself out. I don’t care who they are. Make some of me for all I care.
So if this happened to your significant other or children, and they are clearly upset and want it to stop, you would just go “I don’t care?”
We all sometimes dig in our heels and become absolutist in our arguments online, but dude…do I really need to explain how insane that sounds? Do you really just have a complete lack of imagination and/or empathy?
What happens when someone makes a convincing fake of you having sex with a kid and spreads it around “for the lulz”?
Real helpful and smart comment btw. Uou must be fun to be around.
I should’ve clarified that this was meant to be tongue in cheek but also illustrative of the issue. Something being “fake” doesn’t make it any less real to a victim. I understand it was flippant and a bit aggressive, but the intention was to get them to consider the impact it could have on them personally and those they love. It’s a very serious problem and I just struggle to see how people can shrug it off, which admittedly isn’t something I should take out on folks here.
Hopefully the point still comes across
Specifically because it’s convincing. You may just assume everything is fake, that doesn’t mean everyone will. You may not care about seeing someone’s imperceptibly realistic nude, but if it’s depicting them they may care, and they deserve the right for people not to see them like that.
Just because it’s not logistically feasible to prevent convincing AI nudes from spreading around doesn’t make it ethical
-
easy for anyone to do it
-
easy to do it at scale
-
Doctored photos have always been a problem and, legally speaking, could lead to the faker being sued for defamation, depending on what was done with the person’s image.
AI Photos are only part of the problem. Faking the voice is also possible, as is making “good enough” videos where you just change the head of the actual performer.
Another part of the problem is that this kind of stuff spreads like wildfire within groups (and it’s ALWAYS groups where the victim is) and any voices stating that it’s fake will be drowned by everyone else.
How do you prove it’s not you in either case? Photoshop doesn’t make a whole video of you fucking a sheep. But AI can and is actively being used that way. With Photoshop it was a matter of getting ahold of the file and inspecting it. Even the best Photoshop jobs have some key tells. Artifacting, layering, all kinds of shading and lighting, how big the file is, etc.
I want to add something. What if all the sudden it’s your 12 year old daughter being portrayed in this fake? What if it’s your mom? It would have been a big deal to you to have that image out there of your loved one back in the 90’s or early 2000’s. It’s the same kind of big deal now but more widespread because it’s so easy now. It’s not okay to just use the image of someone in ways they didn’t consent to. I have a similar issue with facial recognition regardless of the fact that it’s used in public places where I have no control over it.
In addition the the reduced skill barriers mentioned, the other side effect is the reduced time spent finding a matching photo and actually doing the work. Anyone can create it in their spare time, quickly and easily.
I sorta feel this way. Before that people would make cutout mashups or artistic types might depict something. I do get that its getting so real folks may things the people actually did the thing.
I got a few comments pointing this out. But media is hell bent on convincing people to hate AI tools and advancements. Why? I don’t know.
Tin foil hate is that it can be an equalizer. Powerful people that own media like to keep power tools to themselves and want the regular folk to fear and regulate ourselves from using it.
Like could you imagine if common folk rode dragons in GOT. Absolutely disgusting. People need to fear them and only certain people can use it.
Same idea. If you’re skeptical, go look up all the headlines about AI in the past year and compare them to right wing media’s headlines about immigration. They’re practically identical.
“Think of the women and children.”
“They’re TAKING OUR JOBS”
“Lot of turds showing up on beaches lately”
“What if they kill us”
“THEY’RE STEALING OUR RESOURCES”
You’re looking for a cat’s fifth leg. There is no conspiracy. It’s just new technology and what’s new is scary, specially big leaps, which this new age of machine learning seems to be part of.
It was a big deal back then as well, dumbass
You deserve the insult more than him