First off, I am sex positive, pro porn, pro sex work, and don’t believe sex work should be shameful, and that there is nothing wrong about buying intimacy from a willing seller.
That said. The current state of the industry and the conditions for many professionals raises serious ethical issues. Coercion being the biggest issue.
I am torn about AI porn. On one hand it can produce porn without suffering, on the other hand it might be trained on other peoples work and take peoples jobs.
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.
without a victim
It was trained on something.
It can generate combinations of things that it is not trained on, so not necessarily a victim. But of course there might be something in there, I won’t deny that.
However the act of generating something does not create a new victim unless there is someones likeness and it is shared? Or is there something ethical here, that I am missing?
(Yes, all current AI is basically collective piracy of everyones IP, but besides that)
Watching videos of rape doesn’t create a new victim. But we consider it additional abuse of an existing victim.
So take that video and modify it a bit. Color correct or something. That’s still abuse, right?
So the question is, at what point in modifying the video does it become not abuse? When you can’t recognize the person? But I think simply blurring the face wouldn’t suffice. So when?
That’s the gray area. AI is trained on images of abuse (we know it’s in there somewhere). So at what point can we say the modified images are okay because the abused person has been removed enough from the data?
I can’t make that call. And because I can’t make that call, I can’t support the concept.
I mean, there’s another side to this.
Assume you have exacting control of training data. You give it consensual sexual play, including rough play, bdsm play, and cnc play. We are 100% certain the content is consensual in this hypothetical.
Is the output a grey area, even if it seems like real rape?
Now another hypothetical. A person closes their eyes and imagines raping someone. “Real” rape. Is that a grey area?
Let’s build on that. Let’s say this person is a talented artist, and they draw out their imagined rape scene, which we are 100% certain is a non-consensual scene imagined by the artist. Is this a grey area?
We can build on that further. What if they take the time to animate this scene? Is that a grey area?
When does the above cross into a problem? Is it the AI making something that seems like rape but is built on consensual content? The thought of a person imagining a real rape? The putting of that thought onto a still image? The animating?
Or is it none of them?
Consensual training data makes it ok. I think AI companies should be accountable for curating inputs.
Any art is ok as long as the artist consents. Even if they’re drawing horrible things, it’s just a drawing.
Now the real question is, should we include rapes of people who have died and have no family? Because then you can’t even argue increased suffering of the victim.
But maybe this just gets solved by curation and the “don’t be a dick” rule. Because the above sounds kinda dickish.
We already allow simulated rape in tv and movies. AI simply allows a more graphical portrayal.
I see the issue with how much of a crime is enough for it to be okay, and the gray area. I can’t make that call either, but I kinda disagree with the black and white conclusion. I don’t need something to be perfectly ethical, few things are. I do however want to act in a ethical manner, and strive to be better.
Where do you draw the line? It sounds like you mean no AI can be used in any cases, unless all the material has been carefully vetted?
I highly doubt there isn’t illegal content in most AI models of any size by big tech.
I am not sure where I draw the line, but I do want to use AI services, but not for porn though.
It just means I don’t use AI to create porn. I figure that’s as good as it gets.
It’s not just AI that can create content like that though. 3d artists have been making victimless rape slop of your vidya waifu for well over a decade now.
Yeah, I’m ok with that.
AI doesn’t create, it modifies. You might argue that humans are the same, but I think that’d be a dismal view of human creativity. But then we’re getting weirdly philosophical.
With this logic, any output of any pic gen AI is abuse… I mean, we can 100% be sure that there are CP in training data (it would be a very bug surprise if not) and all output is result of all training data as far as I understand the statistical behaviour of photo gen AI.
With this logic, any output of any pic gen AI is abuse
Yes?
We could be sure of it if AI curated it’s inputs, which really isn’t too much to ask.
Well AI is by design not able to curate its training data, but companies training the models would in theory be able to. But it is not feasible to sanitise this huge stack of data.
There is no ethical consumption while living a capitalist way of life.
😆as if this has something to do with that
But to your argument: It is perfectly possible to tune capitalism using laws to get veerry social.
I mean every “actually existing communist country” is in its core still a capitalist system, or how you argue against that?
ML always there to say irrelevant things
Whats illegal in real porn should be illegal in AI porn, since eventually we won’t know whether it’s AI
That’s the same as saying we shouldn’t be able to make videos with murder in them because there is no way to tell if they’re real or not.
That’s a good point, but there’s much less of a market for murder video industry
I mean, a lot of TV has murders in it. There is a huge market for showing realistic murder.
But I get the feeling your saying that there isn’t a huge market for showing real people dying realistically without their permission. But that’s more a technicality. The question is, is the content or the production of the content illegal. If it’s not a real person, who is the victim of the crime.
Yeah the latter. Also murder in films for the most part is for storytelling. It’s not murder simulations for serial killers to get off to, you know what I mean?
I’ve found that there’s a lot of things on the Internet that went wrong because it was ad supported for “free”. Porn is one of them.
There is ethically produced porn out there, but you’re going to have to pay for it. Incidentally, it also tends to be better porn overall. The versions of the videos they put up on tube sites are usually cut down, and are only part of their complete library. Up through 2012 or so, the tube sites were mostly pirated content, but then they came to an agreement with the mainstream porn industry. Now it’s mostly the studios putting up their own content (plus independent, verified creators), and anything pirated gets taken down fast.
Anyway, sites like Crash Pad Series, Erika Lust, Vanessa Cliff, and Adulttime (the most mainstream of this list) are worth a subscription fee.
without a victim
You are wrong.
AI media models has to be trained on real media. The illegal content would mean illegal media and benefiting ,supporting, & profiting from and to victims of crime.
The lengths and fallacies pedophiles will go to justify themselves is absurd.
Excuse me? I am very offended by your insinuations here. It honestly makes me not want to share my thought and opinions at all. I am not in any way interested in this kind of content.
I encourage you to read my other posts in the different threads here and see. I am not an apologist, and do not condone it either.
I do genuinely believe AI can generate content it is not trained on, that’s why I claimed it can generate illegal content without a victim. Because it can combine stuff from things it is trained on and end up with something original.
I am interested in learning and discussing the consequences of an emerging and novel technology on society. This is a part of that. Even if it is uncomfortable to discuss.
You made me wish I didn’t…
Don’t pay any attention to that kinda stupid comment. Anyone posting that kind of misinformation about AI is either trolling or incapable of understanding how generative AI works.
You are right it is a victimless crime (for the creation of content). I could create porn with minions without using real minion porn to put the randomnest example I could think of. There’s the whole defamation thing of publishing content without someone’s permission but that I feel is a discussion irrelevant of AI (we could already create nasty images of someone before AI, AI just makes it easier). But using such content for personal use… It is victimless. I have a hard time thinking against it. Would availability of AI created content with unethical themes allow people to get that out of their system without creating victims? Would that make the far riskier and horrible business of creating illegal content with real unwilful people disappear? Or at the very least much more uncommon? Or would make people more willing to consume thw content creating a feelibg of fake safety towards content previously illegal? There’s a lot of implications that we should really be thinking about and how it would affect society, for better or worse…
Don’t pay any attention to that kinda stupid comment. Anyone posting that kind of misinformation about AI is either trolling or incapable of understanding how generative AI works
Thank you 😊
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.
I’ve been thinking about this recently too, and I have similar feelings.
I’m just gonna come out and say it without beating around the bush: what is the law’s position on AI-generated child porn?
More importantly, what should it be?
It goes without saying that the training data absolutely should not contain CP, for reasons that should be obvious to anybody. But what if it wasn’t?
If we’re basing the law on pragmatism rather than emotional reaction, I guess it comes down to whether creating this material would embolden paedophiles and lead to more predatory behaviour (i.e. increasing demand), or whether it would satisfy their desires enough to cause a substantial drop in predatory behaviour (I.e. lowering demand).
And to know that, we’d need extensive and extremely controversial studies. Beyond that, even in the event allowing this stuff to be generated is an overall positive (and I don’t know whether it would or won’t), will many politicians actually call for this stuff to be allowed? Seems like the kind of thing that could ruin a political career. Nobody’s touching that with a ten foot pole.
what is the law’s position on AI-generated child porn?
Pretend underage porn is illegal in the EU and some other countries. I believe, in the US it is protected by the first amendment.
Mind that when people talk about child porn or CSAM that means anything underage, as far as politics is concerned. When two 17-year-olds exchange nude selfies, that is child porn. There were some publicized cases of teens in the US being convicted as pedophile sex offenders for sexting.
I believe, in the US it is protected by the first amendment.
CSAM, artificial or not, is illegal in the United States.
I see. I’ve looked up the details. Obscenity - whatever that means - is not protected by the first amendment. So where the material is obscene, it is still illegal.
https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_Coalition
Let’s play devils advocate. You find Bob the pedophile with pictures depicting horrible things. 2 things are true.
-
Although you can’t necessarily help Bob you can lock him up preventing him from doing harm and permanently brand him as a dangerous person making it less likely for actual children to be harmed.
-
Bob can’t claim actual depictions of abuse are AI generated and force you to find the unknown victim before you can lock him and his confederates up. If the law doesn’t distinguish between simulated and actual abuse then in both cases Bob just goes to jail.
A third factor is that this technology and the inherent lack of privacy on the internet could potentially pinpoint numerous unknown pedophiles who can even if they haven’t done any harm yet be profitably persecuted to societies ultimate profit so long as you value innocent kids more than perverts.
Am I reading this right? You’re for prosecuting people who have broken no laws?
I’ll add this; I have sexual fantasies (not involving children) that would be repugnant to me IRL. Should I be in jail for having those fantasies, even though I would never act on them?
This sounds like some Minority Report hellscape society.
Correct. This quickly approaches thought crime.
What about an AI gen of a violent rape and murder. Shouldn’t that also be illegal.
But we have movies that have protected that sort of thing for years; graphically. Do those the become illegal after the fact?
And we also have movies of children being victimized so do these likewise become illegal?
We already have studies that show watching violence does not make one violent and while some refuse to accept that, it is well established science.
There is no reason to believe the same isn’t true for watching sexual assault. There are been many many movies that contain such scenes.
But ultimately the issue will become that there is no way to prevent it. The hardware to generate this stuff is already in our pockets. It may not be efficient but it’s possible and efficiency will increase.
The prompts to generate this stuff are easily shared and there is no way to stop that without monitoring all communication and even then I’m sure work around would occur.
Prohibition requires society sacrifice freedoms and we have to decide what weee willing to sacrifice here because as we’ve seen with or prohibitions, once we unleash the law on one, it can be impossible to undo.
Ok watch adult porn then watch a movie in which women or children are abused. Note how the abuse is in no way sexualized exactly opposite of porn. It often likely takes place off screen and when rape in general appears on screen between zero and no nudity co-occurs. For children it basically always happens off screen.
Simulated child abuse has been federally illegal for ~20 years in the US and we appear to have very little trouble telling the difference between prosecuting pedos and cinema even whilst we have struggled enough with sexuality in general.
But ultimately the issue will become that there is no way to prevent it.
This argument works well enough for actual child porn. We certainly don’t catch it all but every prosecution takes one more pedo off the streets. The net effect is positive. We don’t catch most car thieves either and nobody suggests we legalize car theft.
Am I reading this right? You’re for prosecuting people who have broken no laws?
No I’m for making it against the law to simulate pedophile shit as the net effect is fewer abused kids than if such images were to be legal. Notably you are free to fantasize about whatever you like its the actual creation and sharing of images that would be illegal. Far from being a minority report hellscape its literally the present way things already are many places.
Lol, how can you say that do confidently? How would you know that with fewer AI CP you get less abused kids? And what is the logic behind it?
Demand doesn’t really drop if something is illegal (same goes for drugs). The only thing you reduce is offering, which just resulting in making the thing that got illegal more valuable (this wakes attention of shady money grabbers that hate regulation / give a shit about law enforcement and therefore do illegal stuff to get money) and that you have to pay a shitton of government money maintaining all the prisons.
Basically every pedo in prison is one who isn’t abusing kids. Every pedo on a list is one who won’t be left alone with a young family member. Actually reducing AI CP doesn’t actually by itself do anything.
Wrong. Every pedo in prison is one WHO HAS ALREADY ABUSED A CHILD, whether directly or indirectly. There is an argument to be made, and some studies that show, that dealing with Minor Attracted People before they cross the line can be effective. Unfortunately, to do this we need to be able to have a logical and civil conversation about the topic, and the current political climate does not allow for that conversation to be had. The consequence is that preventable crimes are not being prevented, and more children are suffering for it in the long run.
People are locked up all the time for just possessing child porn without having abused anyone. This isn’t a bad thing because they are a danger to society.
Good arguments. I think I am convinced that both cases should be illegal.
If the pictures are real they probably increase demand, which is harmful. If the person knew, then the action therefore should result in jail and forced therapy.
If the pictures are not, forced therapy is probably the best option.
So I guess it being illegal in most cases simply to force therapy is the way to go. Even if it in one case is “victimless”. If they don’t manage to plausibly seem rehabilitated by professionals, then jail time for them.
I would assume (but don’t really know) most pedophiles don’t truly want to act on it, and don’t want to have those urges. And would voluntarily go to therapy.
Which is why I am convinced prevention is the way to go. Not sacrificing privacy. In Norway we have anonymous ways for pedophiles to seek help. There have been posters and ads for it a lot of places a year back or something. I have not researched how it works in practice though.
Edit: I don’t think the therapy we have in Norway is conversion therapy. It’s about minimizing risk and helping deal with the underlying causes, medication, childhood trauma etc. I am not necessarily convinced that conversion therapy works.
Therapy is well and good and I think we need far more avenues available for people to get help (for all issues). That said, sexuality and attraction are complicated.
Let me start by saying I am not trying to state there is a 1:1 equivalence, this is just a comparison, but we have long abandoned conversion therapy for homosexuals, because we’ve found these preferences are core to them and not easily overwritten. The same is true for me as a straight person, I don’t think therapy would help me find men attractive. I have to imagine the same is true for pedophiles.
The question is, if AI can produce pornography that can satisfy the urges of someone with pedophilia without harming any minors, is that a net positive? Remember the attraction is not the crime, it’s the actions that harm others that are. Therapy should always be on the table.
This is a tricky subject because we don’t want to become thought police, so all our laws are built in that manner. However there are big exceptions for sexual crimes due to the gravity of their impact on society. It’s very hard to “stand up” for pedophilia because if acted upon it has monstrous effects, but AI is making us open this can of worms that I don’t belive we ever really thought through besides criminalizing and demonizing (which could be argued was the correct approach with the technology at the time).
That said, sexuality and attraction are complicated.
There’s nothing particularly complicated about it not being ok to rape kids, or to distribute depictions of kids being raped.
Nobody here has at all suggested it’s ok to rape kids. I hope you can understand the difference between thinking something and doing something.
I really don’t know enough about the subject or how that therapy works. I doubt that it is conversion therapy, but Ii really don’t know. I would assume it’s handling childhood trauma, medications etc.
Both therapy and if satisfying urges through AI generated content, is both something that should be answered scientifically. If there is research then that should be the basis for what decisions is taken, if there is a lack of research then more research should be the next step.
If the pictures are not, forced therapy is probably the best option.
This is true, but it really depends how “therapy” is defined. And forced therapy could mean anything from things like the old conversion therapy approach for gay people.
You might argue that these days we wouldn’t do anything so barbaric, but considering that the nature of a pedophile’s is very unsavory, and the fact that it will never be acceptable, unlike homosexuality, people would be far more willing to abuse or exploit said “therapy”
-
I think the concern is that although it’s victimless, if it’s legal it could… Normalise (within certain circles) the practice. This might make the users more confident to do something that does create a victim.
Additionally, how do you tell if it’s really or generated? If AI does get better, how do you tell?
what is the law’s position on AI-generated child porn?
Already illegal here in the UK https://metro.co.uk/2025/02/02/makers-ai-child-abuse-images-jailed-uk-introduces-world-first-law-22481459/
Illegal is most of the west already as creating sexual assault material of minors is already illegal regardless of method.
It’s so much simpler than that—it can be created now, so it will be. They will use narrative twists to post it on the clearnet, just like they do with anime (she’s really a 1000 year old vampire, etc.). Creating laws to allow it are simply setting the rules of the phenomenon that is already going to be happening.
The only question is whether or not politicians will stop mud slinging long enough to have an adult conversation, or will we just shove everything into the more obscure parts of the internet and let it police itself.
No adult conversation required, just a quick “looks like we don’t get internet privacy after all everyone.” And erosion of more civil liberties. Again.