Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is probably controversial, but if we assume that pedos will (unfortunately) exist and will generate demand for this kind of material, isn't it better if it comes from an AI instead of real children?


This is an old discussion about "pretend paedophilia" using either art or (adult) actors, and AI essentially changed very little about it. The two views, very briefly, are:

- "It's a safer outlet and prevents actual child abuse, so it's a good thing."

- "It will encourage and enforce paedophilic tendencies and (indirectly) encourages actual child abuse, so it's a bad thing."

The last time I looked, the evidence is inconclusive. It's a difficult topic to research well, so I'm not expecting anything conclusive on this any time soon.

My own view is that most likely, there are different kind of paedophiles and that different things will be true for different groups because these types of things aren't that simple. This kind of nuance is even harder to research, especially on such a controversial topic fraught with ethics issues.

There's also the issue of training material, which is unique to AI. Is it possible to have AI generated child abuse material without training material of that kind of thing? I don't know enough about AI to comment on that.


Does the legality of fake material increase the demand of real material? Does availability of fake material "awaken" or otherwise normalize desires that might have remained dormant? Studies have shown a link between violent porn and abusive behavior. A link, of course, does not mean causation, but given the potential for monstrous harm, I think we need to be wary of legalizing this kind of material. There's also the question of the training set used to generate this type of imagery.

However, I also think thoughtcrime is a very dangerous and slippery slope. It's not an easy question with an easy answer.


For it to come from AI it needs to come from real children. Chicken and the egg scenario.

Someone abuses a child, film it, put it in AI. And they now have that child's model.

Throw away the child and they're currently guilty free if any charges. Of course that won't be enough so repeat the process.

It's not like someone is creating a model in blender and than running that though a AI. Not like that doesn't happen anyway.


Gen AI does not need to be training on photos of nude children to produce photos of nude children. It can generate a flying pig soaring over Congress without ever being trained to do so.


It may not but why would it not be trained on child imagery if not to produce photorealistic results?

If you had the opportunity to tune your AI with photography than to self generated where true photography of a pig which produced higher quality less defects on generation why would you not go for such?


If a generative AI knows the concept of children and the concept of porn, it can generate porn with children in it (possibly with various degrees of success and realism). It’s not stuck and forced to produce only what was strictly in the training set. AIs are fundamentally extrapolation machines.


> For it to come from AI it needs to come from real children.

Yes, but given that CSAM data already exists, and we can't go back in time to prevent it, there's no further cost to attain that dataset. Unlike all future real CSAM, which will be produced by abusing children IRL.

I see parallels with Unit 731 here. Horrible things have already happened, so what do we do with the information?


It's not the cost, why do need movies get produced when existing movies already exist?

Because of new content. If AI is being trained on real data and new content than the datasets don't end up stale.


New movies get produced because people want to make and sell movies. They don't have to make movies that are 100% reality. Movies actually use special effects and CGI to fake all kinds of things that would be illegal in real life.

For example, there was a time when to get a flood effect filmmakers flooded a set. 3 extras died. Later on they were told they can't do that, but they can simulate it. Tons of movies show people getting overcome by floods, but no one dies in real life anymore.


> New movies get produced because people want to make and sell movies.

Same with CP.

But real movies still use real effects. Just a lot more of it is on a green screen as a cost saving exercise and the demand for the movie to be now now now.

If quality went in to making films as they did in the past, the movie industry wouldn't be such a shovel of shite. Those were real, with real actors and real acting. Now you got CGI however, scenes are still produced in the real.


It feels really wrong to write this but what happens if someone makes a model that's "good enough" and there's no incentive to abuse children any more? Also, a lot of models are never trained on real pictures of what they generate.


If a "good enough" model ever came be it could split the pedophile category in to two. (On a basic level)

Those who seek sexual gratification from the abuse of a minor. The real deal.

And those who are aroused by the body of the minor, or watching the abuse of an minor.

If the model is "good enough" than you could potentially say that those who are interested in pedophilia probably won't seek the further extremes to fulfil their pleasure.

However, in the long run they are still pedophillac and the real deal will always be the more for those.


> what happens if someone makes a model that's "good enough" and there's no incentive to abuse children any more?

That isn't how anything works.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: