Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Without the changes they made to Stable Diffusion, it was already able to generate CP. That's why they restricted it from doing so. It did not have child pornography in the training set, but it did have plenty of normal adult nudity, adult pornography, and plenty of fully clothed children, and was able to extrapolate.

Anyway, one obvious application: FBI could run a darknet honeypot site selling AI-generated child porn. Eliminate the actual problem without endangering children.



> FBI could run a darknet honeypot site selling AI-generated child porn. Eliminate the actual problem without endangering children.

It's very unlikely AI generated child porn would even be illegal. Drawn or photoshopped photos aren't so I don't think AI generated would be.


This isn't the case in law in many countries. Whether an image is illegal or not does not solely depend on the means of production; if the images are realistic, then they are often illegal.

https://en.m.wikipedia.org/wiki/Legal_status_of_fictional_po...

Don't forget that pornographic images and videos featuring children may be used for grooming purposes, socializing children into the idea of sexual abuse. There's a legitimate social purpose in limiting their production.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: