I've had to avoid taking certain cutesy photos of my 3 year old toddler son because he is sometimes naked (example: bathtime or beach) and I have to fear my innocent photos being misidentified as CSAM.
What's crazy is that my library contains naked photos of ME as a toddler (I scanned a bunch of old slides in a few years ago) and I have to of course wonder if that is going to get flagged. (My parents were German immigrants. Germans DNGAF about human nudity, unlike the puritanical Americans.)
There's a cost to automating this. You might say "well a human can tell" but humans don't scale.
Regarding the evaluation of everything else, I find it useful to ask what the concrete demonstrable harm is that has been done. Not the hypothetical harm, mind you. So for automatic generation of self-indulgent pornographic material (or for example things that are not even possible in reality such as... hentai?), I don't see an issue unless it is acted upon and demonstrably harms someone or violates consent (harm caused by creating a court case should not count towards this since that is circular reasoning). Most people who are only attracted to adults have fapped to content that is something they would never do in reality; I don't think it's a strong argument that the mere possibility that that might occur is enough to ban it.
What's crazy is that my library contains naked photos of ME as a toddler (I scanned a bunch of old slides in a few years ago) and I have to of course wonder if that is going to get flagged. (My parents were German immigrants. Germans DNGAF about human nudity, unlike the puritanical Americans.)
There's a cost to automating this. You might say "well a human can tell" but humans don't scale.
Regarding the evaluation of everything else, I find it useful to ask what the concrete demonstrable harm is that has been done. Not the hypothetical harm, mind you. So for automatic generation of self-indulgent pornographic material (or for example things that are not even possible in reality such as... hentai?), I don't see an issue unless it is acted upon and demonstrably harms someone or violates consent (harm caused by creating a court case should not count towards this since that is circular reasoning). Most people who are only attracted to adults have fapped to content that is something they would never do in reality; I don't think it's a strong argument that the mere possibility that that might occur is enough to ban it.