Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Nonconsensual AI porn" is a weasel term because it implies that it should be necessary to get someone's consent to create fake porn using their faces.


It definitely should be. I don’t want my face used for porn if I just randomly shot a stock photo.

Controlling likenesses in AI was the whole point of the SAG strike.


What if I hired a human artist to hand draw the fake instead?

What if they used Photoshop instead of drawing from a reference?


People can, and do sue for that successfully.

The fact that a small problem exists does not negate the existence of a larger problem.


Only if its published, or commercialized, right?

Yeah, widely publishing or making money off of someone's likeness is a whole different can of worms. I was thinking for strictly personal use.


Nonconsensual porn laws generally aren't restricted to commercial use, and some include fake images with intentional and recognizable use of likeness (some also don't, its a mixed bag.)

Yes, commercial use of likeness is also an issue, and it may or may not be violated simply by distribution of something use the likeness on a commercial website like civitai.


Won't section 230 cover the former?


Section 230 might cover right of publicity claims for an innocent host or user, but not for the user submitting the content.

Revenge porn laws are generally criminal, and as such unaffected by Section 230.


Presumably a16z is not investing in a company for personal use.


Well that is worrying.

Right now CivitAI is basically Stable Diffusion social media. It doesn't seem profitable at all, even of they are selling user data or getting subs or whatever.

But if they start charging for downloads or whatever, they may cross a line into making money off likeness, and not just "hosting user content" like Facebook and Twitter and any oldschool image hosting service gets to say they do.


"What if I had significantly increased barriers to doing this very inappropriate and deeply weird thing?"


thats still a problem, don't you see it?


How can that possibly be enforced though? You can't really stop people from drawing or photoshopping stuff for personal use, and this is essentially a further extension of the technology.


> Controlling likenesses in AI was the whole point of the SAG strike.

No, it wasn't.

It was one important issue in the strike, but there were others (streaming residuals were a big issue, for instance.)


> Controlling likenesses in AI was the whole point of the SAG strike.

I think they had 5 or 6 demands.


Well said. If you fancy using my likeness as a dartboard, or in a meme, or as a Photoshop asset, or painted on a canvas, or drawn by AI, or mistakenly randomly generated, etc, great! Have fun. Not my circus, not my monkeys.

I'm not entitled to categorically own/forbid using a look. That's nonsense and leads to self-inflicted quandaries: How do I know a video of unknown provenance contains me, not a dead ringer that gave consent? How different must a depiction be to not require my consent? 9 pixels? 30%, whatever that means? At least an eye color change?

It's impossible to consistently enforce, presumptive, and effectively thought-policing a concept. In short: it's absurd.


> How do I know a video of unknown provenance contains me, not a dead ringer that gave consent?

> It's impossible to consistently enforce, presumptive, and effectively thought-policing a concept. In short: it's absurd.

I mean, come on. It’s fine to disapprove of the law, but this isn’t some uniquely difficult thing that the legal system couldn’t possibly handle. It’s certainly nowhere near the level of complexity and ambiguity of, for instance, criminal fraud law, where things like the intent of the accused and the “reasonable person” are routinely crucial elements.


Actually it is arguably higher level of complexity, because while intent is not normally an element for right of publicity, it has been looked to along with effect to disambiguate which cases that aren’t simple image or voice likenesses (such as voice impersonations) are nevertheless covered.

But it is neither novel nor unique to AI.


Across the internet spreads a noisy video. It's pornography with an absolute dead ringer celebrity face. There are no context clues -- physical SMT, celebrity references, video provenance, etc.

What do?


I mean, if you don’t know who is responsible, what do you ever do? What if you find a dead body but no clues about how they died? What’s uniquely tricky about this particular type of crime?


There is no unique trickery here. In both cases, you do nothing -- don't charge presumed innocent people with "likeness" theft or with murder.



“In the United States, the right of publicity is a state law–based right, as opposed to federal, and recognition of the right can vary from state to state.”

So, the USA-specific answer is depends on the specific US state(s) whose law relevant to the action in controversy.

There are countries with national rights in this area, but the USA is (and your source highlights this) not one of them.


It unequivocally should. Stop abusing peoples' privacy or you are going to get your toys taken away...


The irony of course is that people are only able to create deepfakes of non-celebrities because social media has already gotten the average user very comfortable with letting go of their privacy.


True, but I'm not going to blame the average person for being the victim of industrialized psychology.


Taking a pornographic movie and putting someone's face in place of one of the actor's does not violate their privacy in any way, since nothing private was shared that wasn't private before (their face).


Sharing a photo on Facebook doesn’t imply a public license for any pervert to use it for pornography. Don’t complain when that gets codified in law either — people like you are way too cavalier with other people’s livelihoods.


People are too entitled in trying to own a look. Can I spread porn fakes of myself? What if I look identical to Taylor Swift? Do I lose my right to free expression?


Blame Disney, not me. And I'd argue most individual people are interested in controlling what is done with their own visage more than anything. It's the same legal logic as revenge porn laws. If you are my enemy, all I have to do is find your sibling's Instagram account and I could make entire <yourlastname>hub website.

> What if I look identical to Taylor Swift? Do I lose my right to free expression?

Taylor Swift's corporate legal team would already do a pretty fine job of excising that right from you. No additional legislation needed.


> Can I spread porn fakes of myself?

Yes.

> What if I look identical to Taylor Swift?

Yes, still.

Now, if you are trading commercially on the appearance similarity in a way which presents, either explicitly or implicitly on your images as Swift’s, then you open yourself up to right of publicity claims in some jurisdictions, and the same may be true of revenge porn laws in some jurisdiction, even without the context being commercial.


If you genuinely believe that my kosher, not-Swift deep fakes would ever legally survive -- regardless of context/provenance claims -- then I have a bridge to sell you.


I'd brush up on some information regarding that. Cause that my friend is very much illegal and people could very easily sue and win.


No they could not easily sue and win, especially if no money was exchanged and the video wasn't presented by the maker as a actually being of the video in question. There are plenty of porn sites who's sole existence is deep faking rich and powerful people performing sex acts. Those sites use the word FAKE or something akin to that in the title/intro of the video to add some protection.


> because it implies that it should be necessary

Who gets to decide what should or should not be necessary? Do you think your opinion about this is the majority view by people?


Any term given to a specific criminal action will often be used to refer to that action with the implication that it’s a crime, yes.


It should be necessary to get someone's consent to create fake porn using their faces.


It should, obviously, be necessary to do this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: