Quite the horrifying read, and absolutely a case that will be used as ammo to tear down S230.
I love S230 in principle. I don’t think platforms should generally be held accountable for the postings of their users to a point. On paper, it’s an excellent safeguard that has enabled the modern internet for better or worse.
As with all “good things”, however, bad actors have found ways to exploit it for their own ends, and an incompetent Congress has failed to reform it. Modern polarization isn’t helping, as the respective camps regarding S230 tend to be “protect it as is” or “repeal it entirely”, neither of which is appealing or worthwhile. S230 is flawed, and it must be amended, updated, or replaced with a better version that addresses common grievances.
In my subjective experience, I would prefer to see S230’s protections “scale” with the size of the platform and its function. For instance (and I know this will be a highly controversial take), I don’t think large, centralized platforms like FB or X should have S230 protections at all, because of the high degree of control, immense profits available to invest in R&D or better moderation, and the repeated failures of those companies to moderate misinformation and harmful content while also claiming to be authoritative and reliable sources of public information. On the flip side, your small (sub-10k users) community should have full S230 protections, because you’re a niche that likely serves a smaller cohort, rather than a centralized monolith of the general public.
Could such restrictions be exploited to create similar harms as the above? Of course, but at least a modest reform like that would serve as a warning shot to both sides that no legislation is ever “done”, and we must continue to adapt it to ensure it meets the needs of modern times.
My take on it is that opaque, individualized content recommendation algorithms should be treated the same as human editors selecting the content in question. Something Facebook-sized that only shows users a reverse chronological feed of things they manually subscribed to wouldn't be nearly so problematic.
As it is opaque we can't even know if it is human editors behind the scene making all the decisions instead of algorithms. I heard Tiktok uses a lot of human judges to decide what content should go viral, when you do that why do section 230 still apply?
That's acting enough like a publisher that 230 might not apply. If they decide to make something illegal viral, it could lead to interesting litigation.
> and it must be amended, updated, or replaced with a better version that addresses common grievances.
That assumes it will be "better" and less flawed. To that point:
> In my subjective experience, I would prefer to see S230’s protections “scale” with the size of the platform and its function. For instance (and I know this will be a highly controversial take), I don’t think large, centralized platforms like FB or X should have S230 protections at all, because
sec230 in its current form is better than every "reform" I hear suggested.
Not really, if you see the other case law, almost all of these semi-professional "amateur" porn cases come down to he-said, she-said. One party would agree to be filmed/paid for sex, back out half way through, and then sue for involuntary pornography and consent. Very few are the true voyeur cams that you would expect when you initially read about these cases. In this case they couldn't prove any of it so the defendants got a plea deal with no jail time. Suing OnlyFans was just a cash grab, like how lawyers love to go after pornhub (the mindgeek company mentioned in the other case law). If anything, this is section 230 working as intended. There is nothing broken to reform.
What the federal government should do perhaps is to have a federal disclosure and consent form database for all participants in an adult video, listing what they are willing to do, for how long, and when they signed it. That would go a lot further than this sort of wishy washy he-said she-said money grab lawsuits.
If you're filming sexual acts to be placed onto a website for monetary gain, then there should be a written form showing that consent. He said/she said shouldn't be something that should come within 10 miles of these conversations. The consent form should grant either party the ability to with draw that consent again with written consent.
See my other comment about the "girl do porn" cases. The models signed release forms with the expectation their videos go to private buyers but instead they were sold online, so they turned around and sued the producers for sexual assault instead. This sort of thing needs sunshine lists and clear regulations. Plus having to disclose your identity on a public registry would reduce porn participation too, so it's a win-win even for the Republican legislators who are against porn in the first place.
I would go as far as to suggest that full SSN-level KYC ought to be legally required for all actors. From my understanding the adult film/sex industry currently rely purely on physical signatures and maybe driver licenses for their releases. If you need to disclose your SSN to sign up for a penny stock app like Robinhood, then you damn well should be required to do the same for anything sex work related. And if an underaged person try to do that, automatically notify the parents. It's not as if certain segments of the American government are unfamiliar with "don't say trans" tattle-to-the-parents legislation for kids.
> If you need to disclose your SSN to sign up for a penny stock app like Robinhood, then you damn well should be required to do the same for anything sex work related.
Stock trading accounts need your SSN for its original purpose, which is taxes.
And I think that having the government keep a list of people who do things that many consider immoral might worry people who don't trust the government to use such lists responsibly.
> The models signed release forms with the expectation
If the website hosting the content requires consent forms before allowing the video to be published, then these types of situations will be covered. To allow the content to be published, the consent must explicitly grant the host site to publish. If not, then the site should not allow the content. All content should be explicitly granted permission for use. This is how the rest of the world works with licensed content. Not just photography, but even things like fonts grant a license for personal desktop use, but requires additional licensing for use on a website or printed material. Music grants rights for personal use, but again needs additional rights for use in movies, TVs, commercials, social media platforms, etc.
It's yet another example of a working system in place for everyone, yet the people either disrupting or flagrantly disregarding those systems causes chaos.
This is in the same ballpark as S230: sounds great on paper, but is terrible in practice. Having such a database would absolutely expose performers to reprisal or persecution by a bad actor government - i.e., LGBTQ+ actors being persecuted by a religious or fascist government - and therefore shouldn’t exist.
However, I do agree with other posters here that any sort of adult production should absolutely have signed, written consent forms outlining safewords/signals for all participants involved, the nature of what will be filmed, and the expected acts of the parties therein, such that there can be no question about these issues in a court of law. Paperwork saves lives.
> One party would agree to be filmed/paid for sex, back out half way through, and then sue for involuntary pornography and consent.
I mean, I understand what you're saying but you're making it sound like "it's not real rape but more like contractual disagreement".
If someone backs out half way through, they absolutely should be allowed. There might be compensation in their contract etc ... But that's implementation details, the base fact is just because you said "Yes I agree to do X for Y money" doesn't mean you can't quit your job after it starts, and your employer cannot force you to keep working until he's satified with the result. That apply to pretty much any job, but especially to something like porn.
> your employer cannot force you to keep working until he's satified with the result. That apply to pretty much any job, but especially to something like porn
Yes, but there is no equivalent of a clear cut "at will" employment for sex workers, and this needs to change. Legislators refuse to touch this topic with a six foot pole, and that's why it's all in a grey area. Contract law shouldn't come into play here at all, and given the high stakes, it's unfair for producers to be ipso facto accused of sexual assault everytime there's a breach of contract (you can extend this to prostitution too).
IMO, it's pretty simple: freedom must be bundled with responsibility. However if a company doesn't want responsibility, it should be given immunity, but in return it must give up control over user content to some extent.
In practice, the bigger the platform, the more it should be regulated, and platforms with billions of users should be overseen by elected officials.
I love S230 in principle. I don’t think platforms should generally be held accountable for the postings of their users to a point. On paper, it’s an excellent safeguard that has enabled the modern internet for better or worse.
As with all “good things”, however, bad actors have found ways to exploit it for their own ends, and an incompetent Congress has failed to reform it. Modern polarization isn’t helping, as the respective camps regarding S230 tend to be “protect it as is” or “repeal it entirely”, neither of which is appealing or worthwhile. S230 is flawed, and it must be amended, updated, or replaced with a better version that addresses common grievances.
In my subjective experience, I would prefer to see S230’s protections “scale” with the size of the platform and its function. For instance (and I know this will be a highly controversial take), I don’t think large, centralized platforms like FB or X should have S230 protections at all, because of the high degree of control, immense profits available to invest in R&D or better moderation, and the repeated failures of those companies to moderate misinformation and harmful content while also claiming to be authoritative and reliable sources of public information. On the flip side, your small (sub-10k users) community should have full S230 protections, because you’re a niche that likely serves a smaller cohort, rather than a centralized monolith of the general public.
Could such restrictions be exploited to create similar harms as the above? Of course, but at least a modest reform like that would serve as a warning shot to both sides that no legislation is ever “done”, and we must continue to adapt it to ensure it meets the needs of modern times.