Hacker News new | past | comments | ask | show | jobs | submit login

Well remember what actually happened. Apple removed their app! That's a bigger problem in general (I think at some point, legislators will need to force Apple to allow 3rd party installs without jailbreaking a phone).

Tumblr had issues with jailbait and childporn. Even with a lot of moderation and policing, it was difficult to keep under control. As an interesting consequence, people who use scripts to rip entire blogs, may have underage selfies and other pics they're not even aware of. It's a strict liability crime in the US, so even having huge Tumblr dumps can be risky!

There were a lot of factors involved in the censorship, and it makes me thing the future of the open web needs to be more federated/distributed. Sites like Twitter, Tumblr, and Facebook need to lose relevance.




> Tumblr had issues with jailbait and childporn. Even with a lot of moderation and policing, it was difficult to keep under control.

That's because they had crappy design:

* How does one report a post? Click the share button (I'm not clicking something labeled share on child porn).

* Can moderators delete posts and the comment chain that allows for coordination? Nope, just the images.

Doesn't matter how moderation one does if the tools aren't there.


Apple specifically removed it for child porn, and once Tumblr took care of that (before removing adult content in general), Apple added them back to the store.


Taking care of it on an ongoing basis might have been too great a burden for the (assumedly-understaffed) team, or they had too few tools to deal with it and no incentive or manpower to create new ones that worked with the existing adult content system.

Simply banning adult content en masse allows them to use heuristics to identify nudtiy and genitalia in images.

Previously, they would have had to make a judgment call on whether those depicted were of adult age. By banning all of it, they no longer have the responsibility to make that choice.


> or they had too few tools to deal with it

That wasn’t the case. They had the industry standard tools (PhotoDNA + partnership with NCMEC), but I can’t say how well they were being maintained at the time of the situation with Apple.


And at the time, HN roasted Apple pretty roundly for the decision: https://news.ycombinator.com/item?id=18494137

There was some great snark about Safari being next on the ban list.


The thing is, a naked 17-year-old and a naked 18-year-old look fairly similar. One is a felony; one is not. It would be evry difficult to distinguish between them. Apple probably wanted to head off the outrage which some person would drum up, and passed to tumbler. Tumbler figured it was easier to classify all inappropriate content, as there's no real way to classify something so fuzzy with the requisite accuracy.


Other porn sites with user contributed content exist. This is not as big a problem as you make it out to be. Not to the extent that a big company like Automattic wouldn't be able to solve it and keep that quite profitable interest group on board. The reason they won't is because they don't like to go against the grain of the currently advertiser-mandated vision of an exclusively family-friendly internet — where 'internet' here means the ad-supported part of it; i.e., all of the bigger commercial content silos.

The outrage comes when people stumble upon photos of minors in the early pubescent or even prepubescent stage of development intended to titillate. That is, content that is fairly consistently classed as child pornography, and no apparent action is undertaken to purge that content.

> […] as there's no real way to classify something so fuzzy with the requisite accuracy.

For the odd case where an account is uploading content that looks like it might involve a minor nearing adulthood, a platform privately and confidentially asking for proof of identity and age is reasonable enough. It's a fair solution for, to name just one example, the odd flat-chested twenty-something exhibitionist of Asian descent.


> sites with user contributed content exist.

Very true, but there's a notable distinction: they can devote 100% of their expertise to moderating such nuances. It would be costly for a company such as tumbler (one which is not built around inappropriate material) to build out and maintain such expertise.

> The reason they won't is because they don't like to go against the grain of the currently advertiser-mandated vision of an exclusively family-friendly internet — where 'internet' here means the ad-supported part of it; i.e., all of the bigger commercial content silos.

Are you surprised this is the case? What brand wants to be associated with that sort of stuff?

> The outrage comes when people stumble upon photos of minors in the early pubescent or even prepubescent stage of development intended to titillate. That is, content that is fairly consistently classed as child pornography, and no apparent action is undertaken to purge that content.

There have been quite a few instances of prosecuting minors due to sending pictures of inappropriate content, taken by themselves: https://www.thedailybeast.com/cheats/2010/03/21/is-sexting-c...

> For the odd case where an account is uploading content that looks like it might involve a minor nearing adulthood, a platform privately and confidentially asking for proof of identity and age is reasonable enough. It's a fair solution for, to name just one example, the odd flat-chested twenty-something exhibitionist of Asian descent.

I'm sure there was a more polite way to phrase that. Regardless, much of the internet consists of downloading and re-uploading media. This would remain difficult.


>Other porn sites with user contributed content exist.

I bet that those sites also hosts vast amounts of illegal material as well. The law isn't fully enforced because doing so would have so many people arrested that it would lead to invalidating the law.

As you said, the outrage doesn't really exists in those hard to tell cases, but the law isn't written to align with the outrage. So the content is largely ignored unless a specific case happens to get popular attention.

Overall the standards are about as sane as our standards around other similar topics, which is to say not at all.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: