Well remember what actually happened. Apple removed their app! That's a bigger problem in general (I think at some point, legislators will need to force Apple to allow 3rd party installs without jailbreaking a phone).
Tumblr had issues with jailbait and childporn. Even with a lot of moderation and policing, it was difficult to keep under control. As an interesting consequence, people who use scripts to rip entire blogs, may have underage selfies and other pics they're not even aware of. It's a strict liability crime in the US, so even having huge Tumblr dumps can be risky!
There were a lot of factors involved in the censorship, and it makes me thing the future of the open web needs to be more federated/distributed. Sites like Twitter, Tumblr, and Facebook need to lose relevance.
Apple specifically removed it for child porn, and once Tumblr took care of that (before removing adult content in general), Apple added them back to the store.
Taking care of it on an ongoing basis might have been too great a burden for the (assumedly-understaffed) team, or they had too few tools to deal with it and no incentive or manpower to create new ones that worked with the existing adult content system.
Simply banning adult content en masse allows them to use heuristics to identify nudtiy and genitalia in images.
Previously, they would have had to make a judgment call on whether those depicted were of adult age. By banning all of it, they no longer have the responsibility to make that choice.
That wasn’t the case. They had the industry standard tools (PhotoDNA + partnership with NCMEC), but I can’t say how well they were being maintained at the time of the situation with Apple.
The thing is, a naked 17-year-old and a naked 18-year-old look fairly similar. One is a felony; one is not. It would be evry difficult to distinguish between them. Apple probably wanted to head off the outrage which some person would drum up, and passed to tumbler. Tumbler figured it was easier to classify all inappropriate content, as there's no real way to classify something so fuzzy with the requisite accuracy.
Other porn sites with user contributed content exist. This is not as big a problem as you make it out to be. Not to the extent that a big company like Automattic wouldn't be able to solve it and keep that quite profitable interest group on board. The reason they won't is because they don't like to go against the grain of the currently advertiser-mandated vision of an exclusively family-friendly internet — where 'internet' here means the ad-supported part of it; i.e., all of the bigger commercial content silos.
The outrage comes when people stumble upon photos of minors in the early pubescent or even prepubescent stage of development intended to titillate. That is, content that is fairly consistently classed as child pornography, and no apparent action is undertaken to purge that content.
> […] as there's no real way to classify something so fuzzy with the requisite accuracy.
For the odd case where an account is uploading content that looks like it might involve a minor nearing adulthood, a platform privately and confidentially asking for proof of identity and age is reasonable enough. It's a fair solution for, to name just one example, the odd flat-chested twenty-something exhibitionist of Asian descent.
Very true, but there's a notable distinction: they can devote 100% of their expertise to moderating such nuances. It would be costly for a company such as tumbler (one which is not built around inappropriate material) to build out and maintain such expertise.
> The reason they won't is because they don't like to go against the grain of the currently advertiser-mandated vision of an exclusively family-friendly internet — where 'internet' here means the ad-supported part of it; i.e., all of the bigger commercial content silos.
Are you surprised this is the case? What brand wants to be associated with that sort of stuff?
> The outrage comes when people stumble upon photos of minors in the early pubescent or even prepubescent stage of development intended to titillate. That is, content that is fairly consistently classed as child pornography, and no apparent action is undertaken to purge that content.
> For the odd case where an account is uploading content that looks like it might involve a minor nearing adulthood, a platform privately and confidentially asking for proof of identity and age is reasonable enough. It's a fair solution for, to name just one example, the odd flat-chested twenty-something exhibitionist of Asian descent.
I'm sure there was a more polite way to phrase that. Regardless, much of the internet consists of downloading and re-uploading media. This would remain difficult.
>Other porn sites with user contributed content exist.
I bet that those sites also hosts vast amounts of illegal material as well. The law isn't fully enforced because doing so would have so many people arrested that it would lead to invalidating the law.
As you said, the outrage doesn't really exists in those hard to tell cases, but the law isn't written to align with the outrage. So the content is largely ignored unless a specific case happens to get popular attention.
Overall the standards are about as sane as our standards around other similar topics, which is to say not at all.
Tumblr had issues with jailbait and childporn. Even with a lot of moderation and policing, it was difficult to keep under control. As an interesting consequence, people who use scripts to rip entire blogs, may have underage selfies and other pics they're not even aware of. It's a strict liability crime in the US, so even having huge Tumblr dumps can be risky!
There were a lot of factors involved in the censorship, and it makes me thing the future of the open web needs to be more federated/distributed. Sites like Twitter, Tumblr, and Facebook need to lose relevance.