That reveals a major complication of the process that your partner was a part of: if you suspect content to be CSAM, you *legally must not* look at it, open it, etc. It’s a way to prevent bad people from abusing those positions, but at the same time, no one (innocent) who was told to do that wants that law to be changed.
It makes building detection systems harder, though.
Your intuition points to a typical PR pattern around Meta: employees disagree, and they resolve their frustration in the press. So many big tech scandals are an opportunity to remember the “two wolves inside of you” story.
It makes building detection systems harder, though.
Your intuition points to a typical PR pattern around Meta: employees disagree, and they resolve their frustration in the press. So many big tech scandals are an opportunity to remember the “two wolves inside of you” story.