Taking care of it on an ongoing basis might have been too great a burden for the (assumedly-understaffed) team, or they had too few tools to deal with it and no incentive or manpower to create new ones that worked with the existing adult content system.
Simply banning adult content en masse allows them to use heuristics to identify nudtiy and genitalia in images.
Previously, they would have had to make a judgment call on whether those depicted were of adult age. By banning all of it, they no longer have the responsibility to make that choice.
That wasn’t the case. They had the industry standard tools (PhotoDNA + partnership with NCMEC), but I can’t say how well they were being maintained at the time of the situation with Apple.
Simply banning adult content en masse allows them to use heuristics to identify nudtiy and genitalia in images.
Previously, they would have had to make a judgment call on whether those depicted were of adult age. By banning all of it, they no longer have the responsibility to make that choice.