Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe a dumb question, but why are media decoders, which are notoriously high risk, not well sandboxed?


Because sandboxing on iOS is terrible. Not that any of the other commercial vendors are any better.

If they could provide good sandboxes do you think the highest security certifications advertised on their website [1][2] would only certify protection against attackers with “basic attack potential”, the lowest possible level. Three whole levels below “moderate attack potential”. I mean, seriously, they certify their security sucks on their website, is it any wonder their security sucks.

[1] https://support.apple.com/guide/certifications/ios-security-...

[2] https://support.apple.com/library/APPLE/APPLECARE_ALLGEOS/CE...


From a security perspective, Common Criteria certification isn’t particularly meaningful.

Plus, it’s not really worth getting certified at a higher level than you need. Why expend extra effort?


No. From a security perspective a Common Criteria certification to the lowest possible level does not establish meaningful security. That is kind of the point.

The companies that develop easily hacked systems that are repeatedly hacked hundreds of times a year like Apple, Microsoft, Cisco, Amazon, Google, etc. can only achieve certification levels indicating they are easily hacked. They have never once succeeded at certifying meaningful security. The certification is pinpoint accurate, just the trillion dollar commercial IT companies do not like the results.

I agree it is largely not a useful differentiator, but that is because all of the commercial IT vendors are certified incompetent. The Common Criteria will not help you determine which fish in the barrel is hardest to shoot. Its job is to distinguish serious security by professionals.


And why haven’t they been rewritten yet considering it keeps happening.


It takes a while. At Google at least, new systems in android are required to be built in rust and there are major efforts to rewrite significant systems. But it takes time and rewrites are dangerous in other ways. And you need all the tooling to handle everything else an engineer does beyond simply writing code.

From where I sit, it also feels like the industry has really only coalesced around "the only real solution is safer languages" in the last 2-3 years. "Rewrite it in swift/rust" was way more controversial in 2019. So hopefully we'll see significant progress in the next several years.


But it's already been a while...


How long do you think it is reasonable to go from "we are now in agreement that rewriting stuff is the right call" to "all media processing code is written in a memory safe language"?


Why do you exclude the time to get to "we are now in agreement"??


1) Because that takes work 2) Because that makes things a bit slower, so it’s a stand-off between Apple and Google because neither of them wants to be the “laggy” phone


It makes no sense, especially for things like thumbnails in file viewers.


They are.


They clearly are not, seeing what's happening.


They are. Look up "Blastdoor" and other related efforts to sandbox decoders.


Sandboxing is not perfect. There are sandbox escapes all the time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: