Hacker News new | past | comments | ask | show | jobs | submit login
Reversing a MacOS Kernel Extension (lightbulbone.com)
100 points by supro on March 31, 2018 | hide | past | favorite | 22 comments



> Reversing the Unprotect Algorithm

Protected binaries have everything after the first three pages encrypted with 512-bit Blowfish. The key is the “Don’t steal Mac OS X” poem (I believe with spaces removed):

Your karma check for today: There once was was a user that whined his existing OS was so blind, he'd do better to pirate an OS that ran great but found his hardware declined. Please don't steal Mac OS! Really, that's way uncool. (C) Apple Computer, Inc.

Fun fact: CommonCrypto doesn’t do Blowfish with a key size this large anymore due to CVE-2016-1802: https://blog.timac.org/2016/0710-blowfish-operations-with-ke...


> Note that using Blowfish with a key longer than 448 bits is not recommended as it weakens the security guaranteed by the algorithm.

That is quite interesting. This is the kind of thing that people like me, who are not cryptographers, would not have expected. We are taught to always think longer is better.


No - what people are being taught instead is to never implement cryptography themselves and leave that to somebody who knows what he is doing.


Yes, it's always good to leave X to somebody who knows how to do X. However, this ought not be left to "professional" cryptographers. After all, "professional" merely means you're getting paid for it.

History has shown again and again that many professional cryptographers do not know what their doing at all. From Crypto AG, over Wifi encryption standards, early SSL encryption, to cell phone encryption standards professional cryptography has failed again and again. There's probably more snake oil in commercial cryptography than in open-source hobbyist projects.

Professional cryptographers, as well as intelligence agencies and police authorities around the world have a vested interest in perpetuating the myth that you shouldn't roll your own crypto, even though they know that e.g. certain combinations of cryptographic primitives or higher round numbers in existing algorithms are more secure at best and no less secure in the worst case. There are many other modifications that would potentially strengthen cryptography, e.g. triple encryption with minimum key or cascading ciphers.

The advocacy against home-made cryptography started in response to the cyperhpunk movement in the 90s, which was perceived as a threat for a very simple reason: Nonstandard implementations need to be reverse-engineered individually, and this is very labour intensive and can become prohibitively expensive even for large well-funded government agencies. Use of standard libraries is therefore much more desirable, as it also makes it easier to tailor your side-channel attacks to the endpoint. Moreover, if there is a weakness in a library, it can be exploited on more systems. Also, you can attack the servers and machines of fewer developers, e.g. exchange binaries when they are downloaded.


> History has shown again and again that many professional cryptographers do not know what their doing at all.

Case in point: you should never use plain-text ascii as a key directly. You should always run your source key through a KDF, which will produce an actual key of the proper length. Also, what matters is not the length of the key, but how much entropy it has. Anything more than 128 bits is serious overkill.


This is very good advice that I find almost nobody I know follows. It is such a simple step that covers such a large attack surface that I am surprised this isn't more common practice since it gives you a nice peace of mind on the input.

Every other time I suggest it in code review, I get a `but it works without it` and then I have to re-explain a bunch of stuff.


> it works without it

All security "works" until it doesn't.

All of the hand-wringing about FB and Equifax drives me nuts because all of it was completely predictable. But no one wants to hear it until the horses are out of the barn and galloping over the horizon in a cloud of dust.


I have wondered that too. When state agencies see a crypto stream they want to crack I'm sure they are happy to see its using a standard cypher. Because that have done years of research into how to crack it.


Or in reality a standard cypher had been researched by thousands of independent researchers and you know with a greater probability that the standard is not vulnerable.

Question is, do you believe more in the ability of “the agencies” to penetrate independent researchers, or do you believe otherwise?


> No - what people are being taught instead is to never implement cryptography themselves and leave that to somebody who knows what he is doing.

The point is: what is "implement"? If I use OpenSSL (or any other crypto library with Blowfish), and I don't know about this length issue, I'm still screwed.


Software developers tend to assume that any domain they create for is already so well-established that there’s a single web page on the internet that will explain how to do eveything.

For example, in physics it’s laws of physics. You can sometimes get away with reading an article on wikipedia and it’s probably going to tell you how things are today and a decade in the future.

For security it’s special cases. Being, for example: that certain hash functions will loose on their entropic properties if applied twice. This changes almost every year - and someone needs to keep their docs up to date so that you could know that.

The issue is that security is incredibly contemporary in nature; while laws of physics most developers deal with haven’t changed for at least half a century.


Key length is a user-facing parameter of encryption libraries.


Library interface does not imply safe interface. More often than not one is required to study the supplied documentation, but even then some libraries require a certain amount of domain knowledge before one could apply their programming skills.

That goes beyond security.


The reason we're talking about it is because someone sang the "leave crypto to the experts" tune. Apparently not, because this was about key size. Responsibility for which is punted by experts to non-experts.

I'm not saying don't study the docs. I'm pointing out the irony in bring up this "leave it to the experts" platitude, when it's so clearly about something left to end users.

And to that effect: if interfaces are not safe, why are experts designing unsafe interfaces?

I'm tired of this victim blaming in crypto failures.


What would be the purpose of such a "bad" encryption? Apple must have known that it is not difficult to decrypt the protected binaries, so what is the point? To deter people who are not determined enough to learn about the code? To tell people who do decrypt them not to seal code?


I'd assume it's for the same reason that that Oracle used a poem in one of their protocols[1]-- to make distributing the key copyright infringement.

[1] https://dacut.blogspot.com/2008/03/oracle-poetry.html


It’s probably meant as a slight deterrent. People who are determined to Hackintosh are going to do it anyways; this just puts a small barrier in place to them doing so. Plus, the poem makes them feel bad about their actions.


Technical Protection Measures.


It's not a very good one, is it? Nothing close to what iOS does.

Nice username BTW ;)


This is interesting, I wonder if this the same encryption used to encrypt app store apps as well, I believe on iOS the kernel extension is fairplay, it would be interesting to know if someone tried to decrypt an app without a jailbroken device.


Funny - reverse engineer C++ calls by using google instead.


(2016)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: