It's the reason why, like many here, I have never bought a Samsung Android device (with the exception of the old Galaxy Nexus which was clear from Samsung total software crap).
They tend to provide good hardware features (waterproof, camera, etc.) but they kill it with bad software. Had a few meetings with them a few years ago, and got the impression that they thought software is cheap and not worth improving.
The software on my Galaxy Note 2 was pretty fine though (this was just over a year ago). It broke down and I had the motherboard replaced (under warranty), when it got back it was wiped anyway so I figured I might as well try cyanogenmod. Expected a culture shock like when switching from a windows to a linux desktop, but it's really just the same. They ship pretty much stock android with a load of crap that you can disable. First boot you just disable their ChatOn and various Note pen features and you're good to go, it won't ever bother you and you have a nice Android system. Cyanogenmod sucks more power than the stock Samsung software did...
Re-reading this, my first sentence is "the software [...] was pretty fine", well, I meant that they didn't mess up Android too badly. Their own software, I don't know, I'd totally ignore Knox simply because it's closed source and tries to do security. Now we've seen it fails, but even without that knowledge I wouldn't have wanted to use it.
I've ranted about this more than enough on HN, so I'll summarize: Samsung's software is consistently complete shit. I would be truly surprised if during a security audit Knox turned out to be secure using even the most liberal definition of "secure".
> "The fact that they are persisting the key just for the password hint functionality is compromising the security of that product completely."
Ugh. How this gets through the hands of so many talented engineers boggles my mind. It's easy to just blame the project managers, but at some point don't we have a responsibility to say no, this is a terrible idea and completely compromises the premise of the product?
Saying a definite 'no' in Korean culture is often hard (as is saying a definite 'yes'). Saying a firm 'no' to your boss, which would make him say a firm 'no' to his boss, is harder still.
/* Include a typical "that's why such things should be open-source" essay here. */
I work in the US and I can't remember working with someone from Korea or of Korean decent.
I have had at least a handful of times where I was the only who spoke up about how irresponsible something was. I have yet to see another US developer speak up about the ethics of something like this. Actually, much worse stuff than this. Not once.
I routinely see dumb ass decisions that totally ignore design/UX/technical principles imposed by the product manager while most of the company are against them.
It weakens the product first because they are bad decisions and secondly by having us spend time implementing them instead of actually improving it.
Thankfully I have not yet encountered decisions like this where the user personal informations are affected but I can totally relate to this kind of situation..
I attended a session at Simon Fraser University once earlier this year, hosted by Samsung. Those guys, including a product guy, made it pretty clear that the development work happened in their Greater Vancouver R&D lab, not in Korea. So....
Forget about Samsung's software engineers - they've proven how "talented" they are one time too many already - but what about NSA? They've just approved Samsung's Knox devices for classified information. Unless of course they want these devices to be vulnerable to bugs only "they" know about (at least until today).
I question how talented the actual engineers are. I doubt there's some project manager insisting upon some particular implementation of a password hint function. More likely is that they're relying on some freshly minted CS grad with no experience who doesn't grok how hard crypto is.
This is actually true I think. One of my high school classmate graduated in EE or EC with no CS background/courses and much interest or knowledge in CS worked in Knox right after college. Of course, I can't generalize based on just one case, but it does make me skeptical about the product.
Maybe I am missing something here, but I didn't even know Samsung Knox did storage encryption. As far as I understand, the use case for this technology was isolating enterprise apps from user-installed/general apps. Given that, is there any scenario in which non-Knox apps could read the password from disk and use that to mess with apps inside the Knox container? If so, then this is a serious threat to Knox as a technology, if not, it still sounds like a silly way to implement passwords, but not a deal-breaker (kinda like having a plain-text /etc/shadow in mode 600 is silly - or a sign of negligence, depending on your standards regarding defense-in-depth - but not a full attack vector in and of itself).
I always assumed Knox was something you used in addition to Android's whole-system encryption, not instead of, but I could be wrong.
The API has been "adapted", and the new data separation will be "analogous", but it appears that it's an adaptation of the current AOSP/Google cryptography to fit a similar need, not a direct contribution from Samsung.
I imagine that as the number of programmers is growing exponentially a lot of wheels will be reinvented unnecessarily, but this is a security product from a top-tier company selling products at premium prices.
Just because a company is big, or even a government agency does not mean they have the skill. I've worked for both, and the real security expertise (when they bother) is usually brought in through consultants. The apathy around security in both the public and private sectors is astounding. I keep hearing that it's difficult to get right. It's not. It does take some up-front thought though (security is not a bolt-on/refactor thing, it needs to be built in from the start), and a LOT of work. I know because I've gone through it (https://www.wittenburg.co.uk/Entry.aspx?id=a7aae391-9ac8-411...).
That's cherry-picking non-intersecting definitions though. Difficult can also be work that involves a lot of physical or mental effort, not necessarily confined to just intellectually challenging. Hard work can also refer to work that requires mental effort or mental stamina, and does not only refer to highly physical labor.
That is indeed cherry-picking, and was my intent. Bluntly, most devs I ask about their lack of ability in the security space think they're not smart enough to get it right, so don't even bother to try.
First of all as I mentioned in the paragraph below, I analysed the pre-installed Knox Container App which is known as Knox Personal and shipped with the Samsung S4 I bought and not Knox EMM. " Knox EMM is a enterprise cloud-based management solution for mobile devices which was not part of this analysis."
I investigated the following version (mentioned in the name of the apk files on the device): KNOX_com.sec.knox.app.container_2.0_2.apk, KNOX_com.sec.knox.containeragent_3.0_30.apk
A lot of comments and posts claimed that I have just investigated an early developer version. I don't think that version 2.0_2 seems to be an early developer version?!? Also Samsung why are you shipping early developer versions of a product on customer devices?
I did the analysis about one month ago with a new Samsung S4 and all updates installed. That doesn't seem to be an early developer version, right? Or did I bought a fake one ;)?
Samsung mentioned the following in their press release: "Concerning the second issue, KNOX does save the encryption key required to auto-mount the container’s file system in TrustZone. However, unlike what is implied in the blog, the access to this key is strongly controlled. Only trusted system processes can retrieve it, and KNOX Trusted Boot will lock down the container key store in the event of a system compromise."
I think Samsung speaks here about their Knox Agent. At the beginning of my analysis I used geohots towelroot to gain root access on the Samsung device. During the analysis the phone wanted to update some "Samsung Security Policies". After the update the Agent blocked the root access to the phone. So this agent seems to be working like a usual Anti-Virus tool. It can only detect attacks if it knows the attack. And as we all know, Anti-Viruses are useless against unknown attacks :). This is the same for their so called "TrustZone".
All other points the press release mentioned were just about a Knox 1.0 software, which now was replaced by MyKnox. I don't know what Knox 1.0 is and how to get it or on which devices this is installed. All I know is, the version of Knox Container 2.0_2, which was installed on my Samsung S4 is heavily unsecure.
When I first read "Samsung Knox" in an article merely talking about device features, I rolled my eyes and thought "there's no way Samsung wrote a solid piece of security software".
Knox is difficult to implement in an enterprise environment anyway -- when we evaluated MDMs, two vendors flat out said that they were aware of zero customers actually using it
The problem is, of course the security model for Android is a real mess, especially in BYOD scenarios. These garbage 3rd party solutions are there because the platform doesn't provide it.
I was surprised that so many people didn't have patience to think through the logic in that original post. Since there is a function used to derive the key from password, this must be the way Samsung handles the password. This logic doesn't fly.
Not really brutal. It's a PIN code, not a passphrase. Unless those are used with HSMs (like with bank or SIM cards), their only use is to prevent immediate access by non-technical adversary, so there's little point in securing those beyond typical UNIX filesystem permissions (given you're not supposed to have root access or access raw block device, huh).
It's security of your device from yourself. The same kind of security idea that doesn't provide you with root access. The idea is that your company could be able to provide you with an app, but you won't be able to tinker with it. The whole point is, if the thing would work as advertised, the company can consider your device as "secure" and be not afraid of it leaking their data, while still letting you use it.
They tend to provide good hardware features (waterproof, camera, etc.) but they kill it with bad software. Had a few meetings with them a few years ago, and got the impression that they thought software is cheap and not worth improving.