Based on the write-up, Samsung has lower quality Iris recognition than could be written by an undergrad in a few hours. I say that, having done so.
Most obviously, the system should not tolerate a constant-size pupil, ever. The pupil has micro-dilations around twice per second, and your system is really terrible if you don't verify that changing diameter.
Also, multi-spectral is a pretty good test, though I don't know enough about the capabilities of the S8 camera to know if that's feasible (shouldn't be that hard.) Capturing the patterns of the iris at 500, 800, and 1200nm results in three templates that are quite different from another.
CCC were able to do this for about the cost of a S8. I would say this is one of the rare situations where defeating the attack would have been even cheaper. It's that simple a programming exercise.
Samsung always seems to me as if they race to match any iPhone feature–but never more than skin-deep.
So when the iPhone gets a fingerprint sensor that saves only a hash of the actual data in a special enclave of a custom chip, Samsung responds with an iris scanner that saves an image of the iris as a world-readable jpeg in your home directory.
Thus, their marketing material can claim feature-parity (or even exceed Apple). But it never seems like they actually care.
It's not like Apple doesn't run into similar problems (not sure if the fingerprint sensor has been defeated–it's a bad idea for 5th amendment reasons in any case). But at least they do the minimum in trying.
>It's not like Apple doesn't run into similar problems (not sure if the fingerprint sensor has been defeated–it's a bad idea for 5th amendment reasons in any case). But at least they do the minimum in trying.
>All needed was a photograph of the fingerprint on a glass surface.
And wood glue! Looks like that method proved unreliable, so they expanded it:
"To create the mold, the mask is then used to expose the fingerprint structure on photo-senistive PCB material. The PCB material is then developed, etched and cleaned. After this process, the mold is ready. A thin coat of graphite spray is applied to ensure an improved capacitive response. This also makes it easier to remove the fake fingerprint. Finally a thin film of white wood glue is smeared into the mold. After the glue cures the new fake fingerprint is ready for use."
Yes. They explained all the innovative magic of Apple's fingerprint sensor was better image resolution. So all they had to do was improving that on their end too. I imagine the body changes everywhere over time, so this resolution game has a hard limit. A fingerprint is the worst choice of biometric data, as people leave them everywhere...
That was CCC in 2008. To underscore the inherent problems of biometric authentication, they pulled Wolfgang Schäuble's fingerprints of a dinner glass and published it in their magazine. That issue also included a ready-to-use replica. Schäuble was Germany's interior minister at the time and a strong proponent of biometric data in passports and increased surveillance.
Read the original ccc article or watch the video! They got the print off the phone itself. Only restriction is a high resolution scanner OR camera to capture it.
They also managed to get the fingerprint of a politician via a high resolution picture taken at a speech.
(Also a phone PIN via eye reflection captured by the front camera.)
AS i have come to understand it, biometrics is a good identifier (telling who you are) but a lousy authenticator (telling that you are allowed).
The use of biometrics on mobile devices somewhat mix this, with the assumption that if some user was authenticated within a certain time frame (via a pin or some other knowledge bound check), a simple id is enough to extend that authentication.
Authorization is the counterpart to Authentication: authentication proves who you are (with passwords/tokens/biometrics aka something you know/have/are), authorization controls what you can do (with permissions/ACLs/roles/etc.)
To put it another way, the bouncer at a club checks your photo ID to see that you match it (authentication via something you are), then uses it to see if you can enter (authorization by checking your birthdate against a cut-off/name against a guestlist).
Samsung always seems to me as if they race to match any iPhone feature–but never more than skin-deep
Firstly, any biometric technique can be beaten. Against a known, committed foe, it is almost impossible to defend with surety. And for that matter, who can't obtain the pin code of any other user given a short amount of time and focused attention? The notion that "if someone takes an IR high resolution, close photo of your iris they can defeat your security" is asinine given that the same people could obtain your pin in a million and one ways.
These mechanisms are to induce users to use some security, and the primary defense is against lost or stolen phones, making it convenient enough that it isn't disabled.
Secondly, how did this somehow turn into yet another Royal Apple spiel? Aside from the easy beatability of the Apple fingerprint sensor, why wouldn't you compare the fingerprint sensor on a Samsung?
> ...if someone takes an IR high resolution, close photo of your iris they can defeat your security...
There are commercial security products that regularly perform "IR high resolution" iris scans from several meters away and require no cooperation from the target. Stanley CSS sold one that sat on top of a doorway over five years ago, their product literature says that you need to look at it - but having demoed it myself, I can say that is not true.
You've misunderstood my point, which is that distant iris data collection from non-cooperative subjects is so easy now that vendors are selling solutions pretty cheaply. So using it for security, without additional checks in place, is like walking around with your pin written on your forehead.
Samsung always half-asses their new features. They are so desperate to be first to market or to make a splash they rarely take the time to make it work well. It's my biggest frustration with Samsung, they have the resources and market position to take the long view and give their teams more time to polish, but never do.
And this is why it makes it so difficult to choose high end phones. I have been shouting about how my Pixel phone is magnitudes better of a device than any other phone I've ever used, including the S8.
On paper it looks awful, but everything this phone does works 100% of the time quickly and without stuttering or failing.
I'm on my second Samsung. First was a very annoying pre-capacitive touch that never worked well. This current one is an old Galaxy S5. It is better, but it likes to restart periodically (fortunately Android handles that well). Battery life is crap and it gets sluggish easily.
Before I had a Motorola. It was much better. I will not buy another Samsung.
I wouldn't say expected, they have made some poor performing phones in the past (for various definitions of "made").
But my point was more that on paper it doesn't look like much. It doesn't have waterproofing, it's not "best in class" in anything except the camera, it runs software which has less "on paper" features than other brands, it doesn't have an SD card or removable battery, etc...
But when you actually go to use it, it's a night and day difference between it and other devices.
I think that tbihl and bebna are trying to say that it would cost Samsung less to defeat the attack by programming one of the listed countermeasures than it cost the CCC folks to buy one of these phones. (I don't agree.)
That is exactly what I meant. Like you're probably thinking, my estimate IS an amateurish one, taking into account no business admin/healthcare/etc. But I remember implementing this (as a small part of a project) at a time when I couldn't be bothered to put more than a few hours into any project. At any rate, I can't explain their failure to implement these checks, because they're SO trivial.
"You only need a laser printer, a decent camera and a contact lens."
No, considering that the picture doesn't have to be that good, you need only a rogue picture from somewhere on the Internet (Facebook, Instagram, or whatever), the means to get it printed cheaply (most likely using a public printing service), and the contact lens. Oh, and the contact lens don't have to be new/hygienic. The total cost boils down to pennies.
Edit: 'micro-oscillation' was a term that I lazily invented to describe a phenomenon with which I am only passingly familiar. It is actually called 'hippus' or ' pupillary athetosis'.
Have you read George Orwell's "Politics and the English Language"? If not you should give it a read, it's short. This comment reminded me of his criticism related to Latin usage.
I have now! Hopefully you're accusing me only of saying 'micro-oscillations' when I ought to have said "shrinks and swells slightly at regular intervals."
This is remarkable, even though it's obvious to any of their phone users (myself included) that Samsung's software development is shoddy at best, despite its excellent hardware. Which is a huge pity, but may well have something to do with the culture of corporate deference and rigid hierarchy. http://www.asianews.network/content/feature-samsung-debacle-...
Let's not pretend that people here in extremely permissive environments don't build hugely buggy things that wouldn't survive a similarly superficial test.
why would that only affect their software, but not hardware?
While I agree that Samsung is at its core a hardware company and software engineers are still treated like second-class citizens, you can't just expect them to compete with Google or Microsoft overnight, IMO. I'm disinclined to believe that their military-like corporate hierarchy is to blame.
I don't know: perhaps that the company's structures work better for their core expertises, hardware, but software development, perhaps something that came somewhat later and lower down the hierarchy, doesn't work so well in that context?
Well, a few hours is quite the euphemism if you're talking about starting from scratch.
Also, the cellphone image resolution is far too low to recognize dilations. Looking at the video, I'm surprised that it works as well as it does, to be honest.
It seems that there are reasonably large changes due to thinking about something, but not due to just looking unless your camera can detect 0.01 mm changes in diameter, which maybe tbihl's could be that seems unreasonable for a smartphone camera.
I would guess tbihl has never tried to get iris recognition working on a cost-constrained consumer device at a huge scale. They might not think it is so trivial then.
It's true that I've only tried iris recognition in dedicated setups. Sadly, I can't speak to whether this test is within the abilities of a standard smartphone camera without any macro lens.
Saccades is not what I was referring to. 'Micro-dilations' was my attempt to describe the phenomenon. After some searching, I believe it's called hippus (maybe specifically 'physiological' hippus, to distinguish from pathological) or ' pupillary athetosis'.
'hippus', specifically 'physiological hippus', or ' pupillary athetosis', seem to be the correct terms for what I was describing as 'micro-dilations'. Saccades, as I understand it, is rotational shift of focus by 1-2 degrees, rather than repeated contraction/relaxation of the pupil.
Saccades in general are just rapid, ballistic eye movements: they don't have to be small and they're often voluntary. You often make saccades without explicitly thinking about it, but you can countermand these and hold your eyes still when you want to.
Most obviously, the system should not tolerate a constant-size pupil, ever. The pupil has micro-dilations around twice per second, and your system is really terrible if you don't verify that changing diameter.
What if one of the subject's eyes is a glass eye? What if they're wearing colored contact lenses? Wouldn't both of those situations complicate that?
If someone's wearing a colored contact lens I would expect iris recognition to not work. That's like trying to use a fingerprint reader with gloves on.
Glass eye I'm not sure about either. That seems along the lines of using a fingerprint reader on a prosthetic hand. Sure, you could mold a fingerprint onto one and scan it, but with a sophisticated fingerprint sensor it's going to look like a fake finger. Do we deliberately make everyone's fingerprint scanner weak to allow someone without fingers to use it?
Maybe the answer is to introduce a "weak mode" option where most users could have the scanner verify "yes, this is a real eyeball," and if someone with a glass eye still wanted to use iris scanning in a way that can be copied by a photo, they have the choice to disable the security measures.
The first time I saw a blind person using an iPhone on the subway (years ago, I think it was an iPhone 4), I was completely blown away at how much he could do with it.
Personally I think apple should allow you to turn the backlight for the screen off entirely for their use. Perfect over the shoulder privacy and better battery life to boot!
I suppose Samsung tuned the system to avoid false negatives, which might be difficult in all the lighting conditions that might come up in real world use. It's a device for the mass market and average user after all.
Might be that they did a bad job with the Iris recognition, but why not give them the benefit of the doubt and consider that they were aware of the trade-offs involved?
Knowingly and willingly giving users a false sense of security? How is that not worse in every way?
If they couldn't manage to get false negatives down to an sensible level without compromising security in such a blatant way, there's two courses of action: Live with it, or don't release it.
How is this any different than the swipe to unlock pattern? If you go to a starbucks and sit down for 20 minutes in a higher seat, you can get the passwords to probably 2-5 phones depending on everyone's security and how crowded it is (I inadvertently memorized 3+ coworkers' unlock patterns because they're so easy to see).
If someone has physical access to the user they can probably get into most people's phones (that's just saying people aren't careful enough though).
Phone must automatically lock quite quickly, otherwise somebody quick just grab it after you have unlocked it. This means the password needs to be typed in constantly if you are frequently picking up the phone. Also you often want to grab the phone with one hand, so you need to be able to type the password with one hand. Combine that with the frequent typing and you probably come to conclusion that you can't have a proper, secure passphrase. Instead you resort to pin code of some length. Now remember that you need to be typing the pin code constantly to unlock the phone. With one hand operation there's little you can do to protect yourself against shoulder surfing. This means you pin code is not that private.
Iris scanning or fingerprints are easy for determined attacker, but I would say they are hard for somebody who just grabs your phone. Vice versa for the pin code.
I think a good balance between security and usability would be to allow fingerprint or iris scan when the phone has been constantly in my proximity but require a pin (password) if the phone is taken away. The proximity could be determined for example by pairing the phone with smart watch.
An encrypted NFC or Bluetooth bracelet, one sold with or separately to a phone would be nice. Pings it every so often. If it can't find it, automatically self locks. If it can't find for for X number of days and a password hasn't been entered in that time then it wipes || locks the phone.
Should be significantly more secure than Mifare though. Ideally something like a contactless OpenGPG card or similar.
Recently I searched for passive NFC ICs that'd be suitable for implementing that, but came up empty. Usecase was exactly that: A NFC device located at about the wrist. My laptop has a NFC reader at just the right place of the handrest to read it. And I'd probably transplant a NFC reader into my desktop computer's keyboard for the same purpose.
Just had an idea, maybe not the wrist for mobile devices, maybe a ring that is always on the hand that is on the back of the phone. I don't know, but I can't help but think of how things can be more secure and that there is a market for those with security in mind.
Just found this also in my search while typing this comment.
I don't like rings (you put on your fingers). But that's just a personal preference. I'd be okay with wristbands though. Yes, the proximity to smartphone NFC readers would be a benefit of a ring.
Quoting GP: I think a good balance between security and usability would be to allow fingerprint or iris scan when the phone has been constantly in my proximity but require a pin (password) if the phone is taken away. The proximity could be determined for example by pairing the phone with smart watch.
When combined with a fingerprint sensor, smart lock keeps the device completely unlocked while "triggered" (by being on-body, or close to a trusted BT device, etc), and the fingerprint unlocks it while not triggered. It doesn't ever escalate to requiring the pin/password/pattern. Please correct me if I'm wrong, because I'd like to be.
My solution is to use the "Screen off and Lock" app[1]. It acts as a local Device Administrator to be able to force a password unlock. I added the widget to my home-screen where I can actually lock my phone if needed (leaving it somewhere for a period), and I can use the fingerprint the rest of the time.
Definitely not a perfect system. I wish that I could set timeouts and map the power button to do an admin lock. Also, having to use a 3rd party app for this is quite likely its own threat vector.
Yeah, the Smart Lock functionality doesn't really support configuring your own primary-vs-fallback behavior.
I'd love to have features where the fingerprint is only good enough under certain circumstances, such when the phone hasn't been idle for too long, or when combined with an RFID tag.
You're not wrong. I use smart lock and fingerprint on my 5x. I actually would prefer it to require fingerprint when near my watch with password fallback, and require fingerprint+password when away from my watch, but that isn't currently a possible configuration as far as I can tell.
The password model on Google's version of Android and iOS (as examples) is not biometric based. You need the password every few hours (at least on Android, not sure about iOS) and whenever you restart. The biometric is a keep alive for that "session". For my threat model, that's sufficient. For many, that is sufficient. For some, it absolutely is not and they should disable biometrics entirely. .
Fingerprints and iris scans are not equivalent to a username or email, since the username and email can be changed easily. Proper biometric data cannot be changed, it is tied to the individual. So when the data is compromised and published in the wild and the devices can be fooled with it (which will always be possible), then the user of biometric recognition devices can become the victim of identity theft for the rest of their life.
Exactly this. We need the big 4 to start pushing this as a standard and with all three passwords can be simpler like 8 alphanumeric characters. There is NO reason that the 2fa can't be built into the mobile OSes and shown on screen/watch with fingerprint verification as the trigger.
>Iris recognition may be barely sufficient to protect a phone against complete strangers unlocking it
I suppose that's the attack scenario those systems (at least in phones) are supposed to protect against, to be fair. Suppose the alternative might be that some users use a predictable pin or none at all. Fingerprints or the iris sensor is an improvement for them because they are quick and easy to use.
Of course it's still good to deflate the hype around Iris scanners a bit and demonstrate that it is currently a very limited technology after all. Especially considering their remark that iris scanners spread to other devices too.
>Fingerprints or the iris sensor is an improvement for them because they are quick and easy to use.
I'm not sure about it being an improvement, human laziness always finds a way to make something less secure. Like buying "fingerprint stickers" because too lazy to pull off a glove when wanting to unlock the phone [0].
The CCC always does interesting stuff like this, a couple of years they reproduced a politicians fingerprint just using photos of her hands [1].
This kind of stuff turns biometrics from something "you are" (your fingerprint, your iris) to something "you have" (a fingerprint on a glove, a picture of an iris) making biometrics often very trivial to bypass.
Fingerprints are even worse. They are all over your phone. So if someone steals it the key is already included.
Fingerprints are something you leave all over the place right now and with the increased camera placement and tracking done everywhere pictures of your iris wont be much better for long. So both are not something you are or have, they are something everyone you ever passed on the street has access to.
Using a fingerprint for security is like writing your password on everything you touch. Using your iris is like walking around with your password written on your forehead.
> The Samsung Galaxy S8 is the first flagship smartphone with iris recognition.
That's not quite true, Lumia 950, Lumia 950 XL and HP Elite x3 came out a lot earlier than the Galaxy S8 and all of them use iris recognition (still undefeated, by the way)
There could a bit of a bias, Lumias and HP Elite x3 aren't anywhere near as popular as Samsung Galaxy S8, same as most malware targets Windows rather than Linux/macOS.
I am curios how much eye damage these system can cause. The S8 gives a warning before you activate it that you should not place the phone too close to your face.
How bright is this infrared light and can it cause eye damage although we can't see it?
A nice campfire that noticeably warms your face when you look at it probably gives off a couple orders of magnitude more IR than the phone. IR can damage eyes [1], but the phone probably won't contribute significantly
In general yes, invisible things can burn your eyes but it's probably of no concern here. The more common thing you'll run into is things like cheap DPSS green lasers that output a large amount of IR, you don't have a blink reflex for things outside of your visible range and these will cause damage on the higher end.
Anecdotally I do not notice any eye discomfort while using this n times per day. It's also not "bright" from my perspective but someone with this system would better answer the eye damage question.
It's unlikely to be dangerous (see other answers), but "eye discomfort" is a bad yardstick: the retina doesn't have nociceptors and any discomfort would only be secondary, appearing long after the damage is done. See (without being seen) the few people who manage to observe each solar eclipse with binoculars.
Whilst it's certainly valuable to make people aware of the limitations of the security systems we use, this shouldn't really come as a surpre. If someone is close enough and motivated enough to take a high-res photo of your face just to access your mobile device, they're also probably close enough to film you typing in your passcode - sure, you might do that less often, but for an average user are either of those things a real concern? The security model hasn't really been "broken" because if someone steals my phone they don't have access to the device by default.
> If someone is close enough and motivated enough to take a high-res photo of your face just to access your mobile device, they're also probably close enough to film you typing in your passcode
Not sure about that. All of my friends, family and work colleagues are 'close enough' to me to take a high res photo of my face (and I'd gladly let them do it), but none of them can see my passwords when I'm typing or unlock my phone without my permission. For me this revelation is of a big concern.
With a 4k camera and decent light conditions that could be possible in public space. Whereas you can avoid entering your passcode in public spaces (or shield it well enough).
It shows again that for people with very valuable data (where others would spend significant amounts of money to get data), passwords remain the only secure way.
We should understand the attack model better and the likelihood of a successful eye capture.
We know that it's certainly less hard than knowing someone's password given a semi sane password policy, but more difficult than scraping fb selfies and printing them.
Thinking out loud, I certainly don't share my phone password with my eye doctor, so there's one example of disclosure.
„But biometric authentication does not fulfill the advertised security promises“
This is completely out of context. For the average smartphone user Iris-Recognition on a phone (just like touch-ID) VS pin-disabled on the phone is a huge step forward.
The patterns in your irises are unique to you and are
virtually impossible to replicate, meaning iris
authentication is one of the safest ways to keep your
phone locked and the contents private.
Also your pin disabled argument doesn't make a lot of sense. That's like saying 123456 is a good password because many people disable the password prompt at login.
Yes, but that doesn't say anything about the security of passwords in general. (The same way that bad iris recognition being better than no auth at all doesn't say anything about the security of iris recognition in general.)
I can obtain an image of you online, while I need to be on-site to spot you typing your pin. So if I have your phone, I can do research at home to break in.
Additionally, you cannot change your iris once it's compromised. This is an absolute no-no for secure systems! Changing your pin is easy.
This is definitely not a huge step forward. And, as already mentioned, the average user gets misguided by exaggerated marketing promises.
The trouble with statements like 'for the average xyz user' is:
1. 50% of your users won't have their needs met - that's a large proportion assuming a uniform distribution
2. We can't be sure a uniform distribution in the first place is appropriate
3. If we're going to assume an average user then why don't we assume an average phone too: If the average user gets by without something today then why bother building it as a new feature?
In the end a product should not be designed for an average user. It should be designed for a well defined audience who's needs will met well by the product. If you're going to bother with fancy biometric tech as a feature and selling point then you're clearly NOT aiming at the average user who couldn't care less...
These are general consumer devices and there are trade offs. The most sophisticated hacker the vast majority of users must defeat is a prying family member or friend.
Retinal scans are somewhat more difficult to fool (it's hard to photograph a retina from a distance), but the scanners are too unwieldy to fit in phones and lay people are queasy about lasers scanning their retinas.
I realize that to some it may appear as obvious, but quite often the obvious is overlooked as people respond to the hastily promoted propaganda relative to a system and become emotionally entangled in it instead of holding to reason.
Having prefaced my response with the above clarification, such an outcome should be expected rather than being unexpected. There's no such thing as a totally secure and uncompromisable system. Any system can be compromised. Where there's a system, there's a way to compromise it.
When all is said and done, what can reasonably be expected is a system that's as secure as it can be reasonably made and a genuine effort to patch vulnerabilities as quickly as humanly possible.
Sir. Daugman has left the job incomplete.. there still space for research. "Liveness" checking is of course a challenge in such cheap and simplistic setups..
Anyhow, for operator-attended application scenarios it is still ok. On the other, I see biometrics as a convenience feature in physical or logical access control scenarios (As long as the security level is at least equal or higher than conventional methods).
The important thing is how much more/less difficult is this than spoofing a fingerprint. If it is significantly harder, I still see it as a win for samsung's security.
Spoofing this eye is significantly easier than spoofing a fingerprint, because modern fingerprint technology detect if a fingerprint is "alive" by looking at sweat pores, pulse, veins under the skin and other features of a living finger.
Well, maybe. Usually flagship phones gets the best (most expensive) chips. Budget phones gets knock-offs.
It depends on which fingerprint hw/sw the phone is using. There are about 4 large players in the phone fingerprint chip space that have 90 percent of the market.
Fingerprinting is heavily patented so all four vendors have their pros and cons.
A few dozen small players competing about 10 percent of the market. They probably don't do much of it.
Most obviously, the system should not tolerate a constant-size pupil, ever. The pupil has micro-dilations around twice per second, and your system is really terrible if you don't verify that changing diameter.
Also, multi-spectral is a pretty good test, though I don't know enough about the capabilities of the S8 camera to know if that's feasible (shouldn't be that hard.) Capturing the patterns of the iris at 500, 800, and 1200nm results in three templates that are quite different from another.
CCC were able to do this for about the cost of a S8. I would say this is one of the rare situations where defeating the attack would have been even cheaper. It's that simple a programming exercise.