Hacker News new | past | comments | ask | show | jobs | submit login
Forcing suspects to reveal phone passwords is unconstitutional, court says (arstechnica.com)
284 points by LeoNatan25 on Sept 25, 2015 | hide | past | favorite | 187 comments



All we really need is a real passcode, and then a second special passcode that wipes the phone instantly if typed.

so, my normal code might be 123456, but if someone asks what my code is and I say 345678, then the phone does a data wipe that isn't obvious from the outside, and just deletes all credentials, cookies, history, documents, etc.

Is this workable?


Not a new idea. There has long been a meme going around that if you enter your ATM PIN backwards, it will let you access your bank account but silently calls the police to your location (AFAIK it's just an urban myth). False passwords have a much older history than that.

Investigators are going to take a very dim view of such events, and probably didn't get to the point of demanding access without having documented sensible reason to believe the evidence is there - and may very well have actionable proof that you destroyed evidence, which will not turn out in your favor.


"AFAIK it's just an urban myth"

Your doubt tells us that your PIN is not a palindrome. The second most common PIN is 1-1-1-1, followed by 0-0-0-0 (says http://www.datagenetics.com/blog/september32012/ ), and looking at the top 20 numbers, over 10% of the people have a palindromic PIN. (If selected randomly, that should be 2%.)

So either the police get called a lot, or there's some special flag that say "palindrome PIN - false alarm" (and it must be supported for international cards) ... or it's simply not true.


Excellent reasoning. On my home security alarm if you enter your pin +1 or -1 the alarm disables and it does send the distress signal to the police. Its an advertised feature.


My home security system has a "duress code" that will disable the alarm sound and silently alert the company. It's not the reverse of the normal one, it's a different one that can be set.


Our system has the same thing. I can see how this would be a useful feature across other devices/technology – If I sign in this way, do this, etc.


I wonder what you're supposed to when the guy holding the gun says "Give the alarm code. Good. Now give me the duress code."


You give him any code. Its not like he is going to use it. If there is a pattern like ChuckMcM's where its +1 then give then give that one. Of course in all probability this scenario will never happen and the xkcd wrench is used instead.


How likely are you to actually remember the code when you're under duress, though?


More than 0%, which makes it more useful than not.


You could make it an easy one. Say your normal code is 5829. The panic code should be something like 1111. You'll never key it in accidentally, and people have codes like that, so it won't tip off an assailant.


the thing is, it's not the worst thing in the world if a bad person guesses it.


The ATM feature you mentioned is not totally apocryphal. It's a software called SafetyPIN, which has apparently been patented but not commercially adopted.


We used to have something similar in a previous workplace of mine. Doors were unlocked by touching an RFID badge to a reader and then entering a PIN. If you entered PIN+1 instead, everything worked as normal and a silent alarm was triggered.

The same place also had a security system for a critical data center room: Opening the door from the outside (i.e. entering) would increment a counter by one, opening it from the inside would decrement it. If the room, according to that counter, was supposed to be empty, any movement within would trigger the same silent alarm. The system worked safely up to the point where two people entered together, one of them left the room and the other one's badge had a temporary malfunction. We were quite impressed with how quickly a large assortment of police would show up, guns blazing and all ;-)


Having always a +1 for a duress code seems like a bad idea. What if a criminal asks for your code and types in the code you gave -1? If it works he now knows you tried to to set him up. If it does not, he can still try the PIN you gave him.


I hope the bad guys don't read hacker news.



FFS what a terrible patent[1]. It really is basically "hey what if people put their PIN in backwards; we could call the cops" written out to many pages long. How any patent examiner could read this and think "yes this is novel and obviously required effort worth protecting" is beyond me.

1: https://www.google.com/patents/US5731575


This is a frequent misunderstanding, but the patent system is not subjective. It cannot be, really, if you think about it. It is a legal system and hence has to be as objective as possible.

So patent examiners cannot just look at something and say, "this is crap, no patent for you". They have to prove a patent application invalid. There are many ways to do this, but the most common is by showing sufficient prior art. If they cannot find one or more previously published works that disclose each and every element of the claim, they must allow it. Apparently the examiner could not find evidence of somebody thinking of this before, and so it was allowed, regardless of how novel it appears to us now.

Another way to find an application invalid is to show that it does not disclose enough detail about how to implement the invention. That is probably why this patent (and most others) are really long-winded.


I read a summary on Wikipedia[1] and that helps. But why should the fact that someone didn't think (or document their thoughts) on something be evidence of non-obviousness?

I think of touch UIs and the patents there, for example Apple's rubber-band scrolling patent[2]. Sure, it's quite probable that no one thought of this before if they weren't designing touch UIs. And even if they were, they might have thought of that, plus several other ideas, while developing. Why should this make any difference?

For instance, no one has made scrolling that intentionally segfaults if you scroll fast enough (or insert other silly thing here). You won't find prior art on this. Should it be eligible for a patent just because it's novel? There should be some sort of criteria where the effort required to invent something is taken into account. If it's likely to come about merely as a result of playing in the space, then what does the public gain by issuing a patent?

What knowledge is contained in the "reverse PIN dials cops" that merits protection? Even if nobody had that idea before (or bothered to document it), what does that matter? If you patent can be constructed just by asking a simple question ("think of some ways to alarm when being robbed at an ATM"), well that should be grounds for it not being valid.

Edit: Another example. Things like algorithms. Look at a simple database indexing system: ISAM. Sort your data, sample every so often to form an index. Repeat if the index is too large. Ta-da. Going back far enough, this was very novel. But any worker that had to figure this problem out would arrive at the same solution. It's nearly as fundamental as binary search.

1: https://en.wikipedia.org/wiki/Inventive_step_and_non-obvious...

2: http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=H...


> Should it be eligible for a patent just because it's novel? There should be some sort of criteria where the effort required to invent something is taken into account. If it's likely to come about merely as a result of playing in the space, then what does the public gain by issuing a patent?

One line of reasoning is that there are not enough people "merely playing in some space" to find more solutions to problems. This is not just theoretical. There is empirical evidence based on historical data that showed how the introduction of patent protection to previously ineligible arts influenced innovation. It finds there was more innovation in areas where previously protection was not available or where it was too easy for competitors to rip off your inventions. See http://www.jstor.org/stable/4132712 for instance.

> If you patent can be constructed just by asking a simple question...

Many scientists, mathematicians and engineers will tell you that the best way to solve a difficult problem is by framing it correctly. Put another way, you have to first ask the right questions. The solution may become obvious then, but it may be very hard to first ask the right question. And hindsight is a very powerful bias. What may seem like a simple question today may not actually have been so before it was posed.

This is also one of the many problems with requiring effort as a criteria. Besides being difficult to quantify in general, how do you measure the effort required to formulate the right question? How do you quantify flashes of insight? Maybe a person has 20 years of experience in some field X, but sees a problem and reaches a solution in 20 seconds. Was the effort required 20 seconds or 20 years?


If that weren't an urban myth it must suck to be one of those 1% of suckers with palindromic PINs.


12% of suckers


oh man, I'd get killed because it'd be pretty obvious when I start mouthing a 12-digit pin to try and figure out what it is backwards.....


You have a 12 digit ATM PIN? I didn't know you could do that and I'm 99% certain that card won't work anywhere overseas.


Yeah, I just looked and that appears to be true.

Wells Fargo and a couple others allow anywhere from 8-16 as their maximum, though I reduced mine to 12 because some ATM's were kind of finicky at the longer lengths.

Luckily (hah) working for a US company so I don't get enough time off to actually go overseas, so it's never been something I had to worry about.

Edit: It doesn't seem so weird anymore either. I used to get questions all the time from Cashiers, like "Wow, I didn't know your PIN could be longer than 4". But I haven't been asked about it in 1-2 years.


my credit union assigned me a 5 digit pin... most atms will accept it, but for those that don't, the first 4 digits work :/


Awesome, my pin is 4444


Not really, no, because it turns out cops aren't actually stupid. When they arrest you and take your phone (or anything electronic, really) the first thing they do is clone the memory. You might succeed in wiping out a copy by giving them the alternate password, but that's just going to add to your charge list.

There was a case a few years back (discussed on HN) where a gang had software installed on everyone's phone that caused a remote wipe when activated. The cops did a big raid, and even though they took everyone's phone someone they hadn't caught yet was able to wipe them.

The cops changed their SOP so that when they get ahold of your phone the first thing they do is yank the battery.


> When they arrest you and take your phone (or anything electronic, really) the first thing they do is clone the memory

Depends on the situation. If it's a raid due to software piracy - then probably. If they pulled you over for speeding and arresting you - they probably won't or even know what to look for.

And I really can't think of a way to clone a device like an Android device without unlocking it. ADB now a days requires your explicit permission from a prompt. And if you are like "oh they have ways" I would be very interested in that because that sounds like whatever they are doing are using an exploit or some sort of back door.


Behold the Samsung Anyway Jig: http://forum.xda-developers.com/showthread.php?t=1629359

The link is a few years old. No idea if there is a current version that works on modern phones, but it seems reasonable to assume there would be.


I found this article [1] and I'm already suspect of FUD.

The first part talks about bluetooth pairing with the device - which requires unlocking. The second page talks about unlocking an iPhone with plist files from a synced computer. So it's not a matter of some magical device that can backdoor a phone - but rather using interfaces that already exist.

I found this company [2] and it has the usual marketing ploy - but a quick google search doesn't reveal any actual reviews of people using it. I can paragraphs of marketing spin but no one actually saying "we used this to get into cell phones that were password protected".

I'm not saying it's not possible - I just find it hard to believe without it making modifications to the underlying software (ie flashing a ZIP on android that zeros out the pin password or something).

[1] http://arstechnica.com/tech-policy/2011/04/michigan-state-po...

[2] http://www.cellebrite.com/Mobile-Forensics/Products/ufed-tou...


>And I really can't think of a way to clone a device like an Android device without unlocking it. ADB now a days requires your explicit permission from a prompt. And if you are like "oh they have ways" I would be very interested in that because that sounds like whatever they are doing are using an exploit or some sort of back door.

No, they don't need to use any kind of exploit. They have hardware that allows them to clone the the device's memory. I doubt it has to be on at all.

There's no way to secure a device if the attacker has physical access. The best you can do is secure the data with encryption.


How do you yank the battery and clone memory on an iPhone?


You don't. You drop it into a Faraday bag and then later mess with it in a Faraday glovebox.


What I meant is they yank the battery as soon as they get the phone. I believe you can clone the memory without powering the device if you have the right hardware. If not they could always disconnect the antenna before powering up.


The question stands. How do you quickly yank a battery from an iPhone?


You don't do it quickly.

But what I wonder is how they circumvent the entire trust mechanic when the phone is locked. When the phone is locked most of the storage is encrypted too.


I wasn't trying to say they can somehow get your plain text - this is just a copy of encrypted data. What they can do is keep you from tricking them into destroying the data they have.


Yeah, but isn't the whole point of the secure enclave on the iPhone so that the decryption has to be done on device?


Yes. The A7 came out after anything I've read on the subject, so it may be more difficult. But really all this means is they need to get the copy without destroying the phone and have some way to put it back. The point isn't to brute force the data somehow, but rather to protect it from being destroyed while the wheels of justice crush you.

I've always thought the way to deal with this is to use a OTP scheme. If you have a one time pad that's as large as your data set (assuming we're talking about some reasonably small number of critical documents here), you could generate the cyphertext from your key and then generate another key that translates your cyphertext into something innocuous - grocery lists or whatever.

There's no way the court could prove the key you gave them isn't the right key.


It'd probably be best to do this on a format that can plausibly have such extra data (assuming you generate more operational data than you do plausible cover data). An encrypted disk file is probably a good bet. So long as it's not too egregious, it'll probably be OK. "Yes I have a 64MB encrypted disk, even though my working set is only 2MB."


I'm pretty sure there's a lead you can cut.


In fact, I wager that you could find a spot on any phone model that you could hit with a narrow drillbit to disconnect the power. You could make a jig that you put the phone into and guides the drillbit into just the right spot.

It would only take seconds to use, but you would need to be prepared for the specific phone model.

Maybe you could even have a CNC machine preloaded with the data for a wide variety of phone models. You put the phone down on the work surface, key in the model, and the CNC machine deftly cuts through the right power lead. Less portable, but would require less precise planning.


I'm pretty sure you just made that up.


I'm a little confused here. Are you trying to say there's no way you can disconnect the battery of an iPhone even if you don't care whether or not it works when you're done? Really? I begin to doubt your fitness for this conversation.


When you say 'memory,' are you talking about volatile or nonvolatile memory?


Nonvolatile, of course.


I was under the impression that you couldn't run a bruteforcer against these phones because the key is kept inside of a hardware security module. Couldn't the duress code just wipe the key in the HSM?

If they can copy the HSM, that wouldn't help, but in that case, it doesn't really seem like they need your help getting the PIN in the first place.


>I was under the impression that you couldn't run a bruteforcer against these phones because the key is kept inside of a hardware security module. Couldn't the duress code just wipe the key in the HSM?

That's something I hadn't considered. Is it set up that you can create and destroy the keys but never get access to them?


You're assuming your device is still in control, and hasn't been imaged. This isn't the way a proper forensic process actually works.

More important to the issue at hand, I believe that in at least one case the reason for the passcode/phrase being "testimony" isn't so much that you're revealing what's in the locked container, but that you're demonstrating that you have access to/control over its contents. So entering your erasecode would undermine this point. A courier could not know a passphrase yet be deputized to erase the contents, but that is going to be an uphill argument.

What we really want is a proper layered steganographic filesystem, with an arbitrary number of unlockable levels. But we need an OS and apps that play nicely with that as well.


And then they get you for destruction of evidence?


Wouldn't they have to prove that there was evidence pertaining specifically to the current case to get you for destroying evidence?


It must be the case that evidence can be either damning or exculpatory, and either way you're not allowed to destroy it. Otherwise destroying evidence would be incentivized, because once they can actually prove that the damning evidence was present before you destroyed it.... that proof is as good as having the evidence in the first place.

A judge isn't a robot. If you say "I hit an emergency button to reformat my hard drives as the cops were breaking down the door, because I was preparing to sell those drives on ebay and needed to clean them up", the judge is not obligated to respond "Shoot, I can't prove otherwise, you're free to go."


"What the hell did you people do to my computer? Where's all my shit? I'm going to sue you assholes!!!"


That's probably a decently clever approach. Thank goodness most criminals aren't clever. Or at least, so we assume based on the percentage of them who get caught.... Oh dear.


Not for "anticipatory obstruction of justice". It doesn't matter whether what you destroyed is important to a case or not. There does not even need to be a case at all. You can still be charged if you deleted something, and maximum penalty is imprisonment for 20 years. Hanni Fakhoury, EFF attorney, gave a talk about this at ShmooCon 2012[1].

[1] https://www.youtube.com/watch?v=LzssKvRwrzQ


Doesn't this mean that even clearing your browsing cache is a crime? I deleted temp files recently because they were taking up a bunch of space. I did it to free up space but there could've been evidence there for some investigation (maybe some virus on my machine left evidence of who wrote the virus in temp files that would've allowed the feds to track down the creator).

This seems like one of those laws that sounds good at first but makes a lot of normal behavior illegal, thus allowing the legal system to be able to pick and choose who gets punished.


The burden is on the prosecution to prove that "The accused acted at least “in relation to” or “in contemplation’” of such issue or matter." https://en.wikipedia.org/wiki/Obstruction_of_justice#Obstruc...


That's not hard. At the time of arrest, suspect was observed using Facebook. Here are the correlating logs from Facebook. After unlocking suspect's device using provided code, no Facebook login credentials were present, thus proving that the suspect used the erase code and not the usual code.


My defendant favors using private browsing mode for a number of reasons including preventing friends from being able to access his personal accounts and to lower the risk of tracking and accidentally acquiring viruses.


Here are the Facebook logs from the past two weeks all using the same session token, demonstrating that your client is lying. And now when you try to explain that ok, maybe some information was deleted but that's only because it was embarrassing, and it wasn't evidence of any crime whatsoever, you'll have zero credibility...


They could almost certainly get you for interfering with an investigation, though.


Exactly. Unless the device has untamperable auditing running behind the scenes that shows stuff was deleted, there's virtually no way for the police to know.


Probably not. Erasing leaves marks.

The only marketable purpose for an instant erasure system like that is protection of information (personal or commercial). To build that, you'd make something that zeros all the files and deletes them, deletes and overwrites all the contacts, and such.

But the structures left after that don't look like a brand new device. They look like you had a bunch of things and then erased them. Cops won't know what you erased, but they'll know you handed them a phone that was recently erased.

Now maybe somebody will build an app tuned for obstruction of justice, so it tries to make the phone look brand new. And maybe you'll be very lucky and they'll get it working perfectly on your specific phone. But then you have to explain how you have a brand new phone that was actually purchased a year ago. And how it has no record of any of the calls that your phone carrier will have records of. The obvious conclusion is that you wiped it sometime after your last call and with special software that only appeals to people planning on hindering an investigation.

Is that enough to convict you for obstruction of justice? I have no idea; it probably depends on how much a prosecutor cares. But is it enough to convince cops you are vigorously hiding something? You bet.


make something that zeros all the files and deletes them, deletes and overwrites all the contacts

No, there's no need for anything that complicated.

In principle, encrypting all data on a phone is really simple to implement. In practice, it's carefully thought out to avoid edge cases. The general idea is something like this:

   when first activated, the
   phone generates a random 256-bit AES key

   phone uses this random AES key to encrypt
   all data stored on the phone

   phone retains this random key in a special
   location, and encrypts this key by using
   the user-provided PIN
To quickly erase all data on the phone, all that's necessary is to overwrite the key in the special location with random data. From that point on, there is no feasible way to recover anything on the phone. Period.

It isn't necessary to erase an entire device. It's merely necessary to replace a 32-byte field (that contains the true AES key) with 32-bytes of random data.

From then on, it doesn't matter what the PIN is. Data on the phone is jibberish unless and until the proper 256-bit AES key is produced. That key no long exists, so from that point on the only way to recover the data is by brute forcing AES, by trying all possible 2^256 combinations.

They look like you had a bunch of things and then erased them

No, what remains is indistinguishable from the case where the correct PIN hasn't been provided. Having "things" on the phone is no evidence of guilt. There is absolutely no evidence that the phone was erased. All that is known is that the provided PIN isn't able to decrypt the data.


Good point. If the phone OS supports encryption and allows this sort of auto-destruction of keys, then that's a lot better.

What we were discussing is abakker's proposal for something that "wipes the phone", and I think my comments are still relevant to that approach.

Of course, a suddenly unreadable phone is still suspicious, but if your plan were perfectly implemented, it might be impossible to prove obstruction of justice.


It depends, is there any legal requirement that the device actually has to record a call log normally ?


Not sure how that matters. Is there any legal requirement to keep your receipts? No, but if you suddenly throw them out because you think the police might find them useful, that's still obstruction of justice.


interestingly, I didn't destroy any evidence. I just verbally said something and they took action which destroyed the evidence. It was their action that destroyed it, not my words.


Clever! A judge will surely look favorably on your ingenuity in making the technical facts of your actions not a violation of a pedantic reading of the law, and disregard the intent of your actions.

Similarly, did you know you can legally murder anyone by poisoning their food? You didn't make them eat the poison, they voluntarily ate the food. It gets ruled a suicide.

For more on this and other little-known facts about the law, please subscribe to my YouTube Guide to Being a Sovereign Citizen, only $199/month.


If you share the information-destroying code, then you're the one who caused the destruction of evidence after an investigation is underway. Which you don't want.

Perhaps some software that clears the phone when someone tries to break into the phone / copy data? A prudent security precaution for all sorts of reasons.


I really don't see why not. My home security system has a code similar, disables the alarm but sends a signal to send police now without calling.

I could even see it with touch sensors, that being teaching it that the middle finger wipes the puppy.

With regards to the court decision, I am not sure it will stand in this context. The phones belong to the employer and not the employees. So how as an employer do you retain some access over your provided phones? I can see both sides here. To be honest as a company I don't think its worth the legal ramifications to have power over the content of the phone as it just opens a can of worms

Where I work we are required to pass code our phones if we access the corporate internet or exchange servers but we are not required to divulge our phone contents, passcode, nor place software on the phone giving the company any such ability.


Or after 3 unsuccessful tries, the system wipes itself. Try: macworld, oops...macw0rld, oops... macwor1d!

There's no law against incompetence, true?


A: They'll probably try to clone the memory. B: If a judge thought you were doing this intentionally, they'd hit you with contempt (or other things).


A more fun solution would be to count the time it takes to enter the password. If it's too slow, then wipe the data. Or more advanced, count the tempo, beat and duration of key presses. "Your honor, my password is 1234121234... but be sure to type the 4's as dotted eighth notes, take a rest after the 2's, and moderato!"



port-knocking to a funky beat


> Is this workable?

I know in the Windows Mobile/Blackberry days if you typed in the incorrect password so many times it would reset the device.

However, with many Android devices now a days have the ability to have a "guest mode" - that is activated using a different unlock code. This mode can be limited to not even be able to make/receive calls. Arguably most people won't know what this "mode" is and if they are in it.


Or alternatively, you can have multiple layers of encryption a la Truecrypt Hidden Volumes (http://www.howtogeek.com/109210/the-htg-guide-to-hiding-your...)

The first layer contains something embarrassing but legal like gay porn, and the second layer contains the stuff you really want to hide.

You just unlock the first layer and act really embarrassed if forced and never acknowledge the existence of the second layer.


Police do know about this and can (hypothetically) check to see if there is another encrypted volume inside. Also didn't Truecrypt shut down because they had some huge security vulnerability.


Most implementations so far have been kind of shaky (e.g. relying on having a bunch of "unpartitioned" space on the drive, which would seem weird), but it's technically feasible to have a single encrypted volume which decrypts to two results based on the key. For plausible deniability, one would have to make sure the clean volume had recent browsing history, et cetera. Maybe this is a real-world use for ad-browsing bots.


How can they check? It's certainly possible that Truecrypt has some implementation issues, but I can't see why it's not theoretically possible to do this well.


Here's some good reading. Looks like they can't irrefutably establish a hidden volume's presence, but there are lots of clues nonetheless.

http://security.stackexchange.com/questions/9058/is-it-possi...


Well, driving around in a $300k car that you can't explain purchasing is a HUGE clue that you're a drug dealer but it's not enough on its own to charge you with anything. You won't get thrown in jail until you cough up the location of the stash house based on the car alone.

What do the proponents of these encryption laws expect the penalties to be for 'suspicion of possession of encrypted material'? I don't see any way to create effective deterrents here without making it easy to persecute anyone for any reason. Imagine cops planting USB keys with random bits on minorities they don't like, etc.

Edit: or worse, 15-year-olds planting USB keys with random bits on teachers they don't like and claiming it's CP and then watching them get fired and go to prison for 'refusal to decrypt' shudder


Sure, everything you said is correct. But the alleged drug dealer will probably face an IRS audit and get charged with tax evasion, based on the fact the he has an unusually expensive car. The car's existence will provide investigators with a path to follow in tracking down the suspect's finances, laundering activities, income sources, etc. It's not the entirety of the case (like you said, not enough to charge you with a crime), but it is a clue that an investigator can launch from.

Just because something isn't proof, doesn't mean it isn't evidence. And if I am doing something my government disproves of, I'm going to try and be mindful of all evidence that can expose that. (Note that I'm NOT making any value judgements here) If an inner-hidden volume isn't as hidden as I thought it was, then that's a security risk to me.


My point is that either:

a) 'forced decryption' legislation is toothless. That is, suspicion, but not proof that encrypted material exists and is within your power to decrypt is not enough to throw you in jail for contempt or some other charge.

b) It becomes extremely easy for bad actors like racist cops or asshole teenagers to frame anyone and everyone they want and put them in prison forever.


The guy quoted at the end seems to think it's the passcode itself that would be incriminating, rather than the contents of the phone. Weird. I've seen a theory that compelling someone to disclose a password can be incriminating because it is the same as asking them to admit that they stored the data in the first place, and obviously there's a case to be made that compelling disclosure of the data on the phone could be self incriminating, but the idea that the password would be something like yesiinsidertraded4 is new to me.

Edit: I missed the part where he's a former federal prosecutor. Mystery solved.


This opens up the possibility for an interesting defense, what if I make my password contain the coordinates of those illegal drugs I buried?


Well, that was already covered under "no forced self-incrimination" AFAIK.


He has also revised his opinion on the matter earlier today:

https://www.washingtonpost.com/news/volokh-conspiracy/wp/201...


"If we disbarred the defence attorney, then they would't have client/lawyer privilege"


Even then, that's not retroactive. It's your conversations that are privileged, not you. Removing you as the person's attorney doesn't make the conversations you had as their attorney not subject to privilege.


The location of the body isn't the evidence, it is the body found there that is the evidence. This is why forcing someone to tell the location of the body and holding them in jail until they do is perfectly fine. Except that would be unconstitutional.


The massive difference being that you might not have buried the body at all, but that there is definitely your phone.


No, there is definitely a place (in fact, there are lots of places)- and it could have a body.

The fact that you have a phone, which could contain something, doesn't make the phone case any different.


Which you might not have the password for. You could've recently changed it and forgotten it (I've had to reset phones of friends and family who have done this). Or you never locked it but someone else did. Or a number of other situations where the password, like the body, might not exist.


I don't know enough about this issue to have a real opinion. But I know that Orin Kerr is a smart guy who I usually agree with, so I'm inclined to give his analysis some weight.


Do you have any passwords that would be embarrassing to you if they came out? Can you imagine someone that breaks the law might have an encryption password that was incriminating?


That's actually an interesting thought:

If my password is "iKilledColonelMustardWithACandlestick" - and it's actually an incriminating fact, how does that factor into potentially self-incriminatory discovery?


This sounds too much like something Barry Zuckerkorn would come up with, so maybe not a good idea. http://arresteddevelopment.wikia.com/wiki/Barry_Zuckerkorn


What if it was for a very minor crime? Say you stole a quarter from a friend's home. Still a crime, no?


They offer immunity for the crime exposed by the password and thus you are required to disclose.

So make sure that your password is an admission of whatever crime for which your phone contains evidence.


They offer immunity for the crime exposed by the password and thus you are required to disclose.

Does it work that way? Can you legally compel self-incriminating speech as long as you offer immunity?

If that was true, why wouldn't it be used all the time to incriminate other people? Currently they need to get the subject to agree to such a deal, or so the impression I get from the media is.


This concerns me a little bit, actually.

I know there are folks here on HN who believe that they should have an absolute power to exclude the government at all times. I'm not one of them, though. Particularly in situations where law enforcement has obtained a lawful warrant, I think they should have a way to get that information. People do commit crimes, and the police do need to solve them.

One way to grant the police access is to somehow give them privileged access to the encryption. For me, this idea is dead on arrival. There is no way to grant privileged access to the police without dangerously weakening the encryption in general. I'm a believer that encryption, properly implemented without backdoors, creates a lot more good than bad.

So what does that leave? It leaves compelling the owner of the phone to unlock it. If the police get a warrant to search your house, you are legally required to unlock the door and let them in. It seems to me that a passcode on the phone serves exactly the same purpose.

So, my concern is that if compelling the phone owner to unlock is not an option, it will put a lot more pressure behind the idea of encryption backdoors, as the "only option" to give law enforcement the power they need to do their jobs.


See https://news.ycombinator.com/item?id=10271971

It's not totally unreasonable. On the other hand, there's the 5th amendment, and you are not actually required to open the door when presented with a warrant, that's just a way of saving your door.


Your argument as presented seems to conflate the right to access information and the ability to access information. (Not that you're necessarily doing this, but the way you've written it doesn't seem to distinguish.)

For example, under the law I have the right to go to the Moon. But I don't have the ability. Having the right doesn't compel anyone to create that ability for me.

Back to the issue at hand, I believe the government should have the right to access any information relevant to a criminal investigation, if they have a warrant for it. But I don't think they should be able to compel the ability. At best they should be able to compel that people don't interfere, but the actual ability to access the material is up to them. So while I don't think I fall into that category of person you describe as "believe that they should have an absolute power to exclude the government," I also think that given the current state of technology, that power does exist.

I'm not really worried about backdoors. They look to me to be so ridiculous that it won't even be possible to attempt to mandate them. Even if they are mandated, how will that be enforced? The genie is far out of the bottle on this.


You're correct to make the distinction, but when it comes to law enforcement, both rights and abilities originate in the same place: legislation.

The law gives police the right to enter your premises if they have a valid search warrant. But it also gives them the rams, guns, body armor, etc.--the ability to exercise that right.

From a practical perspective, lawmakers try to address both rights and abilities when crafting legislation that enforces laws. I think it's really unlikely that politicians would say "here's the right, but the ability is your problem." No, they'll write a law that gets the effect they want, even if means granting powers that seem ridiculous to us. There is no shortage of things that have the force of law today despite seeming ridiculous to us.

I am worried about backdoors. They are being seriously considered at the highest levels of government.


What if his house is on the moon, and the government wants him to give them a free ride (to the tune of a few billion) to the moon to execute the search warrant? I would argue that he's compelled to turn off the giant laser cannon to allow the police to get there without being destroyed, but not to let them use his personal craft.

That's the problem with encryption, it's totally unlike anything else that exists in the world. It's impossible (so far) to prove whether someone legitimately forgot the password, or if they're just saying that they forgot it in order to hide/destroy evidence.

What other thing quite literally may or may not exist, depending on the state of a person's mind?


Consider that politicians could just outlaw building houses on the moon entirely, to avoid this problem. Just like they could mandate backdoors in encryption if they believe it is the only way to give police access that the politicians believe they should have. EDIT TO ADD: I believe this is a bad idea but I am concerned it could happen.

Also in terms of a phone, it's easy to tell if it is encrypted, because modern phone operating systems encrypt by default. So the question is whether a person can be compelled to give their password.

If the person wants to assert that they cannot provide the password because they forgot it, that is a defense that can be evaluated in a trial. Is the person lying, or did they really forget it? The court system exists to resolve questions like that. EDIT TO ADD: I'm just making the point that the possibility that someone might claim they forgot their password does not in any way prevent the passage of a law requiring people to provide their password. People try various tricks to get around all sorts of laws--that's why we have a court system.


I don't see how the contents of a person's brain can be discovered by a court. That's a fact which (so far) can only be known by the defendant. Even if a week ago they showed someone that they knew the password, they still could have forgotten it in the last week.

It'd be like trying to determine with 100% accuracy what a person's favorite color is. No matter how many witnesses you call and how many decades of documentation that it's blue you have, that doesn't make it blue. It just meant it was blue.


Intent is in the brain and criminal trials do try to discover it.

Oscar Pistorius shot and killed his girlfriend. "I thought she was an intruder; I did not mean to hurt her," he said (paraphrase). The facts were not in doubt; the entire case hinged on whether the jury believed he was telling the truth.

I agree that there is no way to recover a passcode from the brain (outside of science fiction). But it seems like a court could rule on whether claims of forgetfulness are real.


There's a huge difference between shooting a person and forgetting a password.

Further, any random assortment of bits is potentially an encrypted volume. At least with a murder there is a body. With a "forgotten" password -- or a password you never had to begin with because there is no encrypted volume -- nothing bad has to happen. So it's far more likely to be abused to put people behind bars that "we know did it" but where the police or whoever can't get them on anything else. Like how mobsters often go to jail for tax evasion rather than the violent stuff.


Intent is a factor in almost all U.S. criminal prosecutions, even people who are being prosecuted for shoplifting and say they just forgot to pay.

In terms of the encrypted volume, law enforcement has to prove each step on its own. They can't seize the computing device until they prove it is actually likely to contain evidence relevant to the prosecution of a crime. And they can't ask a suspect for their passcode until they prove that there is actually an encrypted volume to be unlocked (this is probably easier than you think). And if the person says they forgot the password, law enforcement would have to prove that they are lying.

Courts already handle similar situations with contempt and perjury proceedings. Even though criminal contempt or perjury is usually secondary to the "main case," it has to be proven beyond a reasonable doubt just like any other crime.

That's why it's not "abuse." It's not like law enforcement can magically lower the bar for conviction by some special procedure. When Al Capone went to jail for tax evasion, he really had evaded taxes, and the government proved it beyond a reasonable doubt.


I'm sure Capone really had evaded taxes. But the idea that the government couldn't get him on anything besides tax evasion is pretty sad.

The idea that the police or prosecutor has to "prove" all this stuff is nice, but not in line with reality. Warrants get issued on suspicion all the time. There's not a pre-trial trial to determine in front of a jury if the suspected encrypted volume is in fact an encrypted volume.

The judge orders you to provide the password and if you don't you're held in contempt for as long as you don't. Which, if you don't know the password, could be the rest of your life.

That's why some in law enforcement might be tempted to invent an encrypted volume that doesn't exist. Because it short-circuits the normal judicial process. And when you know someone is guilty but the damn red tape gets in the way, it can be tempting to try and make things "right".


A warrant can be issued on suspicion, but once the judge or prosecutor tries to prosecute for contempt, they would need to prove beyond a reasonable doubt that it was actually an encrypted volume.


> I don't see how the contents of a person's brain can be discovered by a court. That's a fact which (so far) can only be known by the defendant. Even if a week ago they showed someone that they knew the password, they still could have forgotten it in the last week.

Mental states -- including knowledge of specific facts, intent, belief, etc. -- are necessary elements of many offenses are determined by courts in the same way as any other facts.

> It'd be like trying to determine with 100% accuracy what a person's favorite color is.

No, its not, because even the "beyond a reasonable doubt" standard for criminal prosecutions is not targeting 100% accuracy, which would be unattainable for any class of facts (not just mental states.)


If the only evidence that someone committed a crime is on their cell phone, the authorities haven't done a thorough investigation and they shouldn't be rewarded for laziness.

Let's take it one step back from electronic devices.

If a corrupt businessman kept encoded records in a notepad, the authorities could compel him to turn over that notebook but no warrant can compel him to explain its contents. That's what encryption is. They authorities can seize the device but they can not force you to explain how to interpret what's in it.

Fortunately, high profile data breaches have put the thought in the public mind that weaknesses can be exploited. We can use that concern to keep people from demanding back-doors.


I'd argue that it's much more important to maintain strong privacy rights than it is to convict the minority of criminals that would be caught by indiscriminately searching their phones.

Keep in mind, almost all this data lives elsewhere too. If it's important enough they can still get it.


If a crime is so overwhelmingly in cyberspace that only the evidence on a computer would make the case, I'm very skeptical that that is the kind of crime that would impact me if it went unsolved.


You aren't legally required to unlock your door and let the police in with a warrant. They will force their way in without your consent however, in some cases with out asking nicely first.


To which law enforcement will say: "but there is no analogous way to force our way into strong encryption, so we need to create one." To reiterate: I think that would be bad. Backdoors are bad.

Given the choice, I'd rather have warrants compelling people to unlock, than a law saying that encryption has to have a backdoor so law enforcement can force their way in.


On the other hand, if you can be compelled to divulge the passphrase to any encrypted container, what happens if you forget it? What happens if you happen to have a block of random-looking data? It might be an encrypted container -- or it might be something entirely different.

If I were to go through my old hard drives, USBsticks and whatnot, I'm reasonably sure I'd stumble on to random blocks of data and/or encrypted archives with passwords long since forgotten. If I could be punished for not producing matching passphrases, those could become a huge liability, very quickly.


Plus, it would be an easy way to incriminate. Drop an encrypted usb in the belongings of the person, and use that as an excuse for putting them in jail. Hell, it would be a lot more convenient than the War On Drugs.


This is a very interesting thread. I didn't think I'd ever feel differently about this issue, but you've really made me consider some other implications of locking police out of phones. Thanks for chiming in.


The real problem here is the fact that there is missing oversight on these court orders, since it might be the case that the judge handing out the order does not have sufficient knowledge of the matter at hand and so there is a huge potential for misuse by the police or intelligence agencies.

In the end it boils down to the simply dilemma of choosing between either catching all "criminals" or protecting the rights of the people.

The latter way of course makes life a bit harder, since you'll never be able to prevent all crimes and people will potentially die, but people die all the time, since dying is a a basic risk of life.

The former way is essentially a rabbit hole, because it allows you to rewrite the definition of a "criminal" to just about anything you'd like and once you get a court order the "criminal" being is pretty much done with their life. This of course was and is still used in dictatorships of all kinds of sizes and employed by numerous secret police agencies around the world. Once there is a way to criminalize any action and then instantly "get rid" of that person, people who are going to use these powers WILL pop up and take over power.

Now of course in the short term no third party candidate is going to show up on the US politcal floor and take win a majority in the elections over night and then install a dictator system based on all the pre-existing powers by simply outlawing all other parties and anyone who objects to the new rulers claims. But 10 or 20 years from now things might be different and if we don't fight over abuse of the law and protect the general public we might end up in a pretty bad situation someday.


Court orders are oversight. Police are by law limited on their own. To go further, they have to convince a judge.

The oversight on the judge? Higher-level judges. If you are arrested after a warrant that is later found to be invalid, the evidence collected based on the warrant is thrown out. This can go all the way to the supreme court. As it did in Franks v Delaware: https://en.wikipedia.org/wiki/Franks_v._Delaware


Don't forget the NSL system of secret court orders...


Given that key escrow failed eventually, what is different today that would make backdoors possible?


There is no need for a back door. Phones are not encrypted once the user accesses them. The carriers can push and pull any files to or from the phone over the air. Some of them do not require a warrant. I was always told to never ask for one.


Backdoors are an absolute bad idea.

My point is that when a bad idea appears to be the only alternative, there will be pressure (from people who do not understand the technology but do understand law enforcement) to do it even though it's bad.


> So what does that leave? It leaves compelling the owner of the phone to unlock it.

You're constructing a false dichotomy. Backdooring all encryption and compelling the accused to provide a password are not the only alternatives. And they're both terrible.

You seem to understand why backdoors are problematic.

If we require the government to prove the original crime in order to convict for refusing to testify then there is no point in making refusal to testify a crime, because any time they could prosecute for it they could already prove the original crime and don't need the testimony. The only way it helps the government is if they are also allowed to convict people who refuse to testify when being prosecuted for crimes they did not commit.

You might argue that an innocent person can be vindicated by telling the truth instead of refusing to testify, so innocent people don't need to refuse to testify. But that doesn't work when the government is playing "show me the man and I'll show you the crime." Everything you say can and will be used against you in a court of law and lying to the police is illegal. You have to be allowed to shut up or the government would be able to convict anyone they want just by compelling them to keep talking until the imperfect human in the fish bowl says something a court will accept as incriminating evidence.

So you want to make an exception for passwords. But if passwords are different than other testimony it's because we should be more protective of them. They're the ultimate fishing expedition. There is no relationship between the password and the crime. If there is no evidence of the crime under investigation whatsoever (perhaps because you didn't do it), they still get to see everything on your phone. Then they can charge you with whatever completely unrelated crime their fishing expedition uncovered and use parallel construction to bypass any restrictions on what the original evidence was supposed to be used for.

Moreover, it remains possible to investigate crimes even if you can't look at the contents of every suspect's device. You still have all the evidence supplied by the victim and witnesses and all the methods of traditional police work that existed before everyone started carrying around personal surveillance devices. Being able to force suspects to supply their passwords might help, but lots of things might help, and most of those things might help enough that it would at least slightly increase the percentage of guilty people convicted. That only tells you that some civil rights come at the cost of not convicting some guilty people. It doesn't imply that we should erase every one that does.

Roper: So now you'd give the Devil benefit of law!


More: Yes. What would you do? Cut a great road through the law to get after the Devil?


Roper: I'd cut down every law in England to do that!

More: Oh? And when the last law was down, and the Devil turned round on you — where would you hide, Roper, the laws all being flat? This country's planted thick with laws from coast to coast — man's laws, not God's — and if you cut them down — and you're just the man to do it — d'you really think you could stand upright in the winds that would blow then? Yes, I'd give the Devil benefit of law, for my own safety's sake.


Any lawyers (or wanna be lawyers) want to chime in -- is your thumbprint protected in the same way? I can see how the 5th amendment prevents them from compelling you to reveal your passcode, but does it prevent them from grabbing your thumb and using TouchID to unlock your phone? (That, or, just using a thumbprint provided during booking or whatnot...)


From what I've read in the past, it doesn't cover your thumbprint. [1]

Thumbprints are physical, so they don't get the same protections. It's kind of like having a physical key to a physical lock. It's not self-incrimination for law enforcement to take that key and use it in the lock.

Same with writing your passcode on a piece of paper. It's no longer a matter of self-incrimination if they find that and use it.

[1]: http://arstechnica.com/tech-policy/2014/10/virginia-judge-po...


The difference is described here in Orin Kerrs reversal of his opinion earlier today:

https://www.washingtonpost.com/news/volokh-conspiracy/wp/201...

Basically, having to provide something in your mind, the passcode, is testimony, a finger print, much like a key, is not testimony.


I'm just in the wannabe category (not even that), but I seem to recall it being ruled that since this was just a physical thing, it could be compelled. If you're paranoid, try to shut off your phone before any such interaction, since it requires a passcode when powering on for the first time. Alternately, use the wrong finger five times in a row to lock out TouchID.


I'm glad they've narrowed the conditions to compel suspects of revealing passwords. It just seems that it was too wide open for just any minor hint of suspicion was enough (sans warrant) to do these fishing expeditions. If they got reasonable suspicion then they need to get a friggin' warrant. I just don't get why law enforcement is getting so sloppy these days.


I think it's the result of the easy-going attitude of such things in the past in the effort to appear tough on crime. When they were given such lenient rules of how they approached crimes and suspects. Rules that are now being slowly taken away from them, as unconstitutional, but are still trying to operate as if they are still there.


I assume there is no difference between being forced to reveal a phone password and being forced to reveal a password to some other system such as a PC, encrypted container or website?


Here's a somewhat-related quote from patio11 that I often refer back to:

> Developers have a cultural quirk where they believe that, e.g., "file sharing is not theft" / "manipulating a URL can't be a crime" / "laws about disclosing protected information invariably contain a public policy exception which comports to the temperament of the dev community" are axiomatic and thereby create an internally consistent legal system which fails to falsify those axioms but also fails to meaningfully resemble the legal system we actually operate in.

> This results in developers sincerely believe things like "Your Bitcoins are unprotected by the legal system because nobody can steal a number", which is a proposition that is absurd to the legal system as "JavaScript is not a programming language" is to a programmer.

(https://news.ycombinator.com/item?id=7367312)

In other words, no. There is—and should be—nothing special about computers.


File sharing is not theft, and there's unambiguous legal precedent to that effect. Really bad example. Stop conflating two laws that mean different things.

Nothing special about computers? Okay. "Persons, papers, and effects" no longer means emails, computer files, or anything other than physical, tangible documents that existed when that law was drafted. I really don't think that's the world you want to live in, or the argument you really want to be making.

The fact that the legal system thinks there's "nothing special about computers" is the cause of a great deal of difficulties that should not exist in a sane world. This is a world with new concepts that did not exist when a lot of our laws were written, and it doesn't make much sense to presuppose that there is or even can be a 1:1 mapping between the tangible and the not, all of the time.


> Stop conflating two laws that mean different things.

Can you point out where I did?


You're being intentionally obtuse here since it's literally the first words of my post. You did quote someone else and pretty much endorsed it, but that quote contains a pernicious falsehood.


Pretty sure he is making a wider point about techie attitudes to the legal system, nothing actually specific about file sharing out of context.


> "file sharing is not theft"

It's not, and your tone and sentiment indicates that you think it is.


Clear enough. It is not theft. It is violation of the copyright law. A crucial distinction. Theft means you take away someone's legal property so they're not in possession of it any more. File-sharing does not take their copyright away and give it to the thief. Copyright-theft is a different thing. Right?


I actually agree with you on this one, for a change.


There is to me. I patently reject the notion that information is property. And a lot of others do as well, at least to some extent.

Do you really believe that up until this week, "Happy Birthday" was someones 'property' ? Do you really believe that posting the ETA for city busses is a patentable innovation ?

Just because some legal thug perverts the definition of words (such as 'property' ) in the lawbooks does not change the way I regard them. Just like back when the books said that a black man was 3/5ths of a human being. Just because it says that in a book doesn't make it correct.


Yes and as OP was pointing out, just because you regard them as powerless doesn't mean they are powerless. You can't simply make rules for yourself outside of society because you wish it so. You may, in your mind, disagree. But in reality you will be subject to the legal rules of society.


So to put it simply: they are right because they have the guns on their side. Lovely.

Also, you word it as if this is society's legal system. Society has about as much control over the legal system as peasant has over the dictates of their king. Unless you limit society only to the wealthy and well connected.


you're partially correct, but actually if people just say "fuck you" to specific laws, en masse, there actually is some safety in it, viz. the RIAA trying to sue people for pirating. Sure some unlucky folks will have to pony up their protection money, but while they get shorted, millions of us enjoy the benefit of disregarding that law. essentially if everyone disregards the law, "they can't catch all of us". I'm not intending to make a solid case for piracy specifically, but just using that as an example about the safety in numbers.


Except this makes the situation worse because now everyone is a criminal and the legal system gets to pick and choose who to punish.

"Hmmm... I don't like gay minorities, let's hit them up for all the illegal things that everyone is doing, but which are still illegal."

Maybe there is a defense if the prosecutor says the above line exactly, but in general it allows for unfair application of the law. This quickly becomes 'don't piss off a cop/judge'.


But when your own personal definition of what you think a word ought to mean and the legal definition of that word collide, whose definition do you think matters?

Choose your own definitions of words if you like, but don't be surprised or outraged if the world doesn't share them.


Which court cases have file sharing prosecuted under "theft"? I always have heard of them being called "copyright infringement".


Has any court ruled file sharing is theft not copyright infringement? This is clearly a case of dissolving the question[1]. No one disputes what's actually going on - copyright is being violated. What's in dispute is the definition of "theft". This is only interesting from a rhetorical/political point, as the flavour of the word theft strongly implies negativity, whereas copyright infringement is debatable.


Forcing the suspect to input the password still looks legal, according to the last paragraph. They just can't force you to tell them the password.


There's a more elegant solution: the government should be allowed to compel you to disclose your password if probable cause exists to search your device but only subject to an evidentiary privilege that prevents your knowledge of the correct password from itself being admitted as evidence to prosecute you.


And what happens if you don't know the password or have forgotten it? I have old encrypted files and lost devices that have passwords I no longer remember.


That's what judges are for, right? They'll review the evidence and judge if you're likely to actually have forgotten the password, or are just saying so. If you're traveling with a phone or laptop that appears to have been in-use by you, and don't know the password to it, chances are you're not being truthful. Otherwise, you'd have a plausible alibi story (I just bought this phone off Craigslist; here see the email and the ad).


What happens if i keep passwords on ram only servers in Elbonia, that self destruct if code is not entered once per week?


Destruction of evidence charges? I am often surprised at how adaptive the law is for preventing loopholes like this. I would say with great confidence that this wouldn't fly.


How can I destroy evidence if the system was set up before being in custody? And the government preventing you from storing your data with dead man's switch is punishing a thought crime.


I think they would argue you set up the deadman switch with the intention of destroying potentially incriminating information in the future. It's not really any different than setting a flare to burn down your office and destroy filing cabinets of evidence of tax fraud or whatever. That would definitely not fly, arson aspects aside. Just because you set it up before there was any incriminating information doesn't really matter, it is a liability you accept setting it up that it might destroy evidence you are not legally allowed to destroy.


I think this ruling marks the recognition that the information processing performed by the devices we carry has, in a legal sense, "merged" with the thoughts we carry in our heads, and are now worthy of the same fifth amendment protections.

I'm in favor, but for some reason this also makes me a little worried.


If backdoors legislation is passed, would there be any economic effect? Personally, I wouldn't be interested in doing anything more than "basic" stuff with my phone and would not feel compelled to upgrade phones so soon.

As for computing devices, aside from proprietary systems like Windows and Macs, do we not have reliable options that allow us to use uncompromised encryption?

Whats to stop app developers for embedding encryption packages or would they be forced to use compromised solutions?

Can you stop a person from building a secure line over a compromised medium, if that is even doable?


This is the reason (along with probably constitutional issues) I'm not terribly concerned about most of this sort of talk [0]. It seems to me that it'd be a temporary problem, mostly with proprietary systems, that would be resolved in a few years. Sort of like the encryption export issues in the 90s, and the OpenSSH project being hosted in Canada.

[0] Ok, I'm not concerned about it actually happening. The part of it that bothers me is the otherwise seemingly-sane individuals who agree with these backdoors. It's very difficult to discuss the issue with some of them because their interest in it is largely driven by emotions, specifically a desire for security and justice/revenge/control of criminal/terrorists/whatevers.


Why they should really need your password?. They can ask the telephone company for a saved copy of your incoming and outcoming calls in the last year so they have already (or can trace) the 90% of the interesting data in your phone.

This seems also protect the policy from the temptation of doing stupid things that could lead to future lawsuits against them (like leaking photos of you drunken in a party, of from your girlfriend naked brushing her teeth and so...)


This seems reasonable. Anybody want to play internet lawyer to explain why you can't/shouldn't be compelled to unlock a fingerprint protected device?


After Riley v California (2014), a warrant is required before the search of a mobile device. Therefore, in a scenario where you are asked to unlock your device, the only question is whether you are required to cooperate by giving your password or offering your finger. In the case of a password, the answer appears to be no, due to the privilege against self-incrimination. In the case of fingerprints, there is no such right. You are merely identifying yourself to the phone and your identity cannot be construed as incriminating testimony.

http://www.supremecourt.gov/opinions/13pdf/13-132_8l9c.pdf

http://scholarlycommons.law.northwestern.edu/cgi/viewcontent...


The analogy I've often heard is that it's like a safe. Presented with the proper warrants, you can't refuse to open a safe on the basis of the 5th ammendment.


I find it interesting. It is crime to do crime is a tautology. But isn't "It is crime to try to hide the fact that you are doing crime" bit of the same? If you're doing crime OF COURSE you are trying to hide the fact that you are are doing crime. Does it make it more or less of a crime, since in practice every criminal is already doing that? Except terrorists. They want everybody to know they are doing it. Does that make them less criminal then?


But what if the safe had documents in it which was written in a language that only you know? Is there any circumstances where you could be compelled to translate them?


Some bookkepers for crime organizations have from time to time used their own notation to keep the books. Typically, they have not been sophisticated enough to hide malfeasance, in addition, the shadiness of it is often enough evidence of wrongdoing, or at least leads to forensic accounting which reveals tax evasion.


Yeah, that would be the closest thing I can think of in the real world.


Not that I do or don't agree... My current operating theory is that investigators may use a "rag doll" view: they can't compel you to admit or divulge anything, but they can make use of your limp uncooperative (but not resisting per se) body. They may note your biometrics (eye color, iris pattern, hair samples, fingerprints, etc), doing so non-invasively (no blood samples unless you're already bleeding); as such they could take your hand and press your finger onto a fingerprint reader, just as they are allowed to gather prints for a background check without a warrant (AFAIK). Corrections welcome.


You are essentially correct. The fifth amendment says you can't be compelled to "to be a witness against himself." Courts have found that bio-metrics aren't witnessing, it's just evidence they are collecting about you.


Can someone comment on what level of court this decision was made at, and how final this decision is likely to be (i.e. how many more appeals are possible at this point)?


I'm glad when the courts uphold the U.S. Constitution in a meaningful way. I have faith America will sort its shit out.

But we shouldn't have to rely on the law alone. We should be able to rely on technology to make it impossible to compel people to give up their most intimidate data. Computing devices have become an extension of the mind and no one on earth has a right to the contents of your mind.


bet is not unconstitutional to force someone to give up pw of PC though? :/


From my limited understanding, the difference is the government having a reasonable level of certainty that the device in question contains the evidence they're looking for.

The analogy I would use is a locked closet full of file boxes. If the government is certain that the files relating to a specific crime are in the closet, then you can be compelled to assist them in opening your closet or face an obstruction of justice charge. However if the police suspect you of a crime and suspect that you're the type of person who would keep the evidence in your locked closet that is not enough compel you to open the closet so the police can check up on their hunch.

In this case, I read it as the men are suspected of insider trading and the government believes that they would have used their cell phones to communicate about the deal and the phones contain evidence of such. There is no actual evidence that the phones were used and so they're not obstructing the police in obtaining evidence the police know is there, but rather preventing the police from poking around to see if the evidence exists at all.


I find it odd that they aren't just able to get the data from the carriers. The only reason it'd be on the device but not the carrier/Facebook/Twitter/etc servers is if they encrypted the messages, but they would probably mention this if it were the case.


So is Stingray and what the TSA does to you at the airport but I guess we are never going to address that.


Here's hoping we're making progress toward ending TSA's fishing expedition. Given the concerns about mass murder being preventable via a security check, at least establishing that TSA's job is ONLY to watch for explosives et al, and may not act on discovery of other harmless (to the flight & passengers) contraband.


The police will stop forcing suspects and will start inducing cooperation.


[flagged]


Please edit name-calling ("clearly what this asshole wants", "idiots like Kerr", "go around spewing bullshit") out of your comments to HN. You have a substantive point to make here, but its signal-noise ratio is upset by extraneous venting, which breaks the site guidelines.

https://news.ycombinator.com/newsguidelines.html


He has reversed his opinion on the matter earlier today:

https://www.washingtonpost.com/news/volokh-conspiracy/wp/201...


When Apple said they can't turn over data the FBI said that means adorable children will die. I wonder what they will have to say about this?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: