Hacker News new | past | comments | ask | show | jobs | submit login
How Lavabit Melted Down (newyorker.com)
287 points by jeanbebe on Oct 8, 2013 | hide | past | favorite | 172 comments



The integrity and bravery he has shown in this fight is impressive. He has definitely earned enough "cred" to restart this business outside the US and be very successful.


We should celebrate Ladar for making the decision to put himself at risk in order to protect his users, but I think we should be careful not to forget that Ladar was forced to make that decision because the security of Lavabit was all a total handwave.

This wasn't untested water, either. The exact same thing happened to Hushmail for the exact same reason, and should have been evidence enough that the model isn't viable.

So I think we should definitely support Ladar as a person, but we also need to be careful not to confuse that with supporting Lavabit, which was a very real danger that should never be repeated again (again).


It wasn't a handwave. It was the best current technology could offer, but this technology was not meant to deal with the oppressive government that can compel any company to reveal any information. As soon as Ladar realized that, and the fact that he is in the jurisdiction of the oppressive government, Lavabit was done.


"It was the best current technology could offer"

Sure, if we ignore the existence of things like PGP, S/MIME, smart cards, and the dozens of other ways we can have secure email without relying on some trusted third party like this.

"this technology was not meant to deal with the oppressive government that can compel any company to reveal any information"

Then it was not meant to deal with the evil hacker who takes control of the server and grabs valuable information.

"As soon as Ladar..."

There's the handwave. The security of the system depends not on technology, math, or physical laws, but on the whims of just one man. What if Ladar was less principled?


  Sure, if we ignore the existence of things like PGP, 
  S/MIME, smart cards, and the dozens of other ways we can 
  have secure email without relying on some trusted third 
  party like this.
As I understand it Snowden was using Lavabit to communicate with journalists who didn't themselves have/use/understand PGP. Lavabit is technology you can use even if no-one else uses it.

How do PGP and S/MIME solve the bootstrapping problem, where no-one uses them because no-one uses them?


There is no bootstrapping problem. Snowden was also able to get journalists to use OTR, without relying on some web interface with questionable security claims. The problem with PGP and S/MIME is that the existing software is too hard to set up. It is hard to tell journalists to go use Thunderbird, get Enigmail set up, and publish a key. The situation is even worse with S/MIME, despite the built-in support in many email clients.

Basically, the problem is that PGP and S/MIME as they exist today are (and I shudder to say it) too security conscious. The reality is that most users are not going to take the time to maintain their public keys, to verify others' public keys, etc., nor are most users going to spend the time to get their keys verified by a CA. We need a "lite" PGP client that is less obnoxious about using untrusted keys, and basically a "newbie" PGP that is more vulnerable to active MITM attacks (which are actually much harder to pull off than one might expect). It would also be nice if public keys could be easily published via QR codes, so that someone can literally hand their public key to another person.

The worst thing we can do is lie to people about the security they receive e.g. telling them things like "Lavabit is set up so that nobody but you can read your email!"


qr codes for gpg public keys - sounds a great idea. Any idea why no one done it yet - seems like it could be a simple transform from an ASCII armoured output.

Bookmarking reply


You have to use a very large QR code to do it. It won't fit on a business card and phones are picky about scanning them.


Sure, but we have key servers for distributing the actual keys. You would only need the fingerprint in the QR code.


Kind of defeats the point - how do you know you're downloading the private key you scanned?

While someone might have their business cards replaced, it's a lot easier just to send a different key to someone.


"Kind of defeats the point - how do you know you're downloading the private key you scanned?"

Assuming that "private key" is a typo (you download public keys), you can just check the fingerprint of the key against the fingerprint you were given. That is easily automated.


This puts us back in the same kind of problem that was being complained about though - keyservers, publishing keys etc.


Not really. The problem is not about publishing keys or key servers, but with the cumbersome process of having to enter a hexadecimal string (and having to check an even longer string). The point of using QR codes is to simplify things: the user just scans a QR code, and the key is fetched and assumed to be valid (i.e. no need for the user to check fingerprints).


I thought a QR code could be effectively any size - I certainly remember a Japanese billboard that used shadows etc.

A QR code can have ~4000 chars of a-z and ~3000 Latin1 iirc

so with my limited gpg knowledge that handles my public key quite happily.

would be interested on the business card front though


The mere fact Snowden was talking to someone in Guardian would be quite damning even if the contents of the messages were not known. The problem in catching the leak is knowing who's talking, not what they're saying - finding out what is the easy part, who is the hard one. Access to envelope info solves the who part, and full access to mail server gives full access to envelope.


How PGP or S/MIME would help you if any provider could be required to turn over all the traffic it gets, unencrypted? The only way you could securely communicate is peer-to-peer with the trusted party, but the email doesn't work this way. Unless you always send mail directly to your target's SMTP server which is hosted by the recipient himself (which kind of defies the whole idea of having email as a service and turns it into just a peer-to-peer chat with funny headers), the adversary would have access at least to the envelope information.

>>> Then it was not meant to deal with the evil hacker who takes control of the server and grabs valuable information.

Indeed, it is not. If the hacker gets full control of your mailserver, at least your envelope information is completely compromised.

>>> The security of the system depends not on technology, math, or physical laws, but on the whims of just one man

This is completely false. Security depended on adversary not having full access to the Lavabit servers, not on Ladar's "whim". As soon as Ladar realized there's no possibility of legally providing it in the US, he closed the service. Assuming your adversary doesn't have full access to your service is kind of a precondition of using the service as means of security. That's like using lock is assuming the adversary does not have the key, if he does, the lock is useless as a security measure. As soon as Lavabit became essentially useless for the purpose it was created, it was shut down.


"the adversary would have access at least to the envelope information."

This is a strawman, because Lavabit never did anything to protect headers. What you are missing is that Lavabit could respond to a demand for plaintext, if Ladar were willing to do so; on the other hand, Google cannot give anyone access to the plaintexts of PGP encrypted messages that I send through their servers because of technical barriers. That is the point of doing your encryption locally, and that is why security and privacy are not a service.

"Indeed, it is not. If the hacker gets full control of your mailserver, at least your envelope information is completely compromised."

Except that with Lavabit, an attacker could also get all your message bodies.

"Security depended on adversary not having full access to the Lavabit servers, not on Ladar's "whim"."

Let's put it this way: if you were involved in a lawsuit against Ladar, would you trust your communications with your lawyer to Lavabit? Of course not, because Ladar could have modified the code at any time and without alerting his users at all to read any plaintext that he wanted to read. If he had been willing to cooperate with the government, he could have and nobody would have a clue.

"Assuming your adversary doesn't have full access to your service is kind of a precondition of using the service as means of security."

In other words, security is not something you can get as a service. The entire model is fundamentally and fatally broken.

"That's like using lock is assuming the adversary does not have the key, if he does, the lock is useless as a security measure"

No, it is like storing your key with the bartender at your favorite night club and assuming that he will not allow your adversaries to use it.

"As soon as Lavabit became essentially useless for the purpose it was created, it was shut down."

It was shut down because Ladar chose to shut it down rather than capitulate. He could have chosen to keep it going while the government eavesdropped on it instead. That means that, as I said, security boiled down to Ladar and his principles. That is why PGP and S/MIME provide you with better security: mathematics are not subject to the choices that human beings make, and with PGP, S/MIME, etc. your security is a matter of mathematics.


>>> This is a strawman, because Lavabit never did anything to protect headers.

Since SSL keys are mentioned, I assume SSL was used in communication. This means the claim that Lavabit did nothing to protect headers is false.

>>> Except that with Lavabit, an attacker could also get all your message bodies.

This is true, however body of the message may contain encrypted information, which is useless to observer. Envelope information can not be encrypted in a way that is not readable by the mail server, that's the point of the mail server.

>>> The entire model is fundamentally and fatally broken.

That is kind of what I was saying - that current technology of the email can not do what Lavabit tried to do if the adversary can do what the courts said it can do. I'm not sure what you're disagreeing with here.

>>> He could have chosen to keep it going while the government eavesdropped on it instead.

That would be a betrayal of his user's trust, since any claim about what his service does would be necessarily false and he would become a liar if he ever claimed his server allows to communicate securely. I agree that this boils down to one's principles. I think a principle of "don't lie to your customers" is a good one to have.

>>> That is why PGP and S/MIME provide you with better security

No they do not, unless you can establish a peer-to-peer channel with your other party. In which case you're not using email anymore. Neither PGP nor S/MIME can prevent adversary from collecting envelope information in emails, and unability to do this is what made Lavabit impossible to continue.

>>> mathematics are not subject to the choices that human beings make,

You seem to be either genuinely confused about how email works and what running a secure email server involves, or trying to say something else than you're actually saying, or not making any sense. There are no "choices" that allow you to create secure email server in situation in which Lavabit found itself.


"Since SSL keys are mentioned, I assume SSL was used in communication. This means the claim that Lavabit did nothing to protect headers is false."

That is like saying that GMail is protecting the privacy of your headers, because GMail uses SSL.

"This is true, however body of the message may contain encrypted information, which is useless to observer"

Except that in the case of Lavabit, that encryption was just a side show since all the cryptographic operations were performed on the server.

"current technology of the email can not do what Lavabit tried to do if the adversary can do what the courts said it can do"

Except that someone who encrypts their message bodies before sending the message i.e. performing cryptographic operations locally leaves the server unable to fulfill demands for plaintexts.

">>> That is why PGP and S/MIME provide you with better security

No they do not, unless you can establish a peer-to-peer channel with your other party. In which case you're not using email anymore."

Really, you think I am not using email anymore if I am running my own personal mail server? It is also false to claim that one must have a peer-to-peer system to protect header information, or that PGP on its own is not enough. You could broadcast an encrypted message via Usenet. You could use anonymous remailers (e.g. "Type I" remailers, which use PGP). In all these cases, however, you need to perform your cryptographic operations locally.

"There are no "choices" that allow you to create secure email server in situation in which Lavabit found itself."

Sure there are: Ladar could have sold smartcards instead of selling cryptography as a service. There, one design decision that could have given his users meaningful security while still maintaining the convenience of webmail. To boost security even more, those smartcards could be coupled with a thumb drive that includes the necessary client software, so that Lavabit could not pull a Hushmail on its users.


>>> That is like saying that GMail is protecting the privacy of your headers, because GMail uses SSL.

It does. Unlike Lavabit, though, there is ample reason to assume they invalidate this protection by granting governmental adversaries access to the information stored on their servers - the thing that Lavabit refused to do, at least on mass scale.

>>> since all the cryptographic operations were performed on the server.

I don't see what precludes you from sending emails already encrypted. Security has layers. You are not limited to using just one.

>>> leaves the server unable to fulfill demands for plaintexts.

You forgot to read the part of my answer where I use the word "envelope".

>>> Really, you think I am not using email anymore if I am running my own personal mail server?

It is irrelevant what you personally are doing - it is relevant what Lavabit was trying to do. They were trying to provide certain service - for people that - like, I assume, most of Guardian journalists - are not technically advanced enough to run a secure mail server on their personal computer and set up everything else in a way that allows it to accept email from the internet. This proved impossible, thus Lavabit is not with us anymore. That was my whole point.

>>> Ladar could have sold smartcards instead of selling cryptography as a service.

He could also have sold hotdogs and Carribean timeshares, how that's related to the topic? He was trying to create a secure email as a service - for those that can not create the same by themselves, which in 99.999% of the population of email users. Not provide solution for completely different problem such as key exchange and creating VPNs and other stuff.


I want secure email but I have yet to figure it out. What are "the dozens of other ways we can have secure email"? Honest question


Have you heard about "thermorectal decryption"?


Unless he actually used properly implemented forward secure SSL for every connection, which I doubt all of either his customers browsers or the SMTP servers he talked to supported, didn't his choices actually put his customers in more danger?

He could have complied with one of the several valid court orders that requested he give the FBI data on a specific account but stopped short if installing FBI code or devices on his system or handing over the keys. Had he done so, it would have stop there.

Instead, it escalated to the point where he actually was forced to expose all his users. Anyone who has transcripts of those connections (e.g the NSA), can now read them, get the passwords, and decrypt any mail they got form the server. It seems like a boneheaded move unless his only goal was to protect Snowden at all costs.


According to the article, the FBI jumped straight to "give us all the SSL keys for everything", and would not let him to that selective warrant.

He rightly observed that those leaked keys would then get into the hands of God-only-knows-who.


The story, as far as I have read from this article and others, was they asked for data(probably with an NSL), he said no. They got a court order. He said no. At some-point he was willing to cooperate, but by that point, they didn't care because they thought he was jerking them around.They then requested the SSL keys. This article is more clear about the exact sequence of events[0], but the the posted one says so as well. The initial request was not for the SSL keys.

From the newyorker article : "On June 10th, the government secured an order from the Eastern District of Virginia. The order, issued under the Stored Communications Act, required Lavabit to turn over to the F.B.I. retrospective information about one account, widely presumed to be that of Snowden. (The name of the target remains redacted, and Levison could not divulge it.) The order directed Lavabit to surrender names and addresses, Internet Protocol and Media Access Control addresses, the volume of each and every data transfer, the duration of every “session,” and the “source and destination” of all communications associated with the account. It also forbade Levison and Lavabit from discussing the matter with anyone. "

Sometime after his initial refusal and then offer to comply with some caveats that the fed's interpreted as stalling:

"Prior to the hearing on July 16th, the U.S. Attorney filed a motion for civil contempt, requesting that Levison be fined a thousand dollars for every day that he refused to comply with the pen-register order. EARLIER IN THE DAY, Hilton issued a search-and-seizure warrant, authorizing law enforcement to seize from Lavabit “all information necessary to decrypt communications sent to or from [the account], including encryption keys and SSL keys,” and “all information necessary to decrypt data stored in or otherwise associated with [the account].” (emphasis mine)

[O]http://www.wired.com/threatlevel/2013/10/lavabit_unsealed/


This is an inaccurate account of events. If you read the actual documents [1], you can see that the FBI had exactly 2 demands: A pen register device, attached to his servers; and his SSL private key. That is the sum total of what they wanted: complete, near-real-time access to all of Lavabit's data. A physical device to copy the server traffic and send it to the FBI, and the SSL key, to decrypt that traffic.

The stated use of these two things was to get information concerning a single person, but they never wanted just that information. On page 100, Levison states that he can manage to get the information the FBI is looking for, without providing the FBI with Lavabit's encryption keys. Someone (AUSA[censored]) says that the proposed solution does not satisfy the subpoenas and court orders, because it would not provide real-time access to the data.

---

[1]: http://cryptome.org/2013/10/lavabit-orders.pdf


It's entirely possible it's an inaccurate account of events. I haven't read all of the primary documents, just secondary sources.

In your linked documents,Exhibit 1 is the original June 10th order. Attachment A of it(page 4 of the PDF) details what he was order to hand over. It does not mention SSL keys at all. Instead it asks for a bunch of meta-data. In fact, it explicitly doesn't even cover communication contents. It also doesn't specify how Lavabit has to execute the order, just that it must provide the data.

This was the order Lavabit apparently initially refused.

Can you point to the first point they demanded the SSL keys? The stuff on page 100 looks like it pertains to the July 16th order. Which is, again, considerably after the June 10th order that originally asked for the data and after Lavabit refused that order. Also, totally inline with narrative of events as I presented it.

Regardin pen-registers: a pen-register can be done in software and is typically done by the service provider, not the government. The term is an anacranism dating back to telgraphs. It doesn't necessarly mean government hardware or software[0]. Hence the discussion page 99 of the pdf about "implementing the pen-trap device" in section d. So that's not blanket access

[0]http://en.wikipedia.org/wiki/Pen_register


The June 10th order is on page 2, and seems to be only for the Target's account details (not metadata on messages, AFAICT.) Page 19 (and again on 97) says "Mr. Levison provided very little of the information sought by the June 10,2013 order." This sounds like he did not refuse it, and may have actually not had much data to turn over since part of his business niche was to not collect that kind of stuff. (Page 98 says "Levison claimed 'we don't record this data'" although in context "this data" appears to be non-content message data, which would not apply to the June 10th order.)

The June 28th order ("pen register/trap and trace order", page 7) is the one he started refusing, then tried to negotiate on later. I think the order "that Lavabit shall furnish agents from the Federal Bureau of Investigation, forthwith, all information, facilities, and technical assistance necessary to accomplish the installation and use of the pen/trap device" includes keys implicitly. The June 28th Order Compelling Compliance Forthwith (to the earlier order on the same day) notes, "To the extent any information, facilities, or technical assistance are under the control of Lavabit are needed to provide the FBI with the unencrypted data, Lavabit shall provide such information, facilities, or technical assistance forthwith."

The first explicit order referring to keys seems to be the July 16th search warrant, specifically Attachment B on page 36. According to page 98, FBI agents discussed encryption keys with Levison as early as June 28th.


So all they wanted was... everything.


You definitely did not read the article.


I did. You either didn't or read what you wanted to read. The FBI didn't ask for SSL keys right off the bat. They asked for info on Snowden's account. After Lavabit refusing and then agreeing but allegedly dragging it's feet on doing so, a federal judge issued an order for the SSL keys.

See my response above. https://news.ycombinator.com/edit?id=6517845


"On August 8th, rather than turning over the master key, Levison shut down Lavabit." seems to imply that they never got the information necessary to decrypt the data.


I did actually miss that part. I figured he gave them an electronic copy when he shut down.

Minimally, he gave them an "illegible copy." in 4 point font.Assuming its the actual key, I'm sure the FBI or NSA can OCR that. Even if parts of it are screwed up, SSL private keys actually have a fair amount of structure in them(at least enough to recover from someone writer "FUCK A DUCK" over and over again over part of one [0]) and even if say half the bits of the key give are illegible totally, the rest give you more than enough to factor it.

[0]http://crypto.2012.rump.cr.yp.to/87d4905b6d2fbc6ad2389debb73...


Read the article.



How was it a handwave if it actually worked? And this is the first time I've heard that Hushmail was forced to betray their users, rather than doing it in response to a simple request. Do you have more information on that?


"How was it a handwave if it actually worked?"

Ladar claimed that he had no way to access user emails. The hidden subtext of that, which was sort of hand-waved away, was that he had no such access with the code he had written, and that all that stood between your mail and his eyes was his own willingness to write that code. The entire security of Lavabit depends on Ladar and his principles.


Isn't that always going to be true?

These systems are entirely built on code, and code is malleable. If you have control over the code that gets executed at every point in the stack (which the operator of a web app certainly does), there's always room for the operator to change the code and therefore change the behavior to do anything, including log things that were previously considered secure.

> The entire security of Lavabit depends on Ladar and his principles.

That has always been true. If he was a less scrupulous individual, he could have run an off-the-shelf e-mail solution with 0 fancy security, but claimed that it was all secure on the backend in his marketing. You wouldn't know the difference, because you can't audit the source code.

Unless you have the access, time, and expertise to perform your own source audit, any claims of security are always implicitly built upon a foundation of trust that the operator is doing what it claims.


"These systems are entirely built on code, and code is malleable. If you have control over the code that gets executed at every point in the stack (which the operator of a web app certainly does), there's always room for the operator to change the code and therefore change the behavior to do anything, including log things that were previously considered secure."

Which is exactly why nobody should have believed Ladar's claims in the first place. He trumpeted the fact that he had not yet developed any wiretapping capability and tried to distract everyone from the hard reality of "encrypted webmail."

This is why real security and privacy require end-to-end encryption.


You might be missing the point a little bit. You're correct, but end-to-end encryption is irrelevant if the operator changes the (e.g., Javascript) code in a webmail app and maybe even just for a single user.

The trust model has to include the device/OS/browser/etc that you're accessing the device on as well as all code and keys (including WOT servers, GPG keys, and the webmail code itself, or, if locally installed, all binary packages, updates, etc).

Which I think is the point is really that in the face of a government who can compel, e.g.,

* an OS vendor to install a backdoor in their software, * a microprocessor vendor to critically weaken the Random Number Generator, * the author and BDFL of a key operating system kernel to use said RNG, * a distribution to include compromised or even unsigned packages, * a mobile carrier who will allow random hardware on their network that can sniff, MITM, and inject evil data, * a mobile phone manufacturer who will install CarrierIQ, * (heavens!) a user/customer who visits an app store and installs an application with either a trojan attached or one that is directly a trojan,..

In other words, no amount of end-to-end crypto will save us if we're attacked at each end and in the middle by people who will stop at literally NOTHING (jail) just to get at the data. This is why Schneier wrote "The Government has betrayed the Internet."

http://www.theguardian.com/commentisfree/2013/sep/05/governm...

Encryption is no panacea when you're dealing with someone who controls both ends and the middle. :( To put this on Lavabit is poor form, IMO. This is one end who stood up and fought and paid a huge price.


"You might be missing the point a little bit. You're correct, but end-to-end encryption is irrelevant if the operator changes the (e.g., Javascript) code in a webmail app and maybe even just for a single user."

Which is why, as I have said elsewhere, webmail is the problem here; more generally, you cannot have security or privacy as a service.

As for the government pushing back doors into products, that is at least upping the ante. It is harder to sneak back doors into software that people are actually using, especially software that is widely used, than it is to sneak a back door onto a server than only a handful of people can inspect. It is also harder to sneak in a back door when there are many, separately maintained implementations of a common protocol; you are forced to try to sneak a back door into an abstract description, which a lot of experts will be reviewing (see e.g. DUAL_EC_DBRG; the backdoor was discovered within a year of the PRNG being publicized).

There are also limits to what sort of back doors can be put at a lower level. Suppose you sneak a back door into my CPU, but I am running software that was developed after you developed your back door. You can make a straightforward computability argument that in general, your back door will be rendered useless by unexpected changes to software, particularly sweeping changes to it.

In any case, my point is not that encryption is a panacea, but rather that webmail is, by its nature, insecure. There was no way that Lavabit's security could have ever been more than just a handwave.


Which is why, as I have said elsewhere, webmail is the problem here; more generally, you cannot have security or privacy as a service.

This is it in a nutshell.


Are you really alleging that CarrierIQ was installed at the behest of a government?

If you're going to use an appeal to authority, then Bruce Schneier is not the way to go. He's a cryptologist but he does make many unfounded claims.


http://en.wikipedia.org/wiki/Hushmail#Compromises_to_email_p...

I remember this happening. Since whoever runs Hushmail didn't go to the press and his blog and make a big scene after being approached by the feds (both for the US and Canada, mind you, since Hushmail's a canadian company). There's a reason why Hushmail was targeted, it was at the time the best way to figure out who's shipping large quantities of drugs everywhere. In the same year, Hushmail and eGold became compromised and Bush passed a law stating police can open your mail in USPS if they think you're a terrorist (previously they needed a warrant which by the time they obtained one you'd already have your package). Coincidentally, this was I believe around the time Silk Road started up. :)

So, Hushmail was involved in the pre-Silk Road days of internet drug trafficking, and I'm pretty sure that is not the position they wanted to be in. We'll never know if Hushmail was "forced" to give up users' information, but I would say that even if they didn't want to, they pretty much would have no other choice other than shutting their doors.


Being legally bound to give up the information is equivalent to being forced. If it wasn't forcible, why did Lavabit have to choose between compliance and shutting down?

Could you clarify what you mean by "How was it a handwave if it actually worked?" Lavabit, like Hushmail, has had no problem giving up information to law enforcement in the past.


I think I missed some important things lavabit had done in the past, I concede.


He shouldn't have any "cred" when it comes to security or availability. Maybe with sticking it to the man, but that doesn't a secure service make.

The system did serve sider encryption and decryption. Lavabit has, in the past, complied with court orders for data. They have access to the data. This is the fundamental problem with Lavabit.

It's rather unclear why they didn't comply with the initial court order in this case which was just for data on Snowden's account, but it's clear they could have complied (though maybe not as quickly as the Fed's wanted) and that they have in the past.

So, if you are contemplating using Lavabit II, keep in mind that sooner or latter it will get a lawful court order for data from whatever jurisdiction it's in. Either it will comply with it --- in which case it's street cred is worthless --- or it will refuse with the same apparent "fuck the police" bravado everyone likes , leading to the same set of escalations that happened here. The end of that is either/all of 1) the service failing because of computer seizure/the founder being held in contempt so he can't maintain it 2) Him caving and handing over keys again or 3) him shutting down the service to prevent 2.

None of these are good. If the court order is targeted at you, there is a decent chance your data is handed over. If it's not, there is a decent chance your data is still handed over or the service goes under.

Oh, and lastly, if your worried about the NSA's dragnet surveillance, they can just hack a foreign server.

Don't rely on non-end-to-end secure systems.


There are no end-to-end secure systems.

So you're typing this on OpenBSD? Sure, because that's secure.

Your RPM's are signed, so you're pretty sure they're safe?

Or you're using Funtoo or Arch and compiling from source? Have you read all that source? Even if you did, did you UNDERSTAND it all?

Are you using an Intel Ivy-Bridge or later processor with RdRand?

Are you using a phone?

Do you log into your webmail from one or more locations? How about email?

Do you trust SSL certificates? which ones?

And now... are all of those one other person that you're writing to that actually also uses GPG exercising the same caution? :)

I'm just saying... if you want to know that you're not being sniffed, you have to simply make it impossibly expensive to do so, which probably means get on a plane, meet and hand the person a note, and then burn it, and then scatter the ashes to the seven winds.

Then worry about the metadata of your flight, time, license plate readers, traffic and surveillance cameras.

The basic problem is that allowing this continued, blatant destruction of the 4th Amendment threatens the entire Bill of Rights which ultimately threatens the very foundation of our government and the rule of law.

Legal power, jail, and bullets trump crypto.


We really need more companies (and those in control of those companies) that stand up for their customers this same way. A public company would not have been able to make this play, so keep that in mind when you are making decisions about where to put your data.


I think Qwest Communications was a public company when they refused to comply with the NSA.


And if I recall correctly their ex-CEO is now in jail on allegedly trumped up charges.

If true this is utterly despicable by the government. Alas, I'm not surprised.


he was recently released.


Is insider trading a trumped-up charge? I didn't hear the internet outrage about Enron's CFO being indicted for the same charge.


If I recall right, the reason it was considered insider trading was that he had published numbers expecting a contract from the government. When he chose to stand up for principles, the contact never landed, which made those predictions false.

I'd appreciate a clearer explanation of this, but last time I read about it, it really did seem like the loss of the contract was out of revenge, and that his prosecution would never have happened had he caved to the requests which he felt were unconscionable.


The problem is he dumped the stock after it became clear (to him) they weren't getting the contracts, but before he told anyone else.


Insider trading by itself is not a trumped up charge - probably none of the charges possible by the US Code are by itself. What makes it trumped up is the particular circumstance - the details of the case. And the details look definitely very fishy.


Didn't they die after that due to the government canceling contracts with them?


They've been merged into CenturyLink, but I doubt that's because of losing some government contracts. Those would be a drop in the bucket for an ILEC. USA telco consolidation is inexorable, for anti-consumer reasons. Actually it's only a matter of time before VZN acquires CenturyLink and then spins its less profitable/deadwood pieces back out to private equity firms. Much money will be made by executives, and much wailing to PUCs will be heard when service craters.


> A public company would not have been able to make this play,

Isnt it funny to hear that a public company would not have been able to do something when it is about something good for its consumers and ultimately its country and citizens, but when it is something extremely bad like compliance, obedience and evilness then they suddenly can and are free to do it.

There is nothing stopping Google, Facebook or Microsoft to act as Lavabit did. Nothing at all except for their own cowardice, malicious intents and disregard for laws and their costumers.

It is like Obama, when it comes to do something good then his hands are tied, when it comes to break laws and shit on constitution to extend mass surveillance or to wage war and drone kill civilians then his hands are not tied.

Now, lets hear some more excuses for the way the most successful tech-companies have acted, their hands were tied, they would face huge repercussions, they didnt know better, they only meant to comply with law and so on.

The truth is, they dont have democracy in interest, or their consumers best in interest, in fact you as consumer and user for their services and these companies are diametrically opposed and for your best you should consider them enemies.


"There is nothing stopping Google, Facebook or Microsoft to act as Lavabit did. Nothing at all except for their own cowardice, malicious intents and disregard for laws and their customers."

Emphasis re-arranged by me. May I remind you that Lavabit was shut down in reaction to actions taken by law enforcement, doing something that is almost certainly actually legal? "Regard for the law" here would seem to entail doing what they want. You appear to be simultaneously criticizing public companies for disregarding the law and not disregarding the law.

(I object to its legality, but that does not change its legality.)


The Supreme Law of the Land is the Constitution, right?

The Fourth Amendment has a few things to say on this. I'd strongly advise you to actually read it -- it's only a few sentences -- and make your own determination on what you think it actually says.

The FISC, authorized under the FISA, goes back to the 70's and was in reaction to the crimes that Nixon committed. Meet the new boss, same as the old boss. Obama has taken that ball and run with it in a big way. Most transparent administration, ha.

And even then, the FISC was and is itself contradictory to the Bill of Rights, 14th Amendment, etc,....

Guess which law wins? Assuming the Supreme Court agrees (and there's no guarantee they would.)

So if you're ordered to do something awful, but you know that there is a higher law that trumps that law, you can simultaneously be disregarding "the law" whilst maintaining "regard" for the law.

The key is: which law?


? What does fisa have to do with lavabit?


> There is nothing stopping Google, Facebook or Microsoft to act as Lavabit did. Nothing at all except for their own cowardice, malicious intents and disregard for laws and their costumers.

Try being CEO of Google, Facebook, or Microsoft, and suddenly deciding to close the entire business, and see how far you get before being removed.


Im guessing that if the CEO of google had known of this, he has a big chip to fight it. If google threatened to shutdown or something, public outrage would destroy the NSA.

But they don't have to get up to the CEO, they get the foreign sysadmin guy and tell him he cant tell his boss or his lawyer or he goes to jail and you got anything you want.


There are some causes that are worth being removed as CEO for. Lavabit founder finally is allowed to speak about this.

It is enough if you threaten to shut down the business to make it known to the G-men that you wont play (the totalitarian) ball. We stopped SOPA/PIPA just with a partial blackout.

But these companies didnt make a squeek. They are accomplices.


> There are some causes that are worth being removed as CEO for.

Yeah, but not if it's an empty gesture that doesn't change anything. Any CEO of Google both wouldn't be able to shut it down rather than comply, and would be removed for trying. It would accomplish nothing.


So, you think the hyper publicity of a billion dollar tech darling CEO being ousted because he wanted to defend American liberties would accomplish nothing?

The first thing it does is piss off one very wealthy person, whoever this CEO is. The second thing it does is piss off all of their very connected, wealthy friends. The third thing it does is serve as an egregious example of abuse to the public at large.

Hmm. Wealthy, influential people with large public support? Nah, these are two things politicians don't concern themselves about.


The most scary quote in the whole article is this:

THE COURT: You want to do it in a way that the government has to trust you /.../ THE COURT: And you won’t trust the government. So why would the government trust you?

It was that the whole idea on which US is built on - the Constitution and other founding ideas - was based on trusting the government only with very little that is necessary for it to function and no more, and having the ultimate power reside in the hands of the citizens. Now it comes to trust in the government being implied and if the citizen doesn't trust the government, he is not to be trusted and must be subjected to coercion. And that's coming from courts, that are supposed to be protecting the constitutional rights. America has come a long and very sad way since its noble origins.


That quote made me feel sick to my stomach. I mean, I knew it had gotten that bad...I've been involved in Restore The Fourth organizing, and before that I've been paying close attention to all the previous leaks about the surveillance state. But, knowing it and seeing a judge state it outright is two very different things. It used to be under cover. It only happened in the darkness of secret documents and agencies. Now, it's come out into the light of day...and they're getting away with it. Not even getting away with it, really...they're wearing it proudly, as though they are the people in the right; they honestly believe they are the people who have nothing to hide or be ashamed of.

It's astonishing that more of our reps aren't standing up and shouting about this. So many of the people in power are complicit, it feels hopeless at times.


The fact the government wanted the SSL keys is obvious they wanted to get at all his customers, not just the one they were targeting.

Levison offered multiple times to write a specific script for the single user that would do what they wanted and at a minimal cost to the government - and they refused. A pretty clear indication they wanted unfettered access to his client base and his network.

Then you add in the lack of ANY oversight on either Lavabit's or the government's, and you have to praise him for what he did.


Do you really consider the judicial warrant system a lack of ANY oversight?

After Levison's lack of cooperation, could the investigators really trust Levison to hand over all the information?


Why should Levison trust a government who has proven to be untrustworthy when it comes to data collection? Levison didn't lie or mislead anyone, he even offered to get the data for them as long as it was targeted. The government has basically zero credibility in matters like this, and yet he was expected to trust them with no oversight (in the article, he was told there was no independent audit of their use of the data)?


What is this monolithic government you speak of? Are you saying that the FBI and the NSA are synonymous? I'm sure plenty of FBI investigators would take exception to an accusation like that.

I can play this game too. Dread Pirate Roberts used StackExchange, so therefore StackExchange users cannot be trusted to build websites that don't host drug deals and supposed hitmen.


Why the constant influx of throwaways?


After they had SSL keys and tap system, they won't need a warrant to access specific data - they could just look and see anything they want. Of course, some of it would probably be inadmissible in court, but who cares - they'd just use different evidence in the court or "parallel construct" it. If they already have the data, the warrant requirement is useless.


> Levison offered multiple times to write a specific script for the single user ... A pretty clear indication they wanted unfettered access to his client base and his network

i don't think this is the correct interpretation. in a court of laws, acquiring evidence is something procedural and governed by rules and regulations. having a third party (lavabit) acquire the evidence and then turn it over to the government is probably something that wouldn't pass muster in court due to chain of custody and other rules.

the government investigators had a particular target, and they needed to collect evidence that would be admissible in a court of law. its actually pretty tough to come up with a good alternative here for the government.


> i don't think this is the correct interpretation. in a court of laws, acquiring evidence is something procedural and governed by rules and regulations. having a third party (lavabit) acquire the evidence and then turn it over to the government is probably something that wouldn't pass muster in court due to chain of custody and other rules.

Search for "$" on this page: http://paranoia.dubfire.net/2009/12/8-million-reasons-for-re... It is very common for third party service providers to search records themselves on behalf of law enforcement, and law enforcement has historically trusted that and even compensated them.

It's the only way that really makes sense IMO. Can the FBI really bust down, say, Verizon's doors, seize all their hard drives, and correctly generate evidence based on systems that aren't theirs, and may in fact be unique among the phone systems of the entire world? Maybe an email system is simpler, but still.

Also, I think it's inappropriate to seize 400,000 users' data just to (ostensibly) get to one person. But I'm not a lawyer, maybe it is arguably legal.

P.S. Also see https://www.eff.org/files/filenode/social_network/Yahoo_SN_L... for all the things Yahoo does on behalf of LE.


On page 100 of the actual documents [1], the refusal was not based on chain-of-evidence concerns, but that by using the solution proposed by Levison, the FBI would not have real-time access to the data:

    The e-mail again confirmed that Lavabit is capable of providing the means for the FBI to 
    install the pen-trap device and obtain the requested information in an unencrypted form. 
    AUSA[censored] replied to Mr. Levison's e-mail that same day, explaining that the 
    proposal was inadequate because, among other things, it did not provide for real-time 
    transmission of results...
---

[1]: http://cryptome.org/2013/10/lavabit-orders.pdf


Great point. On page 46 it says Levison "would be able to collect the data required by the pen register and provide that data to the government after 60 days (the period of the pen register order)."

Then, "The prosecutors informed Mr. Levison that the pen register is a devise used to monitor ongoing email traffic on a realtime basis and providing the FBI with data after 60 days was not sufficient."

So, I am not sure what to think. After some research I couldn't tell whether "pen register" data is required to be real-time. The original order on page 7 doesn't seem to say "real time", but there could be some legalese or other laws that imply it.

Levison was clearly dragging his feet, but it seems sort of bizarre that they escalated their demands in response. Why not just get him charged with contempt and move from there? Since they played the "we can't trust him since he's been uncooperative" card, does this mean they would have trusted him if he complied immediately? What if he immediately turned over info and it didn't help their investigation or went counter to their preconceived notions -- would they have accused him of withholding some and demanded his private keys anyway?

There's weird things on both sides here. It sounds like Levison's actions on the whole are not very defensible though. :/


We still don't know everything that transpired here, but here's my interpretation of the article:

- The government is granted the right to snoop on Snowden's e-mail from a court. The court implores Lavabit to provide the technical expertise necessary to make this successful. - Lavabit replies that all traffic is encrypted, and that it would be a expensive to change that. Lavabit offers to make those changes so long as the government covers the costs of the change (in this case, one week of developer time). - The government balks at the prospect of paying for the change and tries to snoop themselves. They realize that the encrypted data they can snoop on themselves is worthless, and demand the ability to decrypt it themselves.

All of this seems to hinge on Lavabit's demand that it be paid to make the changes necessary to make the pen register effective. Presuming the government was willing to pay him (or he was willing to work for free), there would be a http://fbi.lavabit.com/snowden that mirrored all of Snowden's metadata and it would interoperate with the government's pen register. There would be no need to compromise everyone else's SSL key.

I'm not familiar with legal procedure, but how does that corrupt the chain of custody in a way that other solutions would not?


I am blown away by the bravery, I know I'd never be so bold.

Also confused why he didn't end up in prison on mysterious "pervert" charges out of the blue or even dead. And don't lecture me that is far fetched after this past year.


Well, if they killed him, they probably wouldn't be able to get the keys. And they probably had to keep the bigger "punishment", imprisonment, looming over his head in case he reveals confidential information about the case.


The guardian said Snowden is to release US gov't assassination program documentation in a weeks time. So perhaps we will get insight on the in inner workings of systematic killing.


I believe the preferred nomenclature is 'targeted killing'.


I'll be surprised if it doesn't have some whitewashed name like "Citzenship Revocation Program" that lets people talk about it without saying "killing".


More like "Citizen Revocation Program". Wouldn't want citizenship reduced and people earning and paying taxes to other countries.


It wouldn't surprise me if they had some sort of back door with Verisign or other certificate companies for this.


I'm 100% sure they do, but if you're careful when setting p your SSL/TLS keys, verisign (or any other CA) never sees your private keys, they just sign your CSR. If the FBI had wanted to, they could have seized the servers, and either broken into them and replaced the private key with their own, or replaced the servers with ones they'd built with their own ssl key pairs. But, as the article points out - what they were trying _very_ hard to get was complete infiltration of the entire Lavanit operation without any of the 400,000 paying customers of the supposedly secure email provider knowing about it. Kudos to Levinson for not allowing that to happen, to his great personal cost. (I doubt I'd have the courage to do the same)


CA like Verisign don't have the key though, this misconception is too common. If you're doing it right the CA is just signing a cert you've generated, they never see the key.


Having CA access does allow them to create a silent MITM in the absence of certificate pinning.


With a different key though.

Once you've got this new cert. you can MITM, but you can't use it to decrypt the traffic already captured. Also anyone paying attention sees the cert. fingerprint change out of the blue.


A) Law enforcement doesn't need to decrypt previously-captured traffic; they either want to fish for criminal activity or they'll allow their target to build up new incriminating evidence. B) Who pays attention?


A) That's what they were after though: “all information necessary to decrypt data stored in or otherwise associated with [the account].” A rogue cert and MITM would get the password for the account though, unless B.

B) Anyone who knows what they're doing and has something they really want to keep secret? Maybe if someone had such a secret they'd learn to check the cert, maybe even install an extension that would highlight unexpected changes.


Most people aren't verifying that the cert doesn't change every time they visit a site though - if they have Verisign sign a new cert and replace the old one 99% of users will never notice, because their browsers won't yell at them.


It didn't happen because that sort of thing doesn't generally happen.

Simply raising the specter of 'this past year' doesn't prove some vast conspiracy on the part of the government (or maybe it's the puppetmasters who control the 'real' government?).


Maybe they read 1984 as a manual: don't destroy him, convert him. Or at least make an example of him.


The more I read the more sympathy I have for the government here. They had a (presumably lawfully obtained) warrant against a specific user; it's not they who designed lavabit such that it was impossible to execute this without obtaining access to every other user. The proposal that Levison would extract the information himself rather than turning over the keys strikes me as completely unrealistic - any information so obtained would be quite rightly thrown out of court, because there's no reliable evidentiary chain, only (in effect) Levison's word. Even if he had turned over the SSL keys, the US still has a fairly strong "fruit of the poison tree" doctrine: any information the government happened to obtain on other users would be invalid for prosecution because it wouldn't be covered by their search warrant.


The more I read the less I have. why? Because it is individuals within the government, who know that what they truly need it limited access, but rely on the backing of the US Government to just get everything in hopes something neat falls out.

Worded differently, far too many of these agents are willing to abuse the power of the courts, the secrecy of it all, to intimidate anyone because it makes the feel equally powerful.

there is NO justice when the courts are secret. Justice does not exist where the public is not allowed to go.

the issues he faced proved that there is nothing you can do because they will simply slap "secret" on anything and then threaten you with jail, which once there your stuck because no one was allowed to know.


Since when are search warrants conducted without being under seal? Warrants persuant to a murder or racketeering case are not public information prior to execution. Otherwise the evidence would be disposed of by the perpetrators.


I think he was referring to the fact that so much of the case remained, and still does remain, secret for long after the warrant was served.


> it's not [the government] who designed lavabit such that it was impossible to execute this without obtaining access to every other user.

That's true, but they're still essentially implying that services which are explicitly designed to omit backdoor capabilities for the government to spy on you -- that is, services offering actual cryptographically guaranteed privacy, not just "no one has looked yet, and if they did, it'll all turn out okay in the end trust us" -- are broadly illegal and will get you criminal contempt.


It has always been illegal to not comply with search warrants. Why would search warrants for digital data be any different?


It's also always been legal to say "sorry, we never record any of that information". You can get a warrant asking a bartender to provide you with names, addresses, and arrival and departure times, and transcripts of every conversation of every patron of his bar, and he can stand up in court and say "sorry, I don't have that information, and I'm not prepared to start collecting it.". Would you be happy for the FBI to be able to compel a bar owner to start collecting those records for everybody who walks into his bar? Without letting either new patrons or regulars know he was doing so?


>You can get a warrant asking a bartender to provide you with names, addresses, and arrival and departure times, and transcripts of every conversation of every patron of his bar

No, you can't; the courts don't issue warrants like that. They'd issue a warrant for that information for one particular patron whom they already had reasonable suspicion of.

If the FBI have a warrant to put a bug on one regular's table, that doesn't seem unreasonable. If the bartender's carefully built the bar so that you can hear every conversation from every table and so the only way you can place a bug is one that would also record everyone else's conversations... yeah, I'd say that's unfortunate (and pretty poor design on the owner's part), but the FBI has the right to place the bug they've got the warrant for, and if there's really no way to do that without it picking up other conversations then too bad.


The nature of email is that the government's methods would simply collect all users' data, regardless of whether it was the target's or not. This would be true regardless of Lavabit being designed with user protection in mind or not, unless specialized methods such as those Levison offered to write were employed. Such methods would necessarily vary from provider to provider, because each provider is set up differently. The government simply defaults to massive collection in all cases so they don't have to write that or pay someone to write it. He offered to give them a means to get only the data they asked for, but they turned that down.


It's also always been legal to say "sorry, we never record any of that information"

Only if it is true. He has the data, the article says he has worked with law enforcement before in a kiddie porn case.


CALEA requires that all telephone companies (and now mobile phone and cable companies) provide a means of "tapping" a phone line. To my knowledge, there's nothing similar that says a data service has to provide the ability to retrieve unencrypted data.


What they required was more like the means to tap all the phone lines simultaneously.


See Section 216 of the Patriot Act, which throws in the ability to get "pen register" information (metadata) from Internet communications.

http://www.justice.gov/archive/ll/subs/add_myths.htm#s216

Yet another reason to repeal the "Patriot" Act.


Wouldn't metadata in this case be like "Ip X connected at Y and transferred Z bytes"? Similar to what a pen register produces with a phone call.


Lavabit had no such guarantees. First, the court order was for meta-data which wasn't encrypted. Second, encryption was done server side, so they did have access to the data the government requested.

The judge wasn't talking about end-to-end encryption. They were talking about Lavabit's assertion that it was difficult to get the data they admitted to having.


You say "Lavabit had no such guarantees."

Without me arguing one way or another whether that's true -- do you think if they had better guarantees, that anything would be different here? That the government would say "oh no, we'll back down now"?


Suppose that Lavabit sold smart cards for end-to-end encryption, and as a service would store ciphertexts for you. The government shows up with a warrant, Lavabit says, "Sorry, we cannot comply because we cannot access secret keys or decrypt messages by some other means." What exactly does the government do after that?

The point is that Lavabit's architecture did not resist these kinds of demands at all. At best all Lavabit could do is protect your mail until you log in.


So here's a hypothetical for you. Suppose we build an architecture which you feel is more secure. Then the NSA people say "we have analyzed it, and there's a theoretical information leak. because that information leak exists you have to install our software which will exploit it. but if you'd designed your system better you would have been okay. also, incidentally, you're now legally forbidden from fixing this leak."

Then the legal obligations of the architecture are primary determined by the effectiveness of your security architecture; the existence of defects in the architecture which would preserve security in the absence of their exploitation is the thing matters to the court system.

That sounds tantamount to what you're proposing; is that really the sort of legal framework you think we have / should have? I personally think it's ridiculous, and I reject your justification and others which rely on a technical deficiency in Lavabit's implementation to endorse the legality and moral authority of the court's orders.


"Suppose we build an architecture which you feel is more secure. Then the NSA people say "we have analyzed it, and there's a theoretical information leak. because that information leak exists you have to install our software which will exploit it. but if you'd designed your system better you would have been okay. also, incidentally, you're now legally forbidden from fixing this leak.""

One of the features of a good security system is that there is no single party that can be compromised in a way that violates everyone's security. Basically, your example assumes that the system has a fundamentally risky design.

Even if you need a trusted party in your design, it should be the case that the party only has a limited ability to violate everyone's security. For example, consider an identity-based encryption system, which by its nature requires a trusted party to generate keys. Under the right circumstances, that party can delete the master secret key needed to generate keys; one can imagine setting things up so that the master secret is destroyed periodically, say once per month, so that only keys that were recently issued can be compromised if the authority is compromised (your identity under such a system might include some information about when your key was issued).

"I reject your justification and others which rely on a technical deficiency in Lavabit's implementation to endorse the legality and moral authority of the court's orders."

I think it would be fantastic to have a trustworthy government, but that is orthogonal to the issue here, because the security of a system like Lavabit is not limited to resisting court orders. Suppose the Lavabit have become as big as GMail i.e. big enough that the government might threaten an antitrust investigation if special favors were not performed. Suppose that a foreign government was trying to conduct industrial espionage against Lavabit's users, and surreptitiously installed eavesdropping software or equipment.

For that matter, suppose that you were personally involved in a lawsuit against Ladar. Would you communicate with your lawyer using Lavabit in that situation?

Also consider the reality that laws can change. It might take a massive terrorist attack to cause laws to change for the worse. Maybe a dictator will rise to power. Maybe people will just stop caring. Maybe it will be a combination of things, but the point is that building a system that relies on the law to protect users' rights is a fundamentally bad approach. It was not merely laws that stopped mass surveillance previously; when letters were sent primarily via the postal system, it was the lack of abundant and automatic copying that prevented mass surveillance. Laws are easy to change; widely deployed and widely used technologies are much harder to change (at least in ways that conform to the desires of politicians and governments).


You are incorrect. The usual process in cases where the government subpoenas data from service providers is for the service providers to gather the data and supply it to the government. Doing anything else is nearly nonsensical since the government does not have the domain knowledge of the systems of the service provider to be capable of retrieving the data.


> "They had a (presumably lawfully obtained) warrant against a specific user; it's not they who designed lavabit such that it was impossible to execute this without obtaining access to every other user."

This is what the battle over the clipper chip was about, and the government lost that one. Communications providers, absent any prior court orders, have no legal obligation to make their systems mass backdoor-compliant. It is the government, nominally, which is burdened here by the obligation to conduct their investigation without trampling the rights of 400,000 other people.

> "Even if he had turned over the SSL keys, the US still has a fairly strong "fruit of the poison tree" doctrine: any information the government happened to obtain on other users would be invalid for prosecution because it wouldn't be covered by their search warrant."

Which is why the government uses "parallel construction" to get around this restriction. And because the original source of the evidence remains classified, nobody can say for sure why the defendent was randomly stopped on the highway.


This situation exemplifies a type of conundrum: conditions or goals X and Y are incompatible; they cannot coexist. Persons A and B observe this fact, and A concludes "Therefore X must be sacrificed to protect Y" and B concludes "Therefore Y must be sacrificed to protect X".

Your argument starts with the premise that the government is always entitled to data on particular individuals being investigated, and all other values must always be sacrificed to that end. Your review of the costs imposed, however, recognizes only the wiretapping of "every other user".

There is also an additional price paid, which is even more significant and unfortunately not recognized in your argument nor even in Leveson's arguments in the case. The SSL private key is not only for encryption, it also is designed to authenticate the server to the user. Thus, revealing it enables the additional party to impersonate the server owner online. (In this case, the impersonation would have taken the form of a compromised server, no longer under the owner's control, purporting to be the uncompromised server still under the owner's control.)

The evil comes in with the combination of this demand and the gag order. If the server owner could either keep the key secret or warn the users, this evil could not happen - but when both are prohibited, he is forced to enable the third party to lie to the users and make his server a fraud.

Accordingly, my view is that one of these things must yield - either the secret key demands or the gag orders. This is a higher social value than the wiretapping of persons alleged to be suspected of crimes ("alleged" because the secrecy of the process prevents the citizens verifying that the government has any legitimate grounds at all).


Regardless of "fruit of poison tree" laws, information known is hard to unknow, and it wouldn't be surprising if information leaked to others could be used to lead hounds to foxes via different paths.


Indeed: what they routinely do in those cases is take some otherwise-ignorant agent, give them a little advice like "hey, you really should check out some stuff in this area" (nudge-nudge/wink-wink) and have him "rediscover" this information. Then they tell the court that it was Obtained Through That Agent's Normal Ordinary Investigations. "Parallell construction" aka "intelligence laundering".

https://www.eff.org/deeplinks/2013/08/dea-and-nsa-team-intel...


I have not reviewed the documents in question, so forgive me if I'm wasting your time, but I suspect it was not, in fact, a warrant. Generally requests for so-called 'metadata' don't require the same amount of supporting evidence as search warrants (the theory being that you shouldn't have an expectation of privacy for that data).

Lavabit was a service that explicitly provided a secure and private email to its paying customers. In my opinion, an expectation of privacy _did_ in fact exist for its user's 'metadata'.


I don't mean to be mean or rude, but yes, you are wasting our time. It was, in fact, a warrant.


I think you might make the piece of software reviewed by a digital forensics and make it somehow sign the captured document. It might stand in court.


Surely there must have been a way to have Levison decrypt that users account that would have been satisfactory to the government. I agree that him extracting the information himself isn't a solution, but surely there's a way to do it properly.

The solution isn't either of the extremes, 1) Give access to all Lavabit users or 2) Refuse to comply with a legal warrent, it's somewhere in the middle.


> the US still has a fairly strong "fruit of the poison tree" doctrine: any information the government happened to obtain on other users would be invalid for prosecution because it wouldn't be covered by their search warrant.

After everything we've seen from the Snowden leaks until now, do you honestly still believe this? Parallel construction? Hello??


> While he opposes the bulk collection of domestic communications, he has no such strong feelings about the N.S.A.’s foreign-surveillance efforts.

As a non-American, I have a problem with this seemingly widespread idea even among privacy advocates in the USA that only Americans are entitled to the protection of their rights from the American government.


National defense is a bogeyman for liberals such as myself. It's one of the most justifiable evils available, because the current peace of the world is somewhat of a modern illusion.

Personally, I don't like the existence of national borders of any kind, but I also can't wish them away. And when you draw a line, that says there's something you want to exclude on the other side of it, and the stuff over there just might decide to cross the line in a way you don't like.

My general strategy has been to support inclusive immigration policies, globalization efforts, and joint international operations. The more of these that exist, the less meaningful national boundaries become, the less necessary national defense will be.


To be fair, your government should be working to protect you from foreign threats like this. You should not rely on foreign powers to protect you.


Believe me, I'm not happy that my government is all-in on the American mass surveillance game. But my specific concern here is with Americans who think it's okay for the US government to conduct mass surveillance on the rest of the planet, just not on Americans.


Because we never know where the next threat will come from, or perhaps the threat after that.

Your country may be perfectly at peace with the US now, but there is no way to guarantee that peace unless we have people to continually watch for potential threats. Even that is, in itself, no guarantee, but it's better than nothing.

Maybe someday, mankind will be able to share universal goodwill and peace, but until that time, trust, but verify, at a minimum.


I call "Bullshit" on that.

For all of the talk about "all men created equal" and "do to others as you would have them do to you", fundamentally Americans are brought up to believe they are different or "exceptional" to other humans. This belief let's them distort reality so that spying on innocent foreigners is ok but spying on innocent Americans is an abomination. Hypocrisy.

Timothy McVeigh and others prove that the domestic threat to the US is as serious as the foreign.


Right, but it's not an either-or thing.


Yep, because our government basically controls the majority of what goes on the Internet, right?

/s


It's not a "widespread idea"... it's the legal reality. As a non-American, you do not share the rights of US citizens under the constitution. Period. They are the only people entitled to the protection of their rights from the American government, unless your country has signed some sort of treaty with the US government giving you extended rights from the US, which I doubt it has.

Without such an agreement, this is nothing more than international espionage, which is a natural (and arguably necessary) part of having different nations that don't always see eye-to-eye. Do you take issue with spying in all forms, or only the 21st century brand of spying?


Well that's nice; I'm glad to see that our common interests as HN users extends only to the US border.

Widespread unchecked surveillance is evil in and of itself, it doesn't matter _who_ is doing the surveilling.

Following your argument, if the Chinese government engaged in exactly the same sort of surveillance against you that the NSA does, you'd be fine with it, since it would merely be international espionage. But in the end, the same information about you is being stored. And your information is probably less trustworthy with the Chinese government than even with the NSA!

'Spying' in the traditional meaning is qualitatively different to dragnet surveillance of all communications.

I understand that all countries engage in a level of spying, and that friends may not always remain so, but the US' seemingly widespread lack of trust in _any_ other country, only works when you're (1) right and (2) the most powerful country.

These may not always hold true, and it's when bad things happen that you need friends you can trust (and that can trust you).


Firstly, I'm not from the US. Nextly, yes. Yes, I would be ok with the Chinese government spying on me, if they were able to.

In the case of the NSA, the NSA coerced/worked with many private US companies and took the data from them (rather than somehow collecting the data themselves). Such coercion of American companies is what I believe should be curtailed.

The thing is, China has no such reach in my life. I don't use any Chinese websites to store my data. I don't use any Chinese social networks to keep up with my friends. I don't use a Chinese email service.


No, it is a widespread and very dangerous misconception. Rights as conceived at the time of the drafting of the US Constitution are not granted but rather inherent. The Bill of Rights enumerated rights the founders believed all people had. It was a list but not meant to be complete. In fact one strong argument against enumerating them at all was that people would come to believe the list was complete or that it somehow granted the enumerated rights.

The religious ones thought these rights came from God but in any case most all agreed these are "Natural Rights" that you have because you are a human being. Such rights can only be violated by governments. They cannot be granted or taken away.

This attitude that rights only adhere to citizens of the US, common among united statesians, is a pathetic indicator of the level of indoctrination. It's also morally and philosophically appalling. It's the same attitude that allows united statesians to support bombing brown people on the other side of the world for whatever reasons their leaders give them.


Of wonderful note:

At approximately 1:30 p.m. CDT on August 2, 2013, Mr. Levison gave the F.B.I. a printout of what he represented to be the encryption keys needed to operate the pen register. This printout, in what appears to be four-point type, consists of eleven pages of largely illegible characters. To make use of these keys, the F.B.I. would have to manually input all two thousand five hundred and sixty characters, and one incorrect keystroke in this laborious process would render the F.B.I. collection system incapable of collecting decrypted data.

I tip my hat to this magnificent bastard.

EDIT:

The core issue is summed up nicely thereafter:

Levison believes that when the government was faced with the choice between getting information that might lead it to its target in a constrained manner or expanding the reach of its surveillance, it chose the latter.


For the fortitude he has shown in fighting the good fight, please consider donating to his defense fund: http://lavabit.com/ (link at the end).


wtf? The site's [https://lavabit.com] SSL cert been revoked: http://d.pr/i/sc71 [IMG]


Great!

It's because the private certificate was forcibly supplied to the government. No longer secure => revoke it.


Ah, makes sense. I donated, anyway.


Well, if it's the same cert that had its private key provided to the government, that would make sense.


There is a huge disconnect between the "justice" system and technology which needs to end. You've seen it before if you're in IT, that glazed eyes look when explaining why their Word document is missing…

Anyone with judicial experience know if judges have trusted advisory panels that can help wrap their heads around technology to better rule on cases such as this?


You mean like Judge Alsup, who taught himself Java so that he could rule that Oracle's APIs are not copyrightable? Or Judge Wells, who ordered SCO to show him the code and then threw the case out when the failed to do so? We haven't done so bad on tech judges recently - it seems to me that the problem with the lavabit/NSA cases is not so much the technical side, but the classic one of government powers, and the fact that there is no explicit constitutional protections of privacy.


Why would technology be any different from anything else? The prosecution and defense are free to call on expert witnesses to explain SSL and heart surgery to the judge and jury.


It is up to the participants to bring in experts to make sure the judge has enough relevant information to make an informed decision. Really, even in this case, it does not seem like the judge is doing that badly on grokking the technology.

I think it's actually the other way around, where Lavabit doesn't understand the judicial system and the concept of evidentiary chain of custody.


Do you really think IT is that much harder than say medicine or the nuances of structural engineering on any of the myriad of other technical areas that no doubt show up in courtrooms every single day?


I still don't get one thing about this story:

>> To make use of these keys, the F.B.I. would have to manually input all two thousand five hundred and sixty characters, and one incorrect keystroke in this laborious process would render the F.B.I. collection system incapable of collecting decrypted data

Don't FBI have some ultra DPI scanner with advanced OCR software? Let's say they live under a rock, it's still not so hard to manually type ~2k characters using magnifying glass. If so, what was the point to shut down Lavabit AFTER turning in printed keys?

P.S. I still highly respect Lavabit and people behind it, but this point in a story doesn't make sense at all.


This assumes that it was printed with a high-dpi printer. From the reports I've read he intentionally chose a font that would be hard for a computer to read. There's also no evidence that he printed it on any special high-dpi printer, so the process almost certainly lost enough information as to make the printed text not-fully-recoverable.


what really happened is that the court then ordered him to turn over the key on a CD, or continue to face the $5k / day fine. so he eventually did turn over the key on a CD


Was it a bmp file on the CD?

I should read the article...


The order was to supply the key in PEM format.


News outlets keep repeating "11 pages of 4-point type totaling 2560 characters", which just doesn't match up since that number of characters fits on one page in a fairly normal font size. Also, RSA keys just aren't that big, so the 11 pages must have either been many keys or some other data.

As I understand Lavabit's architecture, there is no "master" key. Instead, incoming mail is encrypted using an asymmetric per-user key. All the key pairs were created by Lavabit and stored on-site, but locked by a password to be provided over TLS. Since Levison probably didn't compromise his system to store users' passwords, presumably the keys that he was handing over in 4-point type were still locked with a password.


I don't understand why, if he was making a principled stand, he would have bothered with the printout in 4-point type. That was sort of a juvenile move, one that could only serve to justify the government's attitude towards him. Weird.


>News outlets keep repeating "11 pages of 4-point type totaling 2560 characters", which just doesn't match up since that number of characters fits on one page in a fairly normal font size. Also, RSA keys just aren't that big, so the 11 pages must have either been many keys or some other data.

You're making assumptions about the encoding...


It was five keys and included the full certificate chain


I've been skeptical of LavaBit, chalking it up to the general deification that HN gives to its heroes du jour, but he really seems to have made a highly principled stand while still allowing the government to intercept any individual for which it had a warrant.


Demanding the SSL keys to the entire database was clearly an insane overreach on the FBI's part, a mistake that they compounded if it's true that they refused to work with Lavar on the more targeted approach he suggested. I would like to kick in some bucks towards Ladar's defense, but I'd rather do it through the EFF (where I'm already a member) rather than rally.org, which I've never heard of.

Does anyone have any experience with (or thoughts about) rally.org -- or, for that matter, any knowledge of why the EFF isn't running point on this case?


Is anyone else thinking that their systems should include a self-destruct button? (for LavaBit I'd imagine a process that e-mailed each user the SSL key used to encrypt their mailbox, then deleted the key from the system. A user could still decrypt their mailbox by downloading it and using the key).


The problem is that services like Lavabit want to do something that is technically not possible: give you access to your encrypted mail from any computer i.e. the convenience of webmail. If I can just download a key and keep it on my computer, why would I not just generate the key on my computer by e.g. using PGP or S/MIME?


No ... I meant a "red" button that could be used just prior to wiping the servers clean. Your point is completely valid while the service is running.


Here's a Defcon talk about more or less that: http://www.youtube.com/watch?v=1M73USsXHdc


Thanks for sharing that link. I couldn't find that talk last time the question of emergency data destruction came up.


This isn't any different than shredding all your business documents when you hear the cops knocking on your door. It was frowned upon when Enron undertook a mass shredding during their investigation.


It was frowned upon because what Enron was doing was both illegal and immoral. What Lavabit was doing was neither. Further, Lavbit's founder took a stand on a believe in the right to anonymous communication. Compare this to Enron, where destruction of evidence was purely in an effort to hide evidence of culpability by those who ordered the destruction.


Knowingly and willfully destroying evidence that's been legally subpoenaed is indeed illegal, whether done by Enron "immorally" or $tech_company "morally".


The amount of support for Levison and ire toward the government in this case is absurd. The FBI followed the Constitutional process of obtaining a warrant for the information of the "one user".

I suspect that the only reason anyone cares about this case is because Lord Snowden the Infallible deigned to grace Lavabit with his email traffic.

Would the internet outrage be the same if the targeted user was found out to be a Goldman Sachs executive or a Westboro Baptist Church minister?


Well, if the hallowed FBI et al had actually taken the reasonable offers of access to one user and not escalated it into full access to the entire service's and customer's content/traffic, there might be a lot less to be outraged about, hmm?


The FBI followed the Constitutional process of obtaining a warrant for the information of the "one user".

Not a big fan of reading the article, are you? The government's demand was constitutional for one user, and unconstitutional for 399,999 others.


Er yeah, he offered to backdoor the one user for them and they refused.


nice troll attempt




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: