Phone calls, emails, and text messages don't have e2e encryption, and we have been using them for 40 years. We are still using them.
Nobody cares.
The USA communications are under constant mass surveillance, by a group of people that denies it, abuses it, uses secret courts to rule about it even when unconstitutional, coerce others to submit every data and shut up about it, and destroy the life of whoever point finger at it.
And people still don't care.
Good luck with telling people that are recording their entire life on facebook, snapchat and tik tok to be mindful about the tools they use.
What’s the point of lumping private phone calls in with things you post publicly to the internet?
Like even someone who posts to IG and TikTok still probably wants to have a private conversation. I think the issue in tech circles is that people outside of them don’t necessarily seek “mathematically private” conversations. My conversations on FB, Hangouts, phone calls, email, etc. aren’t literally mathematically probably private but neither is any IRL conversation.
And ultimately the point is that I don’t particularly care that FB in theory can see my conversations because my threat model doesn’t include Google and Facebook — it includes my friends, family, peers.
It’s not a lack of caring. It’s a misunderstanding about what people’s threat models actually are.
The people at Facebook are peers you don’t know. Not figuring them into the model, but figuring everyone you do, is poor threat modeling.
People’s threat model is to protect their physical/literal life. Without it, as far as we know, emergent emotional ideas of threats is nothing to be bothered with.
Death isn’t to be feared. I’ve been medically dead. It’s just blank. It’s pressure humans bring at scale to have their work externalized into easy to use tools that freak me out. How easily they seem to be willing to dismiss threats of human behavior
So long as society stays stable they’re the least of my concern while billions are destroying the environment. But they are part of a real threat at scale.
The pandemic response in action right here: this thing that will grow into a problem isn’t a problem. My stable neighbors, when measured by their activity of routinely going to a job, are the real threat.
Yeah that’s true but not at the scale you’re imagining it.
Pretending we should, at scale, explicitly model protectionism for all those private customization is a fools errand.
Let’s customize social protection for shared biological traits, and let people wank their specialness in private. Oh, but we have to live like the romantic stories old people carry forward. Where external social pressures rule us.
What’s the point of carving up data into pools generated by any one person except when scientifically interesting? Oh right to build my custom little emotional castle.
It’s people like us, instigating others to give their special details to our DBs over empty false promises of something special occurring at scale if they do.
How anyone smart doesn’t see the obvious parallel to religion there, I don’t know.
You’re not owed a cool tech job while someone else grows your potatoes. You should include the pressure you put on people that don’t owe you personally in your threat model
> my threat model doesn’t include Google and Facebook
Who does it include? Does it include the government if Facebook develops a sentiment model trained on public and private data that governments can use only public input to estimate [dissent|radicalism|political affiliation|etc]?
Do not underestimate how companies can abuse your privacy without direct disclosure.
I think he means that he doesn't care if Google and Facebook or even the government finds out his aunt sue's Apple pie recipe that she posted on his wall.
If he was throwing an overthrow the government party, then he would put in some thought as to which tool to use and probable stick to verifiable, open source utilities.
Repeating this mantra all the time is not helpful.
People have been told that modern computing and privacy cannot coexist. That all software companies spy on them and users can only choose between giving up privacy or giving up technology.
On top of that, privacy, by itself, is meaningless. What matters is information inequality. Inequality is power.
When people can monitor a government you have democracy.
When people cannot monitor a government and the government monitors people you have tyranny.
Unsurprisingly, there are many paid privacy and anonymity services for wealthy people.
Please don't say that people just don't care. People have been educated to be meek to authority.
Educated people have been repeating "don't put things on facebook", "use free software", "gov is spying on you".
The answers have been "I don't have anything to hide" and "you are paranoid".
Because nobody cares.
You can craft the best technological solution to that, if nobody cares about it, nothing will happen.
You can give the best information, if nobody cares, nothing will happen.
You can provide the most secure thing, if the competition gain an advantage from not doing it and you gain nothing, you will lose.
And deep inside you know that because:
1 - you are using a throwaway account
2 - you don't provide any solution to the problem
The situation is exactly the same as with all other topics where we gain a lot of comfort individually to do the wrong thing collectively.
Did we slow down global warming? Didn't we prevent 60% of the insects from disappearing? Did we stop delegating slavery to asian countries? Did we even stop buying from all those companies that enrage us in the news?
This is not the only way how political fight happens. There are indeed many whistleblowers with runied lives. But there are also many more people who devote a part of their life to educating others and encrypting what they can. I put GNU/Linux to every computer of my relatives I could reach. I participate in I2P and Tor networks regularly. I also have a day job and I am not going to risk it. But if a tiny part of the population did what I do, the world would be entirely different. People do care about anonymity and privacy. It's just that they have very limited resources and not everyone can devote a significant part of their time for that. Please help them and do not spread the mood of giving up.
tl;dr: Privacy is not binary. You can always increase the price of hacking you and others.
You make the mistake a lot of geeks does, thinking it's a technical problem. It's not.
You can have the best tech in the world, if the state is against you, if laws are against you, if society is against you, you'll only have scraps of a life.
It's all about people caring. Because only the mass of human can oppose anything. The rest are just sparks.
And you can state "do care about anonymity and privacy" as much as you want. Words don't matter. Actions do.
And people are still putting their entire life willingly on Facebook.
I never said it was a technical problem. It is the problem of asymmetry in information and resources. When more people know that you can confortably use devices without spying and violations of privacy, more people will oppose stupid laws removing our basic rights and spyware.
However, not everyone has time/energy to think about those issues. You need to help people by showing them better options, especially if they do not impose any restrictions. And a lot of free software is like this in 2020.
I wish I could upvote your post 1000x. People don't want to spend time and energy on these things. They just want to live their life. It is only when they can't that people start to care. Despite everything, quality of life in the US and western world is far too high to have people (in large numbers) pushing back in any real way.
I've never heard of a bank account being compromised because of lack of e2e encryption by Facebook, Google, Microsoft, etc. When that happens then post about it and I'll start caring.
People will not give up convenience and performance for encryption/privacy. And they only barely care when it comes to their financials - because they assume the bank carries the credit card fraud risk for them (whether it's true or not is moot).
And encryption is important, but not important enough for the general public. This is why nobody uses TOR for everyday usage, even tho doing so is massively better for the privacy of society.
By this logic GPS is "not secure" because the US government pays for that too.
But GPS receivers don't even have to transmit anything, they calculate your position by receiving broadcasts made by satellites with timestamps and calculating your position by triangulation based on how long it takes for messages from particular satellites to reach you. It would be difficult to have better security properties than this, which was done on purpose because the US military can't have the positions of their units leaking to the enemy.
We know this is true not just because their incentive are aligned but because the system's mechanism of operation is public and anybody who discovers a vulnerability can publish it. And so it is with Tor. The US government funds it because they use it, but it's also completely open source. If there is a vulnerability in it, as they say, PoC or GTFO.
Most vulnerabilities don't come from breaking GPS or Tor or AES. They come from, you have a suitably private way of calculating your position using GPS and then your phone takes that information and uploads it to The Cloud. They come from people visiting a website using Tor and then typing their name into it. They come from people uploading their TLS private key to a public repository, or generating it with a predictable random number generator.
Which still doesn't give them the location of your GPS receiver.
It also applies even less to Tor, because they "turn off" GPS by encrypting it so that only their own receivers can use it. Doing the same thing with Tor both doesn't work because they don't run all of the relays, and doesn't work because it would compromise the anonymity of their own users if they were ever the only ones using it.
I for one don't care as long as my important communication is. And should the US government ever turn authoritarian and malevolent then we can all use VPNs and switch to better communication as happens in China. It's not like Americans are forever accepting unsecure communications, they just don't feel a need for more security. When the need arises Americans will adapt. Hopefully they don't have to.
I'm not sure I could adapt at this point. Yes, I know how to use encryption to properly secure a conversation, but that assumes I can trust my device, my recipient's device, and that the underlying algorithms are really as unbreakable as advertised.
At this point I go through life under the assumption that the government (or bad actors within the government) could eavesdrop on every facet of my life if desired, and live with the anxiety that entails, which is low enough to be drowned out by other, more traditional anxieties. And I focus on political solutions rather than technological ones.
Everything goes out the window if you have a government willing to do anything and everything. It's a moot point. All this end-to-end encrypted technology would just be banned as you suggest.
So what? Conference tools are a replacement for face-to-face meetings. The expectation in such a meeting is privacy.
Further, tapping your phone is unlikely to hijack your computer, steal your credentials and company data.
False equivalences like this are not useful. Security should work. Most folks are unaware of how vulnerable they are, and the potential for abuse is astronomical.
If one looks at an issue that would emotionally charge the population, the protection of children, and then looks back 150 years, you'll find that much of society didn't appear to care. In the US, for a period of time, pets had more legal protection than children. I'll avoid the grimmer details, but people were willing to tolerate things that today would cause mass outrage. It may have taken over a century to change public opinion, but public opinion did change and massively.
So how can we do the same with privacy and protection of basic rights?
That was because nobody really paid attenion to your specific calls/emails/text messages and nothing really happened. The government could do it, but most of us don't show up on their radar. We didn't care about them because they didn't care about us.
Nowadays that is changing. Computers are able to listen to everything, match it all up to people, and with a little capitalism and profit, use the information. But don't let on.
Voila, personalized pricing. Hire only the age you like. Don't just rent to anybody. Date only thin people (with poor impulse control). Loan money to desperate people.
> Voila, personalized pricing. Hire only the age you like. Don't just rent to anybody. Date only thin people (with poor impulse control). Loan money to desperate people.
All of this stuff was happening at least 30+ years ago -- and probably much much longer (just I can personally recall it from 30 years ago).
Yes, you neglect the concept of precaution. I draw the parallels to
• wearing face masks,
• neither confirming nor denying when asked whether one did something naughty. (See the Plausible Deniability story arc in HPMOR.)
If one does not do it regularly and only in times of necessity, it stands out as weird, socially not acceptable, drawing attention.
We want encryption everywhere, every time, even when it's not strictly needed, so it is a socially normal thing to do, which in turn fuels wide-spread use, which benefits everyone through strong network effects.
It's encrypted and there's a long (somewhat silly) explanation of why it's not in a plain ascii format like everything else. You can't just crack it open and try something if you have a country hardware locked wireless device (via it's firmware). It's still possible to do here with a bit of effort but the next one isn't.
You can not get access to the decryption keys if you haven't rooted your phone. You need to hack your own device to look at what you stored on your sd card. If you're having a failing sd card and want to do recovery on it, well too bad, it's now utterly impossible, even if you own the phone.
These are just two real world actual problems I've run into in the past week alone.
Encryption can be, and often is used to be an extremely hostile, closed-data system. So that devices I Own, with My data I don't have access to and cannot see. That's not a better society, just like closed source isn't a better world.
The documentation talks about optionally signing the database with a key. You confuse things. A signature is not encryption.
> Android adoptable storage […] will encrypt without telling the user […] You can not get access to the decryption keys if you haven't rooted your phone. […] If you're having a failing sd card and want to do recovery on it, well too bad, it's now utterly impossible, even if you own the phone.
Works as designed? In this scenario, you're not the recipient of encrypted transmission, you're the adversary.
You should not be upset at encryption as a generic tool. You should be upset at the creators of this system that turns people who paid good money to own a mobile computer into dependent thralls: they cannot exercise the freedom to do with their property what they want. The answer to that injustice is not to abolish encryption, but to abolish the use of non-free software that enables the creation of such systems.
The predications from my previous post still stand.
Of course it's the implementation that matters and how it's used, that's the whole point. The reality is important. It Has To Be Done Better.
Sorry about the regdb example, that was me being frustrated with it and not looking deeply into it yesterday. I saw the binary format, saw the signing and just kinda assumed the rest (I've been trying to get a stronger 802.11 signal from these usb alfa and panda adapters on this pi4 I'm using)
Anyway, it could be a bold protector of personal liberty or another cargo cult ceremony on the checklist and callously shoehorned in inappropriate places.
It could be a defense of privacy that Schneier holds dear or a way for a company to hide from the user what it's recording and tracking through closed encrypted data
It could be Zimmerman's dream or make it so people get locked out of things they own. Locked out of their Bitcoin wallets, hard drives, phones, just about everything.
The "encrypt everywhere" assumes the rosiest of intentions from one group and the most malevolent from another. It assumes perfect technology without failure or fault, perfect human memory, and perfect organization.
That's why I can sit around all day and give you endless real life horror stories. For the bitcoin example, I've personally lost north of a million dollars through mismanagement of wallets (I wasn't monitoring disk failure on a raid array around 2012 and 2 of them finally went). You can look at this list (https://bitinfocharts.com/top-100-richest-bitcoin-addresses....) - there's a number of wallets, hundreds, with histories like this: https://bitinfocharts.com/bitcoin/address/12ib7dApVFvg82TXKy..., last activity 2010. Either this person has the discipline of a soldier or they lost access to it. Hard disk crash, accidental deletion, who knows - normal human failure in an imperfect world.
Human practicality matters. Imagine if a traditional bank kept 243 million locked away forever because you lost a special ID card they sent you in the mail in 2009. This stuff matters. An excuse of "well it's your fault for not being perfectly organized on perfect systems" doesn't cut it.
Encryption and security is too often an unsophisticated hammer that's getting tossed and applied without consideration of the nuances of reality.
It Has To Be Done Better. Not "don't do it". But instead "stop foolishly doing it so naively". Otherwise there will be a large push back and nobody will want it at all.
It seems you are conflating transport encryption with encryption at rest. I can not imagine a scenario where encrypted end-to-end transit can "bite" you as there is no recovery scenario apart from snooping network operators/attackers.
E2E encryption can be really resource consuming, especially with the amount of traffic used for video chat. You might not want this just because of a worse user experience.
Video chat is something like 1.5Mbps. Even phones can do chacha20 at >1000Mbps these days. It's not that expensive.
Where people start to feel it is if you want the server in the middle to be decrypting and re-encrypting everything, because it's not that much for one user but it starts to add up for a million users. But the obvious solution there is to use E2EE and then it isn't doing that.
I cannot inspect what closed source software is sending to the server, how it is tracking me or what information it has. It is encrypting my data against me so not even I can view how I'm being studied or recorded. This applies to the public writ large, nobody outside the deep annals inside the company knows how the public is being monitored and sold.
I've tried to make various tools and utilities for things like logging, auditing, inspection, etc, and have had to do self signed keys and a number of interesting hacks to work around encryption. Often times it's impossible.
It's thousands of times more likely that the authorized party is spying, snooping and stealing my information to package and resell it in open bidding data markets and trying to hide that fact and cover their tracks then they're trying to be bold protectors of my liberty against roving blackhat ne'er-do-wells. Although they'll run to that defense whenever necessary.
Ever see the amount of data trackers on say, a local news site? It's astounding, and they are the technically incompetent. Imagine what happens when people with the same wanton disregard for scruples knows what they are doing.
I've convinced myself at least that instagram listens to the microphone and services ads based on conversations. I have blackboxed tested this in many ways --- and that's the only way I can do it, blackboxed - that's the problem. I could lay out the tests if you want, none are of course definitive, they can't be, that's the point. It's uninspectable and can't be proven either way because it's encryption that none of us can inspect.
Like any other powerful party trying to hide information and abuse their subjects, they claim to actually be protecting us against the dark treachery that surely lies beyond the forest. Let's all tremble in fear and thank them for their service! Give me a break. It's a slight of hand, they've instead taken our digital stuff and protected against us from inspecting the process, the protections against an additional third party is merely a side effect.
If I can't turn encryption off and see what's going on OR if I can't inspect my own device then I don't trust the network OR the supposedly "trusted" party with access to it. I don't trust Google, or Facebook with access to my microphone or ad networks running on apps and you shouldn't either. If I can't inspect the contents of the payloads, then I don't know what's happening. Platitudes and assurance be damned, what data is going OTA, that's all that matters.
I understand there's brigading and hivemind against this basic reality, but I'm dedicated to reality, not to popular opinion.
See, this is the one complicating factor for L7 E2E, and I think you make a compelling case; you pretty much need open source/reproducible builds to be able to have any kind of protection against leaking PIIs over encrypted channels.
I think the best one can do today with services like Facebook/Instagram is to use third-party clients, or, in lack of that, the web apps.
IMO DoH/DoT is the main venue where this tradeoff becomes obvious today; privacy with regards to infrastructure providers, or ability for the user to see/control what your software reaches out to?
I've been using Wire [1] for years on desktop in large part because of their E2E claims; I wanted something simple and secure that worked relatively fluidly for video calls. I've had mostly pleasant experiences with it, and with many calls I'm surprised by the video quality. I've never tried their mobile app, but they've got one and it looks nice, aesthetically. Main drawback has really been that few people I know use it and I've had to cajole people into doing so a little bit, which stinks.
But I'm not a cryptographer and am unable to verify the company's security claims. For all I know it's go zero encryption whatsoever. Are there people on HN who feel qualified to comment? Has anyone used Wire before?
> I'm not a cryptographer and am unable to verify the company's security claims. For all I know it's go zero encryption whatsoever. Are there people on HN who feel qualified to comment? Has anyone used Wire before?
That's why they have independent security audits, see the reports near the bottom of https://wire.com/en/security (I know the people that performed the audit). A few things are new since 2018 but nothing conceptually changed, so as far as I know the audits are still quite current.
I work for a security firm, so we speak of unpatched vulnerabilities at our customers on a daily basis, and use Wire as our main communication system for that and other things.
Leaks information about the voice stream. It's not inconceivable that a well trained algorithm could recover sentences from the transmission pattern, although I don't know for sure if it could do quite that well.
If they actually cared about offering a secure product that would be enough all on its own.
Compare over in the SSH discussion IdentitiesOnly a feature that avoids the relatively smaller leak of potentially allowing an adversary to correlate your identity if you voluntarily connect to their server.
> Main drawback has really been that few people I know use it and I've had to cajole people into doing so a little bit, which stinks.
I find it fascinating how committed everyone now is to Zoom when, at least in my circle, almost nobody had used it before two weeks ago. At that point, everyone installed it as soon as the first meeting came up, and besides the ten minutes of everyone figuring it out it was plain sailing.
It's incredible to me that something everyone did without a thought two weeks ago (installing a new chat application) is now enough of a burden to not bother with. Highlights strikingly the value of being the first mover (or first adopted, as the case may be).
> I find it fascinating how committed everyone now is to Zoom when, at least in my circle, almost nobody had used it before two weeks ago.
Anecdotally, almost every startup/VC/etc. in my circle has been using Zoom for the past 2+ years (Paris and London). So there was probably a seed ready to take root not far from you. I've had the occasional Hangouts, and the rare WebEx or Teams meeting with some larger orgs, but that's it. I haven't even logged in to Skype in that time period.
Even my parents have been able to use zoom seamlessly from their phones. I think it helps that you don't even have to login or have an account if it's always someone else creating the meetings.
Zoom was not the first mover in chat/video apps, far from it. But over the past few years, it has overtaken all the other, older ones for the early adopters in the technology adoption cycle, at least in my anecdotal experience. The last few weeks have accelerated the mass adoption that could have taken much longer or never happened to an instant.
I'm sure, I gathered from the uptake that it wasn't new. My point wasn't about being a first mover (although I compared to the that), but about being the first to be taken up. How people's attitudes towards that initial hurdle of getting a system set up change extremely quickly.
It was easy to convince everyone to install Zoom two weeks ago, now I don't believe I could convince them to install anything else to replace it.
While Zoom may have overtaken others over a couple of years due to being a better system or what have you, that had no impact on my friends and colleagues who hadn't used it or others to come to an informed opinion on the matter.
At my company we've used for a few years each: Hangout, Bluejeans, and Zoom. We got acquired so now we're also using Skype and Teams alongside Zoom. Zoom didn't really invent anything groundbreaking, they did the same thing but with a better execution, it's more pleasant to use.
Similarly, we used Hipchat before Slack, and now we have Teams and Slack. Slack serves the same purpose, but it's more pleasant to use than the other two.
Neither Slack nor Zoom were groundbreaking products, and they had significant competition. They both do the same thing as their competitors, and you could argue the difference isn't crazy, but the user experience improvements are enough to justify switching products.
I guess I'm just confirming that I had a similar experience to yours.
Anectdotal. But I've only ever heard from people that zoom web did not work. Everyone I've asked told me the same: 'no it did not work, but then I installed it, and it did work'.
Apparently their network effect is so big now, that people use it, despite it not working on first encounter.
I use Zoom on Kubuntu when needed for work, and I always use the web version as I refuse to install the app. It has worked well with Chromium and Brave. Firefox doesn't work well if at all for video chat.
Funny, had to use Zoom recently because of some clients who insisted on using it, and I had to install software (OS X), it didn't work in the browser or I didn't find the switch.
As seen in [0], it just changes the URL. This can be done with the Redirector [1] addon, which is generic and can help with similar problems for other webpages (Twitter -> Nitter; Youtube -> Invidious; www.reddit.com -> old.reddit.com). I wish there was a way to make and see user-made rules.
I've used a few video conferencing packages and Zoom just works, and works well. No quality issues or dropouts, Linux isn't treated as a second-class citizen.
A quick look over the source code shows that they do not use an SFU - as such they will be LIKELY be full mesh p2p, and therefore e2e encrypted by default - I don't have the time to verify this, but I understand they are audited for this kind of thing.
The downside to this architecture is that it is not very scalable (i.e. limited participants in a room). There are some ways around this using a lot of signalling between peers to throttle feeds, but you will still always be "uploading" a stream per peer.
I'm not sure how much relaying skype supernodes did, but as skype wasnt webrtc-based they would have more control over stream encryption - allowing routing of encrtpyed streams for true e2e encryption.
For small webrtc conferences, treating certain peers as an SFU can certainly work. However, bandwidth requirements would be substantially higher than full-mesh for the supernodes, and equivalent to SFU peers for non supernodes - so supernodes would need to be chosen wisely (and you would still be very limited on number of participants - then you are entering the realms of peering supernodes and intelligent routing of streams). Additionally, this still wouldn't be e2e encrypted. Its just a (current) limitation of webrtc.
A couple months ago I tried Wire and it commonly delivered notifications several hours late. I had the same issue on Signal. With such a limitation it was useless for me.
No problem on Messenger, Hangouts, Whatsapp and Zello.
Now I would like to have a private communication via Zerotier channel, but only with my wife and close family.
EDIT: I'm not here to discourage. I should have clarified, that I wonder what could be wrong. It was a problem on Android between me and my wife's phones.
I don't have that issue on Android, iOS, or in Chromium on Linux, and the people I chat with on Wire don't have that either. I wouldn't discourage others from trying what I think is the most secure, open and featureful platform currently out there.
(Matrix and Telegram don't have encryption on by default nor do video calling, WhatsApp isn't open in any way whatsoever, Signal doesn't do video calls afaik and requires a phone number and still wants me to unfirewall Google Services on Android... those are the main contenders I know of and Wire has advantages over them all, sometimes few and sometimes many, at least in terms of security. My main issue is just that Wire is sluggish to the point where I don't expect many people will want to use it.)
After people kept bugging them (Moxie initially told me to get lost when I first opened a ticket about it), they implemented some form of fallback for GCM, and I happily tried it but it doesn't work for me. My guess is that it uses Google services when they're installed, and I didn't uninstall them because that would break a lot of other apps, but I did firewall it off. So this means that it doesn't work without Google Services on my phone and that it leaks some metadata to Google for almost everyone. I wouldn't say it's false to say that it still wants me to unfirewall GOOGLE. The apk I can get through Aurora store, that's not necessarily the issue (though the alternative distribution method of the official website is definitely a plus!).
Sorry about not knowing that they have video calls now, that's nice to hear. Does it also do group calls like Wire? The article doesn't say and while I'd love to try...
Honestly, for most use cases e2e video does not really seem necessary. I am a big privacy advocate but if recommend a FOSS like Jitsi anytime over a private tool like Wire.
Wire is open source. It'd be a pain to self-host, but it's doable. And at the very least, you can verify that your client is indeed using E2E encryption and therefore trusting the server isn't necessary.
Can someone with more expertise comment on these two? (I understand that because WhatsApp isn't F/LOSS, there's a lot that we can't know about how it works).
Riot (Matrix to be precise) only supports E2E encryption for one to one calls. Group calls are handled by a Jitsi backend and are unavailable in E2E encrypted conversations with more than two members.
It's very simple, the vast majority of popular chat applications out there don't have open source backends. Which means we basically have to trust their owners to do the right things and hope for the best.
The business model of choice in this space is to (sometimes) have partially open source clients but almost never use OSS in the backend. There's a lot of secret sauce and magic that these companies use to differentiate from each other and most of that is proprietary. This provides these companies with a control point to lock in their users to their network.
For the same reason attempts at standardization of protocols and federating chats and calls between networks have largely stalled/failed. XMPP is technically still around; it's just that none of the popular solutions in this space use it.
I'd say Signal is the positive exception in this space where both server and client are OSS and the client UX is pretty decent. Telegram talks a lot about security as well but their OSS seems limited to mostly client side. Of course a lot of the crypto is client side so depending on how paranoid you are that may or may not be good enough. Either way there probably is a lot of server side stuff that is relevant to end to end security that is not being scrutinized outside Telegram.
For both, even if the client is OSS, what goes into the app stores may still include stuff not accounted for in the source code. Auditing the source code and the binaries are two things. And then there's the runtime environment to consider, which is a OS that is probably proprietary that includes lots of stuff that is a bit icky from a security point of view.
So, there's multiple levels of "trust us, we know what we are doing" that you'd have to buy into in order to feel secure. IMHO that has been a problem for a while but most people/companies seem to be indifferent when it comes to security and happily pay through their nose for 100% proprietary security snake oil peddled by the sales people of e.g. MS, Zoom, Slack, etc.
> the vast majority of popular chat applications out there don't have open source backends. Which means we basically have to trust their owners to do the right things and hope for the best.
Err, if that were the case, then why do we bother with encryption at all? End to end encryption means you don't need to trust the owners, at least for message contents. For metadata, yeah, that's why I choose not to use a known-bad company like Facebook (i.e. WhatsApp) and would rather use Signal or Wire and hope for the best, but that's only metadata. Doesn't mean I trust Wire with the contents of my communication, at least when verifying people's keys.
Assuming you use stuff like this, that would be a good question to ask yourself. IMHO it probably helps a little bit keeping some people out but I have few illusions that the likes of the NSA, Russian, Chineses, North Korean, and other intelligence agencies don't know of dozens ways to listen in if they choose to with varying levels of easiness/convenience. As I like to point out, assuming it's only your friendly local security agencies listening in would be a misguided assumption.
In any case, I use zoom, google meets, slack, skype, facebook messenger, whatsapp, probably a few more things regularly both privately and for work. I'd prefer using Signal more but the people that reach out to call me and the people that use Signal are basically a Venn diagram consisting of two separate circles. I've never actually done a call via Signal. I'm assuming it actually has this feature, but I'm not even sure it does ;-)
Signal supports one to one calls. Basically it's a secure alternative to the way you'd normally use your phone. As you see in this thread, multi-party secure video conferencing is hard and Signal's preference is just not to do things until they can see how to do them securely and then implement that.
Hence not having "kick member from group" for ages after adding groups. Most alternatives just have the server know who is in the group and then the server can manage it, but now the server owners can know who is in which groups and secretly join and leave any group - so Signal had to invent a whole bunch of new techniques.
I started to use it as an alternative to WhatsApp calls for their superior call quality. Many people already have it without knowing about it. AFAIK that's due to being preinstalled on newer devices or being available as an instant app(?) That does seem to imply that you don't have to actively install it to work? ... but I'm not sure.
My understanding of how these E2E products often do their (video) call encryption is not that they use a clever communication protocol like the one used in Signal to protect each frame/packet (which would likely introduce huge amounts of delay - not a good user experience!). Instead during call initiation, they do the key negotiation/sharing over the same messaging protocol they use to transmit their normal messages; i.e. effectively they generate the call encryption key on the device, send it as a E2E encrypted message to the call recipient(s), and use the standard call encryption methods (probably based on SRTP?) to actually protect the call.
Thus, assuming the call encryption methods work correctly (which I have no immediate reason to believe they don't), for an adversary to obtain the key (and decrypt the call), they would need to break the E2E encryption used to transmit the messages; therefore the call is E2E encrypted.
Wire's video calls are also limited to 4 people. I think it's a client-side limit, but with 4 our laptops/phones already struggle so it seems to be a sensible limit.
Audio calls can be with more people, at least six but I don't know how many.
I guess such are the limitations of encryption currently, at least when one wants to have browser support instead of native code and wants to support mobile systems.
WhatsApp is owned by a company in the business of collecting metadata so I don't trust them as far as I can throw them. Even if the call is E2E, your address book is uploaded directly to their server.
Not an Apple die-hard (have never owned one of their computers) but their privacy efforts are, for me, increasingly a reason to either own or use their products.
I think privacy will be the backbone of Tim Cook's legacy.
A lot of people here probably won't believe this, but Cisco Webex offers end to end encryption, but not by default. If you use it, you lose a bunch of features. But it is available, and Webex is immensely popular in the corporate world.
WebRTC supports e2e encryption between peers. full mesh p2p videoconferencing is very inefficient for large conferences so an SFU (Selective Forwarding Unit) is used - a server acting as a peer that redistributes the streams to call parties. This SFU is basically a middleman - it decrypts the stream, and reincrypts it as it forwards - meaning that there is no e2e encryption. This is a current limitation of webrtc mediastreams. There is discussion underway to fix this with "Insertable Streams", allowing you to transform streams (e.g. insert an additional encryption layer).
MY THOUGHTS:
Zoom already bypass the usual webrtc mediastreams - instead they use wasm ffmpeg to record from a canvas and send that data over a data-channel to their SFU. It seems to me that they could instead encrypt this data before sending it over the stream, and distribute keys directly between peers via a full mesh - this would give them full e2e.
As an aside, the ability to implement something like this is SEVERELY hampered by apples decision not to support webm. I simply do not understand why they refuse to do so. If they supported webm (and the MediaRecorder API) then there would be no need to perform unaccelerated wasm-based patent-encumbered video encoding in the browser in order to provide e2e encrypted media streams via sfu.
DISCLAIMER:
I don't work for zoom, but I do work on a product that handles e2e webrtc via SFU using similar mechanisms to that I explained above.
I know webrtc works p2p when possible and does some form of encryption, but I didn't know it claims e2e encryption. Is this for real, like, any standard browser implementation can generate keys for e2ee and display them to the user for verification?
I dont believe there is a way to retrieve the keys via javascript, but you can get a fingerprint from the SessionDescription. These fingerprints are used to validate the certificates during the DTLS handshake, so displaying those should be sufficient for manual verification.
Note that there may be multiple independently encrypted streams per peer, so you may have to display a few. Its not quite as simple as a single fingerprint, usually.
I think this is probably an artifact of deriving the SRTP key from the DTLS key. I would much prefer the option to manage SRTP encryption keys independently, which would simplify this somewhat.
Yes, webrtc when done in p2p/mesh, is end-to-end encrypted, even when using TURN servers in between (these are relay servers when direct p2p is not possible).
Unfortunately mesh doesn't work out well when it's a conference call (many callers), so MCUs, SFUs, etc come to play.
HTML canvas, yes. They use a webgl context (but you can also do similar with a 2d context) containing a video texture (from the mediastream - video/webcam/screencap/etc).
edit: apologies, here they read the pixel data from the video element, but the principle is the same. One reason for using a canvas intermediary (or even multiple) is to provide multiple sizes, or perform some processing, such as background removal, for example.
That google duo doesn't support multiparty conferences in the browser? You can try starting one if you like. They only support 1-to-1 calls in the browser.
It would be a pretty exhaustive list if they stated everything they don't do... As such, I cant give you a link for that.
Jitsi Meet works pretty well (though I agent used it for really big meetings). But in the context of this article, it suffers from the same issues it outlines because it's based on WebRTC.
I have used Jitsi a couple of times but it has been a bit buggy. Not sure if that is because I used the meet.jit.si-server instead of setting up my own, it might be under heavy load.
I have had more success with Nextcloud Talk[0], which we have started testing on the internal network in my company. It even has decent mobile apps and webinar functionality. It's also a bit easier on the CPU compared to Jitsi.
I've tried setting Jitsi up and didn't find it was very easy (though I also didn't spend that much time on it), but I used it a few times with a friend's setup and as a client it worked alright.
At my company we use a self-hosted Nextcloud instance with the Talk plug-in. It has chat, video, audio and screen-sharing. It may not be E2E encrypted, but the data never leaves our servers. It also makes a better impression when you share a <own_company>.eu link instead of a .webex.com or .zoom.us link.
I could quite understand what he meant about WebRTC, is it encripted, but not super encripted, or is it such a junky encription that a script-kiddy can decrypt it?
Just launched a product that uses WebRTC and we have a marketing blurb that says that our P2P WebRTC is encrypted. Is that wrong? Twilio says it is
It is encrypted, yes. The discussion is over end-to-end encryption - i.e. is there a middle server that has access to the unencrypted streams, or do peers communicate directly with each other. This depends on your implementation.
Does e2e work with a large group of people video conferencing?
I thought encryption might get pretty expensive if 1080p streaming directly between peers. For a team of 30 people it eats bandwidth quickly. If there's some kind of server (or supernode) transcoding, does transcoding work with e2e?
Yes, you can have e2e encryption videoconferencing with large groups of people - just not using webrtc (unless you do some crazy hacks - mentioned in my other comment).
By definition, a transcoding server or supernode (acting as an MCU - Multipoint Conferencing Unit) would be a MITM, so that wouldnt be e2e encrypted.
A simple way with WebRTC for a small conference, would be to transcode on the origin and provide multiple streams to peers - one for thumbnails, one for higher quality, low bandwidth fallbacks, etc. This doesnt scale well, but it works nicely for small conferences.
HTTPS is point-to-point encryption. It's good for client <-> server communication.
If you need to send data to another client through the server, then HTTPS won't suffice. The server decrypts the data and then sends it to the other client via HTTPS.
So if there's more than two points (client <-> server <-> client) you'll need end-to-end encryption.
Yeah, that'd be pretty tricky. End-to-end encryption is really easy to implement if you're sending data to only one user. It gets really hard and really slow once you start adding more users.
A simple way would be to encrypt your message with a symmetric key. Then encrypt this symmetric key with your private key, and create a copies of it for each recipient and encrypting it with their public key. So each recipient can decrypt the symmetric key and use that to decrypt the message.
You can probably guess that this gets really messy as you start adding more users. And you need to know the difference between symmetric key encryption and public key encryption.
It's probably better to look at a peer-to-peer solution, or find a more efficient protocol. This is a pretty enlightening StackOverflow post on the subject:
Most public/private use symmetric encryption for the data anyway but they use a key derivation scheme to share the key. You can't use public/private encryption for data size greater than the key size. Obviously you can sign/verify any data size.
Also, above you said "encrypt this symmetric key with your private key", did you mean sign this symmetric key?
I'll be upfront, I'm a little biased... my dad wrote ZRTP and helped design the encryption on the Silent Phone app. There are a number of reasons I trust it. One is that the user is involved in authenticating the call. Most other encryption apps bury that authentication in the settings, allowing you to not worry your pretty little head about it. I trust both the good intentions behind the design as well as the implementation.
Nobody cares.
The USA communications are under constant mass surveillance, by a group of people that denies it, abuses it, uses secret courts to rule about it even when unconstitutional, coerce others to submit every data and shut up about it, and destroy the life of whoever point finger at it.
And people still don't care.
Good luck with telling people that are recording their entire life on facebook, snapchat and tik tok to be mindful about the tools they use.