I don't understand how they will get the chats except from individual phones. The convo is encrypted between telegram app and their servers and the servers aren't available to Bealrus' government officials, so how are they going to get the messages? Obviously they can if they're monitoring public group chats because all they have to do is join, but person to person or private groups, how are they going to get to those? Confiscate everyone's phones?
They are not going to get them, and the people saying otherwise don't use Telegram, and/or do not know what they are arguing over;pedantry. Telegram works well. Anyone can start a private chat encrypted end-to-end, those messages only stay on the device, and you can set them to auto-delete from BOTH user devices in 3 seconds, 5, 10, 30 second, 1min, 1 hour, 1 day. Nobody at all is going to get those messages. Go ahead and wireshark your connection and start using Telegram.
This isn't good advice for trying to show someone their messages aren't being sent in clear text. It could be encrypted using a weak cipher or have other implementation bugs that make it trivial for a nation state to decrypt it. You need to be able to look at the application's source code to tell what encryption it's using and if it's secure enough.
Wireshark would show group messages in Telegram are indeed encrypted. However, they are encrypted only in-transit, meaning the server will see 100% of group chats. This can not be trivially detected with wireshark hence the advice to use the analyzer is useless and downright dangerous. It's like using a radiation detector to find cancer.
> Anyone can start a private chat encrypted end-to-end, those messages only stay on the device, and you can set them to auto-delete from BOTH user devices in 3 seconds, 5, 10, 30 second, 1min, 1 hour, 1 day.
Sure they can encrypt E2E but they don't.
Most chats on Telegram are unencrypted. Huge channels like the one mentioned above are full of agents and nothing prevents them from taking screenshots.
I recently found this out. You need to specifically create a "secret chat" in order for telegram to actually use E2E. I originally chose Telegram because I wanted a messaging app that respects my privacy and E2E by default is pretty obvious in that case.
Maybe I'm stupid for not realizing or looking it up before chosing Telegram. Luckily there are a lot of other apps out there in the same space that do this better. Have switched to Signal now and really like it! Privacy should never be an opt-in feature like in Telegram!
"people saying otherwise don't use Telegram, and/or do not know what they are arguing over"
So that would include world famous professional cryptographers like Bruce Schneier and Matt Green. Meanwhile the people recommending Telegram are random usernames on internet forums. It sounds like you're the only one who doesn't know what they're talking about.
"Anyone can start a private chat encrypted end-to-end"
This is such a shill talking point. Telegram doesn't even support E2EE for groups. Every dissident groups leaks 100% of its chats to server with no possibility to opt-out.
Oh yeah that feature which conveniently deletes your "Down with the <dictator name goes here>" group and its member list every 60 minutes and forces users to create a new one.
"I don't understand how they will get the chats except from individual phones."
~~By default~~ Telegram's group chats used to organize protests aren't E2EE. If Belarusian government hacks Telegram server, they can read every dissident group's chat history trivially.
I have been frustrated with Signal on this. I pitched that it is a good idea because of a scenario:
> You're protesting with people. Cops pick them up, but not you. You can delete their messages and it is likely that you are able to do so before the police can clone your phone or copy the messages (screenshot, whatever).
I got a few strange responses back:
- Deleting messages doesn't mean they can't be saved (yeah... this is probabilistic privacy, not guaranteed)
- My device, my data (okay?)
- Some people run custom apps that save everything (how does that apply here? Funny enough, a sibling comment said something similar)
- Just own up to your typos (-_____-)
- Don't use Signal for communication then because you can't be guaranteed privacy (great, I'll use smoke signals with my friends to organize)
To be fair to Signal, the devs did not get into the forums. To also be fair, Signal is taking the same position and is going to only allow deletion an hour after a message was sent. As much as I love Signal, it is my preferred messaging app, I think they are not in touch with the needs of people. We should look at why people are turning to Telegram when protesting. What can we do to better preserve the privacy of people protesting in HK, Belarus, America, etc? Everything is probabilistic security and privacy when it comes down to it. But what tools would help these people the most? I would argue that bidirectional deletion to reduce the chance of self incrimination is one of them. The other is group messaging, channels, and anonymous messages (so your phone number isn't visible in channels). Emojis are nice and fun for day to day use, but it is getting more and more important to push these other features (yes, I know they are extremely difficult to do and actually preserve privacy to the standard Signal currently does. I think many would be fine if it was an incremental increase in privacy with these newer features).
I don't know why you're getting down-voted. It's an accurate statement that many loyal Signal users can attest to. Signal has been my primary messaging app for years now, but that's my main issue with them outside of group MMS issues still being problematic all these years. Their slow response or lack of care was especially apparent after the huge outcry over constant nag notifications for verifying PIN, setting a profile name, and asking contacts to join Signal. It's like they don't understand how badly they need better adoption for Signal to be effective. If 90% or greater of my contacts don't use Signal, then what good is that? They need to start listening to their users better.
I do think that things like emojis and the (now fixed) link previews do help with adoption. But I think there is another and more compelling adoption method given the current state of the world: privacy and security. The reason people are turning to telegram is because they think it is secure. Signal will never gain mass adoption without good groups. And honestly, they probably need channels too. If it had both those things then all these protestors would turn towards Signal. After all, isn't that why they get funding from the US government? To "enable" democracy in other countries?
I was pitching the following idea to a friend of mine yesterday:
- the UI should hide e2e/“reallyprivate” conversations by default
- as in "not visible anywhere" (edit: unless the app is in the foreground and you are chatting of course)
Unless you:
- do the “add a new user/conversation"
- then instead of adding mail/GUID/phone you add a whatevercanberemembered number/emoji/sentence that unlocks the private conversation you initiated long before
There should be no trace in the UI that private conversations are going on.
What does HN think ? Why hasn't it been done before ?
Edit: there could even be notifications disguised as another app (news subscriptions, medical reminders, battery low, etc.)
This doesn't give you plausible deniability if law enforcement gets their hands on your unlocked phone, as they can see that the file size of the encrypted message logs doesn't match the visible content. If the phone is jailbroken and the key for the message logs is leaked, it doesn't help at all.
Wouldn't the solution here just be to allocate a larger disk space and encrypt that? Then when the space is filled up you expand again? I've seen this done before.
Rather I'd change the GP's solution to having a secret vault in an already encrypted chat system (so you can do the above), essentially making it two layers. Just the second layer isn't a button that says "look at me, I'm where all the real secret shit is."
I agree that security through obscurity isn't a winning solution, but it is part of the toolkit. It would just be dumb to rely on your security solely being obscurity. Encrypted steganography is still a powerful tool, hackers obscure code, and real spies use obscurity all the time. It just isn't the dominant factor.
> This doesn't give you plausible deniability if law enforcement gets their hands on your unlocked phone, as they can see that the file size of the encrypted message logs doesn't match the visible content.
Hmmm. What about from the get-go saying that the app allocates 100Mbytes of space and fills it randomly at regular time until some encrypted content is generated. That'd put a 100Mbytes log/message limit to conversations but that'd be by design and nobody could be sure those bytes are random or genuine messages.
> If the phone is jailbroken and the key for the message logs is leaked, it doesn't help at all.
Why would the key get leaked if it's never stored ?
> . We should look at why people are turning to Telegram when protesting.
Every single person I know who uses Telegram does it for either porn or piracy or both. So using what you already have for protests if they occur makes sense. Trying to get ppl to install a different app is much more complicated at this point. Sometimes may even be prevented by the regime.
See, the lack of bidirectional deletion is one of the reasons I prefer Signal. Nobody other than me should have the ability to delete data on my device.
I disagree. I see my phone as an extension of my brain. If I have an in-person conversation, the other party can't force me to forget the conversation, and they shouldn't have that ability for my phone either.
What if only the other partys messages where deleted?
In telegram it is understood that 'secret chats' constitutes confidentiality. As such, both parties, I believe, ought to be able to delete everything.
I kind of see you point about non-secret chats.
But then we are back with a opt-in model for privacy.
Personally: what I tell you at the coffee machine, in confidence or not, is ephemeral. I would probably not talk to you at all if you where taperecording all conversations, as you want to do with messages... so I think both.parties.should be able to delete text conversations. And privacy should be on by default.
> I would probably not talk to you at all if you where taperecording all conversations
You hit the nail on the head with this one. To me deletion is a nice compromise and why the coffee shop analogy isn't a good comparator. Similarly we don't record video calls (and Moxie himself doesn't like this). So why should every text be recorded and parties do not have control over that data? I do feel that each person in the conversation has a right to control that data (if anything the sender more so) and when policy fails it should fail in the direction that has more privacy (which is the message not existing within Signal's log^). But currently people aren't given this choice and there is no consideration of failure modes.
^ Careful wording because if I don't make this added comment people think I'm unaware that screenshots exist.
I don't really make that distinction, I think it's harmful to have E2E as optional, and only use platforms than have either mandatory E2E encryption (Signal, WhatsApp), or no E2E encryption (SMS, email).
If you have an in-person conversation with me in confidence, that doesn't grant you any additional powers to make me forget details of the conversation.
> Personally: what I tell you at the coffee machine, in confidence or not, is ephemeral. I would probably not talk to you at all if you where taperecording all conversations, as you want to do with messages...
What if I have a very good memory, and follow conversations by writing up their details in personal memos that you can't delete? (e.g. Comey's contemporary memos of conversations he had with Trump.)
> so I think both.parties.should be able to delete text conversations. And privacy should be on by default.
The problem for you is that I'm not going to agree to that - if you won't use Signal, I'm going to force a downgrade to SMS or email, and then you get even worse security and privacy.
If you want to have a conversation that can't be recorded in an automated way, you basically need to meet in a sauna.
> If you won't use Signal, I'm going to force a downgrade to SMS or email, and then you get even worse security and privacy.
Or we will set up e2e encrypted telegram. Or not talk.
> What if I have a very good memory, and follow conversations by writing up their details
You saying that you remember I said something, even took a screenshot vs you can prove I said something, is a big difference.
If I am doing a snowden, I might go to a sauna. If I am planning to overthrow my boss, I think e2e telegram is okay. Because I can delete the conversation it might even be preferable to signal.
Sorry, I just can't agree with your take. You're fundamentally trying to use technology to restrict rather than enable use cases, and doing so in ways that aren't actually robust to your use cases and threat models.
I'm not sure what this has to do with anything. Sure, maybe this doesn't help you in a channel, but one on one? Or small groups? Most people don't run custom apps and you're probably going to know if your friends do. The biggest use I see of bidirectional deletion is if you see your friends be picked up by a nefarious actor and you can delete the messages. This reduces the chance of self incrimination because you can probably delete the messages before the phone is cloned or the messages are saved in some way.
It has everything to do with this when the conversation is if the government is going to use your messages in the unencrypted channel to come after you. If your friends are picked up by a nefarious actor, you would have to know that they were–and also, you'd have to ensure that Telegram isn't keeping some sort of deletion log.
>> Sure, maybe this doesn't help you in a channel, but one on one? Or small groups? Most people don't run custom apps
The bidirectional part is helpful in the non-public channel context.
How does this help? Why does this matter? Well you can keep a public face and a private face. Private channels, group chats with friends, or one on one messages you can be more open and use this tool. But this is normal. Everyone shows a different face in public than what they show to friends (offline!).
> If your friends are picked up by a nefarious actor, you would have to know that they were
Sure. But they're your friends. I don't know how you interact with your friends, but usually when I'm out with them I'm physically near them and know what they are doing. Chances are pretty high I'd know within a few hours if they got arrested/abducted.
> you'd have to ensure that Telegram isn't keeping some sort of deletion log.
This is a different issue and FWIW that's why I don't personally use Telegram. There's no verification so no trust. But that doesn't mean that the deletion tool can be useful in certain contexts if the implementation is correct. No reason to throw the baby out with the bath water. It is about the probability of reducing self incrimination, not guaranteeing.
I certainly do not know what my friends are doing 24/7, perhaps not even within a day or two. And that's plenty of time for law enforcement to install a third-party client on their phone, or just read the messages. I agree that having it is better than not having it, but I would not put too much faith in it being useful against law enforcement. Perhaps retracting a mistakenly sent message, but not much more than that.
I feel like you're being needlessly dense. The threat scenario is being at a protest with your friends, not some midnight abduction. And I'll I'm arguing is that it is better to have it than not have it because there's a __chance__. When it comes down to it every aspect of security and privacy is probabilistic. Security walls aren't impenetrable, but unlikely to be penetrated in a given time-frame. If it doesn't reduce the floor on security or privacy but increases the probabilistic upper bound, why not? So my complaint to Signal is why shoot yourself in the foot by limiting this to 1 hour? (24 if you run a custom app)
There's a reason big companies/government employers want root access to your phone and will wipe data if it is lost or stolen. Because it reduces the chance that company/state secrets. No one thinks it is a guarantee. But if given the choice of "revealing a secret" vs "rolling a dice to see if I reveal a secret or not" I'm going with the latter no matter the odds.
So disappointing that true anonymous communication is technologically feasible but is only unavailable due to government intervention and public apathy.
The main issue is that any form of anonymous communication gets instantly abused for things that very few people are OK with. It's a classic Catch-22 and a very well-known at that.
That can happen anywhere and not just with Telegram - imagine what a repressive government can do with a dump of GMail. Iran's a much bigger country with a regime much more capable and willing to use violence.
Imagine how hard it must be to run a presidential campaign in the US when your incumbent opponent in an election controls the systems that get to read any message in GMail.
Explain how this works. A blue party voter writes to another blue party voter: Hey, let's vote blue this year. NSA that intercepts the message and __________.
The thing being suggested here is that the campaign of the incumbent is reading all the communications of the campaign of the challenger. I don't think anything of the sort actually happens or is really that easy to (completely secretly!) make happen, but that's the proposed scenario.
'Congressional investigators determined that "targeting of US political figures would not occur by accident, but was designed into the system from the start."'
So yeah that might still be going on. Signal etc. make it harder but it's not like the NSA isn't hacking endpoints so hard to say if it's actually secure. We can only hope the next Snowden will let us know if NSA's spying on the opposing political party.
I don't mean that, I'm just pretty sure the person you are replying to meant that. As to the other stuff, no, even if your security services are collecting this sort of thing, by design or not, it doesn't mean it's in your daily briefing, let alone available to your campaign. If it was, Nixon wouldn't have needed to hire a bunch of incompetent cosplayers to be 'Plumbers'.
My kids tested sending https://kamalaharris.info and https://joebiden.info to each other on Instagram, in private messages. The sender would see that the message was successfully sent, but it would never arrive.
Another case is that the person who ran the primary campaign for Kamala Harris now works at Twitter, where he blocked an opponent's campaign account.
By default it's no more encrypted than HN (as in, traffic to their servers uses TLS, messages on the server are not encrypted at all).
There's Secret Chats feature which they claim to be end-to-end encrypted, meaning that it's no more secure than Facebook's Messenger (also end-to-end encrypted in Secret Conversations). Even less so considering that they roll their own encryption (MTProto), while Facebook's Messenger uses Signal's protocol.
Can we stop using 6-year-old info for apps that get updated monthly? The problems they have with MTProto have been patched literally 5 years ago, the only other criticism comes from a direct competitor, and they recommend WhatsApp despite the fact that it's closed-source and nobody can verify if its encryption truly works.
Facebook is planning to merge Messenger, WhatsApp and Instagram, which makes it even more awful of a choice.
Telegram still doesn't encrypt chats end to end (by default¹), which means it's not a strictly superior choice to WhatsApp.
Facebook can't read your WhatsApp messages (of course they can add an update any time to do that), but Telegram has access to all your messages right now.
¹ Yes, you can select the end-to-end encrypted sessions, but they're very crippled from a usability perspective. I don't remember the last time anyone used it with me, yet all my chats on WhatsApp are end-to-end encrypted without anyone doing anything.
Are we sure it can't? Because WhatsApp is closed-source, its GDrive backups are unencrypted and Facebook's whole profit model is based around snooping. Unless they make the app open-source, I'm not trusting them even with a grocery list. People act like E2E is the be-all and end-all but trusting an incredibly shady company on its word is not something I'm comfortable with.
Yes, people are reverse engineering the app. You can check the discussions on HackerNews when security of WhatsApp is discussed.
GDrive backups are not readable by Facebook, they're readable by Google. End-to-end, if properly implemented is the be-all and end-all. Except for metadata, which is a problem, but a different one, and Facebook definitely abuses that. But they don't/can't read the contents of chat messages (for now).
It's not merely trusting that shady company, but also realizing that the news of FB not having E2E-encrypted messages would definitely make the news, you'd be aware of it.
> It's not merely trusting that shady company, but also realizing that the news of FB not having E2E-encrypted messages would definitely make the news, you'd be aware of it.
Right.. consider what your adversary would be giving up by revealing such a secret, even if it was true. That alone provides a not-insubstantial amount of security.
The real question is, why is Telegram more secure? There's a 100% chance it can read your group messages, because it says so on their documentation that describes the cloud encryption. There is no E2EE at all for groups. There is no E2EE at all for desktop. Together these mean E2EE are completely neutered and useless. I'm a privacy researcher and I don't use them at all. Why would an average joe?
Open source is not the be-all end-all of security either. Closed source apps can still be audited (with increased difficulty), and open source apps might still be impractical to audit even though they are open source.
No, it is not necessary _or_ sufficient. That is what I'm saying. You can audit a closed-source app, and there also might be open-source apps which are impractical to audit despite them being open source.
If you have your closed-source app audited, everyone needs to trust the audit company. And I've seen some shit audits in my life that told absolutely nothing about the actual security.
Open source means anyone can audit and verify nothing was done after audit.
Moxie more or less audited WhatsApp's Signal protocol implementation, and people are right to be concerned about whether changes have been made since FB bought the app.
Facebook does get your WhatsApp communication metadata, and has been for years now. As the three letter agencies showed, metadata is actually quite valuable in many respects without needing to trawl through massive amounts of content.
Can’t Facebook read most people’s WhatsApp messages because cloud backups of chats are enabled by default, and only the tiny minority of users who disable that feature will get truly end-to-end encryption?
I don't see the problem of using a hand-rolled encryption algorithm or the strange choices that went into that algorithm as "patched literally 5 years ago".
"Can we stop using 6-year-old info for apps that get updated monthly?"
The fact Telegram's E2EE has not been available
1. by default
2. on desktop apps
3. for group messages
for seven years tells you exactly how secure it is.
"the only other criticism comes from a direct competitor"
Fuck this attitude. Everyone has the right to criticize. If Telegram can't own their mistakes it's their fault, not that of the people who are beating them. Also, impartial professional cryptographers like Bruce Schneier and Matthew Green have told people not to use Telegram. Why is that if not because it's so horribly insecure. Why isn't there a single recommendation for Telegram from ANY cryptographer on the entire planet?
"they recommend WhatsApp despite the fact that it's closed-source and nobody can verify if its encryption truly works."
Because they've helped implement the encryption? Also if proprietary tools doing encryption are not secure, then why do Telegram users think it's ok for Telegram to use closed-source server that's doing the "distributed datacenter encryption" for group messages' at-rest protection. There's not even documentation available for this let alone source code.
Fair point, but from my perspective, even if it was absolutely the best end-to-end encryption there is, it wouldn't mean much unless everyone's using Telegram for 1-to-1 communication using Secret Chats feature.
> Some of its channels helped unconnected, scattered rallies mature into well-coordinated action.
This line alone makes their encryption rather meaningless for this use case, since Secret Chats only work between two people.
Which is why I'm confused people are even talking about their encryption in this thread.
This has nothing to do with secure chats and everything to do with Telegram's Channels feature. But a ton of people that have never used Telegram nor read the article don't know that.
And proxies. Telegram has great proxy support and virtually anyone can install their own MTProxy in 5 min.
A multitude of proxies, shadow optic cables over the border and a bit of whitelisting from the government to allow payment processing made Telegram invincible.
Correct. What anyone in an oppressive regime could do though is to make sure settings are set to "share your phone number with no one," as well as delete their own messages from the channel in their entirety after having been read 15-30 min later or whatever arbitrary time they'd like. They would do best to not use an @username or account name which could identify them. Beyond that, there's no way anyone in Belarus can do a thing besides physical violence and take an individual's or a group of people's phones.
There are also options for invite only channels ( I manage several TG channels, public and private) in which nobody can join without having been given the invite link, or added to the channel if their settings permit other users adding them to channels.
This is all information in bad faith.
The protocol and all Telegram is open source. Are you a cryptographer?
And who "rolled" the Signal protocol, Moxie Marlinspike? Did he not design that himself?
This is demonstrably false. Telegram's apps are open sourced (except Telegram X for some reason), same as Signal's (no exceptions). None of the two offer you their server's code.
> And who "rolled" the Signal protocol, Moxie Marlinspike? Did he not design that himself?
And again, this is completely irrelevant because even if Telegram's end-to-end encryption was absolutely the best there is, a) it doesn't work on group chats, and b) it's not enabled by default, only in Secret Chats. The vast majority of Telegram's usage is not end-to-end encrypted at all.
"The vast majority of Telegram's usage is not end-to-end encrypted at all."
This. This is the backdoor right here. It was never going to be shady flaw in the implementation. It's SO much easier to put it out there in the open, spread misinformation about Telegram being at the forefront of privacy battle and silence all criticism (my links were shadowbanned on their subreddit), and to attack straw men like people posting example's of Telegram's bad track record. tl;dr: damage control.
Telegram's encryption OTOH was designed by Nikolai Durov who is not a cryptographer, but a geometrician. That's like asking a gynecologist to perform brain surgery, lol.
Signal Protocol won the Levchin Prize at Real World Crypto, which was awarded by a panel of several of the most renowned academic cryptographers in the field (including Dan Boneh and Kenny Paterson). Other winners include Bellare, Krawczyk, and Joan Daemon. The protocol has been extensively analyzed and is the current gold standard for messaging encryption.
This. It's not the Durov brothers who are moving the field of secure messaging onwards, or talking at conferences. They're complete amateurs surrounded by fanboys who don't understand the very basics of the field, and who think copy-pasting from https://tsf.telegram.org/manuals/e2ee-simple makes them useful as opposed to spreading propaganda.
But the standard we should apply to secure chat protocols isn't how many awards it won, but whether it's watertight. Obviously winning a prestigious prize means it's watertight, but the converse doesn't follow. A protocol can be safe for practical use without winning any prizes.
It can, but given Telegram's history and professional cryptographers like Schneier[1] and Green[2] saying DO NOT USE IT, it's obvious it's _anything_ but watertight.
No. Still not E2EE by default, still no E2EE for groups, still no E2EE for desktop clients. Why do you want to imagine Telegram magically got better when it's so obvious it didn't?
Because they “magically” updated and improved tons of stuff in the last four years. So I think it’s not unreasonable to consider whether their encryption improved too.
But yes, not having encryption on by default speaks poorly of them. OTOH it’s not concrete proof that the encryption still sucks as of now.
Don't get me wrong, I'm not saying the E2EE encryption itself is flawed. I'm saying it's not being used at all by default. And I'm saying it's not possible to use it for groups or desktop clients. That's _the_ travesty, and the proof that this is the state of things is so obvious people don't realize how serious it is. And my concern is that will lead to a tragedy.
Yeah, it’s true that not having E2EE makes Telegram a bad choice for the purposes of the protesters. Convenience and inertia wins out though. And when you have groups of hundreds of thousands of people, there aren’t too many choices in the first place.
The expectation of privacy loses it's meaning when the group size grows. It's more likely what you said remains private when you say it in a group of five people than if you say it in a group of 50, 500, 5000, or 500,000 people. IMO supergroups and channels don't need E2EE, normal groups in Telegram definitely do. It's not an all-or-nothing thing. E2EE where expectation of privacy can be assumed from group size isn't a problem.
Also, Signal has no upper group size limit but E2EE would make group with 100,000s a bit sluggish. But that's a problem that reduces with Moore's law.
No, and obviously it doesn't have to, because I'm replying to you. You hint at Telegram's protocol being inferior based on the number of awards it won, a heuristic that isn't too relevant in practice.
First of all, most of this goes back five years and things have likely changed, but basically MTProto used several non-standard and out of date security mechanisms (no AE and using SHA1 were fairly notable at the time) whereas Signal was purposing fairly standard and widely used mechanisms (OTR). It's possible that many of those failures have been addressed over the years, but I haven't followed it closely. It's worth noting that Signal has been widely vetted over time and is the underpinning of WhatsApp, whereas MTProto continues to have a poor reputation, it seems.
The very fact out-of-date security mechanisms passed into first version should tell the developers don't follow their field, or that they're complete amateurs. Both are flags so red Stalin would have a problem with it.
The Signal Protocol[0] is based on OTR, a technology which had already seen a number of implementations and informed scrutiny by the time Signal came along.
Also an important aspect is that it is open sourced, meaning others can audit it. I'm a little untrusting of people that say "trust me" but also "no, you can't look at it." (unless there is a good reason to hide it, which in this case I do not believe there is)
(DH-ratchet is still there. 1536-bit FF-DH was replaced with X3DH etc, but the basic idea is still there. Adding hash ratchet for non-round-trip messaging was a good idea, as was pre-keys stored on server. IMO it's fair to say it's been expanded around OTR)
It is encrypted by default but end-to-end is only for calls and Secret Chats (one-on-one). You can delete any message at any time without a trace for both sides, which protesters often do, really don't think the government needs messages to pin a crime on them. Hell, they've pinned crimes on people for literally no reason before.
So when you try and go tell the other person's device to delete your message, how does it go into their iCloud backups and delete that message, or some other backup?
Don't depend on asking someone else's device to delete the data as that data being gone.
It is stored locally, although only temporarily. I rarely connect my phone to the internet and still can scroll through quite a bit of message history.
Not by default, no, because that has UX implications (e.g the chat will only be available on one on your device instead of being synced between all your devices). Though it’s quite easy to start an encrypted chat, and you can decide to have auto destructive messages.
I'm pretty sure Signal at least doesn't encrypt at rest on your phone. So the drive would have to be encrypted as well, which is not default on Android
Signal does encrypt your messages locally. Also Android supports file encryption you don't need to use full disk encryption anymore. Also I think the policy has changed in Android 10.
> All compatible Android devices newly launching with Android Q are required to encrypt user data, with no exceptions.
Signal traditionally had an easy to get encryption key for the local encryption. Now there is a PIN but I don't think it is any protection against having access to the disk. The signal people would prefer that that you deal with the end point security yourself, because they really can't do much there.
Indeed, the PIN is just for SVR. Exported message logs on Android use separate, client-generated, 30-digit, PINs.
Unless the OS+HW provide API for some sort of TPM, it's not possible to provide strong protection for app databases without asking for strong password every time the app is opened. Android has had some sort of sandboxing for a while but it's not comparable to secure enclaves etc. AFAIK.
Android has encrypted storage by default since a few years ago. Of course, by default it uses a default key. But, the point is, enabling "encryption" just means changing that key, not reencrypting the entire device.
Apart from that, regardless if you're on Signal or Telegram if authorities get hold of a protester's identity on such an app and have the power to access the app's servers they can gradually uncover social networks by reading metadata (if I'm not mistaken).
I think you are mistaken. Before your text is sent to Signal your sender information is encrypted with the receiver's public key. So while Signal's servers can see who to deliver the message to they cannot see who sent it. Only the receiving client can decrypt and authenticate the message. This feature was rolled out in late 2018 and is called "sealed sender". It was developed to prevent leakage of any social network information via the message metadata.
But as far as I know Telegram has no equivalent feature.
"So while Signal's servers can see who to deliver the message to they cannot see who sent it."
Why can't they look at the TCP headers of incoming packets to determine source-IP? Also, why can't they look at session identifier or signal ID like phone number to determine who the sender is?
I assume if you are trying to hide your communications you aren't connecting directly to signals servers, so IP should get you nothing. There is no session identifier or signalID attached to your message, its contained within the encrypted part of the message so only the receiver can determine who the message was sent by. https://signal.org/blog/sealed-sender/
Encryption isn't enough. They could just suspect or arrest anyone who has Telegram installed. Or they could check teleoperators' logs for anyone who has used Telegram during the past weeks.
In Turkey, they arrested people who had the ByLock app installed. It didn't matter how people had used it. Having installed it was enough.
Let's differentiate between the heroic individual activists striving against all odds, versus the technologists whose market-driven decisions ensure that the activists are betrayed to their oppressive governments.
It's not so against the odds. The EU just implemented mass sanctions against Belarus and mobilised €53m to support agitators. Top politicians such as Varadkar tweet support.
If Russia so overtly threw money at organising American riots it would be front page news. There's been a year of mass unrest and yellow vest riots in France yet Marcons junta still reigns supreme.
I never replied as you simply restated plainly obvious information about Telegram. I'm intimately familiar with, and would not trust Bruce Schneier. It's not my first day in cryptography.
What you've counterpointed doesn't exactly negate what I said. There are no Belarus state controlled or regionally located Telegram servers. I fully understand and take that risk that server side code is manipulable, and I also fully know they are able to edit open conversations from the server and this has been done. Still a better alternative to choose a foreign state adversarial network these days and to choose E2E and do as best you can to use a throwaway number than to choose something that's been gamed by your own state, for fun and for profit to eavesdrop all conversations out of the gate, or use backdoored WhatsApp. Choose all throwaway, blend in, and don't talk too much.
We saw it with the Green Revolution in Iran. We've seen it several times since.
So long as messages are not encrypted, messaging apps are much more naturally suited as tools of oppression than tools of revolution.