Hacker News new | past | comments | ask | show | jobs | submit login
Cryptographers unearth vulnerabilities in Telegram's encryption protocol (cyberscoop.com)
378 points by wglb on July 18, 2021 | hide | past | favorite | 226 comments



Do Signal, Wire, or Matrix do this transcript consistency thing, or is there some other way by which they prevent message reordering? From what I know, encrypted messengers almost never implement that. At least in Wire I noticed reordering at some point and it uses an implementation of what is now called the Signal protocol.

And that's the most severe vulnerability from the article. The second one is "mostly of theoretical interest" according to the researchers themselves, the third requires "an attacker [to] send many carefully crafted messages to a target, on the order of millions of messages", and the fourth "requires sending billions of messages to a Telegram server within minutes". So while this is not bad research, not at all (it's really great that people take this close a look!), the title sounds rather more severe and people will obviously jump to the conclusion that "See, MTproto is bad like we said all along!" without reading anything. Good research, bad article?

«Obligatory notice in Telegram comment threads: the real issue in Telegram is that end-to-end encryption is optional and made extremely inconvenient to use: no implementation at all in TDesktop, can't use it across multiple devices, doesn't work for group chats at all... That is the main reason Telegram isn't secure: they can read all your chats except when you use this stupid inconvenient mode.»


> Do Signal, Wire, or Matrix do this transcript consistency thing, or is there some other way by which they prevent message reordering? From what I know, encrypted messengers almost never implement that. At least in Wire I noticed reordering at some point and it uses an implementation of what is now called the Signal protocol.

If I'm reading this right, the issue is that a network MITM can reorder the messages in the client to server connection (without being able to read them). This would likely reorder messages in the chat transcript too, but that's a different issue.

TLS or Noise protocol are open alternatives here, and IIRC both require that you have received all previous packets in order to validate a new packet, which means either you've received all the packets in order or the MITM has compromised the handshake.

Signal protocol is the end to end protocol between clients; those messages do not require reception in order. Because the messages go through a server and may be retried, there's lots of ways things could go wrong and some messages may be dropped or delayed and can arrive out of order. You could refuse to display message N until you've displayed N - 1, but if the server dropped N for whatever reason, you need to get the sender to resend it, and that depends on availability of the sender which is limited (their phone may be off, out of coverage, or execution may be denied/delayed by the OS, or the user may have removed the app by the time the receiving app notices the message is missing, etc).


At the UI can potentially indicate discrepancies around message ordering.


>Do Signal, Wire, or Matrix do this transcript consistency thing, or is there some other way by which they prevent message reordering?

Rooms in Matrix are directed graphs[0], so if I'm understanding right, transcript consistency should be the default.

[0] https://matrix.org/docs/spec/#id14


Generally, the thing with cryptography is the first breaks tend not to be useful. Crafted messages, terabytes of data, a compromise in one round of several in the key, etc. Theoretical stuff.

But they then tend to get worse rapidly, building on those useless cracks to drive the wedge deeper. And so in general it's best to assume that if something's slightly broken, soon it will be all the way broken.


As a counterpoint (for crypto algo design), we seem to have gotten much better at designing things so that small breaks don't turn into large breaks.

https://rwc.iacr.org/2020/slides/Aumasson.pdf


That isn’t really the case. Sha1 still isn’t broken in a particularly useful way. Md5 is still basically safe from pre-image attacks.


This is quite disingenuous, as with hash functions there's infinite preimages that map to each digest.


It’s not at all. Understanding the differences between the types of hash image resistance is critical to know whether it matters for your application.



SHAttered is a collision attack on SHA-1, not a pre-image attack. There is no known pre-image attack for SHA-1.


Yet. We get closer every time a new vulnerability is discovered.


> Yet. We get closer every time a new vulnerability is discovered.

Skipjack is the famous counterexample to this: it's been broken up to 31 rounds, but specified usage mandates 32 rounds. Skipjack was given immense scrutiny due to the NSA's role in its design, and we're no closer to a full-round break of it than we were in 1999.


Even that is only theoretical, it's not a practical attack.


MD5 even still has pre-image resistance. A practical preimage attack would indicate something even more serious was overlooked with a design of a hash function. A practical attack would mean you could reverse the value of what produced the hash, by just using the hash.

Only seriously lousy cryptographic hash functions would have bad preimage resistance. Heck even the pigeon hole principal makes this aspect of design easier for a hash function. If a wildy used hash function some how had a practical pre image attack i would be more concerned about our process of standardizing algorithms.


Collision attacks can still be useful in an attack tree


Care to give some sources for this claim? Even with some examples, it seems very vulnerable to the survival bias (examples where small flaws did not lead to a big blowdown are not paid attention to).


They're not wrong in principle: attacks do often start as implausible and then some other shortcut or weakness is found and bam, there is a realistic exploit out of something that was deemed completely infeasible.

I think in this context, it's a bit more nuanced. These shortcomings are fixed (I can't personally vouch for the correctness of the fix as I haven't looked at the source diff, but let's assume it's solid), so they're not left to linger and build on. There was also a more serious bug in the past though I forgot the details. If this is the worst they're finding now, I'm not so worried. What I hear of mtproto doesn't sound like bugs are compounding but rather like they're ironing our small (but not completely unimportant, of course) flaws. But of course we won't truly know if it's secure until someone finds a devastating attack.


The old attack was that the server provided the client with randomness when generating keys for private chats, allowing the server to effectively MITM all private chats.

Pretty weird design decision.


> As an aside, Jakobsen and Orlandi wrote: “We stress that this is a theoretical attack on the definition of security and we do not see any way of turning the attack into a full plaintext-recovery attack.” Similarly, the Telegram “FAQ for the Technically Inclined (MTProto v.1.0)” provides the following analogy: “A postal worker could write ‘Haha’ (using invisible ink!) on the outside of a sealed package that he delivers to you. It didn’t stop the package from being delivered, it doesn’t allow them to change the contents of the package, and it doesn’t allow them to see what was inside.” In hindsight, we think that this is incorrect. As explained above, our timing side channels essentially exploit this behaviour in order to do message recovery (but we need to “chain” two “exploits” to make it work, even ignoring practicality concerns).

https://mtpsym.github.io/


You can simply see the history of algorithms over the last two decades. Look at things like RC4, which is notoriously hard to use safely. It had a history of many issues and was ultimately abandoned even though you could theoretically use it securely to this day. It is situational, but with cryptography, algorithms are deprecated from an abundance of caution in many cases. It’s a little different with a protocol and not a foundational algorithm, though. We can often correct a protocol in a safe way compared to an algorithm that is often simply deprecated. Still the sentiment is mostly correct, but perhaps a bit overstated.


AES has been broken for over a decade. 256 bit AES is actually only 255 bits strong. It has not progressed any further, and AES is the most popular algorithm globally and has had over two decades of eyes on it.

This definitely is not generally true, especially for “academic breaks” that don’t even approach feasibility.

I really think security people fear the wrong things. Unless they are fundamentally flawed very few actual crypto breaks happen in practice. Most real world breaks are the result of implementation bugs, side channel attacks, and social engineering.

If I had to choose between a buggy code base implementing Noise with the latest hipster crypto and a well written memory safe audited code base and protocol using RSA-1024 and RC4, I would feel safer with the latter. (Unless my adversary were the NSA or someone else at that level, in which case I would layer three or four systems for defense in depth.)

This is not an argument for the use of weak crypto. It’s an argument for fearing the implementation, deployment, and users more than the crypto.

Big blobs of C are scary regardless of what they implement, especially if they are messy and have not had many eyes on them. The meat bag in front of the computer is usually the least secure component. If you look at real world breaks it’s usually bad opsec or phishing. The hipsterest crypto won’t help you if you are tricked into running malware or you have an insecure PHP script on a web server.



Point taken. I may have reached for too extreme an example, but AFAIK there were other issues with the WEP construction besides just RC4 being weak.

I think the point still stands though. When I read about breaks (and when I saw them back when I did infosec) it was phishing most of the time. Someone would be tricked into running malware. That’s how most organizational compromises happen. The rest were memory errors like buffer overflows in non-security application code and bugs in bespoke code exposed to the Internet.

My point was that people are afraid of the things that are sexy to be afraid of, and tend to ignore the more mundane but more commonly attacked vectors.

The same applies to firewall obsession in netsec. People will obsess over the firewall and then type “npm install” on production systems. It’s not sexy to worry about that.

Package managers scare the shit out of me…



i.e. messages can be out of order, so no transcript consistency, is that what you're saying?


I think that the linked article described that messages that arrive out of order can still be decrypted despite the decryption keys being generated in a strict order. As far as I know, the order of the keys that are used to decrypt incoming messages also indicate the order in which the decrypted messages were sent.


So would a potential fix be just to add some metadata in the message somewhere that says when it was sent?


That would at least arrange the messages later on in the right order(I think they do this actually), but still means, they might get received in the wrong order. So you get the notification and look on your phone, but see only one of the later messages without context or knowing that there is missing information yet to arrive. That can be very irritating and annoying.


This is also annoying because you are seeing your messages on your end instantly, so your chat history order is not the same as everyone elses. This is most noticeable in a group with lots of messages.

Not a crypto issue, but a UI one though.


transcript consistency is usually something that matters in group chat, and signal/matrix/and pretty much all group chat apps rely on a trusted server to order messages.


See Appendix C of https://mtpsym.github.io/paper.pdf (the underlying research paper). Reordering of messages (in one direction) and "causality preservation" (both directions) are two different things.


From what I've seen of this type of research, a vulnerability like these may be worked on and some one will refine it to make it exploitable, so even if it's not possible to use it in it's current form, it's important to be aware to prevent future attacks.


Mm-hmm.

iMessage and WhatsApp are encrypted “out of the box”—it’s the default. Telegram requires an explicit choice.


Unlike Telegram, iMessage and WhatsApp are closed source apps. How can you verify that they really encrypt the messages?


Here's the thing. There's perhaps a 0.0001% chance that iMessage or WhatsApp contains a backdoor that does not really encrypt a message.

There's a 100% chance that every cloud message, every Windows/Linux desktop message, and EVERY message by default, is sent to Telegram servers, where the vendor, and any intelligence agency, organized crime unit etc., can steal them from. Telegram is a for-profit company. It can also be sold, and that will include billions of messages stored on their server. The only thing stopping them from selling out is "they don't feel like it".

Telegram is not backdoored, but, it is proven to be frontdoored, and this is extremely dangerous because "it's nothing new, we knew this, it's obvious". Yes. It's blindfully obvious to your average crypto anarchist who's spent a year reading about cryptographic protocols. It's anything but obvious to your average user.

To give some idea, I did a survey at my university computer science department: the students have lots of Telegram users. These students study cryptography as part of their network course, number theory course, and there's even a grad-level cryptography course run by Valtteri Niemi. Here's what I found: less than 20% even knew what secret chats were, let alone used them. Everyone just winged it, assumed Telegram was safe by default. Now, if a computer science student body whose field this is don't know about it, what chance is there a hair stylist, a political major, a dissident/activist, or journalist, knows about it.

But I can't blame them: I've long since lost the count of how many times I've seen news media bundle apps together: "Signal, Telegram, Threema ... are secure replacements for WhatsApp". This kind publishing is incredibly irresponsible, and it's no wonder people get the wrong idea.

So yes, Telegram is open source with reproducible builds. But that confirms nothing but the existence of the front door to technical people, that the app doesn't end-to-end encrypt anything, by default.


A 0.0001% chance that a Facebook product has backdoors to spy on people? I see that you have a real hatred of Telegram but come on, this is such a biased take. You are willing to protect a closed-source app from Facebook just because you hate the alternative so much. You are not helping anyone with this.


>A 0.0001% chance that a Facebook product has backdoors to spy on people?

Yeah, I have never ever heard anyone MITM attacked WA user, and I have never seen any claims the app leaks plaintext data or keys to server or third parties. If you have some factual information instead of "hunch", then by all means, lets hear it. I'm not arguing Facebook isn't abusing WhatsApp metadata like there is no tomorrow. They absolutely are, and often metadata is more revealing than the content. I'm not saying WhatsApp is secure enough, I'm saying it's better than Telegram. Consider group messages:

Telegram spies on user's group messages with 100% probability. Therefore, WhatsApp can be at most as insecure as Telegram. But if even if there's 99.9999% probability that WA is backdoored, it's still better odds than Telegram.

If you think WhatsApp can't be trusted because it's proprietary and you have to trust the vendor, then you absolutely can't say Telegram is safe, because you have to trust the vendor not to look at the group messages.

So it boils down to what is the actual probability that WA is backdoored, the number is not zero, but it most certainly is not 100%. If you have useful factual information that overrides Moxie personally telling me that he oversaw WA implement the Signal protocol, I'm willing to update my threat probability estimates, but until then, I'm going to stick with saying it's highly improbable.

>I see that you have a real hatred of Telegram but come on, this is such a biased take.

I have nothing personal against Telegram. I've researched private messaging apps for closer to 10 years, and I'm only interested in all apps improving. But Telegram isn't in the process of locking themselves out of user data, but on the contrary, even the new features, like the group video calls, are collecting 100% of metadata AND content. Telegram isn't helping the world, it's amassing terabytes of data into a silo that's one zero-day away from the biggest breach in modern history.

Here in Finland we've recently had some taste of it when the Vastaamo psychotherapy center was hacked. https://www.wired.com/story/vastaamo-psychotherapy-patients-... Read it. Then realize Telegram private chats can contain messages people wouldn't share with their therapists. Imagine tens of millions of extortion cases (that never end, even if you pay), ruined lives, relationships etc.

The Telegram Hack of 20** is not going to be just another hack, it's going to be the scandal of the century.

> You are willing to protect a closed-source app from Facebook

No, I'm not protecting anyone here, I'm making the distinction that the backdoor is very improbable, because a) Facebook doesn't need it due to the metadata and b) Eavesdropping on your users via backdoor is a felony. Compare that to Telegram where the users send practically everything to the server willingly. There is zero expectation of privacy the user can ask.

>you hate the alternative so much

Telegram is not "the alternative", it's "out of the frying pan, into the fire". As you can see my every post here argue.

There are plenty of open source, secure alternatives like Signal, Element, Wire, Threema, Briar, Cwtch, OnionShare Chats etc.


I know people who worked on WhatsApp and they wouldn't let that happen. Take it for what it is, but it's a pretty serious team there.


How long do you think the principled people will still be working on the project?


As long as it is end to end encrypted, the day they remove that feature is the day I would stop using the app.


Please review the HN guidelines. It’s fine to question Facebook’s integrity, given what we know about the company. Attacking another person’s inferred motivations is not.

https://news.ycombinator.com/newsguidelines.html


WhatsApp claims to use a well-known algorithm so it shouldn't be too hard for a researcher to encode a message with their own implementation of the algorithm (while copying the seed/keys from WhatsApp) and confirm that they get the same bytes out of both.

Of course that wouldn't prove that there's no backdoor, just that the normal code paths work as described.


The more likely question is, is your version of the app consistent with everyone elses? I don't think there is a general backdoor, but a crafted one? I'll take that bet.


Decompile and read the resulting source code, then snuff the network traffic to make sure it’s matching what’s expected. Trusting the Hardware and OS is a larger issue.


People always take that lightly, as if it's trivial to read bytecode (tptacek takes that stance a lot if I recall correctly, and his being him, people take it for gospel). Just looking at the underhanded C (style) contests, it's not trivial to spot backdoors in the source code, let alone the code that a machine is meant to interpret after it ran through a compiler. If it were so trivial to find the flaws that allow attacks in the easiest-to-read of code (open source, docs available, mitigations applied, nice choice of language, everything), we would not have security issues in good software in the first place.


As I said reading the source code is only the first step, you also sniff network traffic. A program that’s only sending data you understand can’t also be sending data you don’t want it to send.

Of course this isn’t trivial, but the point of comparison was an open source application which has exactly the same underhanded C style risks. The hardest to evaluate risk is simply using a subtly flawed algorithm or source of entropy which looks secure in any language, but isn’t.


*sniff


> can't use it across multiple devices

The reason is that. You have to choose between something you can use from whatever device you want, that you can simply log in into a web interface on whatever computer, or end2end encryption.

Telegram made a choice of usability, you are able to use Telegram from whatever device, mobile app, desktop app, web app, smartwatch app, command line client, third party clients, without any problem, notifications arrive on every device, you can start a conversation on one device and continue it on another, they are synchronized in real time.

It's in theory less secure? Yes, since whoever has access to Telegram servers can in theory read your messages. Is it in practice a problem? No, it's not. Telegram doesn't, as far as we know, give access to government or other third parties to the data of the users. WhatsApp does for example, not for the message content but for the metadata (that they can read). What is more secure, in the real world and not in theory?

Signal is more secure, yes, but it's more inconvenient that both Telegram and WhatsApp. I prefer a convenient and slightly less secure messaging service than a super secure one (in theory) but inconvenient to use.


>You have to choose between something you can use from whatever device you want

You're being disingenuous. End-to-end encrypting every 1:1 chat by default, as well as end-to-end encrypting <1000 member groups have already been shown to be possible by Signal, WhatsApp, Wire, Element etc.

Telegram is saying "fuck security" to gain competitive edge with message latency. They have no idea how dangerous game that is.

>Telegram made a choice of usability

Insecure features are not usable features because security is the fundamental attribute of EVERY feature. Telegram has footguns, nothing more.

>"Is it in practice a problem? No, it's not"

And what credentials do you have to make such assertions?

>Telegram doesn't, as far as we know, give access to government or other third parties to the data of the users.

Considering the fact Telegram lacks the know-how to implement secure protocols (see OPs article) what chance is there they can defend against EVERY zero day exploit against their server infrastructure? Telegram most likely would never even know they were hacked. And the intelligence agencies that have owned their systems aren't bragging. And even IF Telegram would detect such attacks, would they disclose it, when they know they lack the know-how to prevent the next exploit, i.e. the know-how on how to deploy E2EE for everything.

>Signal is more secure, yes, but it's more inconvenient that both Telegram and WhatsApp.

You can't be serious. Let's take the most basic possible convenience feature. I chat with my buddy over E2EE 1:1 chat while riding the bus to work. At work, I sit down in front of my computer, and want to continue the conversation using keyboard. Signal allows me to do that with zero hassle, Telegram forces me to drop end-to-end encryption.

Telegram's convenience is an outright joke. The moment you want flexibility without losing security, it's garbage. We can thus argue Telegram strongly incentivizes insecure use (come on, just let us see your messages), because nobody, is going to unlock their phone hundreds of times a day to reply to the chats.


> You're being disingenuous. End-to-end encrypting every 1:1 chat by default, as well as end-to-end encrypting <1000 member groups have already been shown to be possible by Signal, WhatsApp, Wire, Element etc.

All these applications doesn't sync seamlessly from multiple devices. WhatsApp doesn't work if the phone is offline, Signal does but the solution works only for desktop clients, i.e. I can't have two mobile clients (I have two mobile phones, one Android and one iOS, that can access the same Telegram account, a tablet and a smartwatch). I have the Telegram client on all the computer that I have, it's the first thing I install just because I use it to share data between computers (it's practical to send a file or a link to the "Saved Messages" to transfer it between devices).

> Insecure features are not usable features because security is the fundamental attribute of EVERY feature. Telegram has footguns, nothing more.

Telegram is secure enough for most users. We have much private conversations in emails and it's mostly still a plain text protocols, with a lot of server that doesn't even support SSL with these days. So I guess that Telegram is secure enough to use it to organize a beer with my friends, if the email is secure enough to get my bank access codes, or get my medical reports, or other sort of private conversations...

> And what credentials do you have to make such assertions?

Because it is. Most conversations of normal people doesn't contain that much concerning information, and they are mostly full of useless stuff. Probably the most interesting thing that one attacker will find are some nudes that are sent... and in Telegram you have the secret chats for that anyway.

> Considering the fact Telegram lacks the know-how to implement secure protocols (see OPs article) what chance is there they can defend against EVERY zero day exploit against their server infrastructure? Telegram most likely would never even know they were hacked. And the intelligence agencies that have owned their systems aren't bragging. And even IF Telegram would detect such attacks, would they disclose it, when they know they lack the know-how to prevent the next exploit, i.e. the know-how on how to deploy E2EE for everything.

Do you think someone would even bother to break Telegram encryption if he wants to access your data? https://xkcd.com/538/

Authorities routinely access WhatsApp messages even if they are encrypted. How? Give me the phone and give me the password. But most of the time is the birth date of the user, or 123456, 0000000, or some other stupid code like that. You don't want to give me the password? You get in trouble. And we will probably extract the data anyway exploiting the operating system that probably it's an outdated Android version anyway...

The European union, where I live, is passing a law that will impose to chat applications, and WhatsApp already declared that it supports the initiative, to implement a way to check for every message that is sent for child pornography (of course is just an excuse, then since they have the system in place, let's just use also for terrorism, and then for piracy, why not). How whey implement that, that is encrypted? Well of course just implement the check in the client, before encrypting the message! And if you find some suspicious content, of course send it in clear to the authorities to be analyzed. You said what about privacy?!?

I'm pretty sure that Telegram will never implement something like that. And if it does, the Telegram client is open source, someone can fork a client removing that part, and one could install it on their phone. With WhatsApp is impossible, and even if Signal client is open source Signal is agains third party clients anyway (good example of open source!).

> You can't be serious. Let's take the most basic possible convenience feature. I chat with my buddy over E2EE 1:1 chat while riding the bus to work. At work, I sit down in front of my computer, and want to continue the conversation using keyboard. Signal allows me to do that with zero hassle, Telegram forces me to drop end-to-end encryption.

What if I can't install the software because I don't have administrative rights? I guess I can't continue the conversation. What about old messages, arrived before I log in? I guess they are lost.


>I have two mobile phones, one Android and one iOS

Cool edge case. You can have two phone numbers too you know.

> I use it to share data between computers

You can always tell a shill when they try to advertise the product's features every chance they get. Nobody in real life needs to go around telling how convenient something is.


> Is it in practice a problem? No, it's not.

This is not an assertion you can make. Because:

> Telegram doesn't, as far as we know, give access to government or other third parties to the data of the users.

Approximately every country (probably every, but haven't researched them all) has various court order types to go get this information and force secrecy on whoever is involved. Thus any information which is available to a company is equally available in secret via court orders. Nothing they can do to prevent it or tell you.


> Telegram doesn't, as far as we know, give access to government or other third parties to the data of the users.

Just because we don't know that they do doesn't mean that they don't, or won't in the future.

And even if they do do their best to keep messages private, what if their servers are compromised, or they are served a warrant?


Just because we don't know that they dont doesn't mean that they do, or will in the future.


>You have to choose between something you can use from whatever device you want, that you can simply log in into a web interface on whatever computer, or end2end encryption.

>Telegram made a choice of usability

Bullshit. Plenty of services have e2ee and cross-device usage. All you need for multi-platform e2ee is to sync an encryption key across devices. Either through a "trusted" party (like iMessage) or by literally syncing the key locally (Signal). You can even have one device act as the message store so you don't even keep store the encrypted messages in the cloud.

Telegram did not choose usability, they chose the opposite.


iMessage doesn't share private keys between devices. Each device has its own secret key, and when you send a message to a person with multiple devices, you encrypt multiple copies, one for each (source: https://support.apple.com/guide/security/how-imessage-sends-...)

An issue here is that the software doesn't show the user the list of keys used to encrypt a given message. If an attacker can inject their own keys into the identity service records for a given user, they would then receive a copy of all messages sent to that user.


It's true iMessage is not safe from such attacks, but you don't have to look further than Signal to see it's possible to create multi-device E2EE with seamless sync.

iMessage and its bad design can only serve the function of red herring here, let's leave it out from the discussion.


If you can have encrypted group chats, then clearly you must be able to join and synchronize an encrypted conversation with more than one device. Encrypted group chats exist in Signal and other applications, so clearly this is possible.

(There are other mechanisms (to have encrypted group chats) than just pretending each device is a separate chat member, but I thought this was an intuitive way to understand how that could securely work.)

Regarding giving metadata to governments: I want that. If someone commits a crime, then sure the government should be able to request the data on this person (it might exonerate them or implicate them, though of course it's only ever supportive and never a sole reason to convict). The governments that most people here chose are not evil. Dragnet surveillance is not good, but individual data requests for individual persons after a judge approved it? That makes sense to me, at least from the perspective of the Netherlands. If Telegram doesn't comply with these requests (and that's a big if, because I heard otherwise), it would be a reason why democratic country might rightfully ban Telegram.

But anyway that's perhaps a more political discussion rather than technical. Telegram has the data, just like every other centralized platform either collects or can be forced to collect it.


> Obligatory notice in Telegram comment threads:

Sadly, obligatory notice in Telegram threads:

Telegram is encrypted by default and has always been unlike a certain big competitor that has been pushed here.

The issue isn't if it is encrypted but if it is end to end encrypted and also the quality of the encryption.

Saying it is not encrypted is directly misleading.

Your point that they can read your messages if they want still stands.

This however is also true for mail, banking and about everything else except the most secure instant messaging networks.


Encrypted, as it's used with messengers, means E2E encrypted. Anyone who is calling Telegram 'encrypted by default' while technically correct, is being very, very misleading.


Well, calling it "unencrypted" is equally misleading

... and technically wrong as well.

In a somewhat technical community we should strive to communicate correctly and not invent new meanings for well-defined terms.

By all means come up with a new nasty sounding term, but don't lie and excuse it with "it means something else now", that is borderline childish/sleazy salesman-ish.


> Well, calling it "unencrypted" is equally misleading

Encryption implies it protects message content from third parties. Show me someone who agrees its bad if the third party is some random guy from ISP, but it's good if its some random guy from Telegram LLC.

Telegram is effectively unencrypted, because the messages are by default readable by someone else, like is the case if there was no encryption at all.

Third parties are not just your average script kiddie capturing packets at your internet cafe, third parties are also everyone who hacks Telegram server, every intelligence agency that hacks Telegram servers, and a possible parent company that buys Telegram.


I think there's a pretty big difference between "your messages can be intercepted and arbitrarily modified by anybody who has any access to any router or server between one of your devices and the Telegram server" and "your message security is the security of the Telegram servers only".


> Show me someone who agrees its bad if the third party is some random [girl] from ISP, but it's good if its some random [girl] from [service operator].

DoH


Oh apologies, I didn't want to be gender exclusive here.


To be clear, DoH refers to DNS over HTTPS (also thanks to u/wizzwizz4 for clarifying that in a sibling comment) and wasn't at all meant to be a sound. Not sure if that came across wrong, now that I read it again I see that it might have.


This is DNS over HTTPS, and good point.


> Saying it is not encrypted is directly misleading.

Sorry, that's my bad. I edited the message to clarify that I meant end-to-end encryption, since Telegram's implementation (mtproto) of e2ee is the topic of this submission.

> also true for mail, banking and about everything else except the most secure instant messaging networks

That is indeed the state of things. People like the convenience though: imagine you could only retrieve your data from Protonmail by knowing the encryption password. People would irrecoverably lose all their emails on a regular basis. Banking makes less sense though, since the bank needs to know how much money you have to be able to give some of that money (per your instruction) to e.g. merchants. Websites can be end to end encrypted, if the endpoint is owned by the person you're trying to reach. Cloudflare's "TLS MITM as a service" (and similar offerings) undermine that, but if you go to https://lucb1e.com then your traffic is end-to-end encrypted to me. If you want, we can also verify fingerprints out of band, just like you should with encrypted chats to make sure you're not trusting the server (or in this case the signing authorities)!


Given how bashing on MTProto is a favourite cryptographers' activity from the day-0, I'm actually kinda encouraged by the fact that's all they found. It's especially nice, that htey could fix all of them without breaking back-compatibility, which isn't really a given for crypto protocols. Would appreciate more technical and less dumbed down responses from the Telegram team, though. I have no idea what the fuck "counting grains in the bags of sound" is supposed to mean.

(Also, it's funny, but being accustomed to SMS and such I always somehow assumed Telegram messages can be reordered by chance too, if I send them fast. I never really thought about it, obviously, but simply didn't expect it to be otherwise.)


That's my takeaway here as well - no serious issues found, and Telegram actively worked with them to fix their minor theoretical issues, and did all of the fixes before the paper was published and without breaking any clients.


For the grains of sand thing, here's a technical explanation of that: https://core.telegram.org/techfaq/UoL-ETH-4a-proof#4-attacki...

This document seems like a better response to the theoretical problems the researchers found.


    that's all they found
Besides the trivial MITM in MTProto 1.0?

Telegram protocol defeated. Authors are going to modify crypto-algorithm: https://news.ycombinator.com/item?id=6948742


Telegram messages do sometimes randomly go out of order, especially under bad network conditions.


If they are sent out of order.

If a previous message is in a retry state, the second message could go through without issue.


>I have no idea what the fuck "counting grains in the bags of sound" is supposed to mean.'

They have been doing this since the beginning. Their strategy of downplaying every attack against their protocol serves to mask the fact their team has absolutely no qualifications to implement cryptography. Like the original article stated, Telegram did not even file for CVEs, simply because they look bad. CVEs have been filed for much smaller things than these.

Their security game isn't on the side of secure-by-design, but grass-roots-damage-control to justify protocol designed by the owner's brother (who is a geometrician, not a cryptographer) and insecure-by-default. The nepotism and Russian pride should never go before the users' security. But they've never cared about it.

Slightly related, you should see /r/telegram on Reddit, that outright censors articles that criticizes the app or discuss its vulnerabilities. From the looks of it, Telegram is more a cult these days than a collective of users. Its users are willing to spend hours arguing over semantics, and Telegram shamelessly makes use of them having no life, by recruiting them to something known as Telegram Support Force https://tsf.telegram.org/


Russia has been mentioned 7 times in this discussion now, every time by you. I don't have any stake in this fight; I don't use or recommend any of these applications. But you do seem to have a bone to pick.


In Russia it's "legal" to hack other countries or their citizens, so it's not weird take.


It's also "legal" to do so in other countries.

I think it's better to focus on Telegram's open specification and implementation instead of bashing it because the developers happen to be born somewhere you don't like.


The weird part is being the only guy bringing it up half a dozen times in one conversation.


With Russia? No it's just useful to remember Durov's background, and since he's Russian, there aren't many countries he'd be secretly collaborating with.

When the vendor is intentionally collecting and defending the collection of such vast amounts of data, you have to ask, who might benefit if Telegram was a front.

I'm not specifically against Russia or US, I'm just someone who'd prefer people to have the right to their privacy. Mass surveillance isn't healthy for society, and Telegram is enabling it with their silo of user data.

So ultimately, it doesn't matter if Durov is directly playing for any big party, the problem is there's not enough tech in place to prove he isn't and when that's the case, Telegram servers and the user data that sits there, is a free for all. A server in UK is not legally, (or technically) protected from the NSA, CCA, Fancy Bear, Unit 8200, Iranian Cyber Army etc.

The only bone to pick here is any vendor who'd claim to be fighting for privacy when their privacy-by-design architecture is indistinguishable from honey pots and surveillance capitalistic companies.


>A server in UK is not legally, (or technically) protected <snip> a free for all

So exactly like servers from Facebook, Signal, Google, German police, etc. anywhere on earth. Just ask NSA (or Snowden). This is the whole China debate over again: Who would you rather that have access to all your old data the day something you did years ago suddenly become a danger to you because some fascist person became a dictator? China, who likely cannot touch you if you are outside Mainland China or the US, who will happily kidnap you from almost any place on earth if they dislike you enough (often with local help)? I'd rather put my eggs in a Chinese (or Russian) basket than a US/UK one if I had to pick only between those options.


What are you talking about :D This isn't about who's basket you'd dump your eggs into. The point is, Telegram isn't deploying end-to-end encryption, therefore there is a basket in the first place. Had they used E2EE, it wouldn't matter where the server is located, because the server never ever built a massive trove of messages, regardless of where the server is physically located.

I'm equally against everyone reading my private messages. I don't want to have to choose a country who has my messages, I choose my computer, and the computer of my peer, where ever they are.


Durov's background is that he left Russia for political reasons, which include, among other things, several confrontations with the government over stuff like handing users' data for prosecution.


If they have "absolutely no qualifications to implement cryptography" then they are doing an insanely amazing job since such few vulnerabilities were found. All that other crap of russia, censorship, reddit threads, russia, cult, russia, is beside the point here. It makes no difference on the quality of the cryptography. If they suck at cryptography then post code that show why at least (and a fix).


>then they are doing an insanely amazing job since such few vulnerabilities were found.

If they are doing amazing job, then

1. Why isn't desktop chats end-to-end encrypted?

2. Why aren't group chats end-to-end encrypted?

3. Why isn't any chat end-to-end encrypted by default?

If they're so good, why does SO MANY vendors like Signal, Element, Wire, WhatsApp, Session, Threema, Briar, Cwtch manage to pull ubiquitous end-to-end encryption off?

They have the money, why can't they pay a cryptographer to design protocol with best practices, and modern primitives?

>cult, russia, is beside the point here. It

They're not beside the point. When the E2EE is practically non-existent, the project becomes practically indistinguishable from honeypots like the recent Anom, you have to ask whose honey pot operation would it be. As for the cult aspect, if you're not part of the cult fanbase, surely you can stop defending a random company, and take a rational stance and condemn Telegram for not deploying ubiquitous E2EE? It's not like them doing so would hurt you in any way? Surely they in their infinite wisdom can implement it?

Also, I don't need to fix their crappy protocol to be able to criticize it. I don't need to show you the line of code for you to agree there is no E2EE for group chats etc. As for my job, I managed to implement ubiquitous end-to-end encryption for my project, TFC, even for groups, on an extremely constrained split TCB HW architecture. If a CS major can pull it off in such circumstances, why the f can't a multi-millionare with award winning mathematicians?


You are missing the point though. If I as a random person create a client with E2EE then if it is a good client otherwise or E2EE by default doesn't matter in the question "Are they doing a good job at cryptography". It would be like saying "That expensive and very beautiful plate sucks because the cake you put on-top of it is disgusting". Irrelevant to the discussion at hand.

>surely you can stop defending a random company, and take a rational stance and condemn Telegram for not deploying ubiquitous E2EE

Yes I can and I do by comparing Telegram to likewise badly made clients (like iMessage) but this is still beside the point that you seem to avoid: If they made a piece of software and have no professional knowledge of the area it was made for (cryptography/E2EE) then they are extremely good at what they are doing no matter if you like the default settings or not.

I don't use Telegram but this comment thread is nothing but a tirade against something because it is Russian. If you wanted to discuss the actual work then there's no reason at all to mention Russia. You criticize them and say a CS major can pull it off too and then try to argue their home country and default options are a proof of how vulnerable their software is. It is not proof. Code quality, default settings and where you live is not the same thing. Your comment is reading as "TL;DR Made in Russia, hence bad".


My guess for 1 and 3: because they haven't figuered out a way to sync this to all devices.

For 2, maybe scaling issues? Some groups tend to be like public IRC channels.


Telegram already has two different group categories: Normal, an super groups. Telegram bumps 200 member groups to super groups. Signal's group E2EE scales up to 1000 members (hard limit). It's both doable, and scalable far beyond group sizes that have reasonable expectation of privacy.

I.e. they could drop end-to-end encryption after the group grows too large, e.g. after 200, 500 or 1000 members.

Public IRC-channel groups don't need E2EE. The overwhelming majority of groups are small, and would benefit tremendously from E2EE.

As for syncing messages between devices, it's doable, as shown by Signal, Wire etc.


Of course it's doable, but citing Signal for it ... you don't get old messages, only new ones. Might be a security "feature" but then again, let me export/import the history manually.


I was wrong about the CVE numbers not being there at all, the original article did state Telegram didn't prioritize them but here we go https://nvd.nist.gov/vuln/detail/CVE-2021-36769


While bashing on Telegram is basically HNs favorite past time, there aren't many other alternatives that work just as well in the area that I care the most about: being a great chat app. Telegram is fast, has native clients instead of a garbage web ui (or electron based app), has better stickers than the alternatives, is easy to write bots for and has a larger user base where I'm at than any of the alternatives other than WhatsApp.


>Telegram is fast

It's fast because it's insecure. Would they deploy end-to-end encryption for their groups, it would be as slow as the competition.

>has native clients instead of a garbage web ui

The "native clients" are used to manage data stored on remote server over an encrypted tunnel. The data that sits on Telegram servers is plaintext. The security benefits of the native clients (e.g. handling E2EE in a secure way) don't really play out when the desktop application doesn't feature end-to-end encryption at all, not even for 1:1 chats.

>has better stickers than the alternatives,

My Signal client has 1:1 match of my Telegram sticker packs. This is a non-issue. Porting them is trivial.

>has a larger user base

Sure, which means, larger amount of your data is hoarded by the Mark Zuckerberg of Russia who will take zero responsibility if his servers are hacked.

Just a thought: You should probably refrain from advertising features of an app the glaring security issues of which you can ignore out of your privilege. There's a LOT of oppressive regimes in the world, and people living there shouldn't take your advice.


I don't understand your first two rebuttals. I didn't want to keep talking about the fact that Telegram is not secure by default (there are plenty of other comments that go into this); I understand this, thank you.

It doesn't have anything to do with the competition being unable to provide a fast _native client_ that I can use. I dislike web based apps, and I refuse to use them whenever a native alternative is available.

> It's fast because it's insecure. Would they deploy end-to-end encryption for their groups, it would be as slow as the competition.

Doubtful - it would increase the amount of compute required on their side, sure, but if done properly it will have close to 0 impact on the end users in terms of latency.

> Mark Zuckerberg of Russia who will take zero responsibility if his servers are hacked.

As opposed to _x_ from country _y_ that WILL take responsibility if their servers are hacked? Please.

I will continue to talk about what I like about the user interface and what it does better than the competition because I want to.


[flagged]


> I'd use native Signal desktop client written on Rust if there was an option. But there isn't, thus I'm using the Electron app.

I’m not going to say anything else other than that this is exactly what Signal wants. They are actively hostile towards efforts to make third-party clients.

You can argue that this is for security reasons, and maybe I’d buy that.

But I would also bring up the fact that they implicitly control both ends in a binary distributed or web(thus- backdoorable) fashion most of the time. I’m not saying they do that, but since they have control of the clients, the servers and the protocol it is not impossible.


Reproducible builds help here. Not all clients are supported yet I give you that, but perhaps you can see how far injecting backdoored clients on mass scale is from "user's sending plaintext copy to server every time". Both achieve the same thing, one will fill the headline of every news paper in the world if it happens once, the other happens right now, billions of times a day, and nobody cares. There's no perfect verification process, but there's a metric fuckton of low hanging fruits we need to address first.


HN is not the place where you get to tell people to check their privilege or to warn people. Write a blogpost, or write to dang@ if you feel strongly about it.


Most of Telegram's advantages come from it not being end to end encrypted by default. They've proven that simply claiming to be the most secure app is better for most users than actually being the most secure. Because true security comes with drawbacks, and users don't like drawbacks.


Most of Telegram's advantages come from:

* anonymous accounts. You need a phone number to set up an account, but you don't need to share your phone to talk to people on it

* network effects. The sticker game is great on Telegram because there are so many people on it making stickers.

* the secure chat *option*. It has an option of being secure and so therefore people think it's secure and when they find out it's not secure all the time, they're still happy that it has the option of being secure.


A good app offers options. While most HN folk don't like Telegram's default, it at least gives you somewhat of a choice (and the user can force the E2E option even if the other party doesn't like it)(though they do not give the option to make all chats by default E2e, which is indeed a dark pattern). The other apps just shove possibly backdoored E2E down people's throats with no choice at all.


>The other apps just shove possibly backdoored E2E down people's throats with no choice at all.

You can check e.g. Signal is not backdoored by reading the source code. You can find it here https://github.com/signalapp/

You can vefify the client you downloaded from Play Store hasn't been tampered with. Instructions for that are here: https://github.com/signalapp/Signal-Android/tree/master/repr...

Your post is slightly ironic, considering Telegram doesn't give you any choice on using E2EE for groups or any chats for that matter, on majority of desktop clients. Know this: when you're not using end-to-end encryption, i.e. when you're using Telegram cloud chats, those chats are by definition as private as a backdoored E2EE chat would be. So one could argue they are front-doored by design.


Signal just spent a year being closed source.


This is an incorrect take. It's true that Signal's server code had not been updated for many months, but that doesn't have an effect on security as messages are end-to-end encrypted locally on the clients/apps. The client/app code had been consistently updated during this period.

Signal responded to the server code being outdated as well: https://github.com/signalapp/Signal-Android/issues/11101#iss...


But Signal won't allow federation and is hostile to independent client development.

Also, what about the server source code?


Signal choosing not to federate isn't in the scope of this thread. IMO not federating is a guarantee of client quality. The last time I had a look at Matrix E2EE implementations, only Riot was barely usable. There was a ton of clients that didn't support it, didn't intend to, didn't have the confidence to go about implementing etc.

>Also, what about the server source code?

https://github.com/signalapp/Signal-Server There you go, last commit two days ago.


It's not a quality guarantee for clients in practice since you can't control Client Side with open source

Regarding Riot, which is called Element by now, you may want to update your knowledge


Yeah I'm familiar with Riot being called Element now, and you don't need to look further than this chat for me to state that.

"It's not a quality guarantee for clients in practice since you can't control Client Side with open source"

What? Open source native client with reproducible builds is literally the gold standard of individual control over software, it's even GPL licenced.


> IMO not federating is a guarantee of client quality.

That's not how it worked out for email long-term; why should IM be any different?


I know Telegram’s default is to give all the data to the server. But being honest about this makes them able to provide significant UX benefits. Backdoored E2E gives a false sense of security, and, backdoored or not, E2E’s UX sucks. (Also note that the backdoor/malware could be on any level of the stack; From WhatsApp’s plaintext backups, to NSO’s zero days.)


This, seriously. I've switched to mostly Matrix chat (with bridges for backwards compatibility) and I seriously miss the Telegram experience. The official client is your run-of-the-mill Electron chat app, which comes with the standard lag and slowness. Alternative clients either lack features (notably, E2EE), are based on other terrible frameworks (Flutter, QML+Python, etc.) or don't work on the platforms I want to run them on.

Spectral looks promising, and so does Fractal in a sense. I've had trouble getting E2EE to work on Nheko, and everything that does work is slow and laggy compared to native chat clients.

One day I will probably get frustrated enough to hack the Matrix protocol into the Telegram app, if someone hasn't done that already by then. The lack of anything better annoys me to no end!


I use WhatsApp and it's pretty good for messaging friends, for large group chats I don't find telegram useful and I find discord much better.


Matrix or IRC if you hate yourself.


I'm a long time user of IRC, but none of my irl friends are on it (anymore).


IRC for me is a way to seek support for OSS projects and nothing else. It is not really designed to be a IM for phones...


IRC is really hard to do well, since it requires a consistent TCP connection to really be useful :/


Combine the parent's suggestions and use IRC over a Matrix bridge. It checks all the boxes when it works and as their comment mentioned you'll learn to hate things when it doesn't!


FWIW there are efforts to prevent this being required and they’re quite close to mature.

I currently run an irc network with these extensions.


I use a self-hosted web client (thelounge). It's still not good for casual IM usage IMO, but it's pretty painless.


I run znc on my server and Revolution IRC on my android phone and couldn't be happier about it.


> has better stickers than the alternatives do stickers count as a feature or a bug?


It is tiring that this keeps happening. These media outlet publish misleading titles and abstracts implying that Telegram is not secure, knowing very well that most people don't even read the articles. I assumed good faith for the first ~100 times, but it is always the same :D

Anyway, wouldn't it be better to link to the academic source ( https://mtpsym.github.io/ ), or at least to remove click bait titles?


It's tiring to see people claim Telegram is Secure e.g. "because it hasn't been hacked yet" :D These people don't realize Telegram is front doored by design, it leaks 100% of your chats to Mark Zuckerberg of Russia, just like Facebook Messeger leaks 100% of its messages to Mark Zuckerberg of USA.


I did not claim Telegram to be secure. It has nothing to do with what I said. Moreover, saying that something "is secure" does not make too much sense, without specifying secure against what.

Assuming you are in good faith, I will try to explain better: The title of the article states there are vulnerabilities in the encryption protocol.

According to RFC 4949 a vulnerability is:

> A flaw or weakness in a system's design, implementation, or operation and management that could be exploited to violate the system's security policy.

Clearly stating that there are vulnerabilities in Telegram's encryption protocol raises concerns, a lot of confirmation bias among Telegram haters, and leaves people who only read the titles with the feeling that Telegram encryption is vulnerable to attacks.

However, among the 4 flaws reported by the researchers, 3 are not exploitable ("This attack is mostly of theoretical interest", "Luckily, it is almost impossible to carry out in practice", "Luckily, this attack is also quite difficult to carry out, as it requires sending billions of messages to a Telegram server within minutes") and the other one is about reordering encrypted messages.

Therefore, a more fair headline which would undoubtedly raise less interest could be "Researchers found a way to change the order of your Telegram messages, even if they still cannot read them", or "Researchers found some purely theoretical or almost impossible to carry out vulnerabilities in Telegram's encryption protocol".

And don't even get me started about the fact that literally everybody, including expert security researchers, feel entitled to bash Telegram for having rolled their own crypto at every chance they get.


>leaves people who only read the titles with the feeling that Telegram encryption is vulnerable to attacks.

I agree with you these attacks are not so severe the completely broke Telegram. But it is living proof Telegram authors don't have the know-how on how to implement secure protocols. If you heard some bridge builder had replaced every third bolt with fifty zip-ties, you wouldn't be defending the bridge, you'd want to know who the f is overseeing that project, and ensure the entire design was being reconsidered, and that qualified engineers were working on the fixes.

This set of vulnerabilities isn't an indication that Telegram's encryption is bound to have a breaking vulnerability. It's saying they don't have the qualifications to protect the data we know sits in their server effectively plaintext. And I'm saying effectively, because sure, it's encrypted, but the database key sits in the RAM, 4cm away from the CPU, and is one privilege escalation vulnerability away from compromise.

You using the term "Telegram hater" does disservice to everyone, because your lumping together people with no tech background parroting headlines, and legitimate concerns from people who've actually spent time looking into this on a technical level.


> But it is living proof Telegram authors don't have the know-how on how to implement secure protocols

I strongly disagree with this claim. Can you back your claim with some evidence? The vulnerabilities shown here are mostly purely theoretical, I don't see how this goes to show that Telegram engineers are incompetent.

What I see is that Telegram engineers chose to ignore what the Computer Security academic community regards as best practices, and this has led to an infinite amount of criticism (including by the authors of the vulnerabilities we are discussing). Despite this, in ~8 years since launch, the only serious vulnerability which I am aware of, has been discovered and immediately patched right after Telegram was first launched.


>I strongly disagree with this claim. Can you back your claim with some evidence?

Absolutely. Telegram isn't end-to-end encrypted by default. The author admits so here: https://telegra.ph/Why-Isnt-Telegram-End-to-End-Encrypted-by...

Q.E.D.

This set of 4 vulnerabilities isn't the issue with Telegram. Vulnerabilities can often be patched. The issue is the fundamental way Telegram functions.

Also, since you're obviously going to claim the article justifies it as a design decision, read my refutal here before replying https://telegra.ph/Why-you-should-stop-reading-Durovs-blog-p...

Finally, I'm a bit puzzled, you seem to be "open minded" yet your post didn't even touch on this massive issue of failure to provide E2EE for groups, desktop clients, or anything by default. Were you unaware of it? Or would you argue the endless list of competition that actually does E2EE properly (Signal, Wire, Threema, Element...), over-do security?

You're also not even remotely interested in agreeing with the academic community, but instead just observe and basically imply: "no breaches have been made public, therefore it must be secure". How familiar are you with the field of computer security, do you know how security is quantified?


Let's recap what is happening here, because we are going a bit off-track with this discussion.

My original post was about the fact that I am tired of media outlets making borderline denigratory titles all the time about Telegram.

You replied, stating that I claimed that "Telegram is secure", which I did not do. Then, I tried to clarify my original post.

Then you claim that these vulnerabilities show that "Telegram authors don't have the know-how on how to implement secure protocols". I asked you to back your claim, because I don't see how the discovery of a bunch of "almost impossible to carry out in practice" vulnerabilities might imply that Telegram's engineers are incompetent.

To which you reply that "Telegram isn't end-to-end encrypted by default". Now, unless I am missing something obvious here, you just stated a fact that has no relevance whatsoever with your former claim. The claim to prove was "Trivial vulnerabilities discovered --> Telegram authors are incompetent". Now, if you changed your mind, and want instead to argue that they are incompetent because they did not implement e2ee by default, it's a totally different discussion and has no relation at all with my original post, nor with the article we are commenting (imo).

> Finally, I'm a bit puzzled, you seem to be "open minded" yet your post didn't even touch on this massive issue of failure to provide E2EE for groups, desktop clients, or anything by default. Were you unaware of it?

I am aware of how Telegram works. But why do you suggest I should have talked about this? It is totally unrelated to my original point.

> Or would you argue the endless list of competition that actually does E2EE properly (Signal, Wire, Threema, Element...), over-do security?

I never stated such a thing.

> You're also not even remotely interested in agreeing with the academic community

It's not that I am not interested in agreeing with them. I am openly criticizing the behaviour of some of its members. It's a different thing. But also this is a different discussion, and I should not have included that comment, maybe.

> "no breaches have been made public, therefore it must be secure".

I did not claim this.

> How familiar are you with the field of computer security, do you know how security is quantified?

Please do not patronize me.

Finally, I am not interested in having a discussion that is unrelated with the topic of the article, or my original comment about it (because it would be too long and tiring). However, if you want to know my opinion on all this related issues that you brought up, you can read what I wrote about it here: https://germano.dev/whatsapp-vs-telegram/ (even though this does not talk about Signal or other open source e2ee messengers).


>Now, if you changed your mind, and want instead to argue that they are incompetent because they did not implement e2ee by default, it's a totally different discussion and has no relation at all with my original post, nor with the article we are commenting (imo).

No I didn't change my mind. The incompetence is all around. Both the presense of these vulnerabilities AND the fact Telegram's E2EE is practically non-existent tell of the incompetence. The vulnerabilities here are not the major problem, the major problem is focusing on the vulnerabilities is seeing trees without the forest.

If every time there is a discussion about Telegram's issues and we only focus on the narrow set of already fixed vulnerabilities, there's never place to discuss the elephant in the room, that the whole game is rigged. The backdoor massive, right in front of us, and nobody's doing anything to fix it. These security issues do not matter until the glaring hole is fixed.

>Please do not patronize me.

That wasn't my intention. I was genuinely interested. Because if you look at the infosec bubble on Twitter with big names like Matt Green, JPA et al. they all know about these issues yet don't even bother to name them. It's like the uncle you never talk about.

Given that you wrote your article before Signal had even desktop clients, I don't think it's even remotely up to date to vouch for any kind of fruitful discussion. But! Let me know if you update it at some point, I'm sure I'd like to read it then!


> there's never place to discuss the elephant in the room, that the whole game is rigged. The backdoor massive, right in front of us, and nobody's doing anything to fix it

I am tempted to take the bait, and ask you what would be this massive backdoor, which nobody has time to discuss. If I am guessing right, you are still referring to "no default E2EE". In that regard, I would encourage you to consider that not everybody has the same security requirements, and many people are fine trusting Telegram and with the security it provides.

Personally, I cannot wait for Matrix to become more widely adopted, and to see the UI/UX of their clients to become remotely comparable with the one of Telegram.

Anyway, since it doesn't seem our discussion is going anywhere, maybe it's time to stop.

Thank you for the chat, I liked how we managed to stay polite even though we completely disagree :)

> Given that you wrote your article before Signal had even desktop clients, I don't think it's even remotely up to date to vouch for any kind of fruitful discussion

Yeah, I intentionally did not want to compare it to Signal (because the article was already too long that way).


>many people are fine trusting Telegram and with the security it provides.

So here's my concern: They would not be fine with waking up one morning with their entire message history out in the open after a massive hack. Surely you can't argue Telegram will never be hacked. Facebook has had multiple data breaches and I've never heard anyone be happy about that. This is what I've had to be second hand witness to https://www.wired.com/story/vastaamo-psychotherapy-patients-... I've seen the devastation someone's most private life out in the open does to them. I can't think of many things more terrifying than that.

There's a reason I made TFC (my work) E2EE by default. There's a reason Signal, Wire, Threema, Element, WhatsApp, Session all felt they didn't want to be liable or user data.

>Personally, I cannot wait for Matrix to become more widely adopted, and to see the UI/UX of their clients to become remotely comparable with the one of Telegram.

Yeah, Element is improving and will gether, and Signal's polishing the UX, hopefully adding the usernames etc by the end of the year.

>Thank you for the chat, I liked how we managed to stay polite even though we completely disagree :)

Likewise!


Being in the academia and kn knowing it's culture I will help to translate. The cryptographers studied the Telegram protocol and did not find practical vulnerabilities. So to pay their "academic debts" they published a paper on a vulnerability which can mess up senders messages in a random way.


Since you know the culture in academia so well you may also be familiar with that custom where you glance over the actual paper before making assertions about it. Not even asking you to read it, just glance at it.


Yes, I am guilty about it. The original source [1] is indeed much more nuanced and humble than I thought it would be.

[1]:https://mtpsym.github.io


Were you in the academia, you'd know Telegram should've hired a cryptographer when they started, almost a decade ago. They have never done that, because to them pride is more important than doing things right. You, an academic, spending your Sunday defending some random messaging app company online a, and doing it for free, is almost as stupid as the parent company's decisions to favor grass-roots damage control over secure-by-default design.


You seem to have a chip on your shoulder regarding telegram. I’ve seen your comments up and down this thread and they do not merit rebuttal because they’re so dearly held opinions.

But let me be clear: all cryptography is “untested” or “unused” at some point. There’s nothing inherently wrong with making a new cryptographic method.

Criticisms I would agree with are:

1) Advertising as a secure messenger (or: the most secure) as it is stupid to say “most” and the security is untested.

2) bug bounties on the protocol as a marketing method to show that it’s not broken.

3) being of russian origin (which, obviously they can’t control, but if this was a US company doing this it wouldn’t have gotten so much negative press).

—-

What we might forget, especially as telegram marketing is so slick and the art so good: building things is hard, we might not always make the right move- it’s fair to criticise but I doubt any of us would do markedly better, we’d just make different trade offs; and some other forum for “accessibility design” would be lambasting us for making our own keyboard or something.


>You seem to have a chip on your shoulder regarding telegram.

You need to understand I don't see messaging apps as living things. I see muscles, bones, nerves and veins. I have no grudge toward any app. That would be silly. My problem is with dangerous implementation of cryptography in general. It's completely agnostic of vendor. I've criticized a myriad of apps in my lifetime from Palringo to Foocrypt to DataGateKeeper to Telegram to TimeAI (lol): bullshit crypto has many forms. I've even criticized apps I now find more or less good, such as Threema and Element (during Riot times).

> all cryptography is “untested” or “unused” at some point

That's not the problem. Telegram's cloud encryption doesn't become "tested" at some point. It's fundamentally broken because by definition the decryption key is with Telegram (the service provider), and NOT your peer.

"Advertising as a secure messenger (or: the most secure) as it is stupid to say “most” and the security is untested."

Yeah I tend to agree. If you want to take a look at how far security design in secure messaging rabbit hole goes, my research might be of interest https://github.com/maqp/tfc

>being of russian origin

No please don't take my comments to infer anything from something being of Russian origin. There isn't "Russians are bad" aspect to my criticism. There is Durov's military training, there is connections to government from VKontakte days, and there is technical deficiencies that are indistinguishable from state sponsored honeypots. I'm not as interested in WHO Durov might give keys to access the servers. I'm interested in the issue of why its dangerous it can happen in the first place: lack of ubiquitous E2EE.

>telegram marketing is so slick

Exactly, they have fantastic social media team that's expert at handling criticism with snide remarks, memes, pop culture references. They really get people. And I find that terrifying. The platform's an orgy of fun, and all I see is by looking at the veins and bones, is another Facebook. Hundreds of millions of people living 90% of their social life through an app they think stands for them, without understanding they're feeding another monster.

Durov should know better what is ethical to build, but he doesn't care. Even if he didn't collect data for the purpose of using it against people, it's his responsibility to know he's not all-powerful, his servers are not hack-proof, and the data that sits there is a tremendous liablity.

Moxie might not be able to pull off the UX, but at least his heart is in the right place, and he's come the closest wrt design that's secure by default. But the most astonishing thing was the v2 group design, that required pushing the boundaries of cryptography as a field. That was incredible achievement. It seems all we can hope now is Signal's features will one day match large enough portion of Telegram's footguns. Lack of usernames, markdown mode, and replying with stickers are the main gripes for me ATM. Or perhaps Threema, Wire, or Element will catch and surpass Signal. Time will tell.


It doesn't matter much what vulnerabilities there are when a large chunk of your users don't even use e2e encryption. For a while recently I had used Telegram to communicate with a few people, and as an experiment I did not turn on e2e myself. Well, neither did any of my counterparts (even the more security conscious ones). I suspect people are inclined to use it only in special cases, as the UX rather gives off the vibe of pulling your counterpart into the cover of a dark side alley at night (rather than being the default of a presumably secure messenger).


It has severe drawbacks if you use e2e. The default is without because that's what most people want.


"It has severe drawbacks if you use e2e. The default is without because that's what most people want."

I would suspect majority of users would disagree with the question "are you ok with Telegram keeping a plaintext copy of your messages"

The most technical answer you can expect is "All Telegram chats use MTProto encryption. Their End-to-end encryption protocol is called MTProto. Ergo, they can't see my messages".

Ask any user if they want privacy or no privacy. 0% will tell you, oh I explicitly want no privacy.


Whats so hard to understand about having a choice? Telegram doesn't keep a plant text copy of your message if you use the secret chats. Its up to you beside the fact that its also completely voluntary to even use telegram.

>The most technical answer you can expect is "All Telegram chats use MTProto encryption. Their End-to-end encryption protocol is called MTProto. Ergo, they can't see my messages".

Sorry but that just stupid. Any user who cares can clearly read which chats are e2e encrypted and which are not. Zero technical knowledge is needed. The name of the protocol or other irrelevant info are also not needed. The reality is people often just dont care.

>Ask any user if they want privacy or no privacy. 0% will tell you, oh I explicitly want no privacy.

Again at the verge of pure stupidity. Idek what your argument is here. No one forces you or anyone to use Telegram nor is the question privacy versus no privacy. Its mostly privacy versus convenience and more likely its use-cases versus nothing. Because other solutions dont have public channels and public groups and other telegram features people want. And "privacy" is meaningless for stuff mean to be public. Anonymity would be the next best thing and Telegram gives that although they ofc know who posts on a channel since there is no feasible way to change that. They have to know who's responsible for publicly posted content.

If you dont want or need any of that dont use it, its that simple.


>Whats so hard to understand about having a choice?

It's absolutely not hard. Literally my point :D I want choice but there is no choice. I can't enable secret chats on desktop clients. I can't enable secret chats for groups. The choice is choosing between another app entirely, not from within options in Telegram.

>Its up to you beside the fact that its also completely voluntary to even use telegram.

That's like saying, "if you don't like the society, don't criticize it, join the Amish". I have every right to criticize Telegram, if you think they're immune to critic, understand that's a dangerous situation. It's really weird me being concerned about your (and others') safety, is a threat to your self-esteem. If Telegram is part of your personal identity, you're in a world of trouble.

>Any user who cares can clearly read which chats are e2e encrypted and which are not.

No, 90% of people don't understand what encryption is. 99.9% of people don't understand the distinction between E2EE and cloud encryption. Your views are distorted by your bubble. Also nice work with the weasel word "who cares". So let me say this: if you care about privacy, you read my comments with the mindset "this guy is trying to help me", not "this guy is trying to attack my self-esteem".

"Its mostly privacy versus convenience"

It absolutely boils down to that when we start to compare applications. I'm not denying that. But when we say there's two ways to achieve same level of convenience, one is easy and insecure, the other is hard but secure. The problem isn't e.g. "You can't do end-to-end encrypted group chats", the problem actually "it's hard to do end-to-end encrypted group chats". We know, because apps like Telegram have failed to do that.

So it's not "use-case vs nothing", it's not "insecure group chat or nothing". It's "Telegram lacks know-how on how to create secure group chats vs Signal knows how to make secure group chats". Surely you must see this.

"Because other solutions dont have public channels and public groups and other telegram features people want."

But now you're lying. Signal's group can be public, because the group link can be made public. If you want to make the argument that Signal groups are not searchable inside the app, or if you want to make the argument Signal don't scale to massive group sizes, the by all means do so. Please understand I'm not advocating E2EE for massive, +1000 member gropus, I never have. The point is, small groups for small groups have the right to privacy too, and there is no technical reason non-super-groups can't have E2EE, and upgrading to supergroups would drop the E2EE.

"they ofc know who posts on a channel since there is no feasible way to change that."

Funnily enough, Telegram could make all clients connect via Tor by default, not require phone numbers, and thus, make everyone anonymous. But I can totally see why it's not feasible. E.g. the E2EE calls in telegram would stop being low latency. Adding six relay nodes between users would make it unusable. So you see, I'm not blind to real life technical limitations. But I'm not afraid to point out bad design decision when there is no actual technical limitation, either.

The fact is, Telegram doesn't know how to make E2EE sync across devices. The current design allows two people who own iPhone and iMac to have secret chats. But this is what it looks like: One secret chat between the phone and the phone, another between desktop and phone, and third between desktop and desktop, and a fourth one between phone and desktop. So four chats in total. And you can never know if the contact has read the message.

Compare that to Signal that has one chat per contact, seamlessly synced on all devices. The convenience is right there, embedded in the clever key management system.

"And "privacy" is meaningless for stuff mean to be public"

I ABSOLUTELY agree with you. But surely you have your own friends, and you form groups with them. The stuff you say to each other is not public. And if you disagree with that assessment, surely, you don't think every other group of close friends, family etc. have nothing to hide from the rest of the world?

I say, let the public channels be public, I'm not trying to rob you of the enjoyment like millions of crypto currency spam chanels. That stuff doesn't benefit from E2EE. But I say, give us the choice to E2EE small groups, and 1:1 chats regardless of platform.

Why are you opposed to users having technically guaranteed privacy in these contexts?

"If you dont want or need any of that dont use it, its that simple."

The problem in modern society is, we often don't really have such choice. I can't go tell hundreds of contacts, let alone hundreds of millions of people explicitly everything I know. The ones I've told have been absolutely horrified to learn Telegram is not nearly as private as they've previously thought. And everyone has thought Telegram is right there, next to Signal. So kudos to Telegram marketing team. If only the engineers were able to actually deliver privacy tech that matches the public's perception.


>The choice is choosing between another app entirely, not from within options in Telegram.

Thats exactly what you should do. Telegram is not the tool/service that does what you want. It just as much not what you want like Skype isn't what you want and MS Paint is probably also not what you want. Any and all criticism is pointless there is nothing wrong with these tools they just dont do what you wish them to. Leave them to the people who actually want the features that Telegram/Skype/Paint or any other tool provides.


Part of my point. I use WhatsApp and Signal, both using the Signal protocol, and experience no drawbacks whatsoever.

In any case, for a messenger that advertises as secure, placing security after convenience seems pretty unusual.


No drawbacks? Whatsapp requires your phone to be on and have Whatsapp running in the background if you want to use it over Whatsapp Web. Also the amount of Messages and Media that Telegram delivers to me would not fit on my Phone.

Also let's not forget what a bad user experience Whatsapp offers apart from that. No bots, no channels, no hiding your phone number, no polls, limited permission system in groups,... i could continue for 20 minutes. But what is most infuriating to me is the backup system. It never works and you can only store it in iCloud/Google Drive.



Still all the other points. And i am REALLY not sure if switching from a company that is known to try to not give data to governments to facebook, which makes it's money by analysing and selling data would make a lot of sense.


Ah. I’m not using any of those features, I just use it for text messaging.


If you use Whatsapp as a SMS replacement, then you'll be fine.

Many Telegram users use them for chats with large groups of people (hundreds or thousands) with native bot support being a huge thing.

Whatsapp and Signal don't give many tools for handling groups that big, with Telegram you can actually manage it.


May I recommend Matrix, where group chats can actually be secure. By using Telegram for that you are basically leaking your group chats to a tiny company of unknown structure, as well as whomever they may be liable to be influenced by.

If you only use Telegram for convenience and not security, then I guess my point upstream stands.


No drawbacks over what? Over WhatsApp before they got e2e? Compared to telegram both of these are 5+ years behind. And have a different use case for a lot of users. Telegram being some kind of social media IM with a focus on public content and pubic messaging where the whole security thing is moot anyway. Personally >99.9% of my messages on telegram are public or at least accessible by strangers which I have zero reason to trust. There is no point encrypting such messages. They are similar to HN comments. I consider these messages public but still like the fact that neither google nor FB indexes or analyses them to send me spam/ads.

>for a messenger that advertises as secure

Its p much PR nonsense but to be fair it also come from the early days when telegram was the only IM with any e2e that had a relevant market share. I think its understandable that they will not remove "secure" from the description after all those years, it would raise some eyebrows if they would suddenly no longer say is secure ryt?


Telegram's implementation has severe drawbacks that people don't want; e2ee in general does not have to have those drawbacks.


Maybe try telegram someday? Because you dont seem to know what it can do? The features is has are NOT possible with e2e. Its not an implementation flaw its a implementation decision they made on purpose and clearly explained in their FAQ why they made it.

>e2ee in general does not have to have those drawbacks.

How come no other messenger simply copies telegrams feature but with e2e? Also what exactly is the point of e2e chats if every user that joins gets a key to decrypt the whole chat. Its completely useless "security".


> The features is has are NOT possible with e2e. Its not an implementation flaw its a implementation decision they made on purpose

Oh don't be so vague, go on and name one!

> How come no other messenger simply copies telegrams feature but with e2e?

I've been wondering the same thing, especially for something as young as Matrix that didn't (at the time Telegram was already good) have a decent client themselves yet.

I've also been thinking myself of doing exactly that, but then it's basically a full-time job to get something halfway decent. People worked on reverse engineering a server at all, not an encrypted server but just any working self-hostable server at all. I don't remember the repository name but it's on GitHub. Even this was abandoned because it's just a ton of work that very few people are going to use. Additionally modifying the clients to work with a different protocol is even more work.

So I can see why people aren't doing this because I'm one of them, but yeah what I don't get is why the folks at Signal/Matrix/Threema/etc. would rather start from scratch than do this.

It might be easier to have some sort of plugin that overlays encryption on top of Telegram and uses the existing servers. On F-Droid there's this project called OverSec <https://f-droid.org/en/packages/io.oversec.one/> that I've been meaning to try, though presumably it hooks on an input field level and so it couldn't decrypt images before decrypting so it's not a full solution. You'd need to really modify the client (every one of them) and everyone needs to use your custom clients.


>Oh don't be so vague, go on and name one!

See my reply above.

>I've been wondering the same thing...

Its not possible (or feasible) to implement the key telegram features in Matrix or other always-e2ee protocols.

>It might be easier to have some sort of plugin that overlays encryption on top of Telegram [...] and everyone needs to use your custom clients.

Its against the telegram api ToS. You are not allowed to implement features that require other parties to download your client. You can only add local features that do not affect other user who use the official client.


>>Oh don't be so vague, go on and name one!

>See my reply above.

Yeah I saw that. You wrote:

> Not even gonna read it all Have a nice day and keep using whatever you want

so I don't see why I should bother trying to answer your question as well.


You should not, I dont care which IM app you use or why or why you dont wanna use telegram. And you dont care or understand why people use telegram and what they use it for.

I tried to explain it but people like you dont want to know. You just want to argue over e2ee nonsense that no one cares about who uses telegram for public conversations. You could ask HN to make comments e2ee its about that "useful" for what I use telegram for. I want my messages to be read, its that simple.


So Telegram's ToS conveniently forbids end-to-end encryption plugins. :D That's amazing security design!


You are such an idiot


What key features are you talking about?



Well, then your key features are already implemented. So much to that


>The features is has are NOT possible with e2e.

Which feature?

>clearly explained in their FAQ why they made it.

They absolutely did not.

>How come no other messenger simply copies telegrams feature but with e2e?

Which feature isn't being copied? Signal just implemented effin stickers with end-to-end encryption.

>Also what exactly is the point of e2e chats if every user that joins gets a key to decrypt the whole chat.

Firstly, new users in e.g. Signal groups don't get access to group message history. Secondly, overwhelming majority of Signal groups are not public. Your claim that E2EE in groups is useless assumes anyone can join any group. That's not the case, therefore your argument is completely baseless and thoughtless.


>Which feature?

Almost all. Either not possible or not useful. Telegram has huge public community. People need to be able to join/leave and forward stuff to other places etc. etc. It just makes no sense to add any e2ee to that. Even in normal groups the default is that any user can add someone so the new user would need to get the decryption key on invite. Then you have bots that need to be able to read messages so they need the key too. Then you have cloud search so telegram itself need the key as well. Its just completely nonsensical at that point even if it somehow would be possible to implement it.

>They absolutely did not.

https://telegram.org/faq#q-why-not-just-make-all-chats-39sec... I have no trouble understanding it. You may not like it but its clear and easy to understand.

>Which feature isn't being copied?

All the ones that are not possible with e2ee.

>Signal just implemented effin stickers with end-to-end encryption.

Yes, and its an anti-feature for most people. People dont care if a sticker is e2e encrypted send to a chat. But what user care about is how fast it is, how much data it uses and how much it drains the battery. Telegram easily wins all of this because it does not e2e encrypt stickers, instead they are stored in the telegram cloud and the message only contains a file_id for the other user to load the sticker (once, then it is cached locally)

>Firstly, new users in e.g. Signal groups don't get access to group message history.

More anti-features, normal user want a new user in chat to be able to read the history. (If not telegram allows it to be disabled)

>overwhelming majority of Signal groups are not public.

How is that relevant. Use cases are different. I use telegram almost exclusively for public stuff or semi-public stuff. Maybe this is the problem here. You dont use telegram and dont know that it is way more than a messenger. Its more like a social media platform.

"I dont want or use any of that (you, probably?)" Fine, stick with what you use but dont tell me everything telegram does could/should be done with e2ee. Its absurd and nonsensical. It would be like making twitter completely e2ee rather than just give the user and secure e2ee DM option. Which is what telegram did. You have the option for e2ee one-on-one chats if you actually need it. Ive used it a few time like to send someone a password or a private document. But for the most part I have no use for it.


>Even in normal groups the default is that any user can add someone so the new user would need to get the decryption key on invite.

Signal has group invite links.

>Then you have bots that need to be able to read messages so they need the key too.

Signal has bots.

>Then you have cloud search

What you have is search, and Signal has search too, for the local message log. Telegram's log search lacks partial and wildcard searches. It's extremely inferior. Also, Telegram's cloud search gets extremely slow if you try to find anything older than few months.

>You may not like it but its clear and easy to understand.

It doesn't explain why they can't implement E2EE for normal group chats from technical PoV. Neither does Durov's blog post.

> But what user care about is how fast it is

What's the difference? Give me numbers

>how much data it uses

What's the difference?

>how much it drains the battery

What's the difference?

How about some nice facts and sources?

I think it's cute you try to label every secure feature an anti-feature, without understanding security is the fundamental attribute of every feature. You wouldn't use a feature that leaked the content to your worst enemy, why would you upload it to server that when hacked, allows your worst enemy to read it?

>I use telegram almost exclusively for public stuff or semi-public stuff.

Sure, if you personally have nothing to hide, you're welcome to use Palringo (that AFAIK still pushes everything over HTTP) for all I care. Just don't enforce your privileged threat model to anyone else.

>Its absurd and nonsensical

You're in a conversation about security of Telegram. If you don't give a shit about security, go, enjoy your life. Why did you bother come here to brag about your privileged life that doesn't have to be concerned with security?

> It would be like making twitter completely e2ee

That's bullshit. Twitter doesn't e.g. have groups that would enjoy expectation of privacy. Twitter is also not a messaging app, it's a social media, moreover, it's a micro-blogging site. It's content is intended to be public. Sure, the direct messages should probably use opportunistic E2EE, but its not exactly advocating itself as "heavily encrypted", it doesn't IMO have to be.

>You have the option for e2ee one-on-one chats if you actually need it.

No I want E2EE for my group of 10 close friends. I don't want E2EE for 2000 member super groups. And the problem literally is, I don't have the option for E2EE 1:1 chats, they're N O T available for Linux desktop I use. We don't actually have the option. What Telegram has, is a sad excuse of E2EE 1:1 chats on limited platforms, which only functions in online debates. Telegram does NOT have E2EE in practice. If it had, I'd be having my chats on Telegram instead of Signal, and wouldn't have to raise awareness on the issue.


You ignore everything I posted and just list stuff Not even gonna read it all Have a nice day and keep using whatever you want (signal obviously)


When you really think about it, almost nobody can be sure that their conversations are end-to-end encrypted. The only way you can be is to verify the fingerprint on both ends, provided you have exchanged fingerprints over a medium known to be secure or in person. Nobody does that.


You make this idiotic assumption that chat companies are MITM attacking users. Sure, the only way to be sure there is no MITM is to check the fingerprints. But that's aching to false dichotomy. You're not actually choosing between

1. 100% verified E2EE chat and

2. 100% MITM attacked insecure chat

When you don't check the fingerprints, when you are choosing between E2EE and client-server encryption, you're actually choosing between

1. Chat vendor having to commit felony with mandatory minimum sentences, to read your messages

2. You voluntarily sending EVERY SINGLE MESSAGE to the vendor without any expectation of privacy, and thus waving your legal right to privacy.

So no, opportunistic end-to-end encryption is definitely not equivalent to cloud encryption.

Sure, if your personal threat model is that there must be zero chance of some messages ending in wrong hands (maybe you're a lawyer sending private info to client, or naughty pics to your SO), then sure, you will want to perform the fingerprint check. But for majority of communication, it's enough that there is a significant threat of users verifying the fingerprints: Getting caught doing MITM against users is extremely damaging for the company, and again, will land you jail time with very high probability.


I'm surprised this one even needs to be spelled out. If it turns out WhatsApp/FB have access to message contents, they will be liable. Telegram conveniently (for itself) makes that only the case if a setting is changed.

This wouldn’t matter if people gave up arguing Telegram is secure and agreed that it’s just more convenient, but this in fact is a security-related thread.


It’s verifiable, though.


The Telegram team's response to this isn't very encouraging: https://telegra.ph/LoU-ETH-4a-proof-07-16

They don't seem to go into technical details, relying on analogies like bags of sand - maybe this is comforting enough to the general public, but I'm skeptical.

I didn't read the technical details of the exploits though, so Telegram might be entirely in the right to be so dismissive.


Here is the comments from the Telegram team for the technically inclined: https://core.telegram.org/techfaq/UoL-ETH-4a-proof

Linked in the third paragraph of the submission:

> The researchers also highlighted several traits of MTProto that were changed as the result of our discussions before the paper was published. This document provides an accessible overview of these changes. For more technical details, see here.


I would like you, or someone more knowledgeable in the field to argue that the response is not accurate:

> Re-ordering

Exposed no information about the plaintext and needs additional work to become a fully fledged exploit, patched anyway.

> Re-sending: This is a purely theoretical point that has no bearing on the security of messages but is inconvenient for researchers who want to formally analyze the protocol.

Researchers being able to analyze the protocol to find faults - good thing. But can't this also work against people trying to break the protocol?

> Implementation problems - "For an analogy, imagine a door with multiple locks, one of which could be unlocked — the door to your messages could still not be opened because it had more than one lock"

I think this contradicts with the summary above, which is "recover some plaintext", which does not make sense given the analogy. Can someone explain? Possibly the only serious issue found.

> RSA Decryption: "This may sound scary but was not possible in practice"

If it's not possible in practice that seems like a bunch of other well established crypto protocols I know where it's just infeasible to brute force with currently available non state resources.


1. Re: "needs additional work to become a fully fledged exploit": We have verified this attack in practice, see the paper.

2. We give an example application where it has some "bearing on the security of messages" at the end of https://mtpsym.github.io/ under the heading "Did we really break IND-CPA?" and in the paper.

3. Perhaps a better analogy is: the remaining door requires the date of birth but luckily they kept that secret. But analogies aside: "Luckily, it is almost impossible to carry out in practice. In particular, it is mostly mitigated by the coincidence that certain metadata in Telegram [a salt and an id] is chosen randomly and kept secret." https://mtpsym.github.io/

4. By cryptographic standards the strongest attack as it would imply full compromise if successful. It costs in the order of 2^32 noise-free queries within minutes where it is not clear how many noisy queries are needed to get "one noise-free query" since we didn't want to test it against Telegram's servers. This falls significantly short of "well established crypto protocols [...] where it's just infeasible to brute force" NB: these provide guarantees also against state resources. This does not mean it is a practical concern.

The key contribution of our paper, however, is that we prove that MTProto's symmetric cryptography (with some fixes and when implemented carefully) can give you something comparable to TLS (well, it's Record protocol) which is it's closest "competitor".

This proof comes with some caveats, though: MTProto is tricky to implement correctly (as highlighted by our attacks on the official clients). Secondly, MTProto relies on unstudied assumptions that you do not need to make if you use just TLS. See "A Somewhat Opinionated Discussion" at https://mtpsym.github.io/


Why you're saying this? Bags of sand is fantastic analogy, no?


I don't like the way they communicate regarding security, but for context: So far there has been an extreme amount of FUD thrown their way from people who pushed WhatsApp and Signal.


> I don't like the way they communicate regarding security

In general they are cocky but in an expressive way. It is heartening that when papers like this are published they fix the issues despite the PR spin. I've had one email thread interaction with Telegram and it was to unban my account that got banned for messing around with old clients, I can't say they were pleasant but they did answer many of my questions.

>So far there has been an extreme amount of FUD thrown their way from people who pushed WhatsApp and Signal.

Signal's people did not inform their Android users of the potential for backdoored IME's in places like China for over a year (thankfully their site does mention it now). It was telling that the app was available and usable there until word got out that this was a very obvious/easy surveillance target for such regimes. Personally I think it's possible the Signal Foundation was under a gag order and could not report on this. But I would not rule out cockiness in their own way, to date I've never received a single response to the most basic of questions (back when I was an active user/promoter of their apps) from any of their team dating all the way back to Textsecure days.

HN discussed the IME issue around 6 months ago: https://news.ycombinator.com/item?id=25758995


> from people who pushed WhatsApp

So Facebook?


No, I'm thinking about well meaning security people who focused a bit too much on E2E-encryption and forgot that WhatsApp sent metadata in bulk to Facebook and uploaded backups unencrypted to Google and/or iCloud.


>forgot that WhatsApp uploaded backups unencrypted to Google and/or iCloud.

This might finally get fixed:

https://www.theverge.com/2021/7/16/22580800/icloud-google-dr...

Great news because it disproves Durov's BS that claims you can't have backups without giving the vendor the access.


Did he claim that?

I'm fairly sure his claim was that it was easier to do it this way.


"Secret chats are e2e-encrypted chats that never under any circumstances get backed up. Cloud chats are encrypted in the same way, but also have a built-in cloud backup. "

https://telegra.ph/Why-Isnt-Telegram-End-to-End-Encrypted-by...

Of course he's not dumb enough to walk into the mine of saying it directly. But he justifies the need to split chats in the context of backups, which gives any reader an impression you can't do client-side encrypted cloud backups on secret chats. He never even tries to explain why he doesn't try to bridge the feature gap.


As opposed to Telegram, where chats are unencrypted by default, and synced to the cloud?


are we a community of technical people or sleazy sales people?

What you say about being synced to the cloud in a way that can be decrypted is correct and can be problematic for some people but calling it unencrypted is wrong and detracts from your message.


Saying the database is encrypted when the database key has to sit in the RAM of the server is as disingenuous as saying this is proper way to lock up gems: https://i.imgur.com/0gBOPoQ.png

You're basically arguing that "you can't say it's not locked when the lock is right there"


> Saying the database is encrypted when the database key has to sit in the RAM of the server

First you are replying to something I didn't say.

Secondly, going down that path and applying it as strictly to everything you'll find that nothing is encrypted: not your communication with the bank, certainly not WhatsApp as it stores temporary databases unencrypted (not only with a password stored in ram) and probably not much else either. In fact I guess if we apply your criteria to Signal clients they are not encrypted either.


Your comment says nothing about what GP actually stated.


Calling Telegram unencrypted isn't practically wrong when the key is stored right next to the data.


This is pretty short and to the point: https://mtpsym.github.io/


This is a write-up of our research presented here: https://mtpsym.github.io/paper.pdf

We give a high-level overview of what we found here: https://mtpsym.github.io/

We also include a discussion of how to interpret our attacks.


I would avoid Telegram even not having read this. Their homegrown encryption and secretive nature of the back-end creep me out. The irrational part of me is also wary of the developers' background. At the same time I also can not bring myself to trust the other popular options either. Whatsapp should be a pretty obvious case of facebookiness, and signal suffers from the same lack of back-end transparency and decentralization as Telegram. And, believe me I don't want to bring this up, but Moxie is just such an incredibly untrustworthy figure I wouldn't be able to sleep at night trusting his thing.

The only option with any real adoption (still not at the scale of the previously mentioned though) is Matrix. Having familiarized myself with Olm and their implementation of it for the purpose of building clone for educational purposes, I feel safe trusting it. You can't go much more open and transparent than Matrix does.


Maybe this reddit comment will make you less worried /s

>The lead Telegram dev is a 3x International Math Olympiad gold medalist, won another gold in the informatiks olympiad, went on to earn two Ph.D.'s in algebraic geometry, all while working full-time as a programmer?

>Him rolling his own encryption algorithm is not the same as your copy-paste StackOverflow code monkey who scraped by with C's at his community college rearranging the alphabet letters in a caesar cipher.


>The lead Telegram dev is a 3x International Math Olympiad gold medalist

The lead dev

* doesn't have ANY qualifications as a cryptographer (he got his position through nothing other than nepotism) and thus

* thought AES-IGE was best practice

* used SHA-1 10 years after SHA256 was published

* didn't understand the importance of DH parameter pinning

* left in a 64-bit pre-computation MITM attack vector

* initially implemented crappy QR-code like fingerprint for secret chats without understanding the need for hex-decimals that could be compared over authenticated channels

* couldn't implement IND-CCA secure protocol

* didn't prevent these FOUR new vulnerabilities

But most importantly:

* doesn't have the know-how on how to implement E2EE for groups

* doesn't have the know-how on how to implement E2EE for 1:1 on Win/Linux desktop clients

* doesn't understand E2EE needs to be enabled by default

They are literally just winging it. Their Russian Pride would take too large a hit from publishing a CVE wrt the most recent issues, thus they downplayed the issues and wiggled out to maintain the prestigious image in front of the cult that is their users.


People here failed to realize Telegram made a clear choice between usability and security. Cloud messages make it very useful and their groups are awesome.

And you can always turn on E2E if you really want to use it. It works only on smartphones (probably they should add an option...) but it's there and works pretty fine.


And let's not forget. People are studying their encryption protocol and they're not finding anything special. It's not like you can actually decrypt messages.


There is nothing to break when the cloud chats are a backdoor by design. Imagine a vendor selling bullet proof glass. They're arguing the glass itself is extremely durable. These jabs at Telegram protocol are like attempts to break the glass. Here's the problem, that glass has a three meter hole in the middle of it. That hole is called cloud encryption. Telegram leaks

* 100% of messages by default to server. This is the same as if Signal had a backdoor for all of its messages.

* 100% of group messages with no chance to opt out. This is the same as Signal had backdoor for all of its group messages.

* 100% of all messages for Windows/Linux desktop clients. This is the same if Signal desktop client for Windows / Linux had a backdoor that leaked all messages to Moxie.

Telegram's strategy is to exploit that three meter hole. They published a bounty for 100,000 dollars for anyone who could break the glass. But the competition didn't award any points for pointing at the massive hole in the glass.

So no, nobody's finding anything special in Telegram's encryption, because EVERYONE got bored of pointing at the hole back in 2013 when Telegram was released.

Telegram's backdoor is the front door. It's so absurdly obvious it's insane anyone would ever have to point it out. But it's right there, if you just bother to look.


As I said, easily fixed by just using secret chats... Worse things it's that they don't work with group messages and etc. They gave people the choice though. You can use however you feel like it. Basically what you're doing is what Apple does for its users: patronizing. You just think what's great for everyone.

Also, they explain in their FAQs that they don't have the encryption key in the same datacenter, that they have those keys in different datacenters in different jurisdictions so law enforcement can't force them to decrypt the messages.


"They gave people the choice though"

There is no choice for opting in for E2EE group chats, exccept using another application.

"You just think what's great for everyone."

... Obviously?

"Also, they explain in their FAQs that they don't have the encryption key in the same datacenter"

Ok, explain to me on a technical level how the database encryption of incoming packet is done, (before the packet is committed to the encrypted database), when the key is not located on the RAM of the server? I.e. How does a CPU encrypt data without the key being present on the system?

Is it quantum teleported into the CPU registers from another country?


Reading just the article and going out on a limb, it sounds like they are exploiting the window of valid sequence numbers in a session to enable out of order delivery.

I'd speculate the basic problem in the protocol is that session sequence numbers (counters) can be incremented by sending invalid messages, and there is a window of n plus or minus x, where x is how much the counter/sequence number on a valid message can be off-by and still be processed.

Flooding the session to increment the counter above the valid window would yield undefined behavior in a few ways. When a peer gets desynched, either the client or server may kill the session and require a new negotiation handshake with the peer (a DoS vulnerability) - or it attempts to recalculate the window of counter values it will accept based on its last known good message, which sounds like it effectively causes the valid messages in that chain to "fold" back on themselves in their order.

I'd suspect this was a design decision and understood by the protocol architects.

If you aren't using an CMAC, a ratchet function (or a kdf based on hashing) which proves deterministically what the current sequence of messages is, instead of using a sliding window, you are going to have a similar or analogous sequence number "window" problems in your protocol.

Some message sequence schemes use an obfuscated counter where instead of say, integers, you use some function over a field. (hence GCM, as I loosely apprehend it) But even the seed for that function just becomes another secret to manage (imo), so protocols that depend on counters earn extra scrutiny.

The reason to use sequence numbers (counters) instead of ratcheting or hashing is because sometimes you're just substituting the managability of a sequence number counter window problem with another key management problem that has has a different set of known vulnerabilities, and you make the call based on your threat model.

Perhaps the researchers' actual findings are more complex than this, and it isn't just a gotcha criticism of a design decision that had clear trade offs, which few are equipped to object to.


Who would possibly care that their instant messages might be made to arrive out of order? You could even claim it as some sort of feature. If you get confronted about something you said just claim that you said it in a different order. Plausible deniability though forgeable ordering. Which strikes me as sort of funny because deniability is something that often gets promoted as a critical feature but has no practical implications for anyone either.


I often wonder if it’s just easier to crack at the phone level and just get data from memory as it’s unencrypted on the device rather than the quite good protocols that may have some problems. Are we all looking in the wrong place? Personally I’m sure if state actors want they can read your messages, one way or another.


Good job


One day, as the nature of hardware changes, these cryptos will be exposed.


Wow, zero protection against character transposition.

That’s like an intentional weakening of “secured-ness”.


Where do you read character transposition?


swapping words is a form of character transposition but on a larger scale.


It's not swapping words, it's swapping message ordering. To swap words, each word would need to be sent in a separate message.


I'm really curious how Telegram makes money. Signal had (has?) their crypto thing, Whatsapp snarfs up the user graph / interactions, but Telegram just supposedly shows adds on large groups.

I really don't know how that is enough to support half a billion users, just in terms of hardware cost.

Are they spying on some group of crypto whales to get insider info?


It took me like 5 seconds to get an answer for this with a search. Please don't posit wildly speculative scenarios in public forums.


?

Article 1:

>How does Telegram make money?

>Telegram doesn’t make money, or at least it doesn’t generate revenues, as of 2019. Durov pointed out on a blog post that he “believes in fast and secure messaging that is also 100% free.”

>On the same blog, post, Telegram notes that if it were to run out of money, it might introduce “non-essential paid options” to supplement developers’ salaries.

Article 2:

>All the features that are currently free will stay free. We will add some new features for business teams or power users. Some of these features will require more resources and will be paid for by these premium users. Regular users will be able to keep enjoying Telegram – for free, forever.

>In addition to its messaging component, Telegram has a social networking dimension. Our massive public one-to-many channels can have millions of subscribers each and are more like Twitter feeds. In many markets the owners of such channels display ads to earn money, sometimes using third-party ad platforms. The ads they post look like regular messages, and are often intrusive. We will fix this by introducing our own Ad Platform for public one-to-many channels – one that is user-friendly, respects privacy and allows us to cover the costs of servers and traffic.

>If Telegram starts earning money, the community should also benefit. For example, If we monetize large public one-to-many channels via the Ad Platform, the owners of these channels will receive free traffic in proportion to their size. Or, if Telegram introduces premium stickers with additional expressive features, the artists who make stickers of this new type will also get a part of the profit. We want millions of Telegram-based creators and small businesses to thrive, enriching the experience of all our users.

so... they don't?


>Telegram doesn’t make money, or at least it doesn’t generate revenues, as of 2019. Durov pointed out on a blog post that he “believes in fast and secure messaging that is also 100% free.”

Durov might as well be paying lip service here. Telegram reserves the right to change its ToS. Facebook didn't spy on 100% of its users actions when it started. It was a picture rating service. The business model came after. Durov still has money left after he was forced out of VKontatke. He's in no rush to cash in. There's no telling when Telegram will be sold: it's not a non-profit and can be sold any moment.

I can understand philanthropy, but given how Durov has paid zero dollars over the past 8 years to cryptographers to design a protocol that ACTUALLY locks him out of terabytes of insanely valuable user data, I find it hard to believe his motives are pure.

With Moxie and Signal I can trivially say it's not a CIA front despite any money from Radio Free Asia, because I can check the end-to-end encryption works and protects every message. Given Durov's background in information warfare and disinformation, connections to Russia etc. and him having access to almost 100% of users' content and metadata, I can't say with confidence Telegram isn't an SVR (Russian NSA) front.


>I can check the end-to-end encryption works and protects every message.

You would be a fool and a poor spook if you created Signal as a honeypot and made it insecure from the get go to anyone looking. It would be made insecure either later on or by an extremely hard to find "bug". We know from Snowden that the NSA secretly weakens encryption products to make them vulnerable. All your case proves(?) is that Signal is a better made honeypot than Telegram if any of them are actually a trap.


"You would be a fool and a poor spook if you created Signal as a honeypot and made it insecure from the get go to anyone looking. It would be made insecure either later on or by an extremely hard to find "bug". We know from Snowden that the NSA secretly weakens encryption products to make them vulnerable. All your case proves(?) is that Signal is a better made honeypot than Telegram if any of them are actually a trap. "

You can't be serious. Anyone can show that vulnerability in the codebase. Signal has internal code review processes, and the git log is public. There is no better process out there, or if there is, you'll have to explain it. There is no perfect process but if you're trying to argue that justifies Telegram doing nothing wrt e.g. group chat end-to-end encryption, you're out of your mind.

Here's the thing: There's nothing that indicates Signal has a backdoor. OTOH, Telegram doesn't have to be converted into one, it ALREADY DOES IT. All that the NSAs of the world have to do, is hack the server. And since you're so intimately aware of Snowden's output, might I remind you of his SXSW talk from 2014 where he said NSA hacks systems all the time?


....then show your 'like' 5 seconds of results? * like - apart from grammatical comparisons, and 'to enjoy,', does it like mean anything anymore, er....innit?


Here's a source from TC. TL;DR Non-targeted ads in channels, which are like RSS feeds or blogs, no ads anywhere else. Also considering "premium" animated stickers:

> The service, which topped 400 million active users in April this year, will introduce its own ad platform for public one-to-many channels — “one that is user-friendly, respects privacy and allows us to cover the costs of server and traffic,” he wrote on his Telegram channel.

> “If we monetize large public one-to-many channels via the Ad Platform, the owners of these channels will receive free traffic in proportion to their size,” he wrote. Another way Telegram could monetize its service is through premium stickers with “additional expressive features,” he wrote. “The artists who make stickers of this new type will also get a part of the profit. We want millions of Telegram-based creators and small businesses to thrive, enriching the experience of all our users.”


Telegram is still the fastest,easiest to use,secure messaging platform today and its only getting better.


Telegram is definitely the most convenient one with mass adoption, especially among the furry community

But let's not mistake ease of use for security


Dunno about all that, but I still haven't seen someone decrypt a message.


It's not even possible to use it without a phone number.


You can't use Signal or Whatsapp without a phone number either.

But what's different is that Telegram doesn't share your phone number to everyone on the same group chat.


And the phone number is shared with the provider and if I lose my phone number it can be a serious pain to recover stuff quickly. Signal, telegram and WhatsApp are abomignities. Matrix doesn't tie accounts to cellular numbers (God we still are in a world where those cellular are a requirement to sign up to most things). The matrix network is as transparent as it can be, Element is a decent client that runs on iOS, Android and a Web browser.

Element is such a good alternative that the app store appears to be stubly demoting it in the search result, and never seem to show it as a related messaging client. Strange world we live in.

Edit: furthermore, Matrix tech both for the server and clients is entirely open source, and one can spin up its own network.


Whataboutism.

Also, 1) you generally don't need end-to-end encryption with people you're afraid to share your phone number to, and 2) Signal is already working on usernames.


What I don't want is everyone in the school parents group to know my personal phone number thank you very much.

I don't care if the chat is encrypted or not, if some malicious actor intercepts our plans for the bake-sale I think we can live with the consequences.

Also like 95% of my Telegram group chats moved over from IRC, which also had zero encryption. What we wanted was feature parity (moderation and bots).


> What I don't want is everyone in the school parents group to know my personal phone number thank you very much.

The usernames are in the works.

>I don't care if the chat is encrypted or not, if some malicious actor intercepts our plans for the bake-sale I think we can live with the consequences.

Perhaps double check which historical figure said "you have nothing to fear if you have nothing to hide".

Sure, you have nothing to hide, but don't expect others to not have anything to hide either. And realize when you're only reachable over Telegram, that forces other people to reach you over Telegram.

>What we wanted was feature parity (moderation and bots).

That's totally understandable, apps like Signal do support both however. The way I see it, is Telegram's TLS equivalent group encryption is like upgrading your 1988 IRC to 1995 Telegram. That's how old client-server encryption is. E2EE for 1:1 IMs are introduced by OTR in 2004, and roughly 2013, we get E2EE group chats with Signal.


>> What we wanted was feature parity (moderation and bots). > That's totally understandable, apps like Signal do support both however.

Does Signal have a bot API, where? Like an API specifically to create utility bots for Signal, not a client API that can be used/exploited to create bot-like users?


But that's just the thing now is it. It's NOT secure, that's the problem.

* It leaks 100% of messages to server by default.

* It leaks 100% of Win/Linux desktop chats with no chance to opt out

* It leaks 100% of group message content with no chance to opt out.

* It leaks 100% of metadata with no chance to opt out.

* It always leaks the intention to hide messages from telegram chats, Telegram knows with whom you are sending secret chats, it knows when, it knows the message size etc. This metadata is extremely valuable

Telegram is not in the process of fixing these, thus its security isn't reaching the bare minimum. Thus every feature it is implementing is another way for the company to collect data about you.

It's just another Facebook, masquerading itself as "private app for the people". It's run by the man who is literally called "The Mark Zuckerberg of Russia". That man has military education on information warfare / disinformation: He knows how to create a literal cult around his product.

There's almost nothing we know about the company like who it employs, how it makes money, what it does with its data. Journalists who tried to get an interview with Durov travelled to Dubai, only to find empty offices. The office workers next door said they had never seen anyone enter Telegram offices. They suspected Telegram was using the office for tax evasion, which is not a good sign. Telegram's been very enthusiastic about a story where they're evading foreign intelligence. If that's the threat model, why is their security strategy "store all messages on one server" (like the NSA et. al. didn't have zero day exploits);

The claims about how the data stored on servers is protected, are outright lies a first year computer science student whose taken Computer Organization 101 can disprove[0]

Telegram is a dumpster fire and I'd LOVE to be able to say it's security is getting better, but it's NOT. They just implemented group video calls, are those end-to-end encrypted? No, they f'n aren't. Not even NEW features in Telegram are secure by default. You know, the features no-one asked for, that they were in no rush to deploy. It's especially condemnable as the major competition like Zoom, Signal and Jitsi all have end-to-end encrypted group video chats. Telegram staff don't care about me, you or any other user one bit. The only thing they seem to be interested in is "move fast break things" or "move fast f** security".

No matter how you put it, Schneier is right when he says data is a toxic asset[1]. Telegram can't give any guarantees data that sits in their server is forever protected, so they shouldn't be collecting it in the first place. WhatsApp is in the process of deploying client-side encrypted cloud backups. This completely shreds Durov's BS claim that Telegram has to have access to your data. There can only be two reasons they collect it: Either they don't care about your security, or they are actually interested in your data.

[0] https://security.stackexchange.com/questions/238562/how-does...

[1] https://www.schneier.com/blog/archives/2016/03/data_is_a_tox...


Lol




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: