Hacker News new | past | comments | ask | show | jobs | submit login
Telegram has released user data to German Feds in multiple cases (twitter.com/disclosetv)
217 points by CHEF-KOCH on June 4, 2022 | hide | past | favorite | 180 comments



Did anyone expect otherwise? Telegram is a mostly unencrypted chat application, of course it's going to cooperate if local law enforcement comes knocking on their door with a warrant. If you don't want your chats to end up in the hands of law enforcement then you should consider using an end-to-end-encrypted messenger application.

Signal will hand over your data too if the police show up, but they don't have any data to hand over.


> Did anyone expect otherwise?

In a sense, yes. In case you don't speak German, the article [0] touches on this:

> Dass Telegram überhaupt Auskunft über Nutzer an Behörden erteilt, markiert zumindest eine vorsichtige Kehrtwende im Kurs des 2013 gegründeten Unternehmens. Lange bekamen deutsche Ermittler keinerlei Antworten, wenn sie wissen wollten, wer hinter Telegram-Konten steckt, die strafbare Inhalte im Netz verbreiten. Die Betreiber erklären auf ihrer Seite weiterhin: »Bis zum heutigen Tag haben wir 0 Byte Nutzerdaten an Dritte weitergegeben, einschließlich aller Regierungen.«

In a nutshell, this is considered a turning point because Telegram's official stance is (even today) that they don't share data, not even with any government. And now they did, apparently. So yes, this definitely news, if not a bit surprising.

The German government has even made hints that they will seek to ban Telegram from app stores if they continue to refuse to comply with law enforcement [1]

[0] https://www.spiegel.de/netzwelt/apps/telegram-gibt-nutzerdat...

[1] https://www.spiegel.de/netzwelt/netzpolitik/faeser-will-tele...


> Telegram's official stance[...]

... doesn't really matter.

This is something I find surprisingly difficult to get through to people at both a personal and professional level (as I often work in security-related projects).

People speak of whether they trust this or that entity. Reality is that it's an irrelevant discussion. Trusting an organization is always misplaced, no matter who they are. If you hand over data to an organization, you must immediately assume that data is compromised. Yes, that's all SaaS and all cloud providers, for example. Any party who could see the data in unencryted form, you need to assume they have it and can potentially at some point abuse it.

This has nothing to do with the ethics of whoever is running the given organization at the moment. There are many points in time in various organizations where they truly protect the data. I thank you. But it's not a stable situation. Things can change. Less ethical management can come in during a reorg and suddenly everything is fair game and the consumer won't know until it is too late. And always there is the threat of governments demanding the data, at which point no matter how ethical the organization may be, nothing matters since there is nothing they can do.

For any data that could at any level be potentially sensitive, you must make sure it is either not sent through third parties at all, or be encrypted with keys that only you control. Otherwise, assume it is compromised and sold to every bidder.


This is why Apple is also not the solution for security or privacy.


>> In a nutshell, this is considered a turning point because Telegram's official stance is (even today) that they don't share data, not even with any government.

If you care about your privacy the "official stance" alone is close to worthless. That includes governments and corporations as well. I think we already learned that. Just like on security you need in defense in depth on privacy(i.e. hardware, software, and "official policy" as well).


Their stance has always been that they don't protect terrorism suspects, they even had a channel where they reported how many ISIS groups/channels got shut down.


Unfortunately, Telegram doesn't cooperate with German authorities in the majority of cases, because it operates out of the jurisdiction.

Telegram probably figured if they don't at least share information on child abuse and terrorism, they'll just motivate regulatory action.


> Unfortunately

Why is that unfortunate? If I patronize a company, I want to be damn sure they are only giving up info to LEO when it's absolutely necessary.


If I'm a decent Human being I'd rather no company helps criminals do crime stuff.

The standards to turn over such information should not be decided by a private entity but by democratically enacted and enforced laws. Is enforcement of child porn prosecution "absolutely necessary"? Are death threats ok? Where's the line? I don't think a private company sitting in an unaccountable jurisdiction should make that decision. I trust German courts a lot more in that regard.


I'm inclined to agree, partly. If companies have the information, they should not have the last word about whether law enforcement gets access. That said, I do consider properly secure communication tools desirable and am very concerned about ongoing attempts to ban them.

The uncomfortable truth is that if a communication method isn't secure for child molesters and terrorists, it isn't really secure for anyone.


In the case of Telegram they do require you to use a mobile phone number to sign up and use their service. Mainly so they can be sure you don't abuse THEM, mostly. So the safety of criminals from law enforcement does come from Telegram refusing to give up information they have.

And this concerns an area of crimes were the perpetrators don't very actively evade detection. There are other means to do so. And in the case of Germany, openly and publicly criticizing against the state is no problem as long as you don't propose to violently overthrow the state.


> If I'm a decent Human being I'd rather no company helps criminals do crime stuff.

Does that include gas stations, supermarkets, utility companies, sporting goods stores, book stores, clothing stores? Each sells things that are used to aid in crimes. That doesn’t mean the government should spy on everybody who visits them


> Each sells things that are used to aid in crimes. That doesn’t mean the government should spy on everybody who visits them

Of course they do and have been doing it for the longest time.

that's why they are forced to keep a record of what they sell and hand over that information upon request.

It's the authorities it's not some random dude passing by


But different jurisdictions have different rules? For instance, are you OK with a country that outlaws homosexual acts firing hundreds of subpoenas at a company to be able to target its gay users? CSAM is also a wide spectrum offense, with many jurisdictions now banning cartoon images, CG images and pictures of dolls.

I'm not for total chaos. It's just that the world is a very complicated place.


It's less complicated if you don't do business in countries whose jurisdictions you don't trust. The European Union largely outlaws any such abuse by authorities.

Depending on tech companies to protect your data from authorities is a shitty strategy. At best it works the other way around. If not, you're screwed.


The vast majority of users of Telegram come from (and reside in) a country whose jurisdiction you probably wouldn't trust. Most of its developers, too (though they have relocated).


Translation: If you trust Telegram, you're screwed anyway.


Lol


The problem is when the definition of "crime" changes.

Is crime in Saudi Arabia anything that MBS doesn't like?

In the 30s and 40s in Germany, just being Jewish was a crime.

Whether you need modern or historical references, the problem is the same.


Don't do business in those countries. Simple solution. Trusting a private company to keep you safe from your jurisdiction is a stupid idea.


So, don't do business in Germany? Because they sure changed the rules, starting in the mid-30s.

Don't do business in France? Russia? Saudi Arabia? China?

Are there any countries which have never changed the rules?


"Never changed"? What a ridiculous and stupid standard. Germany's constitution is one of the best at guarding individual rights, and as far as nations go, it has one of the best chances for staying that way. German courts are a much better instance for judging what is a reason to turn over personal information, than any private company ever could be, especially Telegram.

No, don't do business in Russia, Saudi Arabia or China, unless you plan on turning in your clients. Greed makes companies overlook these problems. They tend to eventually regret it, because they can't even make a decent profit when, for example, Russia does what it just did.


whats your view on the war in Ukraine? wuat if I told you that calling it a war is a crime in Russia?

Congrats, you're a criminal.


> If you don't want your chats to end up in the hands of law enforcement

don't use chats

meet in person

like all respectable criminals do

Anyway it would still mean that you're being investigated for something and the court authorized the request

Now imagine what would happen if US law enforcement got hands on the chats of the Texas shooter.

Truth is they didn't do anything even though he posted publicly about what he was going to do.

https://www.cbsnews.com/live-updates/texas-school-shooting-u...

So maybe the idea that anyone is spied and controlled by law enforcement or governments is not exactly true.


Don’t worry, it’s worse.

They collect everything, illegally and without any ethics.

Then they’re incompetent as well, so action doesn’t happen when it should.

Instead we get capricious, haphazard enforcement, and a dark future with all of this stored in permanent record to be retrieved using the search tools of 2040 and weaponized later when we later make nuisances of ourselves in response to future legislation.


The app does have E2E encryption. It's just not the default. My one wish is they would change this.


I know, and I agree. MTProtov1 criticisms aside, the E2EE system Telegram uses is perfectly safe.

It's just disabled by default, unavailable in group chats or channels, and enabling it reduces usability (i.e. you can't use multiple devices to chat if you enable E2EE).

Telegram as a chat app has the best UX of any chat app out there in my opinion, so the lack of proper E2EE is simply disappointing. I don't really trust either, but I consider WhatsApp more secure than Telegram, despite Meta mining my metadata.


> but I consider WhatsApp more secure than Telegram, despite Meta mining my metadata

Ha, I still prefer Telegram. I just use secret chats by default and enjoy the awesome UX


Yes, Secret Chat's awesome UX of requiring both recipients to be online and the inability to use it on desktop.


Try Unigram to start "secret chats" though it will not sync with other clients. Unigram works in Windows Desktop.


the E2EE system Telegram uses is perfectly safe

Is it? How would you go about confirming that?]


> WhatsApp more secure than Telegram

Why is that? With WhatsApp client being closed source, we simply don't know if it really is doing E2EE at all.



This makes no sense. There is nothing to prove that that WhatsApp is really using the aformentioned Signal code. It's closed source, so it could be anything inside.


The answer to your objections is literally in the comment I linked, did you read it? One keyword is RE.


[flagged]


Your objection to a perfectly cromulent answer is just namecalling. People take apart and find vulnerabilities, including cryptographic ones in closed source software all the time.


It's a closed source app. Translation: you and I don't know what it does. Take it apart all you want, but you're not going to find any backdoors or learn about how good the E2EE implementation is. Claiming otherwise is ridiculous.


It's a closed source app. Translation: you and I don't know what it does.

This just isn't true. It's a claim trivially disproved in public by, say, gazillions of detailed P0 posts. Again, all you have is namecalling and confidently stated things with obvious counter-examples.

Claiming otherwise is ridiculous.

Yours is an extraordinary claim that requires, never mind extraordinary, any evidence.


Yes we do as there are 3rd party clients (eg yowsup) that also need to implement it.

Problem is that you don't know of the app is leaking your keys somewhere else.


This is my point exactly. It doesn't matter as long as the app is closed source. You could do the worlds most secure E2EE implementation, but then send a copy of all keys to Facebook servers.


>we simply don't know if it really is doing E2EE at all

Why don't we know? Isn't it trivial to set up a MITM test setup and snuff the traffic and analize it?


It's trivially easy to encrypt the traffic and then send a copy of the private key to Facebook's servers. You would not be able to decrypt it, but they would.


If your message in WhatsApp gets reported approximately 1000 employees from Facebook/Meta will be able to read your last 5 messages you made in WhatsApp.

https://oneandroid.net/whatsapp-will-read-your-last-5-messag...


No E2EE protects you from the receiver (the reporter in your example), it's the same if I forward the message I receive to the world.


What? It’s not E2EE by default? This is honestly disappointing. So most of their privacy-focused stance is just a marketing ploy? A lot of my peers now use Telegram to move away from the Facebook ecosystem and now you’re telling me they’re just as worse?


Everyone will likely hand over whatever they can... which means the most decentralized options which leave as little to hand over as possible are best.


Most encrypted.

Centralised or decentralised means little compared to a lack of encryption.

You can't give up that which you don't have access to.


Both are important. When governments mandate back doors such as in EU chat control or US Earn it act, centralized services can be targeted much more easily than the thousands of xmpp and matrix servers running around the world.


Isn't that federated rather than decentralized?


What distinction do you mean? Federation allows for interoperable decentralization. Without federation, we would have thousands of chat/mail/social media servers that can't talk to each other. Some may choose not to federate, but most want to federate to create a useful protocol.


I always believed decentralized was if each user doubled as a server and required no external setup (example: scuttlebutt). Whereas federated was a plurality of servers with users communicated with each other but no central authority (email, mastodon, matrix, etc). However, reading some peer to peer literature like that 1500 page behemoth of a book, "Handbook of Peer-to-Peer Networking", it seems they are used relatively interchangeably..


the terminology we use in Matrix is:

“federated” = servers can talk to each other; eg email, xmpp, sip, activitypub

“decentralised” = data is replicated between servers; eg matrix rooms are replicated equally between the participating servers; usenet

“distributed” = data is replicated between p2p nodes; eg git, p2p matrix, bittorrent.


Either way it's a lot better than walled-garden style messengers like Telegram or Signal.


The app famous for not cooperating will cooperate?

https://hongkongfp.com/2020/07/05/exclusive-telegram-to-temp...


> Telegram to temporarily refuse data requests from Hong Kong courts amid security law

The headline you linked makes it clear it was temporary. Which mean they do of course cooperate in normal situation. Otherwise they would be blocked everywhere, you cannot maintain a service such as a chat application without cooperating with governments.

There is no story here, Telegram shared information with BKA in cases of terrorism and child abuse, as every service operating in Germany would and should do.


I agree that they should, the story is that they can. Err, not a story, it’s a known fact that they don’t encrypt communication.


It’s easy to refuse requests from Chinese law enforcement if you don’t have offices, employees, or assets in China. Meanwhile, European countries recognize the legitimacy of each other’s court systems and will enforce judgments and orders across borders.


It really doesn't matter, or in general you can never be sure that any app that stores any data won't ever meet requests of any officials or any people with guns.

Especially the case of Telegram was quite simple, since SEC filed a complaint we could clearly see who are the main investors. It's not necessarily about the country or investors either.

You can choose only the place where things are stored and expect the company to act according to local laws (for e.g. Protonmail doing its proton things in Swiss judiciary).

And, I guess, a thing we have to teach people is something vague and unclear like post-privacy scene, like how one has to operate knowing that pigeon mails can always be spoofed, no matter how encrypted the conversation is.


These days, any popular messaging app that won't cooperate with local law (by choice or by design in case of e2e) would just be banned/ removed from stores in Germany or most other countries. Strong encryption for wide audience cannot exist today.


Signal works fine in Germany, doesn't it?


They also have the user's phone number and in most cases their phone's adress book.

It's not really about breaking e2e encryption but rather "reveal the identities of following users please". That could be due to all sorts of illegal activities (eg. hate speech, sale of banned substances, terrorism, child abuse, etc.)


Signal cannot in fact provide your address book to German (or any other) authorities. The whole point of Signal's design, and the reason it's less featureful than things like Telegram, is that it's designed not to collect serverside metadata about who's talking to who.


The client has access to the address book and it is hard to verify what the client does in reality. I receive updates of the client every other day and who knows what it brings with it.


I don't think it is that hard to verify what the client does with your data. It is right there in the source.

> Since version 3.15.0 Signal for Android has supported reproducible builds.

https://github.com/signalapp/Signal-Android/blob/main/reprod...

I haven't tried to compare a local Android build to the published version myself, so can't directly confirm the accuracy of this document.

Either way, I agree that a released build can slip by unnoticed by most users. This is not a problem unique to Signal though.

At least with Signal you have the option to verify a build before updating. You can also build and run the entirely open source client yourself, which makes verification redundant.


Thanks for correcting!


What you write, is I think the main point in this case: there is no hint in the article that telegram is providing access to the communications themselves, but rather data about the account holders which might let the authorities determine their identity. The communications themselves are already in the possession of the authorities. Be it, because they were direct recipients as members of the groups the communications being sent to or provided to the officials by a recipient.


Not sure where you got this from, but Element (and other Matrix-based clients) and Signal are obviously a thing. Matrix is even decentralized.


> Telegram is a mostly unencrypted chat application

That's just plain incorrect.

> if local law enforcement comes knocking on their door with a warrant

How is German law enforcement relevant to an app HQ'd in Dubai? They've been openly criticised before for not cooperating with law enforcement.


Mate I love telegram but that's not plain incorrect.

Telegram has transport layer encryption, like literally everything else in 2022. For all intents and purposes telegram can read and access a majority of your conversations on it.

This isn't a super big deal because telegram is aiming to be a social media platform, rather than an encrypted comms platform, and e2ee on groups over a certain size is pretty useless.

I think telegram can still improve by making private messages e2ee by default.


> transport layer encryption, like literally everything else in 2022… for all intents and purposes telegram can read and access

Not literally everything. By and large everything, sure. Effectively everything, even. But not literally everything.

For instance, take a deep dive into how e2e encryption works in the Apple ecosystem…

https://support.apple.com/en-us/HT202303

… and why allowing iCloud Backups for usability becomes the weakest link for Apple Messages:

https://support.apple.com/en-au/guide/security/sec3cac31735/...

To save a click:

Messages in iCloud, which keeps a user’s entire message history updated and available on all devices, also uses CloudKit end-to-end encryption with a CloudKit service key protected by iCloud Keychain syncing. If the user has enabled iCloud Backup, the CloudKit service key used for the Messages in iCloud container is also backed up to iCloud to allow the user to recover their messages, even if they have lost access to iCloud Keychain and their trusted devices. This iCloud service key is rolled whenever the user turns off iCloud Backup.

If you are in this ecosystem, and feel your potential loss from disclosure is greater than your potential loss by losing/damaging your device, go turn off iCloud Backup — and make sure your keychain is secured to your needs.


Hq location can be quite irrelevant. Legal intercept laws can be quite old-fashioned and might make a case than two German citizens having a conversation while on German soil makes the conversation fall under German jurisdiction. There can be a surprisingly large number of ways the jurisdiction can be determined, for all parties involved and, without analysis of German law, I would not readily make assumptions as to if they have a legal basis to talk to Signal or not. And if they do, I’m sure Signal is a law-abiding company.


> That's just plain incorrect.

By default nothing is E2E encrypted.

So yes, we can say it is mostly unencrypted.


No we can't. Otherwise a simple tcpdump would suffice. Doesn't work though with transport encryption.


We are talking about E2EE here. Almost everyhing is covered by TLS these days, so it is not the relevant argument or discussion point anymore.


It is.


> That's just plain incorrect

All chats are unencrypted by default.


That's what the other user said and it is still incorrect. [0] People either don't read the basic FAQ or conflate E2EE to being the only encryption in the world, which is ridiculous.

[0]: https://telegram.org/faq#q-so-how-do-you-encrypt-data


Encryption in transit is assumed, and rightfully so. That still means that telegram gets full access to the plaintext and as such is able to give that information to anyone, and do with it as they wish.

I suppose there are some people pit there that think "unencrypted" here means everyone can listen in, but certainly not the hackernews crowd.


> Encryption in transit is assumed, and rightfully so.

Heh, we've come far. True unencrypted chat was once popular, and technically still exists (although most IRC networks now default people to TLS.)


Honest question. Can you clarify this?

I read the FAQ and even skimmed the MTProto 2.0 docs but from where I stand this Server-Client encryption sounds like encryption in transit but the server still has the ability to decrypt.

This, from a privacy against law enforcement perspective (which is what the article and comments are about), is more or less the same as no encryption.

Edit: s/transport/transit/, add "perspective" to the last paragraph.


It’s true that Telegram only uses encryption for data in transit for normal person-to-person chats and group chats. Data at rest is stored in a way the server can read. That’s one of the things that makes Telegram search so fast.

The encryption part [1] is covered in the FAQ, along with more details.

Also see the question and answer on “Fo you process data requests?” [2]

Telegram has a feature called secret chats, which are only person-to-person. That uses end-to-end encryption.

[1]: https://telegram.org/faq#q-so-how-do-you-encrypt-data

[2]: https://telegram.org/faq#q-do-you-process-data-requests


I'm aware of Secret Chats, but there's extra friction to enable it and I suspect most Telegram users are not aware of them at all - or are unwilling to use them for almost everything.

Also they should now update that FAQ answer where they say:

> To this day, we have disclosed 0 bytes of user data to third parties, including governments.

In fact, if the OP is indeed true, they should probably update the entire answer since it's misguiding at best, and an outright lie at worst.


You don’t understand what it means. Server side encryption does not matter from the user perspective. Telegram has all the keys and they can access all the data, so there is no real privacy.

For E2EE, you need to open seperate 1 on 1 chat, which is optional, not default.

And what it comes to group chats or channels, none supports E2EE.


Server-side encryption = encryption. The fact that you don't find it sufficient and other opinions are irrelevant when it comes to people just plain wrongly stating things, such as "unencrypted" for clearly encrypted data.

It's like going outside in the rain, getting wet and saying "Well, it's not actually raining, I didn't get a pint of water in my boots."


> Server-side encryption = encryption. The fact that you don't find it sufficient and other opinions are irrelevant when it comes to people just plain wrongly stating things, such as "unencrypted" for clearly encrypted data.

We have clearly talked about E2EE (end-to-end encryption) and server side encryption is not that. E2EE means that it is encrypted between you and the message target. Server is the middle man, which should not have the access.

Almost everything is already encrypted with TLS on the current world during transmissions and regulations require server side encryption. It is not even our main interest to talk about that anymore, we are past that.

The main issue on the original post is the lack of E2EE.


Look up Grice's Maxims sometime. Conversations have context. The context here is a comment section for an article about a nation state requesting chats from Telegram. The only relevant kind of encryption that would be able to prevent this is end-to-end encryption; in such a context, 'Telegram is unencrypted' is easily and near-universally understood to refer to E2E encryption, even if absent such context the meaning would be less clear.

A better rain analogy would be someone saying 'I'd like to go for a smoke, is it raining', and you reply 'yes' because there is somewhere in the world where it is raining (just not there). You would be technically correct, but in the context of the question, the person was clearly interested in whether it was raining _there_.


Encryption doesn't matter if Telegram has the keys.

If you put the key next to a locked door it doesn't matter if you lock the door.

Real encryption means that even Telegram couldn't decrypt it.


But that's not "real" encryption. You're just abusing language — as most are in this thread — to get a result you want.

If you want to discuss E2EE, do so but it does not make it more "real" than other encryption.

Unencrypted is false. Not E2EE is true. Most use the former to wage war against an app they don't like because they prefer an app like Signal that satisfies their desirable qualities. Moxie actually started this trend and it is despicable. I'd say the exact same thing if Durov started referring to E2EE as "pedo-encryption" or anything else that distorts meaning.

Don't distort meaning. Use precise language.


Useless encryption is the same as no encryption. If you put the key next to the lock, it's nit locked.

It's an abuse of language to call that encryption because if you say encryption you imply security. But this is not secure and if it's not secure encryption is useless because security is the reason for encryption. Encryption is not used for the sake of encryption but to protect the content of a message from unwanted access.


> Encryption is not used for the sake of encryption but to protect the content of a message from unwanted access.

Yes, that is what Telegram is doing. It may not be protecting the contents from who you want it protected from (everyone but you and the message recipient) but it does protect the contents from other (notice I did not say all) adversaries Telegram and its users don't want accessing.

It is still encrypted so use correct language, please and do not weaponize words to your own designs.


The context was about end-to-end encryption, so the language was perfectly correct. It is one type of encryption.

It is more likely that you are trying to weaponize the words for your own designs.


The context doesn't change the definition of encryption.

> It is more likely that you are trying to weaponize the words for your own designs.

Please point to where I have weaponized a word because on its face that accusation doesn't make any sense. I have not decided encryption means unencrypted. I have doggedly insisted words be used appropriately and even went so far as to give an example of mischaracterization of E2EE where I would call someone out.


If we go by definitions, it is not encrypted. Ideally encryption means the process of encoding when only authorized parties can understand the information.

During the transportation of the information for the target recipient, the data in this case is on plaintext at some point on Telegram's server, and therefore it is not encrypted for the whole duration, going against the idea of transferring or holding information only for authorized parties in ciphertext format.

If we think that Telegram is the targeted party, then it would be encrypted as data is transferred or hold in ciphertext format for the whole process. However the Telegram is no the target, and the encryption is removed in the middle of process.

> Please point to where I have weaponized a word because on its face that accusation doesn't make any sense. I have not decided encryption means unencrypted. I have doggedly insisted words be used appropriately and even went so far as to give an example of mischaracterization of E2EE where I would call someone out.

You brought it up in the first place with a twisted definition.


From Wikipedia which you quoted bits from: "In cryptography, encryption is the process of encoding information. This process converts the original representation of the information, known as plaintext, into an alternative form known as ciphertext. Ideally, only authorized parties can decipher a ciphertext back to plaintext and access the original information."

> You brought it up in the first place with a twisted definition.

I did no such thing. You appear to be confusing idealism with the definition of encryption.

In any case we already have words for transport encryption, encryption at rest, and end to end encryption when referring to modes of encrypted data. Those are sufficient to cover the spectrum of encryption which exists. Calling encryption of one mode "unencrypted" which is not your ideal mode of encryption is disingenuous at best.


> conflate E2EE to being the only encryption in the world

It is the only relevant one. Nobody who cares about protected messages would be satisfied with untrustworthy encryption.

Sure, technically even a messenger using Caesar cipher is encrypted, but most people expect more than a ticked checkbox. No real user cares about what technically still counts as encryption, just like nobody outside of biology cares whether walnuts are actually nuts.


Hop-by-hop encryption is practically useless in a secure messaging setting, and people shouldn't take the "TLS counts as encryption" argument seriously. But it's good Telegram advocates keep making it, because it's an easy way to sum up their security posture.


If you are claiming that encryption in transit is what people think means "encrypted chat" then you are misguided.


Not GP. But I think your comment would be more meaningful if you elaborated on “people” (like which people you’re referring to). Telegram markets itself as a secure messenger and its CEO has written many a times about WhatsApp being worse for security and privacy. I don’t think a non-tech person can differentiate well between these.


I agree with everything you said.

Telegram is outright lying of course. I can't remember if WhatsApp uses E2E encryption by default still or not. If not they are equivalent, but telegram isn't better in any meaningful way.


Just clarify what you mean. E2E (sometimes with double E) is the correct term.


Nah, when someone calls it an unencrypted messenger, one can assume they mean it's unencrypted on the server, as in-transit encryption is ubiquitous and thus a meaningless signifier.


No, this can't be assumed.


Yes it can. If anyone reads "encrypted messenger" they're assuming only they and the intended recipients can decrypt it.

Rather, this is more of a debate of what the layman expects, and frustration with misleading marketing. A great example of this is the whole Zoom debacle; they claimed it was encrypted, people assumed it was E2EE, and got a lot of blowback for that to the point that they ended up implementing E2EE.

Another great example: a few of my friends were using Telegram for a while, and thought it was E2EE until I pointed out that only their "Secret Chat" feature is E2EE.


This is not a layman forum though. I don't assume that, so this isn't a general statement. Precise wording matters.


Even if that were the case, I'd still agree with OP's wording that it's a mostly unencrypted chat. It's encrypted at transit for the milliseconds it takes to reach the server. Once on the server, a third party has access to the plaintext until the end of time. It's a minimally encrypted chat.

And if the wording wasn't precise enough, context still matters more in this case. I'm sure everyone here knew what was meant, despite the familiarity with cryptography. Telegram claims your messages are "heavily encrypted" which is just false, aside from their very limited secret chat feature.

HN prefers substantive discussion, not nitpicking over semantics.


Honest question, how do you prove you don't own user data?


is a mostly unencrypted … of course it's going to cooperate if local law enforcement comes knocking on their door with a warrant

I don’t see how these are connected. All messengers will hand over all metadata they have to comply. The chatgroups in focus themselves are mostly public groups, you don’t have to play james bond to read what’s there. LEAs are arriving to the scene from that vector, not the other way round. “The NGO CeMAS monitors 3,000 German-language channels & groups for "disinformation, antisemitism, and right-wing extremism." - it’s literally in the tweet, man.

Metadata logging is unrelated to encryption, not sure what’s the sensation is without comparing what messengers will actually have on hands in case of a warrant, minus publicly accessible info.


This is unrelated to encryption. Telegram is still encrypted, otherwise you wouldn't need to ask them.


It's not end-to-end encrypted. Of course the communication between the client app and the server is encrypted using TLS.

But this allows Telegram itself to see the content of the conversation when it arrives on their servers. This has piked the interest of LEA, who want continuous, real-time access to that information.


hmm. The offending claim is certainly still there

https://telegram.org/faq#q-do-you-process-data-requests

>To this day, we have disclosed 0 bytes of user data to third parties, including governments.


> 0 bytes

Such a weird way of saying they haven't disclosed data. They might say they disclosed the data by writing the user's details on a piece of paper and submitting that to the government.


Information on paper can still be measured on bytes.. so maybe maybe


I think Spiegel and Telegram use different definitions of "user data" here:

Telegram says with the E2E chats they had no data to share, but that only refers to the content of the chats, obviously not the IP address or the linked phone number, which Telegram says it can also share with governments. When they mention later in the paragraph that they didn't share any data from unencrypted chats, that's with the caveat.

That Telegram has shared _more_ than just metadata I think is unlikely.


I'm pretty sure Germany while having a reputation for being very privacy focus makes more data requests per captia than any goverment.


Perhaps Germany makes more official requests because of their privacy laws, whereas in America the government simply buys people's data on the open market without having to go through official channels, get warrants, etc?


Under US law, it is illegal to give the government large categories of user data, even if you can legally sell that data on the open market to private parties.


Maybe so, but do laws really count for much in this country anymore?

> DHS Authorities Are Buying Moment-By-Moment Geolocation Cellphone Data To Track People

> The Department of Homeland Security also argues that using the information is perfectly legal and that the agency does not need a warrant to purchase it, according to a memo obtained exclusively by BuzzFeed News.

https://www.buzzfeednews.com/article/hamedaleaziz/ice-dhs-ce...

> Intelligence Analysts Use U.S. Smartphone Location Data Without Warrants, Memo Says

> The disclosure comes amid growing legislative scrutiny of how the government uses commercially available location records.

https://www.nytimes.com/2021/01/22/us/politics/dia-surveilla...


Why are you pretty sure about that? All I hear about is the US doing that every day that they can mandate the parties into silence.... so how would you even know.


Lots of companies openly state how many requests they get from each country. I think I saw a report where someone tallied all the requests from all the major sites that report on the requests they get and Germany was at the top.

For example, https://www.apple.com/legal/transparency/ Germany is at 11+k while the UK which is of similar size is at 2+k


> Lots of companies openly state how many requests they get from each country.

As I said, the US can stop you from making such disclosures and they do, which we know because some such embargoes have ended in individual cases. We have no idea how many are still in place.


The US can stop you making disclosures for very specific cases - aka the Patriot act which is a federal law. But majority of requests are done by other people.

Quite simply saying "We have no idea how many are still in place" actually, we have an idea. That idea is not many. It's simply FUD to talk about the US' ability to stop disclosures when talking about who is making the most requests. Especially when the US is still pretty near the top for countries making requests.


"Very specific cases", like the rest of their spying, right.

You can believe that if you want, but I'm not remotely convinced. We consistently find that the US spies and abuses its power more than anyone imagines. By induction, unless anything profound changes, I'll bet on that trend continuing.


> We consistently find that the US spies and abuses its power more than anyone imagines.

Spies, not law enforcement. Spies spy. It's kinda in their job description.


Anyone can spy. Cops spy all the time.


And in Germany, it’s literally a crime to have certain opinions and discuss them. Edit: downvote away, but this is literally true.


And yet it's verision of freedom of speech is in my opinion still stronger than the US.

In Germany, there is Meinungfreiheit, which is freedom of opinion. This freedom cannot override others rights tho. So your right to an opinion may not infringe on someone else. For example, you can't insult others in a way that dehumanes them. However, you can insult people just not in a way that dehumanes them. A German court found an employer couldn't fire an employee just because they called them an autistic asshole in a text message. The employee was entitled to his opinion that the employer was an autistic asshole. It did not dehuman the employer because it was done privately and did not interfer with the operation of the business. So in Germany, private companies are not allowed to breach your rights, unlike in the US where it only applies to the goverment.


This is not correct at all. You are perfectly entitled to be a fascist in your own four walls, even discuss your fascist ideals with your friends. You can not, however, advocate for fascism in public or use insignias or texts of the NSDAP for anything but educational purposes.

EDIT: The comment below phrases it even better.


> This is not correct at all.

A quick glance at https://en.wikipedia.org/wiki/Legality_of_Holocaust_denial suggests that you may have made up the public advocacy requirement:

> (3) Whosoever publicly or in a meeting approves of, denies or downplays an act committed under the rule of National Socialism of the kind indicated in section 6 (1) of the Code of International Criminal Law, in a manner capable of disturbing the public peace shall be liable to imprisonment not exceeding five years or a fine.[36][37]

Of course the distinction you are trying to draw smacks of sophistry to begin with. From what I can tell, you can be anti-islamic in your own four walls and even discuss your secular ideals with your friends in Pakistan, beacon of free speech, as well[1].

[1] As long as you don't defile the name of a prophet. That seems to carry a mandatory death sentence (plus fine, to really rub it in), even if it occurs within your own walls.


The wording might be off, but public advocacy is basically what is meant by "in a manner to disturb the public peace". This does not include discussing facsim in your home, but does include you not being able to hang a NSDAP-flag from your window.

I wonder why so many free-speech advocates are hell-bent on enabling fascists to spread their propaganda. They are certainly not the first but not the last group they will drag to their camps or shoot.

To compare this kind of law to fundamentalist religious law is a special kind of ignorant.


> I wonder why so many free-speech advocates are hell-bent on enabling fascists to spread their propaganda.

Do you also wonder why so many cryptography advocates are hell-bent on enabling not just fascists but also pedophiles and terrorists to plan and commit their crimes or are your difficulties of comprehension more selective?

> To compare this kind of law to fundamentalist religious law is a special kind of ignorant.

Pakistan (unlike, say, Saudi Arabia) has constitutionally enshrined freedom of expression. Can you point to some fundamental difference that makes the restrictions Pakistan places on this right incomparable to those Germany places on it?


[You're technically wrong, now let me explain the ways in which you're basically correct.]


Depends on what you mean by “discuss them”, maybe you can elaborate on what you’re referring to and under what circumstances. I don’t think there’s any country on earth that has absolute freedom of speech. Certain things are illegal to say for good reasons, because they can do real harm to other people.


I am not the GP, but if you cuss someone online, like saying bad words (not racist) just the f-word, you could end up in jail.

I am not supporting that kind of law or anything as I have no skin in the game :).


Like the US, Germany has free speech protections, and ranks well globally in terms of journalistic protection (https://rsf.org/en/index better than the US), but yes it looks like there is a long list of exceptions there too. The restrictions do perhaps look a little more restrictive than the US, but of course it’s always hard to say. Most free speech restrictions have hard fought and won case law behind them.

You’re right; in Germany, insult is illegal. I’m reading about it, and articles say that the majority of minor insults are not reported, and that generally insult claims are dismissed (for example https://www.latimes.com/world/europe/la-fg-germany-insult-la...) Nazism and Holocaust denial are also illegal, and for pretty good reasons. Since the OP’s comment didn’t mention insulting people, but simply having opinions and discussing them, perhaps this is what they’re referring to? That might be the only subject that is taboo without qualification, the rest of them (such as blasphemy) appear to hinge on whether it could affect the safety of people?

https://en.wikipedia.org/wiki/Censorship_in_the_Federal_Repu...

The US has similar restrictions, most of them for similar reasons.

https://en.wikipedia.org/wiki/United_States_free_speech_exce...


I wouldn't compare with US. All the late night show hosts would be in jail.

Watching Bill Maher first time was a eye opener for me on free speech. I couldn't imagine anyone doing it in my own country.

Probably OP is referring to Nazi stuff. But people do get fined for showing birdie at protests. Even though most are dismissed the process of going through the system is the punishment if you have no legal insurance.

All in all, I agree Germany is not a restrictive society and has honours tenets of free speech. But i won't compare it to USA. Which is light years ahead.


It's literally wrong. You can have any opinion you want, but yes, there are certain, very much contradicted "opinions" you can get in trouble for publicly announcing. But even that has certain limits.

For example you can deny the Holocaust among a small circle of friends, and that would not be "public".

And it's extremely hard to get anything of the sort prosecuted on the internet. See "tatütata.fail".


Which opinions?


like?


Cannot say about opinions, but giving a birdie even online is a misdimeanor and expensive affair.

Cursing someone or insulting someone online (calling an idiot) you can end up in jail for 2 years.


The only reason companies like Telegram care about customer privacy is because that's how they sell their product and make money. They'll happily snitch on any of their users if doing so is more big-picture profitable than not snitching. You can never trust a for profit company bro, when will people learn this lol


In many cases, the way they authorities get your chat logs is by simply confiscating the phones. Either yours or that of the other party. In those cases, they don’t even have to bother with Telegram, Facebook, or whatever. They just screenshot the hell out of everything - messenger apps, email, sms, calendar, todo apps.


I'm unfamiliar with German law, and haven't really paid much attention to telegram in a long time. But what do you expect?

A company can say "we don't share with the government" but if you get served a warrant (or subpoena?) telling them to provide X data, you don't simply get to say "no". You /could/ send your lawyer to a courthouse and have them argue the warrant/whatever is unlawful, or what have you, but if the court says it's valid then failing to comply is a unlawful.

The only way you can not provide data to law enforcement is if you never have it. But companies want data for all sorts of reasons. My assumption would be that even if you say "we don't keep data" you could be served documents telling you to store that data.

Of course IANAL, but this is all basic "function in a society" stuff.


This is such a shame because, from a user and fun point of view, Telegram is light years ahead of Whatsapp.

And I was just starting to get traction in terms of bringing my network over from Whatssap. And that is huge as it was almost impossible before. Has been a sea change for me over the last year or two.


Isn't this simply in line with the Data Retention Directive?


Telegram doesn't care about EU law that much...


Why is Telegram making allowances for CP and terrorism related crimes? Why is child abuse being tagged as worse than, say, drugs-related offenses or murder?


Does it not seem reasonable that if a chat group has, lets say, more than 1000 participants, whatever is said should be considered said in public?


People, the only safe (textual) chat app is TorChat. Everything else is merely smoke and mirrors.


What about deleted messages, do they get deleted from servers too?


> The NGO CeMAS monitors 3,000 German-language channels & groups for "disinformation, antisemitism, and right-wing extremism."

So groups and channels that were already public in the first case... No private chat data, no backdoors? I'm fine with that.


I think the idea is that CeMAS is monitoring those channels and associates potential crimes with accounts. They can make a criminal referral to the police that a crime may have been committed, but they may not be able to find out the person's actual name or persona from their account or other public information.

But it would be easy for Telegram to just hand out the telephone number, and in the majority of cases, the number would be registered to the offender. We're mostly talking about unsophisticated offenders here, people agitating for hate, neonazis, unsophisticated child abusers...


Honest question because I don’t understand most of the comments on this submission: what’s supposed to be the issue with cooperating with law enforcement?

I personally find mass dragnet-like unsupervised surveillance extremely worrying especially in the US where federal authorities like to abuse gag orders. But that’s not what we are talking about here. In this case we are speaking about judicially supervised data gathering in the context of an ongoing investigation. I have no issue with this as long as there is proper limits and supervisions in place.


It’s not a problem, it’s just directly contrary to Telegram’s and Telegram’s supporters’ previously stated privacy/security arguments.


> Honest question because I don’t understand most of the comments on this submission: what’s supposed to be the issue with cooperating with law enforcement?

Depends on the laws they are enforcing. There is a very thin line between law enforcement and outright spying and authoritarian opression.


It all starts out with "judicially supervised data gathering", a few years later these companies just create a portal for the law enforcement to search for whatever the latter want without any warrants whatsoever.


What a weird article, unless Google Translate really messed up for me. Spiegel basically references itself as the source, cites some minister who says she has pressure on the app and then explains that they plan to fine Telegram for not cooperating.

So they claim that Telegram cooperates... and then claim it does not cooperate. This is ridiculously vague.

Not to mention that the article doesn't clear up if it's data or metadata. Weird, poor journalism.


> references itself as a source

This might be a 'germanism'. The article says 'according to SPIEGEL information' which translates to 'according to undisclosed sources that gave us information'

> cooperating and not cooperating

Recently there was a huge controversy about Telegram in Germany about them not disclosing personal information about accounts spreading illegal information according to German law. There were claims about banning Telegram in Germany, but after pressuring the app stores, Telegram seems to have given in slightly. So they are still not disclosing as much information as the authorities want them to, but it seems like they started to cooperate a bit.

>data or metadata

'Nutzerdaten' in my understanding means mostly personal Data(real name, ip, address) but you are right, it is quite vague.


They claim to have learned Telegram gave up IP-Addresses and/or telephone numbers

> on suspects in the areas of child abuse and terrorism. In the case of violations of other criminal offenses, it remains difficult for German investigators to obtain information from Telegram, according to security circles.

So Telegram is not giving free access or follows German law, but did help out in some specific cases.

Telegram claims otherwise and says it never responded to requests.

Yes, they are protecting their source(s), so you have to take them on faith. Der Spiegel has, however, an excellent reputation and good track record.


I think you missed the first paragraph:

The operators of the messenger app Telegram have handed over user data to the Federal Criminal Police Office (BKA) in several cases - contrary to what has been publicly reported so far. According to SPIEGEL information, this involved data on suspects in the areas of child abuse and terrorism. In the case of violations of other criminal offences, it is still difficult for German investigators to obtain information from Telegram, according to security circles.

(Translated with www.DeepL.com)


So, what I'm hearing that unlike all the other sufficiently popular apps to matter, Telegram doesn't provide unlimited access to their user data.


WhatsApp is end-to-end encrypted for all messages by default, so Telegram is worse by having full access to the plaintext for the vast majority of messages on the service.


I am very happy that they cannot MitM my convos, I am less happy that they literally control the app I use to see/send/store those messages.

E2EE only means stuff if you have a baseline of trust for the app developer.


That's why there are vendor indepedent standards like IRC or XMPP. We need to stop talking about messaging "apps" and make compliance with internet standards a requirement.


I have doubts, given their owner is Facebook.


You can of course go verify it. Or you could trust the many many people who have.

Or you can spread FUD on the internet…


How can this be verified? WhatsApp isn't open source.


You could use a network traffic analyzer, Frida, or trust third party security audits that WhatsApp publishes like https://research.nccgroup.com/2021/10/27/public-report-whats...


What if it acts normal for a vast majority of users, but a user which is secretly flagged on Facebook's back end will secretly report plaintext? Or a certain list of conditions will trigger more snooping? Network traffic works for proving that the app, right now, in this exact circumstance and time and date and location etc, probably isn't snooping on me. There's lots of sneaky ways to exfiltrate data that you wouldn't notice. Imagine encoding data through the timing of requests made or the exact ordering of simultaneous requests.


>What if it acts normal for a vast majority of users, but a user which is secretly flagged on Facebook's back end will secretly report plaintext?

You can see that by reverse engineering the binary.


Software verification rarely uses the source (at least exclusively) because you can’t trust it.

Typically it’s a combination of decompiling and traffic analysis.


[flagged]


Does Germany have digital wallet for finance, health, etc?


Lesson learned, don't close your API. Maybe your users will trust you more.


Telegram has released user data to the Federal Criminal Police Office of Germany in several cases


Yes, you already mentioned that.


The privacy movement should really distance itself from criminals it always looks sketchy when the most vocal advocates are people that just got caught doing something illegal.


“The trouble with fighting for human freedom is that one spends most of one’s time defending scoundrels. For it is against scoundrels that oppressive laws are first aimed, and oppression must be stopped at the beginning if it is to be stopped at all.”

— Commonly attributed to H. L. Mencken (1880-1956)


Unlimited tolerance must lead to the disappearance of tolerance. If we extend unlimited tolerance even to those who are intolerant, if we are not prepared to defend a tolerant society against the onslaught of the intolerant, then the tolerant will be destroyed, and tolerance with them.—In this formulation, I do not imply, for instance, that we should always suppress the utterance of intolerant philosophies; as long as we can counter them by rational argument and keep them in check by public opinion, suppression would certainly be most unwise. But we should claim the right to suppress them if necessary even by force; for it may easily turn out that they are not prepared to meet us on the level of rational argument, but begin by denouncing all argument; they may forbid their followers to listen to rational argument, because it is deceptive, and teach them to answer arguments by the use of their fists or pistols. We should therefore claim, in the name of tolerance, the right not to tolerate the intolerant. We should claim that any movement preaching intolerance places itself outside the law and we should consider incitement to intolerance and persecution as criminal, in the same way as we should consider incitement to murder, or to kidnapping, or to the revival of the slave trade, as criminal. – Karl Raimund Popper (28 July 1902 – 17 September 1994)


That argument, even if correct, can be used to target any and all viewpoints. It’s just a matter of who gets to decide what is and isn’t “rational”, “deceptive” or indeed “intolerant”. Shift the definitions ever so slightly in your favor and you have an iron-clad tool to suppress, with good conscience, your adversaries, no matter which side you are on.


Successful use by criminals is among the best possible proofs of the effectiveness of any privacy technology. If it protects even criminals, it will surely protect us. Would be even better if some government agency or military started depending on it for their covert operations.


I wonder if you would still feel that way if you ended up being the victim of said criminals.

There has to be a balance, or at least recognition that encryption leads to situations never before possible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: