Hacker News new | past | comments | ask | show | jobs | submit login
Twilio incident: What Signal users need to know (signal.org)
672 points by input_sh on Aug 15, 2022 | hide | past | favorite | 478 comments



This info gives us an interesting opportunity to estimate the rate at which Signal is adding new users. They've been very tight-lipped (understandably) about their usage stats but anecdotally they seem to be an increasingly common presence on my friends' phones, even the non-techies.

As far as I can tell, Signal uses Twilio only to send SMS for phone number verification. Verification happens when a user registers a new number or changes the number on their existing account.

The rate at which Signal is adding new users could be calculated by:

1900 * (proportion of new registrants among SMS recipients) / (length of Twilio incident)

You could probably make some common-sense assumptions about the first variable. But I can't find any publicly available info on when Twilio was first compromised. Their press release only mentions that they discovered the intrusion on August 4, which is presumably close to the end date of the incident. Does anyone know what the estimated start of the incident might be?


> Verification happens when a user registers a new number or changes the number on their existing account.

Doesn't verification also occur when you re-install the app? Between that and how hard Signal makes device <-> device upgrade transfers I wouldn't be surprised if most messages were for existing users.


My wife and me recently bought new phones and had no trouble transferring Signal whatsoever.


Signal's SMS registration codes expire after a few minutes, so you wouldn't even need to know the duration of the incident. Let's be conservative and say the codes expire after 5 minutes (it's probably shorter), then Signal is registering 380 devices a minute.


My reading of the post is that they determined the "1,900 users" figure by the number of users who had requested a code during the duration of the Twilio incident, as the attacker could have accessed their SMS messages at any point during the compromise:

> During the window when an attacker had access to Twilio’s customer support systems it was possible for them to attempt to register the phone numbers they accessed to another device using the SMS verification code. The attacker no longer has this access, and the attack has been shut down by Twilio.


At least number of requests + number of open unverified and unexpired requests. So you need to guess average length of time to verify and the abandon rate.

It also seems like there is also a buffer period when the numbers are registered and not yet purged from the twilio system:

> 1) their phone numbers were potentially revealed as being registered to a Signal account, or 2) the SMS verification code used to register with Signal was revealed.

We don't know how often Signal purges that, so, although unlikely, it could be a day or a week or more of registrations.


380 devices / minute would imply Signal is adding 547,200 users / day, or 199,728,000 users / year. That seems way too high. Granted some could be multiple devices per user, but still...


200M users/yr does not sound unrealistic to me given the network effects of Signal. I'm certainly not suggesting that's the actual rate - but I'd be willing to believe it's within an order of magnitude.

Here's an article from last year mentioning 50M+ downloads in 10 days. https://webcache.googleusercontent.com/search?q=cache:hsVnnQ...


> Here's an article from last year mentioning 50M+ downloads in 10 days.

That was a period in which WhatsApp had just announced their new ToS, indicating that the data about their users was to be merged with Facebook's. Consequently, many users became aware of the enormous privacy issues with all Facebook/Meta products, and abandoned WhatsApp for other services, mostly Signal and Telegram.

I doubt that this period can be used to interpolate the Signal growth for the rest of the year, although the network effect has probably resulted in a much higher growth rate than before the WhatsApp exodus.

Edit: a Tweet from Elon Musk ("use Signal") most probably contributed a lot to that raise in awareness, and the one-time surge of increased growth for Signal.


If by some you mean... quite a lot?

I've registered dozens of devices, since the account pairing expires after five minutes (ok, slight exaggeration).


i found this site with data about users and downloads https://www.businessofapps.com/data/signal-statistics/


That sounds right.


It would also include re-registrations, fwiw.


Interesting idea. The number could also be an attempt to cover up their actual intention of targeting specific users.

Among the 1,900 phone numbers, the attacker explicitly searched for three numbers, and we’ve received a report from one of those three users that their account was re-registered.


> This info gives us an interesting opportunity to estimate the rate at which Signal is adding new users. They've been very tight-lipped (understandably) about their usage stats but anecdotally they seem to be an increasingly common presence on my friends' phones, even the non-techies.

I am assuming US or Germany.

I can't remember which thing it was exactly but there was a huge privacy scare in the US at some point which got people to switch in droves to Signal. Maybe the WhatsApp T&C change?

German's have always been more privacy-aware (hence they have a much bigger cash payment culture than almost anywhere else in North-West Europe) and it seems like a steady trickle is switching over.

But for example here in The Netherlands, I'd say 99% of people is on WhatsApp, 10% is on Telegram, and 0.1% is on privacy-focussed messaging services.


You might be in a different bubble than I am in :)

Netherlands here. 80 contacts on my phone. 20 on Signal, of which 10 are quite normal people. Almost all 80 are on Whatsapp. No idea about Telegram.


Belgium here. I don't have WhatsApp, Messenger, or any social network account.

I only use Signal. Most of my contacts are on Signal (meaning people from work or close friends/family). I would say ballpark 85%+. My shrink and MD are on Signal. I'm lucky enough to work for an organisation that promote and emphasis privacy and security. I'm also quite a good evangelist to my family and few friends ;-)

Once I step out of that circle/bubble of people, I realize how peculiar my situation is, mind you. I have 2 teenagers and sometimes I have to communicate with "regular" people. EDIT : in that case, I use regular email, SMS or phone calls. WhatsApp is widely used in Belgium, as is Messenger. My wife tried to remove WhatsApp at some point, but that made her social life too difficult.


Are you aware you are by far, far far far, irregular?


The attack Twilio suffered is almost identical to the recent attack against Cloudflare: https://blog.cloudflare.com/2022-07-sms-phishing-attacks/ (even down the wording of the text messages, which are nearly identical). Cloudflare’s use of security keys prevented the attackers getting access to any accounts in that case.

These attacks are sophisticated and are capable of bypassing TOTP or mobile-app-based MFA. If this is widespread, I’d be surprised if we didn’t see a massive influx of breaches soon. The vast majority of companies are not well defended against this.


> These attacks are sophisticated and are capable of bypassing TOTP or mobile-app-based MFA.

To be honest, I wish people would stop parroting that these attacks were "sophisticated". In my opinion, I'd call something like Pegasus spyware "sophisticated". I don't think these attacks were that sophisticated at all - they were just standard issue, MITM attacks using targeted text messages - and they just took advantage of what is always the weakest link in security: people. I think of myself as a general middle-of-the-road software developer but I think I could have easily replicated this attack myself.


to be clear they are not able to "bypass" TOTP or mobile-app-based MFA in the way security folks think of that term. They were able to bypass humans[1], which are often the weakest link in security related matters.

[1]: "Twilio became aware of unauthorized access to information related to a limited number of Twilio customer accounts through a sophisticated social engineering attack designed to steal employee credentials. This broad based attack against our employee base succeeded in fooling some employees into providing their credentials. The attackers then used the stolen credentials to gain access to some of our internal systems, where they were able to access certain customer data. " https://www.twilio.com/blog/august-2022-social-engineering-a...


I would consider myself "security folks" and while maybe I wouldn't choose the word "bypass" the effect is that TOTP is basically useless against phishing and always was, and I don't object to that word from lay people.

At Cloudflare, or Google, or several other places that took this seriously, "fooling some employees into providing their credentials" doesn't get you anywhere. With WebAuthn your employees don't have a way to give bad guys credentials the bad guys can use - no matter how badly they were fooled.

TOTP is effective against credential stuffing, but it does nothing for phishing.


The way that Cloudflare attack was working out sounds similar in nature to the way MailChimp was attacked a few months ago:

https://www.bleepingcomputer.com/news/security/hackers-breac...


I'd contest the sophisticated characterisation. Phishing activity targeting banking has been doing this for ages and it's been common against office365 accounts for a long time as well. And it's plainly obvious to common sense that the adversary can just proxy the verification code check in a phishing attack.


I wonder if Twilio corporate culture has an aversion to or outright not-implemented or banned more secure methods of MFA such as security keys because of their $$ acquisition of Authy. Anyone have any insight?


>Among the 1,900 phone numbers, the attacker explicitly searched for three numbers, and we’ve received a report from one of those three users that their account was re-registered.

I wonder if this was a curious attacker trying to see what they could do with their access, or a targeted attack.


I think someone might know that certain numbers belong to certain users and that they want to prove it. Happened a lot with Disqus accounts.

You can btw use the password reset function on many sites to correlate it with notifications. Easy at public events.


It sure feels like it was targeted. Is trying to re-register a Signal account the sort of thing an attacker is likely to do at random?


Similar to the other reply, I imagine it would be along the lines of:

"Holy shit are those Signal 2FA codes? That's wild"

In my head, this is something a more teenager (e.g. Lapsus) might think?


> Is trying to re-register a Signal account the sort of thing an attacker is likely to do at random?

Yes. I mean why not, you've got the number(s).


The page is also quite vague about how the attacker got these 1900 phone numbers. It seems to imply that they were just the ones around when the attacker got access. But it doesn’t actually state that clearly. Were they 1900 random numbers or were they chosen somehow? The latter is of course far worse.

They also apparently have logs of the attacker searching out three specific accounts within these 1900. That seems odd. What’s the chance that, out of all signal accounts, the three they are curious about just happen to be among the 1900 they got access to? (Perhaps signal/trillio don’t have logs from failed searches? That would be pretty poor logging though)


Those 1900 phone numbers would be all the accounts that started the registration/re-registration process with Signal during the time the unauthorized access was available. That process is started on Signal's side and Twilio is only used at the midway point to send a device verification SMS.

Any Signal accounts that did not start that process during that time would not be able to be intercepted or accessed since Twilio has no means to begin it. The three specific accounts mentioned would be the cases found that the verification message was accessed through Twilio to register the account on the attacker's device.

So yes, in effect the 1900 were only the ones around when the attacker got access. Whether the specific three were targeted attacks or random messing around isn't clear though.


The attacker sought out 3 specific numbers. The 1900 number is the amount of registrations that occurred during the time the attacker had access to re-register their Signal account – but likely mostly didn't.


This is a weird thread. There's a product that does secure messaging with usernames and only requires user/pass. It's called Keybase. If this is the product you want, then go use it. I don't understand why everyone wants Signal to be something it's not. I quite like Signal as they are and this "incident" demonstrates exactly what happens if a carrier gets compromised: nothing. Nothing happens. Signal decides not to trust any phone verifications from the period of compromise and requires affected numbers to reregister. All the important crypto has nothing to do with phone numbers in Signal's domain. And this is exactly why I use Signal. It lets me send secure messages to people using a tried and true UX: text messaging, but with its own secure application layer. It's really difficult to build a useable security product, and Signal has done it successfully.


I love Keybase, but I would never recommend it today.

Zoom acqui-hired the team in 2020: https://blog.zoom.us/zoom-acquires-keybase-and-announces-goa...


If you're looking for a Keybase replacement, check out Peergos (https://peergos.org). Peergos is a P2P E2EE global filesystem and application protocol that's:

* fully open source (including the server) and self hostable

* has a business model of charging for a hosted version

* designed so that you don't need to trust your server

* audited by Cure53

* fine-grained access control

* identity proofs with controllable visibility

* encrypted applications like calendar, chat, social media, text editor, video streamer, PDF viewer, kanban

* custom apps - you can write your own apps for it (HTML5), which run in a sandbox which you can grant various permissions

* designed with quantum resistance in mind

You can read more in our tech book (https://book.peergos.org) or source (https://github.com/peergos/peergos)

Disclaimer: co-founder here


Hmm, I'm looking for a Keybase replacement but one of the main reasons I use Keybase is their native apps that let you mount the cloud storage as a FUSE or FUSE-like (Dokan) native storage device.

This is great for distributing encrypted keychains/configuration files and the such across various platforms (where many apps are not cloud-aware but are happy interacting with the filesystem). So far the "mount" approach seems to have also performed considerably better than syncing-based services (dropbox, G drive, OneDrive etc.) which from time to time have resulted in hard to untangle merge errors.

Peergos looks promising but it also looks to be a web-only service, so not exactly a replacement yet. Would be nice to see an option for local native mounting. Preferably via a native app with a FUSE-like mount point, but I'd also probably be OK with something like WebDAV maybe.


We have a FUSE mount and CLI. For details see: https://github.com/peergos/peergos#fuse-native-folder-mounti...


Ooh, good to know. This should really be advertised a bit more in the website... I want straight to the website and it doesn't mention local mounting in the features at all and no screenshots show anything about local mounts either...


can communication happen across servers/vendors like in Matrix?


Yes, it's P2P. Anyone on any server can share and communicate with anyone on any other server.

You can also migrate server unilaterally and keep your social graph without needing to tell everyone, all links to your stuff continue to work afterwards.


Looks interesting. Built on top of IPFS?


Yep, we built a super minimal ipfs replacement - ipfs-nucleus (https://github.com/peergos/ipfs-nucleus) with added block level access control, which is also post-quantum.


I am aware. For one it still works just as well is it ever has, the Zoom acquisition didn't change anything there. So if you care about features, there shouldn't be any problem. For sure it seems to be in maintenance mode, but nothing they were doing of late with Lumens was that exciting anyway (trying to become a crypto wallet like everyone and their mothers).

I would pay $/mo for a Keybase reboot with the goal of building a sustainable business like Signal did instead of taking VC money for a shot at the moon. Until someone does that, Keybase continues to work as a messaging app with usernames instead of phone numbers.


Yeah I'd rather use Keybase which has username / password than the disaster that Signal is right now. Especially when you have both Twitter and Twilio breaches, SS7 attacks, SIM swapping attacks, etc.

Keybase still works and for a simple messaging app does the job better than Signal or any other messaging app that requires a phone number. This is a total disaster.

> but nothing they were doing of late with Lumens was that exciting anyway (trying to become a crypto wallet like everyone and their mothers).

Just like Signal did, with their own private crypto wallet and cryptocurrency that they have been working suspiciously in the background for a year after being questioned.


Why doesn't anyone ever consider XMPP using one of the many clients like gajim, conversations, etc.? It's got all of the encryption features anyone would want but has zero mentions as a secure messaging option.


I've replied in a sibling comment about Peergos which is trying to do just that.


Definitely checking it out, thanks!


It's kept updated, we use it to interact with the Chia Blockchain team heavily and you just can't substitute for its identity feature to know who you're talking to.


Barely so. looks more like bare minimum life support. The slump in code contributions can speak for themselves after the acquisition.

https://github.com/keybase/client/graphs/contributors


Lately I've been pretty happy with Jami. It's still a little unpolished, but it's been good to me so far.


> If this is the product you want, then go use it.

This is great advice if your goal is to send messages to yourself. In the real world, though, a messaging app that you're the only one using is about as useful as a bag of ice in a snowstorm. People don't need "like signal but with usernames," they need "signal with usernames (or email addresses or...)" so they can communicate with people who use signal.


This doesn't make any sense. My assertion is that Signal would not be Signal if it has usernames. The subtext that I did not state specifically is exactly the question of why more people don't use Keybase regularly. Maybe it's not the winning UX?

You don't get to look over at Signal and say "wow what a great user base I need to be a part of that" and then draw the conclusion that "Signal needs to support my idealogical aversion to using a phone number". You're missing the possibility that Signing is the way it is because it requires users to verify their phone number.

If you can't use a phone number but need to talk to people who do, securely, then you need to convince them to use a product that accommodates your niche. Why can't you use PGP and email, or Keybase, or <insert one of the 10s of other products that let you send encrypted messages>?

Sure, signal could add support for usernames. But how do you know there'd be anyone left after they did for you to talk to? Maybe it's not what Signal's users need.

Anyway, if Signal found a way to support usernames that didn't compromise on all the reasons I use signal and also didn't open the network up for tons of spam and low quality content, I don't think I'd complain. But that's a big IF.


I've learned the hard, or at least slow, way that this discussion is mostly futile. All I can say is that a large part of the world doesn't use phone numbers like that anymore. One of the major benefits of messaging services is that they aren't tied to a country, carrier, area, address, personal identity or even your phone. It doesn't end up in random databases of shopping websites or advertising networks. You can share it with someone you briefly met, someone unknown or even have someone else share it for you.

I've found, and I think more than me have, that the overlap between having an immediate need for security and wanting to share you phone number is surprisingly small. And even just a subset of those people are on Signal.

It's just never been very useful for me when other services are.


fwiw I am a user of signal and I am expressing my need. Allowing it access to my contact list and my phone number is a privilege I extend nearly uniquely to it among similar apps and I want that gone. Because I can't just "not use signal," because signal is where the people I need to talk to are. Users are a key feature of any social product, you can't just "all else equal" them away.

It's not really my problem if it's hard. That's for them to figure out. Until they do I will continue to be an unhappy user of their product, and no amount of people on the internet willing to defend their choices as if they were their own is going to change that.


Allowing the Signal client to access your contact list is literally the premise of Signal; it's the core security UX trade it makes: no durable logs of who's talking to who on the servers, and contact lists stored exclusively on the client.


These two things are not related in any way. You could clearly have a communicator that stores its contact lists exclusively on the client, but does not abuse identifiers and contact lists of different applications (PSTN calling software).


Let's concede that using other applications' identifiers is strictly bad. Probably everyone agrees. Now, how do I message you on this pristine application?

Using phone numbers is a compromise taken in order to enable a UX that actually wins users. Have we forgotten what that word means?


You've said something like this many many times and I just don't see the logic of the question. You're talking about a feature that you admit is a privacy compromise and then comparing it to an absolutely maximalist alternative, or a world where people only connect in literally one way (through their phone contact lists). Is it really so hard to imagine that other compromises may be possible, or even coexist?

The answer is I give them my email or username. They give me theirs. We connect.

Using phone contact lists shortcuts this process, but the exchange still had to happen at some point. Is it really so hard to believe some users might choose to do it again? Or, god forbid, with someone they'd rather not give a phone number to?


I'm not comparing to some absolutely maximalist alternative. I'm asking how you get an equivalent product experience without the compromise (which would make everyone happy). I strongly believe the UX afforded by the compromise is how Signal has won all its users. The threat model and all it entails is the value prop.

I genuinely believe there is a lot of commentary on this thread from people who have never designed a secure system. You never get 100% security and 100% privacy. Even if you only use public keys, web3 style, you're still a traceable public key--by definition not private. Okay everyone uses a fresh key for every action. Well now you have a problem figuring out who anybody is and whether you should trust them. Either trust isn't self-sovereign or it is. And we've learned time and time again that self-sovereign trust systems are akin to anarchy. Signal leverages the verifiable short identifiers available to a mobile phone, at the expense of 100% perfect anonymity when asking the question "has this phone number used signal". Literally everything beyond that point is 100% secure and as private as two public keys corresponding can be.

1. As a signal user, I don't want to see the threat model weakened so that we can include email anons, personally.

2. Even if we did, I don't understand how doing so in any way solves the privacy issue. How is email any more private than phone? If an email provider got phished people would be yelling the same thing "how could signal be so stupid to use email, don't you know it's insecure". Email providers can still be compelled into shenanigans, too.

3. Signal as a product has to facilitate a key exchange. I'm pretty sure you can checkout their source code and run their protocol and solve the key exchange portion differently if you so desire. You could have "signal without phone numbers or email" tomorrow if you wanted. As long as your users are willing to copy and paste public keys into their messenger, that is.

To sum up: the key exchange and distribution is the entire problem. And Signal presents an adequate solution: bind phone numbers to asymmetric crypto, add perfect forward secrecy and give people secure messaging. Surely it's not for everyone, but this incident in my eyes only further validated that this premise is solid.


> I genuinely believe there is a lot of commentary on this thread from people who have never designed a secure system.

Gosh that's quite the conclusion. I hope my employer never finds out about this discovery of my competency based on some comments on a message board.

I think you've very much lost the thread of what I'm saying here, because at no point have I suggested anything about 100% security or 100% privacy. It would actually be pretty weird for me to be advocating for that while also asserting that you're making maximalist arguments.

I also never said email is inherently more private than phone. I assert that it's a different privacy tradeoff, and one that I'm more comfortable with for various reasons. I could get into those if you want but I don't think they're relevant. (1) is the more interesting question in the end. (3) is just "it's open source you can fix it yourself!" which is .. not very useful on any level. Yes, I can go make my own signal-based platform and talk to precisely no one over it. No I'm not interested in doing that. I've been to the social network rodeo and have the mental scars to prove it.

So ok, assuming we go with email addresses as the alternative mechanism, and the email addresses still require verification same as the phone numbers, and you still have to mutually have each other on our contact lists to communicate through signal: How, specifically, has the threat model been weakened?


That comment wasn't directed at any single individual. There's just been a lot of "I imagine you can just type in a username and that would all work, QED. Duh." type of comments across the board, hence my broad statement.

I agree Signal could add email addresses specifically, if verified and it wouldn't affect the threat model outside of introducing the network to more spam-able identifiers. Like I've said, if they figured out how to do that without degrading the quality of the experience today I doubt I'd be up in arms. I'm working on adding more email addresses to my contacts book, slowly. It probably makes more sense today than it did when Signal was born.

It's not about what Signal can and can't do, though. Signal needed a readily available offline locally owned and operated contacts book with to make their product vision work. So they used the one everyone has on their phone and it worked. They upgraded the security of everyone sending sms and mms. I think there's a way to celebrate that while asking for email address support without getting into the ream of "zomg Signal sux because they use gross phone numbers what idiots would design a system like that what a fucking mess of a royal debacle attn. whistle blowers and abortion seekers: signal is not for you". That type of response is what I'm railing against by simply reminding people that Signal is a successful product that does indeed work as advertised and that it doesn't exist in a vacuum.


I've been reading up on the state of things. So Signal actually is working to remove the phone number requirement: https://twitter.com/moxie/status/1281353114063257600?s=20&t=...

They've been working on it for years. Their solution is that they have to take client-side ownership of your contacts list, keep it associated with your "account" and sync it across your devices so that when you correspond with someone by username, it becomes available to you everywhere. They have to be your contact book. I can find nothing on how they plan to verify usernames, perhaps in the traditional style with email.

So yeah, absolutely not some trivial change that they just don't want to do because fuck the few people that don't have a phone number (or don't want to use it). They're working toward supporting usernames and at every turn keep getting reamed by HN because, in their effort to solve a problem that only exists on HN, they have to deploy a solution that means you have to trust them in a teeny tiny way you didn't previously IF you set a weak pin on your account. It's mind boggling. It must be so disheartening to see that type of response.

But, that's my point. Signal can't add short names without changing the fundamental trust model which appealed to everybody initially. No amount of hiding a password as a pin, will change that. I really hope they don't kill their product along the way...

(Also man WTF they're running Raft on SGX enclaves just so they can rate limit attempts to brute force users' weak pins. While super cool, technically, what an incredible waste of resources just to try and make weak passwords okay. Probably the most backwards thing I've seen a security company attempt like ever. Just tell your users if they want a username they need a strong password. Or just generate the entropy for them and only allow the username option to people who also want to take custody of their new 32-bytes of entropy and have a signal-managed synced contact book.)


> Just tell your users if they want a username they need a strong password.

If their goal is to shift responsbility to the user, that solution works. If their goal is to provide secure communications to the general public, that solution doesn't work. As you probably know, strong passwords are widely recognized as a failed security technology for the general public.

Also, what happens when the user forgets their strong password? Dataloss is not an acceptable outcome for general end users whose priority usually is not ultimate security, but usability. Thus (as I understand it) Signal allows weak passwords ('PINs') that stay with the client, and adds 'invisible' entropy which is backed up to server-side SGX (because the user doesn't know the entropy, it must be backed up off-phone in case the phone is lost). It's a great, no-tradeoff solution IMHO: If SGX is compromised, the user is no worse off than if the supplemental entropy didn't exist at all - they have their (weak) password. If you don't want to depend on the 'supplemental entropy', use a strong password and then Signal's entropy and SGX security become irrelevant.

> Or just generate the entropy for them and only allow the username option to people who also want to take custody of their new 32-bytes of entropy and have a signal-managed synced contact book.

AFAICT, Signal is not interested in implementing features that are valuable only to geeks and that everyone else ignores, and those kinds of features don't seem to fit their mission.


I agree almost completely. It's just that my guess is that nobody actually cares about usernames either, just the few people who can't use a phone for <reasons>. So I'm thinking they're already kinda in the realm of building out this feature for nobody which is why I was suggesting something more wallet-like like generating 32bytes of entropy and showing users the mnemonic representation and telling them not to lose it (which is familiar, despite being a terrible UX, at least). Perhaps I'm underestimating how many people actually would use a username instead of their phone number in which case I think your 100% spot on.


> in their effort to solve a problem that only exists on HN

I don't understand why your takeaway from the fact that they're implementing it is that the people saying they want or need it are irrational and only exist on hn instead of "hmm, maybe I'm wrong and this is a legitimate feature request".

Anyways, let me assure you that the people who get "reamed" are in fact anyone who even causally mentions they want this feature who get a bunch of very dedicated people telling them how utterly wrong they are, no one should ever want that and anyways it's impossible actually.

Trust me.


I'd consider the way you're asking for the feature. There was definitely an air of "this is such a simple feature why can't I just have it it should be no trouble for everyone involved it's just a username". I think if people asking for this feature were to spitball through it and acknowledge the tradeoffs rather than incessantly repeat how uncompromising they are in their need for usernames and their need for Signal to have them yesterday, the conversation wouldn't seem so volatile. I actually wasn't trying to dive in and sling mud. I see this conversation all the time on HN and, coming across it again, wanted to suggest that maybe another product with usernames would work better for these people since literally every time Signal comes up on HN the peanut gallery shoots off with tired smears and entitled quips about how Signal users phone numbers.

The strong response you encounter is people trying to communicate that it isn't that simple for them. That it means enough of a shift in Signal's model that they're really worried about the change to the product if Signal implemented it, not least because it changes the very thing that drew them to the product in the first place. And unfortunately, it seems the worries are not unfounded. I genuinely don't think many of the people asking for usernames would want them if the proposition was clear: "you can have them but you have to trust us with your contacts book and personal information". It's the catch22: in order to have the privacy of a username, you must give up the privacy you'd win. For some people, they trust Signal with that responsibility more than their carrier (like a VPN) and it's a good tradeoff.

Me? What was compelling about Signal is that it was my contacts book and encrypted communication. No accounts/profiles, no passwords, no proprietary software, no invasive product analytics, just a global DB associating phone numbers with pubkeys. That was my pipe dream but I also acknowledge I'm not the center of the world either: in the same way you begrudgingly use Signal with a phone number, it's also not the end of the world for me if we have yet another company out there where I need to maintain a profile and stick a password in my password manager and login periodically. But sadly, if Signal gets to that point, it ultimately means the "Signal experiment" portion of the product's life will have come to an end. <- This, more than anything else, is why the suggestions to go use one of the products that already provide the experience you're looking for is apropos and not dismissive. We don't want the experiment to end. The entire point of championing Signal in the first place was idea that we could collectively participate in a product that didn't do what everyone else on the internet did and send off all your data to their servers the minute you opened their app.


You’re angrily lashing out at strawmen to justify why the lookup key is constrained to a phone number. That does not need to be bound to a phone number, it could be an identifier someone just types in.

What you’re arguing for is the recovery mechanism to get back online when you lose your private key, which is totally unrelated and could be solved independently for people who choose to give a phone number vs those using an email or some other arbitrary identifier.


1. I'm not angry at all.

2. Let me make this clear: an imperative component of signal's product is that the identifier used is verifiable, and that the only thing they store for a period of time is that users in-fact did verify their number. Everyone arguing for typed in identifiers is missing this point. That wouldn't be Signal. That's the core of what I'm saying. That would be something else where people claim short identifiers and then have to share them with each other via some other channel which I'd have to independently verify, etc.

3. Nobody arguing for non-phone-number short identifiers has proposed a solution for how you verify them and manage them that doesn't change Signal's fundamental threat model and information architecture, which, at the end of the day, is what many users are bought into. I use Keybase, feel free to hit me up there if you need a messaging platform with socially verified short identifiers. My proof is in my profile. If you want an unverified short id, email works great, I respond to that too. Point being there are existing options for "type in a short id and send it a message".


Following up on:

> Nobody arguing for non-phone-number short identifiers has proposed a solution for how you verify them and manage them that doesn't change Signal's fundamental threat model and information architecture, which, at the end of the day, is what many users are bought into.

So it turns out Signal is building support for usernames and their solution is indeed rather involved. In order to achieve usernames they've:

1. added a contacts book and profile

2. added a passpin by introducing Intel as a trusted actor. the pin you enter when using signal is actually your Signal account password

3. presumably adding support for usernames and <TBD> username-based verification

They've been working on this for years. They're not just sitting on their hands. So my point seems to stand: it's not just "add a username field and let people type shit in we have input fields amirite". It's a massive overhaul of their fundamental architecture. And sadly it's not happening very publicly because the stuff they're doing to make it happen is also not okay according to the other half of the security community. Carriers? Not okay? Well how about usernames? Oh, you're using 4 digit pins as passwords and SGX to throttle login attempts? Well that's not okay either SGX has been pwned a billion times. Lose/lose for Signal. I pity them, honestly. It sucks.

I'm not personally enraged or anything. I think the SGX stuff is actually a pretty cool compromise. But, alas, it's still a compromise in order to make usernames equally feasible as phone numbers. Either way you're compromising. And that's what this thread is about: to make security accessible you can't live in an ivory tower and demand perfection. You have to get down in the field and make compromises in order to build a successful product that people will actually use.


“Using phone numbers is what makes signal signal and everyone else is stupid who doesn’t see that!”

“Oh, signal is relaxing that constraint. I was still right and signal employees are wrong for doing this.”


I think you're missing the part where they are in the trying to figure out how to relax that constraint phase (they have not yet) and having trouble in paradise.

They've run into all the issues and nuances elucidated in this thread. They have been receiving pretty intense feedback from people who have stopped using their product because of the concessions made. They've clammed up in response and are losing even more people because they are not clearly articulating the changes to their users (many of whom would be fine with it if communicated transparently and respectfully). They have people who desperately want usernames but also not if it means what Signal is proposing and admit "okay, you heard my request and tried, but hmm let's not do that I don't like this PIN UX and it's not what I wanted when I said I want usernames". And they even saw their product forked the minute it became clear what they were doing: https://getsession.org (blogs start Dec, 2019 which is around the time Signal started messing around with secure value recovery stuff, at least publicly).


That may be their product management premise, but it's not why I use it. I use it because people I need to talk to are there and it has proper e2e messaging. I'm not beholden to their expectations of why I want to use their product.

Also I'm not advocating for anything to be kept server side, nor do I see any reason why other identifiers couldn't be kept client side. An address book is just a list of identifiers, it's not magical just because it's phone numbers and already on my phone.

We've had this conversation before though. I remain unconvinced.


> I use it. I use it because people I need to talk to are there

This is exactly what makes it "your problem".

Signal worked out a way to provide E2E messaging that practically everybody who cares and all their friends use. You can choose to accept their phone number requirement compromise and take advantage of that huge and growing network of users, or you can go your own way and somehow convince "the people you need to talk to" to also use some alternative that more closely meets your specific needs.

> I remain unconvinced.

I get that. I understand and even partly agree with your stance. But the pragmatist in me is way happier with having a significant portion of the people in my contact list also on Signal and having a zero effort was to have an E2E encrypted chat with them. I am old enough to have gone to PGP keyparties in the late 90s. I have verified private keys for a handful of friends with some combination of privacy/security/paranoia outlooks. I can't remember the last time I sent or decrypted a PGP message (that wasn't a computer generated alert). Person to person encryption key exchange has been tried and has never gained anything like a ubiquitous network. Signal isn't perfect, but it's got very close to that, which makes it day to day usable and extremely useful. At least for me and all my friends and most of my business contacts. YMMV.


> This is exactly what makes it "your problem".

No, it’s still signal’s problem too. There is no reason to bootlick here.

> Signal isn't perfect, but it's got very close to that,

It’s really far from it. Being tied to SMS and phone numbers is a nearly fatal flaw.


Fatal by what definition? Signal appears quite successful if you simply look at it.


Signal replaces messaging services that were all keyed by phone number. Use something else. I don't think anybody can do better than explaining why Signal works this way, and what the benefits are, vs. the (amply articulated) liabilities.

This is one of the most boring repeated conversations that occurs on HN. It's incessant. Avoiding these incessant superficial conversations is, in fact, part of the premise of HN.


You sound like people defending PGP when everyone knew there were major downsides and usability issues. How can keeping phone numbers as the only option be more important than everyone being able to publish "Signal:39475638" on someplace like GitHub? Is the phone numbers part of the encryption somehow and you absolutely can't use some other number even in addition to it? Because I refuse to believe you don't understand the downsides of phone numbers and I know you understand the protocol is good enough were it is relevant. So surely then there has to be some technical limitation because what other legitimate reason is there?


And yet, there is no PGP replacement in existence despite it having died a thousand deaths and having promised replacements for decades.

> So surely then there has to be some technical limitation because what other legitimate reason is there?

It's like people aren't reading the whole thread and just responding to specific comments they don't like. The premise of Signal, or at least what's made it practically useable, is that the short identifiers are immediately available and verifiable on a mobile device. When I first reach out to someone on Signal I know the person I'm reaching out to is the owner of the identifier I used unless their phone carrier is actively compromised when I exchange the first message. To Signal's users, this is an acceptable compromise. On top of that, I don't need to do a key exchange dance every time I want to talk to a new person because I have a contacts list of their phone numbers, which Signal has verified and bound to their keys.

Signal is really pretty simple: trade key exchange parties for the phone numbers already acquired though countless years of past parties and have locally grown crypto sans intrusive cloud services. And, do it explicitly not-for-profit so there's no possible motivation to abuse this contract with users in service of shareholders.

Obviously Signal could implement whatever random people felt the need for at any given moment. But they don't and it doesn't seem like whining about it is changing anything. If you don't like that then go use one of the many alternatives or build a replacement. I'm honestly surprised nobody's built one at this point. Literally spin up a signal server, make a build of their mobile app, and let users paste in pubkeys instead of phone numbers when starting a message. See how many people use your product. Or just change the phone number db to a shortname db and remove the verification step.

Yes, these conversations are exhausting. What's even more exhausting is the perpetual outrage from "hardcore" "security" "nuts" and absurd anons driveling on about why all the practical solutions that work for users are nonsense and how they could be made "better" but who balk at actually building the solution they think the world deserves. It's a tale as old as time in the security community, sadly.

It's funny, Moxie actually did something about it and it still isn't good enough. Signal is probably the closest thing to a PGP+email replacement we've ever had. What more do people want?


None of these are a reason to not to also have a different number that you can publish publicly without giving someone your phone number. You can have your phone number for everyone in your phone book and a one way derived or random number for everyone else.

> When I first reach out to someone on Signal I know the person I'm reaching out to is the owner of the identifier I used unless their phone carrier is actively compromised when I exchange the first message.

Compromising is in this case rather common in sim swapping and spoofing (you can barely even call it spoofing). Phone numbers are not useful as some sort of continued point of trust. And I doubt Signal uses it like that under the hood.

> What more do people want?

Before you complain about other people maybe you should give other people the courtesy of reading what they wrote first. I have already said what I want, a public id I can publish on for example GitHub without the implications of publishing a phone number. Implications which anyone with a relevant opinion should already understand.


I think you're being hyperbolic about how weak phone numbers are. Yes, you can get sim swapped. But you pretty much know immediately since your phone stops working. We've never even heard of an attack where someone was swapped for days, weeks, or months and didn't know about it. It's an active attack and while it's possible and yes future messages with Signal users are vulnerable while it's happening, it's not a persistent threat. And your contacts will see your safety numbers change and reach out and make sure you're really you. That leaves a problem of somebody reaching out for the first time to contact you while you're actively being simjacked as the only real damage.

But, none of this even matters if you turn on registration lock. Sim swapping attack thwarted.

I've read your request worded in different ways many times and what people keep doing is pointing a finger at phone numbers, yelling "they're insecure", and then pointing at usernames and saying "look, it can be better". Nobody has actually argued how it could be better, just that phones suck. I don't find that a compelling argument, sorry.

Usernames/email are no less susceptible to whatever service you use to register them getting jacked. There is literally zero security difference and emails are easier to spam. Usernames just don't have KYC baggage that phones do in the US. But honestly as Signal has shown time and time again, all that law enforcement can get from Signal is that a given phone number registered with Signal. Because they have impeccable application layer crypto which is what actually matters.

Okay so what if Signal uses a username/password DB and doesn't allow email reset. That removes the 3rd party from the equation and now Signal takes the burden of being the central authority for usernames. And, while possible, it entirely inverts the whole premise of Signal in the first place.

Good news for you, that's not just my argument, it's actually happening. Signal is trying to add support for usernames by forcing everyone to add a pin. It's not clear at all that this pin is now the password to a signal account that is used to sync your contacts data and profile. That's not a problem in and of itself because it's all theoretically good crypto. The problem is that it isn't good crypto. It's a 4 digit pin for the majority of users. Signal knows this is in a bind trying to slip things in that they know would piss off half their users because it's shit security just in order to make usernames possible. And they're getting called out for it.

aside: It's not passwords per-say that are bad (even though they are because people and UX). It's that Signal is telling everyone "hey add this quick pin" and people don't realize that's actually a password for your whole account and that the whole model is changing underneath them. If you know and set a strong passpin, you're fine.

Anyway, the catcher is this: instead of having to deal with what it means to have passwords and get users up to speed, they developed some technically really cool but batshit insane system to throttle pin attempts so that the burden of trust gets moved from your carrier to Intel and they can wash their hands of how insanely bad a 4 digit pin is in terms of entropy. So you want usernames because you don't trust your carrier? Did you know that would come at the cost of trusting Intel instead? They don't really have a great track record recently...

My entire point is not that people are stupid for asking for usernames or something. It's that they don't come "for free" as everyone seems to think. If you want traditional username/password, then the world changes so that Signal becomes a cloud service you must trust to maintain a new global contacts book of usernames just for use on Signal. Signal didn't like that and that's definitely a problem for all the people who use Signal because they don't have their fingers in that cookie jar. So they punted and are moving the trust point to Intel.

They've been working on this for years.


I agree, it's an exhausting repeated conversation. It's almost as if there's a frustrating unmet need with signal as it stands for a lot of people that isn't actually placated by the repetition of an argument about how they grow as a ~~business~~ (sorry, as a non-profit).

And again, signal is the only thing that can talk to people on signal so "use something else" is not helpful.


> It's almost as if there's a frustrating unmet need with signal as it stands

Do you have an alternative suggestion? Is there an app and platform you'd rather use over Signal? Maybe Wickr? Matrix? (AN0M? <smirk>)

My take is there's a very small "unmet need" that frustrates such a small number of people that everybody who's tried to usurp Signal has effectively failed.

Signal has literally become "SMS but secure" for everybody I know.

> signal is the only thing that can talk to people on signal

That's untrue. There is nobody in my signal messages that I cannot talk to over the phone, via SMS, and almost everybody I can talk to via email (with a vanishingly small number of those for whom I have trusted PGP keys).

A agree with your premise that it'd be really nice to piggyback Signal's contact graph without having to do the work and make the compromises Signal have done to create that graph. But that's a totally unreasonable expectation.

(And FWIW, I think Signal totally lucked out early on by being in the right place at the right time to build their contact graph. My network of friends/colleagues exploded back when WhatsApp fucked up their messaging/policy a few years back, and practically overnight my "normal" and non privacy focussed or recreationally paranoid friends all rage quit Facebook messaging and encouraged each other to move to Signal. There was a super obvious step change in who my available Signal contacts became back then, and I'm not convinced Signal would be what it is today without that fuckup by Facebook back then.)


Why are you saying they need to grow? You of all people are the first to admit that everyone you need to talk to already uses it.


I'm not saying they need to grow. I'm saying that arguments resting on the importance of phone numbers to the growth of their social graph are also resting on the idea that signal must grow. I am, in fact, saying that while this may be important to them it is not strictly important to me.

And I never said everyone I need to talk to is on it. I have like 6 different messaging apps and accounts because nothing has everyone. And I'm pretty conservative about which ones I'll use compared to most people I know.

I would rather use signal than most of those, other than the fact that I also frequently need to communicate with people who have no business knowing my phone number.


I guess I missed where the growth argument was being used. Sounds like we agree that there's no implicit need for signal to explode into oblivion like a unicorn prancing over a rainbow.

I've never regarded a phone number as something extraordinarily personal. The amount of spammers that happen across my phone number is ridiculous. It's nice when you interact with a real human using your phone number (unless it's a recruiter ffs), so the more I give it to the more likely that is to happen. I guess I just don't understand what's personally revealing about a phone number. I give my phone number to mundane things all the time so people can communicate with me. The "need to know" bar for my phone number is pretty low. It plays about the same role as an email address in my life.


I think that people's experience with the privacy and significance of their phone number varies a lot by demographic.


They're not a business.


It is certainly fair to be frustrated. Respectfully, I'd challenge anyone who thinks they can build a successful secure messaging platform that concocts the perfect UX while being absolutely privacy preserving to do so. I'd give it a spin.


Except that like I said, users are a feature here. The perfect thing may exist but it doesn't matter if no one's using it. I don't know about you but I lost belief in the idea of a perfectly meritocratic world of social products a long time ago.


I don't know whether or not it has always been the case, but Signal works fine without "access to my contact list". The Android app does seem pretty persistent in asking for it, though!


> It's not really my problem if it's hard. That's for them to figure out.

It's totally your problem.

You want a platform they have figured out they are not interested in building. That cannot possibly be their problem.

If I were a journalist critical of the Saudi regime or an NSA whistleblower or a government leader or the leader of a drug cartel or something similar, I'd also be unhappy with needing to tie a phone number to my Signal app to be able to use it. But there's a who bunch of very suspicious looking drug busts happening over the last year or two which are without doubt related to drug dealers choosing to use AN0M instead of Signal.You need to be _very_ careful when choosing a Signal alternative...


I use signal/whatsapp etc without giving them access to my contacts. I have to type in the phone number (only first time) with whom I want to chat. And that's okay.


It's interesting to me that you used Keybase as the example. My brain doing its guessing ahead thing assumed you were going to say Matrix. I've seen several popular instances of it, and run in to people actively using it at least monthly where I haven't seen anyone use Keybase in years (since the Zoom acquisition). Do you see a lot of people _actively_ using Keybase still?


I don't use Matrix a bunch so it might just be that I'm not as familiar and out of the loop. To me Keybase (despite all the drama) seems like the most isolated/pure example of a product that took the approach of username/password style accounts and applied it to application layer crypto to achieve secure messaging. Keybase later added all the network-y chat type features that make me think more of a product like Matrix. But if Matrix is good for 1:1 "chat up my contacts and groups thereof", then great. Matrix always seemed more like federated Discord or "crypto" IRC to me with the whole needing to join channels thing.


I personally use it for LOTS of stuff, both personal and commercial (as a Slack replacement). Other than a couple bugs (pinch to zoom on Android, media playback), it's fine - I don't feel like I need any more features, though I'd love it to be a bit snappier. KBFS has been excellent for stuff like secrets in CI pipelines.

Disclaimer: I'm one of the ex-Keybase, now Zoom people. I'm definitely in a bubble. The non-Keybase people I talk with are my consultancy's employees + a couple clients.

Keybase's security model is excellent in protecting you from attacks like the one described in the OP. If you can't sign your device with another one, you can only recover a username if:

- it's not in [lockdown mode](https://book.keybase.io/docs/lockdown)

- it has a verified email / phone number

- you either click a reset link in the email / SMS _or_ know the password

- _and_ the user fails to cancel the reset over many days of warnings.

And if you manage to go through all that trouble, all your contacts will get blasted with warnings about your identity. Fun!


Matrix is the protocol I think one should go to if Signal's requirement of phone numbers is a turn down.


Matrix is great. Just remember that the Matrix threat model isn't the Signal threat model: you're usually telling a Matrix instance --- or, really, anyone who can compromise or suborn that instance --- a lot more about your communication patterns than you are with Signal.

Matrix, right now, is a lot more amenable to the kinds of messaging that people on HN tend to want to do than Signal is. The problematic thing is that HN people tend to believe that their workflows are (1) the most important and (2) the ones with the most sophisticated threat models. Neither are true; (2) is very un-true.

For, like, talking to team members about a shared dev project, I'd always use Matrix in preference to Signal --- of course, for that kind of work, what I'd really do is just use Slack or Discord. Which gets at something about what HN wants from Signal.


Interesting work in this space is being done by OpenPrivacy with their "cwtch" app. An express design goal is to minimise side-channel leakage of information.


If the phone number requirement is the only part you dislike, buy a tourist Sim card with cash. The greatest lie of online identity is that phone numbers are tied to individuals forever.


That's a lot of work considering XMPP or Matrix providers get this right.


Isn't Keybase semi-abandoned? There hasn't been a blog post since 2020 when they were acquired by Zoom.



I got a 6.0.1 update for Keybase like yesterday. I agree with the sentiment, though, feels like it's in maintenance mode. But its core value prop and feature has never stopped working. Point was that it's there and it works and if it's the UX model you prefer then by all means, use it at least until someone comes and reboots the concept.


I think a lot of people abandoned it after Zoom acquired it.


Anecdata point, I jumped ship as soon as that became public.

Zoom is a looooong way from having the sort of trust that Signal/WhisperSystems have in my mind.

(Which is a pity, I really liked the idea of Keybase.)


I don't think it's stated enough just how easy signal is as a drop in replacement for WhatsApp, the main communication method for a significant portion of the world. The ability to install a new app, use your phones contact database, and be able to use the app nearly exactly the same way you used WhatsApp is an incredible feature. With almost zero effort you can significantly reduce (capitalist or nationstate) surveillance against you. It's not perfect but it's a lot of value for little effort.

All of these feature requests require less knowledgeable users to do new things or weigh alternative options which involves time spent developing onboarding. Having "one way," an opinionated way, to do a particular type of thing is a very useful engineering value especially if you have limited engineering resources. Simplicity is an extremely underrated feature.

Being 80% perfect for 20% of the work is laudable.

What's even better about Signal is that Facebook's competitive data is the list of people you know. Facebook wins every time a person adds a friend without adding their contact info to their phone. That means Facebook is the source of truth for who you know and Facebook is the intermediary for communicating with someone else. That's why, in retrospect, whatsapp was an obvious competitor worth spending a lot of money acquiring. WhatsApp drove people to use their phones contact list as the source of truth for you who communicate with, not Facebook's friend list.


A drop-in replacement would mean that you can still communicate with people on WhatsApp. Matrix protocol allows you to bridge WhatsApp and many other SaaS comms platforms to a single client, truly making is a drop-in replacement for WhatsApp.


I installed signal and it worked. I told a friend to install signal and it worked. I told my mom to install signal and it worked. The interface was basically the same. Any friend who installed it appeared the same way they would appear in WhatsApp. I didn't have to teach any of these people anything to get them to use it. I didn't have to talk them into making an account to use it. That is what I mean by drop in.

It's not a drop in for the behavior (talk to other people who use Facebook owned services in a way where Facebook can read all of your conversations), it's a drop in for interface (communicate with others who use the app in the same way you communicated with others using WhatsApp).

I just spent time looking at matrix.

  google "matrix"
  oh right, name space collision with popular 90s movie
  google "matrix app"
  oh, this is some library or something, not an app
  searching the page for client. "Maybe under matrix live?"
  see clients button in sub menu
  see 10 plus options I don't recognize the name of and immediately lose interest

  search "matrix" on app store
  see apps with 2 stars or less than 10 reviews, nothing official.
Matrix is what you get from the people who say "isn't Dropbox just rsync?" "isn't a chat client just a GUI for a protocol?"

100 bespoke solutions to the same problem (10 different desktop clients) is an engineering nightmare and it robs a service of "economies of scale" enabled improvements.

I don't want to read a wall of text to understand something and neither does my mom. We want to search a keyword (or "best chat app"), download an app, and use it for its purpose. That might not be optimal, but you won't see wide spread adoption without it. If you tell me "matrix is best for chat" and I can't search for matrix on google, one click download a client, and be chatting with another person who did the same with minimal setup, it's not just a non starter as far as getting widespread adoption, but it's very far from "drop in." I don't personally have a single friend who has asked me to use matrix/a matrix client, or told me how awesome matrix is. Matrix supporters should ask themselves why the main place you ever see Matrix mentioned on hacker news is in posts about Signal.

Protocols don't win customers, user interfaces do. The average end user wants to download an app and use it without having to understand what "federation" or what the security implications of something is or who owns what data. They want their most knowledgeable friend (or reddit/hn) to tell them what to use and then use it and trust that they know what they are talking about.

I'm pretty confident that as long as this page: https://matrix.org/clients/ looks like that, matrix will never see widespread adoption and it will never be the obvious choice except among the people who prefer to move their documents around with rsync and know what IRC is.

Matrix being a brand for a protocol rather than a full chat app is another self harm. The customer for a protocol is software engineers. The customer for a chat app is all humans with a phone.


This is as daft as Googling "email" and expecting a de facto client. You're on HN, it's nerdville, expect more interest in the protocol than clients.

People search for "email clients. Try searching for "Matrix clients". Element is the best thus far, IMO.

"Widespread adoption" includes the EU's military, healthcare and government, so I wouldn't be so certain you'll end up being right. It's hit 60m publicly addressable accounts, which doesn't include any private servers or any kind of healthcare, gov or military: https://news.itsfoss.com/matrix-sixty-million-users/

EU is forcing interoperability standards. I have a gut feeling that you will look silly in five years time, but I'll buy you a very nice craft beer if I'm wrong. Hit me up on the bet in 5 years time: me@hammyhavoc.com

You were discussing "drop-in replacements", I gave you a drop-in replacement that still maintains interoperability with WhatsApp et al, which Signal does not. Classic shifting of the goalposts.


I think we meant different things when we said "drop in replacement" and therefore were referencing different ideas of what makes something a "drop in replacement," which is why it sounds like there are feelings of goal posts shifting.

If the messages are still sent via Facebook servers, that is not a WhatsApp replacement, it's an alternative WhatsApp Client, it's still at it's core "performing" WhatsApp. It is not a WhatsApp replacement, but a WhatsApp client replacement. I moved to signal specifically to sever my relationship with Facebook because I don't trust Facebook. Not communicating with Facebook is the feature that made signal appealing and made me want to replace WhatsApp (not the client, but the service as a whole) with something else.

I think conflating the idea of replacing "WhatsApp the service" and replacing "the WhatsApp Client" is the crux of our talking past each other and why my focus is on how it's UI compatible and ignoring the idea of protocol compatibility.

> You're on HN, it's nerdville, expect more interest in the protocol than clients.

We are on HN, and it's nerdville, it's true. It is the place where the very same comment (rsync files around) I am using to criticize the hubris of nerds (myself included) was made. (https://news.ycombinator.com/item?id=8863). Plug and Play (referenced in the HN link) is a winning idea. The crux of my statements here is how amazingly plug and play signal is, specifically for WhatsApp users.

> This is as daft as Googling "email" and expecting a de facto client.

I am saying this in good faith and I hope you take it as a good faith comment and not an aggression, but have you googled "email"? I understand the point you were making and I think it applies to a lot of other protocols, but googling "email" returns gmail 1st and 2nd, then outlook 3rd. Wikipedia is the 4th result...

> Try searching for "Matrix clients". Element is the best thus far, IMO.

I hate to respond so directly to this too, but have you googled "matrix clients" because I did, and I am pretty confident the results don't prove the point you wish to be proven. The number 1 result is the matrix website matrix client page I had already found by searching "matrix app". The second and third results are top 10 lists. The 4th result which you have to scroll down to see is Element. How do top 10 lists outperform clients themselves? That paints a pretty bleak picture for the difference in quality between the best app and the worst app.

Engineers have a way of being arrogant when they think the thing they have is technically superior or have a grand vision. Betamax was better right? Being protocol first over customer first, to me, is a form of hubris. The customer experience is what wins, everything else is just an implementation detail for the vast majority of people, even engineers.

> Widespread adoption" includes the EU's military, healthcare and government, so I wouldn't be so certain you'll end up being right. It's hit 60m publicly addressable accounts, which doesn't include any private servers or any kind of healthcare, gov or military: https://news.itsfoss.com/matrix-sixty-million-users/

This is definitely interesting and something I find worth considering. I certainly have an American-centric view. Clearly it's in every countries best interest not to have all their communication going through foreign servers, so the idea of Europe migrating to their own chat, much like Korea chose Kakao, doesn't surprise me. I kind of suspect that the idea of federating chat before the balkanizing of it might be too forward thinking/pre-mature.

> EU is forcing interoperability standards.

It will be interesting to see if American fights this or embraces it.

> I have a gut feeling that you will look silly in five years time

I would not be surprised to see Element, for example, become the most popular client and obvious choice. All the comments, FAQ's, etc, already seem to acknowledge Element is the right choice, but the structures that make it easy to download (google ranking result, links from the main matrix page, people saying "use Element," not "use Matrix") are not yet in place. When Element eclipses Matrix or Matrix starts being talked about outside the context of chat apps I'll start to take matrix seriously. Certainly my criticisms are not based on immutable flaws, but what I think are strategic blunders.

As a final note this is directly from the matrix website:

  Empowering the end-user
    The user should be able to choose the server and clients they use
    The user should be able to control how private their communication is
    The user should know precisely where their data is stored
This means the user has to be informed about servers and clients, the user has to be informed about communication privacy levels, and the user needs to be informed about what data is and where it lives. While that might be nice, it's even more nice to trust someone to solve these problems for you so you can best think about how to spend quality time with the people you appreciate in your life, not the security level of your data.

`sum(dilemma) + sum(onboarding/required knowledge) <= resistance`. anything for which `reward < resistance` will probably not succeed. That's my calculus. Minimize choice, minimize required knowledge/onboarding, maximize reward. That is a winning combo. I don't think matrix is optimizing for that. I guess we'll see if I have to find the flaw in my reasoning or not in 5 years.


The flaw in your reasoning is that Matrix is like the web, and Element is like a browser. Just as mainstream folks don’t say “go look at my Chrome site”, but are smart enough to say “go look at my website with your web browser”, the same goes for Matrix too.

The only reason this doesn’t happen yet is that Matrix clients like Element are still too geeky, and joe public doesn’t care about the advantages of open decentralised e2ee comms. Our plan to fix that is to transform Element’s UX; hide the decentralisation complexities, and let it punch its own weight againdt the centralised alternatives - https://matrix.org/blog/2022/08/15/the-matrix-summer-special... has more details.

The user would still need to understand it’s talking on an open network though, because that’s what it is. But they don’t need to care that much about it.


FWIW, everybody has different results on Google. They vary between mobile and desktop, regionally, and can even be personalized.


Are you saying your results are different?


Yes. They can and will vary significantly. I'm in the UK FWIW.


SMS isn't an app as well, neither is Email.

But you're right, the site is pretty bad for Matrix


Not sorting by (and making visible) a popularity count or a preferred client per platform presents a potential interested person with an arbitrary decision. Rather than "let's experiment and see if I like the default experience," the marketing would have needed to win me over enough to make the work of researching a good client appealing.

Googling "matrix client" leads you to the matrix website clients page rather than Element or the domain of the most popular client. I have to scroll downward before I even see Element. Element showing up after 2 "best 10 matrix apps" articles is an ecosystem failure and to me communicates immaturity.

When I see signs of immaturity, I immediately assume weak/untested security. I want to know other people are trusting something before I trust it. Signs of immaturity also make me cautious about making recommendations to someone else because I don't want to become someone else's tech support.

FWIW, if you google either e-mail or SMS, there are client(s) in the first 3 results for both.


There is also Session (getsession.org) which is a fork of Signal without phone numbers or centralization.


Too bad it is owned by a crypto company, Loki.


What's the issue with that?


where does Telegram fit in your opinion?

genuine question from someone oblivious to messaging advances in the last decade.


The e2e encryption protocol is the definition of "let someone who has just learned about Diffie-Hellman roll their one crypto". It's called MTProto, and version 2 mostly updates padding and uses SHA256 instead of SHA1. Yes, SHA1 was deprecated before Telegram even existed. No, version 2 is not better.

Cryptographers praise Signal because the protocol makes sense and because it's not run by someone as data-hungry as Meta or Alphabet (though I think it's hosted on AWS).

Threema is a good alternative if you want username/password, but has less users (probably since it's a paid app) and less neat security properties (not even forward secrecy).

I agree Signal is not perfect and has never played the Open Source game very well (even under Moxie reports from the community were largely ignored) and the MobileCoin move is weird. I also have not followed the direction the project has taken since Moxie left. However, the _entire_ code is open source (which iirc is not the case with Telegram) and the protocol makes sense (and has been extensively studied), and there is a lot of eyes on the development. I remember code changes that suggested a pivot to not using phone numbers as identifiers (i.e. maybe requiring them for registration but not showing it to everyone you talk to).

I wonder whether MLS will go anywhere and actual projects will adopt it. Last time I checked it did require consensus on message ordering, which seems to make it less well-suited for non-centralized protocols like Matrix, but we'll see.


Telegram only provides e2e encryption for one-to-one conversations and only if you specifically create a "secret chat" largely because of usability and discoverability reasons with regard to their major point of focus. Its probably better discussed in comparison to other services like IRC, Matrix, Discord, or Slack that concentrate on feature rich group chat implementation with easy discoverability, organization, and mobility for which encryption either does not exist or is an opt-in or bolt-on feature.

Services like Signal, Whatsapp, Keybase, or iMessage that provide e2e encryption for all chats, group or otherwise, (albeit with differing levels of implementation security) have chosen to do so at the expense of things like mobility of chat history across devices and the ability to easily discover and join new group chats and instead focus on a less organized, more ad-hoc form of messaging that's a rather different use case than Telegram's.


Doesn’t matrix also do all that with the added benefit of federated Id and pulling Chat history from an e2e group chat?


telegram "supports" e2e encryption, but it is frustrating to use and is not enabled by default


Last time I checked, it also doesn't work for group chats. Has that changed?


Tgm is more a database, rather than just a messenger. It's a centralised huge server-side searchable abyss. It is both its good and bad side. On the good side: if you're using it for public and non-sensitive things like running tech support, it's easily the best thing. Once you type a word and search - you'll get everything from the very distant past. It's very good for dev-ops activity. On the other hand - forget privacy: a phone number ID, server-decrypted chats, e2e hidden behind two menus and not available on some platforms (like Linux).


This article is worth a read on that front: https://www.wired.com/story/how-telegram-became-anti-faceboo...


Telegram is first and foremost a social network, and that type of service is incompatible with what we expect from a modern secure chats.


The reality is that there's room for a new messaging app that uses username, has good UI, and is secure. Perhaps wire is that app?


> It's really difficult to build a useable security product, and Signal has done it successfully.

I'd argue it hasn't. Signal still has no way of backing up your chat history (with photos, etc). Lose your phone and it's all gone forever. The PIN that the app annoyingly tells you to set up does not serve as an encryption key for your backups. There are no backups.

Once again, if your phone dies (this happened to me recently), all your data in Signal is gone forever. And there is no way to prevent that.

In this day and age, I consider this unacceptable. That is not a "useable security product".


> I consider this unacceptable.

On the other hand, I consider this a feature.

I'm not saying you are wrong, but I am saying different people have different ideas and requirements about how they want things like this to work.

For me, most of my Signal chats have disappearing messages enabled, to intentionally ensure there is no long term archive of conversations (assuming you trust the other people to not be screenshotting everything). It gets you into the habit of storing message that may be useful later (mostly for me stuff like event details or addresses), with the benefit of making everybody in the conversation a little more inclined to treat it all as ephemeral and be somewhat more candid then you might be in SMS or email. Not _quite_ as candid as face to face in private, but closer.

There's a widely used and agreed on signal for most of my group chats, where setting disappearing messages to 5 minutes is understood to mean "juicy gossip or legal grey area chat is about to follow" and setting it back to 8 hours or 1 week means "OK, we're done with that discussion, back to regular chat".


> different people have different ideas and requirements about how they want things like this to work

Agreed. But I can't convince people and family to use Signal if I know that one day they will inevitably lose the pictures of their loved ones. Because that's how most people use communicator apps.


Let me show you this picture of my grandson....

opens Signal, scrolls for weeks

You could show them where the Save button is — that's how "most" people use messaging apps. Even "friends and family".


Signal on my Android phone makes an encrypted backup every day, this includes photos and I can copy the file off my phone if I desire (plus I point the backups to my microsd card which should still be good if the phone dies).


Run the desktop client on a Pi in a VNC session at home and automatically receive the identical messages - no problem!

Not a proper solution but a hacky workaround possibility.


"Data in Signal" isn't a thing.

Save messages and media you want to keep outside of your encrypted chats...


Those are features.


> I quite like Signal as they are and this "incident" demonstrates exactly what happens if a carrier gets compromised: nothing. Nothing happens. Signal decides not to trust any phone verifications from the period of compromise and requires affected numbers to reregister.

cool, but entire carriers being compromised has never been a concern. it's state agencies forcing carriers to compromise individuals.

>I don't understand why everyone wants Signal to be something it's not

we don't. we just warn people against using it. it's not a privacy tool, it's a larp toy like a commercial VPN.


Doesn't everyone get notified when your verification status changes? Don't you need to rescan people's security numbers or whatever they call it? If this is truly a gripe you have couldn't signal also add some sort of delay to the re-verification process so that device resets take weeks to be trusted and with lots of warning and opportunities for both parties to disengage before any hostile actor takes over?

As far as I'm away, Signal is used by plenty of people who may be targeted by state agencies. Has there been even one "High value target apprehended because Signal" headline?


> Has there been even one "High value target apprehended because Signal" headline?

There have been a few "high value target apprehended because AN0M" headlines, and if you keep an eye out for it way more headlines/articles where you go "Yeah, that's totally another AN0M bust they just haven't publicly attributed it".

I also suspect (but have nothing more than suspicion to go on here) that companies like NSO can probably exploit phones deeply enough that they can exfiltrate screenshots of Signal. But that they do so at such high prices that it is rarely used and even more rarely hinted at publicly.


"...Twilio, the company that provides Signal with phone number verification services..."

Perhaps this is why Twilio (and Twilio-issued) VoIP numbers work so well for Signal when I don't want to use the number issued by my cellular carrier? Kinda hard to SIM-swap me if you don't know my real phone number.


Do they offer numbers you can use that way or do you just use their APIs with a minimal app?


I spin up a number and verify it can receive SMS (which it forwards to me via email). Since I need receive-only, this is fine. No need to futz with APIs or apps.


A lot of banks refuse to send SMSes to voip numbers. Google voice runs into this, and presumably twilio too.


>it was possible for them to attempt to register the phone numbers they accessed to another device using the SMS verification code

That's a thing? If my number expires and gets reassigned to someone else, and they register for Signal, I'll get locked out of my account just like that? And they'll start getting all the messages that were addressed to me?


It's not your account any more. The new owner gets "your" SMS and phone calls too. The identity is backed by the ownership of the number, not your person.

Importantly the safety number will change since it's a new device. If you care about stuff like this, verify the new device out of band and distrust any unexpected changes. Most people don't care and they still see a huge improvement over plain SMS.


Though note that your message history is still private, as you have to manually export and import the local message history whenever you get a new device.


Services like Signal and WhatsApp can user 3P services that allow them to be notified when a phone number is rotated (given to a new user). They should ideally be doing this, I cannot verify if they are or not.

Second, Signal and other services have implemented secondary registration requirements such as a PIN, which they will require during a new device install or at other times.

Third, you can build models or crude business logic to identify when a number no longer appears used for a period of time. Carriers do not reassign a number immediately. Assigning a number thus one user cancelled, to another user, is seldom done before a 90 day hibernation period.

I used to work at a cell provider.


Your account is tied to your phone number so pretty sure that’s the case, yep!


That sounds horrible. Would I be SoL even if I had ticked "Registration Lock" prior to that?


Apparently as long as you use Signal at least once every 7 days from a linked device, you should be good: https://support.signal.org/hc/en-us/articles/360007059792-Si...

Still, given that your number is used as a primary identifier, I'd avoid using it in that way for an extended amount of time. Among other things, I'm not sure if it's possible to re-register using just your phone number and PIN (but not access to SMS-OTPs on the associated phone number) in case you lose your own device, for example.


No, this is exactly the kind of thing registration lock is intended to address.

If you enabled Registration Lock, your account cannot be hijacked by sms, provided you’ve been actively using your signal account within the last week.

There’s an automated keep-alive for the case that you still have signal installed but haven’t been sending/receiving any messages.


I absolutely do not understand why I have to link my very sensitive Signal account to a very insecure and hard to change ID: my phone number (which can be traced to my identity in too many ways).

Why Signal does not allow fully anonymous IDs (like Threema does) is a mystery to me.

Signal is fine for most users, but it is inherently _unsafe_ for high-value sensitive communications where participants can expect targeted phishing attacks.


Anonymity isn't part of Signal's risk model. If you need to stay anonymous, then there are more suitable options.


It's not about that, it's pretty much the same as using a dynamic IP to authenticate you


Not really, a phone isn't assigned a random number from a pool every time you turn it on or reconnect to a tower, then given to other users ad hoc.

A static IP maybe, except the IP is portable to a new AS if/when you want to move to a new provider. It's even susceptible to a false BGP route =)


It is not about being anonymous (though this also could be nice in some situations), it is about identity theft and credentials theft. There are numerous ways to steal my phone number and then impersonate me on Signal. For me, it is not a big deal (though a dedicated hater can probably ruin my life with that). For many people in sensitive positions, this is literally a matter of life and death.


On average, stealing a phone number is much more difficult than stealing someone's password, because of the frequency of password reuse and data breaches.

If someone were to do that, it would be blocked by registration lock (which it prompts you to do). If they were to guess that, all your contacts would be notified that your identity has changed.


My phone number (and probably yours) are in the Facebook 2019-2021 leaks. These are easily downloadable.


You don't. Register with Signal using a temporary number.


I cannot do it in my country without physically going to some office and showing my passport. Doesn’t feel “temporary” to me.

SIM cloning is a thing. S7 hacking is a thing. Phone numbers are _insecure_ as IDs, as simple as that. Signal’s insistence to use nothing but phone numbers is somewhat suspicious these days.

(Both major competitors in secure messaging, Wire and Theeema, allow pseudonymous temporary IDs in addition to phone numbers).


but if the temporary number gets recycled and somebody else uses it, can they re-register your account?


Cloudflare was attacked with the same attack but they were able to prevent any harm since they use hardware keys for 2-step instead of OTP.

This is a very simple phishing attack and I am surprised it has proven to be effective.


This incident points to something much more severe. What role was this employee(s) whose credentials were compromised? How did these credentials allow even an employee to get plain text auth codes being sent out to end users? Such a permission should be extremely limited in who it is granted to.


I suspect many Twilio support reps need access to outgoing SMS, because manually looking over those will be an important component of handling a "someone is using your service for spamming" complaint.


I disagree. They would not need to access the full contents of outgoing SMS to perform this duty. For example they could see the auth codes masked.


How would Twilio know what portion of the outgoing SMS was auth codes?

Are you proposing they add an API where senders can annotate part of their message as private? (Not a bad idea...)


Are you familiar with their API? We use their SMS auth service at my employer. Twilio is the one composing the outbound message including auth code. The API caller is not providing Twilio with an auth code and phone number. Twilio 100% knows which portion of the outgoing SMS is the auth code.


sorry, not familiar with an Auth API. About 5 years ago I worked at a company that used their API, but we just used it as a service for sending texts to specific numbers. (And mostly we used different services, because it was more expensive than our other options)

Do we know that Signal was using the Twilio Auth product and not something custom on top of Twilio?


We do know. Check the texts you’ve gotten while signing into Signal. You’ll notice that they originate from short codes (like 22395) that are also used by other services like Discord, square pay, just to name two.

Furthermore, it still doesn’t matter whether Signal was using their authy service or not. There should be very tight data controls at Twilio where few employees would ever be able to retrieve clear text messages being sent to end users.

This incident is not getting nearly the attention it should imho.


Wouldn't the spammers then just mark 100% of their spam as private?


grep [0-9]+ should cover enough of the problem space.


Totally agree. The message content should be private and not accessible by employees. Kind of scary when you think that so many 2FA codes are sent via Twilio.


Exactly. A malicious employee could login as any user to popular services like WhatsApp, Telegram and others that are SMS auth only, simply by knowing which endpoint to hit to kickoff an auth session initiation. I hope I am not understanding this exploit correctly. This would be a massive failure on Twilio part to allow employees access to the auth code.


The phishing against Twilio looks very much like the attempt on Cloudflare.

I wonder how many other companies have been successfully phished that we don't know about.


Yes, Cloudflare pointed out the similarity and they suppose it's a single attacker with multiple targets.

If you have access (I don't in my current role and I don't care enough to spend money to do this on my own account) you can ask a passive DNS system about names like cloudflare-okta.com that were used in the Cloudflare attack, identify patterns (same registrar, same hosting, that sort of thing) and also the IP addresses Cloudflare listed.

You should probably assume that anywhere which doesn't actually have FIDO or similar and was actively targeted is screwed, because it only takes one lapse to let the bad guys in.


It certainly makes me wonder: it Twilio more competent because they noticed a phishing success? I assume there are plenty of people at every company who could fall for social engineering scams, so I think it's safe to say most companies aren't totally safe -- they just haven't noticed a major breach.


What I want to know is what is ultimate aim that comes from the overlapping targets of Twilio, 1900 (and 3 specific) Signal users, and CloudFlare? What product on cloudflare and their 1900 (and 3) users were being targeted?


1900 is an impressively small upper bound of affected users, implying that Signal and Twilio got onto this very, very quickly. Seems like the response was swift. I would be interested in learning more about the process that led to such readiness at either company.


I gave signal my landline. They don't need SMS.


So that means for the duration of the attack active contacts and groups were exposed?


No. When you sign up to Signal, they send you a text message verification code. This is done via a service called "Twilio."

The attackers were able to view outgoing Twilio messages, so they could enter your number on the registration screen, read the code that Twilio sent to you, then use that cod to complete the sign up process.

Attackers were not able to view information about your current Signal account (if present) through this SMS service.


But attackers are able to impersonate you to your contacts no?


If an attacker successfully registered your phone number, and if your contact either never checked the Safety Number for your conversation† or they ignore the fact that Signal says the number has changed, then yes, the attacker would be able to impersonate you to that contact.

Twilio says 1900 Signal users are potentially affected (attackers saw the confirmation code or could have seen the confirmation code) so Signal disabled those accounts pending re-registration.

† For any particular conversation pair, Signal has a large unique number it calls a Safety Number calculated from the long term cryptographic identities, if you got a new phone (or if I'm pretending to be you on a new phone), messages you send will have a different number because the new phone won't know the old phone's cryptographic keys. The phone app can display the number (so you can compare them) or scan a QR code from another phone to check they match without the boring work of comparing numbers.


How much can one trust Signal for secure communication?


It won the Levchin Prize at Real World Crypto, and is essentially the gold standard for messaging cryptography. People on HN hate its ergonomics. You could reasonably prefer something else, but you can't reasonably call it untrustworthy.


I don’t really know why, even. I get the account vs phone number argument (though I prefer phone number as it is familiar to regular people in a way that usernames are not) but aside from that it has every feature I could want, and I feel like I am a fairly demanding user. It is is my primary messaging app and I send hundreds of messages a day to my friends, both individually and in group, on my phone and on my desktop.


Transacting in phone numbers doesn't feel right to people who have been soaking in message board and IRC culture for (in some cases) decades, even though it's absolutely natural to ordinary people (remember, WhatsApp was for a very long time the most popular messaging service in the world).

And those same people have very strong feelings about services where you can't build your own client from scratch, and a viscerally negative reaction to Signal's principle that open clients create a lowest-common-denominator anchor for security. I get that too, even though the point is kind of indisputable (see: what happened with Matrix E2E crypto).

Signal has a clear vision for how security and privacy work in messaging, and they're uncompromising about it. I deeply respect that, since there are a lot of things they could do to pick up Internet points that they don't do because they haven't worked out the privacy details yet. But opinions differ.


I get that (I grew up on IRC and with an ICQ# too) but it does massively simplify the onboarding process - both because it’s familiar to WhatsApp users (of which 99%+ of people who get signal would have previously had WhatsApp) and because you immediately know which of your contacts already have signal.


Is Signal a thing of the past yet? Do they still try to glue the word "security" and a phone number together? :-/


A phone number is fine for 99% of users.

They recommend that users with higher-than-average security requirements set a PIN, which removes phone number attacks from the threat model, but you're still dependent on the security of SGX.

Users with extreme security requirements can set a 42 character alphanumeric PIN, thus also excluding SGX from the picture, but at that point you're getting owned no matter what you do.


A PIN does not detach you from your provider, country and misery of having a cell phone, let alone a chain of possible points of SMS interception.


"All users can rest assured that their message history, contact lists, profile information, whom they'd blocked, and other personal data remain private and secure and were not affected."

I do not understand how you can re-register someone's account to a new phone and not have the data read. If it is re-registered successfully, then you should be able to login. If you can login, you can see the data...right?


The registration means "from now on, any messages sent to [phone number] will be delivered to this device", it's not logging into an account.


No, because a key is generated every time you reinstall/register in the app, and it sends along your contacts when you login (hashed). So there shouldn't be a way to see anything shared in the past .


assigning accounts to numbers is the dumbest thing. I remember when I got a new phone number a few years ago I managed to login into someone else's venmo account. Numbers are like dynamic IPs, why would anyone use this to authenticate you is beyond me.


>Numbers are like dynamic IPs

Maybe for you. For other people who have had the same phone number for years or decades, they're the one of the most persistent forms of communication or identification available.


The ability to transfer phone numbers when changing mobile provides has been around for a very long time in US, but it wasn't the case in some other countries until recently.


Even then personally I moved abroad so many times that it just doesn't make sense to use phone numbers. Not everyone stays in the same place all their life


Yes, it's a horribly dated idea, as-is receiving any kind of 2FA code via SMS.


Is it not possible to use an Authenticator app for Signal, given their privacy setup?


Signal uses phone numbers as (the only) unique identifier in their system currently so SMS (or phone call) is necessary to verify the device owns the number.

They've been talking about moving away from phone numbers as identifiers for a while and have implemented features like account pins that head in that direction but it hasn't happened yet.


Using a phone number as an identifier wouldn’t preclude using TFA.


They want to know the phone numbers. Despite everyone telling moxie that it's not secure from the day signal was started.


Please, stop using phone numbers. There is no reliable way to hold a phone number. The messaging protocols are insecure. If your service uses phone numbers or SMS, that means it's not secure or reliable.


Not only that, I don't want any service that I use tied to a phone number. Partially for the reasons you listed, but also because there are better alternatives; email, authenticator apps, physical keys, cards, etc.

I hate looking at my phone. I hate using my phone. I don't want to have even more reasons to keep my phone charged and in my hand. Phones suck.


The Signal desktop app doesn't require your phone to be turned on (once it's been "paired") by the way, as opposed to for example Whatsapp.


Well sure, but does it require me to use my phone at some point in the process of account creation? That's the part I have an issue with.

I had assumed that there would be an alternative to access the service post-creation. My gripe is more with the fact that a phone is a requirement at any point.

To put it another way: imagine I had to send a letter to Signal's HQ in order to make an account. Now, obviously I'm not going to be sending and receiving letters constantly to/from Signal, but the mere fact that writing and sending a letter would be a requirement as part of the process would be at the users personal detriment. That's the point I was trying to get at - that it is forcing the user into a specific method which is undesired, arbitrary, and frustratingly unnecessary.


Yeah agreed, as I mentioned elsewhere I don't see why phone number should be required. It's a good default (esp. for non-technical users) to use phone number and contacts (= "social network"), but I really don't see what the problem is with using email or just username/password combo and then adding contacts manually from wherever. Why not have that option? Most people would likely still use their phone number for this.


Whatsapp has finally gotten away from requiring your phone to be on. It works the same way as the Signal app now.


Ah, didn't know that, but you're right (just tried it)! That's cool. I ran into this issue around a year ago when my phone broke.


By desktop app you mean a website cosplaying as a desktop app.


I didn't know Signal had a web app.

Yes, I'm aware the desktop app is made with Electron. So what? I keep it running almost all the time and I've never had any performance issues with it.


Well, it's slow, it has awful accessibility, you can't create accounts from a computer, you will have performance issues with it if you need the resources for something else.

Also it won't work on linux phones.


What identity token would you prefer?

Would it be bound to the mobile device in any way?

Would it require that a canonical list of registered identities be stored server-side?

How would you impose a cost on spam accounts without burdening users?

Just a few considerations.


Nice try, FBI


Yes, Signal’s phone number requirement is bad. But, given that, the fact that they don’t store any messages on their side and everything is client side is still a huge benefit over a lot of other apps and still a huge step forward for privacy! Criticism is definitely important but I just wanted to put that out there that all things considered, Signal is still very much a good thing.


I refuse to use or recommend Signal due to blatantly bad design choices that put people that need privacy most at risk like security researchers, journalists, abortion seekers, or dissidents.

If you learn a contact phone number then you can buy their location history. Requiring phone numbers and requiring you share them with everyone you contact is brain dead.

This alone is bad enough to abandon Signal but then consider they have centralized control of client binaries, and metadata protection anchored on centralized SGX they can trivially access. This negligent design makes them vulnerable to coercion or even court orders if any judge realizes they actually -can- decrypt messages and dump metadata.

Matrix supports Signal crypto but in a federated network with no PII requirements like Signal. Also no lock-in or central control of apps.


>I refuse to use or recommend Signal due to blatantly bad design choices that put people that need privacy most at risk like security researchers, journalists, abortion seekers, or dissidents.

I understand your concerns, and if I was a security researcher, journalist, abortion seeker or dissident, I wouldn't use Signal either.

But, like the vast majority of us, I am not any of those things. As such, for my (and most others) use case, Signal is great.

For those at risk from highly motivated and/or state-level actors, Signal isn't nearly enough. Nor, unless you build and run your own servers and clients (and never screw up your OpSec), is Matrix.

Signal isn't perfect. However, for most people, it's good enough.

Don't make perfect the enemy of the good. Because perfect doesn't exist.


Those of us that do not need high privacy today might need it tomorrow, or maybe someone we frequently communicate with.

We also have a responsibility to favor tools and practices that make those that really need privacy not stand out.

Element or other Matrix clients are easy to use and lack the serious flaws I outlined for Signal.


I'd point out that for most people (I suppose that could change, and I wouldn't be upset if such changes resulted in better privacy), messaging is often phone-based and includes folks who use secure methods like Signal and Matrix as well as those who use iMessage and OEM SMS clients.

When it comes to that sort of messaging ("I'm running a few minutes late and will meet you inside the restaurant," or similar) I don't (and won't) separate those out. I just use Signal for all such messages.

Which makes for inconvenience when (especially iPhone users) install Signal and still use iMessage.

I'd add that if I have something to discuss that I don't want recorded (don't forget that it's not just your device that puts you at risk, anyone who's received such messages do so as well), I'll use encrypted voice calls (with the assumption -- valid or not -- that the other participant(s) aren't recording that conversation) with Signal or Matrix.

In both my personal and professional life, I've always made sure to only put in writing that which I wouldn't care if it was shared with the world.

Which is no different than it's ever been. I'm not sure why anyone thinks this is a new thing or that somehow "technology" obviates the need for good OpSec. It never did and still doesn't.


Matrix bridges support iMessage and SMS and even Signal so you do not need to fragment your communications to multiple apps if you do not want to.


> if I was a security researcher, journalist, abortion seeker or dissident, I wouldn't use Signal either.

I mean that's really bad, right? Supposedly Signal is the go-to alternative to doing things the hard way (e.g. GPG over email), but apparently it's just not good enough for those with the highest security needs. Given that the alternative is that these people go back to using extremely brittle software, shouldn't someone do something about that?

The implication of course is that Signal should do something about that, because they already have the user base and are in a position to adopt user identifiers that are not based on phone numbers.


>I mean that's really bad, right? Supposedly Signal is the go-to alternative to doing things the hard way (e.g. GPG over email), but apparently it's just not good enough for those with the highest security needs.

Is it? If that's what you got from my comment, then I certainly didn't communicate my thoughts clearly.

Signal is great for what it is. And that's as a centralized encrypted messaging platform that's easy to use.

Have some tradeoffs been made (e.g., not strictly p2p, some data is stored, in encrypted form, on their servers, etc.) in making Signal easy to use? Yes.

For the majority of folks, Signal is more than good enough.

AFAICT, Signal has been quite successful in that space.

However, if you're the target of motivated folks and/or state-level actors, any product that relies on third-party interaction of any kind is suspect. For that use case, you need more.

Why is it Signal's responsibility to do that? Should they be held responsible for the (lack of) OpSec[0] of others, whether they're Signal users or not?

I mean, I get it. Why should you (or anyone else) have to do any work (other than download this handy app) to protect yourself, especially if your communications are of interest to motivated hostile adversaries?

The whole "telephone number as identifier" bit, and the network discovery it provides, is the primary reason Signal has had the level of adoption it has. And is something of a red herring in this case, IMHO[1].

And Signal is a centralized service. All messages (stored until they're delivered) and metadata (the stuff that Signal stores for each user) are stored on Signal's servers.

Storing anything on systems accessible to the Internet is risky. Plain text is much worse than encrypted blobs, but there's definitely still a non-zero risk.

That alone makes it unsuitable for those whose life may depend on their ability to maintain secure communications.

There are other tools that folks can use (a bunch of folks have mentioned Matrix, which is great too), but which should be either fully p2p or privately hosted/managed on hardware under one's physical control, again assuming that you might be harassed, imprisoned or killed for your communications.

But for most of us, myself included, Signal is more than good enough.

[0] https://en.wikipedia.org/wiki/Operations_security

[1] Since Signal is a centralized service, they need a mechanism(s) to identify their users. In some respects, using a phone number for that purpose is sub-optimal, but it doesn't really impact the security of messages sent through the service, nor does it attach (other than an optional photo and other information voluntarily provided by the user) any information that could be used to personally identify the user in question. A such, even if the encrypted blobs were to be accessed and decrypted, they wouldn't be all that useful anyway, except as a self-selected list (those who have registered with Signal) of phone numbers. I'm not sure how much of an issue that is for most folks, given that dozens, perhaps hundreds of other organizations (almost all of whom don't give a rat's ass about your security) have your phone number associated with your name, your address, your shopping/browsing/travel habits and a raft of other PII.


> The whole "telephone number as identifier" bit, and the network discovery it provides, is the primary reason Signal has had the level of adoption it has.

I agree. Signal absolutely should not abandon this. Rather, it should add other user identifiers that can be used alongside phone numbers.

> Storing anything on systems accessible to the Internet is risky. Plain text is much worse than encrypted blobs, but there's definitely still a non-zero risk. That alone makes it unsuitable for those whose life may depend on their ability to maintain secure communications.

I strongly disagree. A lot of critical work has been done with email + PGP, and that's about as leaky (in terms of metadata) as it gets. Obviously there are use cases where you do worry about this, but "storing data on the Internet" is not as such always a problem for those who need the highest security guarantees. Signal adopting alternative user identifiers would open it to use in some of these extreme cases, but would not (of course) make it usable in every situation - and that's okay.


>I strongly disagree. A lot of critical work has been done with email + PGP, and that's about as leaky (in terms of metadata) as it gets. Obviously there are use cases where you do worry about this, but "storing data on the Internet" is not as such always a problem for those who need the highest security guarantees. Signal adopting alternative user identifiers would open it to use in some of these extreme cases, but would not (of course) make it usable in every situation - and that's okay.

I'm old school. If it's connected to the Internet, eventually it will be compromised.

Yes, strong encryption can (and does) make data compromise immensely more difficult in terms of time and resources (much longer than our star will exist -- about five billion years -- which isn't really that big a deal, since the Earth will be uninhabitable in a billion years or so), but once that centralized server(s) is compromised, all bets are off.

I don't disagree that strong encryption is a valuable tool for maintaining data privacy and integrity, but it absolutely does not reduce the risk to zero.


With Matrix you can use F-Droid build of the client. And you don't really need to trust the server too much, right? Maybe it's not enough for Snowden, but it's better.

I'm not saying "don't use Signal", in fact I still recommend it to non technical people, since it's just much simpler. But pointing at the flaws is a necessary requirement for them to be fixed


>With Matrix you can use F-Droid build of the client. And you don't really need to trust the server too much, right? Maybe it's not enough for Snowden, but it's better.

And why should I trust F-Droid's build of Matrix over Signal's build of Signal?

Please understand, I agree with your point. Mine was orthogonal: If you are under threat from motivated and/or state-level actors, using someone else's servers (or clients, for that matter) is a bad idea.

And that includes Matrix. I run my own Matrix server and the users of that server can interact (especially via voice/video) without any fear of being intercepted -- even by me.

What's more, I can't decrypt the conversations folks have in Matrix "rooms" that don't include me without a long process of brute-forcing.

No one is coming to my house to confiscate my server. That scenario is much more likely with a public/commercial service hosted at a data center/cloud provider.

So yes, Matrix is likely more secure than Signal if, and only if you build and install your servers and clients from source with a compiler/linker you built yourself using a trusted tool chain on hardware whose components you've personally confirmed to be free of compromise[0].

[0] https://users.ece.cmu.edu/~ganger/712.fall02/papers/p761-tho...


I like Matrix, but I admit its E2 EE rooms seem to leak more metadata (users in th room, reactionss, maybe replies, display names, avatars) than Signal.


They leak metadata to the operators of the server. So does Signal, albeit anchored to SGX nodes they pretend they cannot access. Signal also has phone numbers making even worse.

With matrix at least you can pick a server operator you trust to provide your metadata to, or host a server yourself.


What a terrible argument. You might as well just switch to Telegram


>What a terrible argument. You might as well just switch to Telegram

Should I? What specific features of Telegram make it superior to Signal?


How does signal allow you to learn someone's phone number from message history? As far as I understand the only thing one can learn by inspecting signal's protocol is that:

1. generally, a certain phone number uses signal

(1) happens once, upon registration of your phone number. You don't see history of which phone numbers are communicating, do you?

In other words, you don't need Signal to buy someone's location history. You just need their phone number and Signal doesn't particularly provide that to you.

Per my understanding, Signal provides subpoena/nation-state resistant level security and is in fact used by many people with high security needs.


Sadly the protections you mention are only true if we ignore the last decade of security research and dragnet surveillance activity. Metadata protection in Signal has major asterisks they do not like to talk about which could be activated covertly by warrant, threats, or money.

1. Google, Apple, or Signal could, compile a malicious Signal binary that generates weak keys and deliver it to specific users, or all users, via app stores.

2. Signal sysadmins or third party datacenter techs could use any of a pile of SGX exploits to dump all their centralized metadata in plain text.

3. Signal aggregates all IP metadata to one place making it easy for their cloud providers and ISPs to work out who is talking to who.

4. Carriers see SMS activations and know who uses Signal. They also know all of the cellular data IPs. An entity that buys this along with data from other ISPs would quickly learn the identities of most conversation participants and their current locations. Enrich that with data widely sold from drivers license office and you also get race, home address, etc, etc.

Centralized PII requiring services that claim to be promoting security and privacy should be met with extreme scrutiny.


Okay...

(1) is abstractly possible for any software and always has been. Signal cannot directly send my phone a bespoke binary... I got it through the app store. If that was allowed it would be Apple or Googles breach of the model, not Signal.

(2) there were SGX vulnerabilities, yes, but they've been patched and Signal is no longer vulnerable (in the one instance where they were), no?

(3) citation please, these are IP logs for conversations?

(4) this is not Signal's problem to solve. If you buy into what Signal offers, you're saying it's okay that my carrier knows that I registered with Signal because that's all they know. Being able to inspect IP headers for traffic on the internet is possible regardless of the software you're using. If you don't trust the internet with your communications then you need to take them off the internet... I don't know what else to say.

Further, typically companies can't be compelled to do something like (1) because it represents an undue burden on operation of their business. This is why Apple refused to give the FBI a bespoke build of iOS that bypassed the pin code. Not to mention the loss of business when people find out that a breach of trust had happened. Also I thought Signal had reproducible builds in every instance possible.

Idk, it sounds like you really shouldn't use any software you didn't write yourself and hardware you didn't build yourself and network where you don't trust every single node if your threat model involves IP logs and hardware tampering and targeted malicious software... that is hardly practical by any stretch of the imagination.


I agree that Signal does have several questionable design decisions, but that's not one of them. You can get a sim, register with it, and take it back out. There, no location. Or even better, you can simply get a voip number.

Bottom-line, despite Signal's issues it is still the #1 IM app that I recommend to "normal people" seeking to have private conversations. No, it's not perfect, yes, it's a massive improvement over facebook/instagram/whatsapp/telegram/etc.


You can not buy a sim without KYC in almost all countries. Also most users will not realize these consequences and will just assume the defaults on Signal protect them with their every day phone number and SIM.

Also facebook/instagram/whatsapp/telegram/etc are not trying to advertise themselves for the high risk use cases Signal is actively promoted for. I obviously do not recommend anyone use those either, regardless.

Matrix is all I suggest for most people.


> You can not buy a sim without KYC in almost all countries.

I'd be curious to see stats on this. At least in the US, it is very easy to buy a SIM and sign up for a pre-paid plan with zero KYC.


The US is actually the only exception I am aware of world wide which gives us a distorted view of this problem.


Unless things have changed in the last few years, there are apparently countries in Europe that don't require registration: https://www.reddit.com/r/europe/comments/9ziqfi/european_cou...

And that's a quite high regulation part of the world, I'd be surprised if South American or African countries were stricter.


Requiring SIM registration is nearly universal outside of Europe and NA.

https://www.comparitech.com/blog/vpn-privacy/sim-card-regist...


Great link, thanks. Interesting that my intuition was off, although I suppose it makes sense that regulations are loosest in countries with the strongest speech and privacy protections. Although I'm skeptical of how well the rules on paper are enforced in some of the countries listed as requiring registration. I have seen SIMs for sale at roadside stalls in a couple countries listed as requiring registration, and I don't think they were checking ID...

But it looks like the official answer is 36:

> Those without any SIM-card registration requirements are Bosnia and Herzegovina, Canada, Cabo Verde, Comoros, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, Iceland, Ireland, Israel, Kiribati, Latvia, Liechtenstein, Lithuania, Maldives, Malta, Marshall Islands, Micronesia, Moldova, Namibia, the Netherlands, New Zealand, Nicaragua, the Philippines, Portugal, Romania, Serbia, Slovenia, Sweden, the United Kingdom, the United States, and Vanuatu.


In some places, the sale itself doesn't require anything, but you need to go to the provider and show an ID to activate.


You need a government issued ID to get mobile service many places. You can’t just get a SIM in the same way you get a burner email address.


> still the #1 IM app that I recommend to "normal people"

What app do you recommend to HN types? (I'm getting ready to switch messaging platforms. All my friends use iMessage and I'm so tired of typing on my phone at them. They can be lured over to something else with the promise of encryption.)


Try Element.

Effectively the same crypto as Signal but you can be anonymous as needed. Also decentralized with many app options.


And if Element is not the desired application to use matrix, then there are plenty of others, and available across many device and OS platforms: https://matrix.org/clients/

...Of course, Element remains the oldest and likely still most feature-full app.


I haven't really looked into Matrix. I appreciate the nudge!


Beyond any research that you undertakevaround matrix, i would add: while its not new (having existed for several years now), its popularity has increased quite a bit in the last year or 2. So in my opinion its still early days. Althougj its evolved quite nicely. So some rough edges might be encountered - more so on the client/apps side, less on protocol side - but as the superfan that i am, i really feel it represents the future of distributed messaging, which one use-case is chat. Good luck!


Aren’t the apps reproducible? Meaning, if the open source part does not match the binaries the it could be a canary for compromise.

Mind you, the last time I looked there are not alternate implementations of the signal protocol and even the usage of libsignald was frustrating due to continuous backwards compatibility breakage. I would love for a proper libpurple implementation.


> abortion seekers

Uhhh, not sure what koolaid you've swallowed, but including them in that list is almost laughable.



I agree none of this is laughable.

Irrelevant of your position, please read the entire article that you referenced for facts (pre-RvW overturn, pregnancy at 23~28 (?) weeks, took Pregnot, buried in back yard, Nebraska law was at that time 20 weeks).

The Vice article seems to have quite a lot more facts and references. https://www.vice.com/en/article/n7zevd/this-is-the-data-face...


I did read it.

The point is that the mother is being charged with aiding her daughter to have an abortion due to evidence collected from chats they thought were secure, but which were still susceptible to a warrant.

Now, there are other charges. And the time the abortion happened it occurred while the 20 week ban wasn't being enforced, because the state knew it wouldn't hold up under Roe (and is only illegal and chargeable now, with the court having overturned Roe). So, yes, it's super interesting.

But the point is that police are charging someone for aiding an abortion due to texts the sender thought were secure. That's the entire relevancy. Anything else about this particular incident isn't germane.


I wonder how the people putting "abortion seekers" on such lists would feel if I included "self-defense rights advocates" for people 3d printing guns or smuggling them in from abroad on similar lists.


I'd wonder if it's for self defense why you didn't buy your firearm legally, since, you know, it's legal to do so. I haven't done a deep dive, but as far as I can tell in most cases it's legal to 3d print too, though admittedly that's something that there are some semi-serious efforts to change.

In other words I'd suspect the classification of "self defense advocate" to be a self serving branding effort since there are legal ways to accomplish the same, but I wouldn't doubt the need of this person for a secure messaging platform.


>I'd wonder if it's for self defense why you didn't buy your firearm legally, since, you know, it's legal to do so.

Outside of the United States, that's usually not the case. Even if countries do allow private gun ownership, the restrictions on how to obtain them (and what they can legally be used for, what kinds are available, etc.) are exceptionally onerous.

And even within the United States, there are individual states that have attempted to severely curtail private firearm ownership. Were it not for certain Supreme Court decisions, handgun ownership would be outright illegal in the District of Columbia and likely in several other states.


> Even if countries do allow private gun ownership, the restrictions on how to obtain them (and what they can legally be used for, what kinds are available, etc.) are exceptionally onerous

Citation needed. I, and probably the majority of the citizens of those countries do not consider the standard test/psych eval/background check/random checks in the future to make sure you're following the rules to be "exceptionally onerous". And i think most non-Americans would agree that adding some friction to a fringe case (owning a personal firearm for protection or fun is not something most people do, even in the US) is worth it if it nearly eliminates blatant misuses of firearms - either making suicides easier and more terminal, enabling easier revenge murders, or making your average school/public place shooting easier.

What would you consider a just middle ground between "onerous requirements" and "everyone can buy any weapon without any requirements but paying for it"?


>Citation needed. I, and probably the majority of the citizens of those countries do not consider the standard test/psych eval/background check/random checks in the future to make sure you're following the rules to be "exceptionally onerous".

Just because you've accepted the boot on your neck doesn't make it not a boot. When (not if) a currently free and democratic Western nation decides to be not so democratic anymore (whether due to invasion, international pressure from economic partners like Russia and China, or just that the assholes in power decided to seize even more power) the citizens (or rather subjects) of those countries will have no means of fighting back. You can already see it with several countries' response to covid.

>What would you consider a just middle ground between "onerous requirements" and "everyone can buy any weapon without any requirements but paying for it"?

My feelings on gun control can be summed up as "I want mail order rocket launchers delivered to my doorstep." The state should fear its people, not the other way around, and the best way to ensure that is to give the people the means to put a bullet (or several) into any would-be tyrants.

And, regardless of what "the majority of citizens" feel about bootlicking and trampling on their own natural rights, advances in home manufacturing are quickly making any efforts to do so a pipedream.


> want mail order rocket launchers delivered to my doorstep." The state should fear its people, not the other way around, and the best way to ensure that is to give the people the means to put a bullet (or several) into any would-be tyrants.

You should look into France and it's protest culture. When the people are unhappy with the government's action, they go out on the street and protest. Without any weapons, this being a civilized country where violence is only a last resort. And you know what? Governments listen and adapt, even without the fear of direct death.

So i find your premise wrong to begin with. There is no natural right to murder, so i disagree that owning a weapon is a natural right.

And i find it extremely funny that the country that is so proud in their "everyone should be armed so the government is afraid of the people" culture has such shitty dysfunctional governments that act against the people's interests extremely often. Where are the armed uprisings against the Patriot act, civil asset forfeiture, racist abuse, abortion restrictions, failures to combat climate change or wasting money in useless wars abroad? No? When then?


>My feelings on gun control can be summed up as "I want mail order rocket launchers delivered to my doorstep."

I don't know. I'm a believer in extreme gun rights as well, but giving people the power to have rocket launching systems like MANPADS just seems a bit, dangerous.


You already have the legal ability to own a rocket launcher - it's not any different from any other "destructive device". The main barrier to ownership is finding someone willing to sell you one, and the price they would likely ask for it. There are rich collectors in US who own tanks (with active turret), artillery etc - mostly older stuff, but still plenty destructive.


A quick google suggests that if you want your tank to have working guns and/or turret you need a Federal Destructive Device Permit, which includes a background check and ATF approval.


Destructive devices are NFA items, true. If you want to own a DD, you need to pay the $200 federal transfer tax, which is done by submitting a form to the ATF and getting a tax stamp from them.

It's not a "permit", though. And there are no special limits on who can own one - if you can legally own a gun, you can legally own a DD or any other NFA item. One doesn't even need to be a US citizen or a permanent resident for that, even people on student and work visas can do it.


This is the site that I was looking at: https://nationalfirearmsact.com/nfa-regulated-items/destruct..., which seems to spell out the background check as a requirement. It also says you need to be a resident of the US. Am I missing something?


Background check is a requirement for regular gun sales as well, except private person-to-person sales - in practice, this is the vast majority of transfers.

ATF can be more thorough with NFA items because the law doesn't have a limit on how long they can look at you, unlike those regular NICS checks which have a hard limit - but the list of things that makes one ineligible to own is the same.

As far as residency, you have to be a resident somewhere in US, but you don't need to be a permanent resident / green card. A student or a work visa is good enough, combined with proof of current residency (such as utility bill with your name and address).

This isn't quite what OP asked for, of course - you can't have one "shipped to your doorstep" - but this is also true for most regular firearms (there's a collector license that enables this for some old guns).


If it's a choice between wearing a mask at the grocery store and the idiot next door blowing up my house with their mail order rocket launcher, I'll take the mask. If that makes me a bootlicker so be it I suppose.


I have a suspicion that I already know, but why are you jumping to a non-sequitur about masks? I tend to agree with the user to whom you're responding on this particular issue, and I still wear a mask in places such as public transit, enclosed spaces, etc.

So...I guess my point is that you don't _have_ to choose between masks and gun rights. I'm unsure of why you would bring it up.


From the comment I'm replying to:

> You can already see it with several countries' response to covid.

Perhaps the commenter was going for something else, but at least where I'm at we've had two straight years of people insisting they are muzzles, an infringement on our god-given rights, and the beginning of a slippery slope to tyranny. Perhaps the commenter meant something else, but since they didn't spell out what specifically about the "response to Covid" they intended to solve with a mail-order rocket launcher of all things, I was left to interpret for myself.


Fair enough, I suppose. I immediately thought of (what I consider to be) excessive lockdowns and enforcement in countries like Australia, but I can see how you went to masks.


It certainly could be. But even then I'm not sure I consider measures attempting to control a pandemic the height of tyranny. I realize this might sound like I'm pro-lockdown, I'm not, I actually think most countries completely botched their handling with measures both insufficient to have sufficient impact on the actual spread, while limiting enough to ensure significant damage from the measures themselves.

I also take issue with the idea of gun (or rocket launcher ownership) ownership as a means of prevention. I mean look at the top countries for (citizen) gun ownership. Sure you've got the US, Serbia, Canada, Uruguay, Finland up there, not bad, But you've also got Yemen at #3 and Lebanon and #11. If that's the kind of "freedom" private gun ownership ensures, then I'm not buying.

And again I'm not even that pro gun control. I think you should be required to get a background check to get one, I think you should be required to be trained on their use and safety, and I think you should be required to take reasonable measures to protect your firearms against theft. I'd say that's it, but I suppose I'm also against mail order rocket launchers. But for the most part, having met those requirements I think you should be able to buy what you want (within reason, again let's skip the rocket launcher). But as protection against government tyranny? Doubt.


> Outside of the United States, that's usually not the case.

A good and fair point. I'd fallen into the trap of being too US centric on HN.

> Were it not for certain Supreme Court decisions, handgun ownership would be outright illegal in the District of Columbia and likely in several other states.

Sure, were it not for the Supreme Court. But as there remains plenty of ways to legally obtain guns in the US, I'm still going to doubt that you've resorted to gun smuggling for "self defense"


I would have no problem seeing that included on such lists either. I have friends who hunt and I myself enjoy shooting on a range once in a while. I also know single parents that live alone in sketchy areas that are well trained and level headed enough to trust with firearms for home self defense.

There are almost always reasonable uses of many services and tools we tend to have knee-jerk-ban reactions to as a society.


How do you figure? Several states had abortion laws that were never repealed and others have trigger laws on the books that have gone into effect or will shortly, so yes you can be prosecuted for abortion now. Hence the need for privacy.


I wouldn't even call it bad. In may ways it's good, actually. It's good because it allows signal to build a product that is relevant and usable. Phone numbers only connect people and are a bridge to allow all the perfect crypto to do the legwork. The knee jerk "phone bad" reaction is understandable, sure. But I don't think it's warranted for Signal. Signal would look like Keybase without phone numbers. Keybase (or their key exchange UX, at least) is great. But it's not the same product.


People use telegram more and more which is based on usernames...


> don’t store any messages on their side

Google Play services are still required for the official builds because of the (unverifiable to be really) encrypted backups.

> everything is client-side

Signal's FOSS fork developers would disagree. They got outright legal problems after they wanted to implement an open source alternative. Most APIs in regards to contact management are server-side. There's Molly as a younger fork but I'm waiting for Signal to write them also a cease and desist letter.

Honestly this is why I think that Signal should be treated the same like WhatsApp. Supposedly end to end encrypted, but only until you suddenly have the FBI with printed out chats in front of your door.

As long as Signal uses proprietary services and contains proprietary blobs in their (default aka Play store-provided) app we have to treat it as an unsecure messaging system.

Especially given the RCEs that it had in the past, where it was as simple as injecting an HTML with a script tag to install malware on your system.


> Google Play services are still required for the official builds because of the (unverifiable to be really) encrypted backups.

??? I use the official build with encrypted backups without Google services, and have been doing so for at least 3 years. I've been forward-carying my backup since 2016, too.


we still don't know of anyone with their messages printed out. Signal probably doesn't have all the contacts hoarded on their server, while WA certainly does. For me, there is a big difference in "might be buggy" vs "certainly is privacy hostile"

I still prefer Matrix, but Signal is clearly the next best thing for chats. And it also has quite a number of non-HN users :)


I will admit that this requirement always confused me. What is there to benefit from by requiring it?


It means that Signal doesn't need you to create or upload a list of your contacts; it uses the existing contact list from your phone. This also lets you use Signal to replace the default text messaging app on Android, automatically upgrading conversations to be encrypted when possible. This in turn means that just using Signal to communicate with someone becomes a normal, everyday activity, and less of a sign of suspicious activity (from the point of view of law enforcement, etc.).


I think the problem is that it's a requirement, not a feature you can choose to use. I'd be more inclined to use Signal if I choose to use only a user/pass. Just need a block function.


How would other people contact you?


With your username?


How do you text a username?


By opening signal, putting in the username and sending a text. Signal only uses MMS as a fallback when communicating with someone not on signal. When both parties are on signal SMS/MMS is not used. Presumably they are OK with not being able to communicate with people not on signal.


Why do you need to text? Sending messages without the arcane UX and unreliability of old school SMS or the tries at improving it(RCS) is simply better.


You do know that sharing your contact list is optional?


On an iPhone, what does 'sharing your contact list' imply ?

Does the app get just name and phone numbers or all the meta data like address and personal notes that I put into my contacts ? I haven't been able to figure this out - does anyone know what Apple's policy is on this ?


Contact Notes, specifically, require [0] a special entitlement to access them, so normal chat apps should never have access to them on iOS.

All other fields, for all contacts, are accessible once Contacts access is granted.

[0] https://developer.apple.com/documentation/bundleresources/en...


I believe they have covered this question many times before, but I don’t see an answer on signal’s website. From memory, it had to do with not wanting to own the user’s contact list. Using a phone number allowed them to rely on a contact list on the users phone, which is not tied to the signal account. There was more nuance than that though.


> it had to do with not wanting to own the user’s contact list. Using a phone number allowed them to rely on a contact list on the users phone, which is not tied to the signal account.

That doesn't make any sense. Signal did the total opposite. It started keeping sensitive user data in the cloud including your name, your photo, your phone number, and a list of your contacts. It stores that data on their servers permanently.


The list of your contacts bit is patently false, they've discussed in detail about how they securely organize contact lists: https://signal.org/blog/contact-discovery/

Signal has always kept your name/pic/etc on their servers I believe, because otherwise you turn signal into a P2P application, which it is not. It's a fully encrypted application that stores minimal information. It is NOT P2P.

For example, your messages are stored on their servers until they're delivered.


> The list of your contacts bit is patently false,

You are wrong and your blog post from 2014 doesn't take into account their new data collection practices. See: https://community.signalusers.org/t/proper-secure-value-secu...

If this is the first time you're hearing about the data collection, that should tell you everything you need to know about how trustworthy Signal is.

> Signal has always kept your name/pic/etc on their servers I believe

Wrong again I'm afraid. There really was a time when Signal didn't collect and store any user data on their servers. They've repeatedly bragged about times when governments have come around asking them for data and they were able to turn the feds away because that data was never collected in the first place. That changed with the update which added pins. Today, Signal now collects that very same data.


I don't think this is true, do you have a source?

They store registered users phone numbers and allow discovery by making a request with a hashed version of the phone numbers on your contact list. They add an extra layer to allow attestation of the software doing this using Intel's secure enclave. They give many examples of responding to warrants with only whether the number has been registered and the timestamp of registration, which they explain is the only information they hold.

Private Contact Discovery: https://signal.org/blog/private-contact-discovery/



There's a horrible conflation of concepts here. A pretty big one.

When people talk about cloud services, they generally mean part of an application that runs on the cloud that participates as a trusted actor in the application's trust model.

What people in the linked thread are realizing is that "signal has a server" and they are confused because they thought signal didn't have a server, or something.

So, what's important about Signals servers is that, outside of initial key exchange which is verified by two parties out of band, they are not a trusted entity, ever. When you send a message it goes through signals servers. When you sync your profile picture with other devices, same thing. The data transits signals servers. This is made possible because of cryptography. By encrypting the data in a way that is indecipherable by 3rd parties (Signal's servers included) your data is isomorphic to random noise. So, the only thing Signal needs to do is route the random noise to the right place. If it doesn't do that, it's a denial of service and about the only attack you're vulnerable to if you use Signal. Otherwise, the receiver gets the exact random noise that you sent, but only they can make sense of it because of the miracle of cryptography.

If you're really doing to throw a fit because Signal syncs a profile picture between your devices using the same level of crypto as is used for messaging then you're honestly crazy.

No. Signal did not "not have a cloud" and now they "have a cloud". Not by any reasonable interpretation of the events.


Signal has a "cloud" a server where they collect and store your name, your phone number, your photo, and list of every person you've contacted using Signal. That data isn't some ephemeral encrypted string that is only present when you "sync your profile picture" or when you send a message. It is collected and stored on their server where it will sit for at least as long as you have an account.

The justification for it was so that you could get a new device and have Signal download all of your info from your Signal's server down to your device. The data collection first takes place as soon as you set a pin or opt out of setting one (at which point a pin is assigned for you automatically).

The data is encrypted, but that does not make it impossible for signal or for 3rd parties to access it. see: https://community.signalusers.org/t/proper-secure-value-secu...

If you're a whistleblower or an activist, a list of every person you've been contacting using Signal is a highly sensitive data. No matter how you want to spin it, Signal is hosting that highly sensitive user data on their servers where Signal and 3rd parties alike could possibly gain access to them.


You should assume every bit of information sent on the internet is archived in a massive warehouse somewhere, because it is.

Thus, we have to trust the cryptography itself. Sending an encrypted message to a peer is no different from sending an encrypted message to yourself (other than the use of symmetric vs asymmetric crypto). The fact that you send a message to yourself which is stored persistently on signal's server doesn't change anything (and it's even opt in AFAIU). Sure, there are concerns about the implementation, but until someone can decrypt the blobs in storage (the crypto is broken) I don't see reason for outrage.

Pretty simply, if you don't trust the crypto then you have a very different threat model to pretty much everyone else. If you don't trust crypto you can't use the internet because you can't use TLS. You're relegated to networks where you trust every single node (where you don't need crypto) and other such stuff. Most of us trust the crypto because it's really the only practical option. I don't see the problem.


> You should assume every bit of information sent on the internet is archived in a massive warehouse somewhere, because it is.

Leaving aside the whataboutism here, you shouldn't assume that when you're using a secure messaging app that claims to be designed to never collect or store user data. Signal makes that claim at the start of their privacy policy and it is a lie. It started out true, but they begain colleting data and they refuse to update their policy.

> Thus, we have to trust the cryptography itself.

No one is suggesting we can't trust cryptography. The fact is that doesn't matter how strong your algprythm is when you're encrypting that data with a 4 digit number. You can 100% "trust the cryptography" and still acknollege that it won't take very long for someone to brute-force your pin and get your data plain text.

> Sending an encrypted message to a peer is no different from sending an encrypted message to yourself... (and it's even opt in AFAIU).

This has nothing to do with "sending data to yourself" and everything to do with Singal collecting data from you and storing it for itself. There is a massive difference between encrypting something yourself and sending that data to yourself and someone else copying data from you, encryping it, and saving it for themselves.

This data collection is also not opt in. At all. You can opt out of setting a pin, but if you do one will be automatically generated for you and your data still gets silently uploaded to Singal servers to be stored. The community spent months begging for Signal to add a way to opt out of this data collection, but they were ignored.

See:

https://community.signalusers.org/t/dont-want-pin-dont-want-...

https://community.signalusers.org/t/mandatory-pin-without-cl...

> Pretty simply, if you don't trust the crypto then you have a very different threat model

"The crypto" isn't the problem here. The problem is Signal collecting sensitive user data and permanently storing it on their servers in a manner that could allow it to be accessed by third parties and then not clearly disclosing that to their users and refusing to update their privacy policy to reflect the change.


Signal can't possibly read the data. How is that for itself? Only you can decrypt it! Signal doesn't have your data. They have garbage bits of effectively random noise.

You can prove it to yourself. Go take one of Signal's servers and try to find someone else's data there. You won't.

Why would Signal update their privacy policy to reflect the desire of misguided fear mongers? I certainly wouldn't do that if I were them.


> Signal can't possibly read the data.

They literally can. If you can brute force a 4 digit pin, you can access any of the data protected by a 4 digit pin. Some pins are longer, but it's notable that even after a lot of backlash they continue to push for "pins" and not "passwords" knowing that many will continue to use a simple four digit number.

> You can prove it to yourself. Go take one of Signal's servers and try to find someone else's data there. You won't.

um... what?

> Why would Signal update their privacy policy

To accurately reflect the data they collect and how it is used? So that they don't lie to their users by making claims that are demonstrably false? To notify whistleblowers and activists that their information and the information of those who they are in contact with could be discovered by state actors who can force Signal to give them access? There's three good reasons right there.

I'm sorry you're so upset by this. I know the reality is uncomfortable but that doesn't make it "fear mongering". I honestly wish it wasn't true. I wish they weren't collecting user data, I wish they were doing more to secure what they do collect, and most of all I wish they were honest and forthcoming about what they are doing, but wishes can't change what is. I hope that regardless of if you use Signal or not, you'll try to accept facts even when they aren't easy to accept.


Let me make this clear: if the data is stored in a way that Signal's service cannot decipher it, then it's not collected by any reasonable definition of collected". In order for Signal to collect it they would have to obtain it, which they don't, and can't, do.

This term isn't just some loose word to be thrown around and abused on message boards. If we take your definition of collected where handling encrypted data is collecting it, then "the internet" collects all data. Uh oh.

What signal does is route encrypted messages between principals in a system. That's all they do. They don't collect personal information. Read their subpoena responses, they publish all of them.


> Let me make this clear: if the data is stored in a way that Signal's service cannot decipher it, then it's not collected by any reasonable definition of collected".

I think this is misguided, and confuses the truth. Data collected and stored remotely is being "collected and stored remotely" regardless of how well protected it is.

I will however concede that it is possible to design a system where data is encrypted on a device and then uploaded to the cloud in such a way that simply having that encrypted data on a remote server doesn't put that data at risk. Signal did not design their system in that way.

> If we take your definition of collected where handling encrypted data is collecting it, then "the internet" collects all data. Uh oh.

Again, this isn't about handling encrypted data - it's about the long term storage of highly sensitive but encrypted data - and as I said above, even that is fine if it's done correctly. Signal has done a poor job of designing their system which leaves user's data at risk.

> What signal does is route encrypted messages between principals in a system. That's all they do.

That used to be "all they do". Then, about two years ago they decided they wanted everyone to have profiles which would be kept on the cloud. As soon as you install the software, before you try to send any message to anyone you're asked to provide a pin to secure you data. Once you set one (or opt out of setting it yourself) it collects a bunch of data from your device (not needed for routing anything - remember you've just installed the app and are not trying to send or receive any message at this time) and having collected that data it encrypts it on your device using the pin, then it uploads that data to their cloud. That data can be recovered by you (or anyone else for that matter) by providing the pin that you set. The data they just collected and stored is not used to transmit, route, or delver messages. This data collection takes place in addition to any information needed temporarily to transmit, route, or delver messages.

> Read their subpoena responses, they publish all of them.

That's incorrect. They publish the ones they are allowed to publish under the law (look up "national security letters" for more info) and their refusal to provide one agency with data says nothing about the requests they are forced to comply with. Their favorite examples involve cases where Signal was unable to hand over the data because they didn't collect it in the first place. Today, because of changes in their data collection practices, they now collect exactly the kinds of data they were not collecting before and were therefore unable to provide.

It's unlikely that Signal would be compelled by a standard subpoena to brute force their users pins to access the encrypted data. It is far more likely that the data is already being collected by an agency on-site, and that the data collection is continuous and ongoing (look up "Room 641A" for an example of on-site data collection by the state).

The fact that it is unlikely that Signal would be compelled by a standard subpoena to brute force their users pins does not mean:

- Signal employees can't do it themselves any time they feel like it.

- State actors can't do it whenever they feel like it

- A hacker couldn't gain access to a server and do it

Because of the sensitive nature of the messages sent over the platform, and because they have explicitly marketed themselves to vulnerable groups like whistleblowers and activists it is critical that Signal be honest about the risks of using their software. They insist they don't collect any data, while in practice they do. They say they secure the data they have, in practice that data is exposed by way of multiple vulnerabilities that could very well endanger the freedom or even the lives of the people using Signal.


Can you link to the implementation? I'll agree that a 4 digit pin is rather egregious and trivially crackable. I don't know a single serious cryptographer that would allow such nonsense which is why your comment sounds so unbelievable. I thought they were blending the pin with some device-local entropy to make a reasonably strong key. I'd like to verify your claim.


Basically, they planned to get around much of the problem by depending on a very insecure secure enclave to make up for a lack of basic sound security practices.

The scheme they came up with to store user data in the cloud was described here: https://signal.org/blog/secure-value-recovery/

The code is here: https://github.com/signalapp/SecureValueRecovery

This site does a pretty good job of explaining why this isn't a good design: https://palant.info/2020/06/16/does-signals-secure-value-rec...

I'm sure I've linked to it already, but please review the discussion here as well: https://community.signalusers.org/t/sgx-cacheout-sgaxe-attac...

Even more details here: https://community.signalusers.org/t/wiki-faq-signal-pin-svr-...


They definitely do not encrypt your data with a 4 digit pin. They use Argon2 (a slow hash, not that it matters specifically here since the security depends largely on the entropy) to derive a 32-byte key. Then they derive subkeys: an auth key, and part of a final encryption key. The other part of the encryption key is 32-bytes of entropy. You store your entropy in an SGX enclave with a limited number of attempts allowed to combat the possibility of a weak pin.

Few things:

1. The vulnerabilities in question for SGX have been patched, only one of which affected Signal at all.

2. Signal preemptively combats any future speculative execution vulns by adding "don't speculate about this next branch" instructions before every single branch.

3. nit: SRV is a scheme to store the 256bits of entropy in the cloud, not the actual user data. It's unclear from those links whether Signal has actually deployed the "store encrypted contacts" portion.

4. It is concerning that the security of this entropy is tied to Intel's SGX implementation.

5. If you use a strong password, which security nuts would, none of this matters.

6. If you turn off your pin, none of this happens at all (so it's at least opt out but IIRC setting a pin was optional).

7. I don't find your interpretation particularly charitable to the truth of what's actually happened. It's incredibly reactionary.

I will give you:

1. The trust model for Signal has changed to include a dependence on a piece of Signal cloud to enforce a rate limit on (really access to) escrowed entropy IFF you use a weak pin.

2. There does seem to be unnecessary confusion surrounding this whole thing.

What bothers me reading through this is that it was never made clear to users that the security model would change if you enabled a weak pin, in other words that the strength of your pin/password is now important if you don't/can't/won't trust Signal+Intel. If that was made clear there would be no issues at all and concerned citizens would simply disable their pin and deal with the not-improved UX or choose a strong pin such that the entroy escrow SVR thing is entirely moot.

I don't think they need to update their privacy policy or user agreement to reflect these technical implementation details, though, as I've previously stated.

Moxie blames the poor reception on not having analytics. I'd say they should have known, it's pretty obvious you can't pretend you don't need a password and try to hide it from users if you want to add stuff that needs a password, like usernames. But I also know from first hand experience how difficult it is to just sit there and say "whelp, we can't build this thing that will make many users happy and make the product better because it isn't perfect".

What's sad is actually that this is all in service of enabling username messaging and dropping the phone number requirement which is exactly what everyone is yelling about. So it's like, they listen to feedback from people who want to use Signal without a phone number requirement. Then they build the thing that lets them take a crack at the nut. And then they get reamed by HN for having the audacity to try and build a secure solution to a problem that largely only exists on HN and only for Signal (nobody gives a shit that every other app under the sun just stores your contacts in plaintext). Must really suck to get that kind of response.

I'll probably go turn off my pin. I have no interest in signal managing my contacts.


I did oversimplify their encryption scheme, but the issue is that in the end you still only need a pin to get the unencrypted data. I agree that if they'd been honest about passwords and the need for a strong one this wouldn't be as big an issue. It's because they were not honest that I don't think it's fair to expect their users (even the security nuts) to do it. Their target demographic will include whistleblowers and journalists who aren't necessarily all that tech-savvy.

The strengths and weaknesses of SGX are debatable, I may lean on the pessimistic side, but as you say it impacts the security model of Signal users and to me that means they (and new users) should be clearly informed. The first line of their privacy policy says "Signal is designed to never collect or store any sensitive information." which is demonstrably false.

As for opting out, unless something has changed they still store your data on the cloud, it's just handled differently:

https://old.reddit.com/r/signal/comments/htmzrr/psa_disablin...

I don't know what options someone has after they've already created a pin, if there's a way to remove your data from the cloud, I stopped using signal before they forced the pin (back when you could still just ignore the notice) and getting real answers to these kinds of basic questions is way more difficult than it should be. This is, again, a service targeting very vulnerable people whose lives and freedom may be on the line.

I was one of those Signal users who wanted them to move away from requiring a phone number too. That said, what I was looking for was something more like Jami. They managed to create a system with usernames and passwords but without phone numbers or accounts keeping your data in the cloud.

I'm not shitting on Signal's efforts overall. A lot of great work went into Signal and I'm pissed I still haven't found a good replacement for it, but the changes they made hurt the security and safety of the people who depend on Signal. They are a massive intelligence target and I can't blame them for anything they were forced to do, and if their goal was to subtly drive people away by raising a bunch of red flags I thank them, but if this is their best effort at communication and building trust how charitable can they expect us to be when two years later so many of their users don't have a clear idea of what's being collected and stored or what that means for their safety?


I just wanted to thank you for the information and the ensuing thread. Very interesting.


>That doesn't make any sense. Signal did the total opposite. It started keeping sensitive user data in the cloud including your name, your photo, your phone number, and a list of your contacts. It stores that data on their servers permanently.

This is the first I've heard of that. And if it's true, it's a big problem.

Is there any documentation of this behavior that you can direct me to?


You aren't alone. There are a ton of people who have no idea Signal has been collecting and storing sensitive user data on their servers. There was a ton of discussion about it when the update rolled out and a lot of backlash from their users, which they ignored. They've since refused to update their privacy policy as well which I personally see as a canary warning users to avoid their service.

https://community.signalusers.org/t/proper-secure-value-secu...

https://community.signalusers.org/t/what-contact-info-does-t...

https://community.signalusers.org/t/can-signal-please-update...

https://community.signalusers.org/t/dont-want-pin-dont-want-...

https://community.signalusers.org/t/sgx-cacheout-sgaxe-attac...


>You aren't alone. There are a ton of people who have no idea Signal has been collecting and storing sensitive user data on their servers. There was a ton of discussion about it when the update rolled out and a lot of backlash from their users, which they ignored. They've since refused to update their privacy policy as well which I personally see as a canary warning users to avoid their service.

Edit: This bit is apparently not the case. And more's the pity.

====Section affected by edit=========

I can't (and wouldn't try to) speak for anyone else, but if you disable the PIN functionality[0], Signal doesn't upload the information you're talking about.

==== End section affected by edit=========

Which isn't a new change (IIUC, PIN disablement was introduced ~2 years ago). I'd say that using the PIN functionality should be opt-in rather than opt-out, so in that respect I agree.

Further, Signal should probably update their policy documents to reflect the current state of affairs.

But I stand by my previous comment[1].

[0] https://support.signal.org/hc/en-us/articles/360007059792#pi...

[1] https://news.ycombinator.com/item?id=32474579


> I can't (and wouldn't try to) speak for anyone else, but if you disable the PIN functionality[0], Signal doesn't upload the information you're talking about.

This is also incorrect. If you opt out of setting a pin, Signal creates a pin for you and uses that to encrypt the data it uploads to their servers. Again, not your fault. Signal has gone out of their way to avoid answering direct questions about this in a plain way.

See: https://old.reddit.com/r/signal/comments/htmzrr/psa_disablin...


This is absolutely NOT true. (1) Signal doesn't store your contacts, and (2) Signal only stores a name and a profile photo if you want, and in a secure way https://signal.org/blog/signal-profiles-beta/


I'm sorry to be the one to tell you, but Signal 100% stores your contacts. They keep your name, your photo, your phone number, and a list of every person you've contacted using Signal. That data is permanently stored on their servers.

See: https://community.signalusers.org/t/can-signal-please-update...

> "This should be updated for the recent changes where contacts are uploaded to Signal’s servers and stored permanently along with Groups V2 and other data, protected by a 4-digit minimum PIN and Intel SGX – there have been concerns 5 raised 2 in these forums, particularly if one of your contacts chooses a brute-forceable PIN which in the context of an Intel SGX vulnerability 1 could leak a lot of contact data if hacked, even if you choose a strong password."

See the two links sited in that comment for more information on why it isn't actually stored in "secure" way.


Why not just store a contact list of usernames on the phone though?


What would this list contain? You don't have a signal username. If you did you'd have to claim it somehow (degenerates to email or phone verification). It's not that simple.

Using phone numbers allows signal to plug into the existing state of the world and leverage it to upgrade the security of messaging for everyone who uses it. The one compromise is that it treats phone number as a short identifier (importantly, not cryptographic, it uses real crypto for that).

If you don't use phone numbers, your product would look more like Keybase. You have to somehow facilitate key exchange between people in a way that's actually usable. Otherwise all your security benefits go out the window because nobody uses your product. Signal understands this nuance perfectly which is why they're a successful product.


If I don't use Signal, then I'm not going to keep a list of my friends' Signal usernames on my phone.

If I subsequently sign-up for Signal, then I have no way to discover which of them use Signal - short of contacting them via some other method and asking for their Signal username, if any.

By making the Signal username the same as the user's phone number, I actually DO have a list of Signal 'usernames' on my phone already. As soon as I sign-up, I can send my list of friends' phone numbers to Signal and they can tell me which of those people have Signal accounts.


It's the easiest anti-spam measure, because it makes it expensive enough that spammers won't have a million accounts. Fake account detection is effectively put on the back of mobile carriers


Without it, you wouldn’t be able to see which of your contacts are on signal.

And then nobody would use Signal.

It’s very unfashionable today, but they decided to not let perfect be the enemy of good.


While that is nice, I see no reason to require that. Some people just don't care about that feature.


What is this future UX you're imagining? How does the future solve the contacts book/short identifiers problem?


I'm not saying its an amazing experience or solves the problem systematically. Again, some people simply don't need these features. You can literally just take part of the public key and that's it. That is totally fine for some use-cases.


Then use urbit. It already exists.


Yeah but the whole point of Signal is to allow secure very secure communication. With little effort they could allow this usecase. It would address a major criticism and they already have the underlying infrastructure.

But I guess you can just keep moving the goal post.


I don't understand how that's moving the goal post. Urbit developed a novel way to phonetically encode larger amounts of entropy than people are used to dealing with in order to build a network where your cryptographic identifier is your namespace and prime identity. You can spin up an urbit ship/planet and securely message anybody on the network using that short identifier. You suggested just using part of somebody's public key as an identifier directly. Urbit basically does that and a whole lot more.

I suspect it would be rather trivial to cut out the phone parts of Signal and have a UI where you paste in the first 8 characters of pubkeys and it matches those. Why not try building it?


I know nothing of Urbit.

My point was just that there is a simple technical solution that Signal could apply if they wanted to make people happy who have no phone number and its moving the goal post to say 'use some other app'.


Claiming that the solution is simple is the problem. Signal actually has been trying to add usernames for years. It involves an account, contacts book, trusting Signal, trusting Intel, SGX remote attestation, Raft, and passpins. They are getting reamed for it because it's not really possible to treat a 4 digit pin as a strong password but they're trying to do it anyway. It inverts the whole value prop of Signal on its head. That's my point here. The reason I'm saying use something else is because it's not simple like you claim. And it changes Signal's model enough that people are leaving Signal because it's not what they signed up for.

Put simply, telling Signal to add usernames is like telling the existing users to "use something else" because that's what Signal must turn itself into to satisfy the "I need usernames right now" crowd.


Again, I'm not talking about usernames, I'm talking about public keys.


Well you should try: https://getsession.org. It's a fork of Signal with pubkeys instead of phone numbers.


I mostly used to Matrix already. I prefer the way they work and I prefer their approach.


The difference between Matrix and Signal is that my mom can use Signal


Just like this website. Usernames. Easy peasy.


And how do you claim a username?


By knowing the password?


Where do you put the passwords DB?


Besides fighting spam accounts, finding and connecting with others is easier. No more needing to ask somebody what their account name / friend code / ICQ number / whatever UID a system uses to add them to your contacts. It's not a good user experience to type in someone's sometimes insane username.

Meanwhile, someone in my contacts that installs Signal automatically sees my name pop up and can start chatting. Far easier, and helps drive adoption.


Except the reality is that I'm asking for a phone number to add people (well on whatsapp here in europe), and most screen names people chose can be communicated verbally while phone numbers often have people resorting to handing the phone displaying the number around.


There's no need to make it a requirement for this. Matrix has similar functionality, but doesn't require that you add your phone, just makes it an option.


They trade your privacy for not having to figure out some technical issues.

Also, when they rolled out their cryptocurrency payment system (after keeping the server-side source code secret for more than a year, during which Moxie, who is a paid advisor to that same cryptocurrency, denied they were working on a payment system), they got KYC for free.


Less spam. A phone number is a much better deterrent against opening a hundred accounts than let’s say an email address.


Using the phone number as the identity. You need to verify ownership of the phone number to prove your identity. I guess they could use email as an alternative.


Beside the technical reasons, tech companies are valued from their access to contact information, and Signal has had huge investments made. This despite the obvious fact that it is highly unlikely that Signal would benefit directly from that data in any way. But much Signal's design is taken from the companies that came before, which did.


It also pushes fraud detection up the pipeline to mobile operators.


Making it more expensive to create spam accounts, I'd guess.


Maybe this will make Signal re-think their hard requirement of a phone number to register for Signal.

...eh, who am I kidding?


I think they are. I just also think the problem is a lot harder than people give it credit for. If they just go with a standard username (as in some form of a database lookup) then I'll be upset. But I'll be upset because this effectively doesn't solve any issue, and introduces others that have big privacy impacts and requires Signal to be a trusted source (which is antithetical to Signal's proposed mission).

I do wish Signal would be more transparent though. Given that this is such a difficult problem and they've been struggling with it for years, it reasons that it is time to seek help. This is like when a student just studies for hours and hours on end, spinning their wheels. They aren't learning anything. At some point you need a tutor, help from a friend, or asking the professor for help. (Or analogy in work if that's better)

Signal, it is time to be transparent.


Signal is a trusted source already – you trust them telling you which number is which user.


Sorry, let me clarify. We should trust Signal as little as possible. That's how the design should work. Zero trust is very hard to create but let's minimize it.

Opening up usernames (in the conventional sense) you will end up needing to verify and act as an arbiter. This is due to the fact that certain usernames have real world meaning behind them and you don't want to create an ecosystem where it is really easy to honeypot whistleblowing platforms (how do you ensure that CNN gets {CNN, CNN-News, CNNNews, etc}?). They've suggested that this might be the case given that the "Notes to self" and "Signal" users are verified with a blue checkmark. The issue is that verifying users not only makes Signal a more trusted platform but the act of verification requires obtaining more information about the user. It also creates special users. All things I'm very against. I'd rather hand out my phone number than have Signal collecting this type of information. So yeah, it isn't completely trustless, but I certainly don't want to move in the direction of requiring more trust.


You aren't supposed to trust Signal on that; you are supposed to verify it out-of-band using Safety Numbers.


Signal does not tell me which number is which user. I know which number is who myself. The most Signal does is presumably warn me when the key associated with the number changed (eg. new phone).

And that's where I have to trust Signal, but as a protocol not a "trusted source" of information.


> I do wish Signal would be more transparent though.

You mean like updating their privacy policy to explain that they are keeping sensitive user data in the cloud? They refuse. There are people in this very discussion who are (or were at least) unaware that Signal is collecting and permanently storing user data on their servers. Signal's communication on what they're collecting and how has been a total joke. I cannot consider them trustworthy and at this point I suspect that refusing to update their privacy policy is a giant dead canary intended to warn users away from their product.


Please stop spreading objectively inaccurate FUD in the thread.


What part of what I said was inaccurate?


What user data is being stored in the cloud? Can they decrypt it?


The information they collect includes your name, your photo, your phone number, and a list of all the people you've been in contact with using Signal.

They encrypt it using a pin which they ask you to set or one they generate for you. They (and anyone else) can decrypt that data by brute forcing what is often just a 4 digit number.


Can you show me this? I'd guess it would have to be in the server code, so where is it? I don't see any real information about this on their site. Their website suggests that this information is stored locally

https://support.signal.org/hc/en-us/articles/360007459591-Si...


I know. I really wish they had a simple page that explains everything. They've gone to great lengths to be confusing about what they're collecting and how. There were several complaints about misleading communication when this change rolled out Examples:

https://community.signalusers.org/t/dont-want-pin-dont-want-...

https://old.reddit.com/r/signal/comments/htmzrr/psa_disablin...

They've never updated their privacy policy to cover their new data collection practices even after requests (see https://community.signalusers.org/t/can-signal-please-update...)

The scheme they came up with to store user data in the cloud was described here: https://signal.org/blog/secure-value-recovery/

The code is here: https://github.com/signalapp/SecureValueRecovery

This site does a pretty good job of explaining why this isn't a good design: https://palant.info/2020/06/16/does-signals-secure-value-rec...

Lots of discussion here: https://community.signalusers.org/t/proper-secure-value-secu... and also https://community.signalusers.org/t/sgx-cacheout-sgaxe-attac...

Even more details here: https://community.signalusers.org/t/wiki-faq-signal-pin-svr-...

The major change was that pins were mandatory initially but the UX was so bad they had to give people a way to opt out of setting a pin. There is no opting out of having your profile/contacts data collected and stored in the cloud though.


Link to the lines of code on the server that store the data in the way you describe or stop slinging mud.

You linked to a repository that uses Intel SGX which is used in this instance specifically to address your false claim that the e2e encryption used is easily bruteforced.

It also doesn't store any lists of who you contact; this claim is false. You've already been asked once for code references for your inaccurate claims, it's time for PoC or GTFO.


All of your answers are in the links I provided, I'm more than happy to help, but please make an effort too.

Here is the data that gets collected and stored in the cloud:

https://github.com/signalapp/Signal-Android/blob/3553a28683d...

> It also doesn't store any lists of who you contact; this claim is false.

The entire point of Signal adding pins was to protect the data Signal now stores so that you can recover it. That includes: your contacts, profile, and settings. Signal is required to store your contacts in order for that to happen.

Signal does not even try to deny that they collect and store your contacts. They just don't say so plainly and they often present the fact in misleading ways. Here's one example of them explaining their reasoning for storing your contacts on their servers:

"We're trying to add support for identifiers that aren't phone numbers, since that's what we've heard from users. If we do that, your signal contacts can't live in your address book anymore. Every other app just stores that in plaintext on their servers, which we don't want to do." [source](https://twitter.com/moxie/status/1277737851107471360?s=20)

> You linked to a repository that uses Intel SGX which is used in this instance specifically to address your false claim that the e2e encryption used is easily bruteforced.

It doesn't "address my false claim" it supports it. Again, please read the links. Especially https://community.signalusers.org/t/proper-secure-value-secu... since it addresses both the brute force issue, and why SGX is not able to protect the data. You might find https://community.signalusers.org/t/sgx-cacheout-sgaxe-attac... helpful as well.


I guess I'm just confused because I don't see how what you're linking answers my question. For example, the github link here shows mostly bool values and bytes. The strings I do see do include name, so I do get the argument that your name is stored (though you choose your name). But the code makes me think it is only storing a string to tell the program where your profile picture is. As I understand it, the server holds this information (encrypted) but still requires your phone to decrypt and even provide the source for things like the profile picture. This doesn't seem as bad as what you suggested previously.

As for SGX, my understanding is that 1) this exploit is pretty technical 2) it requires physical access and 3) that it is not the primary method of security, but a secondary one. If I understand this correctly I don't really see why this is an issue.

I'm not quite sure what I'm missing here, but I do appreciate your responses. I'm more than willing to admit that I just don't know/understand. I'm not sure where the breakdown in communication is happening (I'm not a security expert so it very well might be me).


This is the line which remotely stores your contacts encrypted E.164-formatted phone numbers:

https://github.com/signalapp/Signal-Android/blob/main/app/sr...


> I guess I'm just confused because I don't see how what you're linking answers my question.

You wanted to know what Signal was storing and if they could decrypt it. I linked to an FAQ which says:

Storage Service (the “cloud”) What is stored?

All information stored is encrypted; note again that each storage record uses a different derived key for encryption.

This protobuf file explains which information is stored, and how it is structured. You’re probably most interested in this part which shows the actual data that’s stored; it should be self-explanatory, so not copying the list here. Notably, message history is currently not backed up using Storage Service.

The "You’re probably most interested in this part" bit linked to that same github page.

The same FAQ continues:

What is it used for?

Restoring some information upon re-installation/registration of the Signal app (on same or new device) by entering your Signal PIN. This is only possible if you are re-registering with the same phone number you used previously. Not available if you’ve disabled the Signal PIN (in this case only possible with manual backup/restore (Android) or transfer (iOS); these methods additionally preserve your message history).

Syncing contacts and groups to linked devices (this is made possible by syncing the “base” storage service key to linked devices). This is still partially being done using Signal Protocol sync messages, but that is unreliable

This tells you what they are collecting, but you also wanted to know if they could access it, and the answer is yes. I linked to one article explaining some of problems with the security of Signal's set up, but here is another https://www.vice.com/en/article/pkyzek/signal-new-pin-featur...

> As for SGX, my understanding is that 1) this exploit is pretty technical 2) it requires physical access and 3) that it is not the primary method of security, but a secondary one.

Finding exploits is hard, using them is often pretty easy. As the article put it SGX enclaves are “a sort of wet paper bag for clustering sensitive info.” but if you're interested here's a discussion on some of the issues with SGX here: https://news.ycombinator.com/item?id=23468746

Physical access isn't a problem for state actors or Signal employees, and without the enclave we're back to being protected by 4 digit PINs again which is no protection at all.


The protobuf you linked does not support your claim that Signal uploads your contact lists. You'll note that AccountRecord does not contain a list of ContactRecords other than those pinned (4 max). Indeed the application UX does not either.

I've asked for evidence twice and you have supplied none.


protobuf files just contain data structure definitions. StorageService.proto contains only one reference to StorageRecord, but Signal actually syncs hundreds of StorageRecords.

Instead of linking to source code examples, which is easily misunderstood, here is Matthew Green's summary:

https://blog.cryptographyengineering.com/2020/07/10/a-few-th...

Alternatively, you can just open the app settings, which tells you that it will attempt to restore your contacts if you create a PIN:

> PINs keep information stored with Signal encrypted so only you can access it. Your profile, settings, and contacts will restore when you reinstall. You won’t need your PIN to open the app


Look pal, I've given you all the information you need. It's really all right there. I can lead you to data, but I can't make you think.

If you want to go on believing that someone at Signal has found a way to backup your contacts so that you can recover them when you use a new device that does not in any way involve Signal collecting your contacts and storing that data to push back down to you later be my guest. You believe in magic, and I'll continue to believe their own statements, code, and documentation.


Signal does not recover my contacts on a new device.


> Signal does not recover my contacts on a new device.

Are you arguing that the feature to restore contacts doesn't exist or just stating that you can't make it work?


From your links, Moxie says

> We're trying to add support for identifiers that aren't phone numbers, since that's what we've heard from users. If we do that, your signal contacts can't live in your address book anymore. Every other app just stores that in plaintext on their servers, which we don't want to do.

I'm not a sec person, but isn't this along the lines of what I was getting to with my original post? Seems like you can't have your cake and eat it too.


I didn't have a problem with Signal wanting to provide a way to restore people's contacts. The problems I had initially were:

- No way to back up this information without cloud storage: why not let users backup their settings and contacts and profile to an encrypted file stored locally that can be copied to SD card, transferred via USB, or even uploaded wherever the user wants?

- No way to opt out entirely: What if I don't want this functionality at all and I don't care if my new phone doesn't have my settings and contacts and profile picture?

They've stated that they have plans to expand the data they're keeping in the cloud for other things down the road. It's not clear yet what they will do with it, but I get the impression that the reason they chose to start collecting user data was to enable them to grow into whatever they become.

There are lots of ways Signal could have improved their data collection and storage practices to better protect their users. Requiring strong passwords instead of pins would have been great! Not depending on the security of an enclave that has already been proven to be vulnerable to attacks would have been even better. Not storing user data at all would be ideal, but yeah, I get that they are within their rights to change the scope of their product to include data collection and cloud storage in order to offer new features.

My real issue with Signal is that this change was so poorly communicated that a lot of people (as evidenced by many in this discussion) aren't even aware that they are collecting data at all. All of the this confusion is entirely Signal's fault, and it'd be a bad look for any company, but this is a product where trust is critical. Where literal lives are on the line. Anyone using Signal deserves to know what their risks are, and Signal has worked to make that extremely difficult.

The fact that they are being deceptive, even after all this time, makes me think it's possible the service has been compromised and they are communicating to users to avoid using Signal as loudly as the law will allow. Not updating their privacy policy works wonderfully as a canary.

For what it's worth, I was a big fan of Signal. I loved the app. I recommended it to everyone! One of the most disappointing things about this entire fiasco is that there's no replacement I've seen that is as good! I've been playing around with a few alternatives, but Signal had the polish that made it ideal for both secured communications and plain old SMS/MMS. It's damaged goods now though. If I were fine with using an app I couldn't trust I might as well use whatever shipped with my phones.


Here's an idea that's been floated many times in the past: Add e-mail as an alternative alongside phone numbers. Support in contacts in both Android and iOS. The only real difference these days is that one requires government ID (by law in an increasing number of countries) and one does not.

I fail to see any fundamental difficulty here.


I've been complaining about the glaring privacy/integrity problem in their SMS-based account verification scheme for years. I don't think any snafu can make them reconsider. It would forfeit the valuable social network mapping they've already poured millions of dollars into through sending verification SMSes.


It's not so much "valuable social network mapping" as it is "the only social network available to Signal", by design. Without phone numbers, they can't use clientside contact lists (they can build their own, of course, but if it's strictly clientside it won't sync, and so it won't work for most of their users). The alternative design, which HN would wildly prefer, admits to usernames or email address accounts, but requires Signal to keep a database of contacts serverside. That's untenable for Signal's threat model.


After countless discussions of Signal on HN, I have yet to see an explanation for why Signal can use phone numbers from a client-side contact list, but not email addresses from a client-side contact list. Surely, in either case the identifier can be treated as an opaque string, right?

Or in other words: suppose the definition of "phone number" was expanded to include alphanumeric characters and @. What aspect of Signal's current design would break? By saying "without phone numbers, they can't use clientside contact lists", you seem to be suggesting that something would break, but I can't imagine how, unless it's as trivial as a database constraint that says "this field must contain only digits".


> Or in other words: suppose the definition of "phone number" was expanded to include alphanumeric characters and @. What aspect of Signal's current design would break?

I feel like you're mis-analyzing a social problem or some other design goal as a low-level technical problem.

I don't know their real reason, but I can say that my email contact list is waaay messier and less curated than my phone contact list. It would probably be annoying is I'd get a "So-and-so joined Signal!" notification for a bunch of randos I've emailed once and had their email auto-added to my address book.


That seems like a problem that could easily be solved by sending fewer notifications. Do I really need to know if somebody has joined Signal until I actually want to talk to them? Isn't it better to have a larger pool of people with whom I can communicate securely using Signal?

I'm mostly just confused because this is being presented as a technical limitation: using email addresses would supposedly "require Signal to keep a database of contacts serverside". I don't understand how or why that's true.


> That seems like a problem that could easily be solved by sending fewer notifications. Do I really need to know if somebody has joined Signal until I actually want to talk to them?

I don't know what the real reason is, what I said was just something that popped into my head. Another comment mentioned spam-prevention as a reason (by making it infeasibly expensive), and that actually makes more sense. Honestly, there probably isn't just one reason, but a cluster of tradeoffs.

> Isn't it better to have a larger pool of people with whom I can communicate securely using Signal?

IMHO, the number people who care deeply enough about the phone number thing to boycott Signal is vanishingly small; not even a rounding error. Sure they're loud on HN or maybe even Twitter, but giving tiny but loud minorities whatever they demand is bad policy.


Sorry, just to clarify: I didn't mean "it's better if people who don't want to give out their phone number can use Signal" (although I happen to think that's also true).

What I meant was: "it's better if I can use Signal to communicate with people even if all I know is their email address".


Of course there is no real reason Signal should be spamming anybody with these crap "notifications" in the first place. Who wants to wake up at 1 AM to know some dude they texted years ago is now on Signal?


> I'd get a "So-and-so joined Signal!" notification for a bunch of randos

One of the reasons I wish I could use something other than my phone number and access to my contact list to work with signal is these notifications creep me the fuck out and I would rather never get them, or have anyone get them about me.

I'm fine with it being a feature for people who want it, but I don't. I want to make my own damn choices about who I talk to through it.


Noone wants those messages


> I have yet to see an explanation for why Signal can use phone numbers from a client-side contact list, but not email addresses from a client-side contact list

OS functionality? There no "Grant access to Gmail contacts" (= email addresses) on Android/iOS, so that client-side list would have to be manually maintained, while (practically) everyone already has a contact list containing phone numbers of their friends.

That said I don't see why a user would have to have a stored social network at all, why can't it simply be opt-out?


At least on Android, any application that I give permission to access my contact list can see email addresses for the client-side contacts that are synced with my Gmail account. Maybe iOS behaves differently?


People don't have email addresses in their contact lists because all their email contacts are stored on Google servers.


Having the ability to identify by other strings than "phone number" wouldn't take away any functionality, just add it. It would be possible to communicate with devices that have email but not phone numbers (children without SIM cards, for example).

But this is all moot because phone numbers aren't just opaque binary strings. They are more useful than other forms of identification.


That isn't why Signal uses phone numbers. It uses phone numbers because users have a local list of phone numbers they communicate with and Signal can use that to determine if people you communicate with are on Signal. If there was a reliable local list of email addresses they could use that but most people use cloud email now and don't have their email addresses they communicate with on their device.


Why would contacts need to be saved though? If I change my phone, I expect my contacts will be copied in some manner, and they'll be available on the new phone. Why would a messaging app need a persistent social network of contacts?


The sane alternative is that it could keep a client-side contact book that users would be responsible for managing entirely on their own, including when setting the app up on a new phone.

Addendum: also, there is nothing preventing this type of contact book data from being backed-up/synced to a new phone, like any other data and settings of any other app. iOS has this feature since like 7 years now. Android, too, I'm sure.


That's a good way to build a secure messaging app nobody ever uses.


Signal still assumes you manage your contacts no? It happens via iCloud or a google account today.


That anonymous IDs can be _optional_, in addition to phone numbers. Again, like Threema is already doing.


It may very well be the case for the smartphone-flipping demographic that prefer WhatsApp and TikTok, but I think it's a misunderstanding/misrepresentation of the crowd that go for e.g. Signal and Telegram.


Large swaths of my non-tech social group are on Signal because it offers a large amount of practical security and privacy without the kinds of sacrifices people seem to assume signal users want to take. Signal has successfully and dramatically increased the number of people enjoying privacy in their communications by not making that assumption.


The majority of my signal contacts aren't particularly tech-literate. The crowd that go for signal and the crowd that go for telegram are different crowds, in a large part because signal designed itself to be accessible to nonexperts.


So they are tech-literate enough to use a smartphone, and apps for it, and they are tech-literate enough to type in their Signal password reminder in a hidden text field (and probably also passwords on dozens of web pages because password/keychain apps are "hard") but typing in e.g. an anonymized "user token" to add a buddy would be too "tech" for them? I refuse to believe a word of what you're saying.


> anonymized "user token" to add a buddy would be too "tech" for them? I refuse to believe a word of what you're saying.

How do you transmit said anonymous user token securely? Using the secure messaging app you're already using? Meeting up in real life? Posting it on keybase? Each of these has downsides that are all solved by a phone number.


I don't see why the user token ("account name") has to be secret in every conceivable way. It just needs to be anonymous. What's wrong with meeting in real life, or exchanging account names in whatever way you initially exchanged phone numbers? You don't seem concerned over the security problem of account activation codes being sent over SMS, so I don't see why you should be concerned over exchanging anonymous account names in the same or more secure ways.


> What's wrong with meeting in real life

I regularly DM people I haven't seen in person in years. I'm not going to fly cross-country to bootstrap a communication channel.

> or exchanging account names in whatever way you initially exchanged phone numbers?

Well because I exchanged phone numbers irl 7 years ago. I do not have a time machine.

> You don't seem concerned over the security problem of account activation codes being sent over SMS, so I don't see why you should be concerned over exchanging anonymous account names in the same or more secure ways.

Correct, because the bit of information "I have a signal account" is far less revealing than the bit of information "I have shared my signal account with a particular individual".

You avoid that only with some kind of public attestation of your signal identity (in keybase or on twitter or whatever) which is the best option, but generally requires everyone have a known public index of their forms of contact, which my friends from high school, generally speaking, don't.


Signals very explicit goal is making something that's usable for everybody. If they'd wanted to make a nerd-messenger they'd make a different product.

(I still consider it a major downside that the phone number is the only lookup key)


Sounds pretty drastic to me to draw the line between "everybody" and "nerds" at the point of the contact book being available or not.


Feel free to insert your word choice of "smartphone-flipping demographic" instead. The point is that if you argue based on "the crowd that goes for signal", that crowd being everyone is the clear aim of Signal, and thus they design around that goal.


> "It's not so much "valuable social network mapping" as it is "the only social network available to Signal", by design."

I don't understand why you state this, when you obvioulsy know that data is connectable and joinable across discrete sources. Being "the only social network available to service X" is the inherent case for every single online service on the entire planet when viewed as an isolated entity. But this isn't a case of anonymized UUIDs. It's a case of personal phone numbers.


That doesn't follow. A client side contact list does not need to consist of phone numbers.

Apart from using the device contact list (which contains email addresses as well as phone numbers) the client can also keep a private contact list.


I feel we've had this conversation before. IIRC this is where we left off:

E-mail as a complement should work fine and supported in all contact lists. It wouldn't change a thing wrt what you're describing.


It was because of over-represented complaints about phone number requirements that Signal implemented the mistake that is SGX and server-side contact lists. Now the social graph of millions of Signal users is instead centrally protected by Intel's attestation obfuscation and a weak 4-digit PIN. All to eventually support usernames, which normies won't use.


Why are server-side contact lists needed to support identities not linked to phone numbers?


Basically there are many options, none of them perfect:

1. Support phone number contact discovery, with persistence provided by the contacts provider. This is seamless and causes the least amount of complaints.

2. Support username discovery, with persistence provided by passphrase-encrypted online storage. This is painful and risks backlash from people losing access to data. Also, now the threat model must account for or ignore weak derived keys (which is probably most of them).

- 2a. Enforce strong passphrase requirements. Many users will abandon the product.

- 2b. Sync usernames between linked devices (using a generated key). Requires multiple devices, risks people losing data, more complaints.

- 2c. Sync usernames using custom contacts provider fields (e.g. email). Nobody is accustomed to doing this, but it might work. Automatic discovery rates would be low. Possibly requires an odd workflow for people adding Signal contacts by their email/username.


Signal (or, more accurately, one of its predecessors) used to use client-side private set intersection for contact discovery, but this scales poorly [1].

Now they use a solution based on Intel SGX and server-side trusted computing [2].

[1] https://signal.org/blog/contact-discovery/

[2] https://signal.org/blog/private-contact-discovery/


Because right now, Signal can use your contact list on the device to get your signal contacts. If you replace a phone number with a username, there will be no way to match signal's user to your contact without:

- Having a server to hold your contacts

- Or having signal app to maintain contact list and sync it across devices


> weak 4-digit PIN

Are you sure that it makes sense to require people to pick something longer and non-numeric for a PIN?

Or is your claim (wrongly) that people don't have longer non-numeric PINs (lots of us do) ?


Most people will choose weak passwords given the option. And so I think it's the responsibility of the developer to enforce strong requirements (edit: when dealing with data encryption susceptible to brute-force attacks): entropy estimations, 128(+)-bit static keys, etc. If any user has chosen a weak passphrase, and still believes it to be secure, the developer has likely failed.


> when dealing with data encryption susceptible to brute-force attacks

The "brute-force attacks" imagined here are a bad guy somehow controls Signal's systems, or else the US government seizes them and then decides to try to brute force them, right ?

But these are attacks where for various rival systems it was already game over. So your assumption is that Signal's casual users, people who maybe were also considering Whatsapp or iMessage or something, should be required to have a cryptographically strong passphrase so as to defeat this unlikely circumstance, as a minimum?

Moxie's whole deal is that this stuff only works when it's for everybody. If there are a five people in your country who use Signal, guess what, the Secret Police can round them up as suspected terrorists and execute them. Were they planning to bomb the President For Life? Or just organising a pizza party? Don't care, it's just good policy. But if there are five million people who use Signal that's a different matter.

Even if all five million are terrorists, that's numbers where you're going to have to tear up your "no negotiating with terrorists" policy, 'cos there are just too many of them.


I do think considering the average use case is paramount. That's why I think remote encrypted contacts storage should have never been implemented: most people won't choose strong passwords. Giving them the false notion that their sensitive stored data is cryptographically indecipherable is wrong.

As it stands now, people who create a Signal PIN aren't even warned about the security implications of using weak numeric PINs, which is among the worst of all possible worlds.

If this feature is critical, it should have been gated behind prominent passphrase entropy warnings, along with the data being put at risk [1], or it should have enforced actual strength requirements.

Signal is still better than most other messengers. I am mostly comparing it to its former self. And its former self worked flawlessly without needing to upload persisted contacts information.

1. https://github.com/signalapp/Signal-Android/blob/main/libsig...


> All to eventually support usernames, which normies won't use.

But brogrammers will. /s


Do "normies" care about high quality encryption? And did signal ever turn on username support?


I think developers have a moral responsibility to make their products as secure as possible, within reason and while still being usable. It doesn't matter if the users care about the benefits. To your second question: no, not yet.


I fully agree with you, and my first question was asked somewhat sarcastically. For the second, my implication is that developers also have a moral responsibility to make their products as private as possible, and SMS verification aint it.


which is a curious thing to me as the phone number i created a Signal account with is no longer my phone number. what happens if the person currently assigned that number tries to join Signal and what happens to me if they do?


It's unsafe. They could lock you out from your Signal account and impersonate you. Someone who does not know you changed your number, forgot about it or does not think about this could then send a message to the person who has your old phone number thinking it's you at the other end. Most people don't bother with the warning about the security number having changed. I also personally assume that the phone number of a contact in Signal is theirs and that I can send an SMS or call them with this phone number. This not being true is at best confusing even if you are not locked out.

It'd also be nice to let the current user of your old number to be able to join Signal without issues.

I'd say don't play with fire and migrate as soon as possible before having issues.

You probably could follow https://support.signal.org/hc/en-us/articles/360007062012-Ch...

Then the people with who you discussed on Signal will be notified of your number change. If you can't follow this, better decide to handle it the best way you can instead of having to deal with this when you have lost access.


Thanks for the link. I have been way too lazy after having the new number for nearly a year. I just couldn't not do it after someone on the internet did the heavy lifting for me.


Happy to know I had a (hopefully) positive impact :-)


Yes, you get your Good Deed For the Day merit badge!


They mention it, use registration lock: https://support.signal.org/hc/en-us/articles/360007059792-Si...

Basically if someone tries to register to Signal with your phone number they'll need to enter that PIN Signal consistently reminds you of.


So if you have a phone number that someone else used to create an account, you can't use Signal?


Close, but not quite.

You see, if the other person didn't use registration lock, now you've got access to complete strangers account.

Problem solved!


Not exactly. Registering again will make an entirely new identity with different key pair. The new phone holder won't get access to your contacts or your message history. I believe your contacts will also know about signature change as well.


Fair enough. I was just making a bit of a joke. :)


Which is less scary than it sounds because a signal "account" is a phone number. Oh no your privacy!

If you view Signal as "a service that allows you to send E2E messages to phone numbers" then this is fine. Your friends will even get a message that says the chat has been rekeyed once the new person sets up Signal.

And if you're worried about government's compelling your cell carrier to turn over your phone number then rest assured that usernames wouldn't help you since they could just compel Signal to turn over your username. So much safer.

As long as the source of identity is something other than a private key that is owned and controlled by the user and devices must have their keys signed by that key to be considered valid it will be the same issue.


I was actually just trying to make a (poor attempt at a) joke. Honestly, I don't really know, or care, how Signal works.

I have no respect or interest in using any service which requires a cellphone to use it by fiat.

I don't really have any great concern about the government requesting my information, not because I "don't have anything to hide" or "because I'm too boring to care about" but simply because I don't care if they do. They will do what they will do. I will do what I will do. It's immaterial for me to worry and fret over the actions of someone else. Governments are simply a form of authority which protect those who pay up, and harm those who don't. It's neither my protector or my enemy, it's just a thing that demands money from me from time to time.

As far as secure communications are concerned, if someone is truly concerned about such things, the only reliable method I'm aware of is a one-time pad. [1] For most, such a system would be far too bulky and cumbersome to bother with, meaning that the communication itself is, in actuality, not worth securing to the highest degree. This, in turn, makes the thousands of digital alternatives "good enough" for all but nation-state threat actors. [2]

[1] https://en.wikipedia.org/wiki/One-time_pad

[2] https://nordvpn.com/blog/nation-state-threat-actors/


They'll create a new key and your contacts will be notified that your key has changed.


How do you deal with spam without requiring a phone number to register?


Signal doesn't ask for phone numbers simply to combat spam; the phone number isn't an elaborate captcha. Rather, as this article repeatedly points out, Signal doesn't keep your contact lists and other data available to its servers. It uses phone numbers because phones already have contact lists, stored clientside, keyed by those numbers.

To replace the numbers with usernames, Signal users would have to either give up contact lists altogether (at which point nobody would use the service), or allow Signal to keep a serverside database of contacts ready at all times for users who log in. This is what other messaging services do, and the result is that the servers have a plaintext log of who talks to who on their service, which is the most valuable information a secure messaging service can make available to a state-level adversary.


> It uses phone numbers because phones already have contact lists, stored clientside, keyed by those numbers.

This makes sense for contact discovery, which is important for normal people who just want a chat app that works.

But there are a very important segment of signal users, people with an elevated threat model, who I would be willing to bet a good portion of them would gladly sacrifice contact lists if it meant not having to share their phone number.


> or allow Signal to keep a serverside database of contacts ready at all times for users who log in. This is what other messaging services do, and the result is that the servers have a plaintext log of who talks to who on their service, which is the most valuable information a secure messaging service can make available

Why couldn't they just keep using the current system, alongside usernames, for phone number contact discovery? Of course for those who opt into it; imo that's what it should be.


Users having to add their contacts each time they set Signal up on a new phone, should the app keep its own client-side contact book, doesn't sound like hassle.

Could you please explain how Signal does not have a social network map, when 1) user accounts are equal to mobile phone numbers, and 2) Signal servers route messages between user accounts.


The Signal protocol has had "sealed sender" since 2018 - Signal server does not know who sent a message, because the sender's identity is E2E encrypted along with the message.

Even if Signal's server saves a message (they claim not to, once downloaded), Signal's server by design has no way of knowing who sent the message.


Every inbound message is authenticated, and credentials are stored somewhere. Correct me if I am wrong, but I'm betting that it's with the same credential/channel as for logging-in a user (aka "sender").

Also, wasn't "sealed sender" broken (again) earlier this year by a group of researchers?


No, sealed sender messages are not authenticated. The sender's client uploads two things: 1) an encrypted message (with sender id encrypted), and 2) a zero-knowledge proof that the sender's client knows the recipient's delivery token.

There is no authentication by the sender, and the sender does not upload any credentials.


I guess I have to rephrase myself: the API calls are authenticated, because the API endpoints will not consume anonymous requests. I'd be glad if you could point me to documentation proving that the messaging API uses completely different credentials than those for user login, and that the two are also disassociated.


This doesn't prove anything, but:

> Without authenticating, hand the encrypted envelope to the service along with the recipient’s delivery token.

Source: https://signal.org/blog/sealed-sender/#:~:text=Without%20aut...

The sender's client sends a certificate derived from the recipient's profile key.

This certificate is sent to the server as the header "Unidentified-Access-Key" - you can see how this header is derived from the Signal clients' source.

So yes, these API calls are authenticated, but not using the sender's credentials in any way.


Good luck finding documentation about the protocols and APIs used by signal. While every random cryptocurrency has a cryptography whitepaper, it seems that Signal does not.


Signal published detailed specifications of the protocol with reference implementations since at least Feb 2017 (group messaging protocol was added later on): https://signal.org/docs/

The server and clients are open source: https://github.com/signalapp


You forgot the part where joining signal "conveniently" discloses that to everyone - with no way to opt out(!).

Also, everyone not sharing their contacts with the signal app already have that UX. Minus the privacy benefits of course.


Signal has always prioritized message security and integrity over anonymity. If you want anonymity, Signal is not, has not, and probably never will be the tool for you.


Huge difference between having a low profile and actively advertising out new registrations. Does not sit well with any conceivable notion of privacy. Which supposedly is one of their strongpoints.


Are you basing your argument on assumptions that:

  A) Most contacts have phone numbers
  B) Most contacts don't have email addresses?
I think you're assuming this (and as it happens, I agree, although the number of email-only contacts is still nonzero for a lot of people).

Are you also assuming that it just adds a lot of complexity to be willing to search by both phone numbers and emails?

That's the part of your argument I'm not grasping. Signal has to be willing to intersect known-account-identifiers with this-device's-local-contact-handles, what's the problem with preferring phone numbers but allowing emails?


> ...and the result is that the servers have a plaintext log of who talks to who on their service

Is there a reason why a user's address book can't be encrypted as well?


Give people the option to pay. I would gladly pay $100 one time fee if it meant I could avoid having a phone number associated. https://jmp.chat is a great work around but I would rather just have an email address or ideally nothing but a receipt directly associated with my signal account.


Isn't email even worse for security?


Depends, if you're able to poison the DNS of the mailprovider / hack the recipient mailserver or do a phising attack.

I just want to be able to communicate without sharing my phone number (since my phone number is bound to Swedish "Swish") meaning someone can get my ID from my phone number here.

This is why drug dealers use Wickr, Threema and others, because they don't expose identity, not because they're "safer".

I have a contact on Threema who I've met many times, but I have no idea how to contact him outside of Threema, because I don't know his identity and we'd both like to keep it that way.


"security" is vast and means different things to different people.

Email can absolutely be used for with e2e encryption keeping the content of exchanges from external eyes.

Email can absolutely not be used for hiding metadata of who talks to who.


You can trivially create as many emails as you want, anonymously and for free.

In many countries, registering phone numbers anonymously is illegal and/or impossible.


> You can trivially create as many emails as you want, anonymously and for free.

Where? Gmail and hotmail both don't allow this.


I agree with you on this one. the person you are replying too is correct in that it does not matter how much you learn or teach yourself, getting a new number is far from trivial, whereas with getting a new anonymous email account, if you take the time to learn, it is trivial, depending on what you care about.

Do you just care about making burner anonymous emails that just work all the time? There are vendors that you can pay for that service (my go to is protonmail), and there are free options out there as well (my go to is riseup). The problem is finding the vendor that suits your need, since these are niche services for the most part.

If you care more about the privacy / sovereignty side of things, you can set up your own mailserver, and once thats done, its incredibly trivial to spin up new burner emails. But that's an even bigger knowledge gap, to the point where its fairly common even on tech forums like HN for folks to be like "Self host email? nope thats to hard".

So yeah, I agree. This is not really an option for an average user. But in the context of signal, I think it makes perfect sense to allow it as an option, even if its not the default.


One option could be to not be able to send unsolicited messages in the first place. Make it required for everyone to "accept interaction" before messages can actually be sent between two parties. Add in rate limiting so you can only have N open "invitations" and spamming should be very limited.


I don't see the difference between some Isabelle showing up as someone to accept or deny, or some Isabelle with a "I'm a hot single in your area, click this link" message so I'm sure it's spam.

On Telegram this is rampant, on average probably one person per day. It shifted from e-gold scams to sex since a few months, but both are still present. People that aren't in big groups (where the spammers scrape user IDs) have zero problems, so the trick is revealing your random identifier only to those you want to contact you. Phone number identifiers are the antithesis to spam protection: we keep our ranges just full enough that we can't shorten it by a digit, but empty enough that we have small growth possibilities. You're very likely to hit a subscriber, by design, by trying random numbers.


> Add in rate limiting so you can only have N open "invitations" and spamming should be very limited.

That also sounds like a good way to limit adoption as well, at least for anyone with more the N contacts, particularly >= 2N as that means likely a minimum waiting period before you can transfer over "more" contacts since some people will never accept/reject the invite because they don't use the app much.

If it were me and I had to wait on others to accept or reject my invite before I can continue transferring contacts, I'm gonna move on.


Who is dealing with the spam SMS messages I get all year round? Phone numbers do not stop spam, they are used to distribute it. I know I received no spam when I had Google Talk... It is a solvable problem that I guess benefits nobody in power to solve.


Spam is one of the major reasons people don’t use sms. “The product Were replacing sucks so our product can suck too”


I don't use SMS because I can't download an open source client to use on desktop, send pictures or other files, edit messages, encrypt conversations, share a live location, hold a poll, have group chats with some semblance of scale, it just doesn't work for more than receiving an occasional message as last resort.

Spam via sms doesn't seem to really exist here, maybe two per year now, up from zero until three years ago.


Threema seems to manage just well. I guess payment is the natural limiter for spam there.


And accessibility.


Hey look, the centralized nature of Signal has come back to bite it. Who could ever have predicted?

Edit to try and make this less snarky and more productive: In n number of threads regarding Matrix, and the Matrix vs Signal controversy[0][1], the point is that Signal took a single, simple approach and tried to proclaim it was necessary because it allows them to be agile and responsive and integrate new features or security standards and not be weighted down. However, they jumped in and married themselves to a centralized service to do that, and have just essentially ignored that the centralized service may have faults. Whereas Matrix, while slower to evolve, is setup to be far more resilient to a single point of failure.

[0] https://signal.org/blog/the-ecosystem-is-moving/

[1] https://matrix.org/blog/2020/01/02/on-privacy-versus-freedom


If they (Signal) care about privacy, they need to drop the need for phone numbers to use their service, there are many ways of dealing with spam (rate limiting, captchas, ...), a true private/secure messenger app should not require any user identifiable info. And the argument of "Signal was the first e2ee messenger app to go mainstream, so they can keep ignoring user's privacy, .... yada yada..." is naive at best; they should lead by example, right now there are many solutions way more private (Briar, SimpleX, Session, Wickr, ....). I user Signal, and I like it, is just a shame they soft-refuse("We are working on it...") to remove phone numbers from the equation.


I second recommending Briar as a messenger. Codebase is well maintained. Specifications and documents are audited, as well as the official clients.


Briar is nice but I wish it worked through Tor like Tox does.


I was always opposed to this STUPID requirement that you NEED to tie Signal to your phone number. I never used it and AFAIK even for the Desktop client you need to have a phone number. Wire and other messengers that use the same or similar protocols do not have this issue.

I am not the only one hating this, from the very start this was a huge critique on Signal by many people, but they never changed. I did not even know they used a 3rd party service to "verify" phone numbers. This is a HUGE issue. People who truly want to stay PRIVATE, politically hunted people who are in life and death situations should never ever use Signal.


Unrelated to this current focus on the weakness of using a phone number as a contact reference, I have experienced a different problem. If a user has ever used Signal previously and for whatever reason revert back to SMS, if another user sends them a message via Signal it is lost with no indication. This is a growing problem as people try out Signal and when they get a new phone just forget or don't care to install it again.


I'm reading yet another argument against centralization.

Matrix protocol, anybody?


How is this an argument against Signal and/or centralization? Pretty much the exact same thing could have happened with matrix 3pid servers.

I really like Matrix (and Signal), I use it and even run a public Matrix home server, but jesus I can't stand obnoxious Matrix fanboys who feel the need to shoehorn some Matrix plug into literally every conversation involving Signal.


Because all of those users are exposed by the same SaaS fault. Too many digital eggs in one digital basket.

What is there to plug? It's FOSS. Use it, or don't, who fucking cares? Whether you or anybody else use it or not is of absolutely zero consequence to me.


the recommended fix here is to add a PIN + enable registration lock

IIRC signal PIN was very controversial back in the day because they were 1) forcing users to do it and 2) forcing them to opt in to some data collection as part of creating a PIN. Signal backed down on requiring a PIN, but now it's unclear from their settings page whether setting a PIN will share data as well. The marketing copy on my droid device says:

> PINs keep information stored with signal encrypted so only you can access it. your profile, settings and contacts will restore when you reinstall

For a company with relatively good cred, this is a shady way of saying 'we will upload your contacts and encrypt it with a short string'


From my 5-minute reading of the Signal source code it seems that disabling the PIN results in the generation of a random 256-bit master key for cloud storage encryption:

https://github.com/signalapp/Signal-iOS/blob/main/SignalServ...

The primary developer has stated previously that cloud storage is still used even with the PIN disabled:

https://community.signalusers.org/t/beta-feedback-for-the-up...


They do some monkey business with sgx to make brute forcing harder but yeah it's worrisome.

https://blog.cryptographyengineering.com/2020/07/10/a-few-th...

https://signal.org/blog/secure-value-recovery/


Using phone numbers as an identifier is terrible, use session! https://getsession.org/

Your private key that is used to encrypt the messages and the public key is your identifier


It's ridiculous this is down voted. Client is a fork of Signal on all platforms. The fix is made, use it.

Arguments are usually just "ad hominem" against blockchain.


"We conducted an investigation into the incident and determined the following."

Guys how is the punctuation here not a colon?!

This is an extremely serious issue according to my English Major inclination towards pedantry.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: