Closed tools like Signal, Snapchat, Slack, Whatsapp, iMessage, Messenger, Hangouts etc will all make the promises about privacy you want to hear, but at the end of the day they are closed source and updates or commands can be sent to your handset to send plaintext to their servers at any time.
The question is under what criteria this will happen. Insider abuse? Government order? To make money?
Trying to make it illegal for companies to do this sort of thing on a country by country basis is worth pursuing, but it is not a real solution. We need to stop trying to use the law to enforce security.
The solution is to use tools that take court ordered backdoors off the table. Support open and federated communications networks where anyone can build their own clients or servers where pressuring of any entity can't put community built clients at risk.
There are a range of clients/protocols that meet this criteria such IRC with OTR, XMPP, IRCv3, Silence, Matrix.org.
Take your pick, and convince your contacts to use that instead of companies where you have to just take their word for it they won't backdoor you as it suits them.
Signal is free software[1] -- GPLv3 in fact. Don't get me wrong, it has its own issues with Moxie having very strange views of the threat model (and being anti-federation and anti-distribution), but it is definitely not proprietary. I also concur with the Matrix.org recommendation.
Signal likes to throw around that they are free software but really that is more marketing than fact.
The signal -client- is open source, but the server is closed. Yes partial server code is open, but running your own server is not allowed. The signal signed clients will only talk to the closed server and Moxie has made it clear he does not want forked clients or clients not built by his team connecting to his network. Open source f-droid builds are not permitted, and if you want updates you must use the play store builds. Those with open source phones must turn on unverified sources and risk man in the disk attacks to install the apk from the signal website.
Security you can't fully verify, is just called marketing.
> The signal -client- is open source, but the server is closed. Yes partial server code is open, but running your own server is not allowed.
This is not true at all, the server is entirely AGPLv3[1]. You can run your own server, but they don't want to federate and don't want people to distribute forks of their code (that connect to their servers). What they want is irrelevant because the license they've put the code under explicitly allows you to do these things -- though arguably they are allowed to restrict connections to their servers because that's a freedom under AGPLv3.
So while I agree (and I explicitly said I agreed in an earlier comment) with the problems with Signal -- you are not helping explain why Signal has issues by spreading misinformation about it being proprietary. It isn't proprietary nor is it unverifiable, instead it is run by a company that has no interest in federation or solving much more important issues with their service. That is a serious enough problem that you don't need to make up issues that will just discredit legitimate complaints.
It does look like they have moved away from closed server components like RedPhone and claim all sources are published now. I appreciate that correction.
The source code for -a- signal server is a nice gesture for anyone that wants to build a signal fork but it does nothing to prove the signal server actually in use is fully open source and does not have last minute patches applied. Can't run my own for the real network so I must hope an employee won't be pressured into changing live systems or signing malicious client binaries at any time with no one noticing. Seems we both agree this is a real problem.
Verifiable security and centralized trust are incompatible.
I do still generally consider any code that can't be verified to be closed but I agree accuracy is important.
Again, the server code is AGPLv3 so it would copyright infringement (of the contributors to the code) on the part of WhisperSystems if they were to patch the server code and not provide the sources to users. There isn't a technical way to stop this problem (federation wouldn't solve it either -- it would allow you to switch to a host that you trust more but that's a different problem), you just sort of have to trust that WhisperSystems isn't breaking the law.
To reiterate -- I agree with you on the general point that federation and having a decentralised system is important for many reasons. But you're moving the goal-posts so that now, even if the server code is AGPLv3 (which requires giving source code access to network users) you still can't be sure that code is running in production, and thus it's still effectively proprietary. That's not a reasonable argument.
As an aside, WhisperSystems has remote attestation of parts of the server code using Intel SGX, which means that it actually has some degree of verifiability[1].
Given the context of the five eyes backdoor discussions it is not unreasonable to expect that a government could pressure WhisperSystems to manipulate a client update or patch a server, GPL laws be damned. A single employee could also be bribed or blackmailed. Intel could also be compelled to falsely attest an SGX enclave.
When it comes to protecting privacy against highly motivated and sophisticated adversaries then centralized trust is just not an option to be taken seriously. It creates a Lavabit sized target.
A company that is as serious about privacy as their marketing indicates would, like TOR, encourage as many servers to run as possible to ensure there is no central pressure point to abuse.
Cards on the table: I find Moxies insistence on a walled garden while using Open Source and Privacy to market it simply unethical.
I do again appreciate the updates on the current status of their public source code though. I will strive for better accuracy in the future on this.
The question is under what criteria this will happen. Insider abuse? Government order? To make money?
Trying to make it illegal for companies to do this sort of thing on a country by country basis is worth pursuing, but it is not a real solution. We need to stop trying to use the law to enforce security.
The solution is to use tools that take court ordered backdoors off the table. Support open and federated communications networks where anyone can build their own clients or servers where pressuring of any entity can't put community built clients at risk.
There are a range of clients/protocols that meet this criteria such IRC with OTR, XMPP, IRCv3, Silence, Matrix.org.
Take your pick, and convince your contacts to use that instead of companies where you have to just take their word for it they won't backdoor you as it suits them.