Hacker News new | past | comments | ask | show | jobs | submit login
Security for the people (google-opensource.blogspot.com)
110 points by avsaro on Sept 18, 2014 | hide | past | favorite | 53 comments



The simplysecure.org domain uses Google Analytics which isn't disclosed on their privacy page (as required by Google Analytics' TOS). There is also a mixed-content warning because someone hard-coded an http:// link to the balloons image in the blog post on their https site. I went to email them and first looked for PGP keys (us pro-privacy & security people all use PGP, right?) and found none for the domain on any key servers.

I point all this out because it perfectly illustrates their point. Here are a group of smart people launching a great initiative with a beautiful and modern website, and they made some mistakes just like the rest of us do.

I've offered to help and recommended they switch to Piwik (and fix the http link). I hope others here on HN who care about security and privacy will also offer to help out. Some of this stuff is easy to fix and I think it's a great initiative.


    I've offered to help and recommended they switch to Piwik (and fix the http link).
HN does tech satire so much better than the rest of the internet..


What do you exactly mean?


Telling the Google Open Source people to use Piwik instead of Google Analytics seems like satire.


I know the Piwik recommendation gave some a reason to chuckle but I was dead serious. This initiative was promoted by Dropbox (who recently brought Condoleeza Rice on board) and Google (an ad company currently removing privacy-enhancing apps from their mobile marketplace) and so this group will need to distinguish themselves to appear credible.

It's that or be dismissed as just another iteration of Microsoft's 'embrace, extend, extinguish' plays.


Not sure exactly, I was copying this comment: https://news.ycombinator.com/item?id=35143


Researching and developing usability and security auditing practices. How do we measure the two in a single assessment?

You don't. They're wildly different disciplines. Security auditing is fundamentally a systems programming problem. The least effective security "auditors" approach security as something different than software engineering. The most significant security issues arise from correctness issues; finding and fleshing them out involves discovering the degrees of freedom offered to an attacker by faulty assumptions made by blocks of code scattered throughout an entire system.

That's not to take anything away from the practice this team is trying to start! Someone should be doing security usability work, because I don't know anyone who does it well now.

Providing real software security for open source projects is a tricky problem. Talented software security people are in enormous demand. Bill rates are going up. One thing a project like this might want to tackle is onboarding more technical people into the discipline, to address the supply problem.


I'm not so sure. "User uses the system as designed" is a huge faulty assumption -- look at all the people who put SSH private keys (or Amazon AWS keys) on GitHub or who write a shared company-wide password on a sticky note taped to their monitor. A more usable system might prevent these sorts of failure modes, or at least inform the user of the risks that result from such decisions.


I'm probably being imprecise. There's a big and important discipline of assessing the usability of a system and the impact of all the affordances the interface of a system provides. I believe that also takes a special skillset, and it's a skillset I'm happy to see new initiatives like this taking on.

I am not suggesting that security usability (or, to keep it technical, security UX) is easy, or that software security practices are necessarily good at it.


I agree that these are separate disciplines, but I think that one reason we're not seeing anyone who does security usability work well right now is that that the best practitioners in each field tend to be silo'd by their specialization.

A single assessment is not necessarily a single metric, and an assessment comprised of audits in each domain seems like a good first step toward building understanding of a common goal and measuring progress toward it. Putting these audits together will hopefully start to expose not only the tradeoffs, but also the synergies at play in designing secure systems.

It seems to me that at the root of the security usability problem is a failure in collaboration between developers with solid software engineering practices and designers with solid UX design practices. Talent on both sides is in high demand, but there are few organizations that are able to get both working together effectively on these hard problems.


>Talent on both sides is in high demand, but there are few organizations that are able to get both working together effectively on these hard problems.

Somehow Duo Security[1] has managed to do this phenomenally well. Their product is both a UX wonder and a security marvel. My first thought was "This doesn't feel like a security product. This feels like a solid UX design demo." But looking at their open source code, it's some of the more beautifully designed security software I've seen. I wonder what their secret is-- and if they'd be willing to share.

[1] https://www.duosecurity.com/


It's very odd that they lead off by focusing on unrelated FUD.

> However, if people are indeed working to protect themselves, why are we still seeing incidents, breaches, and confusion?

That references a completely different security area, and as much as it's a juicy source of scare stories for mass media, it's unrelated to end user security with respect to government and corporate mass surveillance.

The former is basically insecure by design (based on the assumption that transactions are always reversible), with the duct tape occasionally failing. The latter will never manifest itself as a discrete problem for the sheer majority of people, but just an ever-growing set of annoyances and chilling effect on one's thoughts and actions.

They require completely different approaches. For the former simply having backup credit cards, being prepared to sue your banks for negligently giving away your money, and flagging the pop culture scare articles off Hacker News - that's about all you can do, because the deficient technology is not yours.

The latter requires proactively analyzing the implications of one's technology choices and avoiding the attractive nuisances. Fixing these problems is not at all straightforward and is one of the great struggles of our time, which is why it is such a disservice to conflate the two.


But the former now leads to further erosion in the latter as well as reinforcing the misconceptions and fear tacticts that lead to greater and greater incursions on the privste sphere. Those "annoying" mass media articles have led Congres to enact laws, a court to seize the operations of a private company and hand it to a competitor, as well as forming tighter public-private partnerships that provide end-runs around due process.

These things can happen because the public is uninformed, disinterested, and subjected to carefully coordinated messaging by security companies selling solutions, "credit monitoring" companies that package insurance, and the mass media that doesn't have a deep commitment to getting the facys right when it comes to this topic.


And don't forget the end-to-end project[1], which is the javascript crypto library by google.

The significance of these types of project extend beyond browser privacy. As crypto-currency become more prominent, we NEED better, carefully auditted javascript crypto-libraries.

Right now, all the crypto-code are home baked. e.g.: https://github.com/bitcoinjs/bitcoinjs-lib/blob/master/src/e...

While, I think they are all doing a fine job. It is not settling to think that these mission critical, crypto-code is not vetted by cryptographers.

In fact, a few months ago, there was a bug where the nonce for each signature was not set properly that basically meant you were able to work out the private key for 2 different signatures. Some users lost funds due to the bug.

These open initiatives will lay an important foundation.

[1]https://code.google.com/p/end-to-end/


I don't think that JS or any other interpreted/JIT-compiled crypto code will ever be vetted by cryptographers. Simply the fact that you can't control the memory, CPU cache and instruction scheduling means that your code is vulnerable to at least side-channel exploits.


The Stanford Javascript Crypto Library was written / overseen by Dan Boneh who is a serious cryptographer by any definition.

http://bitwiseshiftleft.github.io/sjcl/


> We believe that SJCL provides the best security which is practically available in Javascript. (Unforunately, this is not as great as in desktop applications because it is not feasible to completely protect against code injection, malicious servers and side-channel attacks.)


And? It's vetted by a cryptographer who noted the caveats that apply. Do you take 'vetted' to mean 'unreservedly recommend'?


I would, yes.

His disclaimer mentions three game-over problems.


Obligatory link to "Javascript Cryptography considered harmful" which neatly summarizes the pitfalls here: http://matasano.com/articles/javascript-cryptography/


Browser extensions are cryptographically signed and verified, while web application javascript is not.

The problem isn't with javascript, it is with delivering javascript in a web-based application (amongst other concerns).

Most of the other concerns about web delivered javascript also don't apply to extension security. Example: a web application can't interfere with the execution of extension code since extensions reside within their own context and cross-origin rules apply (there are special API's accessibly only from the extension to call into the web javascript).

End-to-end from Google is a browser extension, and it is signed by the developers and then verified on install. It is more secure than a traditional desktop software installation.


Holy crap, people are actually using javascript crypto for bitcoin‽


They also pretty commonly use in-browser JS SHA256 to derive private keys from human-generated passphrases: https://brainwallet.github.io/ There have been reports of pretty obscure, but still guessable, keys being cracked this way.

Bitcoin is fascinating just because it makes these sorts of things worth untraceable money, sometimes a lot of money, and puts it in the hands of people who have never had that sort of responsibility. Whatever else cryptocurrency does, maybe it will teach laypeople about these things, devise new ways to teach them and new technological measures to increase their safety.


An excellent use of an interrobang!


I wonder what the high level thinking at Google is these days around user privacy. They must be aware that the growing noise about it means it is going to be the stick they are going to be attacked with, and yet their business relies on people willingly giving some of it up.

While this particular announcement seems fluffy, I do somewhat applaud the intent. The question of how we expect to run the internet economy and preserve user privacy is becoming more pressing for everyone though.


I feel "willingly giving some [privacy] up" and "being incapable of even assessing one's level of privacy" are two very different conditions to be in.


> "The question of how we expect to run the internet economy and preserve user privacy is becoming more pressing for everyone though."

It should be run like the economy should be run, with consumers paying for the things they consume directly, rather than via an advertiser proxy at significant additional costs to us all[1]. Then there would be zero need to violate user privacy except where it strictly required for end-user desired functionality, and users can vote with their wallets.

The invisible hand only works correctly with such a direct buyer-producer relationship.

This notion of "free" websites and web services is a lie and needs to be exposed[1].

[1] https://news.ycombinator.com/item?id=7485773


https://simplysecure.org/ is the project that Google is supporting, which seems to be an open consultancy that works with OSS crypto/security/privacy projects to make them easier to use.


On the topic of bringing better security to normal users, the project I'm most hopeful about is the FIDO alliance's U2F and UAF standards.[0] There's a new YubiKey that implements U2F, which is supposed to ship soon.[1]

I'm optimistic because these are open standards backed by many big organizations, and web browsers will be shipping with support built right in, no drivers, no plugins.

Tokens like that Yubikey work over USB HID (they look like keyboards to the OS, so they don't need dedicated drivers). They also work over NFC on mobile devices (ok, on Android devices. Still waiting on iOS.)

[0] https://fidoalliance.org/specifications [1] http://www.yubico.com/2014/09/yubikey-neo-u2f/


This sounds smart. And I'm pretty pragmatic. I don't think Google and Apple and Microsoft are really trying to ruin my privacy.

But something about this makes me uncomfortable. The fact that when this appeared on the front page of HN it was with posts from Google and Dropbox about how they support this.

Google that Assange is releasing a book about, and Dropbox that went a day where it didn't matter what password you entered and has former Secretary Rice on their board.

I have a feeling this organization might help keep things like the recent celebrity iCloud break happening, but as someone else said, real security is not easy. And false security is worse than no security.


A PR announcement of an initiative to form a coalition to investigate making it easier to use security tools that most people don't care about. How thoughtful of you, Google. This wouldn't be an attempt at improving the public's negative view of you with regard to privacy issues, now would it?

Funny how Google doesn't offer a messaging service that's secured to a physical device the way Apple does, and these projects when combined would result in something like that.


You seem to imply that there is no real value in the initiative. I humbly suggest you ask the teams doing all the work (Guardian Project, Open Whisper Systems, F-Droid, etc.) how excited they are about this. I suspect it's not as intangible or fuzzy as you imagine. They're expecting real resources and real funding, which will result in real progress for their already-strapped projects.


I'm sure they're very happy to get funding. The fact that they're excited to get money for their projects does not mean their projects have value, or that most people will be affected by them.


I see the motivation behind it and appreciate the cause, however it is not just about the tools. Yes we need better tools but that alone wont help. 1-Click solutions often promise a lot but only end up giving a false sense of security. I belive that a certain amount of understanding of the underlying, general technology will always be required - both for users and developers.

Encrypting your e-mail does not help you if your machine is compromised.

Your super-secure 24 character password is useless if you use it on every single website.

Encrypting and salting your users credentials is useless if your SSH password is 6-digits.

What i'm saying is: REALLY staying secure is not easy and it never will be.


I'd like to see Hangouts eventually implement some of Signal/TextSecure's crypto.


I'd like Hangouts to do a lot of things. It looks like they made the right choice in moving away from XMPP compliance in favor of usability. It certainly improved the apps' popularity but took a lot of back steps on security and openness. I'd like to at least see google include hangouts as part of their upcoming [end-to-end](https://code.google.com/p/end-to-end/) extension.


I'm excited to see how the Guardian Project will use the resources. God love em, but their projects seem to have less UX polish compared to those of Open Whisper Systems, likely due to Twitter's positive post-acquisition influence on the latter.


"We’re excited for a future where people won’t have to choose between ease and security, and where tools that allow people to secure their communications, content, and online activity are as easy as choosing to use them."

That future is now according to Apple.

In any case the real issue is protecting people from Google itself.


Apple's devotion to privacy ends at phone unlock; the easiest attack vector to social engineer or forensically determine. Today's announcement is not much more than a PR stunt to cover their ass and distract from their own repeated failings to secure their platform or networks.


"Apple's devotion to privacy ends at phone unlock."

If you had read either the privacy policy or security architecture, you would know that you have made a false statement here.

I think it's you who is trying to distract us from the fact that Google itself uses your private data for its own business purposes, and has a vested interest in not protecting you from itself.


I find it adorable how people buy into the rhetoric and trust that Apple doesn't leverage and track your data. They are a hardware and content company, and they absolutely leverage all the data they can in order to better market you the content they think you might want. Its not just business, its personal. Even with the lions' share of their money coming from hardware sales, so what? Without deep, targeted tracking and mining of user data for market research their entire mobile platform would implode along with the business models of their content partners and app makers.

Meanwhile, I show nor imply any support for Google on the subject. They are no better, perhaps worse.


So you simply choose to believe they are lying with no evidence at all.

Not much point in discussing anything with you, I guess.


... and we devolve into the same old debate over the definition of metadata


Nope. Thst's just you presenting more cynicism without a shred of evidence.


Hey google, you can start by phasing out RC4 on YT.


Nitpicking:

They should use a SHA256 certificate :-)

And DANE and TLSA


Baffling that anyone could still be advocating for DANE after the NSA slides indicating that the USG has owned up CAs came out. Why exactly does anyone believe the DNS roots are harder to own up?


We merged the comments from https://news.ycombinator.com/item?id=8335470 into this thread, since the stories are the same.


The toolbox logo for simply secure is killing me. Whomever made the logo has never used any hand tools. The saw is part hacksaw part panel saw.


It's a back saw. Sure it has a weird handle and low tpi, but every woodworker knows what a back saw is.


And how many woodworkers have a back saw with a hack saw handle? I just went through a ton of clip-art toolboxes and cant find any where the designer decided to go with the extemporaneous saw design.


If you're nitpicking the handle design on some clip-art, then I think you should probably just step back a bit.


This thread has some great 'Shit HN says' lines. We are truly a tiny little bubble.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: