Hacker News new | past | comments | ask | show | jobs | submit login

As far as I can tell there was basically a wink-wink nudge-nudge agreement between the various browser vendors/web standards authors (same people) that timing attacks that revealed things like the users browsing history, or that allowed tracking of users weren't a problem and not sufficient to block a feature from being added to browsers. See the many issues in [1][2][3][4][5] which stretch back over 6 years (and aren't all fixed) for an idea of how long this has been going on for (and a previous rant I wrote here https://news.ycombinator.com/item?id=10022315 ).

The cache attacks never received as much publicity for some reason (the papers being harder to read may be one). I do wonder what would have happened if this hadn't come out at the same time as meltdown. It's quite possible it would have been brushed under the carpet yet again.

There were token 'fixes' to these things (coarsening timers) in the past, but they never worked and everybody involved knew they wouldn't work. The introduction of features like SharedArrayBuffer revealed how (un)seriously they really took the problem. They knew it could be used to implement high precision timers but it got added to browsers anyway because it was central to the project of making the web an application platform.

They perceive a need to allow high precision timers (or features that can be used to implement them) because without that the web won't be able to do a lot of things that are possible in native applications.

I'd like to think that this is the moment that browser vendors come back to their senses and rethink what they are doing but I doubt it. Google is a multi billion dollar company based on the web as a platform, and running untrusted javascript on other peoples computers. Dropping the idea of the web as the platform to end all platforms would be an existential crisis for Mozilla. They are locked into this madness with no way to stop that wouldn't effectively be corporate suicide. Expect years of half hearted 'fixes' which don't fix the problem.

[1] https://news.ycombinator.com/item?id=10455735

[2] https://www.nds.rub.de/media/nds/veroeffentlichungen/2014/07...

[3] https://contextis.com/resources/white-papers/pixel-perfect-t...

[4] https://www.mozilla.org/en-US/security/advisories/mfsa2013-5... and https://bugzilla.mozilla.org/show_bug.cgi?id=711043

[5] https://bugzilla.mozilla.org/show_bug.cgi?id=884270




> because it was central to the project of making the web an application platform

This is very insightful. A lot of web standards are attempts to add what traditional native-app capabilities to the browser (e.g. 2D Graphics => Canvas).

What's very clear now is that native-like web capabilities imply native-like vulnerabilities, but delivered over the network.

Browser vendors rethinking what causes a user to launch complex Javascript on tabs they visit (or worse, in invisible iframes) would be a great start. It was one thing when all Javascript could do is style and manipulate the DOM. We can now compile vim into Javascript and that demands a completely different response.

This is not to stop progress on web standards, but if the web community takes this opportunity to level-up their security practices, it'll help them (and web application developers and users) in the long run.


Wow, thank you for writing this. It's very, very eye-opening and insightful.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: