Hacker News new | past | comments | ask | show | jobs | submit login

> you now have to either pay a bazillion dollars to some shadowy certificate cartel in order to produce a binary, or convince users to click through a scary-looking dialog which assures them your program is an evil virus that will steal their data, format their hard drive, and empty their bank account

Aren't we talking about a fee on the order of one developer hour per year? I share the annoyance that there isn't a better way to handle trustworthiness but look at it from Microsoft perspective: the odds are actually quite good that your users are seeing that scary dialog because someone actually is trying to scam them into installing adware, trojans, etc. which will degrade their system and almost certainly lead to complaints about how slow / buggy Windows is.

I think the long-term answer is going to look a lot like Apple's sandboxing on OS X: app-level permissions and scariness of dialogs proportionate to how much access an app wants outside of that sandbox. Until that's widely accepted, people are going to double-down on the code-signing path and that's going to lead to a lot of intentional slowness in the verification process. This sucks but I don't see a better solution with the current security models.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: