Hacker News new | past | comments | ask | show | jobs | submit login

It's not an arbitrary dichotomy, it's a factual one. Mac OS doesn't resemble a centrally controlled 'display terminal' at all, whereas Chrome OS does.

You also didn't answer the part about trusted code.




Mac OS hasn't reached the point where one needs to jump through hoops to install arbitrary programs, but given that the beginnings of such a system is present in Lion and it's the status quo on all of Apple's more recent devices (and one of their main selling points), one can only conclude that's the direction they will be taking consumer-oriented Mac OS in.

> You also didn't answer the part about trusted code.

Because it wasn't there when I replied.

Centrally signed code repositories on their own aren't the worst thing, I rely on them myself (apt-get). The problem arises when it's fixed to one possible repository. If the proverbial "average user" cannot install and administer a friend-approved but Apple-unapproved app with the nearly the same ease as an Apple-approved one, we end up with a situation where Apple directly controls what average users are capable of. It also causes users who wish to own their device to hope for new exploits to be publicly discovered, which is utterly backwards. I understand that progress for actual security (isolation, capabilities, proper deputies, etc) takes significant work, but coarse-grained whitelists aren't the answer.


I agree with your first point, but I disagree that it makes Mac OS into a 'display terminal'. Particularly since although hoops are being introduced for good reason, there is no evidence that Mac OS X will be closed in the way iOS is.

Your second point as I've indicated, is just conjecture. I don't see good evidence for it actually happening. Certainly nothing that Apple has publicly disclosed suggests that it will. It seems pretty unlikely to me.

But yes, if they did go that far, they would indeed control what average users were capable of.

If it ever gets to that point, I'd hope that by that time, there would be some obvious killer apps for another OS to demonstrate why it was a problem.

[edit: Coarse grained whitelists might not be the answer, but I highly doubt that Apple is going to stop there. Every OS release is a step along the way. It's worth noting that iOS has generally developed in the direction of providing more capabilities to programmers, rather than less over time.]


I've actually thought about this a bit, and it's not a "killer app" that will be missing from locked down systems, as the repository can always add anything that becomes popular elsewhere (after a little delay from porting/approval/etc).

The difference starts at the foundation, and manifests in a pervasive lack of respect for the user (who's ultimate control and understanding should be a prime usability concern).

For instance, that whole device-id brouhaha - iOS apps really get a unique device-id, which they are then supposed to partially-obscure according to Apple's guidelines? Why in the world is an app allowed to directly query a fixed identifier in the first place?! There should be a specific ID-Api of which the user controls via a system dialog the same way a user controls how long a browser stores cookies for. Sandboxing+auditing then make sure apps aren't using something like the ethernet addresses to get around the user's choice.

But unfortunately, most of the developers who actually know enough to analyze this are on the take of the ad companies and think that their stake on the user's device is equivalent or even overrides the owner's! So Apple kowtows to the advertisers and permits uncontrollable tracking while the end-users are stuck with their only choice being 'use or not use' an app based on how much they perceive it abusing them. Instead of being introduced to a world full of self determination and limitless possibilities (as early computer adopters were), modern day users are shown a standard no-free-lunch world where "they either get you coming or going". Developers are still able to seek out freedom, but the goal of empowering an end user to solve their own problems couldn't be farther from sight.

(And yes, Android has most of these same problems in addition to some of its own, which is why I said dichotomies aren't useful.)


Instead of being introduced to a world full of self determination and limitless possibilities (as early computer adopters were), modern day users are shown a standard no-free-lunch world where "they either get you coming or going". Developers are still able to seek out freedom, but the goal of empowering an end user to solve their own problems couldn't be farther from sight.

I couldn't agree more with this. However that dream seemed to die with the breakup of Alan Kay's original group. Nobody is even approaching this problem except perhaps Kay's own FONC group, and even that seems to be more academic than practical now.

That said, I think that as digital culture matures as more generations grow up with digital creation, programmability will become the primary constraint, and then we might see progress in this area. If Apple doesn't keep up (although I expect they will), this is the domain I expect the killer app to emerge from.

I'm not sure why you bring up the device-id thing. Apple corrected that issue without external pressure. Also, in the real world, I think that expecting end-users to manage a second cookie-like entity with subtly different semantics to cookies is unrealistic.


It's easy to point to Alan Kay's research and then wistfully say that it will hopefully bear fruit one day, but that actually does a disservice to anybody working on today's systems that treat the user as a mature, self-actualizing owner - whether they be creating new software or making existing software more user-friendly.

A system that's built on a philosophy of eliminating capabilities can never progress into a system that allows a user to gradually learn more and empower themselves, as there's nothing "further down" that unifies the whole thing. Software that starts off requiring significant effort to administer can progress into having a user-friendly interface and be incorporated into systems with sensible defaults.

One shouldn't require a user to have to configure everything out of the gate (say, cookie policy), but one shouldn't prevent them from doing things they know they want. Wasn't the Apple device-id thing "fixed" by only allowing tracking on a per-app basis? With cookies, I can have them deleted every time I close the page.


It's easy to point to Alan Kay's research and then wistfully say that it will hopefully bear fruit one day, but that actually does a disservice to anybody working on today's systems that treat the user as a mature, self-actualizing owner - whether they be creating new software or making existing software more user-friendly.

Setting the straw-men aside, which systems did you have in mind?


Every mainstream operating system has unique IDs readily available to applications - it's just that native applications don't traditionally include ads.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: