Hacker News new | past | comments | ask | show | jobs | submit login
Apps? No root? Your device serves others, warns Berners-Lee (zdnet.com)
192 points by lispython on Feb 1, 2013 | hide | past | favorite | 131 comments



Corey Doctorow explains more fully:

"When we turn a computer into an appliance. We're not making a computer that runs only the "appliance" app; we're making a computer that can run every program, but which uses some combination of rootkits, spyware, and code-signing to prevent the user from knowing which processes are running, from installing her own software, and from terminating processes that she doesn't want. In other words, an appliance is not a stripped-down computer -- it is a fully functional computer with spyware on it out of the box." [The Coming War on General Computation]

Video: http://www.youtube.com/watch?v=HUEvRyemKSg

Transcript: https://github.com/jwise/28c3-doctorow/blob/master/transcrip...


More like Cory Doctorow grinds his axe. Most people don't want to know how it works or hack on it, they just want it to operate reliably for the purposes they purchased it. It's not a war on general-purpose computation, it's an unprecedented variety of embedded computers.


"it's an unprecedented variety of embedded computers."

... at the expense (decreased availability, and public tolerance) of the general ones. Possessing or promoting general purpose computing might well wind up on the FBI's list of suspicious behaviors, along side such threatening habits as paying cash for gum; "normal" people are still basically terrified of magic/the unknown. Hackers are the new witches. A plethora of embedded devices, where one general purpose computer could have done the same, would be tragic.

Of course, all of this is just my humble opinion.


Oh, I remember people worrying about the same thing 20 years ago and general purpose computing is cheaper and more accessible than ever. Embedded devices exist because they serve a purpose well. I like a good Swiss army knife, but I also have a bag of specialized tools.

As for hackers being 'the new witches,' people aren't exactly scared of computers. If anything, they take the complexity of the digital age with too much complacency.


I think the complacency you describe is only side of the coin. People are happy as long as knowledge doesn't empower other people more than it does them, but as soon as you see and say that the emperor has no clothes (especially if they're the emperor), the lawyers and the Luddites descend like the four horsemen of the infopacalyse.

Admittedly, I don't think general purpose computing will in any sense "disappear," but I do think that we are already in a technological depression ("where's my flying car already?") and the witches metaphor is particularly apt. Until "normal" people catch up on the details of general purpose computers ("what's a filesystem?") and feel empowered by them (confident enough to secure their own systems, and write software to automate their lives), we will continue to stagnate, and embedded devices will thrive. The proliferation of embedded devices is a barometer of the public's aversion to generality, pure and simple... I would even expect a new model for general purpose computing to result from the current Cambrian explosion eventually, and be a net win. But I think it's important to realize we spend less time inventing today than 5 or 10 or 20 years ago, and more time dealing with the political and social response to the full implications of the current breadth of computing if only because it is still entering the mainstream. Anyone interested in pushing the limits today has to focus as much, if not more, on addressing the distaste for change, as on the underlying technology (at least if they want to put their name on it). Designing and coding Bitcoin is the easy part...


Look at how the Chumby is now useless, as an example.

The original iPhone is still useful if you are a dev, but not so useful for regular people, as many apps can't run on it any more.


Most people also don't want to pay attention to politics and when possible refrain to participate in elections. How well is it working?


That seems a dubious analogy at best.


Reminds me of Steve Jobs saying that something is wrong if a phone needs a task manager in 2008.


The web is the least free of all platforms from the users point of view* so I find it bizarre that Berners-Lee would use the 'freedom for users' angle to try and convince people to switch to web technologies. I fully support the idea that we should have root on devices we own, it's just a shame he had to dilute that point by pushing his own platform.

* nearly always closed source, in the rare cases where you have the source you can't modify the code that actually runs when you use the service, usually no control or even ownership of your data, it runs on a machine you don't control which can make your data subject to mining by foreign governments or advertisers


>> the web is the least free of all platforms from the users point of view

You're confusing "the web" with "networked software".

When you use the web, you have no control of the server but full control of your client (if your browser is open source). When you use a networked app, you have no control of the server AND no control of the client.

"Do I want control of the software I run on my machine?" is a separate question from "do I want that software communicating with other computers that I don't control?"


There is no confusion here: I want control of my data.

Currently the number of web apps (by which I mean something with a html/css/js interface that runs in a web browser) that I can run locally with my own data that I store locally (or not, my choice) is tiny. The number of native apps that give me full control of my data is huge, because native platforms make that easy to do. Local file access APIs are relatively new in the browser.

You are correct that there are plenty of tasks that are inherently about communicating with another computer, but that is not what we are talking about here. Writing a document needn't involve communicating with another computer. Editing an image needn't involve communicating with another computer. Maintaining my todo list, calculating how much time to bill a client; these needn't involve communication with another computer that I don't control.


"I want control of my data."

One might even argue the idea of individual control is somewhat at odds with the logical structure of FOSS, particularly in it's GNUist form - i.e. for a Lisper (and Stallman was/is one), there's no semantic distinction between code and data. Your position requires distinguishing bits from bits.

Solutions which distinguish bits from bits are political, not computational.


> there's no semantic distinction between code and data

I struggle to understand how this can be true, and I say that as a Lisper (spiritually if not currently), because 'this data does something meaningful when interpreted by a computer' is exactly a semantic distinction from 'this other data doesn't'


What is the semantic difference between data which does nothing meaningful and lisp code that does nothing meaningful?

Or to be Wittgensteinian about it, point to the part of a datum which is the meaning.


I quite like my code separated from my data.

Being able to open up a word processing document without it being able to execute arbitrary code on my computer is one of the many benefits.

Oh, wait, Microsoft Word macros...


Bits submitted by me to the server or derived from or mixed with bits submitted by me are my data. Bits that are created by the web developer that would be there if I didn't even exist are his app. The division isn't political; it is one of origin.


This entire question is political, not computational.


I thought the original article was about locked platforms. I don't have access to any of my data on my iPhone except photos and video. Everything else is sealed up in that system with no way to get at it.

How is that better?

If I had root I could get at that data.

That's not taking away from your point that Web apps may have similar problems but there's at least 3 distinct platforms

* Open Computers with native apps (Windows, Mac, Linux)

* Closed computers with native apps (iOS, Android, XBox, PS3, ...)

* The Web

The article is ranting about closed computers with native apps.


I think we're actually in agreement. Let me restate my argument, and, I think, Berners-Lee's:

"I don't always send data over the network, but when I do, I use an open-source browser that uses open protocols."


I understand your point. It struggles a bit due to the nature of Javascript and the user's direct access to source code.

But "freedom" is a cluster concept [e.g. "game" - see Wittgenstein], and Berner's Lee is pointing to another aspect of what we mean by "free". Again JavaScript is a useful example - the user has the ability to turn it off in their browser if they deem that to be in their interest.

Or rather, the user has traditionally had that ability. Smartphones and other appliances increasingly are curtailing or eliminating the user's freedom to do so. And a user might want to do so based on the concerns about data you raise.

The FOSS idea of freedom is important. But the problem it solves is of another era when those accessing computers rather than terminals tended to be programmers. Stallman's concern arose at MIT AI lab, not among owners of Atari 2600's or Commodore Vic 20's.

Berner's Lee is addressing a world where our game consoles and computers are collecting and transmitting detailed data about what we are doing, and we are not given permission to turn the collection off.


"Stallman's concern arose at MIT AI lab, not among owners of Atari 2600's or Commodore Vic 20's."

While I do not deny the unsavory side of Stallman exists, I am intrigued at how smoothly we've moved from his detractors mocking his "The Right to Read" [1], to his detractors ignoring it, to us standing on the precipice of living in it, even as the mockery continues unabated and even the HN zeitgeist seems to be that he's some sort of whacko who should be ignored... some sort of whacko who, I might add, appears to have been a great deal more correct about the future than the people labeling him such.

[1]: http://www.gnu.org/philosophy/right-to-read.html


I respect and admire Stallman personally. Likewise, I respect and admire his contribution to the world. The problem he set out to tackle was and is very real.

I question whether it remains a primary issue in regard to the future impact of computing on the world. In no small part this is because Stallman's principles have achieved so much. The threats to GNU and Linux do not appear to be existential.

I am a bit troubled by the tribalism of the larger FOSS community and the times at which it focuses on ideology over problem solving and its tendency toward a tribalism which casts so many as as unthinking malevolent "others."

Stallman's principles are valuable analytical tools but they are not a hammer which also saws wood and turns screws. Some problems are better solved by trying wrenches first.


> I respect and admire Stallman personally. Likewise, I respect and admire his contribution to the world. The problem he set out to tackle was and is very real.

> I question whether it remains a primary issue in regard to the future impact of computing on the world.

There are more threats to freedom now than ever. Look at UEFI on new laptops, graphics cards or wifi hardware that only work with closed-source drivers, and the whole world of smartphones that do not run without closed-source binary blobs.

A proprietary BIOS has always been a thorn of contention, but with UEFI you can't boot your own OS at all unless it is blessed by Microsoft.

In the "olden days" things moved a lot slower, and some companies - like Intel - even released drivers under the GPL. Now when a new phone comes out, its SoC is 99% powered by closed source binary blobs. The Nexus 4 is a disaster - look at this list of proprietary blobs https://android.googlesource.com/device/lge/mako/+/77f8e6b51... - how can you say there's no issue here? Almost all the interesting stuff this phone can do is powered by non-free software.

In addition most phones ship with proprietary bootloaders that don't let you change the operating system. I'd say that freedom-wise, today is much worse than the situation with computers from 5-6 years ago, especially if you expand computers to include phones and tablets.


'a tribalism which casts so many as as unthinking malevolent "others."'

I'm actually not worried about people cackling about how they can finally enslave the masses. Generally they do such a bad job when that is a primary goal that they end up failing. (Remember DivX, the time-limited DVDs, for instance?) I'm worried about people incrementally making a small choice here and a small choice there and before you know it, we've still pretty much ended up in the same place. Yes, FOSS as a community has done a lot of stuff but it worries me to see so many people declaring the job done and that we don't have to worry any more. Using the FOSS community as the freedom backstop doesn't work if the FOSS community no longer thinks it has to worry about freedom because of past successes.


I don't think the job is done. I don't think it can be - FOSS is a tool which for some problems can remain in the box without lessening the benefits which come when a particular problem is solved.

The problem of appliances locked down with spyware is an example. I believe it is a problem whose first order implications are more serious than those which FOSS (and even GNUism) are intended to address. The implications include threats to physical safety and political liberty.

I understand that there are different opinions. I acknowledge that it might be the case that the best possible solution adheres to GNUist principles. I just think that an unwavering GNUist approach is likely to an impediment to achieving a good and practical solution

I use "GNUist" because FOSS doesn't prevent lockdown - and "Stallmanist" sounds too much like "Stalinist". And like "pragmaticism" is too ugly to be co-opted for other purposes.

My Symbian Phone would load open source code written in Java, but only after it was signed. I was granted the means of self-signing, but only by Nokia's grace. And that turtle goes all the way down. The OS could have been FOSS and the device still constructed so as to require code signing for the OS.


I think you're right that the huge inroads free software has made contributes to a declining feeling of urgency in the community at large. I'm a huge fan of his but admit to the same general feeling most of the time.

The depth of his impact and contributions is really only obvious with the benefit of hindsight, though. Even among relative friends, it would be hard to find a time in that evolution that people weren't questioning similar issues of net future benefit, practicality and the reliance on strict dogma at the expense of practical compromise.

Based on that I'm inclined to believe his vision is as vital as ever and that the traits that at times make him and the FSF hard to work or deal with could very easily be the same ones that lead to long term benefits.


The GNUist vision was developed before the web. It doesn't really address the situation where computers are networked.

If you send a message to my computer, and my computer sends a response (or no response), how does that relationship justify your access to the source code of my computer even under the most liberal interpretation of GNUist principals?

Assuming of course that software on my computer is not GNUist. And even if it were GNUist, should I be required to notify you of that with each transaction to make you aware that you may modify the code with which you are interacting? I'll skip the logical extension of GNUist thought to the point where you can modify GNUist code on my computer.

GNU was designed around single computer, multiple users. But today's world is single user, multiple computers. The pressing problems of that world are different than those from the days of dumb terminals even if those problems are a result of the success of the dumb terminal era solutions.


The web is more free than a locked smartphone -- at least the web as open competition with no gatekeeper.


Tell that to the people in Iran.


That's quite true. The consumer currently faces three choices:

(1) Dangerous insecure apps-can-trash-my-machine desktop OSes with complicated or confusing install / uninstall semantics and "OS rot" due to installs/uninstalls actually modifying things outside the app.

(2) Jailed devices, where you control very little outside the app world and someone else has root.

(3) The cloud, where you don't control anything.


Tim has never really been a free software guy really (not against it, just not his issue) and here he was at linuxconf.au where pretty much the whole audience will be FOSS diehards and probably at his talk just to check out that guy who started it all. At the end of the day almost anyone will try to present some common ground there, even if its not super strong.

I don't disagree with your assessment of the general concerns web apps, but from a practical perspective its a lot easier to provide FOSS web apps that temper some of these concerns (self hosting, etc.) while still operating within a well tested security framework. To end up in the same position with most current native mobile apps usually requires a device configured for developers with all the security relaxations, difficulties with redistribution and often program fees.

Neither one is very awesome, but its not like many ios and android apps dont wall away your data while mining it on remote servers.


I don't think that really holds - how hard is it to actually run your own server these days? Hosting is dirt cheap. While I'll grant that the learning curve is a bit much, and the software is not quite there yet (and even then, I would say that the software exists, it just hasn't been integrated properly), but that doesn't even compare to the situation with phones where it is flat out illegal for you to do whatever you want with them, and you definitely don't have source code or control over your data. That is to say, all the exact same complaints you have against web services, and more, apply to smart phones.


"Hosting is dirt cheap." When the hardware is shared, Yes. On the metal, perhaps not so much.

"the learning curve is a bit much, and the software is not quite there yet," answers the question about how hard it is to run a server.


Thanks for quoting me out of context by cutting off the really relevant part; full quote:

While I'll grant that the learning curve is a bit much, and the software is not quite there yet (and even then, I would say that the software exists, it just hasn't been integrated properly)

Setting up your own web server isn't much harder (I'd actually argue it's easier) than rooting or jailbreaking a phone. On top of that, it's legal. And once you've rooted/jailbroken your phone, how many services will you continue to use that are hosted on someone else's hardware, using closed source software? The software to host yourself is available, it's just not well integrated yet. Also, the lockin to things like FB is a not insurmountable, but difficult problem.


"yet" summarizes the parenthetical remark.

I apologize if I have taken you out of context.


I don't know if it really matters to the average user to have root on their system; for technical people it certainly implies "true ownership" of the whole stack on the device. What I find bizarre and a bit unsettling is the appification of general purpose computing devices. When it was mobile phones, locking the user out (almost) made sense - the device was being financed by the carrier via subsidy, so they should have a bit of control. Then tablets came along and provided an identical interface in terms of restricting user access, but the user pays the entire cost up front[1]. Now Windows 8 (moreso RT, but even the x64 version) and Mountain Lion are moving to the walled-garden, segregated, impotent app approach. The best reason I can explain why this is terrible is XCode:

I don't really develop for Mac, but there was a Linux tool which also shipped a 32-bit Darwin binary. I thought "I'd like to use that on my 64-bit Mac, surely I can just compile it with XCode". Downloaded XCode from the App Store (after giving my mother's maiden name, my blood type, etc.), and it doesn't appear on the PATH, because it can only write to /Applications/XCode. Jesus [2].

My point is, somehow subsidized phones turned into locked-down tablets, turned into a compiler that can't install itself to /usr/local/bin. That seems daft to me.

1. ASUS let me root my tablet for free, no strings. So it's opt in, but not terribly evil.

2. If anyone has a good Mac homebrew tutorial, I was wondering if I could compile with a different toolchain than Xcode? I'm really not up on this stuff, I need to learn.


"I don't know if it really matters to the average user to have root on their system"

I don't think what matters to the average user should matter to any thinking person. You're right, freedom doesn't matter to lots of people in the western world or, for that matter, in China either. Lots of people choose to be powerless or have no desire to be. Lots of people simply fail to see the long-term implications of paying more and more for less and less freedom. If control is one definition of ownership and possession is nine tenths of the law, by giving up even the expectation of control, we are all abdicating ownership for good. "Good luck with all of the of that." Berners-Lee is right (and Stallman, etc...). Deliberately and only buying freedom is the only sensible investment; anyone and anything that tries to remove individual (user's) freedoms should be dismissed as absurd, IMHO.


The thing is, freedom of speech, movement, etc. are much more obvious to non-technical people than freedom to run arbitrary software. I think it's sort of offensive to imply that because someone isn't aware of the implications of the app store, they aren't a "thinking person". There is no risk of being rounded up and sent to a labour camp for re-education because you jailbroke your iPhone. I'm not really sure where you're going with the possession thing, that's pretty much just a hand-wavey aphorism. Obviously possessing stolen property is not 90% legal.

I agree with you generally, that we as technical people and experts need to be aware of, and create awareness of, these issues. I think you're falling into the same trap as RMS, which is to get absurd about the abstract notion of freedom, without making change attainable or actionable for the average user. If you want to have a massive shift in user attitudes towards the App store, you're not going to achieve that by comparing Apple to China. Nor can you simply 'dismiss them as absurd', because you're alienating your audience: they think Apple is reasonable.

What I liked about Sir Tim here was that he makes the argument pretty reasonably: it sucks that apps aren't cross-platform, that you have to depend on a proprietary interface to save them, share them with other users, etc. He makes it relatable in the present, he doesn't just fall back on the dystopian slippery-slope argument.


"There is no risk of being rounded up and sent to a labour camp for re-education because you jailbroke your iPhone."

:|

ISP/Youtube (etc) copyright school, anti-circumvention laws, three/six strikes, the CFAA, etc... Prison often involves "public service"/forced labour. To people agreeing with those articulations it even seems "just": surely criminals deserve what they get (people even seem OK with rape, as long as it happens to a murderer). To people who don't agree with the current imaginary property laws, there's no practical difference between the threat of prison and the threat of a prison "camp." Suicide might even feel like a viable alternative. I know memory is short, but that was basically the front page for about a week, and only as long ago.

"I think it's sort of offensive to imply that because someone isn't aware of the implications of the app store, they aren't a 'thinking person'."

That wasn't my meaning nor was I trying to be offensive, only blunt. None the less, an uninformed person who bases his opinion on an average of the public's opinion, going no further, is abdicating thought, and can be accurately called "unthinking." (Obligatory Socrates reference... "I appear to be wiser than he, because I do not fancy I know what I do not know.")

"I'm not really sure where you're going with the possession thing[...]"

Simply but strongly: that it's not worth spending money on something you don't ultimately control (be it stocks, socks, tablets, or smartphones; don't buy shares in dictatorships either, even if they are going up: you're only betting that they will win). Everyone has that choice, and it is as important as ever. Save your money for freedom.

Wikipedia: "Possession is nine-tenths of the law is an expression meaning that ownership is easier to maintain if one has possession of something, and much more difficult to enforce if one does not." ( https://en.wikipedia.org/wiki/Possession_is_nine-tenths_of_t... )

Now is the most effective time to hang onto that presumption of control, since it's harder to get back when it's gone.

"he doesn't just fall back on the dystopian slippery-slope argument."

It's an oldie, but a goodie. While I wish it weren't applicable, I do foresee a dystopian future, at least compared to the present, where we have sold the "right-of-way" to the devices and data that augment and define us for brushed metal and an Apple logo (not to single out any one company). At least give me a bit of credit though: I didn't once mention Hitler. :)

Edit: I'm not trying to be combative. As you said, we mostly agree.


Open the Xcode preferences and install the command line tools from the 'Components' part of the 'Downloads' section, although it does say this:

Before installing, note that from within Terminal you can use the XCRUN tool to launch compilers and other tools embedded within the Xcode application. Use the XCODE-SELECT tool to define which version of Xcode is active. Type "man xcrun" from within Terminal to find out more.

Downloading this package will install copies of the core command line tools and system headers into system folders, including the LLVM compiler, linker, and build tools.


Yes, if you use the Terminal and cd to /Applications/Xcode.app/Contents/Developer, you'll see

  drwxr-xr-x   3 root  wheel  102 Jan 25 14:52 Documentation
  drwxr-xr-x   7 root  wheel  238 Jan 29 13:33 Library
  drwxr-xr-x   7 root  wheel  238 Jan 29 13:11 Makefiles
  drwxr-xr-x   5 root  wheel  170 Jan 25 14:53 Platforms
  drwxr-xr-x   3 root  wheel  102 Jan 25 14:54 Toolchains
  drwxr-xr-x  22 root  wheel  748 Jan 29 13:12 Tools
  drwxr-xr-x   7 root  wheel  238 Jan 25 14:54 usr
Inside ./usr/bin is gcc, git, etc.

Basically the developer tools used to live by default in /Developer at the root of the filesystem, now the Developer directory is inside Xcode.


And if you install the command line tools they live in /usr/bin.


Thanks, that was a real RTFM moment.


>I don't know if it really matters to the average user to have root on their system; for technical people it certainly implies "true ownership" of the whole stack on the device.

The average user will find it very convenient when their technical friends can fix, upgrade or unlock features in their device. So even if the average user never uses it, they will benefit greatly by having root access to their device.

Remember back in the day when your friends would have you fix their computer? Has that ever happened with phones/tablets? Why not?


Maybe because the same user who broke things on the computer is locked out on the phone/tablet so they can't break things. If it's broken it'll either be hardware issue or a firmware/os/app problem that has to be solved by the maker/developer.

In my experience supporting users of PCs, Macs, and Android and Apple phones/tablets, very few users have any desire for root access or any idea what to do with it when they have it. Their device is going to belong to others no matter what, maybe it's better for it to be Apple or Google or a somewhat vetted developer in a walled garden than the owner of a malicious bot-net.


The moment an OS maker requires signed apps (on desktop) is the moment I jump ship. They know it's the death knell for desktop OSes though. I'd be really surprised if it happens.

I'm not sure what you mean about Xcode though. It's the same toolset (minus private headers) that Apple uses to build their own OS X and iOS apps. There's one extra step to put all of the command line tools on the path for you (install the Command Line Tools package). Then just get Homebrew and you're all set.

Open source components of Xcode (parsers, compilers, linkers, command-line tools) here: http://www.opensource.apple.com/release/developer-tools-45/

Homebrew: http://mxcl.github.com/homebrew/


The moment an OS maker requires signed apps (on desktop) is the moment I jump ship.

Why the parenthetical? What's different about a computer you can hold in your hand that makes it justifiable for the vendor to lock it down and advocate criminal penalties for attempting to run software of your choice?

They know it's the death knell for desktop OSes though.

"They" have already decided that desktop OSes for the general public are deprecated. Steve Jobs years ago said that PCs are "trucks" that most consumers shouldn't use, and Microsoft has no hesitation about sacrificing Windows 8 usability on desktops in favor of tablets. Even Canonical seems to be headed that way with Unity.


I'm a bit scared of this myself. I forward the syslog messages on my Mac to another host (a Linux system) and I've found that /usr/libexec/taskgated logs (at level 'debug') "no system signature for unsigned /Users/spc/source/sysloginter/syslogintr[181]" (a program I wrote). It also logs that message for some other programs I've compiled (like Synergy [1]) and others I've downloaded (like Firefox).

In fact, I've seen that message now for the past few years. I'm still awaiting the day when I can no longer run my own software. I hope it doesn't come to pass.

[1] http://synergy-foss.org/


edit: anon1385 gives the same answer as I did, but wrote faster :-)

I agree with you on the locking down issue. I don't like the ways things are going. :-(

For Mac command line compiling, You need to install the xcode command line tools package

In Xcode 4.5: see Menu->'Xcode'->Preferences->Downloads pane->Components tab. Then click the install button on the "Command Line tools" package

This should give you a working compiler and tool chain (llvm or llvm-gcc)

Hopefully, anyway.

I think the package is actually available separately from the Apple developer tools site, although I don't have a link for it. Links on the ADC site tend to change so often, it's hardly worth quoting them.


This comment will be probably be unpopular but I think that root access is overrated. Not that it isn't useful in the appropriate hands, just that people make it out to be more than it really is. Like it is some magic wand that grants wishes.

The fact is that there will always be gatekeepers, whether it be the OS maker, the firmware maker, or the chip makers -- root be damned. You'll never have true ownership of your device unless you created, understand, and own those three parts. Maybe I'm stating the obvious but, I think that phone OEMs and OS makers would do better to focus on making more function available that don't require root access. You shouldn't need root to install a tethering app or be able to uninstall some shitware that comes pre-installed by the phone carrier.


>Now Windows 8 (moreso RT, but even the x64 version) and Mountain Lion are moving to the walled-garden, segregated, impotent app approach

What can't you do in ML that you could in OS X 10.1?

If and only if the app signing and approval by Apple gets mandatory to run apps, a la iOS, then you'll have a point.

As for now, all the new signing etc are additions on top of the regular infrastructure, and the reason they were added was security.

Most security nerds find app signing and sandboxing a good idea.


> Most security nerds find app signing and sandboxing a good idea.

Those same security nerds have been doing both on FLOSS operating systems (BSD, GNU/Linux) for years, so that point is irrelevant to the original discussion.


Which is not the same at all with having thousands of proprietary third-party apps to distribute in closed source form (as opposed to having control of some open source repo with redistribution and re-compile/modify etc) right.

So your point is even more irrelevant to the original discussion.


> Which is not the same at all with having thousands of proprietary third-party apps to distribute in closed source form

What are you talking about? Nothing about app signing and sandboxing requires open-source repositories.


When I clicked on that link, my "device" contacted Facebook, LinkedIn, Twitter, Omniture, and CBS Interactive. These sites attempt to compile information about me, track me, and make that information available to others - I'm assuming. Of course, I don't know because they don't tell me.

By using the web, my device is serving others (rather, serving me to others) on a massive scale, every day.


At least in such a case you have some control over that tracking, by installing ghostery or whatever.

Without root you cannot count on being able to do that.


It's a terrible trend, but I don't see it as some kind of conspiracy as some people seem to. It's a legitimate market trend being driven by several factors:

(1) Operating systems have an outmoded security model. Most focus on multi-user security, which still has its uses, but they fail to focus on application isolation. Applications should be installable by anyone and isolated completely from other apps unless specifically granted permission by a user/administrator. The entire malware problem can be laid at the feet of this.

(2) The (related) poor state of installability. Mac has this problem the least with drag-and-drop .app installation, though sometimes even that can be confusing (and .dmg packages are weird beasts... why?). Linux has .rpm or .deb, which applies a massive and complex band-aid to the otherwise awful state of installability on that platform. Windows is absolutely horrible... it's like Linux where you have "installers" that have to do package management instead of a formal package system.

The open source world -- and even commercial vendors that want to keep the PC alive -- have to either address this problem or accept the dominion of the locked-down vendor-controlled consumer compute device. This will require abandoning the old fashioned Unix design philosophy (and the similar way Windows works) and thinking seriously about the problems of installability and isolation. It would be worth taking cues from iOS and Android here, though there's also a lot of room for new ideas.

Oh, and I forgot to mention. If we don't address these problems, all app vendors will pay a ~30% per-sale tax to Apple, Google, and Microsoft in exchange for the valuable service of a platform that provides installability and application isolation. And you know what? The market will pay it, because for most users those things are that valuable.

Point, click, install, with no fear of damaging my system. If I don't like it I click and uninstall. Anything else is completely broken.

It's interesting to note that the prevalence of virtualization is also a sign of the failure of operating systems. OSes in a box (whether via complex container overlays like OpenVZ/Virtuozzo or hypervisors) are an ugly hack to fix the fact that the OS security model is broken even for multi-user operation. The fact that everything requires root to install is the deepest issue, along with the lack of permission structures for things like network interfaces. It should be possible to run an OS and sell accounts to the general public, not VMs, and people should be able to run whatever they want from their local account and this should be safe. The fact that this isn't viable is because OSes are broken, thus we have the huge overhead of virtualization as an ugly band-aid to fix it.


> Most focus on multi-user security, which still has its uses, but they fail to focus on application isolation. Applications should be installable by anyone and isolated completely from other apps unless specifically granted permission by a user/administrator.

Incidentally this is exactly what Android does. They provide the isolation by creating a new user account for each app and by default apps get no permissions. The permissions you accept at install time then allow extra access. (Behind the scenes there are also mechanisms to provide security permissions across processes so that a second process doing work on behalf of your process is limited to your processes permissions.) Multiple problems have happened.

The initial default permissions in Android 1.0 were a little too permissive and for example allowed reading from the sdcard. While that seemed harmless at the time, it was in retrospect a bad choice.

There is no hard validity checker. For example if an application ignores the API call to find the location of the sdcard and instead just uses /sdcard then no one would know. This means that it is very easy for apps to not comply with best practises to be available, the developers don't even know it, and the platform has problems if they want to fix the issues due to the large installed base. (If Angry Birds stopped working because they hard coded things instead of using APIs would Android or Rovio get blamed?)

Android also made a decision that you either accept an app and its permissions, or not install it. There are good arguments both ways of only allowing them as a whole or being able to disable some permissions. But the net effect is that many apps have more permissions than they do use, and have to declare them if there is any chance they could be used. Users can choose to not install apps with too many permissions, but you can't leave a review explaining that. I see a lot more pushback over permissions in app store reviews these days, but the many years of prior practise mean far too many apps have too many permissions. (BlackBerry had this scheme where for each permission you could set it to grant, deny or ask on each use. For the tech crowd it is great, but for someone who just wants to play a game it is annoying and probably not helpful.)

The permissions themselves are not sufficiently granular - for example "Internet" is on or off. An email app contacting my email server is great, but contacting a command and control server in another country is a huge problem. Of course some sort of per connection prompting could happen, but isn't really practical for the vast majority of users.

Commercial issues have a been a problem too. For example the Google Play Store is not available in China (and many other countries). Consequently users have to use a different store, and some have different incentives as app stores are not lucrative.

Finally we have the "fragmentation" issue. Google did not implement Android in such a way that the majority of core could be upgraded by Google. Consequently you end up with full stack releases which depend on manufacturers and carriers deploying after they have already sold a device - something they have no incentive to do. This would need to be solved by aligning incentives, and is hard.

TLDR: the hard stuff is hard. Application isolation is already done by all the mobile operating systems. It is the nitty gritty details beyond that that need solving.


Interesting. Wasn't familiar with the deep details, though I knew Android did something like this.

I don't like the user-account-per-app paradigm. It seems like an ugly abuse of multi-user access control to compensate for the missing app isolation piece.

I agree that it's a hard problem.


If we ignore the word "user" it feels (and is) a lot less like abuse and actually fits very well. Linux has a proven and secure method of isolating access, and on linux servers and linux on the desktop this is used by different users to protect them from each other (intentional or accidental). In Android it's more important to isolate access between apps, again protecting them from each other, intentional or accidental. Even for well intentioned apps, like Chrome or Flash, that kind of secure isolation is extremely desirable, and I don't see much value in a custom built sandbox over reusing the isolation that's already proven.


Per-application permissions, sandboxing applications, and one-click installs are great for the users. I'd like to have a filesystem like Plan 9's, where an app cannot even see outside of its box unless I let it. But for some apps I'd like to let it.

Preventing an app from opening other apps or arbitrary files if the user desires is awful for the users. Siloing data so that a user cannot use an arbitrary application to read a PDF is awful for the users. One-click installs and refusing to run unsigned apps is awful for the users when the only CA is Apple or Microsoft.

All of the current systems are poorly implemented. They could either be a step forward or two steps back. I'm not sure of the direction they are taking yet.


What about hypervisor as microkernel and VM as app? Is this a viable new model for system security?


They do not provide real isolation only the illusion of isolation.

Remember software engineers have been stamping out bugs in the x86/64 arch for decades and they still find new bugs all the time. ARM arch is no different, they'll be stamping out bugs for decades too. By adding a hypervisor and virtual machines you're layering a whole new set of bugs on top like a bug sandwich. Complexity skyrockets, security is the first casualty.

As for not having root on a proprietary device it's a double edged sword. On one hand you're firmly in the land of feudal security leaving everything up to the developers, on the other hand if you root the device you're now opening up priv escalation and NFC exploits galore.


A process is basically a lightweight VM instance.


You mean a VM instance is a lightweight process? With hardware-assisted virtualization, multiple cores, and no preemption, shouldn't an OS-less VM be lighter than a general-purpose kernel-scheduled process?


Lighter how? A VM and a process look almost the same from the running software's point of view. A process is basically the same as it was in DOS - one big memory space, except some memory regions are off limits, and some regions contain kernel trampolines (assembly is still fair game). QEMU/KVM just hooks basic things like MMU/page lookups, and interrupts (using CPU extensions, iff available), so hardware interfaces (ex, PCI) can be simulated, but otherwise the VM runs as a normal process using the kernel's scheduler. Xen just re-implements some of the scheduler stuff on the theory that not everyone needs the full kernel. Unless CPUs start incorporating scheduling (preemption of processes/VMs) and memory-space isolation and kernel interfaces (like fork, and k/exec), I'm not entirely sure what an "OS-less VM" might look like. (It would be great if half the kernel - ex, the Xen part - could be BIOS/CoreBoot-level firmware though, as it'd be nice to just use kernel drivers and common scripts to boot, etc, rather than Grub-specific or iPXE-specific ones, etc...)


It's feasible, but I personally think the hypervisor overhead stinks. Better to fix to OS permission model.


Why not use filesystem permissions/ACLs on named pipes to access native interfaces (as a poor-man's microkernel)?


Hypervisor overhead for hardware-assisted virtualization is worse than OS-level preemptive threading? Why bother with an OS at all and just run mobile apps in a hypervisor with system service ACLs?


As people require a license and training to drive a car, I feel that people should require training (free of course) to use a computer. The whole problem with computers is that they are tools for consumption AND creation. While these new tablets and phones are heavily consumption only. The problem with computers is not that they are complicated, but because most of society has zero competency in using these powerful tools. It is a failure of education.

So we end up with shitty phones and tablets that try to make consuming old media easier, instead of users making their own media and applications.


>Applications should be installable by anyone and isolated completely from other apps unless specifically granted permission by a user/administrator.

There is also the Sandboxie approach to installing apps.

Once installed the OS monitors the app (and any spawned processes) for any and all changes it makes to the OS and once closed the settings are reverted (except local settings). When uninstalled any files/settings/etc it changed are also reverted by the OS instead of the uninstaller.

[1] http://www.sandboxie.com/



These features are hardly inseparable from being denied root access to the device.


That's not true. Root access can be permitted to the user. It just should not be permitted to apps without specific permissions and it should not be required to install an app. Apps should not have to vomit all over the filesystem. They should be self-contained and run in isolation.


This is absolutely correct. Apps should be completely isolated by default but users should be allowed to add access to other resources on a per-resource basis.


I have always liked the web as a platform because I can build something once and have it work across many operating systems/browsers/etc... and because the web is built on open standards.

One criticism people have of this approach to application development is that web apps aren't as efficient as native apps.

This pro/con discussion of web vs native apps reminds me of the C programming language, why it was created and what for.

The C Programming language was designed to be a portable language, meaning that it can be compiled on virtually any platform, any operating system. Yet C was also designed to constitute the minimum abstraction away from a given platforms native assembly language.

Thus a program written in C can be compiled on almost any machine, and run as efficiently or nearly as efficiently on that machine as a program written in that machine's native assembly language.

So perhaps we could use something like the C programming language for the web. A technology which allows us as developers to write our applications once in a portable open format, without needing to sacrifice in terms of performance.

Any thoughts?


Sounds like http://en.wikipedia.org/wiki/Google_Native_Client is what you're looking for.


I think the best thing the tech community can do in this area is continue to investigate and provide specific criticism of individual apps that are doing nasty things. Great examples include the uproars over Google's data collection, or how Path uploaded the entire address book without asking. These are the sorts of specific nasty things the popular press will cover, and the general public wants to know about.

Root access would be very helpful for the tech community in doing this, but on its own, it's probably too esoteric to make much of an impression on the press or general public.


Not a terribly coherent discourse by Tim B.Lee. On the one hand, he wants you to run apps downloaded off the internet in your browser. On the other hand, he wants the user to have root. There's not even a whelk's chance in a supernova that I will ever run a browser as root. I assume that Tim's OK with that as a security stance, but it then follows that browser apps can't have root, and if you're a non-tech-savvy user, if your apps don't have root, then you don't have root.

Focusing a bit more on the whole root access thing, I think it would be well to remember that we tried the open root thing already, and it turned out to be a security disaster for normal users. I think it's a good thing that the mainstream computing platforms (Windows, MacOSX, iOS) are trying a new approach. I also think that it's a very good thing that there are more open solutions, such as Android (which, just to be clear, is also a mainstream choice) and Linux, that allow us to see what can be done with more open access. Then the closed platforms can see how to go about providing that functionality in a more secure manner for less tech-savvy users.


I don't think he's suggesting that you literally run your browser as the root user.

The argument seems to be that if you don't have the ability to access whatever is the equivalent to the root account on your device at all then this means that whoever does has more control of your device than you do.

This means that you have to trust the device or OS manufacturer. For example if you buy an iPhone there is no way to "untrust" Apple without throwing the device out or doing a jailbreak.

With a more open system such as Debian you can decide to untrust the OS vendor by simply replacing the entries in /etc/apt.sources with something else.

This kind of model is often used in corporate environments where computers are locked down by the IT dept, therefor the IT dept can make choices on behalf of the users as to what the security settings should be, what software is installed etc.


It sounds like he's talking about something different than root - freedom to control access at the level the user cares about.

To pun on Bertrand Russell: "It's roots all the way up!"


I don't live here; I just rent.

Sometimes it's OK to just rent. If I want to own, I'll move elsewhere.


I guess that your landlord can come tonight and check if you are sleeping alone, and you can't tell him it's not his business.


No, my landlord and I have an agreement that stipulates the terms under which they can enter the place I rent, and I trust them to honor them. Ultimately, they have a key though. It's a risk I accept.

The most valid argument that I can think of is that, in a "device appliance" context, many people aren't aware that they're renting. I don't suspect that they'd change to owning once they understood that they're renting, but it might make them more sensitive to the agreement under which they are renting.


>and I trust them to honor them.

Except that with renting property you have specific rights clearly defined, and there are people with guns and handcuffs and jails to uphold them. You have no rights at all with your phone, and nobody to keep your "landlord" honest. Nevermind the obvious fact that you are not renting your phone, you purchased the device, it is your sole property.


I agree that we need a better set of baseline laws protecting consumers who "rent" devices from unfair practices, but I don't think that's relevant to your second statement.

I do own my phone. I have every right to throw it off a cliff if I want to. I do not own the software that runs on it. I license it, and when I bought the phone, I agreed to that license.

Apple cannot stop me (legally) from running whatever software I like on my iPhone. The caveat is that they're under no obligation to make that easy. I bought the device knowing what software it runs. Were I to buy the device with the intention of running different software, then caveat emptor applies. Apple does not represent that their mobile devices can run any other software, so I really don't see where there's a valid argument that these "renters" are anything but consensual. I have the right to do whatever I want with the device, but I may not have the ability.

When you get down to it, the argument being made is that one should want to own, rather than rent. Go ahead and make that argument, but it's not fair to dress it up under the precept of "rights".


Apple cannot stop me (legally) from running whatever software I like on my iPhone.

Not for lack of trying. Jailbreaking is only legal because of a specific DMCA exception granted by the Copyright Office, which Apple opposed. And IIRC that only applies to phones, so it actually is illegal to jailbreak an iPad.


>I license it, and when I bought the phone, I agreed to that license.

That's the claim they want you to accept, but it isn't true in many jurisdictions, and in those where it is (US) people should be fighting to fix that mistake, not using it to excuse immoral behaviour. I do not require a license to use software. I did not agree to any license when I purchased my phone. I did not sign any contract.

>Apple cannot stop me (legally) from running whatever software I like on my iPhone.

This contradicts your previous claim that you do not own the software, and that you must abide by some license. Their license says "you can't bypass the restrictions we put on you". You can't argue that you are both bound to it, and not bound to it.

>but it's not fair to dress it up under the precept of "rights".

Why not? We have rights in other areas, specifically to protect people against predatory companies doing shitty things because "haha you agreed to it!".


You're conflating use of hardware with a restriction on software. They are not the same. If you can find a way to load Linux on an iPhone, Apple won't even bother to try and stop you.

I agree that we should have better laws protecting consumers from abuse, but I'm not sure what to make of "I do not require a license to use software". If I create something that I do not want to give away, I expect to be able to define some restrictions on its use. You may have a different expectation, but I don't think you have majority support on that.


>You're conflating use of hardware with a restriction on software.

No, you are assuming the only case of software restriction is limiting my OS choice. Apple works very hard to try to stop people from helping others to fix their crippled phones, so that they can install sofware on them that apple has not approved.

>I'm not sure what to make of "I do not require a license to use software"

Software is protected by copyright. Just like other creative works, like a book for example. Copyright gives the author exclusive rights to certain things, like distribution, copying, and creating derivative works. It does not give them exclusive rights to use the work. Copyright law prevents you from selling a book written by someone else (without their permission). It does not prevent you from reading it. The same is true of software. You can't sell itunes, but you are entirely within your rights to use it.

>You may have a different expectation, but I don't think you have majority support on that.

The law supports me on that.


> Copyright law prevents you from selling a book written by someone else (without their permission)

Actually, you can't sell copies of the book. You can sell your original.


The problem becomes obvious when you discover that there's no place to move in order to own. When you have no choice, are you really renting or are you a serf?

Even if you think that you own your phone, you still have to get it onto a network for it to become useful and you'll be renting that network access and you'll have no choice between contracts because they're all the same.


When you have no choice, are you really renting or are you a serf?

Not a straightforward question. As a renter, I can pack up and leave on relatively short notice, with limited liabilities. But as a homeowner, I might as well be an indentured servant to my mortgage company. If I own my home and suddenly find that I need to sell it and move elsewhere for work or other reasons, I can expect to take a six-figure bath in today's market.

Personally I would rather be a serf than a prisoner... hence my preference for renting.


They aren't all the same. 3 years of contract at $70+/mo != prepaid $30/mo != prepaid 10c/min.


They are all the same in many respects though. Do any of them offer true unlimited data? Tethering? Usage of unlocked phones? Decent coverage? Data privacy?


Well, none of them offers a pony, either. I think that only one of these (data privacy) is a right; that (2)–(5) are conveniences; and that true unlimited data is impossible (assuming you mean "… at a fixed price").


However, if I want to own a pony I certainly have the option of purchasing one.

2-5 are perfectly reasonable offerings and the corporations that control the data pipes have all accepted our tax breaks which contribute to them being able to maintain their virtual monopolies. My point is that the rent analogy does not work at all. It's too simple.


Using a task manager on your phone is like analyzing the water coming out from your tap before you drink it, only a little less important for your general well-being.

Most people are simply ignorant about the details because they trust an authority to keep things in order (e.g. the FTC). Perhaps trust is more misplaced with Apple and Google than with your water supplier, or perhaps those working in the food or water industry will be as wary about water quality as we are interested in whatever spyware is running on our phones - and we will find it a bit ridiculous.


My iPhone does a pretty darn good job of serving me. In fact, that's why I own it.

If it happens to server others collaterally, well, I'm not the jealous type.


Minor quibble: having to port to multiple devices may be 'boring', but writing native apps can be truly wonderful if you love the platform, framework, etc. My personal experience is largely with audiovisual apps -- and for that, I'm a big fan of iOS. Core Audio in particular may be a tough framework to approach, but what you can achieve with it is very exciting.


I fully agree about things that require native apps. When I click on a link to a YouTube video or a tweet and instead of showing me the result in the browser, it pulls me out and brings me to the YouTube or Twitter app, it means I can't bookmark it or share it. I hate that.


This is more of the idealistic view that I haven't seen pan out. Android may be more open, in the times when the latest version is open source. But when the carriers modify, lock down the phone it's a hassle. It's still a hassle on a perfect phone, out of reach to the vast majority. So on those 98%+ stock (including carrier mods) you have apps that aren't as jailed as on iOS and (still haven't read anything on this either way) pretty much the only online advertiser supplying you this for free.

Real world circumstances render this correct ideal view mostly or entirely not true.


I rather make use of my native applications that know how to take advantage of the underlying hardware and OS.

The browser should have stayed a document only thing.


> The browser should have stayed a document only thing.

I think the browser manufacturers will take that deal when the App Stores decide to not allow anything that doesn't actually have any purpose other than showing blobs of text on the screen.

When did a newspaper become an "app"? What new functionality does a newspaper bring to a phone that it needs its own binary code clogging up the flash drive?

To read NPR, I need to install the NPR app. To read BBC, I need to install the BBC app. To read CNN, I need the CNN app. And I need to keep them updated and if a new version of the OS comes out, they need to be updated to use the new APIs and... oh, what a waste of life. I just want to read the damn news. Perhaps next, I'll need separate televisions to watch different TV channels.


Could you give me an example of anything other than a game that is actually "taking advantage of the underlying hardware and OS" in some way that the browser can not? Looking at all the apps people use, games are literally the only things I've seen that aren't just poorly re-implemented websites.


Interestingly, I consider websites to be poorly reimplemented native applications. They make sense when there is a centralized database at the back, but even then UI and UX usually suffer. For one thing, they often lose input methods available to the underlying system. For example, how many web apps use context menus? How many _can_ use them without breaking browser's own UX?


>but even then UI and UX usually suffer

Because of bad designers. Most native apps suffer for the same reason.

>For one thing, they often lose input methods available to the underlying system.

Such as?

>For example, how many web apps use context menus? How many _can_ use them without breaking browser's own UX?

I have no idea how many do, but they don't break anything unless the designer messes up.


I've seen some sites overriding context menus to provide their own copy/paste commands (for unknown reasons, probably to change the look). In doing so, they remove all the options browser puts in there.

For example, right now in Opera I have 17 entries in context menu when clicking on a empty spot on the page, 9 entries when clicking on selected text and 15 entries when clicking inside text edit field and yet another set when clicking on a link. Some of the options control behaviour of the browser, some deal with current page, some deal with specific page element. If HN wanted to use context menus for some app-specific reason, all of those options would be wiped out (at least for affected elements) because the app would be in direct conflict with the browser.

Touch events are often another casualty -- sometimes they are commands to the OS (swipe to change active desktop), sometimes to the browser (pinch to zoom), sometimes to the app (drag to drag around a map, or maybe pinch to zoom?).

Additional layers always introduce problems and cause things at the end of the chain to have worse UX or adapt by using lowest common denominator. A crazy example: gmail running in browser running in windows running in a windows VM running on a mac machine which is remotely connected via VNC from an android tablet. >:D Android -> VNC app -> OSX -> VMWare -> Windows -> IE -> GMail app. Ok, this doesn't illustrate much besides the point that browser _is_ an element of input chain (that makes a disproportionate effect on UX too -- the other links are at least trying to be transparent), I just like the idea. :)


>I've seen some sites overriding context menus to provide their own copy/paste commands (for unknown reasons, probably to change the look). In doing so, they remove all the options browser puts in there.

Of course, and you've probably seen much worse than that. Incompetent web designers are plentiful, but the capabilities of a web site are not limited to "broken sites made by incompetent designers".

>If HN wanted to use context menus for some app-specific reason, all of those options would be wiped out (at least for affected elements) because the app would be in direct conflict with the browser.

If HN wanted to do something incredibly stupid, they could do something incredibly stupid. Right. I'm not sure what the issue is there. If they wanted to not do something incredibly stupid, that is also an option. Why judge only on the worst possible scenario and assume every website is designed to be as broken and shitty as possible?

>Touch events are often another casualty -- sometimes they are commands to the OS (swipe to change active desktop), sometimes to the browser (pinch to zoom), sometimes to the app (drag to drag around a map, or maybe pinch to zoom?).

This applies equally to apps vs web, and isn't even specific to phones.

>Additional layers always introduce problems and cause things at the end of the chain to have worse UX or adapt by using lowest common denominator

That statement is false, and not relevant. The browser is no more an "additional layer" than an app is. The same number of layers in both cases.


> If HN wanted to do something incredibly stupid, they could do something incredibly stupid. Right. I'm not sure what the issue is there. If they wanted to not do something incredibly stupid, that is also an option. Why judge only on the worst possible scenario and assume every website is designed to be as broken and shitty as possible?

But context menu is a useful concept (sometimes). It _would_ be nice if it could be used when it is called for. The problem is that it is impossible (I think) to use context menu or _any other right click action_ you could think of without breaking browser functionality. It would be like you said, incredibly stupid.

Thus web apps effectively lose one whole mouse button, thus reducing available input bandwidth.

Another example of lost input is lack of system-global shortcuts. A web app in my knowledge can't have them. I listen to music while I work and I tend to switch tracks or pause them once in a while. So when I listen to youtube playlists, I have to go back to the browser, open the playing tab and do it there. (and you never know if the video currently playing is SFW :))

> That statement is false, and not relevant. The browser is no more an "additional layer" than an app is. The same number of layers in both cases.

The browser is a layer, and the native-app is a layer, except to get to web-app, you need to go through the browser first and a native-app sits at the same level as the browser (so a portion of input capabilities is not used up for the browser itself).


>But context menu is a useful concept (sometimes).

Of course. Which is why you would use it only where it is useful, not just hijacking a particular input event globally.

>Thus web apps effectively lose one whole mouse button, thus reducing available input bandwidth.

No, you can use right click all you want, and it need not interfere with the browser's use of right click. Also, very few people use a mouse on their phone, so this does seem like a bit of a stretch.

>Another example of lost input is lack of system-global shortcuts. A web app in my knowledge can't have them.

Right. There are a few other things like this that web apps can't do. And apps that people would want this sort of interface for should not be web apps. But as I said originally, virtually all the apps people use are not of that nature (and those that are almost always are apps that came with the phone), they are just websites turned into clunky, awkward apps for no reason.

>except to get to web-app, you need to go through the browser first

The browser is the app.


Instagram and similar

Navigation

Shazam and similar


Instagram? How is taking a picture something that can't be done with a browser?


I am not sure what Mobile OS/browser gives acces to the camera control, but Instagram does more than takes a picture.


>I am not sure what Mobile OS/browser gives acces to the camera control

All modern mobile browsers do.

>but Instagram does more than takes a picture.

Be more specific then, I assumed that's what you thought couldn't be done. Tell me what it is you think can't be done so I don't have to guess.


> Tell me what it is you think can't be done

Actually, you did not ask about'what can't be done', but 'example of anything other than a game that is actually "taking advantage of the underlying hardware and OS"'.

Instagram takes advantage of the underlying OS and hardware when it uses photo filters. Same as native games do. Could it be done in browser? Probably. Same as games.


>Actually, you did not ask about'what can't be done', but 'example of anything other than a game that is actually "taking advantage of the underlying hardware and OS"'.

You left off the rest of the sentence though, the part that said "in some way that the browser can not".

>Instagram takes advantage of the underlying OS and hardware when it uses photo filters. Same as native games do. Could it be done in browser? Probably. Same as games.

The difference is that photo filters can be done, quite trivially, in the browser. Many games can not, because the performance of the javascript engines and the limited hardware accelerated drawing.


> All modern mobile browsers do

At the time of Instagram release that wasn't the case.


I don't recall suggesting that Instagram made a mistake. I was responding to the idea that right now, apps other than games exist which take advantage of the OS in ways the browser can not.


if 'open' wants to compete, it should do so by being better than 'closed'. The argument in the article tries to outlaw being 'closed'. No one stops anyone from making any 'open' system (OS or hardware or whatever) and why should anyone stop someone trying to make a 'closed' system. If someone wants to make a closed system and someone else wants to buy that, sure, by all means..


If "safe" wants to compete, it should do so by being more fun than "dangerous". No one stops anyone from making any "safe" system (automobile or recreational drug or whatever) and why should anyone stop someone from trying to make a "dangerous" system.

One of the best uses of law is to place limits, regulations and standards on what businesses are allowed to get away with.


I don't have root on my TV and it serves me just fine for what I need from it. There might be good arguments to keep platforms open, but this ain't one of them.


OTOH, I would love to root my DVD player so I could fast forward when I want to, not at the mercy of the FBI.


Hacking the player would be useless. DVDs are programmed and any (and all) limitations that are there are because a programmer put them there.

Same with Blu-rays.


This isn't correct. The DVD can only specify that fast-forward and other functions are to be disabled. It's up to the player firmware to enforce or ignore the DVD's user permission flags.


Good point.


From the comment section:

>Trust will disrupt Openness

There are a couple of factors that will disrupt the Web architecture.

The Web has trained us to build dumb clients and centralize anything of value on the server, at a huge cost and never enough trust. We can safely predict today that light-weight protocols, mediated by the mobile OS (and its Platform) will directly challenge the Web architecture, precisely because we can leverage the platform trust model. That evolution is extremely profound.

For instance, apps running on your device can securely and privately share information without requiring a complex temporal integration involving a 3rd party service (such as Google AdSense). The information is produced and consumed on the device or the device of a related end-user. What happens on your device can now stay on your device.

Just to be clear, and to show how disruptive that architecture is, the primary key of your private data becomes your phone, not your identity. Merchants no longer need to identify you. They can’t care less about YOU, they just care to know some information about you. The problem with the Web Architecture was that the only way to do that was to associate PII to a primary key on a server and hence merchants needed to identify you to track your every move (and they shamelessly did).

The second factor is just as profound: the very open nature of the Web is driving scale over scope. The Web has successfully nurtured the largest Catalog, the largest Search engine, the largest Auction site, the largest Social Network, but I see this as a negative side effect of the Web architecture because it limits the scope of what people can do. In other words, the scope of what Amazon, Google or Facebook offer is limited by the scale (and hence the revenue) they can achieve.

I actually argue that a trust-based neutral Platform will support a more vibrant and diverse ecosystem than a truly open model because in essence a Web business couples the leve of trust it can achieve with the functionality it can deliver. The Platform decouples the trust from the functionality and it enables much smaller actors to deliver a lot more scenarios while relying on the trust establish by the Platform.

I would be surprised if the Web can resist being disrupted by the Platform. Actually, I think it already is.


Unfortunately, you can't really trust a device, since there is always a chance that it could be rooted. See the long history of rooted game consoles.

The best you can do is sort-of trust the device. But that's not much better than not trusting the device in terms of the kind of architectures and products you can build.


"you can't really trust a device, since there is always a chance that it could be rooted. See the long history of rooted game consoles"

How many get rooted without the user's intervention?


I do not have an answer to your question, but in my mind the targeted attack is more concerning than a random drive-by that hits everyone. That means just one instance is too many.

As another example: In the earlier days of iOS, you could root it just by visiting a website. Who knows how many fell pray and have malicious code running on their devices today? With a targeted attack, you don't even get the benefit of security researchers all scouring over the code like you would with something more widespread, thus there really is no way to even keep track of what might be out there.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: