Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is a recurring cycle in application development. We've gone from mainframes and minicomputers serving plain text to dumb terminals, to programs running on personal computers accessing applications and data on servers, to web servers serving structured text to 'dumb' browsers, to powerful in-browser runtime engines accessing applications and data over the web.

There's no magic bullet.



It is another iteration of that cycle, only this time we have something to lose, namely the web. We are regressing to client/server with a bunch of ever-changing single-site APIs and shoddy client code third parties can't readily fix, and these are destroying the world-wide web of repurposable content in open formats at stable addresses.


Aren't we just embracing the difference between a site and an API? It's hard to do both well at the same URL. The API provides the repurposable content in an open format, and the site itself is free to experiment with different presentations. Is that so bad?


Before devs started experimenting with client-side rendering, all sites' content was amenable to the same set of tools. Now there are more and more services with broken frontends and unique APIs which are incompatible with everything else and aren't even stable—when you can rev your own client js instantly, you don't know or care whether any other clients broke. I'm stuck using only one client (your js) that works at all and I can't fix it, which is almost everything that client/server got wrong the first time around. It's not impossible to carefully implement a stable form-compatible API in common with a bunch of similar sites, but I don't see it happening.


It's the prisoner's dilemma: yeah, everyone on the web would be better off if businesses weren't doing this, but individual businesses will continue to do it as long as it's in their best interest to retain very tight control of and limit third-party access to the data and meta-data they capture from users.

I'm not sure how or even if the infrastructure of a distributed system like the web could be engineered so as to prevent this kind of situation. Perhaps the solution is to build in a system of financial incentives -- not unlike what the Bitcoin folks have done to solve the Byzantine Generals problem. It's an interesting problem.


For usability response time is everything. So for the moment if people want to build richer apps then they will have to be asynchronous. But do you think there will ever come a day when you can assume the network is always fast and we will be able to go back around the circle to the mainframe again?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: