Hacker News new | past | comments | ask | show | jobs | submit login
Stop Breaking the Web (ponyfoo.com)
79 points by bevacqua on Sept 15, 2015 | hide | past | favorite | 46 comments



Considering that the average page size (first load) is getting bigger and bigger there is certainly some truth to it. Pages are getting slower and more complex and I see more and more browser tabs take 80mb+ for (seemingly) simple sites.

I also agree that web site/web app is slightly different, but: there are blogs and sites that load, show a blank page, then fade in some animation/spinner while loading fonts, styles and more scripts and then slowly fade in the content. However, if something goes wrong (font doesn't load, JS blocked, etc) you see a blank page with a spinner forever. HOW is that progress? That's just awful.

I recently decided to start a little blog[1] using the static site generator hugo[2] and I don't feel like I'm missing out on anything. Well, you won't be able to load the disqus comments and have no code highlighting without JS, but that's it.

[1] http://code-and.coffee [2] http://www.gohugo.io


Consider that when most people get sites that don't meet their needs their gut reaction is to open 20 tabs and mentally slice data from all over the website. That doesn't seem very progressive either.


I am not saying there is no need for more complex web sites. Personally I just think it is sad that in 2015 there are lots of sites that won't load anything at all if you disable JS or have a bad connection.

With a 50k connection at home I simply can't accept waiting 3+ seconds until your site loaded its 500kb of JS and 4 different web fonts to finally show something.


Sure! Parts of that are indeed ridiculous.

Although I'm pretty sure the download size from a CDN is far less important from a performance standpoint (within reason). More important things include: - Time to first byte from a slow'ish server (.5s to create the HTML and .5s to deliver). - How many separate requests you are making (somewhat going away with http2)

Can we start ranting about the excessive amounts of tracking pixels everywhere now :)?


Sorry, but how does this respond to the parent post above? Opening more tabs as a response to site bloat?

If anything I'll abandon the site.

Still not a positive outcome.


This is non-sense. Every few minutes you hear someone complain about the lack of progressive enhancement. The case for progressive enhancement is only valid when you have "content" that is easily parseable by human eyes.

In a world where you have too much data and users want to see slices of data, progressive enhancement fails to deliver a fast, pleasant, engaging user experience. In the case, where you have complex tools that help a user meet some end goal more quickly progressive enhancement falls short of providing the easy to use tools. The only case where this works in any way shape or form is when a site is submittable via simple forms and all data can be retrieved by visiting urls. Any, slightly more complicated use case fails to be delivered at any reasonable pace with any sort of reasonable performance. You know the kind of performance that wouldn't make your server side fall over.

This whole progressive enhancement thing is mired in decade old dogma. While progressive enhancement can work sometimes, it is NOT the only tool. We shouldn't wholesale prescribe solutions without knowing someone's problem.


Progressive enhancement does make sense in some cases, such as using a feature that not all browsers support yet but that the site can function without.

And progressive enhancement makes perfect sense for things like CSS; if your site content makes no sense with CSS turned off, it probably makes no sense with a screen reader. So, for instance, remember to put content in a sensible order in the HTML rather than arbitrarily rearranging it with CSS.

However, I no longer think progressive enhancement makes sense for things like JavaScript, as long as you use features just about every browser supports. Otherwise, you'd have to effectively write your site twice: once with JavaScript, and again as entirely forms/links and server-generated content.

The critical reason why I no longer think this makes sense: because it's completely sensible to follow an API-first approach to site design, where the first thing you write is an API usable both by third parties and by your own first-party site. Then you can write your site on top of your own API. I don't think we need to target human-readable first; on the contrary, I think we get a better, more extensible, more programmable, more open web if we build APIs first and foremost.

Now, all that said, there are other ways this article is completely right about not breaking the web. In particular, having an app is no excuse not to have a website, or one not usable on mobile. And if you're going to display a "you might want the app" banner, have a "go away and stop asking me" that does not break navigation to the specific page the user was trying to visit in the first place.


Progressive enhancement for JavaScript is not only a accessibility and usability issue, it has become a security issue. Tor enabled JavaScript by default, because the web would break without JavaScript support, and that hampered adoption rates of Tor.

We already had a programmable web. If anything we are moving away from machine-readable code.

If progressive enhancement does not fit your development style, and takes too much time to build, well... fine. I see this not as a fault of progressive enhancement, but in your approach. To me it is akin to saying that writing unit tests takes you too much time, and hence, testing makes no sense.


I think the article was pretty clear about API first being bad, and even linked to Twitter's write up of having to tear out that failure: https://blog.twitter.com/2012/improving-performance-on-twitt...


I don't actually think an entirely client-side-rendered application makes sense; on the contrary, I do think it makes sense to do most HTML generation on the server, and where appropriate, hand out snippets via the API, rather than handing out JSON or similar and making JavaScript produce HTML. But that's different than rendering entire HTML pages entirely on the server. And for primarily dynamic content, I see nothing wrong (for many sites, at least) with assembling server-provided HTML snippets from JavaScript into a server-provided HTML base page, without any fallback to an entirely server-provided page.


Of course/agreed! It took me a while to get there, but basically the conclusion I was trying to make is "it's not the only tool please stop prescribing for every scenario blindly."


> This whole progressive enhancement thing is mired in decade old dogma. While progressive enhancement can work sometimes, it is NOT the only tool. We shouldn't wholesale prescribe solutions without knowing someone's problem.

The "first load" problem is a recent addition to the canon of reasons why progressive enhancement is a solid approach. I think that trashes your argument that this is "decade old dogma".

And to further counter your point, the notion that "decade old dogma" is "bad" is refuted by success of POSIX/UNIX. And wheels. Some "old" ideas are still state of the art.


Getting a little tired of reading these rants. People are building "applications" that just happen to run in the web browser, for convenience. They aren't building repositories for static information.


Most newspapers are building sites for mostly static content. There's no reason for those sites to be so terrible, yet they are almost all terrible.


Yeah, these rants never account for the fact there there are a number of different types of websites, and tools for each. Don't use Angular or React for a blog for the same reason you shouldn't use Jekyll or whatever for an application. Different tools for different tasks.


The points apply equally to those things you call "applications" as they do to those things you call "sites". They're essentially the same thing. They're all functionality delivered over web technologies and for that reason, the OP's points still hold. In the case of a "web site" the functionality is "displaying content".

An "application" can be built using progressive enhancement. The OP even describes ways to approach this. There were applications before there were AJAX or Angular. Their experiences leave a little to be desired, sure, but they were certainly usable. A basis in simpler, server-centric interaction with enhanced, client-side experience jazz is still more in-keeping with "how the web was made", and with how it is consumed.


Then how come half the blogs today show black pages when javascript's disabled? Not just applications.


The rants are tiring, I think, because they're fighting some kind of (at least temporarily) losing battle and screaming for everyone to STOP DOING THE BAD THING when it's clear that they don't want to.

If Taco Bell decides to close their web site in favor of an app... well, a rant is going to change that, no matter how strongly worded. Blame consumer capitalism or PR firms or whatever.

It's frustrated prescriptivism. As a voice in the discourse, I suppose it has its value.


Think about what happens with the "old web". You make an HTTP request to a server. The server evals short snippets of dynamic language inside a template to generate a really big string. Then the server sends that string down the socket.

If you want to view another object, you have to request another slightly different big string.

Because of how resource-intensive it is to assemble these strings, programmers deployed "caching software" to memoize every big string that came up. Server hardware was loaded with RAM so that they could store a lot of big strings.

- - -

The new approach is to send the basic code of the website to the client the first time, then have the client make additional requests for data via AJAX, assembling the UI as it goes. Sanity.


I very much agree with this. Get the content to the user as fast as possible and then progressively enhance.

Most of the web sites that I get value from really just need HTML and CSS.

That said I don't mind waiting for rich web email clients, web version of Office 365, etc., that I leave open a while.

Complex web pages that are really just showing content also eat up a lot of CPU and memory resources.


I'm sorry, but I'm still trying to understand what the author is trying to say. It seems like he had a bad experience with Angular and he's taking it out on hash routing and client side rendering (so basically, SPAs?). Beyond that, I'm not really sure.


Absolutely. We have thrown the baby, and the sink, out with the bathwater.

I'll plug my little strike back against the insanity:

http://intercoolerjs.org

You can build a highly dynamic web site with normal URL schemes, sane HTTP caching behavior and zero client-side templating.

I need to do some more work on making it meet progressive enhancement goals, but I'm convinced this is a better approach than heavy client side logic for most web applications.


It's been a while since I've seen anyone take a stand for IE6 users.

As a web application developer whose bootstrapped startup needs every user it can get, let me just say: I am perfectly willing to forgo all IE6 traffic.


His point regards IE6 is that even for several following generations of MSIE substantively similar concerns exist.

IE6 is a bad low standard. But it remains a target which in accommodating it also addresses many other low-capability browsers.

I don't use IE6 myself (though had, some years back on Linux, to access work-related and Intranet sites on occasion). Currently my main beef is that my, yes, ancient, Android 2.2 browser is utterly normed on many sites. Including no font downloads, so no text at all, on far too many websites. Wired and Medium among many others. Obscuring sidebars elsewhere.

Really annoying.

TBL's original HTML docs remain quite accessible however.


Yeah, its a neglible share. The article lacks a grounding in utility. You shouldn't bend over backwards for 0.1% of leads. If your product looks 10% cooler for your best leads at the cost of not working for the 0.1%, it is actually a good deal.


Web design and development is also an art. Good artists look beyond business value.

You actually should bend over backwards to cater to as many users as possible, especially if you are getting paid to build websites for users.

Some countries require sites to be accessible to the disabled. If you design sites for the US government, they should also be accessible to 0.1% of blind or no-script users.

Sure, IE6 as a baseline is very progressive, but it is certainly doable. Less so, if you start with an inaccessible website and catering to as many users as possible is an annoying time-consuming afterthought.

If you can not muster an accessible progressive enhanced website, then you can not muster a js-only ARIA compliant website either. Your only hope is to make something profitable. That's being a marketeer or business man with a little HTML skills, not being a solid web dev.


These companies need to realize that end users, especially me, now pretty much expect their apps and pages to:

    - annoy me for their benefit
    - spy on me
    - waste my battery
    - waste my bandwidth
interesting these companies don't seem to care much about gaining the trust of their customers or actually serving them in a convenient fashion.


Companies don't care, because end users, for the most part, don't care. At least not enough to significantly change their behavior.


The point is to make them care. There are ways of doing this.

Google are steering progress toward HTTPS and app-add interstitials, for example.


To a lot of you arguing that Progressive Enhancement doesn't apply to "web apps," as Jake Archibald argues[1], that really isn't an excuse if you can't distinguish it from what you consider a "web site." Is your content static? Is interaction core to your content (like a video game or a data viz explorer)? Even if interactivity is core to your content, why not first offer something like a product page. Describe about your interactive content in text or in images or in a demo video. Just the minimum static content a user can quickly download before the rest of the richer content is loaded. It's not much of an extra effort; you aren't back porting your interactivity to less capable devices. Instead you're using static content to show what your game could be.

[1] https://jakearchibald.com/2013/progressive-enhancement-still...


So for the novice web developer, what is the recommended course of action for creating a "modern," mobile-friendly website? The options seem to be:

- Code all HTML and internal CSS by hand

- Use a CSS framework

- Use something like Bootstrap or other aesthetic rendering powered by javascript/jquery

The problem seems to be that the less work done by the designer means increased reliance on Javascript or external engines that then cause additional overhead, bandwidth, and possible breakage for people using NoScript or AdBlock. And to be honest, I'm sorta on their side, as it seems silly that a single website needs to reach out to three or four outside domains just to style the page properly.

What's the solution here?


As I've mentioned elsewhere, the best solution that I've been able to come up with is intercooler:

http://intercoolerjs.org/

You can build dynamic websites with minimal or no javascript and just use the same old techniques you are used to.


I don't think the problem is the novice web developer at all. Most of the time using just one framework is not going to give users a problem. The issue is big sites that use maybe one or two big name web frameworks for compatibility and then have an internal framework and then add in a video player framework, an ad serving framework, and an analytics framework. It all adds up after a while and ends with a lot of sites just using "apps" for mobile instead.


If web developers want to build applications for anyone outside the West, they'll need to work on faster page loads. A large percent of the world views the the internet on mobile devices with a slow connection.

Blocking content until a large JS file is downloaded, at least one more ajax request is made, and another round of downloads occurs is unacceptably slow in a developing country.

Isomorphic applications or server side rendering gives a company a competitive advantage in these markets.


Bullshit. The author is confusing web sites and web apps.


No, I think many people do. People are beginning to treat all websites as if they should be web apps.

Look specifically at WIX sites, Google Groups, and Blogger these should not be web applications -- they are standard websites, repositories of information that need not live-update or run any javascript what-so-ever for the most basic of use-cases -- are entirely, 100% unusable without Javascript.

Browse the web without javascript, you'll find that most things are broken, even static content just doesn't work. That's the problem. Web Apps are cool. Your blog _requiring_ or a news group _requiring_ javascript to read is not cool.


Part of the problem is that the web content and security model requires one to consider a domain somewhat isomorphic to an app.

It wouldn't be so bad if the technology existed to reliably say "foo.com is serving data that can be rendered via the some-standard-component presentation layer." Then, the browser itself would have the capacity to cache and optimize the rendering layer for common content schema. As it stands, your choices are to code to one schema (the HTML5 / CSS / no Javascript presentation standard, with all the restrictions that implies), to write a web app that will be a special little snowflake different from all the other special little snowflakes on the web, complete with its own chunk of memory and processing in your browser, or to throw up your hands and commit your content to one of the existing solutions that build the app for you (but then basically also control your data).

We're missing a functional division between data and app that we have in the desktop model.


How about we parse the html and css on the server, because, you know the user might have a browser with a slow rendering engine. Then we just ship them the images.


That's actually more-or-less how Opera Mini works: https://en.wikipedia.org/wiki/Opera_Mini#Functionality


Your sarcasm falls flat for me—if browser HTML and CSS renderers were written in Javascript we probably would have to take extreme measures to keep the web usable, especially on mobile. The effect it would have on battery life chills the blood.


In my clueless ignorance, that was what I first thought the author was ranting about when they mentioned "server-side rendering"...


We should turn adobe photo shop into a web browser!


I hope you’re being sarcastic.


indeed


Puts a damper on the SPA + AWS Gateway API + Lambda app I was developing.


Here is what I think about this...

  =============================
  |                         X |
  |                           |
  |  Would you mind taking a  |
  |  survey to give feedback  |
  |  about this Hacker News   |
  |  comment? It will only    |
  |  take 5 minutes and you   |
  |  can win a new Macbook    |
  |  Pro.                     |
  |                           |
  |   YES      NO     LATER   |
  |                           |
  =============================
In conclusion, stop breaking HN.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: