Hacker News new | past | comments | ask | show | jobs | submit login

Sending incomplete pages means those of us that don't run javascript[1] don't actually see your site. First impression matter, and when your page is:

    SomeSite

    {{storyTitle}}

    {{storyBody}}
    {{curPage}} of {{numPages}}
...the common interpretation is "broken site". This current fad of being too lazy to implement progressive enhancement is a regression. Rendering on the server so you server up a actual page is trivial, and you can still provide javascript that loads the next pages faster. Serving up only a template (or worse: an empty body tag) is insane.

The usual counter is that "javascript is always available" not only ignores the risks, I suspect the claim is based on bad data. How do you know how many people disable javascript? We aren't going to be in most analytics...

[1] for the numerous security and privacy reasons. Running arbitrary instructions in a Turing complete language is a bottomless pit of problems, and "analytics" is still spyware. Google shouldn't get to build a log of every page we visit.




> How do you know how many people disable javascript?

In this day and age, most businesses don't care about this type of user. I have no sympathy for those who intentionally cripple the web and don't care to cater to them. You aren't worth it; progressive enhancement isn't worth the effort. It's cheaper to presume Javascript and ignore users like you altogether; don't forget this is business, our motives are profit, not doing things "right".


So so misguided - what happens when your javascript request fails? Or a broken build throws a script error? Progressive enhancement makes sure your content is accessible no matter what the conditions.


> what happens when your javascript request fails?

You hit 'refresh'

> Or a broken build throws a script error?

The same thing that happens when a broken build returns a 500---the user can't use that service until the developer fixes it.

Progressive enhancement is a theoretically good idea that---in practice---actually adds a lot of overhead to developers (because every layer of progression is its own UI, with its own user experience and considerations).


"You hit 'refresh'"

_You_ don't, the customer does.

Would you care to name the sites you work on so I can stay away?


"Me" in this context is the customer. I'm talking about pages I see in the wild; when pages can't talk to their backends, they throw a "We crashed; please refresh" dialog, and I do what I'm told. ;)

Pages I write generally try their best to wrap calls that can fail in a reasonable retry envelope with some intelligent discernment of what response codes can be retried (429, the occasional 420 if someone thought Twitter was cute, the VERY occasional 500 if I just happen to know that the service in question is flaky) and only failing the error back to the user if the client can't retry it. In contrast to the non-JavaScript forms-only sites I've used, which tend to just surface their 429s and 500s straight to the user and expect them to know what a "back" button is (and whether it's safe to resend a form in this context), it's a better user experience.

(Incidentally: I do find myself having to re-invent that "retry envelope with a success handler, fail handler, and filter to determine if the response should be retried" boilerplate over and over as I move among frameworks; if anyone's built a smooth request wrapper for that, it'd be nice-to-have).


What happens when the server-side templating code fails? That doesn't seem to be an issue with client vs server side HTML generation.


The administrator gets a report immediately. Also the server environment is much more in control. Been there, done that, the author has a lot of good points. Partial server rendering is rock solid and so much faster. Not looking back.


I use the noscript tag to tell people with JS disabled that the site requires JS, end of interaction. Enable js or go away, I have more important things to do than bother with people who intentionally break their browsers. They aren't the target market and aren't worth catering to.

This is business, not academia, right and wrong are judged by profit and loss and opportunity cost, not by what is ideal given unlimited resources. Work on feature X or double my work so a few people a day who break their browsers can still use the site... one is practical, the other is not.


I actually would like to get HN's take on this. Maybe someone should submit an HN poll. I'm still of the opinion disabling javascript is an extreme measure and those that do it need to come to terms with whatever broken internet experience they get. While I do think it's worth it to display to the enduser something like "Looks like javascript isn't enabled; you'll need to turn it on for this site", is it truly reasonable to spend development time to make your site functional without client-side javascript? Probably depends on your audience. I bet sites that are more commonly accessed through Tor have people more likely to have javascript turned off.

But yeah, anyone here work webdev where your webapp is expected to fully work without client-side javascript?


I've worked on projects where that was the expectation; I've set that expectation for projects.

The reasoning has both been concrete/practical and philosophical (but still practical):

* Mobile processing time is still costly in terms of battery life and performance. And the number of http calls (and their latency) also makes a difference in performance; SPAs tend to have smaller requests but larger numbers of them and it seems to me that's actually the opposite profile of what 3/4G cellular networks are good at. (And while this is all less true on the desktop I'm starting to find it annoying that we're nevertheless finding ways to make things choppy and slow on 2 GHz machines with operations not more complex than scrolling).

* This is more vague, but I find there's a discipline imposed in starting the conception of the app in terms of plain HTML/HTTP that seems to keep things better organized, while projects that start with a focus on a rich/heavy UI devolve into overspecific yet mixed concerns more quickly. This doesn't work for everything, since some apps just aren't about resources and media types. But honestly, your app probably is. :)

* Being able to debug/autotest with something like curl is pretty nice.


Nope. To me complaining that an app doesn't work with JS disabled is like complaining that the layout is busted with CSS disabled. It's not clear to me why I should worry about this case as a developer. You can turn shit off if you want to but don't complain that it doesn't work now.


I don't make that many web apps, mostly websites for clients, and I worry over IE8/9 users more than I do about people who turn their javascript off. It's a non-issue for us. If they have javascript turned off at such an aggressive level then they are used to things being broken all over the place.


I think it is not about web apps, more about the casual "one time visit" browsing. There is usually zero incentive to enable JS for a random website where you just want to read an article or watch images. JS there is usually (my impression) used for things that distract or try to lure me into other content. I like distraction free consumption. There is a high incentive to enable JS for a dynamic web app or site you regularly visit (eg HN) though.

Good control for JS and other requests in browsers makes this quite convenient to control as a (expert) user. I like umatrix a lot.


When I'm developing a website, if I have a time to do things the right way, I'm doing my best to present to user a usable website both without CSS and JS. Good semantic markup usually provides good user interface. And, as crawlers usually don't like JS, making non-JS version helps them to crawl the site anyway.


I browse with NoScript. Most sites are usable without JavaScript, maybe not as good looking but who cares. A few sites are not usable and I don't like them. If I really care about their content I enable some of their scripts. After a little, one gets good at recognizing the obvious candidates to enable and the obvious tracking scripts to keep disabled. This and that if I want to see videos (maybe I don't want), this and that if I want to hear audio, not that because it's the ad script, etc. If I don't really care, I go somewhere else. The same content is usually available elsewhere.


What would be better is the ability to selectively enable javascript scripts. I want the javascript for your webpage to run normally, but I don't want the javascript for that ad that moves around the screen or the one for that tracker bug to run at all. This was much easier when ads where mostly in flash.


Just FYI, Firefox has removed the setting to disable JavaScript out of its preferences dialog a while ago. Now the only way to disable it is through about:config. This means that to a second or third approximation, 100% of Firefox users will have JS enabled. I expect other browsers to follow this as well. JS is now a standard part of the web. Additionally, remember that JS is not required to track you. An invisible GIF works just as well to know which pages you visited.

BTW, how do you feel about apps running on your smartphone? Is that much different? Have you seen forecast.io? Well, probably not since you don't run JS, but check it out. Some people develop fantastic mobile apps in JS instead of Objective-C/Java and in that MO there is no option to render things server-side.


I install noscript on every firefox install that I can. Most people like that it makes a lot of sites run significantly faster.

I don't use a smartphone - they are a pathological platform entrenched firmly on the wrong side of the War On General Purpose Computing.

For the record: I do run some javascript - on a carefully selected whitelist basis. A big part of my point is that a website that works is something that I might whitelist to access better features. The problem is showing a broken page instead of showing a basic page and progressively enhancing in the fancier features.


I hope you don't drive a car either: it has many computers which are even on the wronger side of the War On General Purpose Computing. How do people even come up with idea, that anything with CPU inside should be able to do General Purpose Computing :(


Sending incomplete pages means those of us that don't run javascript[1] don't actually see your site.

Sure, and that's a choice that every website owner has to make. Building a working no-JS app is non-trivial for all but the simplest things. Increasingly businesses I've worked with have found that "Doesn't allow Javascript" is shorthand for "Won't buy things online or share useful data due to security worries" so they're paying less and less attention to your needs. Things I build fall back to a simple no-JS version that prompts the user to phone orders or turn on JS. I would expect that to become the norm over the next 2 or 3 years.


Unfortunately, you are in the vast minority of people, probably even on hacker news (I would guess less that 1% of overall population [1]).

So the decision has to be made, just like whether you want to support IE7 users, whether or not you want to put in the extra time to support those edge cases.

And as always, it depends on the type of site you are running. If it's Amazon, that 1% matters a shit ton. If it's a side project or SaaS startup for example, it makes sense to hold off on supporting those 1% in favor of more pressing features.

1.https://gds.blog.gov.uk/2013/10/21/how-many-people-are-missi...


>Serving up only a template (or worse: an empty body tag) is insane.

Not, it's rational if your serving a web application instead of a web document. Most pages written with AngularJS are web applications.


You are almost alone dude. Javascript enabled users are the norm. People don't care about "Running arbitrary instructions in a Turing complete language" they care about buying things and watch videos faster.


I think it's reasonable to assume that either JavaScript is available or the user has made the conscious decision to turn off JS; if so, that user is probably savvy enough to realize the issue and decide whether to enable JS for the particular site.

Far less reasonable is faulting a JS framework for assuming it can use JS.

Still, I agree that if JS isn't essential to the functionality of the site, a non-JS fallback should be available.


> Google shouldn't get to build a log of every page we visit.

Why not block Google at your router?


    $ cat /etc/hosts | grep google-analytics
    0.0.0.0 google-analytics.com
    0.0.0.0 www.google-analytics.com
    0.0.0.0 ssl.google-analytics.com
Unfortunately, the problem is dynamic; any blacklist is always going to be outdated. A whitelist approach is the only blocking method that works. Javascript spyware has gotten a lot worse in the last ~year. A mainstream news site I happened to test recently wanted to issue HTTP requests to no less than 34 unique hosts, just to render a typical static news article. That wasn't the ads (adblock edge).


I think you shoul differ between web applications and web sites. It is nice when content sites work without js, application really doesn't have to.


Not at all. Even when you are serving up an "application", sending a broken page isn't a good idea. I've written a couple very heavyweight "web apps" myself (for in-house use) and even those are careful to always send full pages even though page-rewriting was used most of the time. Given that this was easy to do in rails, I fail to see why rendering the template on the server during the first request is hard. It's pure lazyness by these newer frameworks.

If "app" means some pure-javascript game or similar, the very least you can do is provide a proper page that indicates that the game requires javascript. A warning message conveys useful information - a borken template or empty body tag conveys "bugged website".


People who intentionally disable javascript in the year 2015 should also disable css, then close their browsers, then remove themselves from the network, then turn off their computers, then go live in cave.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: