Another option is to make a website that works without JavaScript. Only dynamically fetch pages when JavaScript is enabled. Progressive enhancement rocks.
All page titles are <title>Monocle</title> which isn't very descriptive and the meta description is the same for every page. Only with escaped_fragment can a user see descriptive page titles. For JavaScript users all pages are titled "Monocle".
There are no unique content articles to rank nr. 1 for. The articles are all found on other sites. I don't really see Monocle rank 2 a lot (a quick glance). Those are reserved for other aggregating sites.
The Google guidelines say:
Make pages primarily for users, not for search engines.
"Does this help my users? Would I do this if search
engines didn't exist?"
I'd extend that to JavaScript apps. Why make escaped_fragment especially for search engines, and then forget to offer this functionality for human users too?
Honestly, with every worthwhile browser (both desktop and mobile) supporting JavaScript and having it enabled by default, it doesn't make much sense to support browsers with it disabled. The couple million people with NoScript installed who enable it only on specific sites know to enable it if a site doesn't work.
- Not all search engines can handle JavaScript, Google is leading the way, but not perfect.
- Those NoScript users could be potential clients or users of your site. I love it when my conversation rate goes up 1%, and that becomes harder when you ignore 1-5% of users.
- Screenreaders generally do not support JavaScript [not true, see comment]. If only one blind user gets to access the content/design I created, then that is worth it to me. I am thinking as front-end engineer here, not as a business owner, where time is money. (Also depending on jurisdiction it may be against the law to be inaccessible).
- Noscript users will likely bounce in large numbers when seeing just a blank page. Simply adding a <noscript> tag, explaining why you need JavaScript goes a long way.
As a front-end engineer I go for maximum content accessibility. I don't meddle into the politics of things ("If we don't drop IE6 support, the web won't move forward!").
I totally understand the new landscape, where a lot of people have JavaScript on. Some web apps can not use progressive enhancement, because the JavaScript is core to the app. But for "static" content websites like these, it is certainly possible to make a website that is usable by most users, human and robot.
Thank you! I've been behind the times. I'll start viewing it as a general accessibility issue, not specific to screen readers. But like Steve Klabnik commented, and from http://www.w3.org/WAI/intro/aria.php , there are a few more steps to take to make JavaScript enabled screen readers play nice:
"WAI-ARIA addresses these accessibility challenges by defining how information about this functionality can be provided to assistive technology. With WAI-ARIA, an advanced Web application can be made accessible and usable to people with disabilities."
But wai-aria has nothing to do with JavaScript, rather the application of semantic elements to make non-native controls able to be interpreted by screen readers.
I'm not saying that you shouldn't ensure that your website is crawlable by search engines. You absolutely should. But we're at a point today where you simply don't need to worry about whether your site works without Javascript or not for the functionality of the site. Even Mozilla is removing the option to disable Javascript from the browser's UI (you can still use the advanced about:config or extensions).
As mentioned elsewhere, screen readers fully support Javascript and have for a very long time (which you've updated your post to acknowledge). That was one of the big reasons I used to ensure sites worked without Javascript years ago.
NoScript users know enough to turn it on for a site for it to work. NoScript users only make up 0.08% of internet users worldwide, a far cry form 1-5%. Basically, you can safely give them a significantly reduced site experience or just a message to turn Javascript on. They're not really worth the effort in most cases. Hardly anyone disables JavaScript anymore as most sites simply won't work right without it.
If you want to expend the extra effort, more power to you. It's just that for most sites nowadays, it's not worth the time/money anymore.
It's not a question of having JavaScript enabled/disabled. It's a question of whether the JavaScript gets delivered from your server to the client's browser properly and fully in a state that it can be executed successfully. Progressive enhancement is about robustness, being adaptable when the network fails to be perfect.
Look at it this way, this __escaped_fragment__ is only supported by Google. No other search engine supports it. And considering that building it with progressive enhancement first, and then enhancing it with JavaScript, you get content that's indexable by any search engine, not just Google.
The workload is the same, the complexity is the same, the only difference is the focus on progressive enhancement first rather than try to bolt a clearly less optimal solution later.
You have to support non-JS browsers if you want to support search engines crawling your sites. The hash fragment is a non-JS method that only works with search engines. Why not use something that works with noscript browsers as well, and incidentally with search engines that don't support the hash fragment?
Don't forget that javascript performance on mobile is abysmal. If your use of javascript is much more than light fluff, its going to seriously annoy mobile users.
> Honestly, with every worthwhile browser (both desktop and mobile) supporting JavaScript and having it enabled by default, it doesn't make much sense to support browsers with it disabled.
Well, javascript support in w3m and lynx is still pretty poor. It makes it harder to easily get some content from offline reading using wget/curl.
It does depend on what kind of site/app you're making. I'd say that if your main content is text, then requiring js makes no sense (especially if you're publishing some blog posts on configuring server software -- I might want do download that article to a headless server).
I think most apps also benefit from an old school REST architecture, so that it is possible to eg: script your todo-app with curl to create a new todo-item without having to go through 3 pages of api-specs.
I certainly could make the titles more useful, in fact due to you suggestion I've just committed a fix to that.
I'm interested though, why do you think it's useful for users to read meta descriptions? Or the raw text used by spiders?
How does making this a pure JavaScript web app degrade the end-user's experience? I think I can effectively argue the opposite - that JS and client-side rendering makes for a much better experience.
It's slower to first load and usable application on the client side.
If we were to take two web apps, one using a rendr-style-render-on-server approach ( https://github.com/airbnb/rendr ), and one using a blank-html-bootstrap-through-js approach, the rendr-style app will win out for time-to-first interaction.
To take a specific example, loading the monocle home page gives a base html time of 355ms for me. setup.js takes another 570ms.
All told, it's an initial load of 355ms vs 970ms. Or "close to instant" vs "is something wrong? oh no, it's good".
>why do you think it's useful for users to read meta descriptions?
It isn't, really. I made a mistake and thought that every page (served to users and search engines) had the same title and description. A user is unlikely to read a meta description, unless he/she is on a search results page.
>How does making this a pure JavaScript web app degrade the end-user's experience?
NoScript users and special need users are unable to access your website's content.
> JS and client-side rendering makes for a much better experience.
Agreed. Progressive enhancement makes this possible and get the best of both worlds: Accessible content for NoScript users, spiffy JS rendering one page app for JavaScript users. RMS can even download your pages through Lynx.
If you go pure JavaScript, you can at least add a <noscript> where you explain why you need JavaScript to enjoy this site.
> "special needs users are unable to access your sites content"
Sorry I have to call you out on this, ignoring the use of the term "special needs", how could this possibly affect someone's use of JavaScript?
Visiting http://monocle.io/posts/how-yield-will-transform-node without JavaScript support yields a blank page.
All page titles are <title>Monocle</title> which isn't very descriptive and the meta description is the same for every page. Only with escaped_fragment can a user see descriptive page titles. For JavaScript users all pages are titled "Monocle".
There are no unique content articles to rank nr. 1 for. The articles are all found on other sites. I don't really see Monocle rank 2 a lot (a quick glance). Those are reserved for other aggregating sites.
The Google guidelines say:
I'd extend that to JavaScript apps. Why make escaped_fragment especially for search engines, and then forget to offer this functionality for human users too?