Those people making sites that are completely broken without javascript have very precise numbers to look at showing that approximately none of their repeating visitors disable javascript.
We, by the other side, have no unbiased number to look and discover if it's a common behavior ;)
I reckon the key word in this comment is "approximately".
I might still be able to get what I need from a site that someone believes is "completely broken", including on repeat visits, without using Javascript.
Sometimes HN commenters debate what it means when a site "does not work" without Javascript. Some believe if an HTTP request can retrieve the content, then the site works. Others believe if the content of the site is not displayed as the author intended then the site is not "working".
I would bet that the definition of "completely broken" could vary as well.
Do the people running sites try to determine how many users are actually using Javascript to make the requests, e.g., to some endpoint that serves the content, maybe a CDN?
Browser authors could in theory include some "telemetry" in their software that reports back to Mozilla, Google, Microsoft, Apple, etc. when a user has toggled Javascript on or off. Maybe it could be voluntarily reported by the user in the form of opt-in "diagnostics".
OTOH, what can people making sites do to distinguish if a GET or POST accompanied by all the correct headers sent to a content server came from a browser with Javascript enabled or whether it was sent with Javascript off or by using some software that does not interpret Javascript?
The content server just returns content, e.g., JSON. It may distinguish a valid request from an invalid one, but how does it accurately determine whether the http client is interpreting Javascript? If a user were to use Developer Tools and make the request from a custom http client that has no JS engine, can/do they measure that?
Regardless of how easy or difficult it would be to reliably determine whether a client making a request is interpreting Javascript (i.e. more than simply looking at headers or network behaviour), the question is how many people making sites are doing that?
They can more easily just assume (correctly, no doubt) that few users are emulating favoured browsers rather than actually using them. One might imagine they could have a bias toward assuming that the number of such users is small, even if it wasn't. :)
> Why is there even a setting? How many people would ever want to turn Javascript off?
Because without JS pages load much faster and browser takes less memory. Ad and tracking often doesn't work without JS. Why would anyone want to use JS?
It's on by default because it is very useful. Just become some websites abuse the ability to run active content doesn't mean Javascript shouldn't be an integral part of the modern web. Browsers should allow the user to crack down on the abuse, but expecting sites to cater to people who disable Javascript is a bit far.
If it were off by default, would web developers cater to the incredibly small percentage of people who change default settings to turn it on?
Why is there even a setting? How many people would ever want to turn Javascript off?
When they provide a setting to toggle Javascript are browser developers catering to an incredibly small percentage of people? How many people use it?