CSS was originally designed with the idea that you would have one css file for your whole site, and it would be maybe 300 lines max.
We've obviously gone well off the rails. Selector specificity was supposed to be the scoping mechanism. And it works great in the original envisaged usescase: a single person making a single website. it just doesn't scale to large teams, and ancient software projects, with complicated layouts.
It never worked well in the original envisaged usecase. CSS 1 didn’t work properly with tables which meant I used to have to inline style attributes for tables as well as set a CAS file. It’s always been garbage for centering things, in the early days that meant using the <center> tag. Then you have all the different browser extensions (o-, mo-, etc) the inconsistent preferences for font-face, etc.
CSS was a great idea but it’s always been a garbage execution of that with no single browser following the spec correctly for years then each browser going off and adding their own crap when the spec ended up languishing (thank you W3C for sleeping while the web leapfrogged into the future).
I’m not a fan of HTML (I think that was good for it’s time but could use deprecation) nor JavaScript (but I’ll concede this is personal preference - however wouldn’t it be nice to have browsers run byte code so you could have a choice of languages rather than having compile everything down to JavaScript?) but CSS is easily the worst of a bad bunch in my personal opinion.
Frankly I don’t even get why people are defending it. Any programming tool that reduces the developer down to hours of trial and error just to do basic things is clearly a misstep. Sure, expert front end developers with years of experience and using bloated frameworks cope fine; but why have we allowed ourselves to get into this kind of mess in the first place?
I would honestly welcome a ground up complete reimplementation of the web if browser vendors all decided to work together on one.
That's great reference (thank you for the link) but one of the great - if not the single greatest - accomplishments of the web was that it allowed anyone with a text editor and access to the internet to publish content with ease. I get that commercialisation of any platform will lead to specialities in that field; however the tools shouldn't make it objectively harder for new comers to contribute. Let alone discriminate against hobbyists who might not want to spend several hours of their life learning CSS (never-mind HTML, perhaps Javascript, how cookies work and any regional legislations, OWASP should they dare to have any user submission forms, etc).
Talking personally, I published my first website in 1994 when there wasn't different layout modes. Now it is suggested that I read a multi-chapter book just to learn what's changed in CSS so I can publish the same content I had before but in a "web 3.0" (for want a better description) format.
I appreciate my comments are very ranty / preachy and do value your comment as I hadn't seen that link before; so my comments are not directed at you in any negative way at all! What I'm essentially just trying to say is web development has become very frustrating for anyone outside of the webdev community. Heck, I find it frustrating and I used to specialise in hardening web servers so have worked quite close to that community.
It is as simple as it ever was to do what could be done 15 years ago with a simple text editor and access to the internet. Any adjoining aspect has gotten easier and cheaper by a magnitude.
It's just that what people want has gotten a lot harder to build.
The removal of the <center> tag in place of CSS black magic is definitely not "easier" than it was 15 years ago. However I suspect this is going to be one of those debates that we have to agree to disagree on.
I should add that I don't disagree with the deprecation of <center> from a language purists perspective. However it was still a step backwards in terms of ease of development when centring stuff in CSS is so clunky in comparison.
>I would honestly welcome a ground up complete reimplementation of the web if browser vendors all decided to work together on one.
Was that the sound of a pig flying by, or the sound of hell freezing over? Any time specs can be interpreted, there will always be these differences in browsers. Maybe we could not call them specs, and just call them suggestions?
Ostensibly I do agree with you however that is the reason I phrased it "browser vendors working together" rather than "a new spec being written".
However, as you said, the chances of that happening are remote. Even putting personal agendas aside (eg Google wanting to control content via AMP), the web in it's current form is "good enough" that there's no real drive to reinvent the wheel for something with such deep market penetration already.
The question is, why isn't "the javascript equivalent of bytecode" just javascript? Why do you think there's a difference between "javascript" and "bytecode"?
We've obviously gone well off the rails. Selector specificity was supposed to be the scoping mechanism. And it works great in the original envisaged usescase: a single person making a single website. it just doesn't scale to large teams, and ancient software projects, with complicated layouts.