I think it's harmless, and it crystalizes the idea that things are changing, the same way that the term "Web 2.0" did.
Muggles have a lot of trouble with names. I worked for a big Uni back in the age when Mozilla's browser was caled "Mozilla" and it was impossible to get anybody to take Mozilla seriously. This was a place full of Unix-heads and Microsoft-haters, and we found that in 2002 we still had a 30% market share on campus of the despicable Netscape 4 browser.
When "Firefox" came out, all of a sudden the muggles realized there was an alternative to IE, so Netscape 4 finally died for us.
The project to obliterate flash and replace it with something open needs a name... And "HTML 5" is as good as any, inprecise as it is.
Yehuda's point is actually that HTML5 is NOT like Web2.0. Everybody knew that Web2.0 wasn't a thing. Now there are people confusing HTML5 as a thing with the process of making the web better.
People regularly convert verbs into nouns. It's a semantic error, but it makes the invisible process of "making the web better" visible to the muggles.
verbing nouns does indeed weird language, but that's not the point here.
The point that analysis of HTML5 as a standard is not the point of the current effort to improve the web, and improve computing in general.
It does not matter where HTML5 and Flash happen to stand at the moment in aggregate usage across companies. Who cares. There are many reasons for adoption of each technology, dependent on the circumstance of each company and their business models.
That's why HTML5 as a noun is a bad banner, avatar, symbol, whatever. Web2.0, nebulous as it may have been, could not be confused with an actual object.
HTML5 suffers the same problem that XML did. People thought anything that touched XML was going to be the wave of the future, and improve everyone's lives, and be both machine and human readable, and lead to a revolution in how metadata was transported.
It is true that XML lead to cool new capabilities on the web. It also was not a panacea, nor did it instantly make any project that used it awesome (XML config files are one of the most heinous digital inventions ever).
People should not confuse the movement to create better web technologies, with the web technologies themselves, or we will all miss the point.
"The project to obliterate flash and replace it with something open needs a name..."
How about 'Throwing a perfectly good technology out with the bath water in the name of open source zealotry so that we can replace it with a mish-mash of SVG and Canvas graphics, multiple non DRM video codecs, and an ever moving WebSocket spec'? :)
Note that Microsoft and Apple want to obliterate flash just as much as anybody does... IE9 shows that even Microsoft realizes that "HTML5" is a far deadlier weapon than Silverlight.
We need to hope that Tim O'Reilly can invent a better term.
I'm quite serious. "Web 2.0" may have been an amorphously defined grab-bag of a concept, and some people didn't like it, but at least it didn't get itself confused with a W3C standards document. If you call something "Web 2.0 compliant" people tend to recognize that you are kidding.
Unfortunately, the term, and therefore the way the technology is understood by tech writers, is causing some fairly serious problems.
The thing is, the core problem isn’t that the name is too fuzzy. It’s what the name implies.
The truth is, the “completion” of HTML5 is absolutely irrelevant.
Basically, the tech press shouldn't use the term HTML5 because they don't understand it? HTML5 is a major milestone for the web and the term carries much more meaning than previous buzzwords (AJAX, Web 2.0, DHTML, etc.). HTML5 is a new technology unlike AJAX or DHTML which defined new ways of using existing technologies (AJAX = Javascript + XMLHttpRequest API, DHTML = HTML + Javascript DOM manipulation). Unlike those terms, HTML5 is very well defined and therefore quite hard to misuse. Am I missing something?
I must admit, I'm confused. I will never be finished? Isn't it just a specification? Isn't there some time when they say the spec is done and signed off on by the appropriate standards bodies?
HTML5 has come to mean two things - the specification, and the loose group of new browser technologies. A lot of people, for example, talk about the new "HTML5 geolocation API", but there's technically no such thing - there's a geolocation API, but it's not HTML5.
Sorta how people called JavaScript visual effects "AJAX" despite there being no asynchronous requests nor XML involved.
The HTML5 that the tech press talks about (I provided two of a large number of examples) is something entirely different: it refers to the general process of improving today's browsers. The day the HTML5 spec is approved, nothing will have changed on this front.
It's similar to how parts of CSS3 have been implemented before any browser has completely implemented CSS2.1. Improving browsers is a process, and it is not a single news story ("HTML5 NOW KILLS FLASH!" -- "HTML5 NOT READY TO KILL FLASH YET!").
I wasn't aware of that. Won't it be living hell for web developers trying to build cross-browser compatible websites? Will you have to perpetually keep track of the implementation status of every HTML5 features for every browser in addition to keeping track of every browser version and their market share?
Edit: I wasn't referring to the DOCTYPE. I was referring to the fact that up until now you could just assume it was safe to use HTML 4.01 because you knew it was implemented (almost) consistently across a large share of browsers out there. Now you will have to keep track of features individually instead of keeping track of a discrete set of features AKA a HTML version.
The doctype has never been useful for detecting browser features. You needed, and still need to, do that with user agent sniffing or with Javascript. All the doctype has ever been useful for is for controlling the browser rendering mode, i.e. standards compliant mode or quirks mode. HTML5 defines much more precisely how pages should be rendered, including how to render broken HTML pages, thereby obsoleting a thing like quirks mode.
"The next version of HTML doesn't have a name yet. In fact, it may never have a name, because the working group is switching to an unversioned development model. Various parts of the specification will be at varying degrees of stability, as noted in each section. But if all goes according to plan, there will never be One Big Cutoff that is frozen in time and dubbed "HTML6." HTML is an unbroken line stretching back almost two decades, and version numbers are a vestige of an older development model for standards that never really matched reality very well anyway. HTML5 is so last week."
Check out http://www.modernizr.com, for example, which abstracts feature detection for many advanced "HTML5" features. You (sorta) don't have to worry about browser version, you just have to worry about feature support.
Feature detection is a far easier strategy for development than user agent sniffing, in my opinion at least. Granted it's less helpful for CSS-related issues, which more have to do with inconsistent implementation than with whether a feature is or is not supported.
well the recommendations wont be "ready" till when?, 2018? And there is no way that even the top 3 browsers will ever support it in its entirety
As it has always been its a matter of deciding what you need to do, deciding what your audience is, evaluating the tradeoffs and picking the best solution.
So I do some consulting for an equity research group and was at an event they put together for a bunch of investment banks, hedge funds etc. These guys would keep asking "what about HTML5?" "I hear HTML5 is the way to go" "Is HTML5 going to kill Adobe"? Much of this was hot on the heals of the iPad announcement.
I kept wondering "Which damn part of HTML5?" the Canvas element? Web Sockets? Video? I think this is where the confusion lies HTML5 isn't a single technology as much as a collection of cool HTML enhancements.
I actually think it's great that the term has caught on. If people were to talk about "next update to the W3C standard" or about the features individually, it wouldn't be getting nearly as much press as it does. And with the increased press comes increased pressure on the browsers (I'm looking at you, IE!) to actually support the new standards.
Well, I think it's too bad that HTML5 can't be treated like a milestone. Worrying about what will be officially recommended, what is or isn't part of HTML5 (geolocation API apparently) really puts me off thinking about developing for it.
You should be worried about what features have enough browser adoption in your target market to enable their use, not whether the W3C has marked a large group of loosely related technologies "done".
I agree, "HTML5" is a stupid name for what it's being used for... but what is the tech press supposed to call the loose bundle of browser technologies? They're going to call it something chosen by the path of least resistance.
Muggles have a lot of trouble with names. I worked for a big Uni back in the age when Mozilla's browser was caled "Mozilla" and it was impossible to get anybody to take Mozilla seriously. This was a place full of Unix-heads and Microsoft-haters, and we found that in 2002 we still had a 30% market share on campus of the despicable Netscape 4 browser.
When "Firefox" came out, all of a sudden the muggles realized there was an alternative to IE, so Netscape 4 finally died for us.
The project to obliterate flash and replace it with something open needs a name... And "HTML 5" is as good as any, inprecise as it is.