I come from a Python background. However, thanks to work and school, I now program mostly in JavaScript, PHP, and Java (though I still use Python when I get a chance).
Now, I could use underscore_names in Java and JavaScript, but I don't. Even though I personally prefer underscore_names to camelCaseNames, I also realize that those languages are designed with camelCaseNames in mind, that the community conventions are for camelCaseNames, and that it is better to write code that looks nice and idiomatic in that language than it is to write code that looks nice and idiomatic in Python.
Not to mention that there are some cases where camelCaseNames are required - for example, when overriding inherited methods in Java - and if I used camelCase where required and underscore_names everywhere else, my code would be inconsistent, which to me is worse than using a style I don't like. So just because I could use underscore_names when I wanted to, there are a lot of reasons that I shouldn't.
A lot of those points also apply to JavaScript:
* JavaScript was designed with the use of semicolons in mind. Brandon Eich himself has said that ASI was only intended as an extra check for sloppy programmers.
* Outside the Ruby on Rails crowd, all the JavaScript I have ever seen uses semicolons. Even within the Rails crowd, this "no semicolons" thing is fairly recent.
* Since the majority of JavaScript syntax is intended to mimic Java syntax, which does require semicolons to separate statements, semicolons blend well with the language, and are therefore nice and idiomatic.
* There are situations where you have to use semicolons due to ambiguity to write straightforward code - there are workarounds, like tricks involving !, but they confuse the intent of the code.
One thing that I noticed is that most of the notable semicolon-haters - fat, mislav, and the GitHub guys - come from Ruby on Rails. Conveniently enough, Ruby does not require semicolons at the end of statements. I suspect this anti-semicolon fervor may come from a desire to use Ruby's conventions with JavaScript.
I think your camelCase example really nails it. Code should be written in a way that is both non-ambiguous and idiomatic to the existing codebase.
If you're in a position where you can define that idiom, then by all means do so. But stay consistent so that when others join a project, or you leave a project, the intent of your codebase is well understood without needing to wade through pages of documentation.
So, IMO, while semicolons are important, they're relatively trivial and an easy "bug" to fix. If I'm looking for an authority, I typically check out the Google Style Guides.
A similar, though more pressing issue that I've faced recently is the proper parenthization of conditions for if. It's generally nice when you don't have to spend a minute or two remembering/looking up the nuances of C++ operator precedence. Know all of the intricacies and tricks for a language doesn't mean every member of your team knows them that well either.
Your comment makes me wonder if this is localized issue, or a more general anti-pattern in polyglot programming. The pattern being: "push my favorite language into the other ones I use". It may or may not be covered by "you can write FORTRAN in any language", or some sort of corollary to Greenspun's 10th law.
Actually, this is one of the reasons why some of the new languages pick a popular older language's syntax and coding style. Example, most static typed languages follow C syntax (C++,Java, C#) because most of them expect the programmers to migrate from the older language. This makes the migration much smoother. And when the syntax does differ, programmers tend to overestimate the complexity of the newer language (most C++ programmers complain a lot about Objective-C's syntax initially).
As a counterpoint, we decided to with underscored_names for our JS since we work with a lot of serialized Python data structures via JSON and using the same naming convention on server as client means our JS doesn't have a mix of camelCase and under_scores for things received via JSON and things defined in the JS. We're not writing reusable libraries for other projects to use, and we don't use a heavy amount of 3rd party code, so it's no big deal.
You bring up a good point. There may be times when technical constraints mean you have to go against the prevailing style -- in your case, interoperability with existing code. (I know that a lot of this works in the other direction when dealing with "foreign" API's in Python.)
However, fat and co. aren't avoiding semicolons because there are technical constraints that require them to avoid them. They just seem to not like semicolons.
"using the same naming convention on server as client means our JS doesn't have a mix of camelCase and under_scores for things received via JSON"
Couldn't that almost be viewed as a feature, rather than a bug? I mean, this way you can instantly tell if a particular variable was received from the server via JSON or was defined in the client code itself. I would consider a naming distinction like to be a benefit, rather than a harm.
Incidentally, we use CamelCaseMethodNames in Python at Google, presumably to be consistent between languages. If you're one company that uses multiple languages, it may be beneficial to use the same style in each, even if that style is non-idiomatic.
It's even inconsistent with the Python core. I'm not going to defend the practice, but it's ended up being a lot less terrible than I thought it was when I first started writing Python here. I've used Twisted before, with its own weird style, and the weird style doesn't really distract much from the rest of the program. It's bad when everyone uses their own style, but if you only have 1.5 styles, it's probably OK. (But again, I would never write a style guide that said methods in Python must start with a capital letter and contain no underscores. That convention is for classes.)
I've been shocked at the level of disrespect for language standards here. Yes: adding semicolons is probably good practice because it avoids the chance of stumbling over bugs like this. And yes: the ASI feature in Javascript is in hindsight a terrible mistake.
That said: you go to war with the language you have, not the one you might want or wish to have.
ECMAScript is ECMAScript. Arguing that your transformation tool doesn't need to handle a feature specified in the language and supported by all known implementations is just ridiculous. Arguing that people making use of a feature that the language standard says they can (and that works) are "sloppy" is equally dumb. Even weirder are the people who jumped on the use of the && operator to effect an "if" as "abuse" -- this is an idiom pervasive in lots of areas and (again) well-supported by Javascript.
Not everyone is going to have the same aesthetics, everyone has a different idea about what features are "fun" and which are "sloppy". And decades of experience in the "code convention hell" world of enterprise programming has taught us nothing if not that this sort of pedantry helps no one. If you want to use Javascript, you have to take the whole language -- warts and all.
This is not really about whether semicolons are required or not.
The point is that in order to ensure maintainability of code you should try to use language (and framework) in a way which ensures better maintainability, supportability, and portability of your code.
Look, the statement like "a && b" is 100% valid in many languages but in order to increase maintainability, supportability, and portability of your code it should be written like "if (a) { b; }".
The easiest way to understand the point of this rule is to get a job maintaing some old crappy code-base :) - I learned that way.
No no, I assure you that was the point I took away. And it seems you missed mine, which is that it's fine to make pronouncements like "don't use short circuit and as an infix if" for your own code. But flaming about them in public is rank pedantry. It's the kind of nonsense that the enterprise world has been dealing with for 20 years now: chasing the "maintainability rule of the week" is going to hurt you badly long term.
Learn to read and maintain the language you have, because you can't win this.
In JavaScript, && does not evaluate to a boolean. It evaluates to the thing on the left if that thing is false (in which case the right hand side is not evaluated at all), otherwise to the thing on the right. So in said JavaScript console,
"asdas" && "da"
evaluates to "da" and
"" && "da"
evaluates to "".
So in an assignment context |r = a && b| would need to be replaced by: if (a) { r = a } else { r = b }, which may or may not be more confusing than the original statement with &&. But in a statement context, where the result of && is not being assigned to anything at all, "a && b" is exactly the same as "if (a) b;" except harder to understand.
The && operator short circuits. It evaluates its second operand only if the first is true (becuase if the first is false, then the expression result is known to be false). It thus acts pretty much exactly like "if", but with an infix syntax. This trick is used pervasively in shell programming, where the native if syntax sucks.
It is concise. There are fewer tokens and bytes needed to write A && B than if ( A ) { B }. And the intent thing is, as I've tried to explain, very much situation-specific. There are many programmers out there very comfortable with this idiom. That it doesn't happen to be popular in the web development community doesn't mean it sucks or isn't worth learning.
Don't pass judgement on syntax you literally just learned, basically. Don't assume the world you know is the only one worth knowing. Good hackers know their tools.
I'm not judging it, I'm just saying that it's not something I'd use because it's confusing for my colleagues / anyone maintaining my work. Unless they're a pure Javascript person.
It is a neat trick though, and proof that I need to spend a little more time hacking in javascript to improve my knowledge :-)
I agree with you to a certain extent, but I do think it is important to distinguish between those that think that they should just use a semi-colon and those that seek to disrespect a standard.
In my case, and I've seen this sentiment expressed repeatedly by others on HN, I think it is odd that a minification script would actively choose not to support a syntax that is standards compliant. On the other hand, I think it is crazy for a major project to use a valid syntax that not only breaks said [popular] minifier but also offers no noticeable benefit.
I do not feel as if I'm disrespecting any standard just because I think simply adhering to that standard is not a sufficient justification for doing something.
But "no noticeable benefit" is your aesthetic decision, not an objective truth. Not everyone feels the same. Pythonistas, for example, might quibble with you about that, because they skip semis when typing all the time and will experience editor friction when using Javascript. And even if you think it's "crazy" to skip the semicolons, you might not think it's crazy to write something like "test && result" as a simple if. Shell and perl programmers like that sort of thing and can read it without difficulty.
I'm not saying that all working code is good code, or that you have to actually use all the language features in all your code, or that you can't have your own well-reasoned opinions about this stuff. What I am saying is that if you're serious about using "Javascript" and interacting with the broader community of "Javascript" programmers, this kind of feature pedantry is going to hurt you and the community badly. You will constantly be running into useful (maybe even brilliant) code that does "crazy" things.
Interesting point, and I do hope that no one ever takes one of my opinions as an objective truth. That said, I do feel strongly that if the only benefit you can name about writing code in one particular manner is that it is aesthetically pleasing to you, then that is hardly a benefit at all.
When it comes to this particular case, I've heard a few different arguments for why the syntax they use is poor, including the following:
1. The syntax relies on parsing behavior that is expected to change in the future.
2. The syntax they use does not work in one of the major minifiers used throughout the community (the crux of the whole issue to begin with, but certainly a drawback in and of itself)
I have yet to hear any benefit to NOT including a semi-colon to resolve this issue beyond aesthetics, and aesthetics alone is just not a rationalization that I can get behind.
I do feel strongly that if the only benefit you can name about writing code in one particular manner is that it is aesthetically pleasing to you
Isn't that exactly what's happening here? A bunch of people piling on skipped semicolons and infix if's because they think they're "sloppy"? Refusing to support ASI (which is, in fact, precisely specified) in their transformation tools because it's "broken"? Why are some people's aesthetics more important than others?
Again: you use the language you have. You will never get the community on board your private yacht of your "sane subset" Javascript. It's been tried for decades. It doesn't work.
>>On the other hand, I think it is crazy for a major project to use a valid syntax that not only breaks said [popular] minifier but also offers no noticeable benefit.
That's basically the take-away I got from yesterday's semicolon drama. You can appreciate that JS allows you to omit semicolons or you can bash those who choose the ambiguous over the explicit. But if you're the lead on a hugely successful project you should pick the syntax that will make it work everywhere. It's especially odd for a web developer to choose aesthetics over pragmatics when it comes to things like this.
That said: you go to war with the language you have, not the one you might want or wish to have.
That's taking the quote out of context, which is saying a lot because it's a oft-quoted example of Donald Rumsfeld's tap-dancing. I doubt Mr. Rumsfeld was advocating driving around in Humvees as if they had armor. I'm sure he would laud the attempts of soldiers to improvise and mitigate the risks inherent in their equipment as much as possible.
And decades of experience in the "code convention hell" world of enterprise programming has taught us nothing if not that this sort of pedantry helps no one.
Over a decade of experience consulting in that world has shown me that adherence to code conventions has tremendous benefits. In shops where there was strict adherence to code conventions, I could be 10X or 100X more productive when refactoring using automated tools.
If you want to use Javascript, you have to take the whole language -- warts and all.
You need to justify this. This strikes me as a silly and counterproductive notion. Even in a tiny language like Smalltalk, you don't want to use, "the whole language -- warts and all," every chance you get. Hacky tricks have a cost. Just because you can implement an entire parsing system using doesNotUnderstand handlers, doesn't mean you really want to. (And yes, I've seen this happen in real life -- you really Do Not Want!)
Are you kidding? JSMin is open source. If you care so much, send a pull request. JS has plenty of weird corner cases. It's up to the author to decide whether he wants to spend time dealing with every single one.
Haven't yet seen anyone point out what unpleasantly quirky code this was in the first place:
!isActive && $parent.toggleClass('open')
It should have been written like this:
if (!isActive) {
$parent.toggleClass('open');
}
What if somebody needs to add a second bit of code to be executed if isActive is false? In the first case, they'd have to refactor the code into an if statement before adding it. It should have been an if statement in the first place.
If you want something to happen if something is true, then you should use an if statement. This is not controversial stuff. Don't look for "clever" ways to misappropriate other parts of the syntax in order to appeal to your own personal minimalist aesthetic taste. Be cooperative.
Edit: On re-reading this it comes off as preachy. In fact I've very recently taken a closer look at some of my own "little quirks" and realised how unhelpful they were for other developers. I guess I'm embarrassed about that and want to spread the embarrassment around.
Unless I'm mistaken 'do_something_else()' won't get executed if toggleClass() returns a value that evaluates to true. Because of short-circuit boolean evaluation.
Actually, you are all missing Crockford's real point, which is that they are considering making ! a binding deference of function objects. If they do, then this:
there are even fewer good arguments for changing the semantics of the fucking logical negation operator 17 years down the line in a dynamic language with billions of end users and no discrete versioning
About as logical as using the logical negation operator for a purpose other than logical negation. :-)
So, how would you implement this dereferencing feature? given that this wouldn't be an issue if you added a semicolon, not sure why you are so hetup there...
Because all you see it as is an argument about syntax. It's actually an argument between the old guard and the up-and-coming hotshots.
Nobody really cares one way or another, and while it would take Twitter more time to append changes to their code than it would Crockford, my guess is the trouble for either would be negligible.
That said, this is very obviously a pride war between those who stick to convention and those who undermine it. The question isn't "should we use semicolons", it's "who is going to start dictating the direction Javascript goes from here on out?"
Clearly, both of these individuals want that spot, but if there's anything I've learned in my short time of coding, convention always wins out.
That's a pretty dramatized way of framing it, don't you think? I think the take-away is as simple as this: JavaScript is a dumpster-fire, and we should probably fix it so we can avoid more of this noise. Framing it as a soap opera conflict instead of a teachable moment makes it more productivity-draining than it already is.
I'm only explaining what makes sense. I know, personally, I'm not going to change how I'm writing my code because of what two influential Javascript developers bitched about on a Github ticket, and I'm pretty sure I'm not alone in that sentiment.
I'm merely explaining why it's drawing so much attention, and what nerves this debate seems to be touching, because as a technical debate it is insanely boring. But that's clearly why it's not strictly a technical debate.
That's even less interesting. I give 2/10 of a crap about what the best way is for me to avoid missing-semicolon errors in Javascript, but I give 0/10 of a crap about who wins an ego battle between Crockford and some other dudes. Why on God's green earth is that on the front page of anywhere?
>That said, this is very obviously a pride war between those who stick to convention and those who undermine it.
I'd frame it as "a pride war between people wanting a sane codebase not based on fragile parser edge cases --include two that know the language inside out--, and the developer of some minor js code that thinks he's too clever for all that".
That's no "undermining convention", that's juvenile bs. I know who I'd rather have on my team.
It wasn't a "fragile edge case", it worked just fine with no ambiguity in every javascript implementation.
The problem is that jsmin is a naive pile of textual replacements (though not quite as abysmal as John Gruber's markdown.pl) instead of being a javascript implementation — something that actually parses javascript into an AST and compiles it to minified javascript.
Unless for some reason you need your minifier to be trivial enough to be portable with no dependencies, you should really be using something along the lines of google's Closure Compiler, which produces far better output and even has stuff like tests!
>It wasn't a "fragile edge case", it worked just fine with no ambiguity in every javascript implementation.
Yes, it worked according to how the parser was designed.
But it depended on a frowned-upon parser feature (AST), that can lead to ambiguous or faulty results in similar cases, and for no good reason. When the creator of the language and its biggest guru disagree with your code style, well, there's no much room for an argument, even if all known implementations work fine with your code.
He also did an idiotic ?: operator abuse instead of a clear "if" in the same exact line. He probably likes silly "succinct code" tricks, which even C people wouldn't touch with a large pole...
I started programming and hacking because i wanted to make something cool and interesting which others would appreciate, not to bicker about something as silly as semicolons. If i wanted to do that, i'd work in retail. When an argument over something so trivial gets to this level, i cannot help but be bored by it.
I think it's so boring because it's a one-sided debate.
One side chooses a style that they find aesthetically-pleasing even if it causes issues for some small subset of potential users.
The other side is flabbergasted that someone would be so reckless and argues for the sensible, safe option, which requires simply terminating your lines of code with an extra character, making those few issues for a small subset of potential users vanish instantly.
He says he doesn't use semicolons, except that it caused issues for those who concatenation scripts so he adds a semicolon to the end of the js file. But, lo! This causes issues for those who use the following pattern:
(function() {
// Le code
})()
This has the potential to cause syntax errors when you execute two anonymous functions after each other. Instead of adding a semi-colon to the end, he "works around" the issue by abusing JavaScript like so:
Actually, my understanding that there is nothing technical about this debate: the debate is about writing maintainable, supportable, and portable code versus "cool" code.
But I also agree this is "by far the most boring debate ever to hit HN" (without technical :).
Of the sustained debates, I feel it is tied with the "my nosql is better than yours" debate. Both of them seem to boil down to "understand your tools, your use case may not map to the thing someone else is advocating for their use case".
I dunno... semi-colons have practical implications for everyone, but "my use case requires speed and persistence isn't an issue, therefore mongo is better" vs "i need persistence and didn't read the docs, who cares if it's a little slower - riak is better" gets pretty tiresome, particularly when my use case is "i need easy graph traversal, neo4j w/ gremlin is the wave of the future". :P
hahaha - I feel the best use of this thread is that we agree to spawn a series of blog posts debating the most boring debates to hit HN ever. Including sock-puppets to randomly drop surprise left turns like "no you are all wrong, here is why the ocaml vs haskell debate bores me" and "no, the real boredom is the long comment threads in the ongoing 'anarco-socialist' vs 'enlightened-libertarian' viewpoint wars"
Of course it's more boring than NoSQL debates, but it's also potentially so much more important. The reason people care is because you have two great projects which they might want to use for many reasons, but which are fundamentally incompatible because of a semicolon. It's not the philosophical divides between database technology where you have many subtleties to analyze and choose between, instead it's just a fucking semicolon forcing you take sides between two orthogonal projects which you might have good reason to use jointly.
Disagree. This is so much worse than that. At least with Hungarian notation (which thankfully died a well-deserved death), its advocates were arguing for a relatively well thought-out system of readability and tying member names to their types for quickly-identifiable scope and type referencing.
But, to suggest that simply removing semicolons constitutes some grand gesture towards readability, simplicity, or something as grandiose as a "minimalist aesthetic," is...beyond absurd.
I think this piece is a great place to end it, as it builds a solid pragmatic case.
It was interesting for me to learn about the possibility of using that kind of '!' notation in JS, even if it's impenetrable to most other developers. Maybe I'll be able to parse some other hipster's code thanks to this.
Maybe. But I barely spent any time reading all the arguments and more time reading through and learning about the syntax of JavaScript, so it was probably more interesting for me than most people simply because I tuned out most of the boring bits. Also, I learned that Closure linter's fixjsstyle can add semicolons for you, which is nice to know.
But that's what I always do: scan for the interesting bits or move on.
There's so much FOR it, it's unbelievable. Yes, it's good to 'break the rules' now and then. Add the semicolon
This is not about if JS allows it or not. It's important, but mostly irrelevant.
You can also not add 'var' to JS variables unless needed. That's going to be a lot of fun when it goes wrong.
But here are the million dollar questions:
1 - How much time is spent making it work without a semicolon as opposed to just typing it? This is not about typing ';' - time for that is irrelevant, but the mental effort of doing so
2 - How much time will it be wasted to fix minifiers (that is, add to them the intricacies of no-semi colon parsing and probably having to write a whole different parser)
3 - How much mental effort is needed to comprehend and correctly fix non semicolon code
Number 3 is the biggest issue and if you don't believe me it goes by another name: coding standard
Yes, JS works without semicolons. And yes, C works without indenting, without meaningful names to variables, etc.
It's not about "to write JS you should know all the nitty-gritty rules of the language and you're stupid if you don't know them so you just add semicolons". It's about teamwork, and facilitating code comprehension (and maintenance).
And language designers make mistakes. They don't know if a feature is going to become a trap, irrelevant to 99% of developers (and with an easy workaround) or just a pain in the behind.
1. How many times are we going to have to debug and rewrite code to work around this defect? If it were just this once sure add the the ';', unfortunately it is never just this once.
2. How much time will be saved by using a minifier that actually supports the javascript language. What other language features will break because someone didn't feel like supporting javascript in their javascript minifier?
3. How much mental effort is needed to support half a dozen different minifiers that all support a different subset of the language as well as the cross browser differences we already have to take into account.
It is a matter of trust. If your tools skip supporting language features because someone decided they didn't like language feature X, what other corner cases did they skip because they were unliked?
Another advantage of the minifiers with a correct understanding of JavaScript is that they can do way more advanced minifications like shortening variable names.
How much time will be saved by using a minifier that actually supports the javascript language.
How much time would be saved by widespread use of "The Good Parts" of Javascript? This is likely to be much larger than the time you propose.
How much mental effort is needed to support half a dozen different minifiers that all support a different subset of the language
Who says you have to? If the community as a whole started omitting the hairy parts of the language, they would tend to converge on the same core. A superset of the 5 or 6 most popular subsets is still going to be a lot cleaner than the whole hairball. In practice, this is about the coverage that just about any language gets from tools like compilers and code transformers -- after all, the serious tools pretty much all have bugs. A superset of the 5 or 6 most popular language subsets is all anyone gets in practice.
Now, this could be easily solved – by adding the friggin semicolon.
What this furore misses is that the original issue (JSMin failing to minify bootstrap-dropdown.js) was already fixed when the bug was raised [1]. Fixed without adding semicolons. Everyone should be happy with that. Developers of bootstrap got to stick to their "no semicolons" schtick, and the person with the original problem got it fixed. Everyone seems to forget this salient point when they rush in to this debate.
For reference, I sometimes use semicolons in my javascript, sometimes I don't. It depends on context and whether it makes the code more readable. It's not an issue I care enough about to get involved in a holy war, one only marginally more relevant than tabs vs spaces.
That isn't really the original issue at all, Crockford's comment is just what brought it to a boilover. The lack of semi-colons has been brought up in issue after issue on the bootstrap project. Each time it's been rejected for the same arguably poor reasoning.
Christian's post is completely spot on here. Javascript was designed to be tolerant of errors and inconsistencies as much as it could. That fact however, shouldn't be used as an excuse for advocating inconsistent coding. Not that I'm saying semi-colons are the epitome of consistent coding (I prefer Ruby myself) but that Javascript was not designed with significant whitespace in mind, rather it just has a tolerance for inconsistent and arguably erroneous syntax.
The argument here is that we shouldn't let Javascript's tolerance excuse laxness on our parts. We should know better.
By "original issue", I was referring to the actual GitHub issue / bug report.
On the wider issue of semicolons I am in agreement of you, and all my own code is written with that in mind - though I respect the bootstrap authors preference and would style my pull requests without semicolons.
If their insistence on not using semicolons causes conflict with other popular software, I would hope they see sense and fix those conflicts - even if that meant adding a semicolon or two.
All the people arguing that the code as was presented should is good and right, I present this:
Everyone knows that debugging is twice as hard as writing a program in the first place. So if you are as clever as you can be when you write it, how will you ever debug it? ~Brian Kernighan
I couldn't tell that the second line in the code in question was an if statement at first without actually thinking about it. How is that helpful?
The author is right on the money. I've always written code as specified - just because it felt 'right' - but I could never pin down my arguments - the author captures my thoughts beautifully.
Particularly his points about reading other people's code & extending functionality - there are very few use-cases where it makes sense to omit semi-colons, end-tags, etc - and if you aren't sure - yours isn't one of those use-cases.
I have grown extremely weary at the level of discourse that this whole situation has provoked - the linked post is one ad hominem after another! What is this supposed to accomplish? Hopefully I can get the people I'm criticizing to change their ways by making them feel really bad about themselves? By telling them they aren't visionaries, they are semi-colons, they are arrogant, sloppy, lazy? It's destructive, self-indulgent, and completely unnecessary.
I upvoted you. This whole semicolon war is the most banal and self-aggrandizing conflict I've seen in a while, rife with as much vanity and childishness a the ``blogosphere'' allows, as evidenced by this post. Can we move on, please?
At least everyone acknowledged that the editor war was an inside joke, I would gladly have an editor thread every week on HN over this.
> The main issue with these parser-fetish arguments is that they assume that you write code for a parser – sometimes even a certain browser – and not for other developers.
That's an excellent and important point beyond the petty incident. You write code solely for humans, not for the compiler.
I think the real issue in the whole debate is that 95%+ of people programming do not understand how programming languages are implemented and haven't been exposed to basic theoretical stuff e. g. to the fact that languages can be ambiguous, that there are cases where it might be impossible to interpret a part of the code, that handling syntax errors is actually hard etc. This post is a good example, it completely misses the point, as the only reason semicolons are present in some languages is to make parsing possible. I think many people would not argue about this if they had a clue why programming languages syntax is the way it is.
This whole debate seems like two folks stuck in their way blowing a whole thing out of proportion.
How about Bootstrap adds the damn semi-colon and JSLint accepts that for the most part the lack of a semi-colon is working and "valid" JS add allows for the edge case. Now everyone gets to go home happy.
The author is picking and choosing who he wants in this argument of his. He has Douglas Crockford and Brendan Eich on the pro-trailing-semicolon side. On the other side he has @fat. If he wanted to be fair he could have included someone like @izs or Thomas Fuchs. But if he referenced their viewpoints on it, it would make it harder to pretend that all code that doesn't include trailing semicolons after every statement is brittle.
"He has Douglas Crockford and Brendan Eich on the pro-trailing-semicolon side"
Brendan Eich, the conceiver of the language. The best placed person in understanding why the JavaScript language does what it does, and what problem a feature was trying to solve.
Douglas Crockford, the developer who took all of JavaScript and found within a clean elegant language. And then he pulled together what he considered the best bits of the language and used that subset as a starting point. http://anongallery.org/220/javascript-the-good-parts
@fat - no idea who he is, apart from being an employee of Twitter and works on Twitter Bootcamp, which is a framework for other people to use to build websites.
@izs - I think you overreach by grouping him as anti-semi-colon. Isaac is a pragmatist, he uses what suits the group of people he works with. NPM is a standalone app, a non-expert JavaScript developer doesn't need to delve into npm for it's standard usecases. So he choses a syntax he believes suits the group of people working with him. And yet, he is now the lead developer of Node.js, and he's already on record saying he won't be spending time changing Node.js into his preferred comma prefixed approach - because the existing team is already used to post fixed commas. That's what I like about Isaac - he listens to a wide range of opinions, and uses what works best for the environment he's in.
Thomas Fuchs - heard of him, he's done lots of stuff JavaScript and PHP wise. I'll give you that one. He gave us Scriptaculous, and some decent PHP resources (like PHPatterns, IIRC). He's active in Rails.
The main reason the quality of JavaScript has increased dramatically in the last 6 or 7 years has been because of a coalescing towards a JavaScript best practice. This has mainly been led by Douglas Crockford and JavaScript developers at Yahoo.
JavaScript is too flexible and too bastardised a language, leaving lots of weird and broken features. Plus, well written code isn't about being clever, it's about writing code that can be supported and maintained. Douglas has taken the best bits of JavaScript and found that to be quite an effective and useful language. With that subset he, along with the people who later became the YUI library developers wrote code in that subset.
The reason for that subset is that the language is clearer, less ambiguous, and less likely to catch a developer out. Turns out using just a subset of JavaScript's syntactic capabilities improves the quality, reliability and maintainability of code.
JS Lint was build to help developers bring their code into line with this subset of usage, and disallow perfectly valid but potentially flawed code. It favours code that is readable and quickly understandable by the non-expert developer.
JS Lint is the starting point for this improved use of JavaScript. JS Min is built, as far as I understand, with the assumption that JSLint rules are in place, so yes, it works on a subset of the language. Actually the documentation http://www.crockford.com/javascript/jsmin.html is more specific: "It is suggested that JSLint be used before using JSMin."
And I guess that's the real problem here. Not using JSLint before using JSMin. And then this argument boils down to @fat saying something along the lines of not wanting to be constrained to JSLint's subset of JavaScript. That's a call he can make, and the users of Twitter Bootcamp can then choose accordingly whether to use the toolkit (since if you are in a JSLint / JSMin using environment, it's probably best to stay away from Twitter Bootcamp.)
Perhaps what would be worthwhile is a JSLint equivalent for whatever syntax and idioms the Rails-induced subset of JavaScript uses (not just a watered down version of JSLint that JSHint is, but something that encourages what they consider to be best practice rather than merely allow it), and a minified that respects those idioms. And something in plain English for those unfortunate developers who stumble into these idioms.
What would "JavaScript the Railified Parts" look like side-by-side with Crockford's The Good Parts, and Flanagan's Definitive Guide?
I'm one of those weird npm style guys, the thing that breaks my brain is this isn't about writing clever code. it is about treating the (literally) single broken case (in the browser) as a broken case. It is about writing safe and maintainable code, by making it obvious that you are dealing with that one special case. I have written js both ways, and I find comma first/semi colon first I am hit by less syntax bugs. Thats why I do it, any other reason would be dumb.
As for crockford leading the way to sanity, the biggest problem I have with him is that instead of warning and educating he flat out forbids, and then heckles people who disagree. "with" is considered a "bad part", but it is used by literally every js template framework. He recommends closure based object construction, but in a big app that is a memory nightmare. And then there is stuff that basically treats js devs as mindless idiots. Like the "new" operator is discouraged, just in case someone forgets to use it when they want to instantiate something or the function keyword is discouraged, because it hoists.
All of those are opinions, and some are demonstrably terrible. But for some reason The Good Parts is still held up as gospel, while the community just chooses to ignore the parts of the book it doesn't like.
As for minifiers, JSMin is pretty obsolete at this point. If you are using it, as an incredibly easy win you can decrease your page load times by using uglify js, google closure compiler, or yui compressor. All of these support asi fully, but that is probably the least important reason to use them.
> @izs - I think you overreach by grouping him as anti-semi-colon [sic].
Those are your words, not mine. The "other side", as I put it, isn't necessarily anti-semicolon. To be on the other side of this particular argument, they merely have to disagree with the assertion that a semicolon at the end of the statement is a must, and not a style choice. And izs did choose that style for npm.
> Thomas Fuchs - heard of him, he's done lots of stuff JavaScript and PHP wise. I'll give you that one. He gave us Scriptaculous, and some decent PHP resources (like PHPatterns, IIRC). He's active in Rails.
There's Zepto.js too, and he was a longtime core member on the Prototype JavaScript framework. He's influential.
> The main reason the quality of JavaScript has increased dramatically in the last 6 or 7 years has been because of a coalescing towards a JavaScript best practice. This has mainly been led by Douglas Crockford and JavaScript developers at Yahoo.
I strongly disagree. The driving force was the increase in importance of the JavaScript language. He was part of that trend. He helped speed it up, but it would have happened regardless.
> JS Lint is the starting point for this improved use of JavaScript.
JS Lint is a starting point. There are other valid starting points.
> And I guess that's the real problem here. Not using JSLint before using JSMin.
But @fat isn't using jsmin. And there are great alternatives, so this is a valid choice!
> What would "JavaScript the Railified Parts" look like side-by-side with Crockford's The Good Parts, and Flanagan's Definitive Guide?
"> What would "JavaScript the Railified Parts" look like side-by-side with Crockford's The Good Parts, and Flanagan's Definitive Guide?
It would look like this: http://pragprog.com/book/tbcoffee/coffeescript
Err, that's CoffeeScript, not JavaScript. CoffeeScript gets cross-compiled into JavaScript. It isn't JavaScript in itself.
I think this whole semicolon story is a natural step of a language becoming more wide-spread. The more people spend time appropriating the language, the more they'll want to push the envelope, exploit the quirks and get the best out of the language's syntax.
It's a natural cycle. C programmers went through the very same phase at some point and to this day different coding styles persist.
It's only when those choices cause incompatibilities that friction emerges, but we should see it as a natural step towards a more unified grasp of what Javascript means for people who program with it. That the general tone of the conversation is antagonistic is just a symptom of the fact that people care about their opinions and their choices, which by all means should be seen as a very healthy questioning on the part of the community. Just my 2 cents.
TL;DR: Sure the image conveyed is bad, but the reasons why such a debate emerges are natural and are part of the evolution of a language, it'll get better.
To me this argument boils down to style over maintainability. The arguments for the latter seem so clear in my mind that I don't really want to dignify the other side with a response.
Every programmer should have had a professor who was really, inordinately fond of Ada, so much so that at least one assignment required coding in, or basic knowledge of, the language.
Ada is strict as balls, but unlike its wannabe-successor, C++, its strictness is for the sake of clairty and the compiler usually actively helps you bring your code into compliance. (Rather than, say, complaining randomly because the syntax for template instantiations changed this week, or punishing you for pronouncing the word "const" with the improper intonation.)
With a bit of exposure to Ada, programmers might understand better why languages are so finicky about syntax details, and that just because a language is lenient, doesn't mean you should take advantage of that leniency. And that, in cases like JavaScript with its bloody semicolons, perhaps leniency is a disadvantage.
I had a professor like that, he also was a signatory to the original ECMAscript standard. He also claims he came up with that name. Since they were spending too much time debating the name, he figured they'd call it that for now, and since it was such an awful name they would be sure to go back and change it.
I think the argument that we shouldn't rely on the parser for certain language features is a bit silly (including interpreting end-of-statements). The language is precisely what the parser says it is, and nothing more or less. JSMin is free to not do what the parser does of course, but that won't be Javascript.
Yup- "computer science" is definitely going retrograde. It was recognized a long time ago that the "implementation is the spec" was a disaster for program semantics. And now we have "implementation is the spec" for something as easy and inessential as parsing. 1950s computer science labored under the misapprehension that parsing was hard (FORTRAN contributing greatly to this feeling). Then parsing was solved (while we don't always user machine generated parsers the ability to specify a grammar and see what features make parsing easy versus hard is a huge step forward). In fact this early victory over the formerly hard problem of parsing is one of the reasons people started anticipating higher and higher level languages and the complete automation of programing. But (as Brooks points out in "The Mythical Man Month") parsing was solved precisely because it was an inessential difficulty. Save your mental cycles for essential difficulties (semantics, evolution of state, remote machines, concurrency and actual domain problems).
If he wants his code to work with this particular minifier, he should include the semicolon, nothing more, nothing less.
Is a bit like trying to speak in official french. It might be correct according to the officials charged with protecting the french language, but you will sound extremely odd to most french people.
I've heard that in Korea, there's one national standard exam for English that's always been graded incorrectly, so everyone learns the incorrect answer when studying for the test.
I'm not sure if you're making that argument yourself, but it's thoroughly wrong.
First, there's no "the parser". There are a lot of parsers, some of which don't yet exist.
Second, that means there's no way to distinguish between bugs and features. If the language is defined by parser behavior, then any parser bug is now part of the language.
Third, you've eliminated the white space needed for future growth. If you look at any standard that has evolved well over a period of years, you can see in the early days there was a lot left undefined. If the implementation is the spec, then nothing can be treated as open for change.
javascript the language itself has a spec that everyone has pretty much always implemented consistently even in the worst throes of the browser wars (the runtime libraries, not so much)
the problem was that jsmin didn't just violate the spec — it's not a javascript implementation at all, but rather a pile of naive textual replacements
I leave out semicolons in my Haskell, and my Python. I use them in my JS, because JS's implicit semicolon rules are rubbish compared to any other language that I've tried with such rules.
It's kinda presumptuous to assume that people who disagree with you don't "understand the logic behind" your position.
So, if your code needs syntactical changing when it gets extended, to me you’ve optimized prematurely. Code will always change and making sure our maintainers have a hard time breaking it is a very simple and good idea.
In other words, a good programmer writes code for project maintainability, not to signal dominance by showing off knowledge of the parser. [+]
(Corollary: One's cleverness is always a finite resource. I'd rather work with someone who devotes that resource to what's good for the project, not his own fame and ego.)
[+] - Unfortunately, the way teaching CS sometimes works, students are rewarded for showing off clever and elite code every chance they get, to show the prof they're a "real" coder.
Writing great code means making it easily readable and understandable by other developers. If you want to show off your quirky syntax skills, for god's sake play perl golf or attend at obfuscated code contests.
This argument is pointless. At the end of the day, the current owner of Bootstrap can do whatever he wants. If the semi-colon issue is that important, people should just fork the project and add semi-colons.
Reminds me how learning python took all the fun out of those semicolon and curlybraces loving languages.
Americans (with US keyboards) really shouldn't design a programming language nowadays.
Well, in nearly every job I've had, there's at least one alpha nerd[1]. Worst of all is the alpha nerd boss (I intentionally avoid the word 'manager' here). I welcome advice on how to find working environments free of this sort of thing.
[1] - two or more is even worse: they're like male gerbils or betta fish, in that they fight all the time.
This is ridiculous. Twitter's developers are lazy.
You don't NEED a semicolon for the interpreter, you NEED a semicolon for the other readers/users of your code!!!
That goes for all other code practices that enhance readability. If you disagree that human readability is less important than interpreter/compiler, you're a lazy moron.
Now, I could use underscore_names in Java and JavaScript, but I don't. Even though I personally prefer underscore_names to camelCaseNames, I also realize that those languages are designed with camelCaseNames in mind, that the community conventions are for camelCaseNames, and that it is better to write code that looks nice and idiomatic in that language than it is to write code that looks nice and idiomatic in Python.
Not to mention that there are some cases where camelCaseNames are required - for example, when overriding inherited methods in Java - and if I used camelCase where required and underscore_names everywhere else, my code would be inconsistent, which to me is worse than using a style I don't like. So just because I could use underscore_names when I wanted to, there are a lot of reasons that I shouldn't.
A lot of those points also apply to JavaScript:
* JavaScript was designed with the use of semicolons in mind. Brandon Eich himself has said that ASI was only intended as an extra check for sloppy programmers.
* Outside the Ruby on Rails crowd, all the JavaScript I have ever seen uses semicolons. Even within the Rails crowd, this "no semicolons" thing is fairly recent.
* Since the majority of JavaScript syntax is intended to mimic Java syntax, which does require semicolons to separate statements, semicolons blend well with the language, and are therefore nice and idiomatic.
* There are situations where you have to use semicolons due to ambiguity to write straightforward code - there are workarounds, like tricks involving !, but they confuse the intent of the code.
One thing that I noticed is that most of the notable semicolon-haters - fat, mislav, and the GitHub guys - come from Ruby on Rails. Conveniently enough, Ruby does not require semicolons at the end of statements. I suspect this anti-semicolon fervor may come from a desire to use Ruby's conventions with JavaScript.