I know this is advertising, but there are a couple of particularly objectionable points…
Increasing competition between rival internet companies, the speed of delivery and the ability to iterate are the key traits of market leaders. In a competitive scenario, reacting to end user needs, incorporating their feedback into the offering and delivering updates and changes regularly is essential.
This is literally nothing to do with Node. It's to do with good architecture and project management. The style of architecture generally used and encouraged in the Node community does tend to push SOA etc., but that's nothing to do with the language.
It’s extremely difficult to hire top talent these days; good developers like to learn new things and to use new technologies.
Node will not continue to be 'new technology' for long. This is a shoddy basis on which to select a tool.
double the number of requests per-second and reduced response time by 35% or 200 milliseconds.
Measuring reimplementations of code in alternative languages is frequently absolutely pointless, because that reimplementation will always go hand-in-hand with massive re-architecting, and absent second-system effect this will result in better performance. For example:
The old story of linkedin where moving to Node.js from Rails for their mobile traffic, reducing the number of servers from 30 to 3 (90% reduction) and the new system was up to 20x faster.
Because all of the rendering was pushed to the client, no doubt.
Node is great, but it's one of many valid choices for developing huge, modern web apps. It's not that Node is becoming the "go-to technology in the Enterprise" (it's not), but that SOA, loose coupling and the reduction in use of monolithic frameworks are everywhere, and that's something that Node is good at.
Agreed. In the other story about node.js I challenged people to describe the "Node way" of doing things, and was given a laundry list of things that applied to every technology, like "modules".
We are currently living through a programming language and tools renaissance the likes of which have never been seen before. Why are good developers deliberately blinding themselves to all the goodness throughout the whole programming ecosystem, choosing instead one tool (e.g. nodejs) as the One True Hammer? It's so unnecessary and so limiting.
Interesting choice of words renaissance, as what we are actually seeing is people blindly reinventing things that were known in the 1970s. At least the true renaissance thinkers knew they were rediscovering things that existed before the Dark Ages; today people genuinely think they're onto something novel.
what we are actually seeing is people blindly reinventing things that were known in the 1970s
Or people who have worked exclusively in web development and for a relatively short time suddenly discovering general programming knowledge, like why modularity and separation of concerns are important, or that being able to write prototype code and get it into production quickly doesn't necessarily mean that code will be maintainable or support the same pace of development over the long term.
Next week, if you do everything in a dynamic-everything language using a dynamic-everything framework with everything looked up at runtime from a database, your software will be slow. The idea that a simple blog site running on a modern server should have to worry about hitting the front page of an aggregator and then falling over because it got a few thousand visitors in the next hour is rather absurd, if you stop and think about how much processing power and raw bandwidth these modern systems actually have available.
While it is true that the basics of computation did exist in the 70's, it's very clear that programmer productivity today far eclipses what we were doing even 20 years ago. I started coding professionally in the 80's. Today much of what I used to do is now handled by users on their own with spreadsheets and query tools. A small team of 1-2 people today can routinely take on a task that would have taken teams literally 10x as much effort back in the day.
For example, anything involving interfacing computer systems together in any way took an order of magnitude more time back in the day. There were no real widely accepted, simple to use methods available. Networking and communications over modems tended to be so unreliable and difficult that systems tended to be made as monolithic things that just did everything for themselves.
It occurs to me that a book could be e written describing why and how programmer productivity is better today than it was a couple decades ago.
It is true that computer technology has been around a long time. And it's also true that a lot of the best stuff such as Lisp has been around for a long time. But there are all kinds of factors that make programmers complete projects in a matter of days today that they wouldn't be able to 10 or 15 years ago.
And yet, an incredible amount of software was written on VT100s. Software that is rock-solid reliable, does things on old hardware that would be impossible on even on modern hardware with scripting languages, and is still relevant and doing useful work after many decades. What looks like "productivity" today is just fluff, pretty interfaces that still offload their real work to old-school systems.
I've used a bunch of different languages over my career from device+graphics libraries in assembly to windows apps to servers, web stacks and (recently) front-end. These days admittedly I do a lot of node development and while not a huge fan, its really so easy to get something up and running all the while using all the same tools and even the same modules on the front+back ends. When I run a build its the same gruntfile for the stack and UI, there's no fussing about getting disparate servers, paradigms and teams playing nicely and its small enough for me to get an entire application installed and running literally within a few minutes even on resource limited hardware like a rasberry pi - that creative freedom is brilliant!
I can definitely rant about all the things that are bad about nodejs but as a prototyping tool or sideproject toy its really valuable.
> Node will not continue to be 'new technology' for long. This is a shoddy basis on which to select a tool.
Not to worry - there will be a new hot technology within a year or two, and a good percentage of the node.js people will scramble to rewrite in it to avoid looking old fashioned.
I couldn't disagree more with what I assume you are implying here - that node is just some trendy thing that people are only using to stay hip.
I have a fresh perspective since I have only been programming (mostly web dev) for a few years and I've been learning on my own with the goal of full-stack competence. I did not care about what was hip, I cared about what would work using my limited resources. With that goal in mind I have spent my time practicing, testing and analysing most of the popular languages and related frameworks from asp.net, php, python, ruby and so on, and from my beginners perspective (which is a important market to serve) node is on another level. I'm fairly confident it is one of those big leaps forward like rails was and will be around a very long time.
Why? In one word I'd say its speed. Its freaking fast and I'm not just talking about io performance, thats just the icing on the cake. Learning curve, development time, configuration, all extremely quick and easy.
One thing I think people forget or don't put enough emphasis on is that node is a networking library, its not just server side JS the way that asp.net is server side C#, its JavaScript from the bottom to the top. Web server, load balancing, pretty much anything you need can be written in just JS. That makes it extremely easy to set up your own stack that you actually have the potential to comprehend or even write yourself and launch something on the cheap. For 5 bucks on digital ocean you can set up a node app that will probably scale enough to survive a slashdot effect or launch a mvp.
Node and its related frameworks still have a ways to go but I think it is a mistake to dismiss it. And finally I'd like to mention that its first release was 2009 and people have been talking about it a lot ever since. I think its already passed the trendy technology phase and will continue to see widespread adoption.
I know what you mean but I really think it'll be hard to topple node. I started with express and tried a few other frameworks and finally decided on sailsjs. I haven't had this much fun or been this excited about web development since I started 10+ years ago. I really don't know what it is about node thats so appealing as Im no js guru but I think its going to be around for a very long time.
What makes you think that JavaScript will still be the flavor-of-the-day preferred by this crowd tomorrow, though?
They were pretty gung-ho about Ruby and Ruby on Rails between 2005 and 2010 or so. Then their hype moved to JavaScript and Node.js quite rapidly. It'll very likely move on to something else soon enough.
The difference between Ruby/Rails and Node.js is that javascript is literally used everywhere on the stack. Such a HUGE volume of open source code is written in JS. Ruby/python/Java/C# have not enjoyed that advantage.
But that doesn't give any meaningful advantage. Want to do your database queries to the client? Your business logic? No. Want to put frontend engineers to write your backend?
Lots of bad ideas. I mean, it is an ok feature, but by no means a killer-feature really. Ocsigen does the same thing: you can write your code in OCaml and the client-side part will by translated to JS automatically. Neat? Yes. Killer-feature? Probably not.
And JavaScript is really not that great to start with, so daveloping in something else server-side is not the worst idea.
The powerful use of JavaScript for me is that I can render the exact same templates on the server and the client. That's a lifesaver when you want a fast, client-based app that is also search engine indexable.
And JavaScript is really not that great to start with
It's fine. IMO, all languages have warts you have to work around, once you know them it becomes a non-issue. I can't think of the last time one of those "JavaScript WTF" moments actually tripped me up in real code.
I guess what I mean there is less let's have front end devs write our backend but that the ecosystem is so huge and that so many people know JS that it's a huge advantage. Nearly every single web developer that's been working over the last 20 years knows some level of javascript. No other language enjoys that advantage.
The Language, (in this case JavaScript) is probably the smallest part of domain knowledge needed to be a good front-end or back-end engineer. Front-end engineer, know JavaScript? OK, optimize these queries, do we need to cache, make temp tables, map/redux against the DW, etc? Hey backend developer, know JavaScript? Why won't this render the same in Chrome 33.1 as it does IE 10...etc...
JavaScript is a convenience language across the stack for when the team is small, or even a single person.
The problem with web dev (I've thought about this a lot) is really that the browser is kindof the hinge. So no matter what you do for unifying languages, even porting modules so they're same on node front to back-end... you still can't really ever get to a point where you're unit testing pretty reliably through the full-stack.
So might as well test parts in isolation & who cares if there is one language throughout the whole stack as long as your devs are good at the languages required.
One day I think the impedance between server & client will disappear, but only when the browser's role as a rendering engine diminishes or finds better integration/intimacy with the codebase.
(and incidentally, i stopped caring so much about this struggle once i learned how to develop pretty reliable SOA. Haven't totally given up trying to conceptualize the perfect stack tho)
JavaScript is like Gravity on the client side, and with ES6/7 it's about to become a much nicer language too. You could never run Ruby natively, in a widespread manner, on the browser. With one language you can now build a complete stack and target all devices. That's quite big.
Whenever a new technology/language/framework comes up I often hear the word "fun" all the time. My question is: does that fun translates into reliable infrastructure and decent optimized code?
All of this. NodeJS is a valid tool, yes, but not the sole reason for those performance improvements.
Take the PayPal case, which (iirc) is mostly based on a 10 year old Java stack. If they were to create a new project from scratch, using current-day Java technologies and frameworks based on current-day paradigms (technologies like Play, Akka, Vert.x), they would get equal if not even higher performance improvements.
I don't know how developer productivity and happiness would be using those tools, but the Java development tools are very mature (compared to JS), and happiness... I don't know really, I'm inclined to believe a lot of developers use NodeJS because it's new and hot and Java isn't, instead of looking at their customers' needs.
> and happiness... I don't know really, I'm inclined to believe a lot of developers use NodeJS because it's new and hot and Java isn't, instead of looking at their customers' needs.
I agree. To be happy, We don't program using a particular language. We play video games instead. We program using a language chosen by us or for us to satisfy customer's needs.
As much as I like dynamic languages and would like to find good reasons to support them, mainly python and javascript, I have to agree with this. I know there are issues with them but as a fairly level field bench mark the Tech empower tests show node.js is usually firmly middle of the pack as far as raw performance. http://www.techempower.com/benchmarks/
Rails is plenty fast - for application development. And a well-designed rails app, appropriately tuned, can be suitable for all but the highest volume loads.
This rails bashing is becoming a cliché, so here's another one - use the right tool for the job. I've worked on rails projects happily serving 100req/sec with the servers running at 20% load. I've got friends who work on J2EE sites which take 10 seconds to render a page after the app server's done talking to all its zillion little useless ESB friends. Is ruby faster than compiled java?
I currently work on a combined rails/node site - we use a rails API to feed a node-served front end. Works beautifully. Why is the API using rails? Because in my opinion, rails is far more mature, and one rails API dev can keep up with three node front end devs. Would the decision be the same if speed and efficiency on the server was of topmost priority? Probably not, but it wasn't. Hell, the DB is the performance bottleneck anyway.
I have nothing against node and will probably use it in an upcoming project where server effiency is the topmost priority but geeze, right tool for the job!
And another point, while I'm ranting - I am coming to see node as the obvious choice for consumer (free) and rails as the obvious choice for B2B/SAAS (paid). Why? Because hyper-effiency is much more important for the former to keep cost per user down. With B2B, if you've got so many paying customers you need another rails server, you throw a party!
Oh, and even the node devs I know deploy using capistrano. There's that right tool for the job thing again.
Rails is fast, if you've never dealt with anything in HPC.
Also I was more referencing that if you rebuild an app you expect it to be faster.
However:
100 Requests a second and at 20% CPU. Really!?
Considering basically all rails/node.js/php/etc apps do is add frilly bits to data held in a database, there is no real reason why it should be slow.
To put it into context, say you are serving a home page, its maybe 150k of HTML/JS of which 90% of it is the same regardless of who visits the page. (hence why proxies are so effective) Thats 15Megs a second.
Not exactly Uber fast, considering its effectively a dumbarse file server.
To put it in context, your standard linux fileserver will (over NFS) push out 1.1gigabyte a second at 30% CPU (assuming disks allow) From a ZFS file system (which is computing the checksums of each 4k block....)
You know, when we say "X is fast/slow" we usually are basing the comparison on something even remotely related. If we don't do that, the comparison is basically useless.
Of course rails, and almost any other software, is "slow" compared to HPC. I struggle to name two less similar applications. You might as well say that aircraft carriers are slow compared to F16s - well yes, yes they are.
The rest of your comment indicates to me you have no experience with web programming. Of course the cacheable parts are simply read from a disk, or even memory. It's the uncacheable parts that are the problem. And no, it's not just "adding frilly bits to data held in a database".
You seem to have a bad case of "shit's easy syndrome" - the tendency of people who have no idea what they're talking about to assume that everything is easy, and anyone who disagrees must simply be stupid or at least incompetent. Well, if you can build a web framework that's as productive to use as rails but works 100x faster, you will be a millionaire practically overnight. Have at it. Forgive me if I don't hold my breath.
Have a good long hard look at the data flow of your web app. Take for example a CMS: A user comes along, the web page is generated, The data comes from a database, its encapsulated in HTML/JSON/x and then sent to the client. (either all at once or chopped up into bits and done asynchronously). If you're advanced, you might pull data from an API (still repacking data in a veneer of *ML)
I have a case of "shits already done" syndrome. For 99% of companies any old web framework will do. Those 1% that need a bit extra, spend the man hours on engineering ways round the deficiencies in the framework they chose. Most of that effort goes into expanding the namespace of the database, Because IO is a pain at scale.
I work in VFX where efficiency and scale really matter. I look after 16k cores and ~5+pb of tier 1 disk storage. 1% increase of efficiency in CPU usage is worth heafty cash. I've seen many, many fads. (Map+reduce, rails, mongodb, couchdb) All of them are re-inventions of the wheel (Map+reduce was seen as a task dispatching system, Rails was supposedly going to replace QT, Mongo/couchdb was supposedly going to replace a posix file system, and postgres as well)
We're now back at the stage where Postgres +SQL is awesome, Filesystems are good at storing unstructured data, and pythonQT really isn't that scary after all.
Although node.js and callbacks are starting to become popular now.....
I think we are talking about different things here.
I'm well aware that 99% of companies could get by with any old web framework. Should they, though? Of course not. Since we're talking about rails, let's use that example. It's an extremely productive, flexible, capable rapid application environment. And yes, it's not a speed demon, especially when used naively, no-one denies that. So why do companies choose it?
It's not because they're stupid or inexperienced. It's because the tradeoffs of higher productivity are worth the penalty in execution speed. Slightly higher server costs are worth the massive boost in productivity compared to other options. Why do you think it's so popular? If you're going to reply "because people are cargo cult following lemmings", you are simply wrong. The tradeoffs make sense to them, that's why.
So you're a sysadmin. Linux uses python all over the place. Why is that? They could have used any old language. But they chose python, even though it's slower than C, because the tradeoffs - easy to read, write, maintain - are worth it.
VFX could not be a more different environment than rapid web application development. As you say, 1% improvement means big bucks. You know what 1% improvement gets you on the web? Diddly shit. And every few percent raw speed increase on the web, as you drop down into the less dynamic frameworks, costs you massively in dev time, maintainability, hireability. It is a balance people strike. If I could increase my team's productivity 10% at the cost of doubling our server expenses I would take that deal instantly.
I can't speak to the fads you mention, except I also generally disdain them. But progress does happen. Which is why the web sites for the films you help make are probably not just written in any old web framework, they're probably written in rails or node or what have you, because the tradeoffs are worth it.
Remember what site you're on. This is about startups, small companies trying to grow fast. It's not about hyper-optimised rendering farms. Different worlds, dude!
Anyway, I think you actually agree with me, and I with you, if we just step into each other's shoes for a second.
Although I don't know what's so hard about making a couple of animated videos. 99% of the time you can just use any old rendering framework. I mean you just ask for a frame, it generates it, and saves it to disk right? Dude, I could do that with povray in 1995 on a 386. Whyever would you need 16 thousand computers for that? And 5PB of RAID? I can store a whole movie in like 3GB and that's 1080P! Actually my imac can make pretty realistic graphics in games, maybe you should buy one of them? Shit's easy right? : P
so long as you had a 386 with the floating point unit. Ironically for a while the iMac was the only choice until the retina for decent graphics. (and CPU, still waiting for the fucking dustbin)
Thanks for the reply. This is almost exactly what I wanted to write, especially in terms of programming costs of Rails vs. nodejs.
I am working on a project with exactly the same setup now, Rails for the API and a NodeJS Proxy for the web – works beautifully and with the mature ecosystem around Rails, development of the API has been very easy and focused.
Much of the critique regarding the slowness of Rails relates to earlier versions of both Rails and Ruby, but nowadays it can match the most mature frameworks.
A single tech stack can only teach me a limited amount. When I build something around an unknown-to-me language/framework/tool over a weekend, I usually walk away with new insights.
For example, playing with Meteor (js) was the first time I was able to understand the concept of pub/sub.
That knowledge can sometimes be brought back to other tools I use, sometimes not. When it cannot easily transfer, the new tool gains brownie points. Playing with the "flavor of the month" taught me something useful. If I stayed with the same established workflow I always use, then I wouldn't have had a reason to encounter pub/sub.
That's a very bad thing to spread. Haskell might be deep, but after using it for a whole year in production, any developer worth his salt would know enough of it to build non trivial things.
This type of comments unfortunately keep developers away from Haskell. Haskell is no C++, you can actually learn it, and it is a very good language.
The Node.js/NPM community is going through a lot of issues and seems, unfortunately, immature. People also tends to choose Node.js for the wrong reasons. Mainly they want to have only one language for everything: Database/Frontend/Backend/Deployment... JS itself as a bare language is a far from perfect. A good point to start will be to just compare a sample selenium test in ruby or python and node.js. The Ruby/Python one is clear and expressive whereas the node.js one is just a callback nightmare. Finally, node.js single thread blocking nature means you have to deal with threads/forks anyway for any serious application.
I wondered this too, so I went out and implemented a large project in Node (Haraka - an SMTP Server). It turned out to be such a success that Craigslist dumped Postfix for it (an SMTP server written in C), and managed to get rid of more than half of their servers due to Haraka being faster.
We just added another example company doing the same - bounce.io - they are seeing similar gains.
Yes Node has problems like any language does, but the gains are real.
I think calling a technology overrated by pointing out a sample selenium test is pointless.
Like any technology, there are some things that node.js is good at, and somethings not good at. The callback-hell is something that will eventually be ironed out in ES6 and as a language, that'd be something that any dev will find non-appealing about javascript.
Callback hell has been my biggest stumbling block with node but there are libraries such as asynch and Q which have provided me with adequate solutions. Also since were talking about things to be ironed out, I hope they make it easier to run node behind Apache or some other webserver thats tried and true.. why abandon all that progress?
The issue about happy developers because they are using 'new technologies' (use node! it's a new technology) is a wrong conclusion. This is like Rails 7-8 years ago. People using node now are essentially doing greenfield projects - they're writing 'from scratch' code without any major requirements to keep existing code going.
Come back in 5 years when someone has to maintain the Paypal nodejs stuff which has become a moderate mess due to weird edge cases, changed business and legal requirements, and try to hire node developers to come in and maintain code written by people who are no longer working there, having moved on to scala/f# projects in 2015. All the new attention will be in 'node 2017', but you'll be stuck trying to make 2017 modules work in 2013 code, and you'll be left trying to patch/maintain something that wasn't as forward-compatible.
For true irony, some of those same developers will be brought back as consultants to develop the next-gen 'paypal 2018!' based on some new Scala framework, and the maintainers will have to watch the experts write the next generation of what they'll be maintaining 4 years on.
EDIT: "The old story of linkedin where moving to Node.js from Rails for their mobile traffic, reducing the number of servers from 30 to 3 (90% reduction) and the new system was up to 20x faster." I strongly suspect that if the team had been given X months of "rewrite the Rails system using all the current knowledge and techniques, as a new system vs patching the live one", they'd have seen similar results in less time.
To the extent that some companies are more SOA flavored than they were 5-10 years ago, this will be easier for everyone involved, of course, but let's not pretend that developers are happy because they're using 'new technology'. They're generally happy in any language when they don't have to work within bone-headed constraints leveled on them by predecessors who were chasing the latest fad before really understanding it.
Wow, a node.js company writes a blog post how node.js is the next technology and you should switch over (and hire them, they do node.js consulting!)
Uhm.. yes.. node.js is great but this article is not.
I can write the exact same article with success stories about PHP, Ruby, Python, Erlang, Perl and with a little bit of creativity about brainfuck..
Indeed. That fluff article reminds me of the .NET spam around 2005. Pointing out three times that "the best developers...like to use new technologies": so, they leave you with a pile of smelling javascript over two years when they are heading for <fill in>. Are they the best by the way? or does it feel like being the best when you can finally add two numbers in a new language? Oh, and that "developer hapiness" argument again. Then: "composed from modules, piped together" contrasted with "traditional monolithic applications". So the writer (or the node developers he represents) was/were unable writing composable code in the other languages they know? (must be a lot, see above) Just wait until their piped modules look like this: http://www.cyberpunkreview.com/image/brazil21.jpg
Fortran is the oldest programming language in existence today, because of this it has a lot of cruft. However, if you put your compiler in standard mode (2003 or 2008 standard) you have a pretty modern language: modules, interfaces, functions that contains functions etc ... You can do OOP in Fortran if you use a modern compiler and stay close to the standard.
Where Java beats hands down Fortran is at the huge ecosystem and libraries available for a variety of domains. Another weak spot for Fortran is that it doesn't stop you to shoot yourself in the foot and you can use implicit types, case insensitivity and globals all over the place.
Doesn't need to be a blog post, list some cases of people switching back to PHP, Ruby, Python, Erlang or Perl with the sizes of Wallmart, Paypal, Voxer, etc...
Not the same size, but this was a company that spent a lot of time chasing the Rails fad in 2005/2006/2007, and switched back to PHP. I wouldn't doubt the same type of stories may be told about node in a year or two, although few will ever be terribly public. Few companies like to promote out their bad decisions (not saying node is definitely a bad decision).
At my company(we are small just 2 devs), I decided to go down the Node route, because why not, the project I just got off of was a lot of client javascript(with Play framework) and it made sense. This was the end of 2011. I was looking for something to replace Play framework because the impending doom of Play 2.0(which ended up being terrible..for us).
As I got into it more and writing more business logic it started to fall apart. The async nature really started to show it's face. The async library was awesome in helping with taming the callbacks . When you want to execute queries in series, iterate over data, manipulate the data coming in and then write some data to a csv, it's just not pretty.
The code looked good, and it was all in really small functions, but making sense of order of execution with callbacks flying all over the place proved difficult. It was possible definitely but if you looked at the logic in Java, it was just like a big sigh of relief. Not sure how to explain it but looking at code that executes like you think it would makes me happier.
Luckily we didn't have too many projects done in Node and switched to Rails 6 months after and have been happy ever sense. I've revisited Node a few times, but even with generators it makes certain things blah. Specifically looping through arrays that need to execute callback code.
I really, really want to like Node.JS, but it's just not there for me. For me both languages on the client and server is overrated anyway and with Angular I'm writing less Javascript then I ever have.
The author makes a point about getting high-quality node developers. That is primarily for two reasons.
1) Early adopters generally tend to do well, whatever the language may be.
2) And more importantly, the asynchrony in nodejs is not easy. Writing maintainable JS is harder than writing Python or Java. So if someone has been through that grind, chances are they could be good developers.
Being able to do everything with a single language is kind of a big deal for many companies; the only thing holding back JS of the messy nature of callbacks (and promises, yuck). I would expect real adoption of JS to begin when ES6 features (generators, classes, etc) become available.
It is probably more of different tensions competing: if node.js will provide 80% of what Erlang can do, but is a lot more obvious, is the same language of the web development, is hyped, and so forth, there is no chance for Erlang to compete.
A stranger / corner language to win must provide a lot more than another more accessible technology.
EDIT: it does not help that other popular backend technologies suck so hard performance-wise, since I can't see how javascript is better than, for example, Ruby, as a language (but the contrary).
I would contend Go does around 50% of what Erlang does. Node.js does around 10% of what Erlang does.
You are right that both Go and Node.js has a lot of mass and a lot of acceleration right now. But Erlang has inertia in the projects where it does well and it will probably not affect typical Erlang systems much if Node.js or Go grows large. To this extent, Erlang is a too different beast.
Most people's slow Java applications are slow due to architecture/ecosystem, Java itself is pretty fast. That being said, I still prefer Go due to its elegant simplicity, easy concurrency, lack of runtime dependencies and better memory usage. Go's selective choice of which OO features to support also helps discourage some of the Java architectural bloat.
> I would contend Go does around 50% of what Erlang does. Node.js does around 10% of what Erlang does.
You're probably more or less correct, but turn things around: from the point of view of the Node.js user, Node does pretty much everything they need, and Erlang does "extra, cool stuff" that, however, they can live without; or think they can.
They can do fast, dynamic web sites with web sockets with relatively low overhead, and that's really what they are after.
This is related to disruptive innovation being discussed elsewhere:
You are I'm sure correct that people who need Erlang won't see Node encroach on their turf, but it's something of a missed opportunity for Erlang just the same.
Exactly that. People sediment to the most accessible technology that allows to, more or less, do the work they are after. If Erlang was the only way to solve a given problem in a decent way, people would switch to it, as, for example, people switched to Git eventually: it was different, more complex to use in some way, but it really was so much better at solving a given problem that everybody moved.
Btw when we talk about Erlang, to me, is not just a matter of the language itself. How accessible, small, easy to run, the runtime of a language appears to be make for a big difference as well.
>You're probably more or less correct, but turn things around: from the point of view of the Node.js user, Node does pretty much everything they need, and Erlang does "extra, cool stuff" that, however, they can live without
They're blub programmers
>As long as our hypothetical Blub programmer is looking down the power continuum, he knows he's looking down. Languages less powerful than Blub are obviously less powerful, because they're missing some feature he's used to. But when our hypothetical Blub programmer looks in the other direction, up the power continuum, he doesn't realize he's looking up. What he sees are merely weird languages. He probably considers them about equivalent in power to Blub, but with all this other hairy stuff thrown in as well. Blub is good enough for him, because he thinks in Blub.
Yeah, well I know plenty of "blub" programmers who somehow manage to quickly deliver high quality work, who have high profile public projects with their name on them, and make six digit salaries where the first digit is not a "1".
And I know plenty of hipster programmers forever chasing the new language du jour who deliver little but useless abandoned libraries on github. Check out this half working haskell ORM I'm already getting sick of, oh and there's no tests
And I work next to a company that foolishly decided to write their API using scala (don't you know rails doesn't scale) and is now reduced to spending 3 months training new devs because they can't hire to save their lives.
So yeah, tell me about "blub" programmers again. I'll tell you, these "blub" programmers make the friggin' world go round. You like PG's essay but riddle me this - how many lisp startups has YC funded?
Basho uses Erlang, despite not really being able to hire erlang devs (just need to hire and train, for the most part, its a small community), because they feel it gives them a competitive advantage, and even attracts more talented developers.
I'm not arguing that you can't or shouldn't get shit done w/ node.js/Go/PHP/Java/C#/whatever works, just that its ignorant to say "Haskell/Lisp/Erlang/OCaml/Whatever has lots of weird shit I don't need".
It's funny that you call these languages the "language de jour" when Haskell, Lisp, and Erlang are all like 20+ years old.
Well, Basho made an eyes-open decision to go with erlang, after making a thorough analysis of their needs, and well aware they were balancing benefits vs costs. I would say they made a good decision. That's not really the kind of mentality I was railing against, though.
What I can't stand is this kind of condescending smugness - "oh, you're still using X? You're a 'blub' programmer!" as if the only conceivable reason for their choice is that they're lazy, stupid, unmotivated, or all three. Well maybe it's because they like that language, they're productive in it, it's mainstream and hireable, and they don't have any unusual requirements!
I get what PG was trying to say in his essay, but I don't think he expressed it very well. Programmers should try to choose the most productive possible tools, of course. But he's made it rather too easy to simply dismiss anyone who hasn't ended up choosing the most hardcore language possible as "blub programmers", rather than acknowledging they possibly made an informed, pragmatic choice. Engineering is about tradeoffs and throwing around these labels does not improve the discussion.
In my opinion PG over-emphasised the role of lisp in that essay. He conflated the ability to use such a language, ie that he and his colleagues are smart guys, with the language itself. They likely would still have succeeded with perl, TCL, even python was around then. And by doing so, he taught the wrong lesson.
Anyway, I think we agree more than not. And "du jour" actually means "of the day" - not necessarily age. Your point is taken, though.
If we are talking from the point of view of "features" of the language, probably yes, but in the get-things-done area things tend to be a lot more malleable, and once you have a single process that performs concurrency with mutable global state you can hack a lot on top of it.
I get the feeling that what Node helps with is concurrency, although being fairly fast helps too.
Ruby on Rails does not really have a good concurrency story for something like web sockets.
Although Rails is still what I would use for bootstrapping something where you're trying to find product/market fit and you are not absolutely sure that you need to squeeze out lots of performance from the get-go. Even if I knew I needed web sockets, I'd probably farm those out to a specific server to handle just those, most likely in Erlang.
That's a good point, but there was nothing in Javascript that was exceptionally better at event driven programming compared to other languages: probably the reverse actually, because of the lack of constructs to carry state.
Regarding the "same language of web development", it seems to be a major selling point for nodejs. What worries me is that frontend development is different from backend development, and it's different from kernel development. If you imagine there is a single language capable of being applied in all these layers, do people really think the _mindsets_ and practices adopted aren't different? Like, oh there are millions of web developers using this language, what a marvelous resource to tap into to, let's ask them to write backend code!
The difference is that most backend developers I know also know javascript. I think at fewer developers only work on the back end and on the front end Javascript has already won, the war's over. So there are more backend devs that know JS than any other language.
"Being able to do everything with a single language is kind of a big deal for many companies;"
Its a shame that so far JavaScript seems to be the only language you can use on the client and server. I come from a Perl background and every time I try to do something in JS it seems like a kludge.
I don't have experience with ClojureScript yet, but CoffeeScript is trivial to debug since the source and the compiled JavaScript are structurally very similar.
On top of that Python has a solid set of libraries for it. In my experience with JavaScript (admittedly a lot more limited than with Python or Perl), the library support is far poorer. Angular is one of the most popular frameworks out there and the documentation is terrible. Compare that to any major framework in Python or Perl and you will see the difference.
IIRC, JS was designed and initially implemented by a single programmer in about 10 days.
It's far better than could reasonably be expected given that, but it still sucks. Some of the things that immediately come to mind WRT not being well thought out are bad number handling, lack of modules, and possibly the number of warnings needed about how "this" behaves.
> I would expect real adoption of JS to begin when ES6 features (generators, classes, etc) become available.
The adoption of server-side JS implementations (node particularly) is already "real" I think - the growth in its popularity seems to be very quick compared to other platforms when you consider how short a time has passed since node was first made available.
I agree that one of the things constraining that growth is the ease with which a nightmare of callbacks gets created, which can be a problem for later maintenance/improvement, and some of the newer language features will help that - but I think many will avoid using those features until they are commonly available client side too (there are many out there still using relatively old browsers, and a lot of projects don't have the luxury of being able to ignore them).
I think a major thing that is making those consider node hold back a little is the very thing that has made it successful: the rapid growth and evolution which is still on-going. I suspect that there are many experimenting with it but waiting until things settles down (which the core no doubt will for a while soon) before committing to using it for major projects.
> but I think many will avoid using those features until they are commonly available client side too
Why would you do that? If you have a few more advanced features on the server side that doesn't run on the client why wouldn't a team take advantage of them just for server-side code. Maybe the fear is server-side code ends up getting run on the client, but I feel like the separation would be pretty obvious that should be easy to avoid. Node.js allows you use the same language on the frontend and backend but it's not like people are justing moving code around between the two without regard for what context it's running in.
Actually, there's a couple of up-and-coming node.js frameworks which do encourage using much of the same code on client and server. Some of the ReactJS stacks do, for example.
The thing is, it turns out that what people want is proper separation of frontend and backend, not separation of client and server. Outside of single-page apps, there's a fair bit of "generate this page for me" stuff required as a baseline, but also a lot of places where if it's possible, rendering (and animating) on the client-side provides a better experience. GitHub's repository browsing is an example, as is Twitter's entire site, as is Google's Instant Search.
So running the same view logic on the server and client is a good thing, as it results in not having to write it once for each environment. The controller glue for tying it to the model changes, but that can easily be switched out per environment without changing the view's code, and is often generic enough that it can be implemented at least in part by a library.
> Early adopters generally tend to do well, whatever the language may be.
One look at the Rails early adopters should tell you that this is false. Almost every major high-throughput architecture that was initially based on Rails had to completely reimplemented.
Rails is appropriate for proving ideas and rapid development of complete apps. It's also suitable for implementing small, loosely-coupled SOA-style apps integrated with a larger system, or for basic low-traffic apps.
It's not appropriate for huge commerce or infrastructure-style services.
A lot of developers jumped on it when it was a new technology because it offered lots of benefits - in particular, the whole convention-over-configuration thing and lots of happy developers. They weren't wrong for that. Now that we're trying to build bigger and bigger things on the web, and moving into SOA with APIs for mobile apps etc., something other than Rails is a good choice.
Those early adopters did well, and modern early adopters will do well too.
I don't think OP meant "do well" as in "code or architect" well, but they "do well for themselves" (make more money, play with newer toys, don't have to clean up others' messes as much, etc).
"Early adopters generally tend to do well, whatever the language may be."
It is because early adopters tend to work either on special kind of problems the new language is especially suited for or because they tend to work on rather small projects. Those small projects would do well in almost any circumstances.
New technologies tend to be buggy, quirky and tend not to stay around. Rarely someone picks them up for things that are supposed to be maintained in the long term.
> I would expect real adoption of JS to begin when ES6 features (generators, classes, etc) become available.
JS adoption is 101% on the client side already, and this force will be merged with server side responsibilities.
You can't blame the JS way of doing things if you wan't to brute force other languages approaches. You need to code JS with JS mind model, otherwise, you will be seriously frustraded.
Why is everyone saying callbacks are naturally messy? When I first started node.js I realised I had a pyramid in my screen (6 callbacks in one another). I then googled "callback hell", and found some nice techniques not requiring promises or any other fancy library to write my code nicely.
The messiness isn't in the code; the messiness is in the execution model itself. However nice you may think your code is now, compared to the "pyramid", it's nothing on how nice it is when you write it in a language that has "asynchronousness" at its core, like Go.
Thank you. I actually already apply all this practices on my code, maybe that why i never understood this callback hell that people keep talking about, maybe haven't done big js projects yet to experience it.
Being able to do everything with a single language is kind of a big deal for many companies
I'm not sure the common language thing has ever been that significant of a factor in node's success, or even in the uptake by developers. I mean it appears in advocacy as a bullet point, but by itself it just isn't that compelling. Especially given that so many developers do so much to distance themselves from JavaScript proper (TypeScript, CoffeeScript, etc). The actual problem space on both the client and the server tends to be dominated by non-JavaScript related issues (e.g. the DOM, state, etc).
Node offered good performance, a decent model for asynchronous implementations (in an era when the PHP / .NET model was entirely synchronous. Every .NET advocate could point out some method of emulating what node does, but it was awkward and not at all idiomatic), and possibly more important than everything else it presented a platform that was a ground zero -- to get into it and start a project using best practices took an afternoon of learning, versus established platforms like .NET and Java where cargo culting and pseudo-best practices made even the most trivial of implementation a monumental task if you wanted to avoid raising the ire of others.
Since then other platforms have largely caught up. Even moreso, the way we develop apps have changed -- the server side isn't a presentation layer, but instead is often merely a service layer. For that purpose some other technologies are often a better fit. Personally I've mostly transitioned to using Go on the service side, as the glorious glue that hooks information together.
While I really like node.js, I always feel skeptical about "re-implemented gave X% improvement". Often a re-implementation is done because the domain-knowledge has improved significantly since the project was started. There should be an improvement even if the re-implementation was done in the same language.
When I see people talk about X% improvement because of some new tech, I always think about Fred Brooks and the No Silver Bullet essay. Nothing changes under the sun.
There is one reason Node is popular: it's JavaScript (extremely popular) on a server and mostly works with little effort (unlike most similar runtimes).
Performance is a big deal too. If you try to start a VC backed startup with Rails nowadays everyone will tell you to stop, you are just going to have to rewrite it in something performant anyway if anything happens, so to just start in NodeJS and save time. Big headline cases like LinkedIn having to move over is in everyone's heads.
Oh my god, on some parts i agree and on some parts you exaggerate massive.
Node is a step into the right direction, but the Node.js way is only the base for building up new or other technoligies.
Yes programmers want to experiment with new things, but when programmers grow up, they want to use things that work without inventing the wheel again.
Node grow up now and it is exciting to see what node will be.
It seems that iam too old with my 35 years ;-) because i see things like node as a tool, a tool to solve a problem. I use node, but i did not say "yesssss it is the best thing in the world, wohoooooo" ;-)
I'm 32. Let's just point at the overexcited crowds and have a chuckle as another fad is about to peak.
I like Node.JS a lot, as you say it's a tool in my toolbox. But due to the it's nature the code is quite fragile. It's best to keep Node.JS apps small and simple and play to Node's strengths, which are a high volume of computationally simple (and potentially long-running) requests.
I'm a developer at Facebook I thought I'd share my experience.
Before I joined, I wasn't really convinced by PHP. My experience using it was good for quick and dirty work but quickly resulted in a mess. However, the Facebook codebase was nothing alike.
The PHP stack uses generators to deal with async requests and doesn't suffer from callback hell. To write the UI, you write new custom components using XHP (XML in PHP). It heavily uses classes, traits, and most badly designed native php functions have a better counterpart available.
On the javaScript side, all the code is written using CommonJS modules. It runs a custom version of JSLint and Jasmine for unit tests. React as the main framework. We also have source transform for ES6 classes, arrow functions, spread operator ... It definitively feels at home for any node.js person :)
"The PHP stack uses generators to deal with async requests and doesn't suffer from callback hell."
This is one of the problems with the Node hype; it claims to be this unique and special snowflake, when in fact pretty much any similar language (Perl, Python, PHP, Lua, etc.) has just as much stuff working for it. Even the "but our libraries are 'designed' for async!" isn't as big of an advantage as is claimed, since in many cases the other languages have good libraries too. (Or in the case of something like gevent, a reasonable ability to convert libraries to async.) It's really a throwback, not the awesome step forward it claims to be.
I was listening to a podcast by the PM in charge of the WalMart node.js deployment, he had nothing but good things to say about node, but using the CPU graph as a measure is stupid and misleading.
The project was using node as a middle-man for some old Java based API which spat out products, prices and stock levels. So node was basically set up to serve static pages, in the hope that later they can migrate the API to node as well.
Anyway, I'm sure it's all great, but unfortunately since I know the WalMart statement is a load of BS I distrust the rest of the metrics provided.
From what I read around I understand that node.js is good for simple projects, but not for heavyweights where concurrency and scalability is the key. This is where I'm pretty sure something like Typesafe stack (Scala/Play/Akka) blows it out of the water. Personally if I work on something I'd like to at least hope that it will be big someday, which is why I'm willing to spend some extra time, but choose the most solid platform that I can afford. Typesafe stack is the most impressive one at least for now, in my opinion.
I really like node.js, but Ruby gives you way more 'happiness' and productivity, imo. Until I don't see ALL (or at least 80%) of the stack that frameworks like Ruby on Rails have in node.js's frameworks, I will still prefer Ruby over node.js for most of my web projects (of course node.js is sometimes more suitable for certain things, especially asynchronous services, but you can still use Ruby).
Do people actually use Node in production for serious things? I actually don't know anybody who shipped anything done in Node. Sure, they do side projects or hobby projects, but nothing more than that.
In fact I work at a company which is on target to process more than 2bn events / 1bn page views this year on a node / python based service oriented architecture. I'd say that's a handful more than your average hobby project.
Because the hipsters thought it was a good idea? Once they realize it's impossible to maintain your codebase they'll all moved on to create their next disaster.
When I saw go-to in the title, I thought it was about callback hell. I use NodeJS heavily in my projects since it was version 0.4 still have difficulties in writing good code. Other than that, it seems fast, but of course there is no point believing node is 15x faster than our old implementation. Any complete re-building of a system would be faster when you have more experience with the domain.
"Write server side apps the way you wrote stupid web pages" meme? And all the difficulties with resource management and efficient I/O is someone else's problem. Ok, good luck for those who never learn Java's lessons.)
Btw, pamphlet forgot to mention compared to what they got many X increases and reductions. Ruby, I suppose.)
Well, V8 complies into native code, but server side is mostly about I/O and RPC and data persistence, which requires a bit more skills than getElementById, so assumption about 1/2 number of developers is overoptimistic.
Also would love to see, same as for java based "solutions", the ratio between memory used and the size of served data set
, especially compared to Golang which I think is little more appropriate for server side.)
"Let's add it all together" section is especially convincing.)
> NPM effectively kills the possibility of experiencing dependency hell
Am I the only one feeling that package management is the cause of dependency hell? Doing it the old school way where you just always have all the dependencies in the repo will never lead to these problems.
1. it's async! most developers can't do threading. it does not do locking at all! let me say it again: no locks. java devs: what is faster, hashmap or hashtable?
2. it's dynamic. compare to old static langs.
3. it uses same syntax as browser.
4. npm. compare it to the maven.
A way to make it better?
1. thick client, not server side rendering of ui. use a CDN and phonegap build
2. npm -g install typescript (to make it a bit less dynamic)
NPM is probably the worst part. It solves the easy problem using brute force and 20 copies of the same library, but punts on actual packaging. Joyent's advice is "don't use NPM to deploy -- use tarballs." What kind of package manager requires you not to use it for deployment??
I really think NPM is a solution for immaturity on the node libraries. Think about the attack surface when you have 20 different versions of the same library scattered throughout your application.
Not having to understand your library dependencies means you don't understand all of the security bugs from old versions of libraries. You should understand your library dependencies. Forgetting about them doesn't make those relationships go away.
Ok, as a Java-dev I'll bite: They are not really comparable - because their use depends on the situation. One is meant for concurrent access by multiple threads - and thus synchronizes the access, while the other one is not thread safe and should not be used where this is a requirement. If you want the best of both, use ConcurrentHashMap.
There is a concept of a shared resource, e.g. a cache of computed values. A shared resource will necessarily be subject to concurrent modification, and thus requires synchronization. The point is, depending of the situation shared resources may far outweigh the cost of synchronization. These are tools, understand them and use them where they makes sense.
>>it's async! most developers can't do threading
Most developers can't do async either. - It's really hard to debug if you dont get your callbacks right.
It's just you.
It's easy to do clean deterministic event driven programming. Maybe you are just not used to it. It works just like the browser events, and there are kids in elementary doing it.
The new Silver Bullet is born! Will get you best developers, cut bad things 50% and improve good things 2x! JS is the only language you will _ever_ need!
I wonder how can you build a reliable e-commerce platform on a database which is not ACID...and node users typically use MongoDB and other NoSQL databases which does not support transactions.
It's true that the most mature DB framework on nodejs plateform is Mongoose,so usually nodejs devs use mongoose to deal with data. No need to say that MongoDB is not the right choice for apps that need transactions or when datas are heavily relational. And Mongoose doesnt support cascade operations.
Sequelize or Bookshelf are not mature enough for now IMHO,compared to what one can find on other plateforms like Java,Ruby or PHP. So you either need to write raw sql queries or delegate all transactions to another app layer.
Using CPU utilisation as a measure like that is pretty pointless anyway.
Without knowing over what period (instantaneous measure across the farm, a ten minute average from a typical node, a ten minute average over the lot, ...) that reading was taken it could mean many things.
If they have pushed a chunk of the rendering duties client-side then the CPU is not going to be their bottle-neck anyway: the back-end data-store and network bandwidth will be.
I have read the article and this thread with keen interest. I am curious to solicit community feedback on our choices as we faced the Node vs. Java choice in 2012. In particular, does our business ultimately suffer long term consequences by our stack choice? We think we ultimately have strong advantages vs. any competitor built using Node.JS, but the beauty of HN is that we can open up the thoughts for everyone to contribute.
I am the CEO and founder of Codenvy. We make SAAS developer environments, and have been working as a team since 2009. Starting in 2012, we recognized the limitations of our initial system and needed to rebuild the entire system from the ground up. We had three criteria: performance, security, and modularity (of our core base and an ability for ISVs / enterprise to plug-in their own customizations). We raised $9M in Jan 2013, so we now have big investors and aspirations equally large. Our target market is primarily enterprises and ISVs that build Eclipse plug-ins. Individual developers and consultants are a secondary audience.
We chose to go with a pure Java stack, after a careful evaluation of Node.JS vs. Java. We wanted to provide some of our thoughts, as we continually ask ourselves whether there are downsides to the approach we chose. For the record, we have competitors like Eclipse Orion and Cloud9 that are pure JavaScript / Node implementations, but we are - as far as I have seen - the only Java implementation in our market.
Our team is currently 40 people, with 35 or so in R&D. Our engineers have experience with Java, Scala, JavaScript, AngularJS. Prior to this project, most engineers were working on server systems for content and document management firms. The hard stuff: sync, versioning, correlation, and so on. As for myself, my career started as a Java engineer in the 90s, but for the most part have been hands off as a technologist for a decade. My skills may have faded, but as a CEO, I am conversant.
We did 2 prototypes on Node.JS where the client and server were combined systems. We ran a battery of performance tests and did conclude that we could probably run a large scale system on fewer physical nodes with Node.JS than with Java, but the gains were not enough to justify the choice.
Our long term architecture was different from our competitors. Most SAAS dev environments provision each user / group their own VM, give you root access, and let you control that system. We chose an approach that would optimize overall system density by by offloading builds, debugs, and editing onto separate clusters. User environments will be virtualized and routed to the appropriate cluster on-demand. The user will feel like they have a personal VM, but our elvish helpers route traffic between the nodes across performance queues.
After offloading LDAP (for identity), event / log management (for analytics), and cloud administration, the only core systems left was our API services (account / builder / runner / auth) and editing. These core systems would be subject to heavy iterative development, and Node's architecture could offer higher density of requests / users than Java, but by our tests the comparison was 10-20% difference, not 300%.
By way of scaling, for every 10,000 concurrent developers, we average 20 editing nodes (I/O constrained), 75 debugging nodes (memory constraining), 5 builder nodes (CPU constrained). A similar TS / VDI implementation would require nearly 5,000 nodes of the same shape / size to service the same population.
So, we saw a disadvantage to Java as we could have reduced our editing node even further. But we chose Java because:
a) The library selection. Broad and mature. In particular, it's data structures, file management, REST services.
b) GWT. This library gave us cross-browser portability of code written in Java.
c) Eclipse Plug-Ins. We knew we wanted to target Eclipse plug-in developers and saw the Orion project as optimized for Web dev, leaving a gap in Eclipse. We saw it as a business gamble that Eclipse devs who needed to move their plug-ins would find a pure Java plug-in architecture that was similar to RCP more attractive than a newer (but have to learn new tech & rewrite) approach.
d) Our history. We did a straw poll of our engineers before we started. We asked them if they felt that they would rather do Node or Java, and 75% said that they would enjoy Java more. When asked why? They had tool familiarity, were confident in the maturity of maven, and better job prospects. This last one completely surprised me.
e) JVM. The number of optimizations of the JVM would give us advantages for enterprise deployment. We built our system to be automatically installable on-premises. We do so with Puppet, but recognized that we would be encountering many IAAS and PAAS standards internally. Enterprises had security and performance tested the JVM, so we saw an opportunity to win hearts and minds of system admins, datacenter architects, and security specialists by our choice of architecture.
So, we carried forward. Since we started working on our next generation architecture, we have focused hard on the best practices associated with agile, devops, and continuous delivery. We have 7 teams of 4-6 people, each focused on a different aspect of our stack. We are able to release any Jira issue at any point in the day. Each team can push into acceptance, staging or production without coordinating with the other teams. At the same time, we have achieved modularity and a rapid development tempo. While we haven't released our new system to the market yet, we find that the performance of the environment is similar to a desktop environment, and it scales cleanly with simulated load tests. We have also been able to give out our SDK (github.com/codenvy/sdk) to 6 different partners, some of which are Eclipse plug-in developers, who are able to code, build, test, integrate, and publish their plug-ins without complicated dependencies to our core team. Their feedback was that the port of their plug-in took about 1/2 the time of the original development. And there is no real way to estimate what the effort would be to do the same in a Node-based offering like Orion.
What we can't measure is whether our competitors are able to release faster & more functionality than we are. It seems like a crap shoot. One other variable we can't account for is the cost to maintain legacy. The system we have Codenvy.com is legacy for us, and will be retired in Q3. We have to continually allocate 3-5 people to work on bugs, special partner projects, or other related items to keep that system live while we also work to roll out the new system. We know this is a drag, and some of our engineers think we may more than double our productivity once we consolidate to a single architecture. I am a bit skeptical at that - but I find that it's unwise to disagree with my engineers. I tend to lose.
So at this point, we feel like we made the right choice. But time will tell - we begin selling our enterprise offering earnestly once we ship our new system in Q2.
nodejs will live on because it uses js which is mandatory in the browser,for better or worse. Hopefully ES6 and futures versions will make this awfull language much better.
I really wish ES6 does that because as it is, Javascript only has two things going for it: the speed provided by the big boys (V8, xMonkey, etc) and the fact that it's only language available on the web. And none of those are merits of the language itself. Kind of sad.
Increasing competition between rival internet companies, the speed of delivery and the ability to iterate are the key traits of market leaders. In a competitive scenario, reacting to end user needs, incorporating their feedback into the offering and delivering updates and changes regularly is essential.
This is literally nothing to do with Node. It's to do with good architecture and project management. The style of architecture generally used and encouraged in the Node community does tend to push SOA etc., but that's nothing to do with the language.
It’s extremely difficult to hire top talent these days; good developers like to learn new things and to use new technologies.
Node will not continue to be 'new technology' for long. This is a shoddy basis on which to select a tool.
double the number of requests per-second and reduced response time by 35% or 200 milliseconds.
Measuring reimplementations of code in alternative languages is frequently absolutely pointless, because that reimplementation will always go hand-in-hand with massive re-architecting, and absent second-system effect this will result in better performance. For example:
The old story of linkedin where moving to Node.js from Rails for their mobile traffic, reducing the number of servers from 30 to 3 (90% reduction) and the new system was up to 20x faster.
Because all of the rendering was pushed to the client, no doubt.
Node is great, but it's one of many valid choices for developing huge, modern web apps. It's not that Node is becoming the "go-to technology in the Enterprise" (it's not), but that SOA, loose coupling and the reduction in use of monolithic frameworks are everywhere, and that's something that Node is good at.