I said entirely typical, not that it was a dick-sized "challenge".
A completely standard developer on .NET or PHP in 2008 was building pages that rendered at less than 10 requests per second. This is experience, not a guess, given that my role was improving the performance of those disasters. A completely average developer on node was building pages that was 10x to 100x better performance.
Secondly, simply spawning threads is laughably non scalable.
Not the OP, but I knew what you meant I soon as I read it. But, did the typical .NET app really not make use of the Asynchronous Programming Model that was part of the standard library?
I ask because in 2008 I was only about 3 years into professional web development, and even I knew about stuff like the C10K problem[1] and that I/O Completion ports were apparently one of the few things that Unix people admired about Windows.
> Secondly, simply spawning threads is laughably non scalable
How else are you going to do it with a non-threaded language?
I've ran well into the 10s of requests per second running "standard" PHP code on practically vanilla Apache configs. This was far from atypical and is no way any kind of challenge on a beefy machine.
My job was not to clean up disasters though. Maybe if your job was to clean up disasters, your typical was much different than mine?
The problem with multi threading like in PHP is that you need locks. It will not be a problem if you have low traffic but once you get hundreds of requests per seconds there's a high chance for errors like "double posts" where both threads say "post don't exist" and then both threads make the post, and you end up with double posts. Or racing conditions, double spending, etc. With NodeJS you get rid of all those problems because it's single threaded.
JavaScript is single threaded, but all IO operations like network and disk uses threads in Node.JS, So there are still racing conditions, like when accessing the file system. But all other operations are single threaded, like storing data in memory. In for example PHP, where the server spawns the threads everything becomes multi threaded. In node.js instead of using locks etc, you use callbacks, so when something is finished, a function is called ... It's just like event listeners in the browser.
Many people complain about Node.JS being single threaded, but it's actually a big relief, they probably haven't been dealing with issues like locks and racing conditions.
To get cpu bound parallelism in Node.JS you have to spawn child processes. There's a built in module in NodeJS called "cluster" that abstracts this a bit.
When talking to a child process you could just as well be talking to another machine over the network, the code will look the same, and scaling horizontally across many machines will be easier, compared to languages that solve concurrency by using threads instead of non blocking IO.
There's a learning curve though. It took me about six months to learn how to do async programming, manage messages between different processes and machines, and callbacks, I do remember the "callback hell", but now it has become a second nature, it's like brushing my teeth, but more fun.
I don't think half a year is that bad, you would need just as long time to get comfortable in any other language. And I consider JavaScript very newbie friendly, you don't need a CS degree to write JavaScript.
A completely standard developer on .NET or PHP in 2008 was building pages that rendered at less than 10 requests per second. This is experience, not a guess, given that my role was improving the performance of those disasters. A completely average developer on node was building pages that was 10x to 100x better performance.
Secondly, simply spawning threads is laughably non scalable.