often times webserver benchmarks are misleading because of how the tests were done.
nginx is a fully fledged webserver with logging enabled out of the box, and other bells and whistles. By just having logs enabled for example you're adding significant load on the server because of log formatting, writes to disk, etc.
At the very least include the configs of each server tested.
The pipelining benchmark is identical to that of Japronto (another, very similar thing posted here on HN a few days ago). Japronto's repo on GitHub holds the wrk pipelining script used.
I haven't had the time to add configurations for every server tested (esp. Apache & NGINX) but the main point here is to showcase the Node.js vs. Node.js with µWS perf. difference.
We don't need to take his word for it. It's open source, so we can run the tests ourselves.
I think it's completely understandable that he threw in the others, probably default config, without caring much about it since they weren't the point of the writeup.
It has a mostly-compatible API but strict conformance doesn't seem to be the goal here. If your application does not make use of obscure features provided by core http (it could probably be refactored to do without anyways), then it's a free boost in performance.
nginx is a fully fledged webserver with logging enabled out of the box, and other bells and whistles. By just having logs enabled for example you're adding significant load on the server because of log formatting, writes to disk, etc.
At the very least include the configs of each server tested.