These benchmark tests for C# are run against MySQL or PostgreSQL on Linux. In the Fortune 500 setup you're probably connecting to SQL Server or Oracle in the back end for which Microsoft and DB vendors have optimized OLE-DB drivers.
That, and JSON serialization on .Net using default MS serializer is super slow. Everyone uses JSON.NET or another faster serializer in the real world.
We have SQL Server tests but they were not included in this round. They were last in Round 7 and we'll include them again in Round 9. Here is SQL Server in Round 7:
Also, we have a JSON.NET test implementation thanks to community contribution. It's test #138 and named "aspnet-jsonnet" as seen on the following chart of C# tests:
Just time. We were already two weeks late on Round 8 due to a host of other issues. We'd like to do one round per month if we can get our routine ironed out.
We'll make it a priority to get them run in Round 9.
Please don't be annoyed by my following comment as I appreciate your effort and like the benchmarks very much (as a hacker - as project leader I'd prefer the "enterprise frameworks" to perform much better :-) ): You should really avoid publishing incomplete benchmarks. They don't do justice to both, the left out and the included.
> You should really avoid publishing incomplete benchmarks.
If we had followed that advice, there never would have been a round 1. I was so uncomfortable with the idea that we'd be publishing surely-flawed benchmarks of frameworks we didn't really understand that I requested to be taken off the project (prior to round 1). It was only after seeing the post-round-1 discussions and the flood of pull requests that I realized I was wrong.
These benchmarks are always going to have flaws. I think it is better for us to regularly publish our best attempt than to try for perfection.
How about those results under the plaintext benchmark run on windows with no db access to mysql or whatever? Still very slow. The low level libraries for http/disk/etc for C# on windows or linux are simply not set up for performance in general and that's what this benchmark is reflecting. You're dismissing these results a bit too quickly and defensively, I think.
Those numbers might not match up to other frameworks, but they are by no means slow. ~29k requests per second (standard ASP.net) equals 2,505,427,200 requests per day.
That's far more capacity than anyone needs and if your site does ever reach the point where 2.5 billion people visit it per day then you can just put another box up and double your capacity to 5 billion requests per day.
This doesn't equal out to 2.5 billion people a day. One requests does not equal a unique visitor. For most cases that doesn't even equal one page load. I don't know where the 29k/s number came from but I guess its a peak load from one of these bench marks. It isn't realistic to expect a server to be consistently pegged at 100% 24 hrs a day. The real number is going to be a tiny fraction of that
Sure, as long as your website is just a tiny piece of plain text, and all your requests are distributed perfectly evenly over the course of the day, and every person only requests a single file and then leaves. But since none of that is even remotely close to realistic, your numbers are not either. Yes, 29k req/s is quite slow for a "serve a tiny static file" benchmark. No, you can not handle a billion visitors a day on one server with such low performance.
That, and JSON serialization on .Net using default MS serializer is super slow. Everyone uses JSON.NET or another faster serializer in the real world.