Hacker News new | past | comments | ask | show | jobs | submit login

One reason that I know of is that you can get better performance.

Now I know that there are claims that C# and Java are performance competitive with C++. You see some of that in the benchmark game at http://shootout.alioth.debian.org for simple problems a wide variety of results. One interesting one, the n-body problem, which involves a lot of hard-core computation, is the winner C++ or Java or .net? Well, http://shootout.alioth.debian.org/u32/performance.php?test=n... says that the winner is . . . Fortran!! (Not my favorite language either). g++ comes out at a factor of 1.3, the best Java at 1.5, and the best C# comes out at 2.1. That is, it takes the best C# program 2.1 times as long to compute this problem than the fastest program.

The stackexchange post said But in this era, the performance of a program written in a language based on frameworks such as C# and Java can be pretty close to that of C++ A factor of 1.6 is not pretty close, in my book. If you are in high speed markets and all other things being equal, placing your order at 1.6 ms when the other guy places it at 1.0 ms means what? It means you are further down in the order book. You lose.

So we all know that these toy benchmarks don't really represent what happens in a large, useful program. It would be interesting to build a more extensive benchmark set, don't you think?

Having worked on a very high-performance stock options feed, I can share some of my experience. The development goes something like this. You start out in C++, cause you get objects, and other useful stuff. You see that during peaks of the day, say around 0900 cst that you are beginning to fall behind. So you begin to tune.

See, lots of people decry the C++/C combination saying that they are different languages. Well, sort of. If you are on a theoretical or dogmatic bent, sure. But if you have a C++ program that is taking too long, you can relatively easily, bit-by-bit, turn it into a C program. So I am fine with the C/C++ designation in practice.

You tune this thing, making sure that you allocate your objects, say, at 0829 in the day and tend to leave them there until 1500. Then you up the -O count, hoping you are not pulling a Heisenberg. If you still need a little headroom, you learn that you can turn off exceptions in g++. Yes, even thought you don't ever throw an exception, having that not disabled costs you CPU. (What were they thinking?). So while the compile flags and the source program extension says C++, what you are executing more closely resembles C.

But there are those who fervently say that in such an environment that Java is competitive with C/C++. If you look at the cost to build the program, I am likely to agree that that part of the effort is faster with Java. But can you tune Java as much as C++? Or .net? I am suspecting not.

I think we should have a more broad-based real-life example. I am thinking that a simulated financial exchange repeatedly implemented in competing languages might be a more interesting example. In fact, I think I will go off and give that a try. Say maybe Lisp (note that it whips Java server in some examples, take that!), Java, C++ and certainly not Fortran (due to personal prejudices).

I'll let you know how that turns out.




I'm not a quant, and quite removed from the field. The transaction systems I work with are much slower, and most likely deal with significantly less information.

But, is the end all be all the execution speed? Does flexibility and robust recovery not matter?

Everything I'm hearing is that, it's better to gut the car down to the frame with no safety measures then even a little bit of coverage. If that program crashes or cross wires some data, how much damage can it do to your financial position?

From what I understand, billions of dollars are on the line everyday, but everyone is racing towards the bottom of that nanosecond mark, generally at the expense of risking instability etc.

Things that would never fly in other high-risk environments seem to fly in the stock system, and I'm going out on a limb, but it seems that's because if there was there is an adult watching the kids play in the stock market sand box and a fuse will blow if an HFT feedback loop triggers some real idiocy. (http://www.zerohedge.com/article/hft-fat-digital-finger-brea... )


There's a dramatic difference between high-frequency trading and other quantitative trading. In HFT, speed is ALMOST everything. It's not entirely, but many of these firms are running VERY similar algorithms (most of the quants have worked at many other firms), and are primarily competing on speed or gaming each other.

Now, there is still a market for advanced quantitative modeling in which reliability is important, and the ease of maintenance, troubleshooting and extensibility make up for microseconds or nanoseconds. A lot of statistical arbitrage can be categorized in this way, although some of the ideas in stat arb are making their way to HFT.

Basically, different applications have different performance requirements. And even though performance is now measured in low microseconds, the battle for nanosecond performance is underway, and extremely lucrative.

Now as to the argument of whether any of this actually provides any value to society, I think the answer is a definitive "no" (wrt HFT). But that's just my opinion - I used to be in the industry but left because of that. The majority of arguments used to justify HFT are really just self-serving rationalizations by people who are only interested in getting rich. It's another example of a screwed up incentive system that is increasing risk in the market rather than decreasing it.


I work in this environment, and execution speed is the "end all be all" for certain programs in the distributed system that makes up the average modern trading environment in most, if not all, firms.

Your analogy to a car is exactly the one that I've used for years: a race car doesn't generally have air bags or electronic stability control; it's stripped down to 4 wheels, an engine, and a steering wheel. I don't really trust "robust recovery" as a idea in this sort of environment anyway because most error conditions are extremely rare and usually involve errors in external systems such as the exchange so the recovery code is impossible to test in a normal sense and is thus very likely to be incorrect. My philosophy (developed after a fairly long career in this field) is: "when in doubt, print a descriptive error message, call abort(), and let the support team sort it out." I am extremely risk averse when it comes to the systems I design, so it's not the case of lack of adult supervision but more an understanding of the limits of what you can do in an extremely complicated environment and that writing lots of error-handling code gives only the illusion of safety.


I was only peripherally involved in trade systems. My understanding is that the amounts of money to be made, and lost if you're second, completely drown the arguable maintenance and personnel costs of C++.


That is true for a small subset of the stuff quants do. For a lot of other stuff an hour this way or that makes hardly any difference at all. A normal day for one of my quant friends basically looks like, get into the office and spend the morning putting together some calculations, run them and got to lunch, get back from lunch and discuss the results with some colleagues, if you decided to trade based on the numbers, call a broker. Seconds and minutes are totally irrelevant to her.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: