Hacker News new | past | comments | ask | show | jobs | submit login
Wall Street Accelerates Options Analysis With GPU (wallstreetandtech.com)
9 points by signa11 on March 18, 2009 | hide | past | favorite | 10 comments



You will note it includes no standard benchmarks and only includes a vague hand-waving claim saying that its orders of magnitude faster. No real world numbers are quoted. Nothing is said about how good the CPU implementation was and so on.

After working in GPGPU for a while, I have become really wary of vendor claims. Vendors do all sort of tricks to make their claims look good : comparing against single core non-optimized CPU code, comparing different algorithms on GPU and CPU, not reporting the time required to transfer data between CPU and GPU, comparing only 32-bit float performance when 64-bit floats should be used and so forth.

Of course, GPUs are certainly very good for some types of problems. But they are not magical solutions and are not suitable for most problems.


I found this blog post pretty insightful on this topic: http://mainroach.blogspot.com/2007/11/gpgpu-vs-multi-core-cp...


I don't understand how he can say that a CPU is as efficient as a GPU at floating-point operations. Yeah, maybe on a per-core basis, but you are comparing (like) 8 cores to (like) 240.

The basic difference is that the silicon used for hiding memory latency (cache) in a CPU is used for more raw computational power and memory bandwidth in a GPU. So if the data are structured so they fit into the data-parallel paradigm, the GPU will kill a normal CPU. Not because of some magic, but because they were designed to do that.

You wouldn't try to run an operating system on a GPU, or a web browser, or whatever. But to say that a GPU only can handle rasterization is to vastly understate the realm of applicability. There are plenty of fp-intensive, data-parallel tasks that fit very well into a GPU paradigm. A lot of scientific calculations, for example.


"Vendors do all sort of tricks to make their claims look good..."

Reminds me of: http://www.dilbert.com/2009-03-02/


Unfortunately, to get good performance you basically have to use different algorithms on CPU and GPU. This makes comparisons difficult.


It really feels like some sort of joke talking about how much faster they could go bankrupt would be appropriate here.


The banks went down because of structured products, not exchange-traded options. Not all derivatives are created equal.


This time. I seem to think those black-scholes models, or whatever they are called, have been blamed for other "incidents" (ref Taleb and the Black Swan here).


Yup. The LTCM meltdown in 1998 was caused by options, if I remember right. Back then the Wall Street banks bailed out LTCM to avoid a global crisis of nightmare proportions.


And lots of lessons were learned, and taken to heart? (cough)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: