Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>but this is not explicitly visible to the programmer , and serious infrastructure is devoted to hiding that latency

What do you mean? You can hide latency on SPU by double-buffering the DMA but in a shader there is no infrastructure at all and no way to hide unlike SPU, you just block until the memory fetch completes before you need the data.

> they are outright programmer-hostile

Depends on the programmer I guess, I enjoyed programming SPUs, don't know personally anybody who had complaints. Only read about the "insanely hard to program PS3" on the internet and wonder "who are those people?". It's especially funny because the RSX was a pitiful piece of crap with crappy tooling [+] from NVidia yet nobody complaining about SPUs mentions that.

[+] Not an exaggeration. For example, the Cg compiler would produce different code if you +0/*1 random scalars in your shader and not necessarily slower code too! So one of the release steps was bruteforcing this to shave off few clocks from the shaders.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: