Hacker News new | past | comments | ask | show | jobs | submit login

ATI and Nvidia would still have to write code to make it work. That's the hard part, not coming up with the idea.



Are you serious? Figuring out the algorithm is absolutely the hard bit here. When was the last time you had trouble implementing an algorithm?

I feel like I'm feeding a troll here - I had to check your profile to be sure I wasn't. I think you're letting your dislike of patents warp your normally intelligent viewpoints.


It's said sometimes on HN that "actually implementing it is the real problem, not coming up with the idea". This is referring to startups; in this domain an idea like "let's do a site just like myspace but with feature X!" is worth nothing, but an actual product can be worth a lot.

I think it's no trolling, he just translated this to a domain where it makes no sense, which is a good reminder that web startups are not representative of all programming/business/engineering problems. And that one must be careful not to use a phrase like a meme, without thinking about its implications.


I'm not going to comment on patents here, but you're absolutely wrong in assuming that implementing an optimized algorithm is trivial, especially in real-time graphics.

Sure, coding up B-trees from a textbook description is easy. But in a video game, 10fps and 30fps is the difference between unplayable and perfect, which means that your "by the book" implementation likely won't cut it. Video game developers spend months squeezing the last 1.1x improvements out of their inner loops, using clever bit coding techniques, cache alignment, often even hand-optimized assembler.


Getting the algorithm to run on an unreliable parallel computer (i.e. a video card) is pretty hard.


That may be true for your average webapp; perhaps even for a reddit or farmville, but it's not for an innovating algorithm.


Bullshit.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: