Hacker News new | past | comments | ask | show | jobs | submit login

Who else is there… AMD?



Also Intel. Being Nvidia-only is not very good from an accessibility point-of-view. It means that only ML researchers and about 60% of gamers can run this.


Thankfully it's a specialist software package not aimed at gamers or ML researchers.


And most people in Hollywood using rendering engines like Optix.


No they don't. Also optix isn't a renderer, it just traces rays and runs shaders on the ray hits on nvidia cards. Memory limitations and immature renderers hinder gpu rendering. The makers of gpu renderers want you to think it's what most companies use, but it is not.

Also Hollywood is a city and most computer animation is not done there. The big movie studios aren't even in Hollywood except for paramount.


Except that is what companies like OTOY happen to build their products on.

https://home.otoy.com/render/octane-render/

As for the rest of the comment, usual Nvidia hate.


Octane is exactly the type of thing I'm talking about. This is not what film is rendered with. It is mostly combinations of PRman, Arnold or proprietary renderers, all software.

I don't know where you are getting "nvidia hate", studios that use linux usually use nvidia, mostly because of the drivers.

None of this changes that optix is not a renderer.


AMD, and Intel, yeah.


AMD, Intel, Apple, Samsung and all the other mobile chip makers.


Yes?


NVDA is 82% of the market and rising.


Are we celebrating monopolies now?


the difference between current AMD and Nvidia GPUs isn't even that large if viewed from price/performance ratio... Comparing cards at similar price has AMD having slightly less performance while having significantly more GDDR memory.

i still use an RTX3080 though, thankfully got one before the current craze started


The difference between AMD and Nvidia is _huge_ when you look at software support and drivers and etc. Part of this is network effects and part of it is just AMD itself. But the hard reality is I'd never buy AMD for compute, even if in specs it were better.

Just as a random anecdote, I grabbed an AMD 5700xt around when those came out (for gaming). Since I had it sitting around between gaming sessions, I figured I'd try to use it for some compute, for Go AI training. For _1.5 years_ there existed a showstopping bug with this, it just could not function for all of that time. They _still_ do not support this card in their ROCm library platform last I checked. The focus and support from AMD is just not there.


Actually they lost 1% and are now at 81%




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: