Hacker News new | past | comments | ask | show | jobs | submit login
AMD's Answer to GameWorks: Open Source Tools, Graphics FX, Libraries and SDKs (wccftech.com)
164 points by altern8 on Dec 16, 2015 | hide | past | favorite | 70 comments



Thats exactly why i buy AMD cards over Nvidia again and again. Who cares about super fancy graphics and yet another, 2% faster shader. Want i want is properly supported Hardware with working Open Source drivers from a company that has interest in its users hacking their own solutions, even when they get better than their properitary ones.


How does the current AMD driver fare in linux land?

Two years ago I was working on a large GL project, and basically the only workable drivers were either intel, or nvidia. With nvidia being far better off pretty much anything, from driver quality to debugging tools.

I'm using an HD4400 on my primary laptop, mostly to have a low-end system for performance tuning. While everybody says that intel should be one of the best supported OSS drivers (if not the only), the i915 driver constantly breaks with a million of little issues which are fixed in one release of the driver and broken again on the next. In the end, all these issues pop up in the wild, where I cannot even restrict users to a single driver version (because many times issues are chipset-specific).

Despite whatever I read here, over and over again, the nvidia driver has always been rock-solid for me, with intel being at least on par, and AMD being not even remotely comparable.

I had generally harder times with intel even, due to unpredictable gpu stalls. Even when it comes to power management, the latest i915 driver fails to come back from suspend.

I really want to switch to something higher quality and OSS, so I'm hoping AMD steps up the game.

I do not need another half-assed driver like i915.


This a thousand times. In our company we have a lot of experience with various graphics chips and drivers (especially mobile). From our experience NVIDIA is the only company that delivers rock solid drivers across the board. Everything else is half-baked at best.

I can also confirm the GPU stalling on Intel.

Even though other chips perform ok in the benchmarks and some AAA games, when it comes to less popular 3D apps/games or writing own graphics code NVIDIA just eats its competition for breakfast. We sometimes get better GL performance out of a 2 year old Tegra4 than out of everything that the competition offers today. And this is entirely down to drivers, not because the Tegra4 is such an advanced chip. Not even talking about the K1 or X1 here which are different levels entirely.

Edit: inserted a "not"


I've been using an AMD card for years now. It's stable, works fine for every day use without problem. I'm particularly fond of the fact that the installer itself builds distro-specific packages for installation (deb, rpm, etc.). That said nVidia does outperform, especially in games. And if you don't play games the open source AMD drivers are almost good enough to make the proprietary driver unnecessary.


I do not care about games. I work on visualization. Which means I do not care about little performance differences and I do not tout for a specific card or the other. I'd like my stuff to work decently on whatever I encounter.

But I do care about general performance and driver quality, especially as a developer. I love the fact that i915 is always the first to get support for the latest kernel features, it's probably going to have the first usable vulcan driver, YET every time I find dozes of minor (and not so minor) regressions, to the point that I'm simply tired.

The catalyst drivers were _never_ an option on linux. When it comes to development I cannot afford the system to fail on me twenty times a day just because I'm playing around.

I miss the days of mach64. I hope AMD delivers.


I love the fact that i915 is always the first to get support for the latest kernel features, it's probably going to have the first usable vulcan driver, YET every time I find dozes of minor (and not so minor) regressions, to the point that I'm simply tired.

Are there no automated unit tests for the Intel driver?


AMD has excellent in-mainline-kernel drivers that work out of the box. It's much better, if not quite as fast, as the proprietary catalyst drivers.


The first and last time I bought an ATI card was in 2003. In the 12 years since then, the drivers have become a little bit more stable and support the newer OpenGL interfaces. They continue to be light years behind nVidia.


I was in the same exact boat and had the near same exact experience around that same timeframe. I did buy another one after the AMD buyout, around 2009 with a Radeon 5870 and it has been a really great card. I'm still using it today in this machine. The OGL support still lacks but not nearly as bad as '03.

That said, the ghost of ATI still haunts AMD. To buy from them next time I'll have to get a great price or they'll have to be way ahead of Nvidia which they were in 09 with the 5870.


IMHO thats not true anymore. Given i use my GPU for casual mining, generating Hashes on mass and therelike and not gaming. You may can say Nvidia is worlds better in Gaming, but far behind in these specific points.


Here's a newsflash: nobody, except miners, care about how fast can you hash with a graphic card. I do not buy a card for hashing. The fact that I can run some fixed opencl safely still doesn't say anything about driver quality.


Years ago i had issues on mass and would have said the very same (without the open source part) about nvidia as they just worked better. Today most linux based operating systems already ship the open source ATI drivers which is good enough (sometimes even better) for most.


And yet, 10 years ago the situation was exactly reversed, with ATi sabotaging early DirectX 9 to run worse on Nvidia cards, and ATi pushing a huge pile of proprietary solutions nobody ended up caring about – like TruForm (ATi brand tessellation) or Close-to-Metal (predecessor to Nvidia's CUDA, only when that proved more successful did ATi switch to pushing OpenCL).

Brand loyalty is useless, because brands don't care about consumers.


This seems unfair considering ATI was bought by AMD and now has completely different leadership. The recent moves towards FOSS are pretty much post AMD buyout.


AMD is so far behind at this point that it will take many years before they can leverage a strong market position to try to push some proprietary solution. If they survive at all.


I agree, at the moment I use mostly AMD at home [i]because[/i] the lower end server processors I can afford don't have the feature sets gimped like Intel, and their consumer GPUs support IOMMU passthrough that NVIDIA needlessly restricts to professional cards.

It may only be their weak market share that inspires their generosity, but it's hooked me and certainly has me rooting for their continued existence.


Unfortunately that is something regular non-dev user doesn't care about much. And such users make most of the customers.

I can only hope that Vulkan will change the game a little bit, so shader hacking won't affect perceived performance that much.


Well, unfortunately, while the idea is admirable, with the current designs, for the majority of the cards, nVidia cards are consuming ~60% of the AMD direct competitors, which is very significant.


2% faster haha, you don't play any games do you? Nvidia cards are much faster than AMD since a while now.


Hardly? Last generation the 290x was easily competitive with the 780 Ti, particularly at higher resolution.

This generation the Fury and Fury X aren't quite as good as the 980 Ti (turns out Maxwell is really good) but aren't bad either.

I like AMD, but CUDA >>> OpenCL, so for any scientific computing Nvidia is the way to go. For gaming the 970, 980, and 980 Ti are too good this time around. AMD isn't far behind however, and next generation could easily flip things around.


AMD tessellation performance is a joke compared to nVidia's. And that's only the beginning.

Benchmarks are useless for most things because they are always fabricated or don't refer to real world scenarios. For example, Firefox JS engine is always very fast in benchmarks, but in the real world we know how it goes. Same with everything.


Performance is a joke <---> Benchmark are useless.

So the only thing that matters is your opinion.


That's wrong. They have, for the majority of cards, a much higher performance-per-watt ratio, which is a different thing.


Plus a higher performance-per-dollar ratio. If you need the best single-card-performance nVidia is probably still the best for most things (unless you're mining litecoin) but you can always buy 2 AMD cards and use them together for less money and higher performance than the single nVidia.


Maybe if you're in the market for a Titan X or something, but the R9 Fury and 390 are both great cards for their price point. Performance will also be impacted by nvidia software like hairworks and God rays.


Thats simply wrong. See i dont game. I do casual cryptocoin maning, generating Hashes, cracking hashes, ... AMD is MUCH superior in all these cases.

People always assume only Gamer care about recent hardware.


Sometimes, the free market does something that really impresses me. This is one of those times. Here we have AMD releasing open source tools in an attempt to beat a competitor whose hardware is technically superior. Sure, this seems like a desparate measure from a company that's lost its way and will try anything to make a buck. And it's probably motivated by a desire to generate lock-in for their hardware. But that's capitalism!


    > And it's probably motivated by a desire to generate lock-in for their hardware.
I don't see how MIT licensed code can generate lock-in for AMD hardware. Would you care to elaborate?


Perhaps I'm misreading it, but the announcement seemed to imply that the source code, however open, is for AMD hardware. So by open-sourcing it, they attempt to generate lock-in to the hardware, which is still just as proprietary as can be. In other words, I don't think that these tools will work on your NVidia GPU. Or not well, at any rate.


Yeah you misreading it because on slides there is few different things and some of them was already announced in past:

* Open source game and rendering middleware, tools and most of it is for Direct3D11/12, but some things is OpenCL / OpenGL. None of it can be AMD-only because it's all based around standard APIs. Techs that Nvidia provide with GameWorks only support hardware acceleration on top of CUDA which is proprietary.

* HSA-related stuff include compiler from CUDA. I'm not expert in this area, but HSA is also not just AMD-only thing and there is several companies backing it.

* Linux graphics stack. It's of course AMD-only, but AMD announced that they'll open source their OpenCL and Vulkan implementation so other vendors may benefit of it.

So no. They don't release things that can only be used with AMD GPUs or anything like that. What more important anyone can take MIT licensed code and turn it into the proprietary product, etc.


but most of the stuff people complain about in gameworks is multiplatform in the same way - especially hairworks


Main issue with that part of Nvidia middleware is proprietary license so no one can publicly share improvements for other vendors GPUs.

Also as far as I understand the default licensing option for most of Nvidia middleware don't contain source code at all.


Thanks for the correction!


AMD also released FreeSync, an open alternative to Gsync, a dynamic refresh-rate interface for monitors. The Nvidia version requires nvidia hardware and licensing and adds significant cost to the monitor, whereas freesync can be implemented in the control boards that already exist for the monitors, and without cost.

I think they're both great graphics cards, but NVidia tries its best to stifle competition.


NV lost their bet on Gsync, it's just not apparant yet. Intel announced they're adopting the tech behind Freesync for their own IGPs. Gsync won't hold up against Intel Kabylake's IGP + AMDs APUs and GPUs.

If the trend continues of Intel adopting AMD's open standard / libre alternatives to NV's tech, I'm not sure how NV can seriously stand against AMD + Intel using the same solutions. Intel IGPs are only going to get better and better and be more relevant over time, Broadwell Iris Pro was already impressive and Kabylake intends to take that to the next level.


AMD is very badly mismanaged. Ever since K8 engineering is again perceived as a source of cost. They lost $400mil on failed ARM servers idea without blinking an eye, but plain refuse to fund proper software support for their products because that would cost like 30-40 engineer salaries! :(.

AMD is Commodore of this decade.


I'm all for AMD going open source with these tools (I love the fact they're doing this) but the reason that developers go with GameWorks in the first place is because Nvidia is able to allocate staff/resources to the game studios to assist in implementing.


Considering how bad GameWorks features perform in many PC games (e.g in some Ubisoft titles Nvidia-only features was dropped after release within first patches) this strategy doesn't work.

Though I only heard good things about Nvidia OpenGL developers support and as far as I know they do support even devs who isn't part of "Meant to be Played" thingy. So I suppose many studios that don't even use Nvidia-only features prefer their middleware because of support.

Sadly AMD is opposite and developers support for OpenGL just not exist. E.g you can send them emails, tickets and there won't be answer for months.


Last time I checked AMD was losing something like $200MM a quarter with around $750MM cash on hand. So they could be out of money in a year.

NVIDIA isn't exactly losing this race ... why not just lock yourself in to GameWorks?


Assuming AMD runs out of money tomorrow, we'd still have an installed base of 50 or 60 million XBox Ones, PlayStation 4s and Wii Us that make use of AMD GPUs, together with a 25% market share on computers.

> why not just lock yourself in to GameWorks?

Because only Nvidia users benefit of it, which manages barely 55% of the PC market share – everything else is dominated by AMD or Intel (or, in the case of handhelds, various other companies).


And in the long run if Nvidia dominates the market even Nvidia users won't benefit (or won't benefit as much as they could have).


Well, there's the case for other platforms than PC.

Also, maybe if you are a major studio who will with certainty release your game within 18 months, perhaps you should. But if you invest heavily and lock yourself to a specific SDK of a single vendor, you better hope the SDK survives and remains relevant.


>NVIDIA isn't exactly losing this race

Because AMD has been putting out great cards the past few months, completely revamped their their driver software. Meanwhile NVIDIA has been doing not much of anything. The latest reports have shown AMD regaining market share in discrete CPU's and this trend will most likely continue. AMD has ZEN coming out next year, and they will probably be profitable again by the end of 2016. AMD isn't going anywhere anytime soon.


reskin not revamp


So it is time to buy new CPU and GPU?


AMD's finances have been miserable for ten years, and they always managed to find new investors.



It's kinda weird, actually. I mean, AMD often has really interesting ideas that may be easy to sell, but execution is always bad and the brand is damaged.


AMD's execution being bad has been provably Intel's fault at times; Intel has been proven to use anti-competetive contracts in the past with their customers (HP, Dell, etc), i.e. limit good CPU prices/discounts unless customers limit their use of AMD.


That was ten years ago. Ever since, AMD had a free playing field… not that it helped.


There was a little while when AMD had the advantage, wasn't there? Putting out 64-bit and dual-cores while Intel was struggling with the Pentium 4 architecture.


That is what lead to Intel's behaviour; in doing so they limited AMD free cash flow, and in turn limited AMD ability to fund newer die shrinks. And Intel's main advantage over AMD at the moment is the fact that it is one die shrink generation ahead.


The Intel antitrust lawsuit was on Intel's behaviour in the 90s – it just took until 2005 to finally reach US courts (it was filed with the EC in 2000). By the time AMD had competitive CPUs, Intel had already stopped their practices to avoid further scrutiny.

AMD just stopped being competitive too quickly to gain a proper foothold.


AMD is playing a fair field, but they aren't innovating. Their top of the line processors haven't received changes in 2 years.


Well they are trying to innovate: hbm, apus, vr.

The 22nm process node they wanted to use was cancelled, and now they're basically working to push it their 14nm products instead. Kind of such (edit:sux) for them to be fabless.


They'll have it. Zen will be 14nm process. I'm excited for Zen, AMD does good things when the wipe the board clean and start over.


Also they brought on the guy responsible for some of the earlier iPhone processors, which had great single-threaded performance (AMD's weakest point right now).


Jim Keller, their previous lead architect, has left again.

The Zen design would have been complete or nearly so by the time he left, but it looks like they're already slipping on their original time frame.

I'm looking forward to what they do, but it's not without some bumps in the road yet.


Like they did with Bulldozer…?


> Kind of such for then to be fabless.

Shouldn't have sold their fabs, then.


True. But you know how people say things like "people have been predicted AMD bankruptcy for 10 years, and they're still going". Well the reason they're still alive is because they flogged all the silver. And they hocked their teeth. And the company is deeply in debt. The company is worth negative equity, which the market still values at 2B.


It's in Intel's best interest that they can find investors; whether they have influenced this is an open question.


Do you think your CPU and GPU will stop working if they go out of business?


Well, the GPU at least would quickly become useless without driver updates. Every time a new AAA title is released, Nvidia and AMD scramble to work around game bugs inside their drivers because the developers don't care.


They care, but drivers contain bugs too. Your attitude seems overly negative without much argumentation. Are you sure your negativity is warranted?


>> drivers because the developers don't care

Yeah because developers have unlimited access to all sorts of GPU/CPU combos that the customers have...


I meant, I need to buy new AMD CPU and GPU.


AMD has consistently impressed me with stuff like this, definitely going AMD this time when it's time to upgrade the ol' GPU. Nvidia be damned.


They better get into deep learning territory quicker!


Looks like another attempt at grabbing a share from NVidia's CUDA ecosystem... Not gonna happen.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: