Hacker News new | past | comments | ask | show | jobs | submit login
AMD's 22 year old GPUs are still getting updates (freedesktop.org)
292 points by titaniumtown 11 months ago | hide | past | favorite | 103 comments



I think drivers should be compulsory to open source 5 years after the hardware is no longer sold, it’d be so good for consumers and make hardware last a lot longer.

I’d include things like digital cameras and printers in this revolution. In a lot of cases it might be better using older hardware as new features would be added by the community forever!


If you're interested in actually making this a reality, I recommend in engaging in your country's environmental policy community -- and bringing a lot of patience.

Over at KDE we've some something in that direction - we've worked with the German federal environment ministry on extending the criteria for Germany's Blue Angel environmental label with rules for software products. The resulting criteria include language about what's needed to keep old hardware running and useful (as unnecessary HW replacement spikes the environmental footprint); including using open source to enable the required maintenance.

Eventually, this will have an effect, e.g. when government procurement rounds require bidders to achieve this label, and others start aligning with government practice.

https://eco.kde.org/


I think what's needed from the government side is indemnity for copyright and patent infringement on open-sourced code. What I've always heard is that AMD steals Nvidia's IP, Nvidia steals AMD's IP, etc. so if they ever open-sourced their highest-performance graphics drivers, they would each sue each other into oblivion since the "borrowing" is now out in the light of day. (Remember that "steals" can mean AMD hires an engineer from Nvidia, they run into the same problem 2 years later, and accidentally solve it the same way they did at their other employer. To someone reading the code or a court, that looks identical to coming into Nvidia's office through a vent and stealing a hard drive. But the intent was never malicious.)

If we removed the concept of trade secrets and made copyright and patents on software last for only 5 years, then this problem would go away.

For an example / cautionary tale, look at happened with Google's range check function in the Java standard library. Millions of dollars spent litigating over 4 lines of code. Nobody will ever want to open-source anything again, especially if they independently discovered some clever graphics optimization.

Whatever fines governments can levy for not complying with these environmental regulations probably pale in comparison to what infringement lawsuits cost to defend, so I think that's what regulations have to attack.

So right now, companies are scared about the legal risk that open source exposes them to. It's easier for entities that wish you harm to find infringement in code they can just look at, instead of encrypted binaries that are sent to the "Trust Zone" inside hardware. Because it's easy, people look. And if they find it, the company that open-sourced the thing is ruined. "You copied me too" isn't a defense to copyright infringement; you'd have to prove it to make them settle, and if their code isn't open-source, you can't do it.


Software patents is a disaster, a lot like patenting mathematics, it's ridiculous and we can all imagine what would happen if there was one company with a patent on calculus, ray tracing or matrices... it would hold everything back.


This sounds like mutually assured destruction, I think they would have to come to an agreement when both were opened to not do that.


I saw this presented at Bits & Bäume conference - excellent work! I've previously heard other, very high-level approaches on "sustainable software", but few seem to care to bring all the ideas into practice. KDE does, apparently!


>I think drivers should be compulsory to open source 5 years after the hardware is no longer sold,

Why wait five years? The hardware stops being sold on date X, the software/firmware must be published the same day.

> In a lot of cases it might be better using older hardware as new features would be added by the community forever!

And here you have found why the OEM's will fight via hordes of lobbyists any attempt to legislate a required publishing of software/firmware for out of production hardware. Keeping the old hardware useful after the OEM end-of-life's it means the OEM sees a smaller number of future sales of their "new and improved V2.0" [1] hardware, equating directly to lower revenue and profit.

[1] where what is "new and improved" is often nothing more than eye candy to convince you to give up your 1.0 version and buy the 2.0 variety.


That's something I'm really angry over right now. Atomos builds the Ninja V portable monitor/recorder for videography purposes.

They sold the Ninja V and the Ninja V+, two exactly identical devices, but the Ninja V has artificial limitations of what resolutions it allows you to process. Obviously, you have to pay extra to enable other codecs, 99$ each.

So already the worst of DRM imaginable, especially as the only way to remove the artificial limitations of the Ninja V is to throw it away, buy an entirely new Ninja V+, and buy all unlocks separately yet again. Oh, and if you try to resell any of your devices, Atomos wipes the unlocks and demands that the new owner buys them yet again.

So already the worst kind of DRM infested money-grabbing BS imaginable.

But no, they found a way to make it even worse. They just deprecated the Ninja V and V+ and announced the all new Ninja and Ninja Ultra. They're identical. Same battery runtime. Same resolution. Same weight. Same performance.

But now with the new and improved AtomOS 11 software. Can't install that on the Ninja V/V+. Only way to upgrade? Throw your Ninja V away and buy a new Ninja.

If I find a way to jailbreak these devices, I fucking will. God I hate them so much.


the only solution is more competition for Ninja.


Or regulation like the one from the thread, requiring systems to be open sourced in certain cases. It would definitely help


From my experience in the industry the problems are usually working out the licensing of tons of licensed code that is included from other companies more than a desire to force upgrades.


This is where regulation works well; if hardware manufacturers are required to open their code, it will be a standard and expected part of supplier contracts. If someone wants to license their code, they will have to allow for that or they'll have no buyers.


Do we know this regulation works well? Where's it been implemented?


Maybe I'm naive, but things like UL or ROHS or GDPR or SOC2 seem analogous to me. Want to be a supplier for an organization that needs a certification? Then your product and all of your suppliers and all of your competitors need to be able to meet that requirement, and it's just a table stakes feature.


Graphics card drivers are the absolute worst because not only do you have these licensing issues, but many games have workarounds because the API isn't called correctly so the driver just... makes up what should have happened.


The government allows software piracy protection.

There should be a trade-off where source (with a few variables hidden for privacy / security) is escrowed and released after a certain time. Would MS share prices really drop if 20 year old programs like Windows XP and Office were forced to be open sourced? Would anyone take a major hit?

The cost would be small, but the upside is the preservation of our technological history.


Yes, I agree. But in practice, this stuff is evolutionary to the point that they'd simply keep a token amount of SKUs in reserve that will never be actually sold but that are theoretically still available in order to be able to block that compulsory order. Because if the competition gets direct insight into how the previous generation looked they will be able to make a much better stab at competing with the current one.

So to solve this you'd have to mandate that they open source all drivers right from when the goods are sold the first time or commit to supporting devices for as long as there are viable units out in the field or something like that. Again, not likely to ever happen for obvious reasons.


Let's just make it an arbitrary number of years after the selling began then, like for patents.


I bought an unbranded drawing tablet a long time ago from a small Chinese company on Alibaba. The driver CD came with source code for the (Windows 3.11 and 9x) driver and configuration utility as well as a PDF describing the interface protocol (which used the serial port). I was pleasantly surprised but then realised that they likely did that because they only cared about selling the hardware and hadn't thought about wanting other user-hostile revenue streams (yet).

I would be fine without open sourcing drivers, but just releasing documentation (and perhaps fragments of demo code) but I suspect a lot of companies are reluctant to do either because they believe there is some secret sauce in the software. Otherwise they'd be more interested in outsourcing to the "community" to do the work for them for free.


> I would be fine without open sourcing drivers, but just releasing documentation

The problem with that is that it's far too easy for the 'documentation' to be incomplete, and in non-obvious ways.

OTOH if they release incomplete source code it will likely be much more obvious (e.g. 'foo() calls bar() but bar() does not exist anywhere in the codebase, wtf').


Why is it not compulsory to release the source code from day 1? If you're making a driver it's because you're selling hardware, that's where your profit and value is. Commingling the two leads to perverse incentives and conflicts of interest.


For GPUs the driver is mostly a gigantic list of performance hacks for different software packages. That might be an asset to competitors.

But I think NVidia would be perfectly fine financially if they gave this out.


Considering Intel Arc's launch, it would've been a massive boon to them to have this information. Granted this information is likely highly customized to Nvidia architecture so it may be of little use, but Arc made it very apparent exactly how many games just barely work at all. You can also see that in the excellent Dolphin emulator team write ups.


Intel's woes with ARC stem from the fact their silicon straight up doesn't support anything below DX12, so they have to rely on software emulation for them.

Nvidia and AMD both support DX versions below DX12 in their silicon, they don't have to emulate anything which means better performance.

Even if Intel had access to Nvidia and AMD's driver source codes, none of it would be useful because they don't have the silicon.



I'd actually say nvidia wouldn't be perfectly fine giving this out. Their drivers are a big part of the secret sauce that makes things like CUDA so dominant.

Nvidia's biggest product is CUDA.


Because if I can rewrite my printer to not intentionally be a total piece of shit, the company gets less money by intentionally expiring or wasting ink, DRM locking ink cartridge brands, or "service needed" bricking the device after a period of time.

There's also the matter of lawsuits. If you get a device, they're not supporting updates, so you go get a 3rd party update and it ruins the device, the RMA costs the manufacture, but worse, the customer(s) may file a (class action) lawsuit against the company over the problem of being unable to find the official firmware/driver updates, causing people to ruin their hardware, get RMA rejections, or not even being aware of an RMA process and simply buying another one.

Personally, I'd rather there be open source drivers/firmware for everything from day one, but to look at both sides, I fully understand the liability and why that's something that only exists in the hobbiest world.


>> Why is it not compulsory to release the source code from day 1? If you're making a driver it's because you're selling hardware, that's where your profit and value is. Commingling the two leads to perverse incentives and conflicts of interest.

> Because if I can rewrite my printer to not intentionally be a total piece of shit, the company gets less money by intentionally expiring or wasting ink, DRM locking ink cartridge brands, or "service needed" bricking the device after a period of time.

That sounds exactly like the "perverse incentives and conflicts of interest" that the parent comment mentioned. I don't think they're asking why it isn't required now, but why it _shouldn't_ be required. What you're saying sounds more compelling as an argument in favor of what they propose, not in favor of the status quo.


The company can create a way to void the warranty or not take responsibility and leave it to you if 3rd party modifications or drivers are used, but not outright prevent them and make them a pain to the users. Something like how Android ROMs and rooting used to be.


> Because if I can rewrite my printer to not intentionally be a total piece of shit, the company gets less money by intentionally expiring or wasting ink, DRM locking ink cartridge brands, or "service needed" bricking the device after a period of time.

This is exactly the kind of perverse incentives that I'm talking about though. And this kind of misbehavior happens way more than someone installing a 3rd party update that bricks someone's device.


I think the ideal solution here would be if companies were required to ship an open source driver, and then optionally offer a proprietary driver for an extra fee which includes whatever 'special sauce' (as another comment put it) that they don't want to release.

The example I'm thinking of is Nvidia's newer GPUs and DLSS. The hardware would come with open drivers, but if you want the upscaling that's an additional fee. While maintaining additional drivers is more work for companies, I think they'd actually benefit from this because it could be a recurring revenue stream for older hardware.


Why five years? Modules (precisely speaking about Linux) shall be open-source upon sale!

Nothing new for AMD and Intel. But we have likely a new field below, firmware. Nvidia (and both from above) push a lot of stuff there now. Firmware often contains security and bugfixes nowadays.

PS: Hoping right now that the newest firmware for RDNA2 fixes issues with PSR (backlight turns off).


> Modules (precisely speaking about Linux) shall be open-source upon sale!

Linux's interpretation of the GPL allows closed source and proprietary modules (providing you conform to a somewhat limited set of interfaces).

There's a lot of closed source binary lumps out there for different hardware.


> Linux's interpretation of the GPL allows closed source and proprietary modules

Not officially, IIUC. If they were officially allowed, this is what would happen: <https://lwn.net/Articles/162686/>


http://linuxmafia.com/faq/Kernel/proprietary-kernel-modules....

Linus said it was okay long ago; that's as close to official as one can get. Later his views became more nuanced, but still largely authorize binary-only drivers.

It may still be legally problematic (as there are many contributors who may be able to make a claim); but 28 years of policy precedent make such an attempt at enforcement difficult.

There's all the EXPORT_SYMBOL/EXPORT_SYMBOL_GPL infrastructure for this exact reason.


> Linus said it was okay long ago

You and I have a very different understanding of the very long text at that link.


> > long ago

> understanding of the very long text

You have the original statement in 1995, which is nice and short.

You clipped from my statement:

> > Later his views became more nuanced

Waffling starts in 2002.

IMO, the fact that there's an interface maintained for the explicit purpose of marking symbols for non-GPL compatible code makes it pretty damn difficult to enforce.


Contrary to popular beliefs, AMD and Intel only do that for some of their cards, not all of them.


The larger problem is that drivers are written by hardware manufacturers.

This is the perverse incentive: driver optimizations present competitive value to manufacturers, so manufacturers keep drivers secret in order to monopolize that value.

The problem is that by monopolizing their driver, the manufacturer also monopolizes the responsibility of driver creation, compatibility, security, and maintenance. This doesn't just prevent competition from other manufacturers: it prevents collaboration with anyone else.

There is nothing to incentivize a manufacturer to actually uphold the responsibility that they have monopolized. It's actually the opposite: monopolized responsibility is an opportunity to implement planned obselescence.

The purpose of copyright and patent monopoly is to incentivize knowledge sharing. In this case, the opposite is true. If we want drivers to be maintainable, then we have 2 options:

1. Require all drivers to be open source.

2. Break up the vertical integration of in-house driver development.

I vote we do both.


How would 2 work? I make some hardware, but have to hope that someone else will start a company to create a driver for it?


Yeah, that doesn't make sense. "We're pleased to announce the release of our amazing new hardware that we can't wait to see what people will do with it. No, really. Please, someone do something with it. We're not allowed to do anything other than make it. Please, justify our existence. However, if someone says the hardware is inadequate, that would just be their excuse for making inadequate software, and in no way reflects the true potential of the device"


There is no need to wait until release. Go ahead and publish a hardware spec during development. Go ahead and commission a driver development company, and work closely with them.

The only change is that driver development is allowed to be a competitive market.


Yeah, I've been there done that. Got the bleeding edge cuts and scars from it. Worked closely with a hardware decoder's published specs provided by the manufacture. We made files that were within published specs and delivered to them for testing. The hardware choked. We had to keep adjusting until performance was acceptable. Published specs were never corrected to reflect true abilities. Hence my part of the comment about it being the fault of software's fault and not the hardware.


I wasn't asking you personally to take over.

Nothing about the situation you described is any different in the status quo.


Multiple someones, in fact.

Why not? Someone makes a faster driver, another makes say a more power efficient one so the card is quieter. And yes it vould cost money. You could compete on this thing. Once upon a time there were competing commercial operation systems. Can't see why GPU drivers couldn't compete.


Yes. It would work the same way it works internally: create a hardware spec, and answer questions from driver devs.

The only difference is that driver devs aren't working exclusively for the manufacturer.

If you want an example of third-party driver development, just take one look at Linux.


> I think drivers should be compulsory to open source 5 years after the hardware is no longer sold

This sounds like a reasonable first step towards the real ideal, which would be to free the drivers from day 0.


> drivers should be compulsory to open source 5 years after the hardware is no longer sold

The ironic thing is that it would be legal for the manufacturer to engineer the hardware to die before that date.


That could probably invite needless frivolous lawsuits (IP/patent/copyright/license trolls) on various parts of the source code.


>> it’d be so good for consumers and make hardware last a lot longer.

Which is why hardware manufacturers will fight tooth and claw to prevent such a thing. They don't want you to keep using old hardware. They want you to keep buying new stuff. If the hardware isn't going to break, then a failure of software will do.


I just want a proper open source hardware card and software driver!


That would be great. I'm annoyed that my fathers 10 year old Cannon MG5100 series (printer, scanner) does not have a driver for windows 11 :(, Now he has to buy a new one or transfer files to his old Windows 10 laptop to print.


capitalism requires the oil of planned obscelescence where available. when not available, just good old conspicuous consumption will do.

of course society needs regulations like dolphins need oxygen


Good point. This is why you see just as many 25 year old Ladas as BMWs.


Terrible example to prove a point. 25 year old BMWs are way more reliable than 25 year old Ladas except you don't see many of the old BMWs around because their wealthy owner upgraded and you see a lot of Ladas because their empoverished owners can't afford anything new so they do what they can to keep them running but that's more survivorship bias of Communist cars which had terrible reliability, hell, most had issues straight out of the factory.


Somehow, while you misunderstood me, it was a good enough example to work anyway and you came to the right answer.


ATI GPUs technically https://en.m.wikipedia.org/wiki/Radeon_R300_series

OSS wins again I think.

I had one of these, a fancy red 9700 Pro that I remember playing Max Payne, JK2 and FarCry on.

Specs https://www.techpowerup.com/gpu-specs/radeon-9700-pro.c50


I had one of those as well, with the 75ohms tv connector and everything, first PC build in highschool.

The tuner didn't work as well as just watching tv, but the novelty of it was pretty neat. Bear in mind, this was long before people had a TV in every room.


It was 2002 I think we were at that era by then.


I remember drooling over this card and half-life 2, while all I had was a voodoo 3. It took me so long to upgrade my computer that by the time I did this series of cards was already two or more generations too old, and in the end I opted for some newer nvidia card. But alas, I finally could play Valve's masterpiece lol


I still have one, I can't remember if it was an r300 or r600. Radeon X1200, a laptop.


A bit of a misleading title, considering that AMD officially stopped updating drivers for Vega GPUs which are still being sold in products today(!) based on Ryzen 5000 APUs. They'll only see security updates.

So I have mixed feeling on AMDs update support periods. OSS drivers and community dev effort is a different thing from the manufacturer's own efforts and should be mentioned as such.

Edit: @downvoters, would you mind explaining please? Not that I care what you do, I'm just curious on your logic. Thanks


This is definitely something the OSS community deserves most of the praise for, but, AIUI, the overall quality of the radeon drivers is only possible because AMD has a pretty good history around giving the community the specs to work with, so they definitely deserve credit there.


    1. Doesn’t apply to Linux.
    2. Probably also doesn’t apply to Windows[1][2].
Please correct me if I missed something.

[1] https://community.amd.com/t5/gaming/product-and-os-support-u...

[2] https://community.amd.com/t5/gaming/product-and-os-support-u...


AMD split off Polaris and Vega drivers last September; official ROCm support ended earlier but afaik you can still run ROCm on Vega?

Similarly worrisome for gamers is aging architectural features as devs catch up to hardware, i.e. mesh shaders in Alan Wake 2 supported in Turing+ and RDNA2+.

https://www.anandtech.com/show/21126/amd-reduces-ongoing-dri...


My HD 7850 machine on Windows is currently on some legacy-branch "22.6.1" driver, over a year old. Even my RX 580 is on another legacy branch now, with most updates only being for RDNA cards.

AMD's dropping of Polaris and Vega also applies to Linux, where the AMDVLK first-party driver stopped supporting them. ROCm also does not work anymore.

Of course, nobody uses AMDVLK so it's not a huge loss (RADV is generally a lot better), but the loss of ROCm does suck.


Your links are from 2021. :-(


I have to respect for AMD for at the very least following the standards they helped create, unlike Nvidia. That makes things like this possible.


> considering that AMD officially stopped updating drivers for Vega GPUs which are still being sold in products today(!)

This happens waaaaaaay to often in hw/embedded :(.

My experience in embedded is that usually you have two teams: lead "next-gen" product team, working from the prototype to the first shipping release where things start to barely work. This is then moved to "support" team, which will try to punch and kick things around with half the required knowledge until it's somewhat usable.

But as soon as the product cycle shifts to new gen hw with a different codebase, you're out of luck.

The dynamics in this are not so simple, but the net result is that you frequently end up with product still being produced/sold where you effectively only get tech support at best, without real fixes (hw/sw).


We're not talking about embedded here but Ryzen APUs for laptops still being sold.


More accurate headline: open source software gets updates to support 22 year old AMD GPUs.


No, the old GPUs were already supported. The performance is still being improved now.


yes, but not by AMD, but the community. Lots of 20+ year old community projects still get feature- or performance-updates these days.

However, finding a vendor who would do that will prove to be an impossible task


At some point you have to accept that a 20 year old GPU(!) just isn't interesting for anything other than historical and nerd-sniping purposes.

That said, I'm sure you could pay Collabora or some shop like it to implement any features for these old drivers that you really care about.


oh, definitely.

Everyone who still uses these GPUs probably doesn't require "more speed" (or they would have upgraded already)

However, as long as people have fun fixing (and testing) new features or improvements to these old drivers, why not?


I did a quick glance to see if anybody was funding this, correct me if I'm wrong but it looks to me like a random dude in Czechia who happens to still use the card.


Yeah, I looked up the author and the changes were comitted from a gmail account. The official contibution from AMD would be made from an account with amd.com domain.


For what it's worth that doesn't follow - the gmail account may well be an AMD employee. Using an email that survives changing employer can be less hassle.


An odd thing to me about the GPU space is how rapidly many ML libraries that rely on CUDA deprecate earlier versions. Two years ago I bought a used workstation laptop with (iirc) a Kepler GPU, and I was surprised to find that virtually none of the popular Python libraries would run on it. I spent a whole day stepping back through earlier library versions which of course created other kinds of dependency problems before giving up in favor of an external GPU.

I wasn't trying to do anything difficult, I was literally at the 'hello GPU world' stage and wanting to try out some very basic exercises from books. Can't help feeling that Nvidia is doing a lot of the groundwork with CUDA and some library maintainers are just surfing on it.


That's because newer GPUs support newer instructions sets.

Like AVX, AVX2, AVX512, but basically a new set every year instead of every 10 years like in CPU land.

And it's not just about speed, they provide new functionality. So at some point supporting older GPUs significantly slows down those who have the newer GPUs.

The next architecture after Kepler, Pascal, added huge new capabilities, that's why nobody supports Kepler anymore, it's too limited.

> wanting to try out some very basic exercises

Should be possible to do that with an older Python (3.5?) and associated libraries. Versions around 2016 should work.


I seem to recall ~6 years ago I had to recompile an ML library because my CPU didn’t support AVX2. So even though the release binaries required AVX2, the source code was written in a way that it could work without it.

I wonder whether this required specific effort by the developers of that library, or whether it was taken care of by some underlying tooling? Can you write CUDA code today that uses the latest features but can be compiled for older cards as well?


Deprecating support for a 10-12 year old gpu doesn’t seem rapid to me


I am... confused? My old PC has an X1800 XT, which is significantly more recent than that, and the Linux driver hasn't been updated since 2009. I don't think that driver even works with modern X.Org or Linux kernels. Could someone tell me if I'm missing something? Would be very happy to be able to run this card at full resolution, instead of having to use a generic VGA driver.

https://www.amd.com/en/support/graphics/legacy-graphics/ati-...


Your link is for the proprietary driver, this link is for the open source driver (not AMD themselves providing updates). X1800 XT is R500 so should be supported by the open radeon driver just fine, including output and modesetting https://www.x.org/wiki/RadeonFeature/


22 year old GPU support improved. Good. Very good! Now, how about making mesa not freeze the driver, crash and panic the kernel on just 10 year old gpus?

https://bugzilla.kernel.org/show_bug.cgi?id=85421


By just using EXA instead of Glamor it just works.


So maybe my old laptop from 2006 will get a kernel with MESA support for its old ATI x1600, which is based on R520.

I couldn't update it past some 3.x kernel because anything newer would not draw the display or lose sync, I can't remember which one. That happened about in 2013.

However the nail in the coffin was the 4GB limit on RAM and too many VMs and containers.


From the comment, it seems to also improve R500, which is ATI's X1000 series. I have an Athlon64 with a X1950 Pro (i bought that PC in late 2003, it had a GeForceFX but later upgraded it to the X1950 Pro) and i planned on trying Linux and see how it performs.


Yeah, just like the R600 driver supports HD 2000 through HD 6000 hardware ('Terascale'), basically any Radeon HD except HD 7000. And radeonsi originally meant "Southern Islands" aka HD 7000 (and parts of the Rx 200/Rx 300 lineup) but it works on all GCN and RDNA cards.


What I think incredible, is that 20 years CPUs may not be able to run the latest client software but are still powerful enough for a lot of tasks. You probably don't need anything more fancy than this to a lot of industrial machinery for example.


Or you could use a arm chip that consumes 100 times less power


AMD is a showcase for why competition is good for consumers. I have little doubt that had AMD spent the last 30 years as the pc chip leader, it would have turned to all the exploitive and anti-consumer practices of intel and nvidia.

Dominant market leaders tend to lock down their position, while those who are not, must innovate and build relationships with consumers by other means, including support, or open drivers. I'm just riffing, but it seems to me a reoccurring theme, and one fresh in my mind as I try to migrate from Windows to Ubuntu Budgie, but their lock down of Office and Windows means I must run an unwieldy full VM.


Maybe for some things. But AMD dropped support for compute (ROCm) for it's RX 580 card after just under 4 years. And there are no open source drivers for compute.


The open-source AMDGPU KFD driver in the Linux kernel supports compute, including with the RX 580 (Polaris). It's true that AMD dropped official support for that hardware and has been closing Polaris-specific bugs as wont-fix. The last ROCm release that AMD validated on Polaris hardware was ROCm 3.5 (June 2020).

I've packaged the test suites of the ROCm math libraries for Debian Trixie and have personally donated a selection of Fiji and Polaris hardware for the Debian AI Team's CI. It's beyond my abilities to fix everything wrong on old hardware, but I'm hoping that having publicly accessible test logs will at least help people to understand what works and what doesn't.

Of course, it always helps if the community contributes. Debian would certainly be interested in bug reports and patches for Polaris. Users with hardware no longer supported by AMD may find it easier to work with their distro package maintainers than with AMD directly.


> And there are no open source drivers for compute.

Well, technically ROCm is open source (although it comes across more as a "dump source code releases every now and then" to me.)

Also, the link is from the Mesa project, which is working on an OpenCL frontend to the driver called rusticl. It's not yet stable or enabled by default, but if you get a build of it, you can try setting the environment variable `RUSTICL_ENABLE=radeonsi` and it should sort-of work (on any Radeon HD 7000 or Radeon R7/R9/RX GPU!)

iirc the rusticl developers are even hoping/planning to get HIP (via chipStar) and SYCL (via Intel's compiler) working on top of rusticl.


I mean, the HIP compiler is built on LLVM and AMD-specific development happens directly upstream, which is more than you can say for other vendors.

Other parts of the overall ROCm system are in a worse place, admittedly. For some reason, AMD doesn't seem to understand the principles of open development.


What are the facts about how long AMD provides drivers for?

Nvidia provide drivers for what feels like a VERY long time.

I have the feeling that AMD gives up on providing drivers after only a few years of product life.

But these are only hunches, anyone know the facts?


Microsoft also still sometimes provide updates for > 20 years old Windows XP, while Apple stop providing security updates only after a few years, which is a shame.


I never thought Mac OS releases were planned for longevity after they started being free, more like point releases. You were expected to upgrade and if you didn't, Apple doesn't care if you had a valid reason to stay (PPC Rosetta, i386 compatibility), just upgrade. Windows releases on the other hand can last up to 13 years and even longer, there are forums for modern day XP users and even ports of modern browsers.


Apple provides about 10 years on computers and 7 years on devices. WindowsXP provided about 8 years of mainstream support. They offered extended support to business for 13 years for a premium but even that has expired. They will soon stop supporting computers without tpm 2 on windows 11.


I recently had issues with AMD's RX580 GPU, where the latest drivers would cause issues for me when trying to run VR games with an Oculus Quest (either SteamVR or the Meta OpenXR implementation). I'd play the game for about 10-40 minutes and then it would crash, the drivers would reset, the screen would go black in the middle for a bit and the OS would make an attempt to recover (which didn't always work).

In the end, turns out that the version of AMD Software and their included drivers from 2023 didn't work and I had to revert to an older version. I ended up going for their 2020 AMD Pro drivers and software because the card is from 2018 and the improvement was immediate - I haven't had a single crash since, whereas previously I could reproduce it 100% of the time, with only the time before the crash varying. Note: this is on Windows 10, which I boot into for gaming, though I actually like their AMD Software which feels a bit better than CoreCtrl on Linux distros.

Some people suggested that 2022 drivers/software would also work, but this is one of those cases where the hardware technically gets updates, but they actually make things worse, perhaps because the older hardware is not tested against as much, especially with newer games and programs. I don't necessarily expect anyone to care that much about hardware from 2018, it was just an interesting case of things going wrong.

Overall, I like AMD because they're affordable (still running a Ryzen 7 1700 CPU on my main workstation and it does everything I need) and decent, and perhaps the driver situation on Linux is a bit better than Nvidia's, but then again I also like Intel's attempts with Arc, except that Nvidia still wins out because of CUDA for a lot of workloads (I never managed to get ROCm working locally well for running LLM's and DKIM kicked my butt, needing to compile stuff after every apt upgrade).

Oh well, here's hoping that we keep getting lots of options for hardware, software and that the market competition stays alive and well, so I can enjoy affordable last gen options with most of the issues patched out a few years down the line as well, when I decide to upgrade on the cheap. But also even in regards to entertainment like gaming things are getting better thanks to Proton. I look forwards to the day when I can 100% switch over to Linux.

My homelab servers already run Debian/Ubuntu with AMDs 200GE CPUs, which I got for dirt cheap and which have relatively low TDP (35W) while still giving me access to x86, which I can passively cool.


why though


alex deutcher. google it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: