Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I recently had my Framework Desktop delivered. I didn't plan on using it for gaming, but I figured I should at least try. My experience thus far:

    * I installed Fedora 43 and it (totally unsurprisingly) worked great.
    * I installed Steam from Fedora's software app, and that worked great as well.
    * I installed Cyberpunk 2077 from Steam, and it just... worked.
Big thanks to Valve for making this as smooth as it was. I was able to go from no operating system to Cyberpunk running with zero terminals open or configs tweaked.

I later got a hankering to play Deus Ex: Mankind Divided. This time, the game would not work and Steam wasn't really forthcoming with showing logs. I figured out how to see the logs, and then did what you do these days - I showed the logs to an AI. The problem, slightly ironically, with MD is that it has a Linux build and Steam was trying to run that thing by default. The Linux build (totally unsurprisingly) had all kinds of version issues with libraries. The resolution there was just to tell Steam to run the Windows build instead and that worked great.



> I later got a hankering to play Deus Ex: Mankind Divided. This time, the game would not work and Steam wasn't really forthcoming with showing logs. I figured out how to see the logs, and then did what you do these days - I showed the logs to an AI. The problem, slightly ironically, with MD is that it has a Linux build and Steam was trying to run that thing by default. The Linux build (totally unsurprisingly) had all kinds of version issues with libraries. The resolution there was just to tell Steam to run the Windows build instead and that worked great.

I've heard it said in jest, but the most stable API in Linux is Win32. Running something via Wine means Wine is doing the plumbing to take a Windows app and pass it through to the right libraries.

I also wonder if it's long-term sustainable. Microsoft can do hostile things or develop their API in ways Valve/Proton neither need nor want, forcing them to spend dev time keeping up.


MS _can_ do that, but only with new APIs (or break backwards compatibility). Wine only needs to keep up once folks actually _use_ the new stuff… which generally requires that it be useful.


Or MS does deals with developers causing them to use the new APIs. I still haven't forgotten when they killed off the Linux version of Unreal Tournament 3. Don't for a second forget they are assholes.


Plus if it does happen, folks need to laern a bunch of new hostile stuff, given how linux is taking off, why not just move to treating linux as the first class platform.


> why not just move to treating linux as the first class platform

This is where the argument goes back to Win32 is the most stable API in Linux land. There isn't a thing such as the Linux API so that would have to be invented first. Try running an application that was built for Ubuntu 16.04 LTS on Ubuntu 24.04 LTS. Good luck with that. Don't get me wrong, I primarily use and love Linux, but reality is quite complicated.


stability has critical mass. When something is relied on by a small group of agile nerds, we tend not to worry about how fast we move or what is broken in the process. Once we have large organisations relying on a thing, we get LTS versions of OS's etc.

The exact same is true here. If large enough volumes of folks start using these projects and contribute to them in a meaningful way, then we end up with less noisy updates as things continue to receive input from a large portion of the population and updates begin more closely resembling some sort of moving average rather than a high variance process around that moving average. If not less noisy updates, then at least some fork that may be many commits behind but at least when it does update things in a breaking way, it comes with a version change and plenty of warning.


Sounds similar to macOS.


MacOS apps are primarily bound to versioned toolchains and SDK's and not to the OS version. If you are not using newer features, your app will run just fine. Any compatibility breaks are published.


> Try running an application that was built for Ubuntu 16.04 LTS on Ubuntu 24.04 LTS. Good luck with that.

Yea, this is a really bad state of affairs for software distribution, and I feel like Linux has always been like this. The culture of always having source for everything perhaps contributes to the mess: "Oh the user can just recompile" attitude.


I'd love to see a world were game devs program to a subset of Win32 that's known to run great on Linux and Windows. Then MSFT can be as hostile as they like, but no one will use it if it means abandoning the (in my fantasy) 10% of Linux gamers.


That's basically already happening with Unity and Unreal's domination of the game engines. They seem dominate 80% of new titles and 60% of sales on Steam [1], so WINE/Valve can just focus on them. Most incompatible titles I come across are rolling their own engine.

[1] PDF: https://app.sensortower.com/vgi/assets/reports/The_Big_Game_...


Same with Godot. I'm writing a desktop app, and I get cross-platform support out-of-the-box. I don't even have to recompile or write platform-specific code, and doesn't even need Win32 APIs.


One aspect I wonder about is the move of graphics API from DX11 (or OpenGL) to DX12/Vulkan, while there have been benefits and it's where the majority of effort is from vendors they are (were?) notoriously harder to use. What strikes me about gaming is how broad it is, and how many could make a competent engine at a lower tech level, but fits their needs well because their requirements are more modest.

I also wonder about the developer availability. If you're capable of handling the more advanced APIs and probably modern hardware and their features, it seems likely you're going to aim at a big studio producing something that big experience, or even an engineer at the big engine makers themselves. If you're using the less demanding tech it will be more approachable for a wider range of developers and manageable in-house.


I believe it's already happening to a minor degree. There is value in getting that "steam deck certified" badge on your store, so devs will tweak their game to get it, if it isn't a big lift.


I am seeing that number increasing soon with The SteamDeck and SteamMachine (and clones/home builds). Even the VR headset although niche, is linux.

The support in this space from Valve has been amazing, I can almost forgive them for not releasing Half Life 3. Almost.


There are strong indications that Half Life 3 (or at least a Half Life game) is coming soon. Of course, Valve might decide to pan the project, but I wouldn't be surprised seeing an announcement for 2026.


> Microsoft can do hostile things or develop their API in ways Valve/Proton neither need nor want, forcing them to spend dev time keeping up.

If they decide to do this in the gaming market, they don't need to mess up their API. They can just release a Windows native anti-cheat-anti-piracy feature.


> They can just release a Windows native anti-cheat-anti-piracy feature.

Unless it's a competitive game and it's a significant improvement on current anticheat systems I don't see why game developers would implement it. It's only going to reduce access to an already increasing non-windows player base, only to appease Microsoft?

Also in order to circumvent a Windows native version wouldn't that be extremely excessive and a security risk? To be mostly effective they would need to be right down the 0 ring level.. just to spite people playing games outside of Windows?


Existing anticheat software on Windows already runs in ring 0, and one of the reasons that competitive games often won't work on Linux is precisely that Wine can't emulate that. Some anticheat softwares offer a Linux version, but those generally run in userspace and therefore are easier for cheaters to circumvent, which is why game developers will often choose to not allow players that run the Linux version to connect to official matchmaking. In other words, for the target market of developers of competitive games, nothing would really get any worse if there was an official Microsoft solution.

On the other hand, using an official Microsoft anticheat that's bundled in Windows might not be seen as "installing a rootkit" by more privacy-conscious gamers, therefore improving PR for companies who choose to do it.

In other words, Microsoft would steamroll this market if they chose to enter it.


Also Microsoft closing the kernel to non-MS/non-driver Ring 0 software is inevitable after Crowdstrike, but they can't do that until they have a solution for how anti-cheat (and other system integrity checkers) is going to work. So something like this is inevitable, and I'm very sure there is a team at Microsoft working on it right now.


> just to spite people playing games outside of Windows?

These things are always sold as general security improvements even when they have an intentional anti-competitive angle. I don't know if MS sees that much value in the PC gaming market these days but if they see value in locking it all down and think they have a shot to pull it off, they'll at least try it.

In theory a built in anti-cheat could framework have a chance at being more effective and less intrusive than the countless crap each individual game might shove down your throat. Who knows how it would look in practice.


>I've heard it said in jest, but the most stable API in Linux is Win32.

Sometimes the API stability front causes people to wonder if things would be better if FreeBSD had won the first free OS war in the 90s. But I think there's a compromise that is being overlooked: maybe Linux devs can implement a stable API layer as a compatibility layer for FreeBSD applications.


Steam has versioned Steam Linux Runtimes in an attenpt to address that. Curently it's only leveraged by proton but maybe it will help in the future.

It's like a flatpack, stabilized libraries based on Debian.


Hear me out:

Containers. Or even just go full VM.

AFAIK we have all the pieces to make those approaches work _just fine_ - GPU virtualization, ways to dynamically share memory etc.

It's a bit nuts, sure, and a bit wasteful - but it'd let you have a predictable binary environment for basically forever, as well as a fairly well defined "interface" layer between the actual hardware and the machine context. You could even accommodate shenanigans such as Aurora 4X's demand to have a specific decimal separator.

We could even achieve a degree of middle-ground with the kernel anti-cheat secure boot crowd - running a minimal (and thus easy to independently audit) VM host at boot. I'd still kinda hate it, but less than having actual rootkits in the "main" kernel. It would still need some sort of "confirmation of non-tampering" from the compositor, but it _should_ be possible, especially if the companies wanting that sort of stuff were willing to foot the bill (haha). And, on top of that, a VM would make it less likely for vulnerabilities of the anti-cheat to spread into the OS I care about (a'la the Dark Souls exploit).

So kinda like Flatpak, I guess, but more.


Check out the Steam Linux Runtime. You can develop games to run natively on Linux in a container already.

Running the anti-cheat in a VM completely defeats the point. That's actually what cheaters would prefer because they can manipulate the VM from the host without the anti-cheat detecting it.


There is no "real" GPU virtualization available for regular consumer, as both AMD and NVIDIA are gatekeeping it for their server oriented gpus. This is the same story with Intel gatekeeping ECC ram for decades.

Even if you run games in container you still need to expose the DRM char/block device if you want vulkan,opengl to actually work.

https://en.wikipedia.org/wiki/GPU_virtualization#mediated


I try to get as many (mostly older, 2D) Windows games as possible to run in QEMU (VirtualBox in the past). Not many work, but those that do just keep working and I expect they will just always work ("always" relative to my lifetime).

WINE and Proton seems to always require hand holding and leaks dependencies on things installed on the host OS as well as dependencies on actual hardware. Used it for decades and it is great, but can never just relax and know that since a game runs now it will always run like is possible with a full VM (or with DOSBox, for older games).


GPU sharing for consumers is available only as full passtrough, no sharing. Have to detach from host.


MS has supported doing gpu virtualization for years in hyper-v with their gpu-pv implementation. Normally it gets used automatically by windows to do gpu acceleration in windows sandbox and WSL2, however it can be used with VMs via powershell.

12th gen and later intel iGPU can do sr-iov.


>I also wonder if it's long-term sustainable. Microsoft can do hostile things or develop their API in ways Valve/Proton neither need nor want, forcing them to spend dev time keeping up.

Not while they continue to have the Xbox division and aspire to be the world's biggest publisher.


Yeah that's been my experience with native Linux builds too. Most of them were created before Proton etc got good, and haven't necessarily been maintained, whereas running the Windows version through Proton generally just works.

Unfortunately it seems supporting Linux natively is pretty quickly moving target, especially when GPUs etc are changing all the time. A lot of compatibility-munging work goes on behind the scenes on the Windows side from MS and driver developers (plus MS prioritizing backwards compat for software pretty heavily), and the same sort of thing now has a single target for peoples efforts in Proton.

It's less elegant perhaps than actual native Linux builds, but probably much more likely to work consistently.


Sometimes you see developers posting on /r/linux_gaming and generally the consensus from the community is mostly "just make sure proton works" which is pretty telling.

It's sort of a philosophical bummer as an old head to see that native compatibility, or maybe more accurately, native mindshare, being discarded even by a relatively evangelical crowd but,

- as a Linux Gamer, I totally get it - proton versions just work, linux versions probably did work at some point, on some machines.

- as a Developer, I totally get it - target windows cause that's 97% of your installs, target proton cause that's the rest of your market and you can probably target proton "for free". Focus on making a great game not glibc issues.

I mostly worry about what happens when Gabe retires and Valve pivots to the long squeeze. Don't think proton fits in that world view, but I also don't know how much work Proton needs in the future vs the initial hill climb and proof-of-success. I guess we'll get DX13 at some point, but maybe I'll just retire from new games and just keep playing Factorio until I die (which, incidentally does have a fantastic native version, but Wube is an extreme outlier.)


1. I think targeting compatibility is 99% as good as targeting native.

2. You’re discarding the shifting software landscape. Steam OS and Linux are trending towards higher PC gaming market share. macOS has proven you don’t need much market share to force widespread (but not universal) compatibility.

3. I don’t see the value in a purist attitude around Linux gaming. The whole point of video games is entertainment. I’m much less concerned with if my video game is directly calling open source libraries then if my {serious software} is directly calling open source libraries.


On point 3, I guess my views are different because my {serious software} is usually work, and if it stops working that's kind of a B2B problem and part of doing enterprise. It's just business as they say.

Gaming is much more meaningful to me as a form of story and experience, and it is important to me that games keep working and stay as open and fair as possible. In the same way it is important I can continue to read books, listen to music or watch movies I care about.


> Unfortunately it seems supporting Linux natively is pretty quickly moving target

With the container-based approach of the Steam Linux Runtime this should no longer be a problem. Games can just target a particular version and Steam will be able to run it forevermore.


I would hope Vulkan also does a lot of work here for linux native builds but I must admit I am only now starting my journey into that space.


A lot of those Linux native builds will have been using Vulkan.

Parity between DX12 and Vulkan is pretty high and all around I trust the vkd3d[0] layer to be more robust than almost anything else in this process since they're such similar APIs.

The truth is that it's just a whole lot harder to make a game for Linux APIs and (even) base libraries than it is to make it for Windows, because you can't count on anything being there, let alone being a stable version.

Personally I don't see a future where Linux continues being as it is (a culture of shared libraries even when it doesn't make sense, no standard way of doing the basics, etc.) and we don't use translation layers.

We'll either have to translate via abstraction layers (or still be allowed to translate Win32 calls) to all of the nasty combination of libraries that can exist or we'll have to fix Linux and its culture. The second version is the only one where we get new games for Linux that work as well as they should. The first one undeniably works and is sort of fine, but a bit sad.

0 - vkd3d is the layer that specifically translates D3D12 to Vulkan, as opposed to vkdx which is for lower D3D versions.


It's not really harder to make a good native Linux port that will keep working, it's just not something most game developers have much experience with.

I have a slightly different view. The former scenario is essentially having our cake and eating it too. I'd rather not "fix" Linux culture.


Too late for an edit now, but `vkdx` in the note here is supposed to say `dxvk`.

A wise man once said "The most stable ABI on linux is win23". It sounds like a joke, but it is actually true.


In my experience, Steam client and most games work great on Debian and Ubuntu, but you should know for GNU/Linux systems, it's only officially supported on Ubuntu (maybe SteamOS is implicit), I can't find the information on Steam's website or support pages, but this is a response I got from Steam Support when reporting a Steam client UI bug on Debian with GNOME, a while ago.


Which is funny because Ubuntu is also the only distro that wants you to install the Steam snap instead, which is then again unsupported.


When I run into issues with running games on Linux (Steam or otherwise), I found it useful to consult protondb.com to see what others have gotten to work. You can filter by OS or keyword etc.

https://www.protondb.com/app/337000 for Deus Ex: Mankind Divided


I wiped windows 10 from desktop. Installed cachyos and steam Installed path of exile 2

and it worked surprisingly, also i see people joking about how win32 is the only stable api on linux xD. Also heard red dead redemption 2 also works well on linux that might be the next game i will check out.


I can confirm I finished RDR2 in story mode in Bazzite, zero issues. Never played the multiplayer part, though.

Story mode is good enough for me

It's the same with Dying Light. They have a neglected Linux version and I downloaded 16GiB before i realised to switch to the Windows version and start again.


I run Fedora 43 and all games (single tickbox in settings) are running through "compatibility mode" (wine/proton). Works great!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: