Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I later got a hankering to play Deus Ex: Mankind Divided. This time, the game would not work and Steam wasn't really forthcoming with showing logs. I figured out how to see the logs, and then did what you do these days - I showed the logs to an AI. The problem, slightly ironically, with MD is that it has a Linux build and Steam was trying to run that thing by default. The Linux build (totally unsurprisingly) had all kinds of version issues with libraries. The resolution there was just to tell Steam to run the Windows build instead and that worked great.

I've heard it said in jest, but the most stable API in Linux is Win32. Running something via Wine means Wine is doing the plumbing to take a Windows app and pass it through to the right libraries.

I also wonder if it's long-term sustainable. Microsoft can do hostile things or develop their API in ways Valve/Proton neither need nor want, forcing them to spend dev time keeping up.





MS _can_ do that, but only with new APIs (or break backwards compatibility). Wine only needs to keep up once folks actually _use_ the new stuff… which generally requires that it be useful.

Or MS does deals with developers causing them to use the new APIs. I still haven't forgotten when they killed off the Linux version of Unreal Tournament 3. Don't for a second forget they are assholes.

Plus if it does happen, folks need to laern a bunch of new hostile stuff, given how linux is taking off, why not just move to treating linux as the first class platform.

> why not just move to treating linux as the first class platform

This is where the argument goes back to Win32 is the most stable API in Linux land. There isn't a thing such as the Linux API so that would have to be invented first. Try running an application that was built for Ubuntu 16.04 LTS on Ubuntu 24.04 LTS. Good luck with that. Don't get me wrong, I primarily use and love Linux, but reality is quite complicated.


stability has critical mass. When something is relied on by a small group of agile nerds, we tend not to worry about how fast we move or what is broken in the process. Once we have large organisations relying on a thing, we get LTS versions of OS's etc.

The exact same is true here. If large enough volumes of folks start using these projects and contribute to them in a meaningful way, then we end up with less noisy updates as things continue to receive input from a large portion of the population and updates begin more closely resembling some sort of moving average rather than a high variance process around that moving average. If not less noisy updates, then at least some fork that may be many commits behind but at least when it does update things in a breaking way, it comes with a version change and plenty of warning.


Sounds similar to macOS.

MacOS apps are primarily bound to versioned toolchains and SDK's and not to the OS version. If you are not using newer features, your app will run just fine. Any compatibility breaks are published.

> Try running an application that was built for Ubuntu 16.04 LTS on Ubuntu 24.04 LTS. Good luck with that.

Yea, this is a really bad state of affairs for software distribution, and I feel like Linux has always been like this. The culture of always having source for everything perhaps contributes to the mess: "Oh the user can just recompile" attitude.


I'd love to see a world were game devs program to a subset of Win32 that's known to run great on Linux and Windows. Then MSFT can be as hostile as they like, but no one will use it if it means abandoning the (in my fantasy) 10% of Linux gamers.

That's basically already happening with Unity and Unreal's domination of the game engines. They seem dominate 80% of new titles and 60% of sales on Steam [1], so WINE/Valve can just focus on them. Most incompatible titles I come across are rolling their own engine.

[1] PDF: https://app.sensortower.com/vgi/assets/reports/The_Big_Game_...


Same with Godot. I'm writing a desktop app, and I get cross-platform support out-of-the-box. I don't even have to recompile or write platform-specific code, and doesn't even need Win32 APIs.

One aspect I wonder about is the move of graphics API from DX11 (or OpenGL) to DX12/Vulkan, while there have been benefits and it's where the majority of effort is from vendors they are (were?) notoriously harder to use. What strikes me about gaming is how broad it is, and how many could make a competent engine at a lower tech level, but fits their needs well because their requirements are more modest.

I also wonder about the developer availability. If you're capable of handling the more advanced APIs and probably modern hardware and their features, it seems likely you're going to aim at a big studio producing something that big experience, or even an engineer at the big engine makers themselves. If you're using the less demanding tech it will be more approachable for a wider range of developers and manageable in-house.


I believe it's already happening to a minor degree. There is value in getting that "steam deck certified" badge on your store, so devs will tweak their game to get it, if it isn't a big lift.

I am seeing that number increasing soon with The SteamDeck and SteamMachine (and clones/home builds). Even the VR headset although niche, is linux.

The support in this space from Valve has been amazing, I can almost forgive them for not releasing Half Life 3. Almost.


There are strong indications that Half Life 3 (or at least a Half Life game) is coming soon. Of course, Valve might decide to pan the project, but I wouldn't be surprised seeing an announcement for 2026.

> Microsoft can do hostile things or develop their API in ways Valve/Proton neither need nor want, forcing them to spend dev time keeping up.

If they decide to do this in the gaming market, they don't need to mess up their API. They can just release a Windows native anti-cheat-anti-piracy feature.


> They can just release a Windows native anti-cheat-anti-piracy feature.

Unless it's a competitive game and it's a significant improvement on current anticheat systems I don't see why game developers would implement it. It's only going to reduce access to an already increasing non-windows player base, only to appease Microsoft?

Also in order to circumvent a Windows native version wouldn't that be extremely excessive and a security risk? To be mostly effective they would need to be right down the 0 ring level.. just to spite people playing games outside of Windows?


Existing anticheat software on Windows already runs in ring 0, and one of the reasons that competitive games often won't work on Linux is precisely that Wine can't emulate that. Some anticheat softwares offer a Linux version, but those generally run in userspace and therefore are easier for cheaters to circumvent, which is why game developers will often choose to not allow players that run the Linux version to connect to official matchmaking. In other words, for the target market of developers of competitive games, nothing would really get any worse if there was an official Microsoft solution.

On the other hand, using an official Microsoft anticheat that's bundled in Windows might not be seen as "installing a rootkit" by more privacy-conscious gamers, therefore improving PR for companies who choose to do it.

In other words, Microsoft would steamroll this market if they chose to enter it.


Also Microsoft closing the kernel to non-MS/non-driver Ring 0 software is inevitable after Crowdstrike, but they can't do that until they have a solution for how anti-cheat (and other system integrity checkers) is going to work. So something like this is inevitable, and I'm very sure there is a team at Microsoft working on it right now.

> just to spite people playing games outside of Windows?

These things are always sold as general security improvements even when they have an intentional anti-competitive angle. I don't know if MS sees that much value in the PC gaming market these days but if they see value in locking it all down and think they have a shot to pull it off, they'll at least try it.

In theory a built in anti-cheat could framework have a chance at being more effective and less intrusive than the countless crap each individual game might shove down your throat. Who knows how it would look in practice.


>I've heard it said in jest, but the most stable API in Linux is Win32.

Sometimes the API stability front causes people to wonder if things would be better if FreeBSD had won the first free OS war in the 90s. But I think there's a compromise that is being overlooked: maybe Linux devs can implement a stable API layer as a compatibility layer for FreeBSD applications.


Steam has versioned Steam Linux Runtimes in an attenpt to address that. Curently it's only leveraged by proton but maybe it will help in the future.

It's like a flatpack, stabilized libraries based on Debian.


Hear me out:

Containers. Or even just go full VM.

AFAIK we have all the pieces to make those approaches work _just fine_ - GPU virtualization, ways to dynamically share memory etc.

It's a bit nuts, sure, and a bit wasteful - but it'd let you have a predictable binary environment for basically forever, as well as a fairly well defined "interface" layer between the actual hardware and the machine context. You could even accommodate shenanigans such as Aurora 4X's demand to have a specific decimal separator.

We could even achieve a degree of middle-ground with the kernel anti-cheat secure boot crowd - running a minimal (and thus easy to independently audit) VM host at boot. I'd still kinda hate it, but less than having actual rootkits in the "main" kernel. It would still need some sort of "confirmation of non-tampering" from the compositor, but it _should_ be possible, especially if the companies wanting that sort of stuff were willing to foot the bill (haha). And, on top of that, a VM would make it less likely for vulnerabilities of the anti-cheat to spread into the OS I care about (a'la the Dark Souls exploit).

So kinda like Flatpak, I guess, but more.


Check out the Steam Linux Runtime. You can develop games to run natively on Linux in a container already.

Running the anti-cheat in a VM completely defeats the point. That's actually what cheaters would prefer because they can manipulate the VM from the host without the anti-cheat detecting it.


There is no "real" GPU virtualization available for regular consumer, as both AMD and NVIDIA are gatekeeping it for their server oriented gpus. This is the same story with Intel gatekeeping ECC ram for decades.

Even if you run games in container you still need to expose the DRM char/block device if you want vulkan,opengl to actually work.

https://en.wikipedia.org/wiki/GPU_virtualization#mediated


I try to get as many (mostly older, 2D) Windows games as possible to run in QEMU (VirtualBox in the past). Not many work, but those that do just keep working and I expect they will just always work ("always" relative to my lifetime).

WINE and Proton seems to always require hand holding and leaks dependencies on things installed on the host OS as well as dependencies on actual hardware. Used it for decades and it is great, but can never just relax and know that since a game runs now it will always run like is possible with a full VM (or with DOSBox, for older games).


GPU sharing for consumers is available only as full passtrough, no sharing. Have to detach from host.

MS has supported doing gpu virtualization for years in hyper-v with their gpu-pv implementation. Normally it gets used automatically by windows to do gpu acceleration in windows sandbox and WSL2, however it can be used with VMs via powershell.

12th gen and later intel iGPU can do sr-iov.

>I also wonder if it's long-term sustainable. Microsoft can do hostile things or develop their API in ways Valve/Proton neither need nor want, forcing them to spend dev time keeping up.

Not while they continue to have the Xbox division and aspire to be the world's biggest publisher.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: