If I had a time traveling magic wand, the one Software Thing I would wish for would be that native, cross platform toolkits had won the war rather than the whole industry punting and declaring The Web Browser to be the target platform, or these super-high-level Game Engines. For decades, I always hopelessly thought the OS and hardware gaps would eventually be bridged by something like Qt or SDL or wxWidgets, and we'd all one day be happily programming cross platform apps using plain old native languages and SDKs instead of Electron or the HTML/CSS/JS triad of pain. As the years go on, and OS vendors move even more towards their own proprietary incompatible native APIs, this dream seems less and less likely.
Platform innovation requires control over your own API, because you want to expose the features and architectures that make your platform excel and that weren’t accounted for in abstracted tools. There will always be incompatible native API’s.
Meanwhile, a ton of apps and games are completely agnostic to those cutting edge platform differences and are going to thrive in least common denominator sandboxes. And making those sandboxes easy to use for some specific style/genre/skill-level is always going to be the competitive difference between them. So the big high-level things are always going to exist too.
But… so are the near-metal abstractions that let you cut through and interleave cross-platform and platform-specific code even in high-performance paths.
You wanted the last group to “win”, but the ecosystem inevitably involves all three. There will always be something like Metal, there will always be something like Unity, and there will always be something like SDL. Winning isn’t necessary.
> Platform innovation requires control over your own API, because you want to expose the features and architectures that make your platform excel and that weren’t accounted for in abstracted tools.
Yeah, I’m not buying that. It’s the story they tell you of course, but I think that’s a marketing lie.
Let’s be clear first that hardware is the platform. Your comment seems to agree with that. Note that for quite a long time, the Windows and Mac world used the same hardware (same CPU, same GPU), and therefore the same platform. They could have went together and specified a common API to work on both MacOS and Windows, and they could both expose all the hardware has to offer. Heck, if they really wanted to expose all the goodness hardware has to offer, they would give us the actual data sheets. They don’t, for various reasons that are generally tied to "IP".
They tell us sweet words about innovation, but let’s be honest they just want to lock us in.
I was trying to respond to "Platform innovation requires control over your own API". The short answer "no it does not": look at CPUs, we just need their ISA to take advantage of any improvement.
In fact, the best way to expose any hardware improvements is to give us the data sheet. Gate keeping direct access to the hardware with an API effectively reduces user access to innovation.
One could criticise how I conflate hardware and platform. I’ll just note that all the goodness we’ve seen the past 40 years were made possible by hardware. Personally I saw precious little innovation coming from software specifically. So even if a platform is more than just hardware, actual innovation mostly comes from hardware anyway.
What would make such a platform less of a compromise than a web browser?
How would programming in C++ be less pain than programming in JS / HTML / CSS? At the very least, JS code won't write past array bounds, or smash the stack.
From relevant olden times, Lisp and Smalltalk environments were closest to the ideal. They were expensive though, and nobody distributed them for free, as Netscape did with the browser. They also notably lacked any protections against untrusted code. But worst of all, they'd likely run even more poorly on consumer PCs circa 1995.
So, enjoy Typescript, V8, flexbox, canvas, web workers, etc. You could end up having a worse deal.
> How would programming in C++ be less pain than programming in JS / HTML / CSS? At the very least, JS code won't write past array bounds, or smash the stack.
A native ABI doesn't mean you have to use C++ though. I can use Qt from Python if I like, or even from the JVM (slightly fiddlier, but doable). I can't do that with the browser.
> nobody distributed them for free, as Netscape did with the browser. They also notably lacked any protections against untrusted code.
The JVM avoids both those problems though - it had a robust security model and was distributed for free. What killed it was that corporations refused to install Java Web Start on their computers because it's a scary "application runtime". But they would happily install web browsers because that's just a "document viewer". Even though they both do the same thing!
I believe that aforementioned “robust security model” was removed several years ago due to issues with its actual robustness https://openjdk.org/jeps/411
If you read your link, it was removed largely because the things that used it (such as Java Web Start) had been removed, which was more because they failed in the market than because its actual security record was particularly poor.
(Yes, there were occasional sandbox escapes, but there are occasional sandbox escapes in web browsers too. Few security mechanisms are perfect)
> A native ABI doesn't mean you have to use C++ though. I can use Qt from Python if I like, or even from the JVM (slightly fiddlier, but doable). I can't do that with the browser.
wasm is that ABI for browser. Yes it would make everything bit slower, but I am fine given a lot more added security.
If we reach the point where a WASM-only app is a first-class citizen and I can write an app that doesn't have to touch HTML/CSS/JS (doing the UI with canvas or whatever), I'll be happy. We're not there yet though.
I think people really underrate browsers. The browser standards are open and have multiple open source implementations. People associate browsers too much with annoying trashy ad-based and other questionable websites to see how good they are themselves.
Electron has an annoyingly heavy download size but it's not the only option for native releases of web-based apps. Windows and some other OSes have built-in browser widgets that can be used with Tauri.
> The browser standards are open and have multiple open source implementations.
The browser standards are open only in name. The sad fact is, implementing those standards are flat out impossible if you’re not a megacorp. They’re just too damn big: I recall someone counted like more than a hundred million words.
Now using those standards is easy, you can implement a subset. But the number of browser engines that actually supports enough of those standards will only decrease.
I'm not really sure for how long we will have multiple implementations. And we won't have for sure any new implementation, we are stuck with the 3 we have and can only hope the 2 non chrome ones will survive.
That's exactly what people said about IE over 20 years ago. History has proven this reasoning untrue. Web isn't going anywhere. If there's an opportunity to build something 10x better than Chrome, it'll be shipped.
> If there's an opportunity to build something 10x better than Chrome, it'll be shipped.
There won’t be. Since IE6 the standards have grown to inhuman proportions, and implementing a new browser engine is even more difficult than it was then.
I'm sure with enough dedicated and enthusiastic people something better than Chrome can be implemented. Though will it survive at all is another question. Chrome has an effective stranglehold on the market, so for anything else to succeed it will take political will rather than development effort.
Browsers are absurdly well-optimized for performance. If you know how to tap it, you can make screaming-fast apps of various kinds, with top-notch graphics, font rendering, accessibility support, audio, video, etc. They also have really solid networking capabilities, as long as you don't need raw TCP or UDP. In particular, HTTP/2, HTTP/3, WebSockets, and WebRTC allow for a lot of advanced things.
By now, you also have WebGL and WASM, if JS's JIT is not fast enough for you.
I love how people use hyperbole such as "screaming fast" and yet a native application that's not even all that optimized will tend to run absolute circles around these "screaming fast" solutions.
How are we supposed to describe these native apps? What's faster than screaming fast? ear shatteringly quick?
> I love how people use hyperbole such as "screaming fast" and yet a native application that's not even all that optimized will tend to run absolute circles around these "screaming fast" solutions.
And you of course have nontrivial examples to prove that? Or as always source: trust me, bro?
This is like asking to prove the sky is blue in a sunny day. Walk outside and you'll see it. I was talking to friends about how we've forgotten how fast computers are because all we see are web pages and Electron applications. People don't even remember the wonder of native applications.
Try building an operating system or browser engine in Javascript and you'll see what the parent is saying. I'm just giving you these examples because these are some of the last remaining native applications everybody still uses, but pretty much any native application will be much faster than the Javascript version. The reason browser wins is that we got to a point where the performance is "good enough" and the development cost is significantly lower.
I've definitely seen some really well optimized web targets. Unfortunately that is not the common case in my experience currently.
That said the WebGL/WASM stuff is generally very nice in my experience and is very much changing my opinion. I'm interested to see what comes in the future!
For that to happen, OS vendors would have actually had to care about sandboxing and security, to enable local execution of completely untrusted code without any gatekeeper. It's their complete security failure, still continuing today, that forces everyone to the web.
The other, slightly less important thing is petty rejection of cross-platform APIs (e.g. Apple's refusal to allow Vulkan support in macOS). It's fine to additionally have platform-specific APIs, but there should be a least common denominator cross-platform standard. But middleware can smooth over this problem, while the security problem is something only OS vendors could fix.
Unfortunately, the position of gatekeeper turned out to be so profitable that vendors don't actually want to improve their security to the point where it's unnecessary. And they're also incentivized to prevent the web from improving to the point where it would threaten their gatekeeper status.
Even without the time traveling, I would be happy if there was just a single stable, non-bloated, reliable, portable platform that could be used for when you just want to Write Once and then know that it will Run Everywhere _forever_ (* insert disclaimer about nothing literally lasting forever). Not something that rolls out breaking changes every six months. Or six years for that matter. Would not even have to be an entire API, just a clear declaration that a subset of some APIs will never change, and some tool to verify that my code did not accidentally use any of the other parts of the API.
Unfortunately running things in a browser is no guarantee, even for those that would otherwise consider that a good option.
Web browsers are remarkably backwards compatible. 20 year old websites continue to work fine.
The things you linked are only advised against for new code:
> These features are likely stable because removing them will cause backward compatibility issues and break legacy websites. (JavaScript has the design goal of "don't break the web".) Still, they are not cross-platform portable and may not be supported by all analysis tools, so you are advised to not use them [...]
They are also typically browser-specific extensions that were never cross-platform in the first place, features added based on proposals that were not in the end accepted (such as the Object.observe/unobserve API), or features from the Old Times™ before the specs were fully defined (and therefore typically also not cross-platform).
You've also got a bunch of deprecations for things that were in the spec, will almost certainly be supported forever, but are now seen as bad API design for one reason or another - usually because they don't handle edge cases correctly for historical reasons, or the name doesn't reflect what the function actually does. Unless any of these features actively leads to a security issue, they're very unlikely to be removed.
20 year old browsers don’t work at all, though. You can’t browse any of the top 100 sites, and you won’t be able to download an old release of firefox with your old version of internet explorer, because SSL.
You can get closer by limiting the depth and breadth of the API, for example by using VT100 and the I/O operations from the C standard library.
That is ‘a bit’ minimalistic, but it is “just a single stable, non-bloated, reliable, portable platform that could be used for when you just want to Write Once and then know that it will Run Everywhere _forever_ […] Would not even have to be an entire API”, and it could run on hardware that has no chance to run Win32.
I don't have experience with .net core, but working on a cross platform java desktop application has me pretty convinced that "write one, run everywhere" is a pipedream even before you add on "forever". It's maybe more of less fine for small, simple applications, but eventually you'll run into something that doesn't work the same in Windows as in Mac and you'll start having to write platform-specific workarounds. At some point you'll find yourself reflecting into the platform-specific jvm internals to work around some bug or another. Then an os update will break something. Then a jvm update will break something.
I must admit I never looked at that ecosystem. Does it just happen to have been quite stable or is it a serious design decision they made and are sticking to?
From a quick search I do not get the impression that .NET has been deprecating parts of the API, including Core APIs, in the past, e.g.:
.NET was pretty stable. I remember porting old .NET Framework 4.x MVC web app to .NET Core 2, and then to .NET 5. Both times it took less than an hour to port. Old directory structure and APIs still work even if they are not the new hot way to do things. Microsoft is known for backward compatibility.
Open source, powered by Skia, backed by JetBrains, and quite battle-tested at this point for small to medium-sized apps. In theory perfectly capable for enterprise as well, since it's basically a spiritual successor to WPF, which has been an industry standard for about 15 years.
They're diving into mobile and WASM well, but that's more of a recent effort and I haven't tested that yet.
Electron would be great if it weren't for the performance, security, configuration, and packaging issues. The latter two seem to be what OP suffered the most.
html/css/js (and the frameworks on top of it) seem like a pretty low bar to build games and business logic for a variety of apps which, despite huge efforts from OP, could run on pretty much any modern platform.
Native is the only way to go. The problem of the mythology of cross-platform (pseudo-)development is having to know both the underlying platform AND the abstraction layer. This explains why RubyMotion, PhoneGap/Cordova, and Appcelerator/Titanium were flops, and so shall Microsoft's MAUI.
There is nothing for free. Abstractions cost performance and confuse troubleshooting.
Flutter is actually pretty close to this right now. I'm building an app that targets Windows, Mac, iOS and Android and so far it's working really well on all of them with more than 90% code reuse.
If Google doesn't give up on it I think it's going to be a much better stack for cross platform applications than the browser is.
Flutter has a multitude of problems, at Google's inability to support anything long-term is probably the smallest.
Flutter apps don't look or work like native apps, and the only people who will put up with that are people who have to do so because their enterprise mandates it. Flutter apps have horrible battery performance. Flutter apps are always at least six months behind what is possible with native toolkits and SDKs. Flutter apps use a language that literally no one other than Sass or Flutter developers actually want to use and that offers exactly no benefit over the dozens of other possible languages out there.
Flutter is Java Swing, but worse in pretty much every way.
No so far I've had a great experience with it. But keeping up with new improvements and changes to the underlying platforms is going to require ongoing investment. Hopefully google continues to think it's worth it.
If you think any of these will save you from the issues of cross platform development and platform specifics, in a different way from what is described by the post, you are wrong.
You will still suffer with notarization and appleness, Android stuff being pressured by Play Store policies changing constantly, and every platform/store specifics, adapting controls, form factors, gestures...
You misunderstand me. I think Java's great. Java applets were quite bloated. moreso than Electron. You can still make Java Applets if you want, but people have moved on.
There's way more than enough room for both given how many million UIs get made. I think more time should be spent wondering why cross-platform toolkits aren't good enough. It's kind of lazy to point at the incumbent and say it's their fault for some reason.
Or framed this way: your dream exists and it's called Qt and can be used to make some absolutely fantastic applications[1]. What's deficient about it and why?
> Similar to developing for macOS, a Mac is pretty much required for developing for iOS and there’s the $100 per year developer membership fee. I think the combined income of both iOS and macOS (95% of which comes from iOS) barely covers the cost of the membership fee and the cheapest Mac Mini.
I think this contextualizes the post well, seems like overall revenues might be in the <$10k or even <$5k range. That's extremely hobby territory (/buying lottery ticket territory). Feels like at that scale a 'build whatever makes you happiest' heuristic is healthier for the individual and cross-platform support works against you.
Their MacOS revenues might have been <$10k, but given they have a 1000 reviews on Steam and their Android has 100K+ downloads I imagine the total revenue is more like >£80k
If you read the quote, MacOS plus iOS is <$1k (cheapest mac mini + apple dev acct). Given that the article stated iOS & Android have equivalent revenue, you're looking Android+iOS+MacOS at <$1500 revenue. Its not a mobile-first game but 100k downloads suggests a single Android download is worth a penny, give or take. Brutal.
I'm not necessarily talking about whether the market is _large_ but rather whether users on that platform and its users will participate in raising its profile to a tipping point. More than one platform is likely to contribute to this.
There is a market but not for the hobbyist. You have better luck on iOS than macOS for games. However, tools, you can make a killing selling simple tools with sexy UX on Mac.
> However, tools, you can make a killing selling simple tools with sexy UX on Mac.
I've been thinking about doing something like this recently but I'm not a heavy mac user these days. Any tips on what people are looking for in small tools?
Know your market. If you aren’t a heavy Mac user then you probably should develop for your space. Any attempt will be met with headache as you fight Apple certification, design aesthetics, and enshitification of your code base to support the nuances of Apple.
Ask yourself, what tools can I build that are not only useful to me, but maybe useful to others, and start there.
I suppose Steam handles Windows game installation, so you don't need Windows installer code signing. But it's worth pointing out that for non-Stream applications, compared to the cost of signing a Windows installer, the $99 yearly fee for Mac is an absolute steal. For windows, you need to get an EV code signing certificate and the cheapest option is $150 US per year, but you ALSO need a >$100 token device. Typical prices for a certificate are 300-500 USD per year.
Figuring out how to do production code signing on Windows, and where to go to get your app trusted after signing, is also way harder on Windows. In contrast, implementing Apple's code signing is both cheap and easy.
The token requirement is a pain. We settled on using Azure Key Vault and AzureSignTool [1]. It costs $5 a month for a HSM key and you can sign things from anywhere.
That's horrible and dastardly, but at least it's far easier for users to bypass SmartScreen on Windows than the block on Macs. I wonder how many Mac users actually know how to.
If you just get a regular (cheaper) code signing certificate I realise SmartScreen will still block you anyway until enough people have installed it, but how many is "enough"?
> Linux accounts for less than 1% of the total players
I think some types of games (think factory building, zachtronics style puzzle games, ...) might have a bigger percentage, though that's only a feeling I have, I don't have numbers myself, I just see such games more often have a Linux release on Steam that seems appreciated.
I published a not-super-popular (edit: but free) esoteric programming game on Steam and of the < 100 (Steam) hardware survey responses for people who have played my game, I'm seeing closer to 5% on Linux (edit to add: my game doesn't support Steam Deck).
I'm sure the hardware survey is biased towards Linux, but it's still a surprising result! Especially so, given that I originally only released the game on Windows (and later added Linux support after getting multiple requests to port the game).
Feel free to name you're game so that I and others can check it out. It's not frowned upon to link to your paid for content if it's relevant to the topic/comment thread discussion.
Thanks! Honestly, comments like this probably have a more positive impact on me than the small amount of money I could have made by selling the game :)
I found your game a couple of months ago here on HN and put it on my backlog of things to check out. But rest assured that I told a couple of work colleagues who went down that rabbit hole and didn't show up again for days due to being fascinated and nerd sniped. Their responses were quite positive and I'm looking forward to finding the time to follow!
Downloaded the game based on this comment. The Zero-Instruction computer achievement was one of the most fun "sidequests" I've played in this type of game. Thanks!
ninja edit: the lack of vim mappings is killing me
I always thought the hardware survey was biased towards windows. Running both windows and Linux I seem to never get the hardware survey on Linux but get it yearly in windows.
That would make sense, the programmer crowd is more common on Linux and these are programming-like games. Looking at the authors games, they seem like the type of thing that would be more likely to do well on Linux (factory / automation). So the fact that he's still leaning towards it not being worth it financially seems notable.
Loved the article. My beef is debugging IOS Safari. It's so tragic that it's The Next IE™ but I can't debug it without a mac. There has always been ways to do it, but talk about jumping through hoops to see a debug window.
In this case they're shooting themselves in the foot. By requiring potential developers to own a Mac, they sharply limit their developer audience. Thus, fewer apps being made for their platform, thus fewer apps to drive using the platform, thus fewer sales of their precious hardware.
They showed time and again that they don't want a too-wide developer audience. They already have too many apps, and see the choice fatigue in users. They want big companies professionally produce polished stuff that commands, say, $14.99 in app store, of which $5 is Apple's share. They want stuff like Beatmaker or ProCreate.
Small fry need not apply; if they insist, they should at least clear the threshold of owning a Mac and paying $100/year for the App Store license.
The real world runs on Linux (from servers, data centers, microcontrollers, ...), business runs on Microsoft/Office, and creative makers (from designers to OSS creators) do it on a Mac.
Of course not 100% true but wherever I look its like >50% true at least.
Not to mention being the most valuable company in the world, sitting on _not_ the majority of users _but_ the majority of user revenue/wallet sizes, and having more cash to spend at hand than nearly every other company in the world has in total revenue, to the point of surpassing small/medium countries' GDP.
So far, "shooting themselves in the foot" I would not 100% agree with. And depending who you're asking, the Mac is the #1 platform, and is even growing since M1 chip releases.
They've been at it for over 40 years and grew to the most valuable company in the world. In many computer fields they still are THE platform (anything design for example) so clearly the software they do have is plenty.
Can you elaborate what you mean by shooting themselves in the foot?
Do you see them as a centipede who sacrificed one foot in exchange for all the riches in the world?
Or do you mean that if they were more open to developers they'd somehow be even bigger? What's bigger than a trillion dollar company? Would they be intergalactic?
Because they haven't achieved any serious success with the Mac. If they didn't have their portable device line they would be a bit player in the tech industry.
Not really. I'm a long time Apple user. Started using macOS with the version 9 (but my first personal laptop ran OS X).
I don't consider them successful in computing. In fact my first job was at an Apple Service Provider and it was already hard to run everything useful on Macs. Now I wonder how much technology is actually actually running on Macs. Their computers are becoming glorified clients for other companies technologies.
They make make a ton of money selling a lot of small devices that are more about fashion / social statut than technology relevance nowadays.
As it is there isn't a whole lot of good reasons to start using a mac instead of anything else.
I don't think they are currently successful in the computer business. Them making shiton of money on overpriced mobile fashion doesn't really change that.
I saw a message on here but you can actually run the latest Safari on Windows. I had to search a bit but yes in the end I got it working. Basically the latest webkit can still be used in a rudimentary browser.
One issue with publishing a browser-based game that this article glosses over is: managing save data.
The browser provides Local Storage, but that isn't reliable as a "source of truth" since browsers (mostly on iOS, I think) may delete the data periodically (thanks, iOS, for deleting my Wordle history!) or when clearing browser history. Aside: the worst situation is when publishing on itch.io and playing on an iPad--your data appears to get saved to Local Storage but is actually wiped immediately to prevent cross-site tracking (itch.io hosts HTML5 games in a frame with a different domain).
Publishing on Steam, on the other hand, gives you durable file system storage and you can add cross-device syncing via Steam Cloud very easily. I'm holding out hope that something like remoteStorage will eventually catch on for browsers, but for now I don't really see a convenient solution.
If anyone has ideas for managing save data in a browser game that don't involve hosting user data myself (and dealing with account recovery, GDPR, etc.), let me know!
So this is happening because itch.io hosts the game on a CDN subdomain such as v6p9d9t4.ssl.hwcdn.net which may change at any time? If it doesn't change, it sounds to me like there should be no problem, but I suspect I misunderstand. And sounds like it's Safari, not just iOS?
I'm about to start letting users export their games to html5, almost all of which will end up on itch.io, so really need to solve this too.
I'm running up against this pain in a hobby project. The best options I've come up with so far are:
* WebRTC and NAT traversal to let the user sync files (complicated but viable for my use case but pretty dumb for gaming).
* Cloud storage and anonymized social logins with E2E encryption (cue key management horror stories)
* Download a file on exit, upload on resume (terrible UX)
Web-first app, Electron for desktop, WebView for mobile.
I wonder how that compares to using an engine like Godot, Unity, or UE for cross-platform support? I love Electron and WebView is nice too, but neither would be my first choice for creating a video game. Personally, I'd prefer a native app that compiles to WASM for web support.
I noticed that too. Yes many games can be made this way but it's not what I think of when I think of making multi-platform games. At the same time if we could continue advancing web apps it would be more appealing, but does PWA even work everywhere?
"Saturated", as in the first usage under "Web", refers to market saturation. The word is questionably appropriate in the first case, and wrong in the second.
I suspect that what he means (in both cases), is that sales are low on both platforms, and they are not worth the effort to port to.
Does anyone know if, when you release a Windows game on Steam and it is played through Proton (eg the Deck or really anything, but the Deck is notable since it's a common platform), the dev would have any indication that the game is not in fact running on Windows? In a way, it doesn't matter, but I always wonder how these kinds of stats color how devs look at things.
But it seems like a lot of the other Mac issues are related to Steam.
A normal Mac app sold online or through the store wouldn’t need special entitlements for dylib loading, and Xcode could easily handle all the signing for you. Xcode cloud could help too with builds.
I don’t think the game being Electron would matter there.
> It turns out that unless the game is explicitly marked (by Valve reviewers), Steam Deck will use the Windows build + Proton even if a Linux version is available.
I found this which sounds like it's not the default, but is in fact a result of compatibility testing:
> If your game has gone through Steam Deck compatibility testing and the testers reported that the native Linux version didn't work (because of #579), then it might have been flagged to run the Windows binaries via Proton by default, instead of the native Linux version.
For those interested in cross platform game development, don't forget https://haxe.org/! The usefulness / popularity ratio is very high on this one :).
As the post mentions, the de-facto official way to distribute games on linux is as a windows binary run via wine/proton. AFAIK this mainly stems from issues in how linux deals with graphics drivers - you can static link everything _except_ opengl. This is similarly an issue with containerizing graphical apps: last I read you needed to have the same version graphics libraries both on the host system and _within_ the container due to the weird linking issues.
Does anyone know if there's any efforts to improve this situation? Is there some ideological issue in linux that prevents standardizing or presenting a generic interface?
(Or more background, my details are very vague here)
> As the post mentions, the de-facto official way to distribute games on linux is as a windows binary run via wine/proton.
It absolutely isn't unless you hate your users.
> you can static link everything _except_ opengl
You don't need to statically link OpenGL - it has a stable ABI and most functions need to be loaded at runtime anyway.
> This is similarly an issue with containerizing graphical apps: last I read you needed to have the same version graphics libraries both on the host system and _within_ the container due to the weird linking issues.
You don't need to worry about containers when distributing native Linux games.
> Does anyone know if there's any efforts to improve this situation? Is there some ideological issue in linux that prevents standardizing or presenting a generic interface?
Excellent post. This is why I’ve given up on trying to monetize my web game (https://www.farmhand.life/) and just give it away for free as OSS. The money I’d make would not justify the effort needed to actually capture it.
Surprised to see "Code signing with a hardened runtime is almost a must." under Steam (Mac OS). I released a game on steam and never had my distributables signed and never had a single complaint. Every player launches Steam games through Steam, and when that happens there's no need for signing.
> All the revenue from the platform does not even cover 10% of the cheapest Mac Mini. And then there’s the $100 per year developer membership fee for notarization
Same experience for my iOS games.
Apple should more friendly for sole indie developers. (But maybe they really don't need to care about this now?)