Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So if I'm reading this right, there is still no such thing as a standard graphics API. There's OpenGL, there's Direct3D, there's other hardware-specific APIs (Metal and Vulkan are mentioned), and now Qt is inventing their own API to abstract over all these APIs.

How is having a standard API to draw 3D graphics not a solved problem at this point?



The game revs ruined it for everyone else ;)

The revisionist history: AAA game developers are used to console APIs. Which do exactly what you tell them to at the hardware, no sugar coating or drivers in the way. Having shipped a few games, drivers are indeed a terrible curse. One vendor's D3D11 drivers are pretty notorious for being incredibly invasive to your game in the goal of speed, making it difficult to ship content that was consistent across devices. Another vendor, sick of having to turn their driver into a rocket engine, found common ground with some game developers, and made a prototype, Mantle, which showed real-world performance gains on game content. But it was pretty specific to how their hardware and GPUs worked. But given that Xbox wanted performance, Microsoft was I intrigued and worked closely with that vendor to design the next-generation Direct3D API.

Meanwhile, the vendor with the super fancy drivers had a pretty major chokehold on OpenGL, which is just an absolute terrible mess of a bad API. It's an API that is so backwards and difficult that nobody likes it from the driver or the application perspective, but given that this vendor had the best drivers for it, they didn't want to lose it. But Mantle vendor wanted to shake things up and submitted Mantle to Khronos, the organization that standardizes OpenGL, to form the basis for the "gl-next" initiative. Mobile GPU vendors get invo.ved and turn the design into a mess. The original engineers who designed Mantle left their parent company over design disagreements to go join Apple, who also had a vested interest in getting rid of OpenGL but had little desire to use gl-next. This is Metal.

Ultimately, D3D12 is Microsoft's turf. Nobody likes Vulkan but it's there on Android and Stadia and Linux and is pretty mediocre there. And Metal is pretty well-rounded but is the domain of Apple.

3D rendering has always been a mess, but now it's even worse because tiles exist, data bandwidth is crazy expensive, and game revs need more fidelity and FPS than ever before. Synchronizing between coprocessors is now the job of the application.


> Microsoft was I intrigued and worked closely with that vendor to design the next-generation Direct3D API.

I've heard "accusations" that it was actually the inverse.

Microsoft and that vendor were working together on next gen consoles, including the new D3D12 API. But Microsoft weren't planning to release it on windows until long after the xbox one released.

The vendor wanted something to show off now, so they pulled together Mantel in a reasonably short time-frame and announced/launched it before D3D12 was even in an announceable state.

I have idea if these "accusations" are true, it's just what I've heard.


> Nobody likes Vulkan but it's there on Android and Stadia and Linux and is pretty mediocre there.

Mediocre on Linux? Since when?


Didn't Vulkan omit a way to query for active displays so you just have to pick a random display index when your application starts up?

Other than that I thought Vulkan was pretty decently comparable to Metal. I don't have any first hand experience with it though.


That sounds more as if it would be a window system specific issue? As far as I know OpenGL didn't have that kind of API either and you had to use X11 and the glX bindings for that kind of information.


To complicate the mess, the other vendor DX11 drivers are so bad that some Windows users have started using DXVK (really designed for Linux, but works just fine on Windows) to translate DX11 to Vulkan and getter better performance in many games.


Differences between Mantle and Vulkan are not that big, some things are super similar. The biggest difference between Mantle and Vulkan are renderpasses, and that can feel like a mess if one is used to immediate renderers, but as a concept it’s very much core in Metal and even DX12 has it.


Vulkan is actually only on Android after version 10.

It was introduced as optional API on Android 7, besides Google and Samsung flagship phones, no one else cared.

Also the drivers are so buggy that Google introduced the concept of updatable graphic drivers via the store.

Now look at Android updates, version 10 market share and enjoy Vulkan.


I didn't know that Johan Andersson, from DICE (Battlefield) fame who created Frostbite (the EA engine), the creator of Mantle spec, joined Microsoft. Is your data correct?


No, he co-founded Embark studios with other DICE guys and is doing Rust.


Politics mainly.

We could have OpenGL but Microsoft wanted their own so they made Direct3D and ATI and Intel were unable to make both Direct3D and OpenGL work properly (because most people targeted Direct3D anyway so why bother?) so after SGI bankrupted themselves, Khronos was formed and they decided to define a new OpenGL from scratch to help ATI and Intel claim they have OpenGL (they pay money after all) so they built the core profile but ATI-now-AMD and Intel failed to make that work too (since people still targeted Direct3D so why bother?) so to help them again they decided to ignore OpenGL and make Vulkan which kinda seems to work for now but AMD and Intel took their time to produce certified drivers and Vulkan doesn't look that popular (though Direct3D 12 which is essentially the same thing also isn't that popular either, but the alternative tends to be Direct3D 11 so still no reason to bother fixing OpenGL) so who knows for how long this will work?

Apple was in the OpenGL train initially but like their Java and X11 support that was so they have some easy ports of important stuff and once they got a sniff of popularity they ditched anything non-Apple because why bother maintaining something others control?

That is my interpretation of the story so far anyway. And i didn't mention OpenGL ES which is OpenGL in name only but not in anything that really matters - but thanks to iOS and Android it became popular even though both of these devices had more than enough power to handle the proper thing instead of the crippled down version that was originally designed for J2ME feature phones that couldn't even do floating point math.

On the positive side, OpenGL is still the API that has the most wide availability and of all APIs the one that has the most reimplementations on top of other APIs - though in both cases you'll want to stick to older versions, but TBH versions were always a fluid thing with OpenGL anyway, you're not supposed to think in versions but in extensions.


> Microsoft wanted their own so they made Direct3D

Not quite, it was somewhat of a necessity at that point. 3D graphics was in an abysmal state back in the mid to late 90s. Hardware vendors shipped proprietary APIs (Glide), half-assed OGL ports just for Quake (MiniGL), but very few offered full OpenGL implementations.

In order to migrate gaming from DOS to Windows, Microsoft was in dire need for a reliable and widely available API that they could ship with the OS for game developers to use.

OpenGL wasn't exactly great, since it wasn't controlled by MS and the whole extension system was a huge mess and horrible to work with (I don't care what Carmack thought about it!)

Direct3D on the other hand offered a stable API and most importantly a REFERENCE RENDERER! - something that OpenGL was lacking and lead to super annoying bugs, since every other driver behaved differently...

The latter is still relevant today - OpenGL lacks a defined reference implementation, so "OpenGL support" on the box means very little in practise. This is why certain software packages require certified drivers, because CAD vendors would never be able to ship a finished product if they had to support every single quirk of every vendor, hardware- or driver revision...


In the open source world, Mesa is effectively the reference renderer. It's not hard to get it running on other platforms either.


As an addendum, since the poor API keeps getting forgotten, before DirectX there was WinG on Windows as 2D API for gaming.

It was the first attempt to try to move PC game devs from MS-DOS into Windows,

https://en.wikipedia.org/wiki/WinG


When Direct3D was introduced there was no Glide nor MiniGL and OpenGL provided more than enough of the functionality games would need at the time. Microsoft was in control of their OpenGL implementation which allowed both a full replacement (what drivers do nowadays) but also a partial replacement where a driver would only implement a tiny subset of the API and the rest (e.g. transformation, clipping, lighting, etc) would be handled by Microsoft's code.

> In order to migrate gaming from DOS to Windows, Microsoft was in dire need for a reliable and widely available API that they could ship with the OS for game developers to use.

Yes, the rest of DirectX provided that and OpenGL games used it too.

> OpenGL wasn't exactly great, since it wasn't controlled by MS

Which was the only real problem for Microsoft, not anything else

> and the whole extension system was a huge mess

During the mid to late 90s there were barely any extensions and OpenGL 1.1 provided more than enough functionality for the games at the time. The main extension that would be needed during the very late 90s was multitexturing which was importing a few function pointers, nothing "messy".

> and horrible to work with

Compared to early Direct3D, OpenGL was much easier to work with - early Direct3D required to build execute buffers, manage texture loss yourself and other nonsense whereas OpenGL allowed you to essentially say "use this texture, draw these triangles". This was such a big issue with Direct3D's usability that Microsoft eventually added similar functionality to Direct3D in versions 5 and 6 and they even killed execute buffers pretty much instantly. Even then OpenGL still provided more functionality that the drivers were taking advantage as new functionality was available in GPUs (e.g. Direct3D 7 introduced hardware transformation and lighting, but OpenGL had this from day one essentially so all games that used OpenGL got T&L for free when drivers added support, whereas games that used Direct3D had to explicitly enable it).

> Direct3D on the other hand offered a stable API and most importantly a REFERENCE RENDERER! - something that OpenGL was lacking and lead to super annoying bugs

This is wrong, Microsoft had a software rasterizer for OpenGL 1.1 that behaved very close to the spec and SGI had also released their own software rasterizer.

> since every other driver behaved differently...

This was the case with Direct3D too and in fact a much more painful experience. Direct3D tried to alleviate this by introducing capability flags but in practice no game did proper use of them and games had all sorts of bugs and issues (e.g. DF Retro had a video where they tested a bunch of 90s 3D cards on Direct3D games and pretty much all of them had different visual glitches).

> OpenGL lacks a defined reference implementation, so "OpenGL support" on the box means very little in practise

This is an issue indeed, though it is largely a problem with the driver developers not caring about providing a consistent behavior than a problem with the API. If the driver developers cared they'd try to do things similar to other drivers as long as any difference was spotted between implementations.

Though that is a modern issue, for pretty much the entirety of the 90s and early 2000s there were official software rasterizers from both Microsoft and SGI.


> When Direct3D was introduced there was no Glide nor MiniGL

You must be from a different universe: MiniGL was released in 1996 - the very same year Direct3D 4.0 and Direct3D 5.0 shipped... As for Glide - that started also in 1996 and was commonly used until 3dfx went defunct.

> During the mid to late 90s there were barely any extensions and OpenGL 1.1

Again - in which timeline was that the case? Certainly not in this one: in 1996 (!!!) there were about 90(!!!) vendor-specific extensions [1]. This is not a question of whether you in particular are aware of them or their particular usefulness, they did have use cases and were supported across vendors; sometimes, with varying levels of support...

> Microsoft had a software rasterizer for OpenGL 1.1 that behaved very close to the spec and SGI had also released their own software rasterizer.

Neither of those were references that you could reliably run pixel-level A/B tests against to verify your drivers.

There never was an official reference implementation and there probably won't be any either.

> The main extension that would be needed during the very late 90s was multitexturing

Unless you were porting software from other systems like SGI workstations, which I did at the time. And believe me - it wasn't fun and having half a dozen code paths to work around that depending on the target hardware wasn't "clean" either.

I won't comment on your "which API is better"-drivel since your arguments didn't age well anyway. We're back to execution buffers and manual (texture-) managing for performance reasons so I could just as well argue that early Direct3D was actually ahead of its time... But that's a matter of opinion and not a technical issue.

[1] https://www.khronos.org/registry/OpenGL/index_gl.php


> You must be from a different universe: MiniGL was released in 1996 - the very same year Direct3D 4.0 and Direct3D 5.0 shipped... As for Glide - that started also in 1996 and was commonly used until 3dfx went defunct.

Only the year is the same, but not the dates. Direct3D was introduced in DirectX 2.0 on June 2[0]. Voodoo 1, for which Glide and MiniGL were made, was released after Direct3D, on October 7[1].

It would be impossible for Microsoft to make Direct3D as an answer to APIs like MiniGL since MiniGL didn't exist at the time the first release of Direct3D was made!

> Again - in which timeline was that the case? Certainly not in this one: in 1996 (!!!) there were about 90(!!!) vendor-specific extensions [1]

I'm not sure what you refer to in "[1]", there isn't any date information in there. Regardless, from [2] (which is from 2000, when there were many more extensions than in the mid-90s) you can easily see that the vast majority of extensions are for hardware that is irrelevant to desktop PCs running Windows (e.g. the SGI-specific and GLX stuff).

In addition new OpenGL versions are essentially bundles of previous extensions, so a lot of these extensions are functionality you got with 1.1 (e.g. GL_EXT_texture is basically the ability to create texture objects which was introduced as an extension in OpenGL 1.0 and made part of the core API - and available to anyone with OpenGL on Windows - in version 1.1).

Of all the extensions listed even in the 2000s list, only a handful would be relevant to desktop PCs - especially for gaming - and several of them (e.g. Nvidia's extensions) wouldn't be available in the mid-90s.

> Neither of those were references that you could reliably run pixel-level A/B tests against to verify your drivers.

At the time that was irrelevant as no GPU was even able to produce the exact same output at a hardware level, let alone via APIs.

Also Direct3D didn't have a reference rasterizer until Direct3D 6, released August 1998. The Direct3D 5 (released in 1996) software rasterizers were very limited (one didn't even support color) and meant for performance, not as a reference.

> There never was an official reference implementation and there probably won't be any either.

That doesn't matter, Microsoft's OpenGL software rasterizer was designed to be as close as possible to what the spec described and was much more faithful to it than the software rasterizers available for Direct3D up to and including version 5.

> Unless you were porting software from other systems like SGI workstations, which I did at the time.

Yes, that could have been a problem since 3D GPUs at the time pretty much sucked for anything unrelated to fast paced gaming. But those uses were very rare and didn't affect Direct3D at all - after all Direct3D was at an even worse state with all the caps and stuff you had to take care of that OpenGL didn't require.

> We're back to execution buffers and manual (texture-) managing for performance reasons so I could just as well argue that early Direct3D was actually ahead of its time

Yeah and IMO these modern APIs are a PITA to work with, more than anything ever made before that with any improvement not justifying the additional complexity, especially when OpenGL could have been extended to deal with better performance.

[0] https://en.wikipedia.org/wiki/Direct3D#Direct3D_2.0_and_3.0

[1] https://en.wikipedia.org/wiki/3dfx_Interactive#Products

[2] http://web.archive.org/web/20000818012212/http://oss.sgi.com...


> Microsoft wanted their own so they made Direct3D

Bill Gates "wanted his own" because this would limit software portability, making his near-monopoly in OSes even stronger. Good move for his bottom line, but a dick move for humanity.


Yeah, because Apple (pre-OS X), Sony, Nintendo, 3dfx were so open to mankind's future, really.


Don't be a cynic. We can blame Apple (not just pre-OSX), Sony, Nintendo… and blame Microsoft too.


The ones that keep mentioning Microsoft alone are the cynic ones, selling their FOSS agenda the best way they see fit, usually with zero experience from the games industry.


As much as games may not be able to be open source, I'm still baffled that the infrastructure still isn't. Open source infrastructure dominates the server space, I don't see why it couldn't dominate the desktop as well.

I see why it doesn't: it's mostly hardware vendors refusing to provide free drivers and refusing to hand over the specs of the hardware they sell (I mean the ISA). There may have been good reason 20 years ago, but it's been some years now that hardware tends to be mostly uniform, and could possibly stabilize its ISA. It has been done for x86 (for better or worse), it could be done for GPUs, printers, web cams, and everything else.

Hardware used to come with a user manual. Then it all stopped, around the time Windows 95 took over. Instead of a manual, they provided opaque software that worked with Windows. That has been the new tradition since, and changing it is hard. For instance it's only very recently that FPGA vendors started to gradually realise that open source toolchains could actually help their bottom line.

My dream, really, would be for hardware vendors to agree on an ISA, so we don't have to put up with drivers any more. https://caseymuratori.com/blog_0031


It dominates the server, because many FOSS users refuse to pay for tooling don't have any other option than paying subscriptions for their servers to keep running, or very least they need to buy hardware.

FOSS Desktop doesn't scale to keep a company running under such premises, because a large majority refuses to pay, and living from patreons and donations only goes as far.

Which is why everyone that wants to make money with Desktop FOSS software, either moved it beyond a paywall served via browsers or to mobile OS stores.

From my point of view FOSS friendliness is a marketing action, where underdog companies play nice, use non-copyleft licenses and as soon as they get rescued due to positive vibes, whatever, hop again into dual licenses to keep their business rolling.


> FOSS Desktop doesn't scale to keep a company running under such premises, because a large majority refuses to pay, and living from patreons and donations only goes as far.

Okay, how complex does an OS need to be, really? Let's see, it needs to schedule and run your programs, interface to the hardware, manage permissions… that's about it. Why would it need to scale? What's so impossibly complex about an OS that it couldn't be done by 5 highly competent engineers in 2 years?

Oh, right, the hardware. With the exception of CPUs, hardware vendors don't publish their specs, and don't agree on a set of interfaces. So you end up having to write a gazillion drivers, dozens of millions of lines of code, just so you can talk to the hardware.

Solve that problem, and OSes won't need to scale. Perhaps even to the point that game devs will be able to ship their own custom OS with their games. As was done in the 80s and early 90s.


Having a stable driver ABi, and being micro-kernel based helps with scaling, which fun fact, that is what Playstation with its heavily customised FreeBSD, or Switch with their in-house OS do.

As for portable specs, if Open Group, Khronos have taught anything, is that there is a big difference between paper and real hardware/platforms.

Yep, we shipped with our customs OSes, which also had our custom workarounds for faulty undocumented firmware bugs, those that we occasionally took advantage of for demoscene events.


> As for portable specs, if Open Group, Khronos have taught anything, is that there is a big difference between paper and real hardware/platforms.

But… they don't even specify ISAs, they specify APIs. I'd wager the big difference is only natural. Another way would be for a vendor to design their ISA on their own, then make it public. If a public ISA gives them an advantage (and I think it could), others would be forced to follow suit. No more unrealistic consortium. :-)

> faulty undocumented firmware bugs

I hope that today, any hardware bug would trigger an expensive recall, and firmware bugs would just be embarrassing. CPUs today do have bugs, but not that many. We could generalise that to the rest of the hardware.


” That is my interpretation of the story so far anyway. And i didn't mention OpenGL ES which is OpenGL in name only but not in anything that really matters”

You might be mistaking the OpenGL es 1.0 for anything modern.

ES2.0 and above is a true subset of the desktop OpenGL, some limits are less and support for things like Geometry shaders are optional, but that’s pretty much it.


OpenGL and OpenGL ES are two completely different APIs with their own specs and implementations. Some versions do have an overlap in functionality in that you can write code that can work with both with minimal (mainly shader) differences, but that's about it.

But IMO OpenGL ES 2.0 was pointless, the devices that were powerful enough to support it were also powerful enough to support the full OpenGL so Khronos should have pushed that instead of fragmenting the driver and API ecosystem.


No really. 4.3 made ES a true subset. As in you cannot write a spec conformant ES3.0+ software that would not run in GL4.3+ implementation.

This was very intentional by Khronos so they could bring the two closer together. In ES2.0 days what you said would have been true, as ES2.0 had some annoying differences especially on shader side, but it’s been 8 years since 4.3 came out.


> but it’s been 8 years since 4.3 came out.

but macOS will never support anything past 4.1 so for cross-platform dev you cannot rely on that


And yet OpenGL 3.3 is still what most would advise on GL forums regarding best market distribution on consumer hardware.

Which is why it is the chosen version for GL on Vulkan and GL on DirectX ongoing projects.


Yeah, that is why shaders need conditional compilation.


Es3 shaders (with the modern in/out qualifiers) compile as is on Gl4.3+. It is a true subset. As an example 4.3 brought precision qualifiers to desktop GL. And now that fp16 is in desktop Hw they are actually useful there.


Pity that GL 3.3 is still what is mostly widespread on consumer desktops.


I also recall the early days of D3D (Immediate mode), which although it came after OpenGL, immediate mode allowed better integration with the cards at the time (notably 3dfx) and OpenGL did not have hardware drivers, which meant it was limited to software rendering. So, if you were into game dev, then D3D was your only option early on.

My recollection was that it wasn't until NVidia started up (and broke 3dfx by poaching engineers) that OpenGL started to become 'better'. Intel was left in the dust until mobo support for DMA (?around DX5?), which allowed cards to gain quick access to RAM, which was vital for texturing (you always had to 'upload' textures to the card itself prior to that). It was the final nail in the coffin for 3fdx at that point, who still hadn't released a new card for ages, and OpenGL was finally on par with D3D. D3D had a retained mode which began to be really useful by about that time too.

At the time, many people wanted to use OpenGL because it was loads easier than Immediate Mode and a lot more intuitive to grok. I recall a certain prominant Doom developer berating D3D loudly in a private email list (John Cormack) about this very fact. Ironically, a few months later some guys released a demo for a game called "Unreal" using D3D and everyone was blown away. (circa 1995-6). More ironically, it wasn't for another year that GL Quake came to fruition.


Carmack loved OpenGL because GPU vendors could (and would) release propriety extensions that exposed all the new functionality of new GPUs.

He would rewrite custom rendering paths for various GPUs and common sets of extensions, allowing him to improve performance and/or improve graphics.

With Direct3D, Microsoft defines a common feature set that all GPUs supporting that version of Direct3D are required to support, and any extra functionality that GPUs might provide on top of that are locked away, completely inaccessible.

Checking the Doom 3 source code, he the main ARB and ARB2 pixel shaders paths (equivalent to dx8 shaders). Then for the older gpus that mostly support pixel shaders but not in a standards compliant way, he has an nv10 code path and a nv20 code path.

Then he has a r200 code path, which I think just improves performance on r200 graphics cards over regular ARB2.


Extensions are a good thing since it allows developers to take advantage of new functionality and provide it to consumers pretty much immediately - this is a win win for everyone involved, programmers use cutting edge functionality and consumers actually get to use the fancy GPUs they paid money for.

Direct3D programmers disliking extensions make me think of the sour grapes fable.

But OpenGL providing extensions doesn't mean that Direct3D programmers are free of having to implement different code paths - if anything during the 90s with the proliferation of 3D cards and different capabilities flags, programmers had to take into account a lot of different possibilities in their rendering code (and most failed to do that with games having visual glitches everywhere).


That was only on the beginning.

> Direct3D is now better than OpenGL Says John Carmack

https://www.bit-tech.net/news/gaming/pc/carmack-directx-bett...


> and broke 3dfx by poaching engineers

I don't like the term "poaching". It implies that the engineers who took better paying jobs did something wrong, whereas they were just trying to retain a bigger portion of the enormous value that they were creating.


GPU hardware is varied and changes frequently. Many API features are partially or entirely implemented in software (drivers or API runtime) complicating things. Console, mobile and desktop parts are different in terms of functionality. OpenGL is probably the closest to runs on everything. BGFX, sokol, WebGPU and other abstraction layers might be good fits as well. I think options are pretty good if you just want rasterized triangles, vertex and fragment shaders (and maybe compute). As you start to need better performance or other parts of the pipeline options are less clear cut.


> How is having a standard API to draw 3D graphics not a solved problem at this point?

It seems like it mostly is? The problems described in the article appear to be due to a mismatch between how QT and Krita do things under the hood.

OpenGL has broad support (except Apple). The OpenGL ES subset adds web browsers and lots of embedded devices. ANGLE provides translation of ES to D3D, Vulkan, and soon even Metal.

Vulkan seems to work pretty much everywhere (except Apple of course). MoltenVK provides translation of v1.0 to Metal.

If departing native hardware APIs is acceptable, gfx-rs appears to work today. WebGPU is well on it's way to being fully implemented. Plus (as the article mentions) apparently QT6 is intending to introduce their own custom abstraction layer?


OpenGL works badly on MacOS and even worse on Windows unless translated through Angle. On Windows, using OpenGL directly will give performance problems and crashes _all the time_. And weird bugs, like red and blue swap on some combinations of AMD GPU's and driver versions.


> On Windows, using OpenGL directly will give performance problems and crashes _all the time_.

Didn't the original (Java) version of Minecraft use OpenGL? It seems to have done well enough...


Except PlayStation, Android until version 10, Win32 and UWP sandboxing.


> PlayStation

True, but Sony has some sort of completely custom thing going on there so it seems that there's zero chance of standardization in that case.

> Android until version 10

I thought Android got VK support way back in version 7 (Nougat)? And hasn't it always supported GL ES? Not being a closed platform, support for any API is of course dependent on the underlying GPU and associated driver.

> Win32 and UWP sandboxing

TIL. I hadn't realized Microsoft restricted access to Vulkan and GL from within the sandbox. It looks like ANGLE has supported GL ES for Windows Store apps since 2014? Regardless, I was under the impression that UWP wasn't very popular with developers anyway.


Android got optional Vulkan support on version 7, yes. And by being optional, meant most OEMs cared as much as the optional updates.

Hence why on Android 10, Google took a set of actions, made Vulkan mandatory, started the path to have OpenGL ES on top of Vulkan and introduced GPGPU driver updates via the Play Store, as means to force OEMs to actually care about Vulkan.

Just like many devices still don't ship with OpenGL ES 3.x, because it is optional as well.

There a reason why I mentioned Win32 and UWP sandboxing and not just UWP. Yes the pure UWP model although quite nice to program for (for me it is what .NET 1.0 should have been all along) failed to gather the mass adoption that Microsoft hoped for, and that is why since two years they have changed course to merge both worlds, now officially known as Project Reunion.

As a matter of fact, here is the application model for the upcoming Windows 10X, where pico processes are used to sandbox Win32, WSL style, https://youtu.be/ztrmrIlgbIc

Angle for UWP was contributed by Microsoft themselves and now they are working together with Collabora to support OpenGL and OpenCL.

https://github.com/microsoft/angle

https://devblogs.microsoft.com/directx/in-the-works-opencl-a...


Regarding Android, yeah I get that it's optional (I actually didn't know that recent versions had made it mandatory). If you view Android as analogous to Windows though then I think it makes sense. There's lots of different devices running Android, some of which aren't even phones. My point is that, similar to desktop GPUs, you can choose a "lowest common denominator" API based on the maximum age of the physical devices that you want to support.

I don't see a problem with that approach. In fact it seems to be about the best you can hope for when it comes to hardware APIs in general (unless you're Apple and control all the hardware) since things are constantly being revised and redesigned.

Regarding Windows, what are you saying here? That Windows Store apps will be getting Win32 support in the near future because the sandbox will finally be able to accommodate it (but VK and GL will still be blocked)? Or that native (unsandboxed) Win32 apps will become sandboxed (and thus restricted) in the near future? (I suspect the former, which is neat but doesn't change anything regarding graphics APIs.)

I did learn a few new things here but am still left with the general impression that Vulkan has reasonably broad (and increasing) support while OpenGL ES 2.0 can target pretty much everything worth supporting (including most web browsers). (Of course I'd strongly prefer to use a more modern API than ES 2.0 if it's available.)


The difference is that desktops get updates, while on Android is more of wishful thinking targeting latest versions, unless one just targets Pixel and Samsung flagship devices.

Win32 sandboxing is orthogonal to the store.

UWP is not about the store, people keep mixing this up, as it unfortunely refers to multiple things across the Windows stack.

UWP is also known as WinRT, UAP, or just modern COM. And sandboxing UWP applications doesn't necessarly require delivery via the store. Any MSIX package will do.

What Microsoft is now doing (officially as of Reunion) is detaching all this tech from the kernel and have them as userspace APIs, across multiple Windows versions, and merging the UWP and Win32 stacks into one, hence Project Reunion.

Sandboxed Win32 applications don't need to be store only.


>Vulkan seems to work pretty much everywhere (except Apple of course).

There's Molten.


Indeed, which is now managed by Khronos itself.


It is mostly a solved problem. OpenGL is the old high-level standard, Vulkan is the new standard, though it's closer to the hardware.

WebGPU is the main contender for a new high-level standard, which will probably mostly run on Vulkan.

Only Apple doesn't support Vulkan, but nobody can force them to.


WebGPU is not high-level, no? It is closer to Vulkan than to OpenGL.

And I am not sure Vulkan runs everywhere... what about consoles?


WebGPU is very similar to Metal. While it is closer to Vulkan than to OpenGL, it's still higher level than Vulkan and easier to use.


> And I am not sure Vulkan runs everywhere... what about consoles?

The Nintendo Switch supports it. Playstations had their own custom proprietary API since forever, and Microsoft peddles their own API on Xbox.


Switch supports it alongside NVN and OpenGL 4.5, and most engines end up using NVN. Unity alone accounts for 50% of the games.


> WebGPU is not high-level, no?

Right - what I read is performant WebGPU is more consistent than Vulkan between GPU vendors and I understood that to mean higher-level. But perhaps it's just a promise of a better design.


Only on the Switch, and even then they have their own API, NVN, which is what most AAA and middleware like Unity actually use.


That means that any project that wants to be multi-platform and support OSX will need some sort of graphics API abstraction anyway. Of course it's still nice to only have to implement Metal and Vulkan. But you're not getting away with depending on a single open standard.


You don't have to implement Metal, Valve had MoltenVK open sourced specifically to fix the Apple hole in Vulkan coverage.


>Only Apple doesn't support Vulkan, but nobody can force them to.

Why, does Microsoft support Vulcan? Does Google on Android?


> Why, does Microsoft support Vulcan?

Microsoft just uses whatever the vendors throw over the wall. So yes.

> Does Google on Android?

Yes.[0]

[0]: https://developer.android.com/ndk/guides/graphics


Marketing sales pitch.

Windows only supports Vulkan via the ICD driver model used by OpenGL, Win32 and UWP sandbox only support DirectX.

Microsoft is now pushing for vendors just to implement them on top of DirectX.

Vulkan is only guaranteed to be available on Android devices since version 10. The market share of Android 10 is left as exercise for the reader.


A couple of corrections, if you don't mind...

Vulkan is not hardware-specific, it is an open, cross platform standard like OpenGL.

Direct3D is not standard, but it is so well supported and used (even in Linux!) that you can consider it one.


This sentence from the article offers a clue. Of course Apple are not the only bad actors here. But basically, it's company politics.

> So at some point, the engine stopped working on macOS. Mostly because Apple doesn’t want people to use OpenGL, so they’re starving their drivers from development


WebGPU is going to be dope. It goes beyond the web. https://github.com/gfx-rs/wgpu-rs


> How is having a standard API to draw 3D graphics not a solved problem at this point?

The problems come from having proprietary GPU hardware ISAs. And baseless optimism about working around that by the magic of software.


You need an abstraction over different hardware implementations to have a standard API. Abstraction comes with a cost. And 3D graphics applications want to utilize hardware at its fullest capability. So basically it's not a simple problem. Especially considering that hardware is constantly evolving, inventing new approaches. For example I learned OpenGL about 15 years ago and I never used any shaders. Now, AFAIK, shaders are everywhere.

Probably the best thing you could do at this point is to use game engine like Unity or UE as an abstraction.


Meanwhile we have been compiling the same C and above code for mips, arm, power, x86, and riscv for decades in portable ways while still getting performant code.

Now to be fair GPUs have a legacy of slowly becoming generic processors whereas 10+ years ago they were largely fixed function hardware. Writing generic code for a GPU today is totally sensible because it supports most arithmetic, primitive types, branching, etc.

But there is nothing stopping you from optimizing your compiler for glsl any more than optimizing your platform specific custom vendor compiler for C.


You still write performance-critical code using assembly language. C compiler is not ideal.


The problem here is that abstraction comes at a cost, and that cost is so great that we need even crazier abstractions to be able to work around it... and now the suggestion is, as you say, work with a whole engine. Ah, now I can draw a pixel on screen! I see it as a lot of wasted potential (even if maybe not that much wasted commercial potential).


As a data point I've been developing the visual side of Https://ossia.io with the new Qt RHI and it's been a breeze.


I'll check it out... Does it use QML and the scenegraph, or does it have a custom C++ canvas widget implementation?


the software's "main editor" uses a very traditional QGraphicsScene with a QGLWidget - no QML / QtQuick (QtQuick is a very good tool for a lot of usecases, but not the right tool for this one particular "traditional big desktop app" job imho).

The RHI-using part is for creating custom visuals (think applying various shaders to video & camera inputs for VJ), so I wrote my own scene / render graph leveraging it which renders in separate windows through QRhi.


That sounds interesting. I've cloned score and libossia, but I haven't found that code yet. Could you point me at it?


95% of the RHI code is in there: https://github.com/OSSIA/score/tree/master/src/plugins/score...

- Window: https://github.com/OSSIA/score/blob/master/src/plugins/score...

- Renderer: https://github.com/OSSIA/score/blob/master/src/plugins/score...

- Example of a very simple node which renders a texture generated by a std::function: https://github.com/OSSIA/score/blob/master/src/plugins/score...

It was mostly written at my Ballmer peak during last year's christmas / new year's eve though :-) so lacks code quality a fair bit.

There's a graph of nodes. The graph is walked from every output node (screen surfaces) to create a matching "rendered node" (pretty much a render pass). For every node "model", a node "renderer" will be created, which contains uniforms, GPU buffer (QRhiBuffer) & texture (QRhiTexture) handles, etc. and a QRhiGraphicsPipeline (the state in which the GPU must be to render a node, the VAOs, shader programs, layout, culling, blending, etc etc)

Then every time the output node renders due to vsync or whatever, the associated chain of node (render passes) is executed in order.

I recommend looking at the RHI manual tests in the qt source, they show the usage in a very straightforward manner:

https://code.qt.io/cgit/qt/qtbase.git/tree/tests/manual/rhi?...

In particular, I started with this one : https://code.qt.io/cgit/qt/qtbase.git/tree/tests/manual/rhi/...


Thanks! Now I've got a place to start at least!


There are abstraction layers utilizing either, but many implement their own.


Could we just get 2D figured out before adding another dimension?


And LibGNM, LibGNMX, GX, NVN, the APIs that game consoles actually use.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: