That's pretty uncharitable spin. The bug linked says in the first issue
> macOS' OpenGL implementation never supported compute shaders
which is a pretty valid reason to drop it. Nothing to do with Chrome. Further, OpenGL is dead, every graphics developer knows this. All the browser vendors are working on WebGPU. No reason to waste time on implementing outdating things. There are finite resources. No other browser vendor named any plans to support webgl-compute. So stop with the ridiculous spin
Not at all, after all Apple now supports WebGL 2.0 on top of Metal, and WebGL has always been on top of DirectX on Windows.
Nothing on the standard requires OpenGL backend, only that the semantics are preserved, GL ES, Metal and DirectX 11 all have compute shaders support.
OpenGL is not dead until Khronos comes up with an API that is actually usable without an EE degree on GPU design and shader compilers.
WebGPU is years away to become usable, even if it happens to be released during 2022, it will be a 1.0 MVP, years behind of what Metal, Vulkan, DX 12, NVN and LibGNM are capable of in 2021.
> In order to reclaim code space in Chromium's installer that is needed by WebGPU, the webgl2-compute context must be removed.
I bet the Chrome team wouldn't bother with code space installer for other Google critical features that are part of Project Fugu.
Disclaimer: I work on the team at Google responsible for the open source translator engine that runs under WebGL (ANGLE)
WebGL on Linux requires the OpenGL backend until eventually Vulkan is supported well enough.
Apple's WebGL2-on-Metal approach involves the same translator engine I work on, and was very much a collaborative effort between Google, Apple, and external contributors. Chrome will adopt this in the future after integration work is finished.
I can confirm that browser GPU programmers are definitely spread pretty thin just ensuring compatibility and reliability across all platforms
It would be even further away if all the devs working on it were instead spending years on WebGL-Compute. It's not like you snap your fingers and it works. It takes years of cross platform work and testing. That time is better spent moving forward.
As further proof, the only names on the compute spec are Intel and one Google name. Compare to the WebGPU spec which clearly has buy in from all the browsers.
So if Chrome had shipped it you'd have the gallery of comments bitching about Chrome going too fast, implementing too many things no other browser plans to implement.
WebGL 2 in Safari does run on Metal. WebGL 1 also runs on Metal. Apple contributors added support to ANGLE to use a Metal back end and it’s enabled by default in the latest Safari.
Raph says Metal fails to support an important synchronization primitive that is therefore also not available in WebGPU, and limits performance in his font rendering pipeline.
As a counterpoint, I've been using WebGPU (through wgpu-rs) for the past 1.5 years. It's been a pleasure to use. For instance, here's the CPU-side code for a glow post-process shader using 4 render passes https://github.com/JMS55/sandbox/blob/master/src/glow_post_p....
Yes, definitly, that is the approach taken by AAA game studios.
A plugin framework for 3D APIs is probably the easiest part to implement on a game engine, specially when API like OpenGL and Vulkan already require implementing one due to extension spaghetti.
> OpenGL is not dead until Khronos comes up with an API that is actually usable without an EE degree on GPU design and shader compilers.
Why would this ever happen? It seems like there is nothing in the works, nothing on the horizon, and no demand for a higher-level, less-performant API to become a new standard. Even OpenGL itself has been getting lower level and more detailed ever since version 1. People can build/use a wrapper API or game engine or something else if they want easy. It seems weird to say this right after defending Apple’s use of Metal to implement WebGL. Apple’s moves to get rid of OpenGL from the Mac ecosystem are one of the strongest forces pushing OpenGL out.
That's something of a sweeping generalization. Zink has managed to beat a native OpenGL driver in some particular benchmarks.
In many other benchmarks, it loses. That being said, it still manages decent performance, which is extremely impressive for a one man project using only the Vulkan interface. It wouldn't surprise me if it eventually becomes the default OpenGL driver in the open source driver stack (for HW capable enough to support a Vulkan driver, obviously).
Autodesk is using Vulkan for some products (including using MoltenVK for the Mac version).
As for using middleware, GPU's are vastly more capable and complicated today than 30y ago when OpenGL 1 appeared. In most cases it makes sense to use a higher level interface, specialized for the particular type of application you're writing, be it ANARI, a game engine, some scenegraph library, or whatever.
> OpenGL is not dead until Khronos comes up with an API that is actually usable without an EE degree on GPU design and shader compilers.
That isn't going to happen because everyone is, in fact, moving away from the idea of a "graphics API" altogether and simply allowing the compute systems to calculate everything.
> macOS' OpenGL implementation never supported compute shaders
which is a pretty valid reason to drop it. Nothing to do with Chrome. Further, OpenGL is dead, every graphics developer knows this. All the browser vendors are working on WebGPU. No reason to waste time on implementing outdating things. There are finite resources. No other browser vendor named any plans to support webgl-compute. So stop with the ridiculous spin