Hacker News new | past | comments | ask | show | jobs | submit login

That's pretty uncharitable spin. The bug linked says in the first issue

> macOS' OpenGL implementation never supported compute shaders

which is a pretty valid reason to drop it. Nothing to do with Chrome. Further, OpenGL is dead, every graphics developer knows this. All the browser vendors are working on WebGPU. No reason to waste time on implementing outdating things. There are finite resources. No other browser vendor named any plans to support webgl-compute. So stop with the ridiculous spin




Not at all, after all Apple now supports WebGL 2.0 on top of Metal, and WebGL has always been on top of DirectX on Windows.

Nothing on the standard requires OpenGL backend, only that the semantics are preserved, GL ES, Metal and DirectX 11 all have compute shaders support.

OpenGL is not dead until Khronos comes up with an API that is actually usable without an EE degree on GPU design and shader compilers.

WebGPU is years away to become usable, even if it happens to be released during 2022, it will be a 1.0 MVP, years behind of what Metal, Vulkan, DX 12, NVN and LibGNM are capable of in 2021.

> In order to reclaim code space in Chromium's installer that is needed by WebGPU, the webgl2-compute context must be removed.

I bet the Chrome team wouldn't bother with code space installer for other Google critical features that are part of Project Fugu.


Disclaimer: I work on the team at Google responsible for the open source translator engine that runs under WebGL (ANGLE)

WebGL on Linux requires the OpenGL backend until eventually Vulkan is supported well enough.

Apple's WebGL2-on-Metal approach involves the same translator engine I work on, and was very much a collaborative effort between Google, Apple, and external contributors. Chrome will adopt this in the future after integration work is finished.

I can confirm that browser GPU programmers are definitely spread pretty thin just ensuring compatibility and reliability across all platforms


Unfortunely to the outside world it looks like WebGL has been put on ice because WebGPU is going to sort out everything, someday.

No wonder the focus on making streaming work for 3D rendering with current APIs on modern hardware.


> WebGPU is years away

It would be even further away if all the devs working on it were instead spending years on WebGL-Compute. It's not like you snap your fingers and it works. It takes years of cross platform work and testing. That time is better spent moving forward.

As further proof, the only names on the compute spec are Intel and one Google name. Compare to the WebGPU spec which clearly has buy in from all the browsers.

So if Chrome had shipped it you'd have the gallery of comments bitching about Chrome going too fast, implementing too many things no other browser plans to implement.

BTW: WebGL2 in Safari does not run on Metal (yet)


WebGL 2 in Safari does run on Metal. WebGL 1 also runs on Metal. Apple contributors added support to ANGLE to use a Metal back end and it’s enabled by default in the latest Safari.


Raph says Metal fails to support an important synchronization primitive that is therefore also not available in WebGPU, and limits performance in his font rendering pipeline.


If it's not available in Metal, how would it be magically available in WebGL running on top of Metal?


Evidently it is also not available in WebGL. Like many other things.


Here is a tip, doesn't need to be the same team.


Here's a fact, there are a limited number of developers period. You seem to magically have conjured a team out of thin air.


On the contrary, if there was interest the team would have been ramped up, plenty of candidates on the games industry, period.


> WebGPU is years away to become usable

As a counterpoint, I've been using WebGPU (through wgpu-rs) for the past 1.5 years. It's been a pleasure to use. For instance, here's the CPU-side code for a glow post-process shader using 4 render passes https://github.com/JMS55/sandbox/blob/master/src/glow_post_p....


It isn't the browser and WGSL is half way specified.


But as far as a OpenGL replacement on native, it absolutely is.


Middleware engines already solved that problem for about 20 years now, no need for WebGPU outside the browser.


Not everyone wants to use existing engines. Or are you saying you want to embrace vendor lock-in in that regard?


Yes, definitly, that is the approach taken by AAA game studios.

A plugin framework for 3D APIs is probably the easiest part to implement on a game engine, specially when API like OpenGL and Vulkan already require implementing one due to extension spaghetti.


> OpenGL is not dead until Khronos comes up with an API that is actually usable without an EE degree on GPU design and shader compilers.

Why would this ever happen? It seems like there is nothing in the works, nothing on the horizon, and no demand for a higher-level, less-performant API to become a new standard. Even OpenGL itself has been getting lower level and more detailed ever since version 1. People can build/use a wrapper API or game engine or something else if they want easy. It seems weird to say this right after defending Apple’s use of Metal to implement WebGL. Apple’s moves to get rid of OpenGL from the Mac ecosystem are one of the strongest forces pushing OpenGL out.


> Apple’s moves to get rid of OpenGL from the Mac ecosystem are one of the strongest forces pushing OpenGL out.

FWIW as someone who exclusively uses OpenGL for 3D graphics, this actually makes me push Apple out :-P


Oh yeah, same here. I loved my Mac right up to the point that I could no longer run any of my own code on it with reasonable effort.


On platforms that support OpenGL, it is the Python 2 of Khronos APIs.

Regarding Metal, indeed those that leave OpenGL are more likely to move into middleware than forcing Vulkan upon themselves.

Hence why Khronos started ANARI, as most visualisation products and CAD/CAM people couldn't care less that Vulkan on its present state exists.


Now Zink (GL over Vulkan) runs faster than OpenGL itself on supported platforms.


That's something of a sweeping generalization. Zink has managed to beat a native OpenGL driver in some particular benchmarks.

In many other benchmarks, it loses. That being said, it still manages decent performance, which is extremely impressive for a one man project using only the Vulkan interface. It wouldn't surprise me if it eventually becomes the default OpenGL driver in the open source driver stack (for HW capable enough to support a Vulkan driver, obviously).


Autodesk is using Vulkan for some products (including using MoltenVK for the Mac version).

As for using middleware, GPU's are vastly more capable and complicated today than 30y ago when OpenGL 1 appeared. In most cases it makes sense to use a higher level interface, specialized for the particular type of application you're writing, be it ANARI, a game engine, some scenegraph library, or whatever.


> OpenGL is not dead until Khronos comes up with an API that is actually usable without an EE degree on GPU design and shader compilers.

That isn't going to happen because everyone is, in fact, moving away from the idea of a "graphics API" altogether and simply allowing the compute systems to calculate everything.

See: the Nanite renderer from EPIC: https://www.youtube.com/watch?v=eviSykqSUUw

To a first and second order approximation, no one cares about graphics that aren't related to games.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: