Hacker News new | past | comments | ask | show | jobs | submit login

We could have had compute in WebGL, given that they are part of GL ES, but Chrome refused to support them and since Chrome == Web, now WebGPU is the only way to get compute shaders.

https://www.khronos.org/registry/webgl/specs/latest/2.0-comp...

https://www.khronos.org/assets/uploads/developers/presentati...

https://bugs.chromium.org/p/chromium/issues/detail?id=113199...




That's pretty uncharitable spin. The bug linked says in the first issue

> macOS' OpenGL implementation never supported compute shaders

which is a pretty valid reason to drop it. Nothing to do with Chrome. Further, OpenGL is dead, every graphics developer knows this. All the browser vendors are working on WebGPU. No reason to waste time on implementing outdating things. There are finite resources. No other browser vendor named any plans to support webgl-compute. So stop with the ridiculous spin


Not at all, after all Apple now supports WebGL 2.0 on top of Metal, and WebGL has always been on top of DirectX on Windows.

Nothing on the standard requires OpenGL backend, only that the semantics are preserved, GL ES, Metal and DirectX 11 all have compute shaders support.

OpenGL is not dead until Khronos comes up with an API that is actually usable without an EE degree on GPU design and shader compilers.

WebGPU is years away to become usable, even if it happens to be released during 2022, it will be a 1.0 MVP, years behind of what Metal, Vulkan, DX 12, NVN and LibGNM are capable of in 2021.

> In order to reclaim code space in Chromium's installer that is needed by WebGPU, the webgl2-compute context must be removed.

I bet the Chrome team wouldn't bother with code space installer for other Google critical features that are part of Project Fugu.


Disclaimer: I work on the team at Google responsible for the open source translator engine that runs under WebGL (ANGLE)

WebGL on Linux requires the OpenGL backend until eventually Vulkan is supported well enough.

Apple's WebGL2-on-Metal approach involves the same translator engine I work on, and was very much a collaborative effort between Google, Apple, and external contributors. Chrome will adopt this in the future after integration work is finished.

I can confirm that browser GPU programmers are definitely spread pretty thin just ensuring compatibility and reliability across all platforms


Unfortunely to the outside world it looks like WebGL has been put on ice because WebGPU is going to sort out everything, someday.

No wonder the focus on making streaming work for 3D rendering with current APIs on modern hardware.


> WebGPU is years away

It would be even further away if all the devs working on it were instead spending years on WebGL-Compute. It's not like you snap your fingers and it works. It takes years of cross platform work and testing. That time is better spent moving forward.

As further proof, the only names on the compute spec are Intel and one Google name. Compare to the WebGPU spec which clearly has buy in from all the browsers.

So if Chrome had shipped it you'd have the gallery of comments bitching about Chrome going too fast, implementing too many things no other browser plans to implement.

BTW: WebGL2 in Safari does not run on Metal (yet)


WebGL 2 in Safari does run on Metal. WebGL 1 also runs on Metal. Apple contributors added support to ANGLE to use a Metal back end and it’s enabled by default in the latest Safari.


Raph says Metal fails to support an important synchronization primitive that is therefore also not available in WebGPU, and limits performance in his font rendering pipeline.


If it's not available in Metal, how would it be magically available in WebGL running on top of Metal?


Evidently it is also not available in WebGL. Like many other things.


Here is a tip, doesn't need to be the same team.


Here's a fact, there are a limited number of developers period. You seem to magically have conjured a team out of thin air.


On the contrary, if there was interest the team would have been ramped up, plenty of candidates on the games industry, period.


> WebGPU is years away to become usable

As a counterpoint, I've been using WebGPU (through wgpu-rs) for the past 1.5 years. It's been a pleasure to use. For instance, here's the CPU-side code for a glow post-process shader using 4 render passes https://github.com/JMS55/sandbox/blob/master/src/glow_post_p....


It isn't the browser and WGSL is half way specified.


But as far as a OpenGL replacement on native, it absolutely is.


Middleware engines already solved that problem for about 20 years now, no need for WebGPU outside the browser.


Not everyone wants to use existing engines. Or are you saying you want to embrace vendor lock-in in that regard?


Yes, definitly, that is the approach taken by AAA game studios.

A plugin framework for 3D APIs is probably the easiest part to implement on a game engine, specially when API like OpenGL and Vulkan already require implementing one due to extension spaghetti.


> OpenGL is not dead until Khronos comes up with an API that is actually usable without an EE degree on GPU design and shader compilers.

Why would this ever happen? It seems like there is nothing in the works, nothing on the horizon, and no demand for a higher-level, less-performant API to become a new standard. Even OpenGL itself has been getting lower level and more detailed ever since version 1. People can build/use a wrapper API or game engine or something else if they want easy. It seems weird to say this right after defending Apple’s use of Metal to implement WebGL. Apple’s moves to get rid of OpenGL from the Mac ecosystem are one of the strongest forces pushing OpenGL out.


> Apple’s moves to get rid of OpenGL from the Mac ecosystem are one of the strongest forces pushing OpenGL out.

FWIW as someone who exclusively uses OpenGL for 3D graphics, this actually makes me push Apple out :-P


Oh yeah, same here. I loved my Mac right up to the point that I could no longer run any of my own code on it with reasonable effort.


On platforms that support OpenGL, it is the Python 2 of Khronos APIs.

Regarding Metal, indeed those that leave OpenGL are more likely to move into middleware than forcing Vulkan upon themselves.

Hence why Khronos started ANARI, as most visualisation products and CAD/CAM people couldn't care less that Vulkan on its present state exists.


Now Zink (GL over Vulkan) runs faster than OpenGL itself on supported platforms.


That's something of a sweeping generalization. Zink has managed to beat a native OpenGL driver in some particular benchmarks.

In many other benchmarks, it loses. That being said, it still manages decent performance, which is extremely impressive for a one man project using only the Vulkan interface. It wouldn't surprise me if it eventually becomes the default OpenGL driver in the open source driver stack (for HW capable enough to support a Vulkan driver, obviously).


Autodesk is using Vulkan for some products (including using MoltenVK for the Mac version).

As for using middleware, GPU's are vastly more capable and complicated today than 30y ago when OpenGL 1 appeared. In most cases it makes sense to use a higher level interface, specialized for the particular type of application you're writing, be it ANARI, a game engine, some scenegraph library, or whatever.


> OpenGL is not dead until Khronos comes up with an API that is actually usable without an EE degree on GPU design and shader compilers.

That isn't going to happen because everyone is, in fact, moving away from the idea of a "graphics API" altogether and simply allowing the compute systems to calculate everything.

See: the Nanite renderer from EPIC: https://www.youtube.com/watch?v=eviSykqSUUw

To a first and second order approximation, no one cares about graphics that aren't related to games.


A great comment to refer to the next time some young coder that never lived through IE6 says it's great to have one single engine everywhere.


> A great comment to refer to the next time some young coder that never lived through IE6 says it's great to have one single engine everywhere.

OP is suggesting that Chrome should have spearheaded a spec that no one but Intel and Google ever showed any interest in. How would that not have been literally "they like a standard (or propose one), they implement it, use it on their projects like google doc, and let the competition deal with the mess"[1]?

Instead they're implementing a spec with broad involvement and support across browsers, it's just taking longer. Seems like exactly how we'd want things to go.

[1] https://news.ycombinator.com/item?id=29405716


> they like a standard (or propose one), they implement it, use it on their projects like google doc, and let the competition deal with the mess

Yes. Exactly what they've been doing with plenty of other "standards"


Which is what the OP is suggesting Google should have done in this case


As someone who has lived through the browser wars, I have no idea what you’re trying to say? The situation today is vastly better than it was back then. It’s not hyperbole to say that porting JS from, say, IE to Mozilla took just as much time as it took to write it in the first place. Today, it’s expected (and often the truth) that something you developed in one browser works in the others.

Also, none of specific reasons people opposed Microsoft’s policies with IE, like it’s attempts to lock people into windows-only APIs (ActiveX etc) apply today.


> I have no idea what you’re trying to say?

In this case, google can pull off something like OP says because it's dominating the market.

> Also, none of specific reasons people opposed Microsoft’s policies with IE, like it’s attempts to lock people into windows-only APIs (ActiveX etc) apply today.

Chrome implements plenty of chrome only API. They like a standard (or propose one), they implement it, use it on their projects like google doc, and let the competition deal with the mess.


Sure, there is a long tail of web API's like this but they tend to be very specialized. They are easily avoided and ignored by most web developers, who likely have no need for them in the first place. (Both WebGL and WebGPU are arguably in this category - you are unlikely to need them for your website.)

This is nothing like the situation was with IE6. Back then basic functionality was half-broken and it was hard to get anything done without a pile of hacks.


> This is nothing like the situation was with IE6. Back then basic functionality was half-broken and it was hard to get anything done without a pile of hacks.

Causes are not 1 to 1, but consequences are the same: monopoly leads to abuse, abuse lead to some site not working with Firefox, consommers needs being ignored by google and them abusing their dominant position to pass what they want as standard, or just destroy API they don't like (see the latest adblock scandal).


Compute shaders (and a lot of other useful features) are part of GLES 3.1, while WebGL2 stopped at GLES3.0 (those computer shader experiments were just that: experiments - my guess is that the effort wasn't continued because WebGPU was already on the horizon).

edit: GLES3.2 => GLES 3.1


Intel showed it working, Chrome abandoned it, because they didn't want to spend resources implementing it, and "In order to reclaim code space in Chromium's installer that is needed by WebGPU, the webgl2-compute context must be removed.".

WebGPU is still on the horizon during the next couple of years.


Apple abandoned OpenGL and refused to implement newer versions that would include compute shaders. WebGL implementations were based on OpenGL at the time. Intel's prototype did not and could not work on Mac. WebGL on Metal was not even started and there was no indication that Apple would ever work on it.

Now, years later, Apple actually implemented WebGL on Metal, so today we could think about implementing WebGL compute on Mac. However WebGPU is now in origin trials. It's very unlikely that Apple would put any effort into a compute shader implementation for WebGL now. And Chrome is not going to go implement a major WebGL feature that has no prospect of ever being supported in Safari.


So for some APIs, Google does whatever they feel like it, e.g. Project Fungus, Houdini and PWAs.

But for WebGL it matters what Apple does?


If Apple was completely blocking progress in web graphics then maybe Chrome would have tried to do something about it. But that's not the case at all. Everyone is aligned on WebGPU as the path forward. It's unfortunate that Apple delayed WebGL for years but there's nothing to do about it now.


Why doesn't Google drop PWAs given the same reasoning?


... that would only be an analogous situation if Apple was collaborating in a W3C working group with Google and Mozilla and Microsoft and others to make a more capable standard to replace PWAs, and was already in the process of implementing it. The situations really couldn't be more different.


Yeah well I don't see Firefox supporting it either, with their usual practice of not even supporting what Chrome bothers to do.


Firefox supports CSS subgrid. I wish Chrome did, since it would solve some layout problems I have.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: