Compute shaders (and a lot of other useful features) are part of GLES 3.1, while WebGL2 stopped at GLES3.0 (those computer shader experiments were just that: experiments - my guess is that the effort wasn't continued because WebGPU was already on the horizon).
Intel showed it working, Chrome abandoned it, because they didn't want to spend resources implementing it, and "In order to reclaim code space in Chromium's installer that is needed by WebGPU, the webgl2-compute context must be removed.".
WebGPU is still on the horizon during the next couple of years.
Apple abandoned OpenGL and refused to implement newer versions that would include compute shaders. WebGL implementations were based on OpenGL at the time. Intel's prototype did not and could not work on Mac. WebGL on Metal was not even started and there was no indication that Apple would ever work on it.
Now, years later, Apple actually implemented WebGL on Metal, so today we could think about implementing WebGL compute on Mac. However WebGPU is now in origin trials. It's very unlikely that Apple would put any effort into a compute shader implementation for WebGL now. And Chrome is not going to go implement a major WebGL feature that has no prospect of ever being supported in Safari.
If Apple was completely blocking progress in web graphics then maybe Chrome would have tried to do something about it. But that's not the case at all. Everyone is aligned on WebGPU as the path forward. It's unfortunate that Apple delayed WebGL for years but there's nothing to do about it now.
... that would only be an analogous situation if Apple was collaborating in a W3C working group with Google and Mozilla and Microsoft and others to make a more capable standard to replace PWAs, and was already in the process of implementing it. The situations really couldn't be more different.
edit: GLES3.2 => GLES 3.1