Cool visualization but it focuses so much on the ``'-.,_,.-'`waves`'-.,_,.='`` and not on the actual coverage pattern of 6-12.5cm waves, so not as actually useful for showing coverage as other tools, but fun as an art project.
If you like that, you might also love NestDrop for music visualization tailored for VJs and with special features to support projecting inside domed surfaces https://nestimmersion.ca/nestdrop.php
Hey, if you click through to one of the example simulations you'll be able to change the visualization from waves to time-averaged power density, which should be closer to what you're looking for.
#1 if you spam the "add a new source" button you eventually get a JavaScript exception logged to the screen due to an array with a fixed max size of 128 elements overflowing.
#2 this could be graphics card or driver specific (I have an AMD card), but scrolling just right can can break the simulation due the the text boxes; for example by quickly paging up and down, or scrolling all the way to the bottom and then wiggling the scroll position up and down. Once this happens the bad data propagates until the entire thing is filled with noise, solid black, or solid white. If you then scroll up to 3D mode the screen will be filled with a mess of polygons.
I once wanted to make something similar, for sound. I wanted to create active noise cancelling "in the room", instead of via headphones. I pictured these devices combining a microphone and a speaker that you could set up in strategic location. After thinking about it for a bit, I realized that inference would cause areas with silence and others where the sound volume would be doubled. Less than ideal but still possibly interesting. But then I thought about it some more and realized that I needed to think in 3D, which makes the setup orders of magnitude more complex.
It seems like that would apply here as well, at least when looking at the effects of refraction?
If you had a "sound laser", you could control the waves when entering a column (think a "sound umbrella") to cancel out sounds that enter. However, this only works for "regular sounds" (as in non-directional sound, like sound from distant sources, like speakers). So, it wouldn't cancel out things like people speaking. It would be pretty useful for clubs, where at your table you can have regular speaking conversations.
I hate to mention that this is all currently patented. It's pretty "easy" to build though, but you'll have to wait a bit for all the related patents to expire or pay some licensing fees.
It can work (Look up beamforming). However it takes a 3d array of speakers with a lot of elements positioned in specific points in space, as well as very accurate position tracking for both ears. Very very unlikely to ever beat a high end noise cancelling headphone in terms of performance as well.
If the listener can be fixed in space then the problem gets easier but in that case what you have is actually a very large, room sized headphone that you enter and sit down into.
Source: I indirectly consulted a high-end furniture company on this exact project, they decided to pivot after a while.
Badass on the visualization side. The multiple emitter portion and the end of the scroll reminded me of https://apenwarr.ca/beamlab/ which demonstrates beamforming (adjusting the phase of adjacent transmitters to focus power towards a specific receiver). To play with this one the "easiest" way to see what's going on is go to the right hand menu, unclick 3 so you just have 2 transmitters, click the "r" button, and then click or click and drag to see how the beam gets optimized (you can see some stats about the received power gain in the righthand side).
While this has to be the most fun to watch demonstration I've seen, something like the free tier of Hamina will likely be many times more useful to those wanting to optimize home Wi-Fi layout https://www.hamina.com/planner. The downside being they force you to make an account whereas this one lets you use it locally with the power of your own browser. The upside being Hamina gives multiple distilled views of the simulation as focused on common Wi-Fi statistics + features and less focus on displaying the wave simulation itself.
I just tried to signup for a new Hamina account... and I've failed after spending 15 minutes on it. It seems they have a bug in the last step of the signup process that makes the final "continue" button always be disabled. Forcibly enabling it by modifying the DOM appears to work at first and lets me submit the form, but it doesn't result in a new account being created.
I have a suspicion it's flagged my signup as potentially abusive or something due to my various privacy enhancements...
When I run the Waveguide Simulator demo on my Alienware M15 Ryzen Ed. R5 (has a RTX 3070; Windows 11 Pro, Chrome v129), I hear a distinct high pitched flutter noise emanating from my laptop. I thought it was from the speakers, but no, with my volume down it was still present as long as the simulator was playing. Weird, but very cool demo (probably my hardware, never hear this during games or other WebGPU demos). The realistic house simulation yields a different signature in the sound.
As others have said, it's probably the GPU power supply circuits making the sound; if the pattern of power consumption has frequencies in the audible range, it can cause components like inductors and capacitors to mechanically vibrate at those frequencies and emit sound. The reason you don't hear it in games is either due to the game audio being much louder or the power pattern not having those audible frequencies.
CPU power circuitry can do the same, but given this is using the GPU, it's a safe assumption that it's the latter.
Coil whine (or capacitor whine) from the gpu running at too high a refresh rate. Easiest thing would be to use nvidia control panel to add an fps cap to something like 2x your monitors max rate for the browser (or globally). It's pretty common with any workload after like 600 fps.
It's mostly broadband noise that can be simulated by simpler methods, but visualizing possible resonance patterns for the low-frequency emissions from the compressor (which typically runs at 20Hz, 40Hz, ..., 120 Hz) would be good to know.
Although I am not sure how the 2d simulation result carries over to the 3d world...
I have a Wifi 6E router, so I wonder if 6GHz vs 5 vs 2.4 acts noticeabley differently here? Is the overall shape the same or does the frequency make a big difference?
I wonder if there is something like this but open-source, so I could customize this. I'm looking for a tool to solve UWB positioning (for indoor navigation) - to be able to know where to optimally place various UWB anchor points.
A while ago I was trying to find realistic examples of what WiFi "looks like", to try to get an intuitive sense of how it operated in a house or outside a building -- to what extent it spreads in the same way as a normal lightbulb, or to what extent its vastly larger wavelength complicated the picture.
At the time, literally the only visualization I was able to find was this artistic seemingly nonsense:
So I'm very happy to see this tool. I'd be even more curious to see a non-animated version that lets you drag your router around and see "illumination" of the overall signal to see how it changes, continuing to take into account how reflections confuse and degrade the usable signal, etc. Instead of the animation of slow wave propagation. Maybe that exists somewhere?
Thanks! If you navigate to one of the example simulations, you'll be able to change the instantaneous field visualisation to one of time-averaged power density, which sounds closer to what you're looking for.
The cool thing about the speed of WebGPU is that you can drag things around and watch changes in real-time, even if you have to average lots of simulation steps per rendered frame.
Ah yes, I've found it -- changing "Signal" from "EM field" to "Power". That is fascinating to look at.
Is there a way to move the router around to see how the field changes in response?
As far as I can tell you can do that in the paid version, and I totally understand gating that for people modeling their own home/office layout. But it would be pretty cool as a free educational demo if you could move the router in the otherwise fixed example.
Especially frustrating in the case of Safari, which only needs to support one native API backend (Metal) on a pretty narrow set of hardware and drivers. Firefox has a much bigger task with needing to support everything, like Chrome, but with far less resources than Google or Apple can afford to throw at it.
For Safari that's par for the course, but Firefox is in surprisingly far behind in terms of anything GPU related in the browser.
I've recently been shocked trying out the WebGL aquarium demo [0] on Chrome and Firefox after running into some really odd performance issues on a project. You'd expect them to behave about the same with GPU acceleration, but FF barely gets half the framerate at the same load. Like, what?! On Linux FF is also several times slower at canvas rendering.
Even Chrome only supports it officially on Windows, macOS and Android, no GNU/Linux (as stable).
And when it becomes widespread, just like WebGL 2.0, it will be a decade behind of what native APIs are capable of.
And in both cases, good luck debugging, other than replicating the content on native APIs, as means to use proper GPU debuggers, because even Chrome has yet to offer any developer tooling for GPU debugging.
ChromeOS is a Linux kernel with some software bundled on top to make a usable system; it is by definition a Linux distro. Curiously, it's even a GNU/Linux distro, as it uses glibc and GNU coreutils, unlike ex Android which is a non-GNU distro.
I will grant that it's a slightly odd distro, but is it any weirder than NixOS or Fedora Silverblue?
User-facing apps are browser instances and VMs, and under that is a pretty normal userland. It's still a distribution of software on a Linux kernel. And for that matter, is Silverblue any less a Linux distro with its read-only root and apps in flatpak/distrobox? Are Qubes OS or Proxmox with everything in VMs?
Why do you need webgpu? It's unfortunate that people use technology that is "state-of-the-art techniques to run simulations at interactive speeds" without fully understanding what it's for. General compute on GPU is what webgpu is for.. To simulate basic waves like in this demo you absolutely do not need that, in fact it's an indication the author implemented the solution in a non-optimal way. WebGL is fully supported by all browsers fully supported by well-maintained libs like 3js, yet here we are people writing a sin function with basic interference patterns, one of the most elementary 3D primitives, in webgpu and argue that's using the "state-of-the-art" techniques.
Good question! This is actually a numerical solver for a few coupled partial differential equations - the method in this context (electromagnetism) is called FDTD. It's implemented as a WebGPU compute shader.
You absolutely could do this using WebGL2 compute shaders too, but I thought it would be fun to try this newer API.
and then a bunch of other GPU code. You can find this with little effort from the bundle, if you care, by base64-decoding the Pt("xxx") parts.
Though I do imagine it indeed could be implementable with WebGL shaders, but I also wouldn't start a new compute-based on it, unless I had a particular need to support older systems. And this I say as a Firefox user..
To run on Chromium Linux you need to enable "enable-unsafe-webgpu" as well as "enable-vulkan" flags in "chrome://flags". Best to disable again afterwards.
And of course, I can't mention that without shouting out projectM (open-source Milkdrop) that supports WebGL https://github.com/projectM-visualizer/projectm/blob/master/... and one of the OGs, Geisswerks https://www.geisswerks.com/
If you like that, you might also love NestDrop for music visualization tailored for VJs and with special features to support projecting inside domed surfaces https://nestimmersion.ca/nestdrop.php
reply