This made me realize that my Pathfinder (vector graphics on GPU) work may have an unexpected benefit—unifying font and vector rendering paths across OS's to mitigate a bunch of these fingerprinting techniques.
(My font rendering reads the OS settings to determine which rendering mode to use to match the underlying OS, but I think there's no need to do that for canvas.)
> (My font rendering reads the OS settings to determine which rendering mode to use to match the underlying OS, but I think there's no need to do that for canvas.)
Not sure I understand. Are you saying that HTML and SVG text would match the OS's font rendering but HTML5 canvas text would not?
highp is technically specified as only min float16, right? (Though given the perf cost of going for a larger float it seems unlikely any implementation will?)
Last time I spoke to gw there were fair differences in WebRender down to rounding error between different GPUs (and drivers, IIRC), though almost certainly you'll know better the situation nowadays (and wrt Pathfinder).
Wouldn't that just put you in the "no fingerprint" category? Presumably fairly few people use such techniques, so couldn't that make you more trackable, not less?
I'm actually shocked, though maybe I shouldn't be. I'd always assumed that canvas fingerprinting was some theoretical technique that nobody would be evil enough to use, not one in active use in thousands of top websites.
Any good defence against this for Safari on macOS?
(There is this javascript blocker, JS Blocker, but when I last used it, the Safari memory usage would explode every other day, to the extent of rendering the machine unusable unless you managed to kill the process very quickly.)
Hi, I am the author of the post. I run some fingerprinting tests sometimes for research purposes. Nevertheless I don't see which of the tests would ask for any audio or photos permissions. Which browser were you using?
I'm on firefox mobile (Fennec F-Droid) and didn't get any permission request, on this page or a few of the other recent articles on that website...
The site doesn't use HTTPS, perhaps someone is tampering with your connection? Some ISPs have been known to do that, but not for very nefarious purposes AFAIK.
It can't be changed. You can fake the user agent and JavaScript-accessible properties (Tor browser does this so every Tor user is indistinguishable), but canvas rendering depends on your actual OS, GPU, etc. Browsers could implement a software renderer to consistently draw canvases on all devices, but then you lose half the reason to use canvas - speed.
GPU differences don't show up very well—the OpenGL spec and the D3D "spec" have standardized behavior across GPUs fairly well at this point. What does show up are text rendering differences and the different software vector renderers (Core Graphics vs. Skia vs. Cairo vs. Direct2D).
A lot of Canvas is still in software. I'm working to change that, but it's a work in progress…
Is anisotropic filtering standardized by the D3D or OpenGL specs now? I know in the last decade or so, it was still the case that if you did an image diff of the same scene being rendered with say, 8xAF on both NVIDIA and AMD GPUs, the result would be different since both vendors used a mix of heuristics to get good results. I'm guessing they might've dropped the hacks by now and converged on a known 'good' algorithm. I wouldn't be surprised if mobile GPUs are still cheating even if desktop ones have stopped, though, so you'd be able to detect that with WebGL.
There's also the 'detect driver bugs' approach since there are test suites out there that detect various driver stack bugs using special shaders, you could run a set of those to identify the GPU (or even driver) being used to rasterize the WebGL content.
Don't we still live in an age where GPU drivers need game-specific driver fixes in order to work correctly? Being bug free is a different question than being free from any noticeable difference in the output, and we seem to not even have reached the bug freedom stage yet.
I mean even Firefox Webrender is only available to Nvidia GPUs at the start, no?
The driver fixes are usually performance improvements where the vendor rewrites shaders to make the game run better or add support for weird vendor features (like SLI). If the vendor had to hack around the game to make it work at all, the developers wouldn't have been able to even test the game during development.
WebRender was locked to NVIDIA as a known target with a known set of driver bugs. They could've chosen AMD or Intel as the target instead and worked to figure out all the relevant driver bugs and worked around them, but for whatever reason they picked NV.
Actually, on Firefox, the 2D canvas is drawn using skia, a CPU vector renderer. Think that Chrome does the same, but not sure. The WebGL canvas uses the GPU hardware though.
Firefox and Chrome have been using hardware acceleration for Skia for ages. Even if you have HW acceleration shut off, Chrome is typically using SwiftShader to rasterize which basically emulates a GPU instead of doing traditional CPU rasterization.
If you switch off hw accel in Chrome and Firefox your canvas performance will tank.
can anyone explain how the web site reads out the information that the browser fingerprint provides? I get that the first step is to create a specific canvas image. but how does the server know what the result was?
(My font rendering reads the OS settings to determine which rendering mode to use to match the underlying OS, but I think there's no need to do that for canvas.)