My hunch is the browser isolation product sounds a lot like an experiment I built some years ago and got bogged down in when I was working on it solo.
Is the browser isolation product a "modern web proxy" that presents say, a social media management site (like Tweetdeck) or an internal app via a rewritten URL, rewriting the HTML, JS, and HTTP headers to hide authentication from the user?
If so - wow, to get that to work performantly requires deep understanding at all levels of the HTTP stack. I had a web proxy written in Haskell some years back that implemented online, nested HTML and JS rewriting and injection, and I was so close to getting modern web apps like Facebook and Gmail to render perfectly. Server side rendered sites were feature complete, but sites using a virtual DOM required care to proxy out DOM builtins like createElement and setting attributes, and that surface area is quite large.
For sites using chunked transfer encoding, it worked on a chunk-by-chunk basis, as opposed to blocking until the entire document had been transferred to the proxy. The additional latency was just the time to parse a chunk. Orders of magnitude better perf than any of the web proxies that many of us on HN used as kids to subvert school network restrictions, and much better compatibility.
It's a little sad my project never saw a release, but I would be very excited to see a company ship something very similar or even better.
> S2 Systems NVR technology intercepts the remote Chromium browser’s Skia draw commands, tokenizes and compresses them, then encrypts and transmits them across the wire to any HTML5 compliant web browser (Chrome, Firefox, Safari, etc.)
Oh so this is very very different and it would be unfair to say one is better than the other. They're transmitting raw drawing commands a la RDP or X Forwarding which provides pixel perfect rendering and website compatibility by running the JS in a remote browser instance.
I wonder how that impacts accessibility, screen readers and ARIA. If they're rendering a Canvas element as a framebuffer, I have to imagine they've lost support.
This sounds a lot like gameplay recording software that I've worked on in the past using apitrace[1] for GPUs. You can tokenize, compress, and transmit recordings of graphics API calls and re-render at variable resolution in post. I hope S2 Systems/CloudFlare haven't attempted to patent this concept.
Is the browser isolation product a "modern web proxy" that presents say, a social media management site (like Tweetdeck) or an internal app via a rewritten URL, rewriting the HTML, JS, and HTTP headers to hide authentication from the user?
If so - wow, to get that to work performantly requires deep understanding at all levels of the HTTP stack. I had a web proxy written in Haskell some years back that implemented online, nested HTML and JS rewriting and injection, and I was so close to getting modern web apps like Facebook and Gmail to render perfectly. Server side rendered sites were feature complete, but sites using a virtual DOM required care to proxy out DOM builtins like createElement and setting attributes, and that surface area is quite large.
For sites using chunked transfer encoding, it worked on a chunk-by-chunk basis, as opposed to blocking until the entire document had been transferred to the proxy. The additional latency was just the time to parse a chunk. Orders of magnitude better perf than any of the web proxies that many of us on HN used as kids to subvert school network restrictions, and much better compatibility.
It's a little sad my project never saw a release, but I would be very excited to see a company ship something very similar or even better.