It doesn’t become true just because you keep repeating it. The point was always to allow for an API where you only had to think about how to render one snapshot, and not care about what the previous snapshot looked like. Virtual DOM made that possible, but it was never more performant than just writing the resulting DOM manipulations manually.
Then let an old fogey make it more clear: browsers repaint/recalculate if you read after write. This required batching writes separate from reads if you wanted decent performance on older browsers and especially on older machines.
Reading an element property, adding/updating a nearby element, and then reading another element's property took FOREVER. Enter the virtual DOM. Since it did not engage the actual rendering engine, the reads and writes did not trigger reflow. At the end of the code segment, the actual DOM actions became effectively write-only. Even though the virtual DOM was slower per access than the actual DOM, the end result was a MASSIVE speed up.
This message brought to you by someone who honed their skills for a decade to batch their reads and writes in vanilla JS only to have those new-fangled frameworks take care of it (and data binding) for you. Jerks.
So what you're you saying is that at the granulatity of a single tick a VDom increases performance significantly due to not having to wait for the browser to recompute the dom after writes.. correct? It effectively batches writes, and thus the need for the renderer to get involved, which increases read throughput because reads block till after the DOM was recomputed. And the DOM is recomputed on every write thats followed by a read.
Makes a lot of sense, thanks for the input; I was completely unaware of this case. Any idea if this is still the case? Do you happen to remember what browsers and/or hardware that saw dramatic improvements (CPU gen would be great)? I'm thinking of doing some deeper perf investigation/spelunking on the subject to satisfy my curiosity. I remember things one way but a lot of people seem to think the opposite..
The view is recalculated/re-rendered/repainted. The DOM is the single-threaded-access data structure that the rendering engine ties into.
Part of the browser API is querying all current CSS properties of an element, e.g., getComputedStyle(…). The only way to get this is by having the layout engine do all its work, so properties like height and width can return accurate info.
Most virtual DOM implementations just skip parts of the API like this. At best, they make an educated guess without hitting the actual renderer. Or they just allow a pass through to getComputedStyle(…) and warn you away from using it due to performance concerns.
It's all smoke and mirrors topped above a bed of lies.