Exactly. Think of is as the classic compiled vs. interpreted languages debated. One is clearly faster than the other but the slower one may have different advantages.
Ahead-of-time vs Just-in-time is a more accurate description.
In this case, there's no evidence that precompiling is faster in theory -- let alone in practice. In the JS framework benchmark suite, SolidJS and InfernoJS performance is almost identical (with SolidJS having a much larger margin for error in most tests).
This is the BEST possible case for precompiling too. In the real world, JITs take a long time to warm up (a couple hundred executions before all the optimizations kick in). With the vDom, you warm up ONE set of code that then runs forever. With the precompile, it has to warm up for EVERY new component and potentially slightly different codepaths within the same component.
The JS framework benchmark reuses the same components for everything which is a huge advantage to precompiled frameworks while not having much impact on vDom ones (as the actual components in both cases usually won't optimize very much due to being polymorphic).
> In this case, there's no evidence that precompiling is faster in theory -- let alone in practice.
It’s absolutely not because it requires far greater effort computationally. The benefit has nothing to do with performance but instead simplified state management.
I know people desire certain frameworks due to how they perform state management. I have never really understood that motivation myself though because managing state is incredibly simple. Here is a basic outline of how simple it is:
1) realize there are exactly two facets to every component: data, interface.
2) all components should be stored in common locations. A single object stores component data and a common DOM node for storing component interfaces.
3) pick a component facet to update, either data or interface, and never update the other. The other should be automatically updated by your application logic (reflection).
4) store your data on each change. That could be dumping it into local storage. In my current app I send the data to a local node instance to write into a file so that state can shared across browsers in real time.
5) be able to restore state. On a fresh page, or even page refresh, simply grab the stored data and rebuild the component interfaces from it.
My current application is a peer to peer Windows like GUI that works in the browser and exposes the file system (local device and remote devices) in Windows Explorer like interfaces. Managing state is the least challenging part of this. The slowest executing part are long polling operations against large file system trees (it’s about as slow in the native OS interface)
> In the JS framework benchmark suite, SolidJS and InfernoJS performance is almost identical (with SolidJS having a much larger margin for error in most tests).
I mean I agree with most of your post, but I'm not sure I would necessarily make that highlighted claim from the benchmark results. I mean the +- seems to be pretty run dependent for most libraries on there. And while I agree that the differences in performance is neglible, there is one. Solid is clearly faster in most tests even if by a small amount. Anyone interested you can look at: https://krausest.github.io/js-framework-benchmark/current.ht... And then isolate Solid and Inferno and then do a comparison against one library. It will color highlight the degree of certainty the difference is between the libraries in terms of significance of the results.