My company also rolled our own framework to handle the case of 'very large web applications'.
In our case, every component is a file (which is becoming more typical) -- but it differs in that every file dependency is automatically resolved and downloaded in a lazy, on-demand fashion. With HTTP 2.0, this means you can update a single file without a build process -- and everyone receives the update as a minimal change.
Also, all paths are relative or built dynamically using something like C #define, so it's easy to refactor by moving entire directories about.
Most of the popular frameworks generate a single JavaScript bundle that comprises an entire application. Some people break these bundles into multiple parts manually in a process called 'code-splitting' in order to do a really chunky kind of lazy load.
With HTTP/2, you can send files down independently, without creating a large bundle, so each file can be cached. You couldn’t obfuscate function names across files though unless you made the process deterministic based on module name.
Etags could be used to cause the browser to revalidate the contents without too much overhead since the connection is already establishd, but good point.
In our case, every component is a file (which is becoming more typical) -- but it differs in that every file dependency is automatically resolved and downloaded in a lazy, on-demand fashion. With HTTP 2.0, this means you can update a single file without a build process -- and everyone receives the update as a minimal change.
Also, all paths are relative or built dynamically using something like C #define, so it's easy to refactor by moving entire directories about.