Hacker News new | past | comments | ask | show | jobs | submit login

My company also rolled our own framework to handle the case of 'very large web applications'.

In our case, every component is a file (which is becoming more typical) -- but it differs in that every file dependency is automatically resolved and downloaded in a lazy, on-demand fashion. With HTTP 2.0, this means you can update a single file without a build process -- and everyone receives the update as a minimal change.

Also, all paths are relative or built dynamically using something like C #define, so it's easy to refactor by moving entire directories about.




Don't the popular web frameworks handle lazy loading things?


Most of the popular frameworks generate a single JavaScript bundle that comprises an entire application. Some people break these bundles into multiple parts manually in a process called 'code-splitting' in order to do a really chunky kind of lazy load.


Code splitting via webpack is not a manual process.

Does your hand rolled framework handle minimization and obfuscation? Probably not because any changes would require a rebuild.

Are your assets cached? If they are, I'm curious how you handle the cache busting when a single file gets updated.


With HTTP/2, you can send files down independently, without creating a large bundle, so each file can be cached. You couldn’t obfuscate function names across files though unless you made the process deterministic based on module name.


But you need some cache-busting functionality anyways (otherwise how does the client know that file "something.js" has changed?


Etags could be used to cause the browser to revalidate the contents without too much overhead since the connection is already establishd, but good point.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: