Actually, it’s a story in itself that a problem noted by an older article is still relevant enough to be discussed. In the 6 months since, has the industry even looked at this or is it even worse?
A fast-moving industry needs equally-fast solutions. If 6 months can pass and people are still talking about a problem as if it was yesterday, then not enough has happened. Some ideas of more that could be done: talk to your web-developer friends; have them talk to their managers; start publicly shaming bloated web sites; submit issues and/or pull requests to web browser projects with ideas for how to fix HTML; or something.
The web isn't near as fast moving as you imply it is.
Website have always been to big. We've been using huge Javascript libraries for the last ten years to add minor functionality.
Even using Flash for websites was a common standard!
The problem is that speed and usability are almost always considered less important than flashy animations and other bullshit that no end user actually wants.
I think this post will be relevant for years to come. I don't think web developers will ever learn.
Perhaps, but HN's rule on dupes is clear and this is well within the margin. "If a story has had significant attention in the last year or so, we kill reposts as duplicates."
While a nice article that I do agree with, it's not exactly accurate.
90% of website size is media (images, audio, video, etc). Discounting that and saying that a 1MB page is so much bigger than just the text it delivers is silly and doesn't really make a point. Maybe it's unnecessary, maybe not, but that's subjective and most people would rather have images and decent UI.
It's true that ads are another factor, and it's slowly being solved, but ultimately comes down to the wrong incentives across the industry. The more annoying/heavy ads work better and pay more so everyone from advertisers to agencies to publishers will continue to optimize toward them until something changes (which might be adblocking). This isn't a technical problem so much as business and politics.
However, the publishers these days *are pretty short on tech talent and mostly use off-the-shelf CMS systems like wordpress (which are already bad software) and then just layer on their plugins, themes, and ads to create the mess we have today. Again no easy way to solve that without talent either on the pub side or in the platforms themselves. Some progress being made here with things like facebook instant articles and medium.com hosting.
Overall, the web definitely has a cruft problem, but it's not really that bad considering all the various channels of information access, and it's all slowly getting better.
90% of website size is media (images, audio, video, etc).
That's a commonsense assumption that, in my understanding, used to be true. But the article goes to great lengths to point out that it isn't true anymore. Some of the bloated multi-megabyte examples don't include any media at all.
If you look at what the unblocked version pulls in, it’s not just videos and banner ads, but file after file of javascript. Every beacon, tracker and sharing button has its own collection of scripts that it needs to fetch from a third-party server.
The javascript alone in "Leeds Hospital Bosses Apologise after Curry and Crumble On The Same Plate" is longer than Remembrance of Things Past.
There are lots of bloated and slow ad networks. It's not that they couldnt make faster tech but that it's not a priority. Same with publishers who don't have enough skills/resources to get it done and would rather focus what little they have on making sure the site and ads work in the first place.
> Maybe it's unnecessary, maybe not, but that's subjective and most people would rather have images and decent UI.
The rise and success of cruft-stripping-as-a-service offerings like Readability, Pocket, Instapaper, etc., seems to indicate that a not-insignificant segment of consumers think that a "decent UI" is _less_ UI.
Those have several attractions. Stripping the cruft from pages is a large part of it.
Simply having a consistent typography, font selection, size, and readable contrast is a huge benefit. Web design isn't the solution, Web design is the problem.
(And asstasstic browser defaults is another problem: why an absolutely unstyled page cannot be perfectly readable and acceptable is a huge flaw. I blame the browser vendors and W3C for this.)
A tool which manages a large written archive is another hugely useful aspect. Tabs and bookmarking both suck.
Also: best I can tell, Readability seem to have died so far as any visible development or company activity goes. There's been none in years, though the service itself works for now. I'm in the process of moving 2000 articles to Pocket (Instapaper lacks necessary features IMO).
It's worth noting that Pinboard.io itself addresses some of these needs.
It is, but many times, I end up using Pocket or Instagram because the text is presented in a way that is simply unreadable for me, and neither Firefox's Reading Mode nor Clearly (when I use Chrome) help.
That's the only way I've ever used them and there are a bunch of send-to-Kindle services based on them so I feel like I can't be alone here. I'm not sure how you'd go about quantifying this though.
For what it's worth, iOS Safari has a directly available "Reader View" button that bumps you to a cleaned up view immediately with a single tap. The built in "Add to Reading List" is buried behind the share sheet.
I usually have a fast connection and reasonable data limits available, so I don't worry too much about how much stuff my browser needs to pull down to render a website. However, there's another type of bloat that absolutely kills my browser and that is all of the JavaScript.
One of my computers is a netbook-ish Lenovo x131e from a few years ago. It runs everything I need it to very well, except for web browsing in Firefox. Many pages cause the browser to completely choke for a few seconds while the JavaScript does its thing.
I finally broke down and installed NoScript and it feels like I'm browsing the web on a brand new computer! Pages load fast and render quickly. If JavaScript is required, I can enable it on a site-by-site basis. There's also the privacy and security benefits, but my main issue was performance.
I used to think that people who browse with JavaScript disabled were being silly, but now I understand why someone would want to do that.
I know that part of the problem is Firefox and part of it is the under-powered computer. Browsing the same sites in Safari on a 6-year-old MacBook Pro isn't nearly as painful and, generally speaking, I leave the JavaScript alone on that computer.
As someone with a lightweight acer I do exactly the same. Amazon for example literally brings my machine to a halt. Turn off JS and bam it's fast again.
In fairness, your 6yo mbp probably has a much faster CPU and more ram than the newer netbook... It's a hard sell, especially now that newer cell phones even are starting to have beasts of CPUs...
But I agree, JS is a huge issue... I notice it most on click-bait type sites on my cell phone. It's really pretty sad, all things considered.
You're absolutely right. It's certainly not a fair comparison at all, they're very different machines and I was satisfying different needs when I purchased each one.
Great stuff, I watched the video presentation too. The audience clapped at the part where he shows the pyramid illustration of the web - HTML followed by huge chunk of crap, then surveillance at the top. Obviously he is not alone in recognizing the problem.
I'm glad to hear Google are planning to label slow sites. We need this. Bloated websites need to be held accountable.
On mobile devices too, we need a browser option to stop loading any further data for a given site after a defined point. So if the browser has received 3 megabytes of a page so far, it stops and asks the user whether to continue with downloading. It might say how many cents this one website has cost them so far (if the user has setup this feature).
Fair enough "modern web, modern features" but most of the modern features we enjoy are improvements to browsers, servers and javascript. There's no reason why this can't mean keeping page size steady while enjoying new features.
A lot of it is library cruft... Angular + jQuery + jQuery library, and all it's components (used or not) and all the related components.. and then the iframes for ads, loading in their own different version of jQuery, and related... etc, etc...
Calling this a "Crisis" is definitely blowing this out of proportion. True, websites have grown by a lot over the past decade, and so has functionality, take a look at your favourite websites on the waybackmachine and see how god awful most of them looked just 6-7 years ago, compare the functionality they provided vs what we have now.
Yes I agree some websites do really need to go on a diet, loading unused css, js and multimedia has to be removed, but 1-2MB is not a reason anyone should be throwing their hands in air screaming CRISIS!! Definitely I would applaud anyone who would spend their time optimising their websites to be as small as they can be, but Jesus Christ 2BM and we have a crisis... no way!
I for one think this is a sign we are making progress a very good thing, we are improving as a community and we are getting more out of our web.
Aside from the fact that it's impressive to make flat-styled animations that big in gif form, I think the people behind medium.com are just as much to blame for not giving feedback to the uploader about the size of the attached images, nor offering a non-destructive optimising pass or serving the gifs reencoded as .WebM for browsers that support it.
I just got a text this morning from my phone company: "Your exceeded your monthly 500MB limit". It's probably not a crisis, but it has some quite real economic consequences if I have to download 2MB worth of nothing whenever I click a link.
Yes, Adblockers to the rescue. But you know, Apple doesn't support Adblockers on an iPhone 5 for whatever stupid reason.
See the main problem then, is not a problem with if websites are obese or paper thin, the question is are websites and our browsers doing a good job of using already existing features for helping users.
I think we should be complaining about developers and admins who doesn't enable gzipping, we should be angry with the fact some websites don't even bother setting headers to cache static resources. We should not be throwing out hands up in air screaming crisis because our web is becoming more useful.
"Useful" doesn't need to mean bloated. You seem to miss the point that many websites these days are bloated even basic articles. We don't need a V8 engine and oversize tyres to go buy milk and bread.
You're forgetting that mobile devices quickly run out of cache memory. This includes iPads. You might visit a news site once every few days, but quite often the same resources need downloading again and again because your iPad has not kept them in cache memory because it ran out already. Desktop browsers have a lot more capacity to keep web resources cached.
Not only that, but RAM memory is also limited on mobile devices. Forget about expecting multiple websites to remain in memory across numerous tabs. My iPad3 can pretty much handle only one tab open. If I switch tabs to another website, then switch back, the whole page loads again. A less bloated web would help this issue.
And then there's accessibility for low-speed areas. Did you even read the article?
Most static servers enable If-Modified-Since header, as well as other detections... Cache sizes, specifically on mobile are very small, and ineffective... it's more often than not, a very small http request to recheck if a cached file is still valid.
Optimizing images is a pretty huge thing, as is enabling gz compression on the streams... these two alone add up to far more saving than setting cache headers, not that they can't or shouldn't be set.
That's a problem with your phone company not websites, 500MB really isn't enough for using the modern web.
Also since when exactly did websites have a mandate to economise to the scale of mobile data? It's always been edge case unless you are specifically targeting that.
> Also since when exactly did websites have a mandate to economise to the scale of mobile data? It's always been edge case unless you are specifically targeting that.
Trends towards mobile-first have been a thing for a while. Especially in developing countries where mobile users outweigh desktop users.
Targeting desktop is slowly becoming the edge case for anything targeting a mass audience. Look at a lot of the recent high-value tech companies. Do Uber/Snapchat/Whatsapp/Instagram target desktop or mobile first?
Where are users in general spending the majority of their time? On a desktop/laptop, or on a phone/tablet?
Mobile-first is designing for a mobile interface. I quiet clearly stated "economise to the scale of mobile data". That is, supporting users on heavy mobile data restrictions.
Users, regardless of device used, are primarily on wifi and that is what is designed for. In modern countries most users don't really have data restrictions low enough to come into effect through websites so going into the future this will not be a problem.
Which countries would that be? As far as I know there are strict data restrictions basically everywhere (apart from legacy plans). You get about 10 of those "modern" websites a day with 1 GB which is for example what US's Sprint "Unlimited" plan offers you.
Most of Asia and Europe. The American plans are so expensive and restrictive they don't reflect the current state of the technology at all. That's what's causing all this 'crisis' scaremongering. 1GB is a fairly basic plan over here and for that you can expect to have to watch your usage.
I was in Hong Kong last year and I was looking at limited data plans then.
I'm in the UK and the only network offering unlimited data is so slow you'll struggle to use more than 3GB in a month.
I've since given in and just paying through the nose for 16GB/month, so that I can not have to watch my data usage. (Un?)fortunately, the network is so fast I've found my data usage is already around 9-10GB/month, so I'm going to have to start watching that again soon.
Then please tell that my German and French mobile providers. While cheaper than their US counterparts I still have to watch my usage. China also has basically the same prices (not adjusted for income parity!). I know that especially north and east of Germany the plans are significantly less restrictive but that's far from universal.
Apart from laziness there is no reason for those websites to be so large.
Apart from greediness there is no reason for data to be so expensive...
Sure there's always room for an amount of optimization and good practice but it's insane to rely on that instead of fixing the problem. The growth in page size has been fairly linear and predictable.
The real headline here should be "German and US carriers cannot keep up with natural growth in technology"
So tell me: Out of a 2 MB website with 300 HTTP requests, what do you usually get as a visitor?
You make it sound as if the natural growth in technology is what makes these websites huge. I run a web app, which is uptodate technology-wise. It has a responsive design with @2x images and @3x. It uses JS in a sensible manner. It's fast and easy to use. Top notch technology. Yet, the average request sums up to 200KB. And 60KB of that is a custom font.
This is not about the natural growth in technology, it is about 50-70% of the 2MB are entirely worthless to the user, because they are usually Ads, trackers, wrongly compressed images (ImageOptim does wonders here) and using 10 JS frameworks in parallel.
It's almost as if you're saying: See, img tags now support the width attribute, so let's upload a 10MB photo and just tell the browser to resize it to 100px, because technology.
I'm not picking through all that hyperbole and condemnation of every dev that isn't you to try a maintain a reasonable argument.
So I go with a "those in glass houses" approach:
The only reason I can't see the terrible quality of your website's photos on my phone is that all the content is so damn small and zoomed out I can barely make out the text never-mind the picture pixels. I don't really want to zoom in because the shade of lime green you've chosen is so jarring it'll likely give me a migraine if it fills the screen.
500 MB could be enough. As I said, if I need a bigger plan, unoptimized websites have economic consequences. I heard mobile-first is a thing. It doesn't seem too far stretched to assume that users and thus potential customers might be interested in fast-loading websites.
See, what this boils down to is this: People use Adblockers, because they're tired of all that shit that's consuming their bandwidth (at least on mobile). Content publishers complain about people using Adblockers. But they're not willing to get the technical side of their content consumption platform right in order to accommodate their very visitors. They cry foul but are part of the problem obviously.
Unfortunately, I don't get to see what links to avoid (size-wise) before I download all the bloat and waste my bandwidth.
Yeah 500 MB could be enough if technology regressed (which it isn't going to). You're going to need a bigger plan regardless as 500 MB is nothing and mobile internet features are become more rich and ubiquitous.
The growth of website size has been fairly steady, if anything the growth has flattened over the last year or so. It is up to the infrastructure to keep up with that.
In the last year or two we've had the introduction 4G with 2-4 times the speed of 3G and you expect to have the same usage allowance?
For people in the developed world with good technology infrastructure, yes.
But it's also quite inconsiderate to tell people to upgrade their dataplan (which for some people might be a financial limit) instead of trying to think of better ways to reduce cruft and unnecessary data transfers.
It's quiet inconsiderate to tell developers to optimize for outdated bandwidth caps (which for some may be extra time, cost etc that they don't have).
You should expect to have to increase you dataplan every few years (because technology) whether that cost is absorbed by you or your phone company is a conversation to have with them.
> True, websites have grown by a lot over the past decade, and so has functionality, take a look at your favourite websites on the waybackmachine and see how god awful most of them looked just 6-7 years ago, compare the functionality they provided vs what we have now.
Well, the article disagrees that this is always what is going on, for example:
> Here's the PayPal website as it looks today. The biggest element on the page is an icon chastising me that I haven't told PayPal what I look like. Next to that is a useless offer to 'download the app', and then an offer for a credit card. I can no longer control the sort order, there are no filter tools, and you see there are far fewer entries visible without scrolling.
and
> It's like we woke up one morning in 2008 to find that our Lego had all turned to Duplo. Sites that used to show useful data now look like cartoons. Interface elements are big and chunky. Any hint of complexity has been pushed deep into some sub-hamburger.
Of course many things improved, today you can make more with less, a lot more. But all of that can easily be undone by just piling on a bunch of hires stock photographs or even video, 20 ad networks and 5 tracking scripts. And one might also to consider the memory footprint and CPU usage of multiple open tabs, and how leet efficient pages and bloated crap alike get put into the browser cache which is of finite size and flushes out older things to make room. Then multiply that by number of visitors. Crisis or not, it's a LOT of waste if you really think it through.
That is true. However, that does not describe all webpages, and I think implies a false dichotomy. There are plenty of sites which use modern functionality to deliver an experience that did not exist a decade ago.
It strikes me that my browsing experience not has seen a net improvement in the past 6-7 years. Whereas the functionality of websites is undoubtedly better, the ad-swamp is definitely greater too. I call it a draw. For all the content/UI improvement, there is a corresponding increase in advertising encroachment. If we assume that in tech things are supposed to improve over time, 6-7 years of static is a regression versus expectations.
Not to mention... I remember the dial-up days. We were all worrying about pages being too big and taking too long to load then, although "too big" was obviously a different size and had more to do with images than scripts. It's not surprising that people are going to use the resources available to them.
Just took over a website for a paying client, local small business. Found out their previous developer was some rockstar guy, thing was packed with frameworks and nuttiness.
Built a scraper and converted it to something using Markdown in a weekend, I barely used any of previous dev's "code."
Nice. There's certainly money to be made for making crappy slow websites better and faster. The trick is convincing clients their site looks and performs like someone's homework assignment.
All of this is unnecessary. Because it's actually possible to build lightweight and fast and yet beautiful websites. The available tools aren't the problem. The problem are the people who design their garbage collectors called websites.
Someone in another thread suggested to colorize the address bar based on bytes transferred and/or requests made. Maybe it'll wake people up if they visit a website and it has some nice dark red color to show how much bandwidth is wasted.
Your observation that it's possible to build xyz type of websites does not adequately discount the necessity of such features.
I'm talking about improving those lightweight and beautiful websites even further. I implement some of those features on every project, to deliver the maximum content & experience quality in every network and device condition possible.
You can't seriously try to shoe-horn the entire platform into whatever your vision of beauty is. Please keep an open mind about a complex future medium, that can effectively accommodate Netflix, YouTube, Facebook, Wikipedia, Photoshop, Open Office, Google Maps, Skype, Auto CAD, Grand Theft Auto VR, ...
Point taken and I'm sure, you (and I for that matter) would benefit from your suggestions. But it wouldn't change anything for the existing bloated websites, because they could be written in a lightweight way but aren't.
Fun fact: Despite the hyper-cautious (2015) in the headline here, the AMP page described, the one that Google said they were going to fix, still redownloads the same video file every 30 seconds, thus making it, still, theoretically unbounded in page size.
The NPR page about ad-blockers which was 12 megabytes without an adblocker and 1 megabyte when using an adblocker is now 1.4MB with an adblocker and "only" 1.9MB without -- to display about 1,100 words.
The medium article that was over a megabyte seems to have removed the pointless 0.9MB invisible image.
The website loads in 9 seconds (Chrome 50), the processing of the (peg$parseKeyword) functions takes 4.5 seconds of time during loading which causes the gap in the waterfall.
One look at the function would cause severe ingestion in most people.
(You wanted the word “indigestion”. Ingestion means something quite different. Though I suppose if someone was prone to eating when depressed, it could work.)
I think it may be worth specifically excluding Images from some of these checks. Or adjusting for what can be compressed in the images themselves. Though it's enough to say that they can be a huge portion of a site.
That said, I was able to boilerplate Preact + Redux for creation of a control that will be used stand-alone and the payload is about 16k of JS (min+gz), and under 1k of CSS [1]. The methodology I used could very well be carried to more "full" applications. There's very little reason most modern web applications can't be way under 500kb payload (excluding images). In this case I wanted more modern tooling, workflow, but a fairly light datepicker modal... I feel most datepickers suck. Could it be lighter? yes[1], but I wanted a little bit of a different approach. In the end it works...
All of that said, the biggest points of code bloat are usually in bringing in an entire library instead of only the pieces needed, especially bad with UI controls... I really wish more people would use/extend bootstrap from source here. It's really easy to do... usually I copy the variables file, copy the bootstrap base file, create a mixins file, and then update all the references in the copied base. From there, I can comment out/in as needed, and be fairly light.
Of course, fonts are another source of bloat, I'd suggest people start leaning towards using svg resources embedded in JS strings, and only those icons needed... all modern browsers support svg well enough in this case. Other web fonts should be limited to 2-3 includes of 1-2 families... that's it. Any more and your design is flawed anyway.
With webpack + babel, it isn't so hard to keep your applications structured, and much more minimal.
Perhaps we need smarter tools and better cooperation. For instance, I bet a large part of code in large websites is shared with some other website. Why can't our tools figure out if this is the case, and then create some kind of "shared library" to be used by both websites.
Maybe. That seems like it should be relatively easy to measure with a caching proxy, right?
In my experience, most site slowness is related to slow loading ads. Since installing an ad blocker a few months ago, I really haven't had any complaints about web speed.
Just take the time making your website slimmer. Do not care about the others and rape the benefits having more visitors than the others. It is egoist, but this is the only way companies are going to move, if the competition is getting more visitors because of slimmer websites. If this practically does not change anything, then, this is a false problem.
For my website (chemical databank) I was able to measure the benefits of reducing the page size with more visitors from countries with poor Internet connectivity.
So, just do it and enjoy the competitive advantage as long as you have it! This is the best way to get things moving.
I was reading an astronaut's biography a few days ago, talking about the transition from the moon missions (where every half-hour was planned and accounted for) to Skylab (where people were working in the same place for long enough that it became necessary for them to have free time). And on one level it's a huge waste to have someone on the ISS where it's costing $x000/minute to keep them up there playing candy crush or whatever. But it's also a sign of maturity, that we are no longer desperately squeezing out every minute we can possibly get up there.
I think we're somewhere similar with the web. Internet is plentiful enough that we don't need to scrape and squeeze every last kilobyte. Maybe medium takes 3mb to render a simple plain page. That's ok.
(I do wish a lot of sites would up their information density though. Above all I wish I could get a full-width page, not a phone-width segment down the middle of my widescreen display. At home I've started using a stylesheet that disables max-width on all elements, and I've yet to see a site that looks worse that way)
> Internet is plentiful enough that we don't need to scrape and squeeze every last kilobyte.
For you and I, perhaps. For the people on 2G/3G or worse, trying to read a simple article, modern web pages are hilariously heavy. The whole point of the web was openness. This is why Facebook, Apple and Google have all developed their own solutions to this problem: FB Instant Articles; iOS News app; Google Accelerated Mobile Pages.
If these companies are taking it seriously enough to develop solutions, there must be a real problem or serious sections of the market being missed/obstructed by the failure of publishers to build their websites responsibly.
And in general, no matter that heavy pages can still load quickly for me on my privileged broadband connection, it still pisses me off when it takes 3-5 seconds for a page to load when I know it should fundamentally take a second (or less) to show me text and some associated images whenever they're in the viewport.
Google AMP is the best solution for the web so far, but it's a step backwards to XHTML/WAP and mobile subdomain sites. The correct thing would have been to hit websites with genuine SEO penalties for shit mobile performance. This is especially true because AMP doesn't even work on older mobile browsers.
The biggest reason why the web needs to care about every single byte of overhead? Because native apps barely have this problem. Neither side, native or web, is going to simply be killed by this, but the value of websites and web apps will decrease if they do not prove to be viable vehicles for content.
As much as I want to agree with the article I really can't. All of the things that people claim as "bloat" are basically necessary for a website to work in a modern fashion...
Layout frameworks (eg Bootstrap) tackle the problem of developing a site that's readable at resolutions from 320px wide on a two year old phone up to a 5K iMac. That is not a trivial task, especially if you need UI components.
Application frameworks (eg Angular) make it much faster, and consequently a nicer experience, to maintain user state, load content, navigate around template driven pages, etc.
Media content has grown with resolutions and pixel densities. 10 years ago we were looking at websites on 1280x1024 displays, with no rich media. Today a consumer facing website has retina quality video. That's going to impact the page weight.
Being wasteful is a minor problem; everyone has fast broadband. Everything is cached at various layers from the browser to the service worker to the CDN to the origin server. Browsers are really fast. With some clever "cutting-edge-even-though-its-been-around-for-years" stuff like http2 you can fetch things in parallel.
Obviously websites should be optimised. No one should be downloading media or libraries that aren't used. Animations should be hardware accelerated. Sites shouldn't be running in debug mode (ahemAngularJSahem). But all in all, what we get in a browser these days is far better, far faster, and far more functional than websites were a decade ago. We could go back to the "works without JS, stateless on the clientside, roundtrip to the server and a whole new page for every click" world I learnt web development in, but I really don't want to. It was rubbish.
I'm face-palming at your comment, and I'll take the bait.
Firstly, there is no such thing as "retina quality video". If you mean 4K, then say 4K. "Retina" is Apple marketing spin for their high density displays that you seem to be using to describe a standard of video. More to the point, websites are not pre-loading 4K video that is counted towards page weight.
To your claims about needing bootstrap for presenting a page on small to large screens. There are numerous easy methods for making a site readable on small and large screens that don't require shoving layout frameworks in and relying on them to make things look good. Learn HTML. Learn CSS.
> "Being wasteful is a minor problem"
That statement would be great as an ironic t-shirt, but otherwise is yet another facepalm.
> "everyone has fast broadband"
Great news. With that assumption out of the way, we can get on with pre-loading retina video and throwing page-weight caution to the wind.
> "Everything is cached"
The utopian world you live in sounds great, but mobile devices are known to be very bad at both long term resource caching and short term memory caching - as in multiple open browser tabs.
The point of the article is that even basic articles are bloated whales that waste precious bandwidth on mobile. On desktop, common sense design is replaced with "touch friendly mobile-first impressionism" for actually less accessibility. When spinning globe videos and 2000 pixel auto-playing slideshow presentations in a vast landscape with hamburger menu in corner is considered the "modern web", we need to take a hard look at what modern should be.
The improved functionality you're talking about does not need to mean bloat or excessive amounts of data to drive simple things on websites.
Firstly, there is no such thing as "retina quality video". If you mean 4K, then say 4K. "Retina" is Apple marketing spin for their high density displays that you seem to be using to describe a standard of video.
I don't mean 4K. I mean transcoding video assets to specific resolutions that match the website's responsive breakpoints, with 2X resolution versions for high pixel density displays (eg retina). Pushing a 4K video down to someone whose browser viewport is 800px wide is the sort of wastefulness people complain about.
I think you mean images. Nobody is doing that with video.
Streaming video quality might be dynamically adjusted according to client bandwidth, but the topic here is unreasonable page weight and bloat. Streaming media has nothing to do with page weight.
I think I see the point you were trying to make though. Our devices have larger screens and more pixels, so fill every last one of those pixels up to the max with content, and then some. Many marketing and advertising folks would agree with you, while many developers are scratching their heads wondering why they just built a "chicken shit minimalist" website. Beer and a paycheck dissolves the care factor.
> Layout frameworks (eg Bootstrap) tackle the problem of developing a site that's readable at resolutions from 320px wide on a two year old phone up to a 5K iMac.
Displaying information readably has been solved by HTML for twenty years. It may not be pretty, but it will be readable.
> Application frameworks (eg Angular) make it much faster, and consequently a nicer experience, to maintain user state, load content, navigate around template driven pages, etc.
You know what's a better way to maintain state? HATEOAS. And there are no client-side libraries involved!
You know what's a better way to load content? Letting the highly-optimized browser do it!
You what's a better way to navigate around a page? Don't take over my space bar and cursor keys, and let me use my browser to browse!
> Displaying information readably has been solved by HTML for twenty years. It may not be pretty, but it will be readable.
I would add to this that you can make it pretty with less than 1k of css. You can make it REALLY pretty, and mobile ready as well. The fact that so few people in my profession know how to do so saddens me deeply.