Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What is going on here? Why have none of the commenters read the article? Perhaps because it's phrased as a question and people didn't realize it's a link?

Anyways, this matches my expectations--people tend to be overly negative and only remember the good part. The mobile web as a whole has gotten faster due to network speeds+cpu improvements.

It is worth noting that pages are doing more after loading now than they used to be though. This won't show up in onload or first meaningful paint, etc. So the first paint is fast, but then if you try to scroll immediately afterwards you'll probably hit some jankyness while the rest of the page loads asynchronously (but only kind of asynchronously since there's a single main thread).

Some other things that could cause the regression are that more people own a budget Android phone now than before. People may not realize how slow these phones are. The single core performance of the top budget phone, the Samsung A50, is comparable to an iPhone 6 which came out in 2015.



> It is worth noting that pages are doing more after loading now than they used to be though. This won't show up in onload or first meaningful paint, etc. So the first paint is fast, but then if you try to scroll immediately afterwards you'll probably hit some jankyness while the rest of the page loads asynchronously (but only kind of asynchronously since there's a single main thread).

The question is whether those pages are doing more for me, or whether they are doing more to me. When I load a page that would have been a normal hypertext document 10 years ago, instead I get a clown show filled with "we have cookies" pop-ups, tracking scripts, ads, ad-blocker-blockers, and more.

> Some other things that could cause the regression are that more people own a budget Android phone now than before. People may not realize how slow these phones are. The single core performance of the top budget phone, the Samsung A50, is comparable to an iPhone 6 which came out in 2015.

If I owned an iPhone 6 in 2015 and own a Samsung A50 today, and the web is slower and jankier today than it was five years ago, then isn't it fair to say that the web got slower?


> The question is whether those pages are doing more for me, or whether they are doing more to me.

The best part is that I'm paying for that privilege when I'm not on WiFi.


Remarkable, isn’t it? A bit like paying for cable and getting more ads than content. Puts that recent post on auto play videos in new light, too.

It really sickens me how advertisers are allowed to tell us whatever they want if they just pay. Not surprisingly, the CCP took out a full page ad in my country’s most prominent newspaper to spread propaganda about the Hongkong protests.


>The best part is that I'm paying for that privilege when I'm not on WiFi.

Datacaps are some of the best examples of the worst kind of rip-offs.

How they're tolerated is beyond my imagination. Here I can get a 100/50 4G consistently, it's for ~20€/mo and the operators have no complaints


> If I owned an iPhone 6 in 2015 and own a Samsung A50 today, and the web is slower and jankier today than it was five years ago, then isn't it fair to say that the web got slower?

Yes. But what do you suggest as an alternative? Refuse to take advantage of technological progress to cater to the small portion of the population who haven't updated their phones in half a decade?


Then a surprisingly large segment of the population will passively or actively refuse to visit sites with bloated webpages. Goes both ways.


If only there were technological progress to take advantage of! "Now, here, you see, it takes all the running you can do, to keep in the same place."


I have and the article starts off with the assumption that it hasn't and then ties itself in knots to reach that conclusion.

The network speeds have increased, network latency has decreased, the hardware has gotten faster and we're at best stuck in the same place we were in 2010.

>Page weight has increased over time, but so has bandwidth. Round-trip latency has also gone down.

>Downloading a file the size of the median mobile website would have taken 1.7s in 2013. If your connection hasn't improved since then downloading this much data would now take 4.4s. But with an average connection today it would only take 0.9s.

The problem today is that the average website sends a couple dozen to a couple of hundred requests to complete a load. The average website 10 years ago sent a couple to a couple of dozen requests for the same thing.

So after 10 years of constantly improving technology and spending $5000 on phones to keep up with the latest CPUs the performance is pretty much the same.

Imagine if you had to buy a new car every 5 years to drive at the speed limit. That is the situation we are in.


The article is really broken.

It's not about "the web" it's about various schools of web building.

This is most noticable when a property fundamentally changes their approach (reddit) or when twitter did it a while back and then (sensibly) retreated.

A better, more sensible approach then say, graphing CPU clockspeeds, would be to fragment web development into these various schools, give them names and then characterize them accordingly.

There's really only two ways to talk about this problem: one is hopelessly divisive and factional and the other is irrelevant and useless.

That sounds unpleasant? Correct! That's why it's still a problem and getting worse.

When the "make things better" axe falls on the fingers of the "mostly harmless" it's the passions of the axe wielder that get the focus and the blame. So instead we all slide into mediocrity together. It's the path of human institutions and the web isn't immune from the pattern.


> The mobile web as a whole has gotten faster due to network speeds+cpu improvements

Making something faster by throwing more hardware at it doesn't meaningfully count as making something faster IMO, you can make the most inefficient piece of software "fast" by throwing the biggest CPU and network you can find at it.

The real issue is why should an otherwise capable CPU from 5 years ago struggle to render the average website today, when it really shouldn't be that hard. Scrolling through someone's marketing website _should_ be a painless experience on even a low-end budget phone.


I don't know the direct answer, but I have a deduction in my head to work with...? I'll just put it out there: I think speed is deeply impacted by high-level frameworks that parse or compile at runtime. React Native is a framework on Android, which is a framework on Java, which compiles at runtime. IDK if anything in that chain compiles down to Assembly or machine code before you open an app made in React Native. That tied with the bloat of JIC background services sitting idle eats bandwidth. Garbage collection checks operate in a loop, checking again and again if all these unused but loaded processes still exist at their addresses. And when you pile those on each other, it seems relatively easy to see how modern CPUs don't seem much faster than chipsets from 5-6 years ago.

I stopped programming around 8 years ago because I hate the current MVC model most software is created and maintained with. What got me interested recently in dipping back in was a video on branchless programming. I love the idea of unit testing at the machine code level for efficiency, and then figuring out how to trick the compiler or runtime and the chipset into making quick, predictive outputs to reduce idling on branches or making 15 steps for something doable in as little as 4.

That feels like a completely opposing direction to take given the current priorities of engineers across almost all industries, even oldtime ones like Gaming.


Android applications get compiled to native code on install since Android 5 (2014).


> The real issue is why should an otherwise capable CPU from 5 years ago struggle to render the average website today, when it really shouldn't be that hard. Scrolling through someone's marketing website _should_ be a painless experience on even a low-end budget phone.

Because marketing/product/design decides to add bells and whistles. Optimization is also not zero-cost effort. The business has to pay for it. I would assume that most business don't think it is worth it.


>The real issue is why should an otherwise capable CPU from 5 years ago struggle to render the average website today, when it really shouldn't be that hard.

That depends on what is a capable CPU. I would say even iPhone dont have an Capable CPU 5 years ago today. ( iPhone 6, iPhone 6s not released yet )

And it is even worst on Android, and the current state of things aren't much better. Hopefully ARM will catch up in the next 5 years, but that means it will take another 5 years to filter down to market.

i.e Not looking Good.


iPhone 6 came out in 2014; iPhone 6s on this day five years ago. Both are capable smartphones; the latter being capability enough to support the latest version of iOS.


iPhone 6s was first sold on September 25, 2015.


> The mobile web as a whole has gotten faster

There are actually cases, were faster infrastructure had slowed down a system significantly. E.g., British railways (in various organisational form over the years) entertained rolling post office trains, which grabbed mail bags on the go, sorted the mail and dropped it again without any halts, since 1838. This played quite a role in the evolution of fast delivery of national news papers, up to 8 deliveries of mail per day in urban centers, etc. By the 1960s the procedure had become too dangerous for the increased speed of trains (with several firemen loosing their heads in accidents involving the scaffolds for handing over the mail bags) and the last Travelling Post Office ceased operations in 1971. Moral: by speeding up the network by a few miles per hours, mail delivery slowed down by a day.

Similarly, as mobile network speeds increased, expectations what could be done with this rose faster than the actual speed of infrastructure. Add high-res resources with previously unheard of page loads and you've established a system of ever increasing expectations and visions, which will be always bound to significantly outclass the real life capabilities of the infrastructure. As long as we stick to this paradigm, increasing network speed will always result in a slower web, due the Wirth factor involved. I'm afraid this will be even more true for any further significant speed-ups, like those promised by a fully operational 5G network. (Also, visions and concept that are apt to exploit and even challenge the capabilities of 5G will probably also pose a new challenge to any hardware on the end points, which may prove eventually financially challenging for an average user, by this introducing yet another significant gap and respective drops in average real-world performance.)


That sounds fascinating. Is there an article or book about it?


You can see the (preserved) system in action in this video ("Absolute History" YT channel) together with a bit of backstory on it: https://www.youtube.com/watch?v=GeMkOruNht8


Fascinating. I know what I'm doing for the next hour.

It does make an interesting comparison to the internet today. The Victorians had the wired telegraph for information, but without the trains most of the advancements of the 19th century would not have been possible. Looking naively at it, it seems the closest technology we have today would be flying drone swarms.


Another interesting aspect of Victorian railways: Originally, passenger coaches had compartments spanning the full width of the coach with doors on both sides, typically with room for 8 passengers. While this puts a maximum of passengers in a coach, each of the compartments is totally isolated and there is no shared infrastructure, like bathrooms or a chance to collect any sort of food, etc. Hence the train has to increase halts at stations to allow for any passenger needs (which may also collect a bit of extra profit at the stations). At some point, coaches with corridors where introduced, now offering room for just 6 passengers in any of the compartments. A drop of 25% in capacity! on the other hand, based on average speeds and frequency on your network, you may more than compensate for this by less frequent and shorter stops, by this increasing overall throughput of the system. Where is the exact point in the evolution of technology, of your system, and of market acceptance that this becomes a viable option? (Include any losses on side business at the stations in your considerations.)


I once saw a humble, but quite astounding artefact on TV: a box for mailing eggs, of course, Victorian. Behind this hides an entire system of postal service and mail train delivery. A, say, Cornish farmer would put fresh eggs for an individual customer in said box in the early morning. Those boxes were then collected by the postal service and shipped by train to London, where it was delivered to the customer's home, just in time for breakfast, the very same morning. (Amazon next day delivery pales in comparison.)


I’m confused, how did increasing speeds slow down service? There’s a safety issue there but I don’t see the connection to service.


Trains were too fast to safely hand off bags of mail between train stations and trains while the trains were in motion, so instead they had to stop the trains entirely, which was slower than before. (I'm not sure why they couldn't just slow down the trains; maybe it was more a combination of higher speeds and changing priorities).


These were just special coaches added to high speed express trains. So the system was interconnected to the system of HS passenger trains as a shared infrastructure.


Ok so high speed passenger service was slowed to make mail transfer safe?

I’m still having trouble figuring out how speeding up trains delayed mail service. Was it just a scheduling issue?


The network is an interconnected system relying on average speeds and throughput. Slowing down or adding halts for the safety of a particular service probably wasn't an option. Hence the end of service. (The speed of that particular service consequentially dropped back to stationary infrastructure and transit between those hubs, roughly what it had been before 1838.)


I think I’m getting it now. When high speed passenger service ran at 50mph it could also carry mail and deliver 8 times a day. When passenger service increased to 75mph it was too dangerous to carry mail the same way so mail service dropped back to once a day, probably on dedicated or slower trains.


I remember that the web was very usable with a 133 MHz CPU, 32 MiB of RAM, a 3600 RPM hard disk, and a 1.5 megabit (0.128 megabit up) connection.

That budget Android phone blows away the hardware that I was using. A quick search for the Samsung A50 tells me: "on Verizon's network in downtown Manhattan [...] average data speeds of 57.4Mbps down and 64.8Mbps up". That is 38 to 506 times faster. It has an absurdly fast 8-core CPU running at 2300 MHz. Ignoring the fact that MHz is a terrible benchmark, that is a factor of 138 faster. The RAM is bigger by a factor of 128 or 192. There isn't really any hard drive latency on the phone.

Yes, the web is slow.

The trouble is that browsers make no attempt to stop web sites from using infinite resources. The assumptions are that web sites will politely cooperate to share my computing resources, and of course I couldn't possibly want to actually use tabbed browsing to access lots of web sites, and we all discard our hardware as electronic waste after just a few years.


It's impossible to make any sane limits that would universally apply to web pages. An isolated environment for an arbitrary application is the web's purpose, not just loading a text document.

You can actually play hw-accelerated doom3 on a browser today no problems. No add-ons, no nothing needed.


You could require user consent for resource increases.

For example, start the RAM at 12 times the number of CSS pixels. When the limit is hit, freeze the allocations until the user authorizes a doubling of the limit. Web sites would need about 10 authorization clicks to go from 4 MiB to 4 GiB. That goes for everything on the page, all sharing the limit.

Web sites would quickly change to minimize that, out of fear that users might not keep accepting the resource usage.

CPU usage could be similar, probably based on threads. The default is that only a single tab in a single window gets any time at all. Everything else is suspended. Users can grant permission for stuff like music players.

Network usage would also need to be limited, though limiting the RAM and CPU will tend to limit network usage as a side effect.


It's more likely users would be mad that their browser update makes them play cookie-clicker to get to their sites.

By and large users are unaware of how much resources something uses, or should use. They don't really care about anything but getting from A to B as fast as possible, with as few interruptions as possible.

Computer resources are just like any other resource; expendable. User will always use more if that means it's more convenient. Human time is very valuable. Accepting multiple dialogues would take even more time than loading a fat page

as an add: there is no single resource you can bind the multiplier to; many sites use no css, but lots of js, or webgl, wasm, tables... There is simply no possible way to foresee what will be slow and what wont


The "CSS pixel" is just a pixel, unscaled for high-dpi displays. The point is to avoid revealing the hardware while allowing a bigger starting amount for a bigger window. If you prefer, just pretend I wrote "32 MiB".

Human time is valuable. That is the whole point of this. My time is wasted when my computer gets so slow that it takes 10 seconds for the Caps Lock light to respond. My time is wasted when the mouse lag is so awful that it takes me half an hour to kill a few tasks. My time is wasted when I have to walk away from an unusable computer, checking back every few hours to see if the OS might have killed the biggest task.

Web browser resource consumption is why I have a fresh new HN account. I had to power cycle the computer today, and it seems that Chromium won't save passwords over a restart unless I upload them all to Google.


All problems are fractal in nature, they require a problem-scope in order to be solvable.

In this case, it would be "how fast is fast enough?", which would be fast enough that the human operating the application doesen't lose focus.

For most pages I've viewed with my 150€ phone, submit->time to interactive is between 1-3 seconds for the first non-cached load, and much faster for when revisiting. This is sufficient worst-case for any conceivable tasks performed on web-apps today.

Some exceptions that come to mind (like Reddit) have ulterior motives to force slowness so that people have to use their native app instead (which isn't much better for what I've heard)


You say yourself that websites are now janky. That is also my experience.

In fact, despite new tech, it is a relatively recent phenomenon that I feel my phone slowed down by a freaking website.

Furthermore, phones may be faster, but websites load slower and are janky due to unnecessary async loading.

So what’s the point of faster phones? The point is that webdev is terrible.

The article goes into all sorts of hoopla to claim that the web isn't slower. Thing is, I can still open those websites on of yore on my phone right now! And to no one's surprise, the new web tech is indeed slower and imo also worse in user experience. So yeah.


> The mobile web as a whole has gotten faster due to network speeds+cpu improvements

This point is agreeable, though if you browse the web on the same phone for 5+ years (my dad still uses his iPhone 6S), you may notice a difference over time.

Also, it's not a good sign if consumers are forced to go along with planned obsolescence just to keep their Internet browsing experience from becoming increasingly slow. The principle of progressive enhancement means that websites built today should work well on phones made years ago.


I have a plan for this, and it involves scraping content and using the web that way. This is something I have started work on, on a small scale for the stuff I care about the most.

Ideally (well ideally really, companies themselves would provide APIs for accessing the content but unfortunately that makes it difficult for them to make money, both directly through loss of ad revenue when clients don’t show their ads and indirectly by making it easier to pirate stuff), we’d have a joint open effort to do this on a massive scale. For now I am doing it on a small scale on my own, writing tools for my own use only.

In addition to this I have also started work on retrieving the content that is walled off in apps that I use. For example, there are some magazines that I used to subscribe to for a while, and I’d rather be able to keep my access to the content indefinitely than to see it disappear whenever the publisher decides that the magazine has run its course and subsequently stops updating the app and then shuts down the servers that host the data.

On top of this, I have for a longer period of time (years now) been using for example Facebook as little as possible. I only use it for Messenger and for upcoming events mostly. Meanwhile, Instagram also owned by Facebook is worth it for me to continue to use much more actively for now. But I am also slowly working to make something of my own to host content that I myself produce, with the intent of continuing to consume content shared on Instagram but to cut back on posting to Instagram and instead to post my stuff on my own server. It’s not like any of my stuff gets much attention anyways, so for me it will not be a big difference in terms of engagement. Mostly the way I use Instagram in terms of the content that I myself post, is that I post pictures and videos that I have made that I think are worth sharing and then when in conversation with friends and aquaintances I sometimes pull up my phone to show them in person something that I did or made recently. A self-hosted service could serve the same purpose.

As for the increasingly slow experience of browsing the web, I come to realize that this might in fact contribute to what the parent to yours said, about people on HN not reading the linked article. At least, for myself I find that I often don’t click through to the linked articles, and I think the experience of slow-scrolling, megabytes heavy pages is contributing to this. I try, however, to not comment on the story itself unless I have read it first. Meanwhile, HN itself is lightweight and comfortable to be on. And often the comments will encourage one to click through to the linked story if it is worth reading, either directly stating that it is worth reading or indirectly stating it by quoting something good from the page or talking about some good data points or novel information from the linked page. (Novel to me, I should note.)


I went down this road once and ultimately didn’t have the patience for what a nightmare scraping the modern web is, parts of what I built I still use for myself. I wish you luck though.


> scraping content and using the web that way

Weboob: https://news.ycombinator.com/item?id=24022671


> The mobile web as a whole has gotten faster due to network speeds+cpu improvements.

What's distinctly lacking in that assessment is web applications getting more resource efficient, or more conservative with storage. I'd argue that hardware and CPU improvements are enabling bad tech stacks, like a friend might enable an alcoholic. Sure, you can minify, tree-shake, etc but with sufficient hardware, you don't strictly have to.

I also don't see TFA as an actual rebuttal of the hypothesis, since it focuses on the US. Half the planet has an uplink, so you're gonna end up with skewed results if you focus only on the top end of the technology distribution. While a rural connection in the US might be just as bad as a connection in rural India, I'd wager the mean and average connection speeds and latency are still way better in the US. The Internet is for everyone, not just Silicon Valley engineers on a MacBook Pro connected through fibre optics.


I did, I just have a different interpretation of it. For instance, this sentence:

"Still, I don't think the mobile web – as experienced by users – has become slower overall."

To me it's hardly a positive. people are paying for fastest speeds, fastest phones, fastest CPUs, and yet they get nothing in return.

The fact that hardware is faster is no excuse for the very real web bloat, specially when most of this bloat is due to stuff that adds 0 value to customers, like tracking scripts, anoying pop ups, overly complex and intentionally confusing tracking disclaimers, etc...


>People may not realize how slow these phones are. The single core performance of the top budget phone, the Samsung A50, is comparable to an iPhone 6 which came out in 2015.

Yes. And also.

The single core performance of Entry Level iPhone, iPhone SE; is faster than Flagship Android Phone.

And that is not counting the System and Software efficiency.


Not the first time I see this statement getting downvoted on HN.

It is an unpopular statement. But you cant deny it.


I use a note 3. Not sure what your talking about. It's super zippy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: