Up until around 2008, clock speed was increasing rapidly and the bottleneck to systems performance kept changing. Software was being obsoleted in a mater of years. Then things started changing; evidenced perhaps by Windows upgrades suddenly becoming a lot less relevant and Apple starting to steamroll more modular computer builders like Dell.
Mark my words, there will come a day where continuously upgrading software will start to have a clearly higher cost than old, boring and stable. 2018 to me is the first year that a software platform can have had a true decade to stabilise in an environment that isn't constantly shifting and changing at a hardware level.
Maybe trends like in graphics cards will shake things up, but I believe the advantage is going to move to stable, slow upgrading, well supported platforms with large bases of well trained developer. Reducing LTS windows will be a disadvantage for the Java ecosystem.
The drive to move quickly with software wasn't to keep up with ever changing hardware, it was to respond quickly to the market. Treat every release like an experiment. The more experiments you can run the more likely you are to understand and solve the needs of your users.
Why would someone pick software that get feature releases and bug fixes on a yearly cycle when similar software is available that has weekly or daily releases.
>The more experiments you can run the more likely you are to understand and solve the needs of your users.
There is an interesting chapter for this in Donella Meadows thinking in Systems
Oscillations! A single step up in sales causes inventory to drop. The car
dealer watches long enough to be sure the higher sales rate is going to last.
Then she begins to order more cars to both cover the new rate of sales and
bring the inventory up. But it takes time for the orders to come in. During
that time inventory drops further, so orders have to go up a little more, to
bring inventory back up to ten days’ coverage.
Eventually, the larger volume of orders starts arriving, and inventory
recovers—and more than recovers, because during the time of uncertainty
about the actual trend, the owner has ordered too much. She now sees her
mistake, and cuts back, but there are still high past orders coming in, so
she orders even less. In fact, almost inevitably, since she still can’t be sure
of what is going to happen next, she orders too little. Inventory gets too
low again. And so forth, through a series of oscillations around the new
desired inventory level. As Figure 33 illustrates, what a difference a few
delays make!
Faster is not always better, especially in systems with delays. By reacting to initial cues prematurely you can easily end up in permanent disarray that simply turns into a chaotic feedback loop. It is often worthwile to upgrade patiently, so one can actually gauge long-term effects instead of reacting to noise. Changing the consumer experience is a conversation.
We're talking about two completely different things here. I'm talking about responding to direct customer requests differently which strikes me (potentially naively) as a different thing than managing a supply chain.
Eh, I think we got some time for consumer performance improvements. Graphics cards and HDD/SSDs seem to have been steadily increasing (noticeably) but CPU's and RAM have not for consumers which are major bottlenecks.
My top of the line late 2013 rMBP has got 4 cores, 16GB of ram, and 512GB NVMe SSD. rMBPs just recently upgraded to 6 cores and 32GB of ram (more storage too but that isn't a big deal nowadays since speed was the biggest optimization for storage and it is still NVMe IMO). However, server CPU and RAM has increased. You can now rent an AWS x1e.32xlarge with 64 cores and 4TB of RAM and thanks to AMD, that competition seems to finally be flowing back to consumer computers (At least for CPUs).
Just look at people being amazed at threadrippers 32 freaking cores on a desktop (matches AWS x1 when you consider the x1 has dual sockets) and we still haven't passed down the gains in RAM (2TB of RAM if it matches the x1).
When those performance gains pass down to consumer computers, users will have up to 6.33 times the amount of CPU and 64 freaking times the amount of RAM!
> When those performance gains pass down to consumer computers, users will have up to 6.33 times the amount of CPU and 64 freaking times the amount of RAM!
This will only happen once there is a usecase for that much RAM in a consumer machine that justifies the massive increase in power consumption. I don't see such a usecase yet. (The only thing that I can think of that pushes hardware requirements on consumer machines is VR/AR, but that's mostly a GPU problem.)
Many notebook models (also the previous Macbooks, I think) were stuck at 16 GB RAM because that's the maximum for low-power DDR3.
Mark my words, there will come a day where continuously upgrading software will start to have a clearly higher cost than old, boring and stable. 2018 to me is the first year that a software platform can have had a true decade to stabilise in an environment that isn't constantly shifting and changing at a hardware level.
Maybe trends like in graphics cards will shake things up, but I believe the advantage is going to move to stable, slow upgrading, well supported platforms with large bases of well trained developer. Reducing LTS windows will be a disadvantage for the Java ecosystem.