I dont think this has anything to do with the hardware. I think we have entered an age where users in general are not upgrading. As such, software can't demand more and more performance. The M1 came out at a time where mostly all hardware innovation had staggered. Default RAM in a laptop has been 16G for over 5 years. 2 years ago, you couldn't even get more than 16 in most laptops. As such, software hardware requirements havent changed. So any modern CPU is going to feel overpowered. This isn't unique to M1's.
That’s because today’s hw is perfectly capable of running tomorrow’s software at reasonable speed. There aren’t huge drivers of new functionality that needs new software. Displays are fantastic, cellular speeds are amazing and can stream video, battery life is excellent, UIs are smooth with no jankiness, and cameras are good enough.
Why would people feel the need to upgrade?
And this applies already to phones. Laptops have been slowing for even longer.
Until everything starts running local inference. A real Siri that can operate your phone for you, and actually do things like process cross-app conditions ("Hey Siri, if I get an email from my wife today, notify me, then block out my calendar for the afternoon.") would use those increased compute and memory resources easily.
Apple has been shipping "neural" processors for a while now, and when software with local inference starts landing, Apple hardware will be a natural place for it. They'll get to say "Your data, on your device, working for you; no subscription or API key needed."
That's a very big maybe. The LLM experience locally is currently very very different from the hosted models most people play with. The future is still very uncertain.