I was a little surprised that it seems to outperform the M1 chip. This M1 chip has been an absolute joy to use since the moment I started using the Mac Mini. After 5 mins I decided that I am totally willing to burn the bridges. Lets drop x86 as the legacy arch that it is and don't look back. M1 Mac Mini was so snappy that I was taken back to the mid 2000s when new computers were extremely exciting again. It provided so much delight and joy.
To think they are just casually outperforming their desktop chip in the benchmarks. Man I can't wait for the next M1X/M2 or whatever they call it.
I agree, it’s been a pleasure to use my m1 MacBook. I got the minimum storage (256GB) a made a workstation/file server for the first time in ages. That’s proven way more useful than more local storage (accessing a bit more of my files locally wherever I am vs accessing all of my files at a slower speed).
>"Give up replaceable components and buy entire SoCs for all of your upgrades. Also forget about incremental upgrades"
I used to think like that - until I analyzed all my PC builds over the last 12 years and realized with Intel changing sockets every generation the whole incremental upgrade thing was fond memories from the 80's, 90's and early 2000's.
Once I realized incremental upgrades wasn't a hill worth dying on my choices increased and I've never been happier with the size/weight/speed/battery life that previous dogma prevented me from even evaluating.
>We could also stop writing shit software.
We could but people don't. Sentiments often don't survive contact with reality :p
> I used to think like that - until I analyzed all my PC builds over the last 12 years and realized with Intel changing sockets every generation the whole incremental upgrade thing was fond memories from the 80's, 90's and early 2000's. Once I realized incremental upgrades wasn't a hill worth dying on my choices increased and I've never been happier with the size/weight/speed/battery life that previous dogma prevented me from even evaluating.
Yep. I still have desktop machines and build another every 4 years or so. I used to upgrade machines, but it's just not realistic anymore. If I'm very lucky I'll manage to reuse the case and power supply, but even that's iffy. Stuff's changing too fast, including, as you mentioned, processor sockets. Memory, too. Man, I had some memory in the 2000s that lived through like 3 different machines. Not these days. If I upgraded video cards more often I guess I might manage to do that once per machine, but that'd be it.
I mostly miss the ability to add or replace defective storage modules, and the ability to add more onboard storage to things like mac minis. External storage sucks in many ways, from desk clutter, unreliability from jostling cables damaging themselves and ports and extra expense. Especially in today's age of small M.2 drives. Even the pro laptops could use some storage slots for media professionals so you don't see goofy ass shit such as velcroed SSDs on macbook screens, which is is a thing I've seen video professionals do sometimes.
Hardwired storage is horrible from a data recovery standpoint also, you should be able to take our your storage chip and try to recover data from it in a standalone fashion as long as you know the encryption key, even if your main board or CPU has been damaged.
Their storage and RAM prices being double the market price is also irritating.
My thoughts exactly. I don't know if this person has actually experienced the M1 in a thorough sense but my suspicion is probably no.
To this day there still exists a cadre who think Apple users are idiots for spending so much money on a computer. Many of them have not seriously taken a look at what the platform offers and they do not fully comprehend why people choose this platform over others.
Regarding the "We could also stop writing shit software." quote, it seems as if there has been a wave of optimization over the last decade just out of necessity. (electron notwithstanding)
It is not enough, components that are written in fast languages such as the OS have generally gotten slower. We are forced to upgrade due to security and I don't see any incentive for companies to improve this part of the software stack. You can use other desktop apps that are faster but you only really have 2-3 choices in OS and that dictates everything else you can do with your computer.
Was running Windows 11 on the Parallels Desktop on M1 and it seems snappy but not as snappy as MacOS. I noticed this same behavior on Windows 10 running on a core i7. It seems as if the design of the OS has some bloat that I cannot explain clearly but can definitely feel. We should consider that maybe Apple has been also optimizing software to be more performant.
> We should consider that maybe Apple has been also optimizing software to be more performant.
They absolutely do. You can see it in all their bundled programs, too (except Xcode... sigh). Safari, Pages/Numbers/Keynote, Preview, Maps. All snappy and light on resources. None of them prone to slowing down unrelated things on your system. You can leave them open in the background for weeks and forget they're there.
There's also iOS and Android. Android's gotten better, but it's always needed more energy and faster hardware to approach iOS performance and UI responsiveness. The software's simply better, by that metric, anyway.
Another instance of this was visible in BeOS vs. Windows/Linux, back in the day. The difference was not subtle and was present even when running cross-platform programs, so it must have been BeOS itself (and its libraries, SDK, et c.) responsible for everything feeling way faster and more responsive.
> There's also iOS and Android. Android's gotten better, but it's always needed more energy and faster hardware to approach iOS performance and UI responsiveness. The software's simply better, by that metric, anyway.
A significant portion of this is just requiring more efficient developer tools.
- Generational GC vs ARC memory management accounts for a large (50-100%) increase in memory usage
- JIT vs compiled code performance impacts
- Swift value types vs Java (and Objc) class types removes further memory management pressure as well as allowing for significant memory savings.
- The underlying OS model Apple has restricts the need to run code from multiple applications at one time to a small set of extension points. Android does allow for background processing (although you can't guarantee consistent/persistent processes).
Of course you give all this up when you are just wrapping your javascript + web content up in an electron or similar framework. In that case, you are running at a fraction of the efficiency of either platform.
M1 snappiness partly comes from the efficiency cores and a better QoS algo. x86 could start adding effiency cores and do similar kinds of scheduling to get some of the benefits.
I’ve upgraded piecemeal several times this decade. Harddrives-> Sata SSD -> NVMe SSD. The power supply, fans, and case are all 10 years old and continue to be great. I’ve upgraded the graphics card 4 times, and the CPU/MoBo twice. I’ve upgraded or extended the RAM three times.
There is definitely value in SOC for CPU/MoBo (maybe memory), but the other components work better upgrading at different rates.
Just because you keep falling for the Intel scam doesn't mean we all do. My current build has components dating from 2012, and I've been able to upgrade my Ryzen easily. Previous build made use of an LGA 1155, and still reused components from a previous build. I have hard drives from 2012 in my machine, and should I have the space for it, some 2009 drives would have been in it.
Buying entire new computers is the most wasteful thing you could do, both money wise and environmentally wise.
Surprised to see you down-voted, I fully agree. I regularly update components in my systems each year including RAM, GPU, SSDs, WiFi (just upgraded to 6e), and more depending on needs.
Processors/Motherboards not as much, maybe once every 5 years or so.
You might not but someone else will thereby preventing your computer from becoming e-waste. The only way your non upgradeable computer won't become e-waste is if you commit to using it for as long as an upgradeable computer would be used for. Lets say 10 years. Will you commit to not contributing to ewaste by using your non upgradeable computer for the next 10 years?
I buy Apple computers precisely because I want a machine I can use for that long. I used my last Macbook Air for 11 years without having to replace anything, so was less wasteful than a person swapping out components every year. Its screen is toast so it's a headless server now.
If everyone were like you then I would have no qualms about Apple's anti environmentalism. Personally I upgraded a garbage 2011 MBP from an unusable state with a SSD (came with HDD)and ram extending it's life. I find it ridiculous that I can upgrade an ancient MBP's ram to more than comes with the current base spec.
Each of my MBPs has been used for at least a decade (though not by me after ~6 years with most), so, sure, I'll commit to that since it's already happened with all the others.
I gave my son the 2012 MacBook Pro when I purchased the M1 MBP this year. He uses it for everything now.
I have tried giving older Macs to other family members, that didn't work out so well. So there are limits.
In general, used Macs have crazy high resale value, and if they work out ok, no problem with that. Both ends of the transaction receive value from the exchange.
I am in the process of clearing out quite a bit of e-waste, but we are talking fried PowerPC Macs and some iPhone 4 devices that I cant get working again. I have noticed that the Macs tend to be physically smaller than thier counterparts from the same era in my collection; that might suggest less e-waste overall, but of course the variation is all over the place.
The oldest operational x86 machines in my setup are two tower Mac Pros, a 2007 one and anither from 2009. These are somewhat upgradeable: you can use SSDs instead of rotating rust (duh), max out the RAM (easy), put in newer GPUs (drivers and BIOS permitting), or max out the CPUs (you need a moderately weird tool to get at the heat sink screws).
The fate of our electronics is indeed worthy of concern, but I don't see that Apple approach leading necessarily to more trash.
(oh yes: I am typing this on an ipad 4. I'd love to keep using it for another 14 months, it will then be ten years old. but it's getting a bit unreasonable. Works for hacker news though)
I recycle them at best buy or hand them down to a family member (often the case, except SSDs as those are usually dead). Far from perfect, I'm sure some of it makes it to a landfill. But probably better than nothing.
Just to clarify though I don't replace all those components every year, it usually ends up being one or maybe two things a year. But if I do one component a year my system can end up lasting indefinitely.
So is anything in your system for example 10 years old? MacBook can easily last 10 years for the right person. Just because I replace mine often doesn't mean people who buy it from me will.
My “desktop” of the last 8 1/2 is an MBP. Perfectly usable, does mail and web and zoom and a couple more things perfectly well. Only 8GB of ram, but Safari seems to become faster and require less resources with each update (Firefox …. Less so)
It still received the most recent OS updates.
Battery life is not as good at 8yrs - but still has a few hours of juice in it.
Overlooking the butterfly keyboard fiasco, Apple makes robust long lived hardware, and software to support it.
It’s market does not include the 3% “piecemeal” upgraders. But I suspect in “amount of time component is use” metrics it handily beats the wintel world (and the Android world)
For what it's worth, all of the systems that Apple has shipped with the M1 are replacing systems which also did not have replaceable components and could not manage incremental updates.
The main difference that I see is that the M1 chip is much faster and more power efficient than the systems it replaced (as well as much more expensive systems, for that matter; see M1 Air vs. Macbook Pro). I suspect that means that people who buy the M1 systems that Apple is selling won't replace their laptops as soon as people who bought x86 versions.
Your point is taken, but this is a trend that Apple has been moving towards regardless; it's not something that's new to the M1 Macs. You haven't ever been able to swap out your motherboard or processor in a Macintosh with very few exceptions (including the Xeon Mac Pros), and we've yet to see what an M1 (or M2 or whatever) Mac Pro would look like, expandability-wise, anyway.
Its almost like Apple stuck with a CPU vendor that was trapped at 14nm while they were able to buy access to TSMC 5nm, a full 3 generations ahead of what was in older Macs, with each generation providing significant double digit improvements in density and power usage reduction.
Ah yes, I forgot to mention in my original comment that I also have a Core i9 Macbook Pro and it seems more slow than the M1 mac. I cannot even believe it but I love to use the ~700$ Mac Mini and the $2800 laptop stays in the bag.
M1 is hobbled by having to have circuitry devoted to supporting the stricter x64 memory consistency model (they toggle it on only when emulating x64, but it is still there taking up die space when not). The A chips can be relaxed consistency only.
A15 is also on a newer process node (TSMC 5nm gen2) and has 2x the L2 cache it looks like.
Well lets see what M1X/M2 bring. They will probably retake the performance crown by squeezing even more onto the die.
Given Apple's history, I wonder if like 10 years from now they just drop this additional bloat. In that case we really didn't lose the performance just gave it up temporarily. :)
Why not both? I was shocked at what I could get to run on my M1 Macbook Air - a lot more than I expected!
The Windows 10 ARM beta also did a shockingly good job of running X86 code in Parallels too. Way faster than one would assume is reasonable for what was going on under the hood!
To think they are just casually outperforming their desktop chip in the benchmarks. Man I can't wait for the next M1X/M2 or whatever they call it.