Hacker News new | past | comments | ask | show | jobs | submit login
The Apple A15 SoC Performance Review: Faster and More Efficient (anandtech.com)
191 points by vanburen on Oct 4, 2021 | hide | past | favorite | 193 comments



Even though I am no longer an Apple Fan I was surprised at the sentiment on A15 after the Keynote and Reviews.

Do people really expect revolutionary improvement every single year? ~20% Improvement YoY within the same power envelop is god damn impressive. And if this is not good enough? This is just layering ground work for next year's 3nm and LPDDR5. ( Or 4nm depending on circumstances )

The E Core are interesting because they are a preview of what to expect in next generation Apple Watch.

I was hoping there is some investigation on new Display Engine and Video Encoder / Decoder. Especially on power usage. But looks like these kind of interest are in the minority.


I was a little surprised that it seems to outperform the M1 chip. This M1 chip has been an absolute joy to use since the moment I started using the Mac Mini. After 5 mins I decided that I am totally willing to burn the bridges. Lets drop x86 as the legacy arch that it is and don't look back. M1 Mac Mini was so snappy that I was taken back to the mid 2000s when new computers were extremely exciting again. It provided so much delight and joy.

To think they are just casually outperforming their desktop chip in the benchmarks. Man I can't wait for the next M1X/M2 or whatever they call it.


I agree, it’s been a pleasure to use my m1 MacBook. I got the minimum storage (256GB) a made a workstation/file server for the first time in ages. That’s proven way more useful than more local storage (accessing a bit more of my files locally wherever I am vs accessing all of my files at a slower speed).


"Give up replaceable components and buy entire SoCs for all of your upgrades. Also forget about incremental upgrades"

No, thank you, I'll keep a flawed x86 architecture. We could also stop writing shit software.


>"Give up replaceable components and buy entire SoCs for all of your upgrades. Also forget about incremental upgrades"

I used to think like that - until I analyzed all my PC builds over the last 12 years and realized with Intel changing sockets every generation the whole incremental upgrade thing was fond memories from the 80's, 90's and early 2000's. Once I realized incremental upgrades wasn't a hill worth dying on my choices increased and I've never been happier with the size/weight/speed/battery life that previous dogma prevented me from even evaluating.

>We could also stop writing shit software.

We could but people don't. Sentiments often don't survive contact with reality :p


> I used to think like that - until I analyzed all my PC builds over the last 12 years and realized with Intel changing sockets every generation the whole incremental upgrade thing was fond memories from the 80's, 90's and early 2000's. Once I realized incremental upgrades wasn't a hill worth dying on my choices increased and I've never been happier with the size/weight/speed/battery life that previous dogma prevented me from even evaluating.

Yep. I still have desktop machines and build another every 4 years or so. I used to upgrade machines, but it's just not realistic anymore. If I'm very lucky I'll manage to reuse the case and power supply, but even that's iffy. Stuff's changing too fast, including, as you mentioned, processor sockets. Memory, too. Man, I had some memory in the 2000s that lived through like 3 different machines. Not these days. If I upgraded video cards more often I guess I might manage to do that once per machine, but that'd be it.


I mostly miss the ability to add or replace defective storage modules, and the ability to add more onboard storage to things like mac minis. External storage sucks in many ways, from desk clutter, unreliability from jostling cables damaging themselves and ports and extra expense. Especially in today's age of small M.2 drives. Even the pro laptops could use some storage slots for media professionals so you don't see goofy ass shit such as velcroed SSDs on macbook screens, which is is a thing I've seen video professionals do sometimes.

Hardwired storage is horrible from a data recovery standpoint also, you should be able to take our your storage chip and try to recover data from it in a standalone fashion as long as you know the encryption key, even if your main board or CPU has been damaged.

Their storage and RAM prices being double the market price is also irritating.


Cloud storage is kinda cheaper if you look at iPhones


My thoughts exactly. I don't know if this person has actually experienced the M1 in a thorough sense but my suspicion is probably no.

To this day there still exists a cadre who think Apple users are idiots for spending so much money on a computer. Many of them have not seriously taken a look at what the platform offers and they do not fully comprehend why people choose this platform over others.

Regarding the "We could also stop writing shit software." quote, it seems as if there has been a wave of optimization over the last decade just out of necessity. (electron notwithstanding)

It is not enough, components that are written in fast languages such as the OS have generally gotten slower. We are forced to upgrade due to security and I don't see any incentive for companies to improve this part of the software stack. You can use other desktop apps that are faster but you only really have 2-3 choices in OS and that dictates everything else you can do with your computer.

Was running Windows 11 on the Parallels Desktop on M1 and it seems snappy but not as snappy as MacOS. I noticed this same behavior on Windows 10 running on a core i7. It seems as if the design of the OS has some bloat that I cannot explain clearly but can definitely feel. We should consider that maybe Apple has been also optimizing software to be more performant.


> We should consider that maybe Apple has been also optimizing software to be more performant.

They absolutely do. You can see it in all their bundled programs, too (except Xcode... sigh). Safari, Pages/Numbers/Keynote, Preview, Maps. All snappy and light on resources. None of them prone to slowing down unrelated things on your system. You can leave them open in the background for weeks and forget they're there.

There's also iOS and Android. Android's gotten better, but it's always needed more energy and faster hardware to approach iOS performance and UI responsiveness. The software's simply better, by that metric, anyway.

Another instance of this was visible in BeOS vs. Windows/Linux, back in the day. The difference was not subtle and was present even when running cross-platform programs, so it must have been BeOS itself (and its libraries, SDK, et c.) responsible for everything feeling way faster and more responsive.


> There's also iOS and Android. Android's gotten better, but it's always needed more energy and faster hardware to approach iOS performance and UI responsiveness. The software's simply better, by that metric, anyway.

A significant portion of this is just requiring more efficient developer tools.

- Generational GC vs ARC memory management accounts for a large (50-100%) increase in memory usage

- JIT vs compiled code performance impacts

- Swift value types vs Java (and Objc) class types removes further memory management pressure as well as allowing for significant memory savings.

- The underlying OS model Apple has restricts the need to run code from multiple applications at one time to a small set of extension points. Android does allow for background processing (although you can't guarantee consistent/persistent processes).

Of course you give all this up when you are just wrapping your javascript + web content up in an electron or similar framework. In that case, you are running at a fraction of the efficiency of either platform.


M1 snappiness partly comes from the efficiency cores and a better QoS algo. x86 could start adding effiency cores and do similar kinds of scheduling to get some of the benefits.


>Efficiency cores

Alder Lake 2022.


Great - there's half of the solution - hardware. Got the software to integrate and leverage it to the hilt too?


I’ve upgraded piecemeal several times this decade. Harddrives-> Sata SSD -> NVMe SSD. The power supply, fans, and case are all 10 years old and continue to be great. I’ve upgraded the graphics card 4 times, and the CPU/MoBo twice. I’ve upgraded or extended the RAM three times.

There is definitely value in SOC for CPU/MoBo (maybe memory), but the other components work better upgrading at different rates.


You should give AMD a try.

They've been on the same couple sockets for a long time now.


Just because you keep falling for the Intel scam doesn't mean we all do. My current build has components dating from 2012, and I've been able to upgrade my Ryzen easily. Previous build made use of an LGA 1155, and still reused components from a previous build. I have hard drives from 2012 in my machine, and should I have the space for it, some 2009 drives would have been in it.

Buying entire new computers is the most wasteful thing you could do, both money wise and environmentally wise.


How often are you replacing your PSU? Every time Intel changes socket?


I only replaced PSUs on my MacBooks when Apple switched from MagSafe to USB-C.


Surprised to see you down-voted, I fully agree. I regularly update components in my systems each year including RAM, GPU, SSDs, WiFi (just upgraded to 6e), and more depending on needs.

Processors/Motherboards not as much, maybe once every 5 years or so.

Apple really seems all about throw away culture.


>...I regularly update components in my systems each year including RAM, GPU, SSDs, WiFi...

Then go for it - you have choices, and no one is forcing you to buy a Mac.

I personally got shit to do and have no interest in replacing all of that at that type of frequency.


You might not but someone else will thereby preventing your computer from becoming e-waste. The only way your non upgradeable computer won't become e-waste is if you commit to using it for as long as an upgradeable computer would be used for. Lets say 10 years. Will you commit to not contributing to ewaste by using your non upgradeable computer for the next 10 years?


I buy Apple computers precisely because I want a machine I can use for that long. I used my last Macbook Air for 11 years without having to replace anything, so was less wasteful than a person swapping out components every year. Its screen is toast so it's a headless server now.


> Past Performance Is No Indicator of Future Performance

Just because Apple computers before last a long time doesn't mean it will be the same with their current computers.

Although my personal bet is that the 2016-2019 MBP will degrade faster than the 2020 M1 Macs.


Desktop systems last even longer than laptops. Except, of course, Apple's desktop systems because they'll stop letting you upgrade the OS.


If everyone were like you then I would have no qualms about Apple's anti environmentalism. Personally I upgraded a garbage 2011 MBP from an unusable state with a SSD (came with HDD)and ram extending it's life. I find it ridiculous that I can upgrade an ancient MBP's ram to more than comes with the current base spec.


Each of my MBPs has been used for at least a decade (though not by me after ~6 years with most), so, sure, I'll commit to that since it's already happened with all the others.


I gave my son the 2012 MacBook Pro when I purchased the M1 MBP this year. He uses it for everything now.

I have tried giving older Macs to other family members, that didn't work out so well. So there are limits.

In general, used Macs have crazy high resale value, and if they work out ok, no problem with that. Both ends of the transaction receive value from the exchange.

I am in the process of clearing out quite a bit of e-waste, but we are talking fried PowerPC Macs and some iPhone 4 devices that I cant get working again. I have noticed that the Macs tend to be physically smaller than thier counterparts from the same era in my collection; that might suggest less e-waste overall, but of course the variation is all over the place.

The oldest operational x86 machines in my setup are two tower Mac Pros, a 2007 one and anither from 2009. These are somewhat upgradeable: you can use SSDs instead of rotating rust (duh), max out the RAM (easy), put in newer GPUs (drivers and BIOS permitting), or max out the CPUs (you need a moderately weird tool to get at the heat sink screws).

The fate of our electronics is indeed worthy of concern, but I don't see that Apple approach leading necessarily to more trash.

(oh yes: I am typing this on an ipad 4. I'd love to keep using it for another 14 months, it will then be ten years old. but it's getting a bit unreasonable. Works for hacker news though)


What do you do with your old components?

My assumption your approach is far more wasteful at a cost of getting top notch performance. Nothing about the bleeding edge is ecological.


I recycle them at best buy or hand them down to a family member (often the case, except SSDs as those are usually dead). Far from perfect, I'm sure some of it makes it to a landfill. But probably better than nothing.

Just to clarify though I don't replace all those components every year, it usually ends up being one or maybe two things a year. But if I do one component a year my system can end up lasting indefinitely.


So is anything in your system for example 10 years old? MacBook can easily last 10 years for the right person. Just because I replace mine often doesn't mean people who buy it from me will.


Yes all the components last well over 10 years except maybe an SSD because I work with a lot of data. Like I said, I usually hand them down to family.

If you can get by on a decade old machine all the more power to you. Professionally I cannot.


My “desktop” of the last 8 1/2 is an MBP. Perfectly usable, does mail and web and zoom and a couple more things perfectly well. Only 8GB of ram, but Safari seems to become faster and require less resources with each update (Firefox …. Less so)

It still received the most recent OS updates.

Battery life is not as good at 8yrs - but still has a few hours of juice in it.

Overlooking the butterfly keyboard fiasco, Apple makes robust long lived hardware, and software to support it.

It’s market does not include the 3% “piecemeal” upgraders. But I suspect in “amount of time component is use” metrics it handily beats the wintel world (and the Android world)


For what it's worth, all of the systems that Apple has shipped with the M1 are replacing systems which also did not have replaceable components and could not manage incremental updates.

The main difference that I see is that the M1 chip is much faster and more power efficient than the systems it replaced (as well as much more expensive systems, for that matter; see M1 Air vs. Macbook Pro). I suspect that means that people who buy the M1 systems that Apple is selling won't replace their laptops as soon as people who bought x86 versions.

Your point is taken, but this is a trend that Apple has been moving towards regardless; it's not something that's new to the M1 Macs. You haven't ever been able to swap out your motherboard or processor in a Macintosh with very few exceptions (including the Xeon Mac Pros), and we've yet to see what an M1 (or M2 or whatever) Mac Pro would look like, expandability-wise, anyway.


Its almost like Apple stuck with a CPU vendor that was trapped at 14nm while they were able to buy access to TSMC 5nm, a full 3 generations ahead of what was in older Macs, with each generation providing significant double digit improvements in density and power usage reduction.


Their last MacBook Air uses Ice Lake so it's jump from Intel 10nm, so about 1.5-2 generation jump?


Ah yes, I forgot to mention in my original comment that I also have a Core i9 Macbook Pro and it seems more slow than the M1 mac. I cannot even believe it but I love to use the ~700$ Mac Mini and the $2800 laptop stays in the bag.


The moment I bought the M1 MBA 3 months ago is when I knew I fucked up purchasing the i9 MBP last 2020 mid year.


M1 is hobbled by having to have circuitry devoted to supporting the stricter x64 memory consistency model (they toggle it on only when emulating x64, but it is still there taking up die space when not). The A chips can be relaxed consistency only.

A15 is also on a newer process node (TSMC 5nm gen2) and has 2x the L2 cache it looks like.


Well lets see what M1X/M2 bring. They will probably retake the performance crown by squeezing even more onto the die.

Given Apple's history, I wonder if like 10 years from now they just drop this additional bloat. In that case we really didn't lose the performance just gave it up temporarily. :)


I would expect so, they dropped 32-bit x86 support.


It is very likely that the A-cores also include TSO.


I prefer x86 compatibility over improved M1 speeds.


Why not both? I was shocked at what I could get to run on my M1 Macbook Air - a lot more than I expected!

The Windows 10 ARM beta also did a shockingly good job of running X86 code in Parallels too. Way faster than one would assume is reasonable for what was going on under the hood!


Totally agree. More than happy with the performance. I would worry about it if I can pair a 4K monitor to the iPhone , and it becomes a Mac, or at least an iPad. Until this killer feature, I prefer longer time between re-charge.


That would honestly be amazing and probably very feasible with the new chipsets.

Imagine having a desktop setup and an "empty" laptop case with slide in dock.

Everything in one place. Same system for the iPad.

Will probably never happen though because that's less products to sell?


samsung has the "phone to desktop" application for some of its devices. Called "Dex"

https://en.wikipedia.org/wiki/Samsung_DeX

makes sense when these devices are already more powerful (and expensive) than you Raspberry Pi.


I would love that! The desktop part could still be another product with a more powerfull cpu and gpu which is using the iphone just as a bootable drive.


Microsoft already did that with Continuum. No one used it.

https://www.microsoft.com/en-us/windows/continuum


That's because:

1. No one used Windows Phone either

2. The use-case wasn't really clear.

"Run Office apps on a regular computer-sized monitor!" I can already do that, using the computer-sized computer connected to my computer-sized monitor.

"Setup is simple using Continuum-compatible accessories" So I have to buy a new keyboard/mouse/monitor/something else? (No, but it sounds like I might have to)

The pitch was "You can use your phone kind of like a PC, assuming every app you need runs on Windows phone" (which, for almost everyone, is absolutely not the case). If there's one thing Windows phone was famous for, it was not having apps.

Essentially, Microsoft was trying to pitch you a feature that let you use a phone + monitor and keyboard as a replacement for some of the apps on your PC. Not a compelling pitch, except for presentations.


Why did they not use it? Was it that the idea was inherently flawed or that the implementation didn’t live up to the promise? Remember how people said the iPod was just another music player, the iPhone wasn’t much different than earlier touchscreen devices or PDAs, etc.?

Circa 1998-1999 I was using a mobile device with a touchscreen and wireless broadband to surf the web and send email while sitting outside. Clearly the idea was good even if Handspring and Ricochet never managed to deliver it with few enough compromises to see mainstream success.


I think it's a normal reaction to the marketing. When every product launch is promoted as the second coming of Jesus Christ people become cynical and overly critical.


And yet you don’t see this with any other competitor. Even Intel, who kept dropping the ball year after year. In contrast, Apple has been consistently realistic about their performance figures, and outperforming all their direct competitors for about 10 years.

It’s just a knee jerk reaction to about anything related to Apple. There is a lot to dislike about them, to be sure, but these reactions to independent benchmarks are just blind cynicism. It’s cool to be a contrarian, and it is making the Internet a much worse place.


People love to ding Tim Cook for Apple's lack of innovation but the I'd argue the M1 is a revolutionary product on the same level as the first iPhone. Of course, it's not as sexy and in your face as the first iPhone but people in the industry and consumers immediately felt the impact of the shock wave.

When you think about it, the M1 perfectly encapsulates Apple under Tim Cook. It didn't come out of nowhere. They have slowly improved on their A-series in the iPhones the last decade and it accumulated into the M1. That's exactly how Apple is run today. Inch by inch.


Revolutionary is always going to be controversial. The first iPhone was dismissed as nothing special. There were touch screens before, after all, and it didn't even have a proper keyboard. The first 64-bit ARM SOC was argued not to be revolutionary because it offered only evolutionary levels of performance comparable to previous Apple CPUs.

I don't think it's arguable that the M1 wasn't a game changer though. M1 firmly established Apple's design prowess at the laptop/desktop class chip game. It was no longer arguable that mere phone chips weren't comparable to desktop class CPUs. That claim is dead. It also enabled performance and power efficiency levels years ahead of the competition, giving tangible immediate benefits to customers.

So revolutionary? It's a matter of opinion, but I think it was a watershed moment and permanently changed the technical landscape for CPUs running 'full blown' desktop class software.


The iPhone revolutionalized touchscreen smartphones. It was the first very succesful capacitive touchscreen smartphone. After that, UIs had to get ready for capacitive touchscreens. Before that, resistive smartphones existed, but did get popular. Websites, for example, hardly optimized for these. They did optimize for the new capacitive touchscreen paradigm, as did smartphone 'apps'.


The unlimited data plan included with iPhones was revolutionary as well.


Prior to the iPhone you could buy a $5 a month unlimited data plan addon from AT&T. The revolution was AT&T increasing unlimited data to $20 a month for the OG iPhone and $30 a month for the iPhones thereafter.

It's weird to look at this in the context of Google and T-Mobile almost shipping the first Android devices with a $10 a month unlimited data only plan...


> The first 64-bit ARM SOC was argued not to be revolutionary because it offered only evolutionary levels of performance comparable to previous Apple CPUs.

It sounds like you are conflating AArch64 with Apple. Apple had nothing to do with AArch64, it was developed by ARM in-house; Apple simply released the first AArch64-compatible core (outside of ARM's Cortex series).


That's not what I've heard. The rumor is that AArch64 is as much Apple's as it is ARM's. That's why they're able to flaunt the normal restrictions of the architectural license: creating custom instructions, and not implementing parts of the ISA.


There is no rumor. AArch64 was designed by and is owned by ARM, as is all of the ARM IP.

Apple is able to do what they want because they have an architectural license; not a core license. It's the same reason Nvidia, Cavium, etc are able to design their own cores irrespective of the Cortex series. That's what an architectural license is for. While MediaTek, HiSilicon, etc are limited to using the Cortex SIP.


I’m fully aware of that. Apple released the first SOC implementing it.


The Cortex-A53/A57 was the first core to implement it. And a Mali+Cortex-A53 vanilla SOC is the first implementation.

Then AppliedMicro had a hardware version to demo.

Apple simply had the first to market consumer product with one. Them buying up a good chunk of the fab space needed to produce them at the time probably contributed to that.


I'm not sure it's revolutionary but it is a remarkable device. The Air is the first laptop I've owned that I actually like using rather than tolerate.


[flagged]


> Otherwise this comes off as a thinly veiled "Wintel ew, Apple good!" regurgitated bit of fanboy wank.

No, the M1 Air is just a substantially better product than the Intel variants. You can find dozens of HN threads where people explain exactly why.


And before the M1 variants ever existed, people moaned about how they only tolerated the Intel MacBook/Air/Pro variants?

Bullshit. It's fanboy revisionism.


Ryzen is where it's at. Zero loss of general computing freedom, 12 hour battery time on laptops and faster than Intel.

You lose so much right now with an M1 and later you'll lose even more. Honestly, you'd have to be a fan boy to get behind this.


What do you lose? You get iOS apps, much better battery life, fanless, a Unix os with the ability to install Linux. What is the loss you refer to?


Pffft. If you can't think of a single thing, I won't even bother arguing with you.

Enjoy the dystopian future you're creating.


Openess.

Apple is the #1 most restrictive company in the world. "the ability to install Linux" with zero commits and support from Apple themselves, all reverse engineered. Meanwhile Intel and AMD both contribute to the Linux kernel.

How much kool aid do you have to drink to think that 'ability' is actually something you can say is a feature with Apple devices?

Come back when I can unlock the bootloader of an iDevice and we'll talk about the 'ability' to install Linux.


> Apple is the #1 most restrictive company in the world

You really will not gain much credibility here on HN by throwing in some random, made-up ranking results. And I can guarantee that 99.99% didn't even read past that sentence /s


Way to blow me over with facts and citations.

Apple: Prevents sideloading without a host MacOS computer.

Sideloading is temporary.

Blocks third party app stores.

Blocks any and all browsers using a non-Safari engine.

Not one iPad or iPhone has ever been bootloader unlockable.

They've locked out iMessage people from being able to port their number out of the service for YEARS.

Tell me what part of this screams openness.

Even the most restrictive devices are less restrictive than Apple's. If the "ability to install Linux" is the benchmark here, guess what... that's nearly every single fucking device in the world. Except the majority of Apple devices, of course.

Apple explicitly has gone out of their way to prevent this.

https://www.phoronix.com/scan.php?page=news_item&px=Apple-T2...

https://www.forbes.com/sites/jasonevangelho/2018/11/06/booti...

https://www.zdnet.com/article/what-must-be-done-to-bring-lin...

Apple does ZERO to allow people to use an OS outside of their sandboxes. With the Intel based systems, Bootcamp is still restricted and limits hardware access necessary to run Windows properly on their systems.

Yes. Apple is restrictive as fuck. As far as Technology companies go, yes, I'd put them up at the top.

Your turn with sources and citations.


The reason your claim is hyperbolic, revolves around consumer tech around choices that most people don't care about. They locked people out of porting your number from iMessage? Scary - wait until you find out what happens when Google locks you out of your Google account.

And #1 restrictive tech company? Did Oracle disappear while I slept?


At least Oracle supports open source.

Apple does not.


FoundationDB would like a word.


That's pretty much the only product they support, no? I am no fan of Oracle (probably nobody is), but they do support some very heavy projects, the obvious ones being OpenJDK, MySQL, and the Linux kernel (mostly btrfs IIRC).

https://lwn.net/Articles/867540/

Edit: I am not really talking about WebKit, Darwin, and the other projects they only developed for themselves.


They didn’t really “only develop WebKit for themselves”: all of the non-Firefox browsers now descend from WebKit one way or another.

But, you can also look at their GitHub repositories: Swift, llvm, cups are at least three open source packages they maintain.


That sure don't look like commits to open source projects managed by other people, bucko.


M1 is dramatically better than the top tier AMD mobile offering.

https://www.notebookcheck.net/Merciless-Apple-M1-Mac-mini-ta...


Nice dishonestly selected single benchmark screenshot.

https://nanoreview.net/en/cpu-compare/apple-m1-vs-amd-ryzen-...

The Ryzen 5900HX is more than 70% faster than the M1 in Cinebench Multicore... and 1% slower in single core.

You know. Real world shit.

Also. Max 16GB Ram for the M1. L Oh fuckin' L. Even my Celeron laptop could do more than that.


You're responding to a power efficiency result by citing a pair of raw performance results. That doesn't make a whole lot of sense, especially when discussing mobile hardware. Differences in power efficiency usually matter a lot more to mobile hardware than differences in raw performance.


The one you're defending... linked a benchmark of the Mac Mini. Not an M1 based Macbook.

Try again.


Using the Mac Mini form factor and a comparable Ryzen mini PC to judge the processors' suitability as mobile chips is certainly less than ideal, but it's not as worthless as you seem to be implying. In particular, this kind of comparison makes it far easier to get a more apples-to-apples comparison of the chips themselves, without other system design decisions as serious confounding variables. You won't find AMD's top mobile chip in any form factor that remotely resembles a MacBook Air, for example. And when trying to compare a MacBook Air against a Ryzen notebook that's 3-4x as thick, you wouldn't be able to make as strong a conclusion that it's the chip itself and not merely the heatsink responsible when the Ryzen does deliver higher performance.


What do you lose? You get iOS apps, much better battery life, a cool and fanless computer, a Unix os with the ability to install Linux. What is the loss you refer to?


You are totally wrong. I too love my M1 MBA and I have a huge pile of other notebooks. The weight, the battery lifetime, and all this with stellar performance. It is literally the best notebook I have ever owned.


The M1 Air is a lot faster than it's intel predecessor, the battery runtime is at least 50% longer in real use and the thing is fanless, silent and I haven't burned a single body part yet. That alone is revolutionary.

Just because the machine hasn't changed in appearance (and sadly neither has it's so-so display), the chip difference alone makes it a whole different ballgame.


Fanboys have driven the word 'revolutionary' into the ground so deeply that it ceases to have any actual meaning anymore.

Meanwhile, through temporal prescience, the previous Intel based Macbooks were simultaneously revolutionary and merely tolerated before the M1 variants ever existed.

Don't you guys ever get tired of twisting your Apple apologeticism thought processes into a gordian knot all the time?


By what metric do you consider the M1 revolutionary? "Inch by inch" and "revolutionary" are a bit contradictory, no?


It is indeed revolutionary, because it is at the heart of the first widely popular fully-functional powerful general-purpose ARM-based laptop ever being produced on a massive, industrial scale. If that's not being revolutionary, I don't know what is.


> first widely popular fully-functional powerful general-purpose ARM-based laptop ever being produced on a massive, industrial scale.

I submit if you need to qualify your use of the word "laptop" with fully eight adjectives or adjectival phrases to avoid ambiguity with pre-existing products, that it's probably not "revolutionary" by definition.

The M1 is a good CPU. It's incrementally better than the Intel CPUs used in earlier versions of the same product.

It is very notable in that it was produced in-house and not purchased from Intel. And that says important things about the business climate in which Apple finds itself. But that's not the same thing as "revolutionary" in a technological sense. It does what competing chips do, somewhat better.


> I submit if you need to qualify your use of the word "laptop" with fully eight adjectives or adjectival phrases to avoid ambiguity with pre-existing products, that it's probably not "revolutionary" by definition.

I count five: It's the first "widely popular", "fully-functional", "powerful", "general purpose" ARM-based laptop ever being "produced on a massive, industrial scale"

My counterpoint: it's revolutionary because it's the first ARM-based laptop which doesn't have to make a ton of compromises to make it out the door.

It's got mass-market appeal ("widely popular", unlike the PineBook Pro).

It's not limited in functional scope ("fully functional", unlike Chromebooks).

It's not limited in computing power ("powerful", unlike any other ARM laptop).

It's not restricted to a subset of tasks ("general-purpose", unlike the iPad Pro when treated as a laptop).

It's being mass-produced for retail, i.e. it's not a limited-run or a prototype ("produced on a massive, industrial scale").

Every other attempt at an ARM laptop has made one or more of these compromises; the M1 Macbooks (and the M1 Mini) don't have any of these compromises, meaning that it's fit-for-purpose for the vast majority of laptop users (those for whom a Macbook would have sufficed before the M1 line).

I think being good and useful is pretty revolutionary; I haven't seen another ARM laptop offer that, and those are pretty important features.


>My counterpoint: it's revolutionary because it's the first ARM-based laptop which doesn't have to make a ton of compromises to make it out the door.

This should not be understated. You could hand an M1 Mac to anyone and unless they were technical and understood what you were handing them, they would neither know nor need to know that it has an entirely different CPU architecture.

It's totally transparent. Not one application did I try did it do anything other than just run as I would have expected. Heck the vast majority of x86 Windows software I tried - either under Crossover (commercially supported version of WINE) or in the Windows 10 ARM beta under Parallels - worked just fine.

And very speedily too. It was not apparent that emulation was going on - at all.

It really is quite astonishing. As they say, seeing is believing - in this case using is believing. The overall feel of the system is just not something that's easily to articulate. It's not just about running a benchmark or application quickly; the whole thing is just more responsive.

The only reason I took my M1 Macbook Air back on the last day of the return window was it turns out to do everything I want if it just had more RAM. Which was NOT my starting position; the M1 ran so well that I ended up realizing it could do everything I wanted and more if I could get at least 32GB of RAM - 64GB would be perfect.

So I'm rather impatiently awaiting the next round. As soon as I can get more RAM I'm so getting another Apple Silicon laptop!


> The M1 is a good CPU. It's incrementally better than the Intel CPUs used in earlier versions of the same product.

I suppose everything can be called incrementally better by some viewpoint but calling it incrementally better than the Intel CPUs used in earlier versions is wild to me. I specifically switched to my first ever Apple product because of how revolutionary the generational step was compared to what we've been getting on x86 laptops year to year and no equivalent x86 laptops of the year came close. Not only is it comparable to my overclocked desktop in tasks from 1-4 cores it does so without kicking on fans and the battery life is astounding while it does it. It even runs emulated x86 software faster than the native x86 version of the same year.


Here is what I find confusing. When people say the M1 is revolutionary, it really seems to suggest that the playing field is "available CPUs on the market". But, then some mean it to be "no, just compared to old CPUs in the previous generation of the same product".


It trounces the majority of either of those - which something a revolutionary product would do.

It's not until you get to the highest end desktop CPUs from Intel or AMD that you see competition with the M1 on a per core basis.

Keeping in mind this is Apple's lowest end, low power, mobile part. It wipes out all but the highest end desktop processors - not just on a core basis, but even a multicore basis - agin for all but the highest end.

Nope, I don't think revolutionary is hyperbole at all. It also makes me wring my hands with glee thinking about what a part designed for the desktop where higher power and cooling are readily available. Where adding more cores makes a lot of sense. How big of a gap is there going to be with that SOC?

The prospects of new hardware are once again exciting. CPUs have been stagnant for a LONG time now. I was simultaneously relieved and disappointed to learn my i7-7700K was still not that far off from the latest CPUs - hardware unboxed had a great video https://www.youtube.com/watch?v=wAX1lh985do

At best - on average a 30FPS between my lowly 7700K and the highest end parts? Yeah, newer parts have more cores - but most games are not well optimized for multiple cores. On one had I'm relieved - no need to spend money on upgrades since it just isn't worth it. On the other hand - it isn't worth it to upgrade! And what nerd doesn't like having the latest tech?

So yeah, it's nice to have someone pushing the boundaries and something to be truly excited about. Also stark contrasts like this tends to light a fire under others, if nothing else. Rising tides raising all boats and all that...


> It trounces the majority of either of those - which something a revolutionary product would do.

This isn't even remotely true. Computers are really good at doing more than one thing at a time. I find it mind-boggling that there is such a focus on the performance of a single core. This hasn't been all that relevant for the last 15 years. As for what this matters, the difference for single core isn't all that big regardless (one M1 core performs about the same as one 5980HS thread). When you want to know how powerful the computing power of a CPU is, how about comparing .. you know... what it can do?

And, what it can do when it is asked to do everything it can, I can reiterate that the M1 is nowhere near the competition. The 5980HS mobile CPU will get the same multi-threaded job done in half the time, which is a huge difference. Which is why discussion threads like these feel... chilling. Like as if I stepped in to a cult meeting, where we praise normalcy as revolutionary. It's not even close to the last generation of CPUs in terms of compute power. It just a step up from the Intel CPUs apple used in the previous MBPs.

And, it's not that the M1 is a bad CPU. It's actually pretty good for what it does, and it is exceptionally good at power efficiency. The 5nm and low thread count does contribute significantly to the power performance, so it will be interesting to to see how future M1 processors perform.

Imagine if the M2 doubled core count, and thus doubled multi-core performance compared to the M1. It would then be just as powerful as the current 5980HS. Another revolution no doubt.


While I agree, I think you're not giving credit to RISC + Apple's vertical integration here. What does it do? It produces a lot less energy per instruction.

I think that other manufacturers will look into their energy usage as well. That's quite revolutionary, because it seems to me that it has shown a proof of concept for which there's a lot of room to grow in. Less energy means less heat, less heat means you can crank up clock cycles.

My thinking is simplistic but I think someone with more understanding of it will tend to agree with my high level view and tell you exactly why this is revolutionary.

Note: I'm not giving Apple full credit here, I'm giving it half credit. The other half goes to the invention of ARM and RISC micro-architecture in general.


That's still evolutionary not revolutionary. Every CPU intended for laptops over the last 20 years has been both increasing performance and reducing power consumption compared to the generation(s) before them. Apple didn't buck a trend, they didn't do anything new. They "just" had a really good execution of the standard, tried & true improvement path.

This was the whole marketing push behind projects like Intel Centrino - to drive even lower system power consumption by mandating certain combinations of certain parts that worked well together. Which continued with things like Intel Atheno & Evo.

Nothing Apple did with the M1 changed the game. It's the same game, it's the same race it's always been, they just are now in the lead of that race with their car. Which is impressive in its own right, but definitely not "revolutionary"

> Less energy means less heat, less heat means you can crank up clock cycles.

This goes the other way around. To improve efficiency you reduce the clock cycles and increase IPC instead. That was the M1's advancement over the status quo, a significant increase in IPC. Clock speeds didn't change - in fact, it regressed by a tremendous amount. This regression in clock speed is how the M1 consumes so little power by comparison.


No, it doesn't. If it did we'd be seeing similar performance from other ARM CPU vendors, and we aren't. Apple pulling far ahead of Intel is largely independent of them using ARM and more a function of how much investment their poured into their own chips.


IMHO for every invention that's the definition of "revolutionary" that matters.

The first lightbulb isn't revolutionary, it's a nifty toy invention that doesn't affect anything, much less cause a revolution like the first practical efficient general purpose mass produced lightbulbs did.

The first radio isn't revolutionary, the first practical efficient general purpose mass produced radios make the revolution.

The same for computers, the same for smartphones - iPhone was revolutionary because it impacted the world; while its competing predecessors that had almost the same tech did not and so were not revolutionary.


lol - if you had actually used one, rather than pontificating in an Internet comment thread, you would not be describing the M1 as an "incremental" improvement.

It's night and day. Readily noticeable by non-technical people. It's not just faster, it's snappier, more responsive and just has an all around different feel.


> If that's not being revolutionary, I don't know what is.

Nothing changes in the way people use their computers. It's a performance leap, but there have been many such in the past. Back when PCs and laptops got 2 cores instead of 1, "real" multitasking was finally possible and performance doubled almost overnight. That was IMHO a bigger leap than M1, but I don't hear people calling Core 2 Duo and Athlon X2 "revolutionary".


It's not just a performance leap. The whole machine feels a hell of a lot more responsive. Or tight. Or like it's reading your mind - it's hard to describe. It's just snappier, and it's not a subtle difference either.

It also delivers incredible battery life, and runs a lot cooler so you don't have annoying fans or face throttling as much, if at all, under load. No more cases hot enough to burn skin.

If you think you understand the M1 just by looking at benchmarks or reviews, you are sadly mistaken. Use one of these machines for a couple of days and you will more than likely not want to give it up.

Heck there are developers ditching their iMac Pro's in favor of M1 Mac Mini's for everything but long compiles. When Apple releases desktop versions of their SOC things should get stupid spicy. And I'm very much looking forward to it.


I love my M1 Air, but ultimately it's a moderately faster version of what I had before. The real benefit, so far, is that I have a fanless machine with the same kind of power as the previous pro level machines. Otherwise, it hasn't exactly changed what the computer is on a fundamental level.


The fact that I can forget my power cord and not care is a game changer.


> If that's not being revolutionary, I don't know what is.

Then you might want to look up the definition. A revolution is "radical" and makes "fundamental changes in the socioeconomic situation".

Life after the printing press / steam engine / transistor fundamentally changed. Even if tomorrow, all x86 chips were replaced with ARM chips, what radical and fundamental changes to society would take place?


Even if that were true, it would still just be an incremental improvement. We've been mass marketing laptops for decades. Switching to a different architecture while a big engineering task isn't revolutionary.

But it's not true. ARM based Chromebooks have been on the market for years already and cornered educational markets and had massive consumer success overseas.


One interesting thing about the M1 is that a single chip (currently) powers apple's tablet, entry level laptop, entry level desktop, and their most recent all-in-one desktop and high end pro laptop. The differentiation is done elsewhere. What is missing is the high end pro level hardware, but if the rumors are true those will have the same but "more of it". It's the first chip I know of that answers the "should we optimise for speed or for power consumption" by "yes".


Simply by application of existing cutting-edge technology (mobile chipset) to a different use-case (laptops/desktops).

The writing has been on the wall for almost a decade, but the execution required to achieve it is definitely remarkable.

My M1 MBA is better than my 2020 Intel MBP (which cost over 2x) in almost all ways that matter: quieter, cooler, battery life, and incredible responsiveness.

And x86 will never catch up. That's why it's revolutionary - it has destabilized the existing regime.


In which particular area are you suggesting that the M1 is ahead for x86 to catch up?


The A15 performance cores seem to be equivalent in performance to AMD Ryzen 5950x desktop cores, at 1/5th of the power budget.

True, AMD has a few tricks up their sleeve, and is a node behind Apple at TSMC, but on a purely technical level it's clear that Apple ahead of both AMD and Intel.


The power consumption is very impressive indeed. But, I think there are many compromises to be made when scaling up, so we'll see how the M2 performs in this regard. However, aside from power consumption, I find it interesting that people refer to it as such a revolution. Maybe I'm just imagining a unspoken "... for what is to come". Because as is, the only thing the M1 does better than it's competition is arguably the power consumption. The 5950x you mention, although an apples to oranges comparison will outperform highly parallelizable tasks by a factor of 4. Throw in a top end GPU and it becomes a factor of 14.

Now, power consumption may be the end all be all for some people. If your work tasks actually don't require a lot of computational power, then I can see it being much more valuable to have the flexibility of a cool laptop with a long battery life.

But, touting the M1 as revolutionary, because all that matters is power consumption, feels like a disconnect to me. I'm all for people liking stuff they like. But, I don't get what's revolutionary at all. I was interesting in hearing what people thought about it though.


quoting my earlier post:

> in almost all ways that matter: quieter, cooler, battery life, and incredible responsiveness.


The reason why I asked was because you say x86, but you don't mean the competition of x86, but the older generation of apple's x86 based hardware. Also, three of those things you mention are arguably the same -- quieter, cooler and battery life are all tied to power efficiency, which the M1 excels at. So, it is kinda like with every discussion thread on the M1, everyone kinda suggesting it is a powerful CPU, while it is just power efficient. From a computational point of view, where it matters, it is far behind the competition. "Catching up to x86" in this regard makes very little sense, but you also explicitly omit computational power, which of course is accurate. I certainly would not think this was revolutionary for a CPU any sense of the word, but to each their own.

This discussion also comes up again and again on HN on the M1. People say weird things like that the never experienced such a jump in computational power from one generation to the next. And what they tend to mean is from older apple hardware to newer hardware.


By the metric that I can have an entry level Macbook outperform maxed out Macbooks of a previous era. Oh and, do that with a ~20 hour battery life. And no fan.


It is still the fastest single-threaded performance available on any chip you can buy today: Laptop, desktop, server.

https://www.cpubenchmark.net/singleThread.html

(Available in a fanless ~$1k laptop, no less!)


At passmark, anyway, in a chart that seems to have some rather questionable orderings to it. Like the Ryzen 5900 is according to that AMD's fastest single core performance CPU. But that's certainly not the case as it's just a power-restricted & clock-restricted version of the 5900X, and the 5950X has then an even higher clock ceiling.

Similarly on the Intel side they have the 11900 ranked higher than the 11900K. Which again, there's no way that's true since the 11900K is literally an 11900 with a higher power limit & higher turbo frequency. And then the 11900KF, which is just an 11900K but with the iGPU disabled, is then somehow a lot faster than both?

Actual binnings & achievable frequencies vary, of course, but these orderings aren't passing a sniff test, either.

https://www.anandtech.com/bench/product/2687?vs=2673, https://www.anandtech.com/bench/product/2687?vs=2637, and https://www.anandtech.com/bench/product/2687?vs=2787 are better comparison points here. The M1's single-core performance is of course very strong, but I'm not sure why you're surprised that it's "still" the fastest single-core CPU since these CPUs are all the same generation. There hasn't really been anything newer.


In fairness though, that first link you posted had the M1 beating the 5950X in geekbench 5 single threaded, so that confirms the Passmark result under certain workloads.

In any event, you'll notice M1 achieves an extremely competitive level of performance with a peak power of 22% of that used by the Ryzen 5950X and 10.5% of the 11900K. That's crazy!

In terms of efficiency there really is no comparison the M1 is a remarkable product.

We'll have to see how the M1X/M2 goes, but given what we've seen so far, we should see some stellar results from an M series chip designed for a workstation class laptop.


At peak power the M1 is also struggling to achieve 1/2 the performance of the 5950x, if that. The peak power draw is for the multi threaded workloads, where the m1's 4+4 obviously loses badly to the all the big 16c/32t 5950x


Put another way, the M1 achieves half the multithreaded performance of the 5950x while using less than a quarter of the energy. That's pretty impressive.

We'll see how they scale up, with the new MacBook Pros.


> Put another way, the M1 achieves half the multithreaded performance of the 5950x while using less than a quarter of the energy. That's pretty impressive.

That isn't impressive, no. Power scales non-linearly with clock speeds (and thus performance across a particular CPU design). Compare a 3700x vs. a 3800x for example: https://www.anandtech.com/bench/product/2665?vs=2613 - Huge increase in power draw, barely any performance increase.

You'd have to compare at equivalent power levels to draw a meaningful conclusion like that. Especially since the 5950X is paying power for things the M1 just doesn't have at all, like the large amount of PCI-E lanes. But otherwise for a wall-powered system quadrupling the power for doubling the performance is a quite straightforward "yes please!" tradeoff to be made. Often the gains are much smaller than that.

Of course the M1 also isn't designed to excel at heavily multithreaded workloads having "only" 4 big CPU cores, so it's an unfair fight in that direction as well.


The revolutionary aspect of the M1 is that the processor for a computer is once again under control of the computer designers. For thirty years it has been outsourced as a commodity and been optimized for the interests of the CPU producers. The centralization of CPU design had benefits of scale, but maybe at this time Apple’s scale is big enough that taking control of the design is more advantageous.


I see no contradiction.

M1 powers by far the best arm based laptop available today - all arm laptops hence forth will be compared to it. revolutionary.

Apple iterations on arm have been regular and consistent for a long time.


Not just arm. The M1 is a point of reference for any portable computer.


Well that's not true. M1 is great for certain workloads, but I'd take my G14 any day of the week with it's Nvidia GTX 3060.


And, I might add, all laptops of any architecture.


When you look at the history behind the M1, it's been an inch-by-inch progression, grinding away at that CPU architecture. The culmination of that work is having one single CPU power iPads, macbooks and iMacs, which I think can reasonably be called revolutionary.


its the first ARM based PC that doesn't suck. Microsoft Windows ARM didn't move people to ARM based PC while Apple succeed.


Early Acorn Archimedes users would like a word about that "first" claim, it misses by a few decades!


ARM was a company created by Acorn, Apple and VLSI. The ARM was the Acorn RISC Machine, and I owned one, but the first ARM chip was by Apple as much as Acorn. They didn’t decamp dozens of engineers to Cambridge for no reason…


Not sure who downvoted you - I completely agree. People have been talking about Arm as the next generation for a long time. I would suggest that Amazon made it successful with Graviton in the cloud, and Apple made it successful on the laptop/desktop with the M1.


Wouldn’t you imagine every revolution is really a catalyst moment that follows years of hardly-noticed incremental changes that set the stage for it?


It was developed gradually: The A series is already at number 15!

Macs suddenly switched to these ARM processors. That's the revolution, and the number is M1, but it's based on the gradually developed A series.


Do revolutions typically start over night or do they build up until there's a tipping point? Apple has been working towards a revolution, the M1 is the tipping point.


Tipping towards what? How will having this laptop change your life over having another laptop? What can you do now that you couldn't do before?


Good question. It completely untethers me from my desk. The battery life is so good I can get through an entire work day without worrying about power at any point. This frees me up to leave (e.g. for a meeting, to work in the park or just on a couch, or simply being frazzled and in a hurry) without even considering a power brick or when I'll be able to plug in again.

To have that freedom in a 3 lb laptop with virtually no performance compromises is incredible.


I would take an M1 MacBook Pro over my intel MacBook Pro in a heart beat. It’s a heating battery dying piece of shit to put it lightly


ARM replacing X86 as the desktop computing processor of choice.

A revolution doesn't have to benefit you. This benefits Apple financially, any benefits you receive are secondary.


How so? If I build a house with a revolutionary blueprint, but I build that house brick by brick to make sure it's the best representation of that blueprint possible, is it still not a revolutionary home?


It’s not. The release was somewhat revolutionary (it’s not all it’s made out to be, but there are many remarkable things about it).

Before releasing it though, they worked up to it (in secret), 2.54 centimeters by 2.54 centimeters.


The M1 is evolutionary. What I think will truly revolutionary is the AR classes that Apple makes in a couple years using the next-generation of that silicon.


Revolutionary incremental improvement?


Or rather "evolutionary"?


I believe anyone who read AnandTech's A-series analysis article know that their SoC is really really great from around A11 era. A7 was also very first ARMv8 implementaion.


100% the m1 reminds me of the leap from a BlackBerry to my first iPhone. With my m1 laptop, I never have to worry about bringing my charger. My intel MBP always had slowdowns, with my M1 I've never noticed a slowdown.

The m1's compute/watt will enable designers to change the dimensions of a laptop to be even smaller and compact. If the iphone's a15 starts to be near m1 compute-power, we could actually see the iPhone merge to become the device powering our monitors and keyboards. The new possibilities of such a revolutionary technology is endless!


>> we could actually see the iPhone merge to become the device powering our monitors and keyboards.

I highly doubt that because it means Apple will sell zero macbooks and they don't want less sales.


M1 is an implementation detail.


* ...an implementation detail that makes the implementation possible.


It makes them predictable and vulnerable.


Yes and:

Riding Wright's Law.

> I'd argue the M1 is a revolutionary product on the same level as the first iPhone.

This was first obvious with Apple's move to 64-bit for mobile. And should be more obvious with the Apple Watch. While M1 is quite the competitive advantage, there's nothing on the horizon to compete with Apple's wearables; Apple dominates both the profits and marketshare.

I don't know what superlatives to use. Nor do I know how to summarize, much less describe, Apple's strategy. But it's working.

Apple focused on profitability. Capture the highest margin market segments, then ride the price curve down. This sucked all the oxygen out of the competition.

Similarly, Apple's monospony strategy repeatedly boxed out competitors.

Most infuriatingly, Apple laser focused on select differentiators and competitive advantages to the exclusion of many others. For example, thinness for thinness sake. At the expense of reliability and repairability. Why? Because they could and copying them was prohibitively expensive. Again, boxing out competitors.

--

Contrast and comparison sometimes helps understanding.

Two other examples of Apple's Druckeresque strategy are Tesla and SpaceX. Capture the profits, plow the capital back into R&D, relentlessly drive Wright's Law.

My hunch is the core strategic focus is on lowest cost of capital. What moat most protects the dominance of AWS, Apple, Tesla, SpaceX? I think, compared to their competitors, it's their lower cost of capital?

Duh, right?

So while I think it's fine to speculate and cheer. But it's hard to get excited by competitors who a) don't have cheaper capital and b) aren't driving Wright's law for crucial competitive technical advantage.

Toyota used to have these advantages. But somehow lost the narrative. I think the case studies will argue that Toyota missed two opportunities. First, they punted on owning their own digital hardware and software, whereas Tesla treated the car as an iPhone on wheels, a new platform. Second, most importantly, Tesla jumped into EVs at the very moment Li-ion became feasible, and decided to relentlessly drive Wright's Law for battery tech. The other manufacturers simply waited too long to jump in. Tesla's Li-ion flywheel (access to capital, scale, Wright's Law) will kick in within a few years, cementing their dominance.

Of course Tesla's story isn't finished. Like all hard driving efforts, balanced on a knife edge, Telsa could implode. But if they avoid a cash crunch, it's hard to imagine any EV competitor catching up. In that way, Apple's access to cheap capital is great deal more robust.

FWIW, I anticipate Starlink will become SpaceX's Apple style money printing machine. And once that revenue kicks in, the existing cell and cable carriers will be buried. The Mars mission and national defense parts of SpaceX's story are fun and legit. But unlike all their competitors, SpaceX will be self-sustaining on Starlink.

--

Thanks for reading this far. I write to understand. And this narrative has been eluding me.

--

PS- The outlines of next technical turf battles are pretty obvious.

Apple's next big challenge is baseband chips. They know this, of course. But for whatever reason, eliminating their dependence on Qualcomm is very hard. And whoever productizes the successor to 5G, whatever the satellite uplink chipset shapes up to be, will be sitting very pretty.

The Li-ion story wrt BEVs now seems pretty obvious. Everything from light trucks to skateboards will be almost fully BEV by around 2030. Tesla will produce about 1/10th of the world's batteries; and maybe a 1/10th of the Model 3 sized automobiles. And like Apple, they'll capture most of the profits.

The next frontier will be hydrogen. The hydrogen story today looks like Li-ion around 2005, +/- 3 years. Hydrogen's first use cases will be all the stuff BEV missed. Like trucks, transportation, utility scale energy storage. Someone like Toyota could survive long enough to see their hydrogen bet pay off. But I think it's more likely new entrants will access to cheap capital will prevail.

Any way.


>Apple focused on profitability.

Quite the contrary. Apple focuses on experience - and knows that if you deliver a kick ass experience, profit will naturally follow.

Apple also is one of the few tech companies that seems to be willing to say "no" and not be all things to all people. It's often misinterpreted as only targeting the profitable markets, as if that's a bad thing. Not pursuing unprofitable markets seems to be common sense to me - call me crazy!

Microsoft of the 80's and 90's focused on profitability and chasing relentless backwards compatibility; and it worked - for a while. Until you had the Windows division actively undermining other portions of the organization it saw as competition. Memes like this are no accident: https://www.globalnerdy.com/2011/07/03/org-charts-of-the-big...

Then the next wave of computing - mobile happened and they suddenly were on the outside looking in. Much like the likes of DEC, Prime and heck even SUN were when the PC revolution happened and dethroned mini's.

As for baseband chips - radio is freakishly unforgiving. Apple did buy Intels baseband modem operation lock stock and barrel, so in three to five years - who knows? They certainly are motivated and have the resources to make a solid go of it. And they have certainly proven they can expand their expertise rather competently when they are motivated to. I don't think the Swiss are as dismissive of the Apple watch as they initially were; some even admitting the band designs Apple came up with were an improvement. That you can take segments in and out of the metal segmented band with just your thumbnail is, dare I risk saying it, revolutionary. If you don't think so I have a Timex band the flummoxed experienced jewelers - lets see how you do with it.


While seeing these deep dives on A series processors is fascinating I find how powerful they are to be highly disappointing.

Here you have undoubtedly some of the finest processor technology available, manufactured in the hundreds of millions, wasted by the constraints of its platform. If the Pi foundation had access to cores, processes, and even pieces of this level of technology the overall computing world would be so much more capable.

Even beyond that, the iPhones themselves? Severely limited by the I/O they possess. As these devices age they could have several recycled general computing purposes but they're constrained by the single lighting port. iPhone use NVMe storage, they have PCIe lanes!

All this work, effort and engineering for a platform who simply seeks to make more off of the revenue of whales in their app stores.

Edit: I think my comment around Pis is a little misleading. What I meant is the general educational Linux community. Not specifically the Pi foundation. If after X years Apple unlocked the bootloader and you could install other operating systems to better leverage the hardware I'd feel better about the state of it.


You've got a point. Back in the iPhone 4S days, I speculated that some day there could be a really interesting market for second-hand iPhone "motherboards" to be put to use in… oh I dunno… robotics, OLPC-type things, even blade servers… Of course I knew that Apple would fight rooting (and then the secure enclave came along) but it was fun to imagine.

The article mentions that several leads from the Apple chip division bailed for Nuvia, so at least the future of non-Apple boards looks a bit brighter.


Reading your comment, we certainly have much more computing power but it's no longer general-purpose. It's interesting to see cars waiting to get built while at the same time, yesterday's iPhone gets tossed as e-waste. If only there was a way to make these commodities transferrable between industries.


>wasted by the constraints of its platform

I mean, ignoring that there are some fairly serious number crunching things to do on iPhones (I think the lidar and basic 3D scanning features intro'd with the iPhone 12 Pro are already quite useful professionally for example, like cameras themselves on smartphones), how can you say this following Apple's switch of Macs to their chips too? Obviously the economies of scale and constraints generated by iPhone development has fed Apple's move up the stack in classic disruptive fashion to the iPad and then last year the Mac. Hwever you feel about smartphones themselves, it seems a little more than a little odd to see this comment now in 2021 given that it's going straight into more classic personal computers as well?


I'm excited about the M1 Macs and the fact they do have bootloaders which can be unlocked. This doesn't change the fact that there are a large magnitude more iPhones that exist that will never be used for their full potential.

Everyone is buying a magical teleporter but it'll only take you down to the corner store. Doesn't matter if 95% would only use it for that anyway, the fact the other 5% can't is what makes me sad.


>This doesn't change the fact that there are a large magnitude more iPhones that exist that will never be used for their full potential.

Depends I guess? Having a lot of power on tap can show up in transparent ways as well as obvious apps. Apple is definitely leaning heavily on their SoC for a lot of their computational photography stuff. Pros may prefer to just shoot RAW of course for decent reason, but having a system make "good shots" fairly point-and-shoot at the cost of a ton of compute is a real value add for lots of regular people. A lot of people also do some real gaming on their iPhones these days, and those can absolutely push the GPU. Maybe that doesn't fall into your definition of serious, but I don't think it's quite fair to dismiss any use-case valued by customers. There are also extremely practical energy saving issues like the race to sleep. Often the faster a chip can do a job and then return to hibernation, the better the energy efficiency. A lot of normal usage is very bursty, and like it or not on the modern web there are tons of sites that can hammer SoCs pretty hard. Everyone cares to some degree about battery life and responsiveness in handhelds.

If there was some big cost for this that'd be one thing, but there really isn't given the economics of modern silicon design and fabrication. Apple amortizes R&D big time across the iPhone, iPad, Mac, and even stuff like the Apple TV. Relentlessly pushing forward the units on the phones feeds directly into everything else. And even if most people only use the full potential a fraction of the time that may be a very valuable fraction, and what will take off in the future that might use it isn't always clear.

Though one thing that is clear is that wearable displays and serious AR/VR are the big next disruption/extension event for computing, and Apple like every other player needs to be ready. That's going to take a ton of compute power along with highly evolved environmental sensor usage and fusion. They have to be working towards that for years and years beforehand, and clearly are.


I don't understand the comment wrt the MacBooks either.

It looks more like sour grapes that one of the most prestigious computing companies on the planet is out-engineering the "foundation" spin-off of Broadcom, created (very successfully, I might add) to breathe life into an aging chip foundry.


"It looks more like sour grapes"

He wants that chip in a raspberri pi, not sure what you call it, it's the quite the opposite of sour grapes


"sour grapes" (2): "feeling or expressing resentment, disappointment, or anger"

Seems appropriate to me.


I updated my post to reflect my feelings better.


> If the Pi foundation had access to cores, processes, and even pieces of this level of technology the overall computing world would be so much more capable.

Even if they had access to it I don't think it would be anywhere near as affordable as the Pi foundation wants their devices to be. An M1-powered Raspberry Pi would be amazing, but also probably several hundred dollars.


Which Intel sells as NUC.


> Severely limited by the I/O they possess

My iPhone 11 and iPad Pro have more, and more varied, I/O devices available than any of the ~6 computers and dozen or so video game consoles in my house. Like, I could buy more for the others to make them compete, but the i-devices just come with tons of stuff, plus can connect to lots of the same USB devices my other computers can (including, notably, anything MIDI), plus most of my bluetooth peripherals. And they can use external monitors/TVs (mirroring only, but still).

I guess there's a chance my Nintendo Switch comes close, but then, it can't connect to as many BT and USB devices.


This denotes how important TSMC and having the fabrication node advantage is for Apple. The A15 seems to be an all-around good chip, just not the magnitude of performance jumps (20%+) Apple has shown since the A7. For reference, this chip is on an improved TSMC 5nm (N5P) node but still 5nm nonetheless like last year's A14.

Last year's M1/A14 cores had performance gains due to the combined architecture improvement and the shifting from TSMC 7nm to 5nm fabrication. At the time it was difficult to determine the % of gains attributable to either, so now it seems we have a better idea that node advantage could be the majority.

It'll be very interesting to see what happens in the next M-series chip announcement, e.g. M1X or M2. I still think it's plausible Apple switches the M-series chips to 3nm first in order to cement their superiority over Intel. That said, porting architectures across nodes is no small feat (as we saw in Intel's Rocket Lake, though that was small -> big).


ARM SoC developed for Android devices, on the other hand, are consistently way behind in performance and are nearly all based on ARM designs.

I've been waiting for years for someone other than Apple to make a fully custom ARM SoC that can compete with Apple.

I suspect there isn't a lot of incentive for Qualcomm to do it as they as nearly a monopoly in the Android space.

I've been hoping that AMD or Intel would step up and make an ARM core that crushes Qualcomm.

I just got a used iPhone 7 for my so and despite being 5 years old is still really smooth running the latest iOS 15. It's the only iPhone I own (the rest are Android) and I'm a bit jealous.


Qualcomm dominates at radios, which matters far more than single-core CPU performance. So there's no pressure on Qualcomm to be better, and really all they do is take off-the-shelf ARM designs for the CPU cores anyway. So it'd really be ARM themselves falling behind. Perhaps Qualcomm is pressuring them to keep die sizes small, so ARM just isn't willing to make a proper "big" CPU core. Or maybe ARM themselves just don't have the right incentives to do so. Who knows.

But the end result is nobody really wants to compete against Qualcomm in the mobile phone SoC space, since Qualcomm's radios means they win by default. Then in the laptop & server space, Intel & AMD have no incentive to switch to ARM instead of sticking with x86. At least not at this point in time.


> Qualcomm dominates at radios, which matters far more than single-core CPU performance.

that is nonsense. I switched from iPhone to a Pixel 4 post-CSAM scandal and the difference is night and day. My phone now makes me wait. Every single fucking day I wait. Wait Wait Wait. Slow Slow Slow. And this is a phone without the bloatware. Fucking miserably slow. Not in absolute terms necessarily, it would have been competitive 10 years ago. But in 2021, Qualcomm is fucking over every single Android user and their management should be ashamed.

I assume nonsense about single core performance being good enough is how the industry (Apple aside) has ended up in such a sorry state.


Sorry for not being clear. I meant that radios matter far more to carriers & OEMs, the people who are actually Qualcomm's customers.

For carriers this lets them get out that first 5G phone and all that nonsense. For OEMs this drastically reduces engineering complexity (and therefore improves margins) by avoiding a separate radio chip.


Ah that makes sense. I think I'm on a hairpin trigger about people saying Qualcomm chips are fast enough for consumers. Mostly because I'm frustrated and upset by their chips every single day. But I'm not a Qualcomm customer, just a lowly user.


I will admit it's super frustrating to read a review of a flagship Android phone. Even from reputable places like Anandtech. They'll rave about the latest Snapdragon and how it'a a 10-20% improvement year over year. And yet it's 40% slower than last year's iPhone. I understand Android users don't have a choice. We can't get the A15. But journalists should be telling the world the truth over and over in every flagship Android review. "Your $1500 Galaxy S21 ultra has a worse SoC than a 3 year old iphone."

I've been an Android user since the original G1, and every year I wonder why I don't switch to an iphone. Maybe this year.


“I switched from iPhone to a Pixel 4 … And this is a phone without the bloatware.”

Mutually exclusive statements.


What distro?


GrapheneOS. A degree of slowness might be the extra security hardening. Not an excuse, because, aside from the built-in spyware, iOS is just as locked down, if not more so.


iOS is probably not nearly as security-hardened as Graphene (or even plain AOSP). iOS relies mostly on "security through obscurity" rather than actual hardening. Apple hopes to maintain security by making the system so complicated and proprietary that it's difficult to understand, let alone find bugs in. Graphene/AOSP hopes to maintain security through real open-source improvements that can be independently verified. It's hard to find bugs there not because access to the sources are unavailable, but because there are usually less bugs in the first place. So, which approach is better? Well, critical exploits are discovered in iOS left and right, so that probably tells you something.

Also, I don't think it's fair to assume that Graphene is not playing some role in the slowness (though I don't believe most of the currently-implemented hardening is expected to create dramatic slowdown). Perhaps you should try running the stock ROM for a day or two (with the Google apps disabled in Settings, of course).


The next pixel(6) might have a non-generic SoC according to the rumors. obviously will cost more than a pixel 4 today though


I hope so. I got the 4 because I needed a cheap temporary replacement and only pixel is supported by graphene. I guess the risk is that google does something funky to prevent forks from working or taking advantage of the custom silicon. Competition for Qualcomm is way overdue.

I'll probably suck it up and buy a pixel 6 once graphene is available for it.


> I suspect there isn't a lot of incentive for Qualcomm to do it as they as nearly a monopoly in the Android space.

That may change soon! Samsung and MediaTek are hot on their tail.

https://www.notebookcheck.net/MediaTek-Dimensity-2000-to-fea...


They acquired Nuvia which is meant to be the custom CPU attempt from Qualcomm. But not sure if they will bring Nuvia expertise to Mobile SoCs or keep them for desktop & servers


I updated my old first-gen iPhone SE to iOS 15 and it's still fast. Not nearly as snappy as the newer generations, but it's not laggy or anything.


I wonder if there are any reviews that measures their progress on the camera ISP, which occupies large amount of real estate on these smartphone SoCs. Clearly Apple invests a lot on the area, and it shows on their presos, yet almost no review mentions about how it progresses over time.

Same for media codecs, and lesser extent, NPU kind of stuff.


Interesting SPEC 2017 performance figures for the A15 performance cores vs x86 that the story's author posted in the story comments:

>Comparative subsets would 5950X 7.29 int / 9.79 fp, 11900K 6.61 int / 9.58 fp. versus 7.28 / 10.15 on A15


Is that per core?


That's single-core/thread performance, which isn't quite the same as per-core performance, because you usually won't get N times the score by running N copies of the workload across N cores.


Those efficiency cores are something to behold. They positively blow away the A55.

A510 is the next best thing. Like bulldozer, it will have two integer units sharing a single floating point unit. I doubt companies will ship 8 A510 cores which means that overall little core performance will likely not be improving a huge amount.

In fact, it's looking to me like A79 is going to be the "little" core while X2 will be the big core and a couple A510s will be strictly limited to secure low-level processes.


I get that these Apple chips are fast but has anyone spent anytime comparing how fast the Operating System is to other OS?

macOS/iOS vs Windows or Linux.

Usain Bolt is the fastest person on earth. But if he’s wearing 5 kilogram shoes, he’s not going to be any faster than a normal person.


Alternative operating systems like Asahi Linux are apparently not far enough along to give you anything meaningful, but here are some benchmarks of Linux virtualized under QEMU: “vastly, hugely, mind-bogglingly fast”

https://www.sevarg.net/2021/01/09/arm-mac-mini-and-boinc/


Intel and AMD should consider how much goodwill they may gain by just removing IME and PSP. The framework and librem laptops show that there is a market of people willing to pay for upgradable devices which are independent from makers and free from out-of-the-box backdoors.


> Intel and AMD should consider how much goodwill they may gain by just removing IME and PSP.

On a very practical scale; none. It'd make some nerds happy but it's like a subset of a subset of a niche population.

> The framework and librem laptops show that there is a market of people willing to pay for upgradable devices which are independent from makers and free from out-of-the-box backdoors.

All it shows is that there is some market for it but I think you're either severely overestimating framework and librem's scale or are severely underestimating intel and AMD's. I'm extremely skeptical that either company cares about that niche nor do I think they'd rework all their processors just to satisfy that niche.


> Intel and AMD should consider how much goodwill they may gain by just removing IME and PSP.

Too small to measure: the subset of privacy activists who are concerned about it is orders of magnitude smaller than the number of IT workers who use those features, and that's a tiny fraction of the number of people buying computers.

Calling it as a backdoor is your choice but consider that using a term not following its standard definition is not likely to be effective or good for your credibility if you're trying to draw attention to something more important in the future.


I'll consider the advice.


For siblings commenting that the librem and framework markets are small... You're right, it is small indeed. What is harder to measure is the impact of the goodwill they could gain. It is not very different from what they gain of sponsoring events. Of course changing behavior could be more expensive or complicated, but it may be worth in the long run.

IME features maybe useful for some people. But there are people who don't want it. I don't think it would be expensive to make it possible to disable it via eFuses.

Remember that linux was once much smaller than any commercial UNIX. So was Apache and Nginx. Linux is still very unpopular among gamers; that didn't prevented Valve from investing in what now is their lifeboat.

A small market should not be seen as an excuse to not invest in something that will certainly bring goodwill.


Compared to the markets Intel serves, the Librem and framework markets are inconsequential. No good financial incentive for them to invest in such a move.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: