Hacker News new | past | comments | ask | show | jobs | submit login

The A12 is clearly at parity now vs. Intel's existing mobile offerings, probably somewhat ahead given the long delays with 10nm.

The question upthread is whether or not switching architectures makes financial sense, not whether it's a (mild) technical win.

Switching to their own chips cuts Intel out of the loop, but as far as business risk that simply replaces one single source manufacturer with another (TSMC).

It probably saves money per-part, which is good. But then Apple is still drowning in cash and immediate term savings really aren't much of a motivator.

> By 2021 Intel can probably deliver a 1.1-1.2x speed up tops for Macbook airs, while Apple can probably deliver a 2x speed up for 2021 with the A15 and another one for 2023 with the A16.

That's going to need some citation. Moore's law is ending for everyone, not just Intel. TSMC has pulled ahead of Intel (and Samsung has caught up) for sure, but progress is slowing. That kind of scaling just isn't going to happen for anyone.




>Moores law is ending for everyone, not just Intel.

I am by no means an "in the know" on Chip design and this whole bit is probably a fair bit of speculation, but I remember Jim Keller talking about the ending of Moores law on a podcast in February[1]. If I remember correctly his argument boiled down to the theory, that Moores law is in some sense a self fulfilling prophecy. You need to have every part of your company believing in it, or else the parts stop meshing into one another well. I.e. if a team doesn't believe that they will get reach a density/size improvement, that would allow them to use more transistors in their design they will need to cut down and adjust their plans to that new reality. If this distrust in improvement spreads inside of a company, it would in turn lead to a steeper slowdown in overall improvement.

And while there may be an industry-wide slowdown at the current point in time, perhaps this dynamic is exacerbated at intel, causing them to loose their competitive edge over the past years.

[1]https://youtu.be/Nb2tebYAaOA?t=1805 (Timestamped to the beginning of Topic of Moores law slowing down)


~ignore bad info~


Intel 10 nm does not use EUV.


Intel's 10nm strategy was basically to do everything they could to advance their fabrication process without having to use EUV. Some of those changes turned out to be bigger risks than EUV. TSMC was a bit less aggressive with their last non-EUV nodes, but it actually worked and now they have EUV in mass production (though few if any end-user products have switched over to the EUV process at this point).


Thanks for the correction.


> That's going to need some citation.

So this is Geekbench 5 showing the difference between a 2012 macbook air i5 and a 2020 macbook air i5 - both base models and similar price: https://browser.geekbench.com/v5/cpu/compare/2613713?baselin...

And this is a thread in the rust subreddit about compilation speed on macbooks where some users report the performance increase for different generations of macbook pros and macbook airs, if you want a more "realistic benchmark" to calibrate geekbench results: https://www.reddit.com/r/rust/comments/gypajc/macbook_pro_20...

This is the ipad pro 2020 crushing the macbook air 2020: https://browser.geekbench.com/v5/cpu/compare/2612714?baselin...

And this is the improvement from the previous generation ipad A10 to the ipad pro's A12Z - 2x speed up in single a generation: https://browser.geekbench.com/v5/cpu/compare/2613991?baselin...

You are definetely right that Moore's law is hitting Intel hard. But AMD is still doing quite well, nvidia and "ati" are doing incredibly well, and Apple chips have been doing extremely well over the last couple generations.

Maybe you are right, and Apple won't be able to deliver 2x speed ups in the next 2 generations. I'd expect that, just like for Intel, things won't abruptly change from one gen to another, but for this to happen over a longer period of time. Right now, only apple knows what perf their next 2 gens of chips are expected to deliver.

The only thing we know is that Apple ARM chips are crushing their previous generation both for ipads and iphones year after year, and now they are betting on them for macbooks, and potentially mac pros, probably for at least the next 10-15 years.


> This is the ipad pro 2020 crushing the macbook air 2020

You keep coming back to that citation. It's more than a little spun. The parts have comparable semiconductor process (Intel 10nm vs. TSMC 7nm) and die size (146.1 vs. 127.3 mm2). But the the A12Z in the iPad is running as fast as Apple can make it (it's basically an overclocked/high-binned A12X), where the Intel part is a low power, low-binned variant running at about half the base clock of the high end CPUs, with half the CPUs and half the L3 cache fused off.

A more appropriate comparison would be with something like the Core i7 1065G7, which is exactly the same die and can run in the same 12W TDP range but with roughly double the silicon resources vs. Apple's turbocharged racehorse.


But the A12Z (based on the A12) is not using the latest TSMC process as used for the A13 (which is almost certainly in higher volume production than Intel 10nm).

Plus, if Apple can afford to put an overclocked / high-binned TSMC chip in the lower-cost iPad but has to put a low binned i5 in the Air doesn't that say something about the relative economics / yields.

For what it's worth I have a Core i7 1065G7 and it's decently fast but gets very hot and definitely needs a fan (which the iPad doesn't) and has good battery life (but not as good as the iPad's).

The advantage still seems to me be to be very strongly with the Apple parts.


> if Apple can afford to put an overclocked / high-binned TSMC chip in the lower-cost iPad but has to put a low binned i5 in the Air doesn't that say something about the relative economics / yields.

Potentially. It probably also says more about the relative product positioning of the iPad Pro (high end, max performance) vs. MacBook Air (slim, light, and by requirement slower than the MPB so that the products are correctly differentiated).

The point is you're reaching. The A12 is a great part. TSMC is a great fab. Neither are as far ahead of the competition as Apple's marketing has led you to believe.


Both products (ipad pro 2020 and macbook air 2020 i5) are similarly priced (ipad pro being ~25% cheaper), yet the ipad pro has longer battery life while having a much better display, and much better raw performance (~1.6x faster !).

The first i7 from apple starts at ~1.6x the price of the ipad pro in the macbook air 2020, yet the results of comparing the a12z with that show that, performance wise, nothing really changes: https://browser.geekbench.com/v5/cpu/compare/2626721?baselin...

I'd suspect the reason is that these benchmarks are long enough, that the Intel CPUs just end up completely throttled down after the first ~30-60s, while the A12Z does not.

Either way, the trade-offs here have multiple axes, and honestly die-size is not something I as a user care about. I care about performance / $ and battery life / $. The A12z seems much better along these two axes than either of the Intel CPUs in the Air 2020.

To find something competitive in terms of performance (but worse battery life) from Intel, one does need to go to the Core i7 1065G7 that you mention, which delivers approximately the same performance as the A12Z: https://browser.geekbench.com/v5/cpu/compare/2627162?baselin...

However, the first machine with that CPU is the Macbook Pro 13' 2020, and that starts at ~2x the price of the ipad pro. From looking at the upgrade prices, the cost of that i7 alone might be by itself ~50% of the whole ipad pro cost, or larger.

So while you are right that these two are comparable in terms of raw power, I doubt they are comparable in terms of performance/$ and battery life/$. Without knowing the exact costs of each we can only speculate. If that intel i7 is 2x more expensive than the A12Z, then perf/$ would be half as good, and since raw battery life is worse, the battery life/$ axes would be twice as bad.


> I'd suspect the reason is that these benchmarks are long enough, that the Intel CPUs just end up completely throttled down after the first ~30-60s, while the A12Z does not.

The reverse is most likely true. The macbook has a bigger aluminum body to absorb and dissipate heat. In addition, it has an active cooler with heat pipes to conduct away that heat for ejection from the system.


> And this is the improvement from the previous generation ipad A10 to the ipad pro's A12Z - 2x speed up in single a generation

Single product generation, not single chip generation. There is a new A SoC every year. The A11 was a thing for the iPhone X & 8.

Apple doesn’t claim to “tick/tock” like Intel did. Apple is also dealing with significantly less mature technology which enables exponential gains like the sort they’ve been delivering through since the A4.

The technology is maturing. The hockey stick growth doesn’t work forever for any metric. Given enough time, it will always flatten out. Apple is already seeing that too. They’ve gone from year on year 2x performance to (per your above) 2x every 2 years.

That timeline will continue to stretch. The only major advantage Apple’s A chips have over Intels is that they can be optimised to Apple users (and I do mean the large aggregate, not the niche coders) typical use cases to maximise battery life. There are no other customers Apple cares about beyond that large majority.


We're also talking about two different ISAs: x86-64 vs A64.

In addition to the underlying architectures to implement them.

It's not inconceivable that Intel simply made enough arch / process / design mistakes that they've ended up at the end of a performance box canyon.

It's happened before. Pentium 4 was a deadend, while Pentium M-derived architectures continued to improve performance.

So "Apple realizing continued gains by optimizing a less mature architecture" is one hypothesis.

But "Apple realizing continued gains by optimizing a more evolveable architecture" is a valid alternative.


It could well be that MacOS is a slower OS

It's been the case for years in comparison to every other competitor

Let's put Linux on both those chips and see what happens


There is no point in benchmarking anything: Macs have long shipped with CPUs one to two generations behind. If performance or efficiency (they’re interchangeable In this regard) on the desktop/notebook fronts mattered to Apple as a company, that would never have happened. Clearly there are other factors beyond the performance/efficiency curve at play and so it’s not a matter of benchmarking anything.


The benchmarked Mac in his post is running the latest 10th gen Intel that everyone else is using.


You missed my point. I have no problem with the machine he’s benchmarking, my point is rather that if performance or efficiency were the be all for Apple, they’d never have shipped a non Pareto choice/behind the curve. Clearly there are other factors that lead to their choices.


Yes, the 10th generation of the Facebook/email CPU

Nobody would use that for real work, Apple won the laptop market when developers started using it, now they are killing them for tablets, which are gadgets, not work machines.


I recently got an upgrade but, I spent quite a long time working as a professional data scientist and data engineer on a 2013 MacBook with one of those Facebook/email CPUs. Quite happily, too. It was never going to hack it training any deep neural nets, of course. But that is what our data center is for. I wouldn't want that kind of stuff running locally, anyway, for a whole host of reasons. And it turns out that analyzing data in Jupyter or R is can easily be a lighter workload than displaying Facebook ads or doing whatever it is that Google has recently done to Gmail.

I will admit that our front end developers do all have nice high end machines, multiple high-resolution monitors, all the good stuff that Jeff Atwood tells us we should buy for all developers because They Deserve the Best. I attribute our site's sub-par (in my opinion) UX on the Facebook/email CPUs and 13-15" monitors that our users typically have, in part, to the fact that my colleagues' belief that high-end kit is necessary for getting real work done is a bit of a self-fulfilling prophecy. There's not much intrinsic incentive to worry about user experience when your employer is willing to spend thousands of dollars on insulating you from anything resembling a user experience.


> Nobody would use that for real work

It always amuses me how warped many commentators on HN's perspectives are on what is and isn't "real work". At time's it's bordering on "no true scotsman" fallacy.

I assure you that plenty of people are using those computers and getting paid for the work they do on them.


Most of these people do their work without writing any JS too, enabling them to cope without 32TB of RAM to manage their 16 tabs of JavaScript each running in its own chromium instance.

Some day, we’ll re-learn some old lessons about the need for efficient utilisation of resources. In the mean time, check out the latest old game re-implemented at a fraction of the frame rate in your browser.


To be fair my laptop at work has 8 cores and 64gb of ram and it's already 3 years old

I'm gonna change it soon not because it's not good anymore, but because some of the hardware has become too slow for the things are necessary nowadays

Mind you it's not my choice only, things are more complex, deadlines aren't longer, quarterly reports are still every 3 months but the things we have to do have become more complex and computationally heavy and need new hardware to keep up

I'm an hobbyist musician, a 300 dollars laptop is good enough and even if I was a pro it would be enough

Truth is that the computational power of a 100 dollars smartphone would be enough

So a new chip by Apple doesn't change that their mobile line is already overpowered for the average use cases and the laptop one is underpowered and overpriced


I think the assumption is that user on HN are devs. I mainly do OPS and have a huge vCenter installation availble to me, as well as AWS and Azure test accounts. My spec’ed out work macbook pro mainly runs iTerm, nothing that my personal macbook pro from 2013 can’t do.

There are only two things that I use at home that could benefit from an upgrade, AppleTV hobby projects and YouTube, not the video playback, that’s fine, but the pages loads slowly.


> I think the assumption is that user on HN are devs.

That's a fair assumption. My problem is with the other poster using the term "real work" to imply that Apple's devices are underpowered or useless. And even then, if they are, there's a lot of dev work that can still be done on machines a decade old performance-wise.


I think that's the point: a new chip with mobile like performance does not qualify as a working machine

For what is worth when I do code review I could do it from my phone if only the mobile Gitlab interface was a bit better.

The i5 MacBook air is 999$ and it's not gonna change with the A13

I don't see any advantage for the end user


> I don't see any advantage for the end user

If it had the same price, the same performance, and twice the battery life, that would be an advantage to many users


Counterpoint: imagine finishing the job in half the time using two times the same battery power

I would make that trade, I would work less hours and spend half a day, everyday, off

Anyway

My matebook 13 does 15 hours of normal workload on battery while also being very thin and light (and 350 euros cheaper than an equivalent MacBook air)

It mounts an AMD Ryzen

I seriously doubt that I will need 30 hours on battery in the near future

Unless I'm stranded on a desert island, but I guess I'd have to face more serious problems then...


> I think that's the point: a new chip with mobile like performance does not qualify as a working machine

For whom?


Plenty of people do real work on pen and paper, or chalkboards, or just with their hands, "real work" doesn't really mean anything


I'm starting think I'm in an episode of Twilight Zone: we are in an alternative world where people suffer from severe cognitive dissonance and can't argue properly

Real work referred to computers means heavy load

The original post said "Apple is killing laptops to sell.more tablets which are gadgets and you can't do real work on gadgets"

Which is true

There are a lot of people driving push scooters, you can't do real work with them, you need a proper vehicle if your job requires moving things and/or people all day

Paper and pen have nothing to do with laptops and, usually, brain is more powerful than an i5

Unless you're an Apple zealot


> Real work referred to computers means heavy load

I think this is perhaps the source of your problem. You are assuming that others interpret the phrase "real work" to mean exactly what you think of when you hear the phrase.

For many people, when you say someone is not doing "real work", you are implying that their work is not important, or it's not valid. If someone says to you "why don't you go get a real job" - it's the same kind of thing. There are plenty of jobs in our industry writing CRUD apps for businesses, for example. Those ARE "real work", no matter how common or unglamorous they might be. However many of those jobs can easily be done on a machine with very modest resources.

Yes, there are jobs where the demand on computer hardware is much more resource-intensive. But it is a mistake to assume that those scenarios are what people will think of when you use the phrase "real work".


Here on HN real work is not something an i5 can handle

And if an i5 can handle your workload, then the A13 won't make a difference either in price or performance (because you already don't care about performance) it only matters to Apple's profits

So no, an i5 is not enough for doing real work in technology

Any developer here on HN would tell you that with 1k you can find much better deals for the money

If your job is not technology related you can do it with a 5 years old laptop of any brand

We buy new machines because we always need more power, to do everything else that is not real work I own a 3 years old 12 inches Chinese laptop with an i3 that runs Ubuntu and is just perfect

You might be amused, but you wouldn't accept it as a work laptop if your company gave you one

I'm sure you wouldn't


> If your job is not technology related you can do it with a 5 years old laptop of any brand

That really depends on what technology you're working with. In general, if you're not working with a bloated JS project or a multi-million line C++ codebase, a computer from the last decade will do just fine as long as it has 8-16 GB of RAM.

I mean these days the difference between an i5 and i7 is almost non-existent to me as when possible I disable hyperthreading out of an abundance of precaution.

There's a lot of "real work" in tech that can be handled on an i5.

Most embedded programming work could easily be done on an i5 from a decade ago.

> We buy new machines because we always need more power

We need more power because people keep developing bloated software for newer machines.

---

Can you define to me what real work is? Without simply saying "it's work that needs more than an i5 to handle", that is.


> computer from the last decade will do just fine as long as it has 8-16 GB of RAM.

That was my point as well, if you aren't working on anything that requires power you don't need a "working machine"

But if you do, the A13 is not a solution

> We need more power because people keep developing

I'm not the one putting hundreds of ML/AI models in production

But I do enjoy having a system that makes it possible to test an entire stack on a single machine, something that just a few years ago required multiple VMs and complex setups

Even if you're developing puzzle games for low level Android phones an i5 is not enough

You can not believe it, but it's the truth

> Can you define to me what real work is?

Of course I can, even though you can't define what it is that can be done with a baseline i5 that qualifies as "real work"

A typical dev will do some, all, or more than these things

- open up an editor, with multiple files open, with integrated language server, linter/formatter, background error checker

- open up an ide, jetbrains, Android studio, Xcode, on a middle sized project, with a few tens dependencies and start working on it

- launch a maven/gradle compile

- launch a docker build

- launch a docker-compose up with a main application and 2 or 3 services (DB, redis, API backend)

- launch the training on any of the ml/ai framework available. Of course you'll launch it on a very limited subset, it's gonna be slow anyway

- process gigabytes (not even in the tens of gigabytes) of data

- on my i3 even apt upgrade is slow. That's why I use it as a media player and not as work machine.


If your definition of working machine says that they were nearly impossible a few years ago, your standards are probably too high.

These machines have become very capable. If we don't move the goalposts, they can be working machines.


I really doubt they are, I'm an average programmer

My laptop is a working tool for professionals, if I use it as a dumb typewriter I don't really need modern generation CPUs, a Pentium 2 would be enough

Ram is a more pressing issue these days, given the amount of bloated software one has to run just to page a colleague about something (yes slack, I'm talking about you, but not only you...)

When I am at my computer working I want it to do things for me while I do something else effortlessly, without noticing something else is going on

If I have to watch it while it finishes the job, it would be just a glorified washing machine

And it means it is underpowered for my workload

That's why people usually need more power than the baseline, because the baseline is the human using it, the computer's job is not to just display pixels at command, it's much more than that

Imagine you are an administrative employee, you are typing a report on your laptop, you're doing real work BUT you're not doing real work on your laptop or to better put it your laptop is sitting idle most of the time which is not real work for it

Work is a physics concept, real work means there is force involved and energy consumed

If the energy is minimal or the force applied almost zero, there is almost zero work done

Simple as that


> real work means there is force involved and energy consumed

See my reply earlier in the thread. I think the primary source of contention here is that you assume everyone should only think of the definition you give for the phrase "real work".

Obviously many of them don't.


> Even if you're developing puzzle games for low level Android phones an i5 is not enough

  $ sysctl -n machdep.cpu.brand_string
  Intel(R) Core(TM) i5-5287U CPU @ 2.90GHz
Works just fine for Android development.


Hey, do you mind me asking what mac you're using? (Or hackintosh?)


MacBook Pro (Retina, 13-inch, Early 2015).


Nice, that's a good model. I have a lot of coworkers with the same setup and dreading the upgrade (mostly because of the touchbar).


Yup, I bought that one within says of the October 2016 because the new ones were outrageously expensive compared to what I was used to and I was not willing to give up inverted-T ;)


It really does not

There are tools that work and tools for professionals, tools work, professional tools are for people that do it as a daily job and their job depends on them


Your opinion of "what works" is not universally shared by everyone.

You don't know the details of saagarjha's work developing on Android (unless, perhaps, you actually know them in real life, which seems unlikely given the way you responded), and neither do I. saagarjha is the one best able to determine what works for them.

If your point is that that having more resources avaiable to you can make you more productive, that's fine. It's always nice to have as beefy of a machine as possible.

However, not everyone has that luxury. Businesses have budgets, and most developers I know don't get to set their own budget for equipment. Sometimes you are lucky, and the money is there, and your management is willing to spend it. Sometimes that is not the case. Regardless, my primary development machine right now is a 5-year-old laptop, and I get plenty of development work done with it.

The way you worded this latest response makes it sound as if you are saying that I am not a professional, and my tools are just "toys", because I don't work on an 8 core machine with 64GB of RAM. I don't know if that is your intention, but if it is it is both inaccurate and insulting.


> Your opinion of "what works" is not universally shared by everyone.

Earth orbiting around the Sun wasn't either.

See the problem is not if you are a professional or not, but if the tool is.

If I do the laundry and the washing machine takes 4 hours to complete a cycle I'm still washing my clothes, but I'm not doing it using a professional tool

There's no place where I implied people using less than optimal tools are not professionals, I'm talking exclusively about tools.


I agree with every single point you said and has been stating something similar.

>It probably saves money per-part, which is good. But then Apple is still drowning in cash and immediate term savings really aren't much of a motivator.

The only possible reason I could think of is to lower cost and lower selling price ( While retaining same margin ). A Macbook 12" ( Or will it be Macbook SE? ) that cost $799, the same price as iPad Pro.

It is basically Apple admitting tablet with Touch computing will never take over PC with Keyboard and Mouse. Both will continually coexist for a long time if not indefinitely. And this isn't a far fetch statement. Most enterprise have absolutely no plan to replace their office Desktop Workflow with Tablet. The PC market is actually growing. There are still 1.5B PC in the world, of which only 100M belongs to Apple.

I still dont understand how they will give up x86 compatibility on the Pro market though. They could make the distinction where every Mac product with Pro uses x86. And non-Pro uses ARM. At least that is my hypothesis.


The iPad pro not taking over a PC is a self-fulfilling prophecy as long as Apple does not allow it. With the mandatory App Store and its restricting rules, there are many things you just cannot do on an iPad. I wouldn't consider a MB Air, if the iPad had the same capabilities. That it doesn't have, is purely a software limitation.


For consumers and business users, you have Excel and Email. iPad already does 95% of what most Mac user do on their Desktop If not more and yet it hasn't taken over. It has not taken over by numbers, there isn't even a trend, projection or glimpse of hope anything has started.

The tablet and PC is simply a different paradigm best suited for their own purpose.

It is the same narrative that Smartphone will take over most of your computing needs. At first it seems obvious, Nations not been through PC era will go straight to Smartphone. And yet 5 years later the biggest growth area for PC are these Smartphone nations.


> replaces one single source manufacturer with another (TSMC)

Sure except it's Apple that design the chip then. A pretty huge advantage. Apple loves vertical integration.


> But then Apple is still drowning in cash and immediate term savings really aren't much of a motivator.

Why would you think so? Apple is a for profit company. Apple consistently makes more than 35% or so in margins. Why wouldn’t it make sense to increase that wherever possible (to increase profits or offset some discounted pricing it’s offering elsewhere, like a free one year subscription to Apple TV+ on purchasing a new device)? Also consider the impact of COVID-19 for the next one year or so.


Because the price of iPads have been falling over time.


That could be because of cheaper production costs or because Apple has seen iPad sales dropping and traded some of its margins to sell them at lower prices. As a for profit company with very good margins, it only makes sense that Apple would continue to maximize that and not let it slip a lot, even if the gains may seem minimal to an outsider. It’s also the same reason why Apple continues to sell Macs at the same price a at launch time even years later without any hardware updates, even though Macs are a small percentage of its total revenues.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: