I really respect Apple's privacy focused engineering. They didn't roll out _any_ AI features until they were capable of running them locally, and before doing any cloud-based AI they designed and rolled out Private Cloud Compute.
You can argue about whether it's actually bulletproof or not but the fact is, nobody else is even trying, and have lost sight of all privacy-focused features in their rush to ship anything and everything on my device to OpenAI or Gemini.
I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.
> to purchase a machine that feels like it really belongs to me
How true is this when they devices are increasingly hostile to user repair and upgrades? MacOS also tightens the screws on what you can run and from where, or at least require more hoop jumping over time.
Of course I wish the hardware were somehow more open, but to a large extent, it's directly because of hardware based privacy features.
If you allowed third-party components without restraint, there'd be no way to prevent someone swapping out a component.
Lock-in and planned obsolescence are also factors, and ones I'm glad the EU (and others) are pushing back here. But it isn't as if there are no legitimate tradeoffs.
Regarding screw tightening... if they ever completely remove the ability to run untrusted code, yes, then I'll admit I was wrong. But I am more than happy to have devices be locked down by default. My life has gotten much easier since I got my elderly parents and non-technical siblings to move completely to the Apple ecosystem. That's the tradeoff here.
Think of some common sense physical analogies: a hidden underground bunker is much less likely to be robbed than a safe full of valuables in your front yard. A bicycle buried deeply in bushes is less likely to be stolen than one locked to a bike rack.
Without obscurity it is straightforward to know exactly what resources will be required to break something- you can look for a flaw that makes it easy and/or calculate exactly what is required for enough brute force.
When you add the element of well executed obscurity on top of an also strong system, it becomes nearly impossible to even identify that there is something to attack, or to even start to form a plan to do so.
Combining both approaches is best, but in most cases I think simple obscurity is more powerful and requires less resources than non obscure strength based security.
I’ve managed public servers that stayed uncompromised without security updates for a decade or longer using obscurity: an archaic old Unix OS of some type that does not respond to pings or other queries, runs services on non-standard ports, and blocks routes to hosts that even attempt scanning the standard ports will not be compromised. Obviously also using a secure OS with updates on top of these techniques is better overall.
I think the scenario that security through obscurity fails is when the end user is reliant on guarantees that don't exist.
For example Intel's Management Engine, it was obscured very well. It wasn't found for years. Eventually people did find it, and you can't help but wonder how long it took for bad actors with deep pockets to find it. Its this obscured cubby hole in your CPU, but if someone could exploit it, it would be really difficult to find out because of intel's secrecy on top of the feature.
It seems like people are really talking about different things with obscurity. Some are referring to badly designed weak systems, where secrecy and marketing hype is used to attempt to conceal the flaws. Others, like my comment above, are talking about systems carefully engineered to have no predictable or identifiable attack surfaces- things like OpenBSDs memory allocation randomization, or the ancient method of simply hiding physical valuable things well and never mentioning them to anyone. I’ve found when it is impossible for an external bad actor to even tell what OS and services my server is running- or in some cases to even positively confirm that it really exists- they can’t really even begin to form a plan to compromise it.
> where secrecy and marketing hype is used to attempt to conceal the flaws.
That's literally the practical basis of security through obscurity.
> Others, like my comment above, are talking about systems carefully engineered to have no predictable or identifiable attack surfaces- things like OpenBSDs memory allocation randomization,
That's exactly the opposite of 'security through obscurity' - you're literally talking about a completely open security mitigation.
> I’ve found when it is impossible for an external bad actor to even tell what OS and services my server is running- or in some cases to even positively confirm that it really exists- they can’t really even begin to form a plan to compromise it.
If one of your mitigations is 'make the server inaccessible via public internet', for example - that is not security through obscurity - it's a mitigation which can be publicly disclosed and remain effective for the attack vectors it protects against. I don't think you quite understand what 'security through obscurity[0]' means. 'Security through obscurity' in this case would be you running a closed third-party firewall on this sever (or some other closed software, like macos for example) which has 100 different backdoors in it - the exact oppposite of actual security.
You're mis-representing my examples by shifting the context, and quoting a wikipedia page that literally gives the same examples to two of the main ones I mentioned at the very top of the article as key examples of security through obscurity: "Examples of this practice include disguising sensitive information within commonplace items, like a piece of paper in a book, or altering digital footprints, such as spoofing a web browser's version number"
If you're not understanding how memory allocation randomization is security through obscurity- you are not understanding what the concept entails at the core. It does share a common method with, e.g. using a closed 3rd party firewall: in both cases direct flaws exist that could be overcome with methods other than brute force, yet identifying and specifying them enough to actually exploit is non-trivial.
The flaw in your firewall example is not using obscurity itself, but: (1) not also using traditional methods of hardening on top of it - obscurity should be an extra layer not an only layer, and (2) it's probably not really very obscure, e.g. if an external person could infer what software you are using by interacting remotely, and then obtain their own commercial copy to investigate for flaws.
> You're mis-representing my examples by shifting the context,
Specific example of where I did this?
> literally gives the same examples to two of the main ones I mentioned at the very top of the article as key examples of security through obscurity: "Examples of this practice include disguising sensitive information within commonplace items, like a piece of paper in a book, or altering digital footprints, such as spoofing a web browser's version number"
I mean, I don't disagree that what you said about changing port numbers, for example, is security through obscurity. My point is that this is not any kind of defense from a capable and motivated attacker. Other examples like the OpenBSD mitigation you mentioned are very obviously not security through obscurity though.
> If you're not understanding how memory allocation randomization is security through obscurity- you are not understanding what the concept entails at the core.
No, you still don't understand what 'security through obscurity' means. If I use an open asymmetric key algorithm - the fact that I can't guess a private key does not make it 'security through obscurity' it's the obscuring of the actual crypto algorithm that would make it 'security through obscurity'. Completely open security mitigations like the one you mentioned have nothing to do with security through obscurity.
> The flaw in your firewall example is not using obscurity itself, but: (1) not also using traditional methods of hardening on top of it
Sooo... you think adding more obscurity on top of a closed, insecure piece of software is going to make it secure?
> if an external person could infer what software you are using by interacting remotely,
There are soooo many ways for a capable and motivated attacker to figure out what software you're running. Trying to obscure that fact is not any kind of security mitigation whatsoever. Especially when you're dealing with completely closed software/hardware - all of your attempts at concealment are mostly moot - you have no idea what kind of signatures/signals that closed system exposes, you have no idea what backdoors exist, you have no idea what kind of vulnerable dependencies it has that expose their own signatures and have their own backdoors. Your suggestion is really laughable.
> not also using traditional methods of hardening on top of it
What 'traditional methods' do you use to 'harden' closed software/hardware? You literally have no idea what security holes and backdoors exist.
> if an external person could infer what software you are using by interacting remotely, and then obtain their own commercial copy to investigate for flaws.
Uhh yeah, now you're literally bringing up one of the most common arguments for why security through obscurity is bullshit. During WW1/WW2 security through obscurity was common in crypto - they relied on hiding their crypto algos instead of designing ones that would be secure even when publicly known. What happened is enough messages, crypto machines, etc were recovered by the other side to reverse these obscured algos and break them - since then crypro has pretty much entirely moved away from security through obscurity.
You are operating on a false dichotomy that the current best practices of cryptographic security, code auditing, etc. are somehow mutually exclusive with obscurity, and then arguing against obscurity by arguing for other good practices. They are absolutely complementary, and implementing a real world secure system will layer both- one starts with a mathematically secure heavily publicly audited system, and adds obscurity in their real world deployment of it.
If there are advantages to a closed source system, it is not in situations where the source is closed to you and contains bugs, but when closed to the attacker. If you have the resources and ability to, for example, develop your own internally used but externally unknown, but still heavily audited and cryptographically secure system, is going to be better than an open source tool.
> They are absolutely complementary, and implementing a real world secure system will layer both- one starts with a mathematically secure heavily publicly audited system, and adds obscurity in their real world deployment of it.
Ok, let's start with a 'mathematically secure heavily public audited system' - let's take ECDSA, for example - how will you use obscurity to improve security?
> If you have the resources and ability to, for example, develop your own internally used but externally unknown, but still heavily audited and cryptographically secure system, is going to be better than an open source tool.
Literally all of the evidence we have throughout the history of the planet says you're 100% wrong.
> Literally all of the evidence we have throughout the history of the planet says you're 100% wrong
You are so sure you’re right that you are not really thinking about what I am saying, and how it applies to real world situations- especially things like real life high stakes life or death situations.
I am satisfied that your perspective makes the most sense for low stakes broad deployments like software releases, but not for one off high stakes systems.
For things like ECDSA, like anything else you implement obscurity on a one off basis tailored to the specific use case- know your opponent and make them think you are using an entirely different method and protocol that they’ve already figured out and compromised. Hide the actual channel of communication so they are unable to notice it exists, and over that you simply use ECDSA properly.
Oh, and store your real private key in the geometric design of a giant mural in your living room, while your house and computers are littered with thousands of wrong private keys on ancient media that is expensive to extract. Subscribe to and own every key wallet product or device, but actually use none of them.
> You are so sure you’re right that you are not really thinking about what I am saying, and how it applies to real world situations- especially things like real life high stakes life or death situations.
Nah, you're just saying a lot of stuff that's factually incorrect and just terrible advice overall. You lack understanding what you're talking about. And the stakes are pretty irrelevant to whether a system is secure or not.
> For things like ECDSA, like anything else you implement obscurity on a one off basis tailored to the specific use case- know your opponent and make them think you are using an entirely different method and protocol that they’ve already figured out and compromised.
You're going to make ECDSA more secure by making people think you're not using ECDSA? That makes so little sense in so many ways. Ahahahahaha.
I very well may be wrong, but if so you are not aware of how, and I will need to find someone else to explain it to me. I’ve been interested for a while in having a serious debate with someone that understands and advocates for the position you claim to have- but if you understood it you would be able to meaningfully defend it rather than using dismissive statements.
> Think of some common sense physical analogies: a hidden underground bunker is much less likely to be robbed than a safe full of valuables in your front yard. A bicycle buried deeply in bushes is less likely to be stolen than one locked to a bike rack.
That's not what security through obscurity is. If you want to make an honest comparison - what is more likely to be a secure - an open system built based on the latest/most secure public standards, or a closed system built based on (unknown)? The open system is going to be more secure 99.999% of the time.
> Without obscurity it is straightforward to know exactly what resources will be required to break something- you can look for a flaw that makes it easy and/or calculate exactly what is required for enough brute force.
The whole point of not relying on obscurity is that you design an actually secure system even assuming the attacker has a full understanding of your system. That is how virtually all modern crypto that's actually secure works. Knowing your system is insecure and trying to hide that via obscurity is not security.
> it becomes nearly impossible to even identify that there is something to attack
That's called wishful thinking. You're conflating 'system that nobody knows about or wants to attack' with 'system that someone actually wants to attack and is defending via obscurity of its design'. If you want to make an honest comparison you have to assume the attacker knows about the system and has some motive for attacking it.
> but in most cases I think simple obscurity is more powerful and requires less resources than non obscure strength based security.
Except obscurity doesn't actually give you any security.
> I’ve managed public servers that stayed uncompromised without security updates for a decade or longer using obscurity: an archaic old Unix OS of some type that does not respond to pings or other queries, runs services on non-standard ports, and blocks routes to hosts that even attempt scanning the standard ports will not be compromised.
That's a laughably weak level of security and does approximately ~zero against a capable and motivated attacker. Also, your claim of 'stayed uncompromised' is seemingly based on nothing.
You are begging the question- insisting that obscurity isn't security by definition, instead of actually discussing it's strength and weaknesses. I didn't "say so"- I gave specific real world examples, and explained the underlying theory- that being unable to plan or quantify what is required to compromise a system makes it much harder.
Instead of, for example in your last example simply labeling something you seem to not like as "laughably weak"- do you have any specific reasoning? Again, I'd like to emphasize that I don't advocate obscurity in place of other methods, but on top of additional methods.
Let's try some silly extreme examples of obscurity. Say I put up a server running OpenBSD (because it is less popular)- obviously a recent version with all security updates-, and it has only one open port- SSH, reconfigured to run on port 64234, and attempting to scan all other ports immediately and permanently drop the route to your IP. The machine does not respond to pings, and does other weird things like only being physically connected for 10 minutes a day at seemingly random times only known by the users, with a new IP address each time that is never reused. On top of that, the code and all commands of the entire OS has been secretly translated into a dead ancient language so that even with root it would take a long time to figure out how to work anything. It is a custom secret hacked fork of SSH only used in this one spot that cannot be externally identified as SSH at all, and exhibits no timing or other similar behaviors to identify the OS or implementation. How exactly are you going to remotely figure out that this is OpenBSD and SSH, so you can then start to look for a flaw to exploit?
If you take the alternate model, and just install a mainstream open source OS and stay on top of all security updates the best you can, all a potential hacker needs to do is quickly exploit a new update before you actually get it installed, or review the code to find a new one.
Is it easier to rob a high security vault in a commercial bank on a major public street, or a high security vault buried in the sand on a remote island, where only one person alive knows its location?
> Instead of, for example in your last example simply labeling something you seem to not like as "laughably weak"- do you have any specific reasoning?
'without security updates for a decade or longer' - do I really need to go into detail on why this is hilariously terrible security?
'runs services on non-standard ports,' - ok, _maybe_ you mitigated some low-effort automated scans, does not address service signatures at all, the most basic nmap service detection scan bypasses this already.
'blocks routes to hosts that even attempt scanning the standard ports ' - what is 'attempt scanning the standard ports' and how are you detecting that- is it impossible for me to scan your server from multiple boxes? (No, it's not, it's trivially easy.)
> Say I put up a server running OpenBSD (because it is less popular)- obviously a recent version with all security updates-, and it has only one open port- SSH,
Ok, so already far more secure than what you said in your previous comment.
> only being physically connected for 10 minutes a day at seemingly random times only known by the users
Ok, so we're dealing with a server/service which is vastly different in its operation from almost any real-world server.
> only known by the users, with a new IP address each time that is never reused
Now you have to explain how you force a unique IP every time, and how users know about it.
> On top of that, the code and all commands of the entire OS has been secretly translated into a dead ancient language so that even with root it would take a long time to figure out how to work anything
Ok, so completely unrealistic BS.
> It is a custom secret hacked fork of SSH only used in this one spot that cannot be externally identified as SSH at all
It can't be identified, because you waved a magic wand and made it so?
> and exhibits no timing or other similar behaviors to identify the OS or implementation
Let's wave that wand again.
> How exactly are you going to remotely figure out that this is OpenBSD and SSH, so you can then start to look for a flaw to exploit?
Many ways. But let me use your magic wand and give you a much better/secure scenario - 'A server which runs fully secure software with no vulnerabilities or security holes whatsoever.' - Makes about as much sense as your example.
> Is it easier to rob a high security vault in a commercial bank on a major public street, or a high security vault buried in the sand on a remote island, where only one person alive knows its location?
The answer comes down to what 'high security' actually means in each situation. You don't seem to get it.
Obfuscation is not security.. So there can't be "security through obscurity".
Widely deployed doesn't mean it's a positive action, and effective ? It just can't be as it's not a security. People really need to pay more attention to these things, or else we DO get nonsense rolled out as "effective".
Where did you come up with “ security through obscurity ” in that previous commment? It said nothing about using an obscurity measure. He was talking about hardware based privacy features.
What do you mean by considered bad practice? By whom? I would think this is one of the reasons that my Macs since 2008 have just worked without any HW problems.
As fas as I see, it's not possible to connect to a device that uses the same Apple account, which is what I have done in my case. It has to be a different one.
Also, it only seems to work on a local network with hostnames.
Is that just openssh with an obscured name?
It would be nice to simply use the correct basic tool instead of some strange rebranding by a company of a much more well understood and trusted standard
A different one: it’s called “Screen Sharing.app” and it’s under Utilities.
After you open it, press “Connections -> New” and start typing a contact name.
They get a little notification if they are online, and if they accept you have a seamless screen sharing experience ready to go. It’s honestly magic for the “my parents have an error message and don’t know what to do” situation.
I assume they have you in their contacts as well for it to work.
For my parents: “under utilities” means it’s mostly impossible to use. What does under mean? What are utilities, are they different from apps? Can I Google search that? Where is connections? What is a connection, is that like a friend?
Glad to see your parents are tech savvy, but this reads like you live in a very different reality from mine.
Some of us are old enough to remember the era of the officially authorised Apple clones in the 90's.
Some of us worked in hardware repair roles at the time.
Some of us remember the sort of shit the third-party vendors used to sell as clones.
Some of us were very happy the day Apple called time on the authorised clone industry.
The tight-knit integration between Apple OS and Apple Hardware is a big part of what makes their platform so good. I'm not saying perfect. I'm just saying if you look at it honestly as someone who's used their kit alongside PCs for many decades, you can see the difference.
> My life has gotten much easier since I got my elderly parents and non-technical siblings to move completely to the Apple ecosystem. That's the tradeoff here.
“Hacker News” was always the arm of Valley startup mentality, not Slashdot-era Linux enthusiast privacy spook groupthink. It is unfortunate that this change has occurred.
Startups sometimes want to build hardware products, and in that case they mostly can't rely on __consumer__ products like Apple sells.
Apple gobbling up supply chains and production capacity is not something a hardware startup should be happy with.
Also, startup engineers don't necessarily like "alien technology", which is what Apple is becoming by developing everything behind closed doors and with little cooperation.
Startups don't like to pay 10-30% of their revenue just for running their software on a device someone has already paid for.
There are more reasons to dislike Apple than you can find on Slashdot.
> Apple gobbling up supply chains and production capacity is not something a hardware startup should be happy with.
You start with a very good and fair point.
> Also, startup engineers don't necessarily like "alien technology", which is what Apple is becoming by developing everything behind closed doors and with little cooperation.
> Startups don't like to pay 10-30% of their revenue just for running their software on a device someone has already paid for.
You ended with one that is pretty much at odds with my experience. Startups value distribution channels, and the App Store has been fantastic for this.
Furthermore, it's one thing to dislike Apple. It's another thing for people to veer off into conspiracy theory, which is what half the threads on here have started to do.
> Slashdot-era Linux enthusiast privacy spook groupthink
This is what the vast majority of discourse on these topics has been dominated by on every platform, not just HN. I wonder if there's a shorter term for this, none of 4chan /g/'s crass terms cover this kind of depiction.
But very distinctly, not all. Apple deliberately makes customers buy more than what they need while refusing to sell board-level ICs or allow donor boards to be disassembled for parts. If a $0.03 Texas Instruments voltage controller melts on your Macbook, you have to buy and replace the whole $600 board if you want it working again. In Apple's eyes, third party repairs simply aren't viable and the waste is justified because it's "technically" repaired.
> You can install whatever OS you want on your computer
Just not your iPhone, iPad or Apple Watch. Because that would simply be a bridge too far - allowing real competition in a walled garden? Unheard of.
> You can disable the system lockdowns that "tighten the screws" you refer to and unlock most things back to how they used to be.
And watch as they break after regular system upgrades that force API regressions and new unjustified restrictions on your OS. Most importantly, none of this is a real an option on Apple's business-critical products.
Yeah, I know you can't buy every component. But five years ago you couldn't buy any.
We're clearly talking about Macs for the software parts so I'm not sure why you're bringing in iPhone/iPad/Apple Watch where the status quo has remained unchanged since they were introduced. I'd love those to be opened up but that's another conversation.
Regarding system restrictions on macOS (putting aside the fact it fully supports otehr operating systems on Apple hardware), the ability to disable the system restrictions hasn't changed for years. System Integrity Protection is still a toggle that most users never need to touch.
> Most importantly, none of this is a real an option on Apple's business-critical products.
It was mandated by right to repair laws, it provides the absolute minimum, and they've attempted the price out people wanting to do repairs. The only way it could be more hostile to users is by literally being illegal.
They could go out of their way to make things actually easy to work on and service, but that has never been the Apple Way. Compare to framework or building your own PC, or even repairing a laptop from another OEM.
What you see hostile to repair I see as not worth stealing. What you see as macOS dictating what you can run from where I see as an infiltration prevention.
What you see as anticompetitive payment processing on iOS, others may see friendly and harmless business model. HNers, be respectful when criticizing bigger companies like John Deere and Apple - it's important you don't hurt these customer's feelings and scare them off.
Your tractor doesn't (I would hope) contain your banking details and all your emails, contacts, browsing history, photos, etc. It deserves to be treated as the tool that it is.
Apple taking your data privacy seriously seems a worthy exception to me. You're free to disagree, and buy an Android.
Apple can take my privacy seriously while also allowing me to fix my hardware. You are promoting a false dichotomy that could be used to excuse almost any form of irrational behavior.
> MacOS also tightens the screws on what you can run and from where, or at least require more hoop jumping over time.
Can you explain what you mean by this? I have been doing software development on MacOS for the last couple of years and have found it incredibly easy to run anything I want on my computer from the terminal, whenever I want. Maybe I'm not the average user, but I use mostly open-source Unix tooling and have never had a problem with permissions or restrictions.
Are you talking about packaged applications that are made available on the App Store? If so, sure have rules to make sure the store is high-quality, kinda like how Costco doesn't let anyone just put garbage on their shelves
> Can you explain what you mean by this? I have been doing software development on MacOS for the last couple of years and have found it incredibly easy to run anything I want on my computer from the terminal, whenever I want.
Try sharing a binary that you built but didn't sign and Notarize and you'll see the problem.
It'll run on the machine that it was built on without a problem, the problems start when you move the binary to another machine.
Just to clarify, Asahi Linux is working on M3/M4 support. As far as I can tell nothing changed in the boot loader that makes this work more difficult, it just takes time to add the new hardware.
You mean you have bugs on a system that isn’t announced as production ready yet ?
I tested Asahi and I genuinely love it and I’ll probably be happy to use it as my daily driver as soon as it will be mature enough. And I’m impressed by how it works well (outside of what still doesn’t work at all).
But buying a Mac ARM hopping to run Linux on it today without issue is just a wrong move. Just buy a classic PC if you want to be productive on Linux today.
I’m pretty confident it will happen though since the team itself looks pretty confident about supporting what is currently missing and in the past, achieved more than I hoped.
edit: also, unless you are the digital equivalent of "off the grid", I would argue most people are going to need some sort of cloud-based identity anyway for messaging, file-sharing, etc. iCloud is far and away the most secure of the options available to most users, and the only one that uses full end-to-end encryption across all services.
> edit: also, unless you are the digital equivalent of "off the grid", I would argue most people are going to need some sort of cloud-based identity anyway for messaging, file-sharing, etc. iCloud is far and away the most secure of the options available to most users, and the only one that uses full end-to-end encryption across all services.
"You need some cloud-based identity, and this is the best one," even granting its premises, doesn't make being forced into this one a good thing. I'm an Apple user, but there are plenty of people I need to message and share files with who aren't in the Apple ecosystem.
EDIT: As indicated in the reply (written before I added this edit), it sounds like I was ignoring the first part of the post, which pointed out that you aren't forced to use it. I agree that that is a sensible, and even natural and inevitable, reading. I actually wasn't ignoring that part, but I figured the only reason to include this edit was to say "that isn't true, but if it were true, then it would be OK." (Otherwise, what's the point? There's no more complete refutation needed of a false point than that it is false.) My argument is that, if it were true, then that wouldn't be OK, even if you need a cloud-based identity, and even if iCloud is the best one.
I had to set up a Windows computer for the first time in a decade recently, and holy shit did they make it difficult to figure out how to do it without a Microsoft account.
My MacBooks are built like a tank and outperform/outlive everything else easily for a decade. I don’t need more than 128GB of RAM or 2TB of storage… and I don’t need to repair what doesn’t break. It would be nice to have the option, but the time I save using an OS that just works like MacOS is worth more to me. And the best software in the world always runs on it. It’s a no brainier for me.
You can also repair their devices yourself using their repair program, but only if you have the capabilities as an individual, so I don’t see your point.
> I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.
You either have have very low standards or very low understanding if you think a completely closed OS on top of completely closed hardware somehow means it 'really belongs' to you, or that your data/privacy is actually being respected.
Me too. So stable that I'm becoming less and less tolerant for annoying issues in Windows, and further more motivated to make consumer linux even more available and reliable.
It's not that bad anymore (e.g. with system 76), but I understand the point.
I disagree with OP celebrating Apple to be the least evil of the evils. Yes, there are not many (if any) alternatives, but that doesn't make Apple great. It's just less shitty.
It feels like a lot of people in these threads form their opinions of what desktop Linux is like these days based on one poor experience from back in 2005.
This an Apple zealot thread. They aren’t reasonable or actually interested in computing outside of Apple. They will talk shit about Linux all day in this thread or any other one but if you criticize Apple it’s to the guillotine for your comment. I’m surprised yours is still standing.
yesterday I booted into my Kubuntu installation. it's on a rusty spinning disk on purpose. even after half of a century since the mother of all demos we still cannot preload the fucking "start menu" or whatever is the more correct technical term for that thing that _eventually_ shows the applications when you click on it.
You hit the nail on the head. And it’s something virtually everyone else replying to you is completely missing.
Apple isn’t perfect. They’re not better at privacy than some absolutist position where you run Tails on RISC V, only connect to services over Tor, host your own email, and run your own NAS.
But of all the consumer focused hardware manufacturers and cloud services companies, they are the only ones even trying.
You miss the point. It's not that I enact authority over my system in every detail all the time, but I want the ability to choose authority on the aspects that matter to me in a given circumstance.
They just have really good marketing. You fell for their pandering. If you really care about privacy use Linux. But Apple ain't it. Closed source and proprietary will never be safe from corporate greed.
If you're using the web, your privacy is about your browser and your ISP, not your OS.
At times, it's even about how you use your browser. No browser will save you from telling google too much about yourself by using gmail, and viewing youtube videos, and using search. The AI's and algorithms collating all that information on the backend see right through "incognito" mode.
Telling people they can get security and privacy by using Linux, or windows, or mac just betrays a fundamental misunderstanding of the threat surface.
You missed the point completely. The problem with a user hostile closed OS like Windows is that they collect a lot of data from your computer even if you never open a web browser. You have no clue what they collect and what they do with the data
If you're so focused on privacy why don't you just use Linux? With Linux you'll actually get real privacy and you'll really truly own the system.
Apple takes a 30% tax on all applications running on their mobile devices. Just let that sink in. We are so incredibly lucky that never happened to PC.
As much as anyone can say otherwise, running Linux isn’t just a breeze. You will run into issues at some point, you will possibly have to make certain sacrifices regarding software or other choices. Yes it has gotten so much better over the past few years but I want my time spent on my work, not toying with the OS.
Another big selling point of Apple is the hardware. Their hardware and software are integrated so seamlessly. Things just work, and they work well. 99% of the time - there’s always edge cases.
There’s solutions to running Linux distros on some Apple hardware but again you have to make sacrifices.
Even on the machines most well-supported by Linux, which are Intel x86 PCs with only integrated graphics and Intel wifi/bluetooth, there are still issues that need to be tinkered away like getting hardware-accelerated video decoding working in Firefox (important for keeping heat and power consumption down on laptops).
I keep around a Linux laptop and it's improved immensely in the past several years, but the experience still has rough edges to smooth out.
> Even on the machines most well-supported by Linux, which are Intel x86 PCs with only integrated graphics and Intel wifi/bluetooth
Uhh, this is just untrue. I have it running on three different laptops from different vendors and Fedora, pop_OS!, and Ubuntu were all pretty much drop-in replacements for Windows, no problems.
You "keep around a Linux laptop" but I daily drive them and it's fine. Sure, there's the odd compatibility problem which could be dealbreaking, but it's not like MacOS is superior in that regard.
I'm just speaking from personal experience. That example was real, Firefox did not work with hardware accelerated video decode out of the box for me under Fedora, which was pretty high impact with that machine being used for studying. I got it working and it's kept working since, but like I said, it took some tinkering.
macOS has its own oddities of course, but they don't impede such basic usage as video playback.
I have used several distributions and daily driven linux for long periods of time (2-3 years) since 2008. Even today multimedia apps have issues, these can be solved by going through online forums, but it's always a frustrating start. Usually upgrades to software will re-introduce these issues and you will need to follow the same steps.
> "Today we’re making these resources publicly available to invite all security and privacy researchers – or anyone with interest and a technical curiosity – to learn more about PCC and perform their own independent verification of our claims."
They did say that people can come look at the hardware too, but we can all understand why that's not a fully open competition for logistical reasons if nothing else.
They've certainly engaged in a lot of privacy theater before. For example
> Apple oversells its differential privacy protections. "Apple’s privacy loss parameters exceed the levels typically considered acceptable by the differential privacy research community," says USC professor Aleksandra Korolova, a former Google research scientist who worked on Google's own implementation of differential privacy until 2014. She says the dialing down of Apple's privacy protections in iOS in particular represents an "immense increase in risk" compared to the uses most researchers in the field would recommend.
Does that mean you just don't bother encrypting any of your data, and just use unencrypted protocols? Since you can't inspect the ICs that are doing the work, encryption must all also be security theater.
That's a fine bit of goalpost shifting. They state that they will make their _entire software stack_ for Private Cloud Compute public for research purposes.
Assuming they go through with that, this alone puts them leagues ahead of any other cloud service.
It also means that to mine your data the way everyone else does, they would need to deliberately insert _hardware_ backdoors into their own systems, which seems a bit too difficult to keep secret and a bit too damning a scandal should it be discovered...
Occam's razor here is that they're genuinely trying to use real security as a competitive differentiator.
The approach that the big platforms have to producing their own versions of very successful apps cannibalizes their partners. This focus on consumer privacy by Apple is the company's killer competitive advantage in this particular area, IMO. If I felt they were mining me for my private business data I'd switch to Linux in heartbeat. This is what keeps me off Adobe, Microsoft Office, Google's app suite, and apps like Notion as much as possible.
Apple isn't privacy focused. It can't at this size with this leadership.
Privacy puts user interests first. Apple doesn't.
Try exporting your private data (e.g. photos) from any modern apple device (one that you paid for and you fully own) to a non apple device that is an industry standard like a usb stick, or another laptop. Monitor some network traffic going out from your laptop. Try getting replacement parts for your broken idevice.
Others aren't pretending to put your interests first, Apple though...
>I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.
At least these days - it means asking for less trouble. It really is improving leaps and bounds. But I still dual boot on my gaming PC, but I run a lot of games on Linux in compatibility mode and it works well a reasonable amount of the time.
Of late I have been imagining tears of joy rolling down the face of the person who decides to take it upon themself to sing the paeans of Apple Privacy Theatre on a given day. While Apple has been gleefully diluting privacy on their platforms (along with quality and stability of course). They are the masters at selling dystopian control, lock in, and software incompetence as something positive.
It's most dangerous that they own the closed hardware and they own the closed software and then they also get away with being "privacy champions". It's worse than irony.
Its only 'bulletproof' in PR and Ad copy, because for as long as the US is capable of undermining any tech company that operates within its purview with NSL's, the 'perception of security' is a total fallacy.
In other words, the technology is not bulletproof, no matter how hard the marketing people work to make it appear so - only the society within which the provider operates can provide that safety.
For some, this is an intolerable state of affairs - for others, perfectly tolerable.
Mac OS calls home every time you execute an application.
Apple is well on its way to ensure you can only run things they allow via app store, they would probably already be there if it wasn't for the pesky EU.
If you send your computer/phone to Apple for repair you may get back different physical hardware.
Those things very much highlight that "your" Apple hardware is not yours and that privacy on Apple hardware does not actually exist, sure they may not share that data with other parties but they definitely do not respect your privacy or act like you own the hardware you purchased.
Apple marketing seems to have reached the level indoctrination where everyone just keeps parroting what Apple says as an absolute truth.
They send a hash of the binaries/libraries, and generate a cache locally so it's not sent again. That helps stop you from running tampered-with binaries and frameworks. No user-personal data is sent.
There is no evidence at all that they are trying to ensure you can only run things from the App Store - I run a whole bunch of non-app-store binaries every single day. To make that claim is baseless and makes me de-rate the rest of what you write.
There is always a trade-off between privacy and security. This still falls well under the Google/Android/Chrome level, or indeed the Microsoft/Windows level with its targeted ads, IMHO.
My understanding is that they keep a local file with known malware signatures, just like the malware scanners on every other platform.
> macOS includes built-in antivirus technology called XProtect for the signature-based detection and removal of malware. The system uses YARA signatures, a tool used to conduct signature-based detection of malware, which Apple updates regularly
Xprotect is a blacklist that runs locally and is rarely used.
The phone home functionality is notarization, where apple does a network call to check that the signature on an executable actually came from apple’s notarization process. It is in essence a reputation system, where developers must be on good terms with apple to have the ability to notarize and get a smooth install experience.
From what I had in mind, notarization is only done developer side before publishing. Client side it's just a check against Apple certificates to verify that the binary haven't been tampered since notarization, no phoning home should be involved. (Or maybe just to update Apple certificates).
They also check the developer certificate in the OCSP stage.
Both of these are mechanisms where apple can effectively lock out developers from having a smooth install experience for their software at their discretion.
1. Most users are not capable of using general purpose computing technology in a wild, networked environment safely.
2. Too many people who matter to ignore insist, "something must be done."
3. And so something shall be done.
4. Apple is navigating difficult waters. As much as I disapprove of how they have chosen a path for iOS, the fact is many people find those choices are high value.
5. I do, for the most part, approve of their choices for Mac OS. I am not sure how they prevent malicious code without maintaining some sort of information for that purpose.
6. We are arriving at a crossroads many of us have been talking about for a long time. And that means we will have to make some hard choices going forward. And how we all navigate this will impact others in the future for a long time.
Look at Microsoft! They are collecting everything! And they absolutely will work with law enforcement anytime, any day, almost any way!
I sure as hell want nothing to do with Windows 11. Most technical people I know feel the same way.
Screenies every 3 to 5 seconds? Are they high? Good grief! Almost feels like raw rape. Metaphorically, of course.
Then we have Linux. Boy am I glad I took the time way back in the 90's to learn about OSS, Stallman, read words from interesting people, Raymond, Perkins, Searles, Lessig, Doctorow, many others!
Linus did all of tech one hell of a solid and here we are able to literally dumpster dive and build whatever we want just because we can. Awesome sauce in a jar right there
, but!
(And this really matters)
...Linux just is not going to be the general answer for ordinary people. At least not yet. Maybe it will be soon.
It is an answer in the form of a crude check and balance against those in power. Remember the "something shall be done" people? Yeah, those guys.
And here we are back to Apple.
Now, given the context I put here, Apple has ended up really important. Working professionals stand something of a chance choosing Mac OS rather than be forced into Windows 11, transparent edition!
And Apple does not appear willing to work against their users best interests, unless they are both compelled to by law, and have lost important challenges to said law.
If you want that, your choices are Apple and Linux!
7. Open, general purpose computing is under threat. Just watch what happens with Arm PC devices and the locked bootloaders to follow just like mobile devices.
Strangely, I find myself wanting to build a really nice Intel PC while I still can do that and actually own it and stand some basic chance of knowing most of what it doing for me. Or TO ME.
No Joke!
As I move off Win 10, it will be onto Linux and Mac OS. Yeah, hardware costs a bit more, and yeah it needs to be further reverse engineered for Linux to run on it too, but Apple does not appear to get in the way of all that. They also do not need to help and generally don't. Otherwise, the Linux work is getting done by great people we all really should recognize and be thankful for.
That dynamic is OK with me too. It is a sort of harsh mutual respect. Apple gets to be Apple and we all get to be who we are and do what we all do with general purpose computers as originally envisioned long ago.
We all can live pretty easily with that.
So, onward we go! This interesting time will prove to be more dangerous than it needs to be.
If it were not for Apple carving out a clear alternative things would look considerably more draconian, I could and maybe almost should say fascist and to me completely unacceptable.
As someone who cut his teeth on computing in the era you refer to, I have a small disagreement about Linux (especially Ubuntu) in your statement.
Apple is priced beyond the reach of many "ordinary people" especially outside the western markets. A cheap (perhaps after market) laptop with Ubuntu on it (often installed by the seller) is something that has been getting a lot of traction among regular users. Most of the things they do are via. a browser so as long as Chrome/FF works, they're good. They often install software that undermines the security that the platform natively offers but still, it's a pretty decent compromise.
You know I decided to take my old note 8 for a test drive as a PC of sorts. Went ahead and purchased one of those USB 3 port bricks so I could hook up a nice display, keyboard, mouse, removable storage.
Samsung Dex popped up and it works mostly!
I found one could do quite a lot.
That is not the way I would go, but if I had to? Bring it! Plenty can be done, good skills learned.
> I run a whole bunch of non-app-store binaries every single day
if you are in the US, you need to either register as a developer, or register an apple id and register your app to run it for a week. that's how you run non-app store code. Both of those require permission from apple.
This is completely incorrect. You can download a random binary and execute it. You will get a warning dialog saying it’s not signed by a known developer. You are free to ignore that though.
Depends what you mean by fiddling. But I'm in the process of switching to mac from Linux because my new job has forced it upon me.
I tried installing "Flameshot" via homebrew and it wouldn't run until I went into Finder, right clicked it and clicked open. Luckily it's mentioned in their docs [0] or I would have never guessed to do this.
I use homebrew every day and have never encountered this. Sounds like an issue with how the software has been packaged.
I also notice two other installation options in your link that do not come with those additional instructions — which to me suggests with whatever they’re doing on homebrew.
If I were you, I would relax. At least you are not being shoved onto Win 11.
And then think about that. Seriously. I did. Have a few times off and on over the years as we sink into this mess.
I bet you find an OS that does a bit more than you may otherwise prefer to prevent trouble. If so, fair call in my book.
Just how big of a deal is that?
Compared to Android, Windows 10 and tons of network services and such and what they do not do FOR you, and instead do TO you.
And you can run a respectable and useful installation of Linux on that spiffy Apple hardware when it gets old. So make sure it gets old, know what I mean?
As someone that just got out of a gig where I had to run Docker on MacOS - for the love of god, I would have done almost anything to use Windows 11.
Look - if I'm going to be treated like garbage, advertised to and patronized, at least let me use the system that can run Linux shells without turning into a nuclear reactor.
It’s not “a big deal” if the user knows about, but the phrasing in macOS is maliciously bad - I sent a build from my machine to a coworker and when they “naively” ran it, the pop up that came up didn’t say “this program is unsigned” it said “this program is damaged and will now be deleted” (I don’t remember the exact phrasing but it made it sound like a virus or damaged download, not like an unsigned program).
> If you send your computer/phone to Apple for repair you may get back different physical hardware.
I happen to be in the midst of a repair with Apple right now. And for me, the idea that they might replace my aging phone with a newer unit, is a big plus. As I think it would be for almost everyone. Aside from the occasional sticker, I don't have any custom hardware mods to my phone or laptop, and nor do 99.99% of people.
Can Apple please every single tech nerd 100% of the time? No. Those people should stick to Linux, so that they can have a terrible usability experience ALL the time, but feel more "in control," or something.
Why not both? Why can’t we have a good usability experience AND control? In fact, we used to have that via the Mac hardware and software of the 1990s and 2000s, as well as NeXT’s software and hardware.
There was a time when Apple’s hardware was user-serviceable; I fondly remember my 2006 MacBook, with easily-upgradable RAM and storage. I also remember a time when Mac OS X didn’t have notarization and when the App Store didn’t exist. I would gladly use a patched version of Snow Leopard or even Tiger running on my Framework 13 if this were an option and if a modern web browser were available.
NeXT was great and Mac OS X was also nice and had a lovely indie and boutique app ecosystem during the mid-to-late 2000s. Sadly, iOS stole the focus. However, the OP argues Linux usability is bad, which I think is an outdated POV. It really depends on your setup and usecases. For many development usecases, Linux is superior to macOS.
I run NixOS on a plain X11 environment with a browser, an editor and a terminal. It's really boring. For my favorite development stacks, everything works. Flakes make workflow easy to reproduce, and it's also easy to make dramatic setup changes at OS level thanks to declarativeness and immutability.
If you're interacting with other humans, or with the consumer internet, you'll run into thousands of situations where my default setup (macOS, Chrome) "just works," and your setup will require some extra effort.
You may be smart enough to figure it out, but most people (even many smart tech people) get tired of these constant battles.
Here's an example from earlier this evening: I was buying a plane ticket from Japan Air Lines. Chrome automagically translates their website from Japanese to English. Other browsers, e.g. Firefox, and even Safari, do not - I checked. Is there a workaround or a fix? I'm sure you could find one, given time and effort. But who wants to constantly deal with these hassles?
Another very common example is communication apps. Or any time you're exchanging data in some proprietary format. Would it be great if no one used proprietary formats? Yes! Is that the world we live in? No. Can I force the rest of the world to adopt open standards, by refusing to communicate with them? No.
The world has moved on from desktop environments to multi-device integration like Watch, Phone, AirTags, Speakers, TV and in that way Linux usability is certainly worse than MacOS.
Oh sort of. That is for sure a thing, but not THE thing.
I would argue people are being tugged in that direction more than it being simply better.
You can bet when people start to get to work building things --all sorts of things, not just software, they find out pretty quickly just how important a simple desktop running on a general purpose computer really is!
It could help to compare to other makers for a minute: if you need to repair your Surface Pro, you can easily remove the SSD from the tray, send your machine and stick it back when it comes repaired (new or not)
And most laptops at this point have removable/exchangeable storage. Except for Apple.
> remove the SSD from the tray, send your machine and stick it back when it comes repaired
Apple has full-disk encryption backed by the secure enclave so its not by-passable.
Sure their standard question-set asks you for your password when you submit it for repair.
But you don't have to give it to them. They will happily repair your machine without it because they can boot their hardware-test suite off an external device.
I get your point, but we can also agree "send us your data, we can't access it anyway, right ?" is a completely different proposition from physically removing the data.
In particular if a flaw was to be revealed on the secure enclave or encryption, it would be too late to act on it after the machines have been sent in for years.
To be clear, I'm reacting on the "Apple is privacy focused" part. I wouldn't care if they snoop my bank statements on disk, but as a system I see them as behind what other players are doing in the market.
I hear the point you're making and I respect the angle, its fair-enough, but ...
The trouble with venturing into what-if territory is the same applies to you...
What if the disk you took out was subjected to an evil-maid attack ?
What if the crypto implementation used on the disk you took out was poor ?
What if someone had infiltrated your OS already and been quietly exfiltrating your data over the years ?
The trouble with IT security is you have you trust someone and something because even with open-source, you're never going to sit and read the code (of the program AND its dependency tree), and even with open-hardware you still need to trust all those parts you bought that were made in China unless you're planning to open your own chip-fab and motherboard plant ?
Its the same with Let's Encrypt certs, every man and his dog are happy to use them these days. But there's still a lot of underlying trust going on there, no ?
So all things considered, if you did a risk-assessment, being able to trust Apple ? Most people would say that's a reasonable assumption ?
> even with open-source, you're never going to sit and read the code (of the program AND its dependency tree)
You don't have to. The fact that it's possible for you to do so, and the fact that there are many other people in the open source community able to do so and share their findings, already makes it much more trust-worthy than any closed apple product.
I hope you bring that up as an example in favor on open-source, as an example that open-source works. In a closed-source situation it would either not be detected or reach the light of day.
In a closed source situation people using a pseudonym don't just randomly approach a company and say "hey can I help out with that?"
It was caught by sheer luck and chance, at the last minute - the project explicitly didn't have a bunch of eyeballs looking at it and providing a crowd-sourced verification of what it does.
I am all for open source - everything I produce through my company to make client work easier is open, and I've contributed to dozens of third party packages.
But let's not pretend that it's a magical wand which fixes all issues related to software development - open source means anyone could audit the code. Not that anyone necessarily does.
> What if the disk you took out was subjected to an evil-maid attack ?
Well, have fun with my encrypted data. Then I get my laptop back, and it's either a) running the unmodified, signed and encrypted system I set before or b) obviously tampered with to a comical degree.
> What if the crypto implementation used on the disk you took out was poor ?
I feel like that is 100x more likely to be a concern when you can't control disc cryptography in any meaningful way. The same question applies to literally all encryption schemes ever made, and if feds blow a zero day to crack my laptop that's a victory through attrition in anyone's book.
> What if someone had infiltrated your OS already and been quietly exfiltrating your data over the years ?
What if aliens did it?
Openness is a response to a desire for accountability, not perfect security (because that's foolish to assume from anyone, Apple or otherwise). People promote Linux and BSD-like models not because they cherry-pick every exploit like Microsoft and Apple does but because deliberate backdoors must accept that they are being submit to a hostile environment. Small patches will be scrutinized line-by-line - large patches will be delayed until they are tested and verified by maintainers. Maybe my trust is misplaced in the maintainers, but no serious exploit developer is foolish enough to assume they'll never be found. They are publishing themselves to the world, irrevocably.
What if the disk could be removed, put inside a thunderbolt enclosure, and worked on another machine while waiting for the other? That's what I did with my Framework.
Framework has demonstrated in more than one way that Apple's soldered/glued-in hardware strategy is not necessary.
It's also possible to say "nothing" and just leave it at that. A lot of people are desperate to defend Apple by looking at security from a relative perspective, but today's threats are so widespread that arguably Apple is both accomplice and adversary to many of them. Additionally, their security stance relies on publishing Whitepapers that have never been independently verified to my knowledge, and perpetuating a lack of software transparency on every platform they manage. Apple has also attempted to sue security researchers for enabling novel investigation of iOS and iPadOS, something Google is radically comfortable with on Android.
The fact that Apple refuses to let users bring their own keys, choose their disc encryption, and verify that they are secure makes their platforms no more "safe" than Bitlocker, in a relative sense.
I suppose so they can do a boot test post-repair or something like that. I have only used their repair process like twice in my life and both times I've just automatically said "no" and didn't bother asking the question. :)
With Apple FDE, you get nowhere without the password. The boot process doesn't pass go. Which catches people out when they reboot a headless Mac, the password comes before, not after boot even if the GUI experience makes you feel otherwise.
You need to trust the erasure system, which is software. This also requires you to have write access to the disk whatever the issues are, otherwise your trust is left in the encryption and nobody having the key.
That's good enough for most consumers, but a lot more sensitive for enterprises IMHO. It usually gets a pass by having the contractual relation with the repair shop cover the risks, but I know some roles that don't get macbooks for that reason alone.
>And for me, the idea that they might replace my aging phone with a newer unit, is a big plus. As I think it would be for almost everyone.
except that isn't generally how factory repairs are handled.
I don't know about Apple specifically, but other groups (Samsung, Microsoft, Lenovo) will happily swap your unit with a factory refurbished or warranty-repaired unit as long as it was sufficiently qualified before hand -- so the 'replaced with a newer unit' concept might be fantasy.
I've seen a few Rossman streams with officially "refurbished" macbooks that were absolutely foul inside. Boards that looked like they had been left on a preheater over lunch, rubber wedges to "cure" a cracked joint, all sorts of awful shit. The leaked stories from the sweatshop that did the work were 100% consistent with the awful quality.
Admittedly this was a few years ago. Has apple mended their ways or are they still on the "used car salesman" grindset?
Are these Apple refurbished, or bought from a third party like Best Buy or Amazon? I’ve bought plenty of Apple refurbished products (directly from Apple) over the years and they always look like new (including 100% battery health).
Third parties and resellers though I’m convinced just call their returns/open box units that appear to be in decent condition “refurbished.”
You have a phone with a real, but subtle fault. Something not caught by the normal set of tests. You return it for repair, get sent a new one, they replace the battery in your old one and put into stock as 'reconditioned'.
My phone is perfect, save for a worn out battery. I send it in for battery replacement, they send me yours. Now I've swapped my perfect phone for your faulty phone - and paid $70 to do so.
It would depend on a countries consumer laws. I used to work for AASP's in Australia and they definitely used refurished phones for replacements and refurished parts for the Mac repairs. Not everyone who uses this site lives in America...
Further, there is a CRL/OCSP cache — which means that if you're running a program frequently, Apple are not receiving a fine-grained log of your executions, just a coarse-grained log of the checks from the cache's TTL timeouts.
Also, a CRL/OCSP check isn't a gating check — i.e. it doesn't "fail safe" by disallowing execution if the check doesn't go through. (If it did, you wouldn't be able to run anything without an internet connection!) Instead, these checks can pass, fail, or error out; and erroring out is the same as passing. (Or rather, technically, erroring out falls back to the last cached verification state, even if it's expired; but if there is no previous verification state — e.g. if it's your first time running third-party app and you're doing so offline — then the fallback-to-the-fallback is allowing the app to run.)
Remember that CRLs/OCSP function as blacklists, not whitelists — they don't ask the question "is this certificate still valid?", but rather "has anyone specifically invalidated this certificate?" It is by default assumed that no, nobody has invalidated the certificate.
> i.e. it doesn't "fail safe" by disallowing execution if the check doesn't go through. (If it did, you wouldn't be able to run anything without an internet connection!) Instead, these checks can pass, fail, or error out; and erroring out is the same as passing. (Or rather, technically, erroring out falls back to the last cached verification state, even if it's expired; but if there is no previous verification state — e.g. if it's your first time running third-party app and you're doing so offline — then the fallback-to-the-fallback is allowing the app to run.)
> Last week, just after we covered the release of Big Sur, many macOS users around the world experienced something unprecedented on the platform: a widespread outage of an obscure Apple service caused users worldwide to be unable to launch 3rd party applications.
Scroll down a little further on your link for confirmation of what the parent said:
> As was well-documented over the weekend, trustd employs a “fail-soft” call to Apple’s OCSP service: If the service is unavailable or the device itself is offline, trustd (to put it simply) goes ahead and “trusts” the app.
Even at the time people quickly figured out you could just disconnect from the internet as a workaround until the issue was fixed.
Presumably because you have Gatekeeper set to "Allow applications from: App Store" rather than "Allow applications from: App Store & Known Developers".
This is just Gatekeeper asking you which code-signing CA certs you want to mark as trusted in its kernel-internal trust store (which is, FYI, a separate thing from the OS trust store): do you want just the App Store CA to be trusted? Or do you also want the Apple Developer Program's "Self-Published App" Notarization CA to be trusted?
Choosing which code-signing CA-certs to trust will, obviously, determine which code-signed binaries pass certificate validation. Just like choosing which TLS CAs to trust, determines which websites pass certificate validation.
Code-signing certificate validation doesn't happen online, though. Just like TLS certificate validation doesn't happen online. It's just a check that the cert you have has a signing path back to some CA cert in the local trust store.
I have the latter Gatekeeper option, and I often have to click "Allow anyway". I don't see how being forced to click an extra button in a preference pane makes things more secure.
If you're getting the Gatekeeper dialog with the "Open anyway" button (the "Apple cannot verify that this app is free of malware" alert), then this is a specific case: you're on Catalina or later, and the app you're using has a valid code-signature but hasn't been notarized.
This warning only triggers for legacy releases of apps, published before notarization existed. Since Catalina, notarization has been part-and-parcel of the same flow that gets the self-published app bundle code-signed by Apple. AFAIK it is no longer possible to create a code-signed but non-notarized app bundle through XCode. (It's probably still possible by invoking `codesign` directly, and third-party build systems might still be doing that... but they really shouldn't be! They've had years to change at this point! Catalina was 2019!)
Thus, the "Open anyway" option in this dialog is likely transitional. This warning is, for now, intended to not overly frighten regular users, while also indicating to developers (esp. the developer of the app) that they should really get out a new, notarized release of their app, because maybe, one day, this non-notarized release of the app won't be considered acceptable by Gatekeeper any more.
I'm guessing that once a sufficient percentage of apps have been notarized, such that macOS instrumentation reports this dialog being rarely triggered, the "Open anyway" option will be removed, and the dialog will merge back into the non-code-signed-app version of the dialog that only has "Cancel" and "Move to Trash" options. Though maybe in this instance, the dialog would have the additional text "Please contact the app developer for a newer release of this app" (because, unlike with an invalid digital signature, macOS wouldn't assume the app is infected with malware per se, but rather just that it might do low-level things [like calling private OS frameworks] that Apple doesn't permit notarized apps to do.)
Both Windows and MacOS require that developers digitally sign their software, if you want users to be able to run that software without jumping through additional hoops on their computer.
You can't distribute software through the Apple or Microsoft app stores without the software being signed.
You can sign and distribute software yourself without having anything to do with the app stores of either platform, although getting a signing certificate that Windows will accept is more expensive for the little guys than getting a signing certificate that Macs will accept.
On Windows, allowing users to run your software without jumping through additional hoops requires you to purchase an Extended Validation Code Signing Certificate from a third party. Prices vary, but it's going to be at least several hundred dollars a year.
It used to be that you could run any third-party application you downloaded. And then for a while you'd have to right-click and select Open the first time you ran an application you'd downloaded, and then click through a confirmation prompt. And macOS 15, you have to attempt to open the application, be told it is unsafe, and then manually approve it via system settings.
That's just your extremely limited experience (2 stores): homebrew runs a special command clearing up a bit so you don't get that notification, which does exist if yout download apps directly
Huh? It hashes the binary and phones home doesn’t it? Go compile anything with gcc and watch that it takes one extra second for the first run of that executable. It’s not verifying any certificates
When I first run locally-built software I tend to notice XProtect scanning each binary when it is launched. I know that XProtect matches the executable against a pre-downloaded list of malware signatures rather than sending data to the internet, but I haven't monitored network traffic to be sure it is purely local. You can see the malware signatures it uses at /private/var/protected/xprotect/XProtect.bundle/Contents/Resources/XProtect.yara if you're curious.
With the sheer number of devs who use Macs, there is a 0% chance they’re going to outright prevent running arbitrary executables. Warn / make difficult, sure, but prevent? No.
The strategy is to funnel most users onto an ipad-like platform at most where they have basic productivity apps like word or excel but no ability to run general purpose programs.
Meanwhile you have a minimal set of developers with the ability to run arbitrary programs, and you can go from there with surveillance on MacOS like having every executable tagged with the developer's ID.
The greater the distance between the developer and the user, the more you can charge people to use programs instead of just copying them. But you can go much further under the guise of "quality control".
> The strategy is to funnel most users onto an ipad-like platform at most where they have basic productivity apps like word or excel but no ability to run general purpose programs.
And you know this how?
This reads like every macOS fan’s worst nightmare, but there’s zero actual evidence that Apple is going in this direction.
> The strategy is to funnel most users onto an ipad-like platform
They make the best selling laptop in the world, and other most-popular-in-class laptops. If their strategy is to have people not use laptops, they are going about it funny.
> not share that data with other parties but they definitely do not respect your privacy
not sharing my data with other parties, or using it to sell me stuff or show me ads, is what I would define as respecting my privacy; Apple checks those boxes where few other tech companies do
Their repair policy, from what I can see, is a thinly veiled attempt to get you to either pay for Apple Care or to upgrade. I got a quote to repair a colleague's MacBook Pro, less than 2 years old, which has apparent 'water damage' and which they want AUD $2,500 to repair! Of course that makes no sense, so we're buying a new one ...
The problem with many self-repair people is they effectively value their time at zero.
I value my time realistically, i.e. above zero and above minimum wage. It is therefore a no brainer for me to buy AppleCare every ... single ..time. It means I can just drop it off and let someone else deal with messing around.
I also know how much hassle it is. Like many techies, I spent part of my early career repairing people's PCs. Even in big PC tower cases with easy accessibility to all parts its still a fucking horrific waste of time. Hence these days I'm very happy to let some junior at Apple do it for the cost of an AppleCare contract.
> The problem with many self-repair people is they effectively value their time at zero.
Back in 2010 Apple quoted me €700 for a topcase replacement because of shattered display glass. Instead I paid €50 for a third party replacement pane and did 15 minutes of work with a heat gun.
What's more, they fold most of the cost of the repair into the price of parts. So you can either get a replacement screen for €499 and install it yourself, or have it officially repaired for €559. This effectively subsidizes official repairs and makes DIY repairs more expensive.
Apple does extreme gouging with repairs, its hogwash to claim anything else.
My hope is that the machine will work for a long while, like most of them do. In my case it’s a ~$1200 machine so I prefer to self-insure. I’m taking the chance that if it goes bad, I’ll pay to fix or replace it.
This makes sense, for me, when I do it on everything that I buy.
A big problem with Apple Care is here in Thailand anyway you need to give them your computer for a few weeks. You have to wait a week for them to look at it. They won't even allow you to use it and then bring it back in a week.
How often do you actually need a repair from Apple? I used to buy AppleCare but stopped in the last few years and have yet to need any repairs done except a battery replacement on a 14 Pro that I was giving to family.
Even with small children, I haven’t really found a need for AppleCare. They don’t touch my devices, and their devices are older iPads that aren’t worth that much to begin with, sheathed with big chonky cases that have survived a few trips down stairs unscathed.
Because it feels like extortion. There was almost certainly no water damage caused by external factors: the user didn't spill anything on it and has literally no idea where the so-called water damage could have come from. I have heard anecdotally that this is their go-to for denying claims and it is difficult to argue against.
Agree. I recently went to an Apple store in Tokyo to buy an accessory. The Apple employee pulled up their store iPhone to take my payment (apple pay) and then asked me to fill out a form with my email address and there was a message about how my info would be shared with some company. I thought about going back and pretending to buy something else so I could film it. I questioned the store person, "It's apple supposed to be "Privacy first"". If it was privacy first they wouldn't have asked for the info in the first place and they certainly wouldn't be sharing it with a 3rd party.
At the very least Apple are better than Microsoft, Windows and the vendors that sell Windows laptops when it comes to respecting user experience and privacy.
I switched to iPhone after they added the tracker blocking to the OS.
Everything is a tradeoff.
I’d love to live in the F droid alt tech land, but everything really comes down to utility. Messaging my friends is more important than using the right IM protocol.
Much as I wish I could convince everyone I know and have yet to meet to message me on Signal or whatever, that simply isn’t possible. Try explaining that I am not on Whatsapp or insta to a girl I’ve just met…
Also it is nice to spend basically no time maintaining the device, and have everything work together coherently. Time is ever more valuable past a certain point.
But why do we have to choose between convenient and open? Why are these companies allowed to continue having these protected "gardens"? I don't believe a free and truly open ecosystem for mobile devices would actually be less convenient than iOS or Android. If anything it would be vastly better.
Has it occurred to you that the stronger control of the ecosystem is a feature that supports the convenience and integration that's possible?
This is just the "Why not Linux desktop" argument from the past two decades. Sure, in theory it can be configured to do a lot of different things. But you're probably gonna have to work out the details yourself because the downside of theoretically supporting everything is that it's impossible to just have it work out of the box with every single scenario.
They have big numbers. Big numbers tell that 95% of people would need to be in closed protected gardens rather than getting slaughtered by open source wolves.
> Apple is well on its way to ensure you can only run things they allow via app store, they would probably already be there if it wasn't for the pesky EU.
People have been saying this ever since Apple added the App Store to the Mac in 2010. It’s been 14 years. I wonder how much time has to go by for people to believe it’s not on Apple’s todo list.
Genuinely asking: are there any specifics on this? I understand that blocking at the firewall level is an option, but I recall someone here mentioning an issue where certain local machine rules don’t work effectively. I believe this is the issue [1]. Has it been “fixed”?
They're probably referring to the certificate verification that happens when you open any notarized application. Unless something changed recently, the system phones home to ensure its certificate wasn't revoked.
It does kind of suck if the binary is frequently updated, big and you have a slow internet connection. So some program which normally takes seconds to open can take 20 or more seconds to open after an update. Or if you don't use that program frequently, you always get a very slow start of a program.
Yeah because what’s being sent is not analytics but related to notarizarion, verifying the app’s integrity (aka is it signed by a certificate known to Apple?)
This came to light a few years ago when the server went down and launching apps became impossible to slow…
> Around one year ago, after joining the Blender Development Fund and seeding hardware to Blender developers, Apple empowered a few of its developers to directly contribute to the Blender source code.
I'm assuming similar support goes to other key pieces of software, e.g., from Adobe, Maxon, etc... but they don't talk about it for obvious reasons.
The point being Apple considers these key applications to their ecosystem, and (in my estimation at least) these are applications that will probably never be included in the App Store. (The counterargument would be the Office Suite, which is in the App Store, but the key Office application, Excel, is a totally different beast than the flagship Windows version, that kind of split isn't possible with the Adobe suite for example.)
Now what I actually think is happening is the following:
1. Apple believes the architecture around security and process management that they developed for iOS is fundamentally superior to the architecture of the Mac. This is debatable, but personally I think it's true as well for every reason, except for what I'll go into in #2 below. E.g., a device like the Vision Pro would be impossible with macOS architecture (too much absolute total complete utter trash is allowed to run unfettered on a Mac for a size-constrained device like that to ever be practical, e.g., all that trash consumes too much battery).
2. The open computing model has been instrumental in driving computing forward. E.g., going back to the Adobe example, After Effects plugins are just dynamically linked right into the After Effects executable. Third party plugins for other categories often work similarly, e.g., check out this absolutely wild video on how you install X-Particles on Cinema 4D (https://insydium.ltd/support-home/manuals/x-particles-video-...).
I'm not sure if anyone on the planet even knows why, deep down, #2 is important, I've never seen anyone write about it. But all the boundary pushing computing fields I'm interested in, which is mainly around media creation (i.e., historically Apple's bread-and-butter), seems to depend on it (notably they are all also local first, i.e., can't really be handled by a cloud service that opens up other architecture options).
So the way I view it is that Apple would love to move macOS to the fundamentally superior architecture model from iOS, but it's just impossible to do so without hindering too many use cases that depend on that open architecture. Apple is willing to go as close to that line as they can (in making the uses cases more difficult, e.g., the X-Particles video above), but not actually willing to cross it.
> Apple is well on its way to ensure you can only run things they allow via app store, they would probably already be there if it wasn't for the pesky EU
What has the EU done to stop Apple doing this? Are Apple currently rolling it out to everywhere but the EU?
>Apple is well on its way to ensure you can only run things they allow via app store
that ship has well and truly sailed, this conspiracy might once have held water but Apple's machines are far too commercially ubiquitous for them to have any designs on ringfencing all the software used by all the industries that have taken a liking to the hardware.
The EU is center-right-wing, and laughs all the way to the bank whenever someone like you falls for their "we externally pretend to be the good guys" trope. Leyen is pretty much the worst leadership ever, but they still manage to convince the politically naiv that everything is fine, because of GDPR, AI laws and huge penalties for big tech. Its sad how simple it is to confuse people.
I mean, the security features are pretty well documented. The FBI can't crack a modern iPhone even with Apple's help. A lot of the lockdowns are in service of that.
I'm curious: what hardware and software stack do you use?
Edit: I have not posted a source for this claim, because what sort of source would be acceptable for a claim of the form "X has not occurred"?
If you are going to claim Apple's security model has been compromised, you need not only evidence of such a compromise but also an explanation for why such an "obvious" and "cheap" vulnerability has not been disclosed by any number of white or grey-hat hackers.
"Since then, technologies like Grayshift’s GrayKey—a device capable of breaking into modern iPhones—have become staples in forensic investigations across federal, state, and local levels."
"In other cases where the FBI demanded access to data stored in a locked phone, like the San Bernardino and Pensacola shootings, the FBI unlocked devices without Apple’s help, often by purchasing hacking tools from foreign entities like Cellebrite."
> Apple is well on its way to ensure you can only run things they allow via app store
I'm very happy to only run stuff approved on Apple's app store... ESPECIALLY following their introduction of privacy labels for all apps so you know what shit the developer will try to collect from you without wasting your time downloading it.
Also have you seen the amount of dodgy shit on the more open app stores ?
It's a reasonable choice to do so and you can do it now.
The problem starts when Apple forbid it for people who want to install on their computer what they want.
> Apple is well on its way to ensure you can only run things they allow via app store
I am totally ok with this. I have personally seen apple reject an app update and delist the app because a tiny library used within it had a recent security concerns. Forced the company to fix it.
No one is stopping you from using only the app store if you value its protection, so you need a more relevant justification to ok forcing everyone else to do so
If I had 1.4B active users I would want to mitigate the ability of almost all of them to accidentally fuck up their devices instead of worrying about irritating a few tech folk because they can’t load broken apps on it.
Your stat is an order of magnitude type of fantasy, the apps aren't broken, and the inability to install also affects everyone, not a few folks, so again you're left with nothing but your personal desire for controlling other people
> Your personal desire for controlling other people.
Well that’s just childish, pouty, and not a very well thought out train of thought on the subject.
The control isn’t over people, it’s about finding a solution to creating and preserving market share via device reliability on the platform. There are 1.4B iPhone users (and that’s a real number, not a fantasy), and not every one of those people is savvy enough to vet their applications before installation. If installation of any app was wide open you would have a large portion of those 1.4B accidentally installing crap. They may have 100 apps on their phone but if 1 is a piece of shit and broken (and yes conservatively at least 1% of apps out there probably have a bug bad enough to wreck some havoc) and it renders the reliability of the phone to shit that’s bad. If the market perceives that the reliability of the device is shit, Apple loses either in increasing or preserving market share for the device. Apple needs those devices need to work reliably and it feels that one way to do that is vetting the apps that will be running on it. The hardware is great, the OS does its job making the hardware platform operational, but the one place where there is the opportunity to introduce instability is in the apps. So you do your best to control that area of instability opportunity on your platform.
Here is the beautiful thing for you…there plenty of other phones out there that will allow you to install whatever the hell you want. Apple only has 16% of the worldwide smartphone market share.
Man, talking about crashing trains of thought: you fail to grasp the fact that the conversation is about MacOS, not iOS, that there is no contradiction between "blah platform" and control over people, and even that the fact that other phones exist doesn't negate the deficiencies of this specific phone
> conservatively at least 1% of apps
That's another made up number of yours, with a similarly made up qualifier
> the market perceives that the reliability of the device is shit
Since the vast majority of devices aren't so locked down, isn't "the market" yelling at yout that you're wrong?
I was talking about iOS so yes, I missed that the conversation was about Mac. Shame on me. In a sense the use case for a Mac is less ubiquitous than a smartphone, so the need for vetting may not be as great because users of the device don’t perceive the apps running on it as the device itself.
However, I stand firm in my argument about why the iPhone is locked down and why it’s a good thing. Even if you spread into other smartphone manufacturers like Samsung, you still find similar attempts to control the lay users ability to install unvetted apps on the devices. It may even be more important for them to do that too since they don’t fully control the OS on their devices.
> That's another made up number of yours, with a similarly made up qualifier
Obvious it was made up and obviously it was set as an intentionally low bar for software quality because who would argue (especially on HN) that 100% of available software out there is bug free, but if you want to believe that all available software is 100% safe to use, I encourage you to download and install everything you come across no matter whether the device is a smartphone, a Mac, or any other device you use and rely upon. I am sure you will be fine.
Sure, though it doesn't mean what you want it to mean since you just ignore the $$$ elephant in the room that explains the desire for more control. For the same reason, you "stand firm" in ignorance as to "why the iPhone is locked down"
> Obvious it was made up
Glad you realise that.
> intentionally low bar
Intentionally appearing like one
> if you want to believe ... software is 100% safe to use
Again with your fantasies. I believe the justification should be grounded in reality, both in terms of the % estimate as well as in terms of the severity (so no, "bug free" is irrelevant, you need severe billions-afecting bugs that can only be eliminated by hard-forcing the app store, which you can't have since the reality doesn't align with you).
And as to your standing firm in your argument "why it’s a good thing", well, you don't really have an argument, just a desire for one with made up stats and corporate motivations
Alrighty, so I guess what we have learned that apparently some number at 100% or perhaps less of all software is apparently released bug free. However, we don’t know for sure “the perhaps less”, despite all of the numerous historical examples of shit software being released that has wrecked havoc that we or others have experienced. And since we don’t know that precise number we are not allowed to state any estimate no matter how modest that is below that 100% of software perfection. Therefore, a device manufacturer would never need or should do anything that attempts to protect the consumer and their market share by protecting the device’s perceived reliability by preventing buggy software from being installed, because buggy software doesn’t exist.
Thanks for the education in the importance of precision and the rejection of experience in determining reality. I’ll ignore my decades of having to clean up all the messes that apparently non-existent buggy shit software managed to do to novice and lay users who willy-nilly installed it…or maybe didn’t install it, since it was imaginary.
By the way…before you respond again you might read up a bit on situational irony. You seemed to have missed it on my prior comment…and this one is dripping with it.
Your drips don't land because you can't make up a valid argument, ignore what I said and resort back to your fantasy land again fighting your imaginary 100%s and do-nothings
Sure – Apple are trying to stop people who don't know what they're doing from getting hurt. Hence the strong scrutiny on what is allowed on the App Store (whether it's reasonable to charge 30% of revenue is an entirely different question).
People who are installing things using a terminal are probably (a) slightly computer savvy and (b) therefore aware that this might not be a totally safe operation.
Privacy is the new obscenity. What does privacy even mean to you concretely? Answer the question with no additional drama, and I guarantee you either Apple doesn’t deliver what you are asking for, or you are using services from another company, like Google, in a way that the actions speak that you don’t really care about what you are asking for.
> End to end encryption by default, such that the cloud provider cannot access my data.
The App Store stores a lot of sensitive data about you and is not end-to-end encrypted. They operate it just like everyone else. You also use Gmail, which is just as sensitive as your iMessages, and Gmail is not end-to-end encrypted, so it's not clear you value that as much as you say.
I think "could a creepy admin see my nudes" or "can my messages be mined to create a profile of my preferences" are much more practical working definitions of privacy than "can someone see that I've installed an app".
End-to-end encryption is certainly the most relevant feature for these scenarios.
App store DRM is a red herring, as a developer I can still run as much untrusted code on my MBP as I want and I don't see that going away any time soon.
You are saying a lot of words but none of them negate the point that Apple has a better security posture for users than any of the other big tech cos. For any meaningful definition of the word "security."
Sure I use gmail, I've been locked in for 15 years. Someday I'll get fed up enough to bite the bullet and move off it.
> Apple has a better security posture for users than any of the other big tech cos. For any meaningful definition of the word "security."
Apple can push updates and change the rules on your device at any time. Rooted Android works better in that regard: you can still use Google stuff on rooted devices. Also I don't think Apple's security posture for users in China is better than every "other big tech co."
The takeaway for me is that Apple's storytelling is really good. They are doing a good job on taking leadership on a limited set of privacy issues that you can convince busy people to feel strongly about. Whether or not that objectively matters is an open question.
There's some weird[1] laws around privacy in Australia, where government departments are blocked from a bunch of things by law. From my perspective as a citizen, this just results in annoyance such as having to fill out forms over and over to give the government data that they already have.
I heard a good definition from my dad: "Privacy for me is pedestrians walking past my window not seeing me step out of the shower naked, or my neighbours not overhearing our domestic arguments."
Basically, if the nude photos you're taking on your mobile phone can be seen by random people, then you don't have privacy.
Apple encrypts my photos so that the IT guy managing the storage servers can't see them. Samsung is the type of company that includes a screen-capture "feature" in their TVs so that they can profile you for ad-targeting. I guarantee you that they've collected and can see the pictures of naked children in the bathtub from when someone used screen mirroring from their phone to show their relatives pictures of their grandkids. That's not privacy.
Sure, I use Google services, but I don't upload naked kid pictures to anything owned by Alphabet corp, so no problem.
However, I will never buy any Samsung product for any purpose because they laugh and point at customer expectations of privacy.
[1] Actually not that weird. Now that I've worked in government departments, I "get" the need for these regulations. Large organisations are made up of individuals, and both the org and the individual people will abuse their access to data for their own benefit. Many such people will even think they're doing the "right thing" while destroying freedom in the process, like people that keep trying to make voting systems traceable... so that vote buying will become easy again.
That was the result of social engineering though, not iCloud being compromised. AFAIK it was a phishing scam, asking the victims for their usernames and passwords.
> asking the victims for their usernames and passwords.
This should illuminate for you that there is nothing special about iCloud privacy or security, in any sense. It has the same real weaknesses as any other service that is UIs for normal people.
Never said there was. No system is foolproof, and a lot of today's security is so good that the users usually are the weakest links in a system. Still, some are more secure than others and there's a difference between a person being tricked to give up their credentlials and a zero day.
In my experience, single-core CPU is the best all-around indicator of how "fast" a machine feels. I feel like Apple kind of buried this in their press release.
These numbers are misleading (as in not apples to apples comparison). M4 has a matrix multiply hardware extension which can accelerate code written (or compiled) specifically for this extension.
It will be faster only for code that uses/is optimized for that specific extension. And the examples you give are not really correct.
If you add a supercharger you will get more power, but if the car's transmission is not upgraded, you might just get some broken gears and shafts.
If you add more solar panels to your roof, you might exceed the inverter power, and the panels will not bring benefits.
It's true that you will benefit from the changes above, but not just by themselves - something else needs to change so you can benefit. And in the case of the M4 and these extensions, the software needs do be changed and also to have an use case for these extensions.
That is an example of the kind of hardware and software synergy that gives good performance on Apple Silicon for apps in iOS and Mac OS. Apple execs have given interviews where they talk about this kind of thing. They look at code in the OS and in their application libraries that can benefit from hardware optimization and they build in hardware support for it to improve performance overall. This helps all kinds of app running on the OS and using the standard libraries.
Are you implying there are no use cases for matrix multiply?
In any case, the two main deep learning packages have already been updated so for the place this change was almost certainly targeted for, your complaint is answered. I'm just stunned that anyone would complain about hardware matrix multiplication? I've wondered why that hasn't been ubiquitous for the past 20 years.
Everyone should make that improvement in their hardware. Everyone should get rid of code implementing matrix mult and make the hardware call instead. It's common sense. Not to put too fine a point on it, but your complaint assumes that GeekBench is based on code that has implemented all those changes.
> Are you implying there are no use cases for matrix multiply?
The whole point is that these highly specialized scenarios are only featured in very specialized usecases, and don't reflect in overall performance.
We've been dealing with the regular release of specialized processor operations for a couple of decades. This story is not new. You see cherry-picked microbenchmarks used to plot impressive bar charts, immediately followed by the realization that a) in general this sort of operator is rarely invoked with enough frequency to be noticeable, b) you need to build code with specialized flags to get software to actually leverage this feature, c) even then it's only noticeable in very specialized workloads that already run on the background.
I still recall when fused multiply-add was such a game changer because everyone used polynomials and these operations would triple performance. Not the case.
And more to the point, do you believe that matrix multiplication is a breakthrough discovery that is only now surfacing? Computers were being designed around matrix operations way before they were even considered to be in a household.
I'm not complaining, I'm just saying that the higher numbers of that benchmark result do not translate directly to better performance for all software you run. Deep learning as it is right now is probably the main application that benefits from this extension (and probably the reason why it was added in hardware at this point in time).
Well you're really just describing benchmarks- if the benchmark doesn't represent your standard workflow then it probably isn't a good reference for you. But Geekbench includes a bunch of components based on real-world applications like file compression, web browsing, and PDF rendering. So it probably isn't perfect, but it's likely that the M4 will feel a bit faster in regular use compared to an older generation MacBook Pro.
Beware that some of the Geekbench number are the result of them suddenly gaining support for streaming SVE and SME just when Apple implements it.
I'm not doubting the number represent real peak throughput on M4. There is just a taste to the timing lining up so well. Also they don't take advantage of fully SVE2 capable ARM cores to compare how much a full SVE2 implementation would help especially at accelerating more algorithms than those that neatly map to streaming SVE and SME.
The single core performance gains of M4 variants over their predecessors are distorted, because the streaming SVE and SME are apparently implemented by combining what used to be them AMX units of four cores.
> I feel like Apple kind of buried this in their press release
The press release describes the single core performance as the fastest ever made, full stop:
"The M4 family features phenomenal single-threaded CPU performance with the world’s fastest CPU core"
The same statement is made repeatedly across most the new M4 line up marketing materials. I think thats enough to get the point across that its a pretty quick machine.
The article has all of the "x times faster than M1" notes but the video shows graphs with the M3 whenever they do that and it is usually ~1.2x in the CPU on that. I think it's probably a smart move this page (and the video) focused so much on 2x or greater performance increases from the M1 generation. After all, so what if it's 20% faster than the M3? As in: how many customers that weren't already interested in just buying the latest thing before reading your marketing material are you going to convince to upgrade from the M3 just because the M4 is ~20% faster vs trying to convince M1 users to upgrade because it's over twice as fast.
I think the point is to try to convince MacBook users who haven't looked to upgrade yet rather than something trying to make a comparison to other non-mac models you could buy today. From that perspective it's perfectly valid, even if it doesn't tick the boxes of what others might want from a different comparison perspective.
> Results are compared to previous-generation 1.7GHz quad-core Intel Core i7-based 13-inch MacBook Pro systems with Intel Iris Plus Graphics 645, 16GB of RAM, and 2TB SSD.
Not really. Intel CPU performance hasn't changed by orders of magnitude in the last ten years. My ten year old Windoze 10 desktop keeps chugging along fine. My newer 2022 i7 Windows machine works similarly well.
However, attention to keeping Intel Macs performant has taken a dive. My 2019 16" MBP died last week so I fell back to my standby 2014 MBP and it's much more responsive. No login jank pause .
But it also hasn't been eligible for OS updates for 2 or 3 years.
My new M3 MBP is "screaming fast" with Apple's latest patched OS.
My god, it's ridiculous. I really prefer Linux desktops. They've been snappy for the past 30 years, and don't typically get slow UI's after a year or two of updates.
This is the same CPU tier, just a later generation.
Passmark scores:
6700K: 8,929
14700K: 53,263
Yeah, that's practically the same performance.
But hey, that newer i7 has way more cores. Let's pick something with a closer core count for a fairer comparison. Let's pick the Core i3-14100 with its 4C/8T with a turbo of 4.7GHz. Even then, its Passmark benchmark 15,050.
I get it, an old CPU can still be useful. I'm still using an Ivy Bridge CPU for a server in my closet hosting various services for my home, but it is vastly slower than my Ryzen 7 3700x on my current gaming desktop and was even slower than the previous Ryzen 5 2600 I had before and sold to a friend.
> Yeah, better than the glaring, 10x better than i7 Intel Mac. Like that's even a valid point of reference.
The last Intel macbook pros were released 4 years ago. Their owners are starting to shop around for replacements. Their question will be "will the expense be worth it?"
I maybe don't understand but isn't an intel at 5.5ghz faster in terms of bits processed then the 4.4ghz m4? wouldn't that be fastest as more data can be processed?
GHz represents the number of cycles per second, not the number of bits actually processed per second. On different CPUs the same instruction can take a different number of cycles, a different number of the instruction can be in flight at the same time, a different number of several different instructions can be issued at once, a different amount of data can be pulled from cache to feed these instructions, a different quality of instruction reordering and branch prediction, and so on.
As an example a single thread on an x64 core of an old Pentium 4 661 @ 3.6 GHz benchmarks at 315 with PassMark while a single x64 core of a current 285k @ 5.7 GHz turbo benchmarks at 5195. Some of that also comes down to things like newer RAM to feed the CPU but the vast majority comes down to the CPU calculating more bits per clock cycle.
I don't know much about modern Geekbench scores, but it that chart seems to show that M1s are still pretty good? It appears that M4 is only about 50% faster. Somehow I would expect more like 100% improvement.
Flameproof suit donned. Please correct me because I'm pretty ignorant about modern hardware. My main interest is playing lots of tracks live in Logic Pro.
Some of it depends on which variant fits you best. But yeah, in general the M1 is still very good--if you hear of someone in your circle selling one for cheap because they're upgrading, nab it.
On the variants: An M1 Max is 10 CPU cores with 8 power and 2 efficiency cores.
M4 Max is 16 cores, 12 + 4. So each power core is 50% faster, but it also has 50% more of them. Add in twice as many efficiency cores, that are also faster for less power, plus more memory bandwidth, and it snowballs together.
One nice pseudo-feature of the M1 is that the thermal design of the current MacBook Pro really hasn't changed since then. It was designed with a few generations of headroom in mind, but that means it's very, very hard to make the fans spin on a 16" M1 Max. You have to utilize all CPU/GPU/NPU cores together to even make them move, while an M3 Max is easier to make (slightly) audible.
> it's very, very hard to make the fans spin on a 16" M1 Max. You have to utilize all CPU/GPU/NPU cores together to even make them move,
I routinely get my M1 fans spinning from compiling big projects. You don’t have to get the GPU involved, but when you do it definitely goes up a notch.
I read so much about the M1 Pros being completely silent that I thought something was wrong with mine at first. Nope, it just turns out that most people don’t use the CPU long enough for the fans to kick in. There’s a decent thermal capacity buffer in the system before they ramp up.
Huh! I regularly max CPU for long stretches (game development), but I found I could only get the fans to move if I engaged the neural cores on top of everything else. Something like a 20+ minute video export that's using all available compute for heavy stabilization or something could do it.
The M3 is much more typical behavior, but I guess it's just dumping more watts into the same thermal mass...
The M1 was pretty fast when it debuted. If you own an M1 Mac its CPU has not gotten any slower over the years. While newer M-series might be faster, the old one is no slower.
The M1s are likely to remain pretty usable machines for a few years yet, assuming your workload has not or does not significantly change.
I wonder how reliable geekbench tests are. Afaik it's the most common benchmark run on apple devices, so apple has a great interest in making sure their newest chips perform great on the test.
I wouldn't be surprised to hear that the geekbench developers are heavily supported by apple's own performance engineers and that testing might not be as objective or indicative of real world perf as one would hope.
Wouldn't it be more surprising to you Apple has been selling 4 generations of the M series to great acclaim on the performance but it all turned out to be a smoke and mirrors show because the hardware is optimized for one particular benchmark they didn't even reference in their comparisons?
The only areas the M series "lags" is in the high end workstation/server segment where they don't really have a 96+ core option or in spaces where you pop in beefy high end GPUs. Everything else the M4 tends to lead in right now.
Okay, the linked benchmarks proved me wrong. I trust cinebench numbers a whole lot more than a geekbench score.
My bad.
And just to be clear: I didn't speculate that Apple tune's its chip to Geekbench, I speculated that geekbench was overly optimized towards apple's latest chip.
> (...) but it all turned out to be a smoke and mirrors show because the hardware is optimized for one particular benchmark they didn't even reference in their comparisons?
I know iOS developers who recently upgraded their MacBooks and they claim they now feel more sluggish. I wouldn't be surprised if it was due to RAM constraints instead of CPU though.
So, take those artificial bencmarks with a grain of salt. They are so optimized that they are optimized out of the real world.
Always look the benches of the actual workload types you'll be using. Some of the above, e.g. cinebench, are actually production rendering programs. Doesn't mean dick if you aren't going to use it that way.
It's kinda true but also not at all.
In general, having better single thread performance means a more reactive UI and snappier feeling because blocking operation gets executed more quickly. On the other hand, many modern software having been extremely optimized for multi-threading and not all categories of software benefit that much from faster UI thread. If parallelization isn't too expansive, throwing more core at something can make it actually faster.
And the big thing you leave out is that it all depends on how well the software is optimise, how much animations it use and things like that.
My iPhone has MUCH better single-thread performance than my old PC, yet it feels much slower for almost everything.
And this is exactly how I feel about Apple Silicon Macs. On paper, impressive performance. In actual practice it doesn't feel that fast.
It's not really buried... their headline stat is that it's 1.8x faster than the M1, which is actually a bigger improvement than the actual Geekbench score shows (it would be a score of 4354).
Call me cynical, but when I see headlines like "up to 2x faster", I assume it's a cherry-picked result on some workload where they added a dedicated accelerator.
There's a massive difference between "pretty much every app is 80% faster" and "if you render a 4K ProRes video in Final Cut Pro it's 3x faster."
I've never kept any laptop as long as I've kept the M1. I was more or less upgrading yearly in the past because the speed increases (both in the G4 and then Intel generations) were so significant. This M1 has exceeded my expectations in every category, it's faster quieter and cooler than any laptop i've ever owned.
I've had this laptop since release in 2020 and I have nearly 0 complaints with it.
I wouldn't upgrade except the increase in memory is great, I don't want to have to shut down apps to be able to load some huge LLMs, and, I ding'ed the top case a few months ago and now there's a shadow on the screen in that spot in some lighting conditions which is very annoying.
I hope (and expect) the M4 to last just as long as my M1 did.
You'll be glad you did. I loved my 2015 MBP. I even drove 3 hours to the nearest Best Buy to snag one. That display was glorious. A fantastic machine. I eventually gave it to my sister, who continued using it until a few years ago. The battery was gone, but it still worked great.
When you upgrade, prepare to be astonished.
The performance improvement is difficult to convey. It's akin to traveling by horse and buggy. And then hopping into a modern jetliner, flying first class.
It's not just speed. Display quality, build quality, sound quality, keyboard quality, trackpad, ports, etc., have all improved considerably.
The performance jump between a top-of-the-line intel MBP (I don't remember the year, probably 2019) and the m1 max I got to replace it.. was rather like the perf jump between spinning disks and SSDs.
When I migrated all my laptops to SSDs (lenovos at the time, so it was drop-dead simple), I thought to myself, "this is a once-in-a-generation feeling". I didn't think I would ever be impressed by a laptop's speed ever again. It was nice to be wrong.
> The battery was gone, but it still worked great.
A family 2018 Macbook Air got a second life with a battery replacement. Cheap kit from Amazon, screwdrivers included, extremely easy to do. Still in use, no problems.
My 2015 15" MBP is also still kickin, is/was an absolutely fabulous unit. Was my work machine for 3-4 years, and now another almost-6 years as my personal laptop. My personal use case is obviously not very demanding but it's only now starting to really show its age.
I also have a M1 from work that is absolutely wonderful, but I think it's time for me to upgrade the 2015 with one of these new M4s.
Honestly, my Thinkpad from 2015 was still used in my family until recently. The battery was pretty bad, same as on my 2015 MBP, but other than that, I put Fedora on it, and it was still really fast.
Longevity is not only a thing of MBPs. OTOH, IIRC, some 2017-2019 MBPs (before the Mx switch) were terrible for longevity, given their problematic keyboard.
I was also a mid-2012 MBP user. I eventually got the M2 MBA because I was investing in my eyesight (modern displays are significantly better). I was never impressed with the touchbar-era macs, they didn't appeal to me and their keyboards were terrible.
I think this M-series macbook airs are a worthy successor to the 2012 MBP. I fully intend to use this laptop for at least the same amount of time, ideally more. The lack of replaceable battery will probably be the eventual killer, which is a shame.
That is amazing. Mine lasted for a super long time as well, and like you, I upgraded everything to its max. I think it was the last model with a 17 inch screen.
Sold mine last year for $100 to some dude who claimed to have some software that only runs on that specific laptop. I didn't question it.
I still have my 2015, and it lived just long enough to keep me going until the death of the touch bar and horrible keyboard, which went away when I immediately bought the M1 Pro on release day.
I used that same model for 5 years until I finally upgraded in 2017 and totally regretted it, the upgrade was not worth it at all, I would have been just as happy with the 2012. I quickly replaced it again with the "Mea Culpa" 2019 where they added back in ports, etc, would have been just about worth the upgrade over the 2012, 7 years later, but again, not by a big margin.
The 2012 MBP 15" Retina was probably the only machine I bought where the performance actually got better over the years, as the OS got more optimized for it (the early OS revisions had very slow graphics drivers dealing with the retina display)
The M1 Pro on the other hand, that was a true upgrade. Just a completely different experience to any Apple Intel laptop.
I was considering upgrading to an M3 up until about a month ago when Apple replaced my battery, keyboard, top case, and trackpad completely for free. An upgrade would be nice as it no longer supports the latest MacOS, but at this point, I may just load Ubuntu on the thing and keep using it for another few years. What a machine.
I've just replaced a 2012 i5 mbp, and used it for Dev work and presentations into 2018.
It has gotten significantly slower the last 2 years, but the more obvious issue is the sound, inability to virtual background, and now lack of software updates.
But if you had told me I'd need to replace it in 2022 I wouldn't believe you
Ah my 2013 mbp died in 2019. It was the gpu. No way to repair it for cheap enough so I had to replace it with a 2019 mbp which was the computer I kept the shortest (I hated the keyboard).
How do you justify this kind of recurring purchases, even with selling your old device? I don't get the behaviour or the driving decision factor past the obvious "I need the latest shiny toy" (I can't find the exact words to describe it, so apologies for the reductive description).
I have either assembled my own desktop computers or purchased ex corporate Lenovo over the years with a mix of Windows (for gaming obviously) and Linux and only recently (4 years ago) been given a MBP by work as they (IT) cannot manage Linux machines like they do with MacOS and Windows.
I have moved from an intel i5 MBP to a M3 Pro (?) and it makes me want to throw away my dependable ThinkPad/Fedora machine I still uses for personal projects.
I spend easily 100 hours a week using it not-as-balanced-as-it-should-be between the two.
I don't buy them because I need something new, I buy them because in the G4/Intel era, the iterations were massive and even a 20 or 30% increase in speed (which could be memory, CPU, disk -- they all make things faster) results in me being more productive. It's worth it for me to upgrade immediately when apple releases something new, as long as I have issues with my current device and the upgrade is enough of a delta.
M1 -> M2 wasn't much of a delta and my M1 was fine.
M1 -> M3 was a decent delta, but, my M1 was still fine.
M1 -> M4 is a huge delta (almost double) and my screen is dented to where it's annoying to sit outside and use the laptop (bright sun makes the defect worse), so, I'm upgrading. If I hadn't dented the screen the choice would be /a lot/ harder.
I love ThinkPads too. Really can take a beating and keep on going. The post-IBM era ones are even better in some regards too. I keep one around running Debian for Linux-emergencies.
There are 2 things I was always spending money on, if I felt is not the almost best achievable: my bed and my laptop. Even the phone can be 4 years old iPhone, but the laptop must be best and fast. My sleep is also pretty important. Everything else is just "eco".
In my country you can buy a device and write off in 2 years, VAT reimbursed, then scrap it from the books and you sell it to people without tax payed to people who otherwise would pay a pretty hefty VAT. This decreases your loss of value to like half.
It's tax avoidance, not evasion. If it's fully legal then I don't know why wouldn't you recommend it. If you are against it, you can easily pay more in taxes than required yourself.
Apple has a pretty good trade-in program. If you have an Apple card, it's even better (e.g. the trade-in value is deducted immediately, zero interest, etc.).
Could you get more money by selling it? Sure. But it's hard to be the convenience. They ship you a box. You seal up the old device and drop it off at UPS.
I also build my desktop computers with a mix of Windows and Linux. But those are upgraded over the years, not regularly.
>I've never kept any laptop as long as I've kept the M1
What different lives we live. This first M1 was in November 2020. Not even four years old. I’ve never had a [personal] computer for _less_ time than that. (Work, yes, due to changing jobs or company-dictated changes/upgrades)
Exactly my thoughts. I don't understand whether I'm really spoiled, or is the crowd here weird about upgrading for some reason - if you have a laptop from 4-5 years ago, the new one would be 2-5x faster in vast majority of things - even if not critical for your workflow, it would feel SO MUCH nicer - so if it's something you use for 100h / week, shouldn't you try to make it as enjoyable as reasonably possible?
Other example - I'm by no means rich, but I have a $300 mechanical keyboard - it doesn't make me type faster and it doesn't have additional functionality to a regular $30 Logitech one - but typing on it feels so nice and I spend so much of my life doing it, that to me it's completely justified and needed to have this one then.
That’s a feature, not a bug, for some. When I upgraded to an M series chip MacBook, I had to turn up the heat because I no longer had my mini space heater.
> I've never kept any laptop as long as I've kept the M1.
I still have a running Thinkpad R60 from 2007, a running Thinkpad T510 from 2012, and a modified running Thinkpad X61 (which I re-built as an X62 using the kit from 51nb in 2017 with a i7-5600U processor, 32 GB of RAM and a new display) in regular use. The latter required new batteries every 2 years, but was my main machine until 2 weeks ago when I replaced it with a ThinkCentre. During their time as my main machine, each of these laptops was actively used around 100 hours per week, and was often running for weeks without shutdown or reboot. The only thing that every broke was the display of the R60 which started to show several green vertical bars after 6 years, but replacement was easy.
I've spilled liquid on my MacBook's once every 10 years on average. Last in 2014, then again last month. Accidents happen.
As I've noted in a sibling comment, I'll probably stop purchasing mobile Macs until the repair story on Macbooks is improved -- the risk for accidents and repairs is simply much higher on portable machines. That's only going to happen through third-party repair (which I think would simultaneously lead Apple to lower their first-party repair costs, too).
Interesting. I have found occasion to use it for pretty much every Mac I've owned since the 1980s! I'm not sure how much money it's saved compared to just paying for repairs when needed, but I suspect it may come out to:
1) a slight overall savings, though I'm not sure about that.
2) a lack of stress when something breaks. Even if there isn't an overall savings, for me it's been worth it because of that.
Certainly, my recent Mac repair would have cost $1500 and I only paid $300, and I think I've had the machine for about 3 years, so there's a savings there but considerably less recent stress. That's similar to the experience I've had all along, although this recent expense would have probably been my most-expensive repair ever.
"SSD is soldered on" is a bit of glossing over of the issue with the M-series Macs.
Apple is putting raw NAND chips on the board (and yes soldering them) and the controller for the SSD is part of the M-series chip. Yes, apple could use NVMe here if you ignore the physical constraints and ignore fact that it wouldn't be quite as fast and ignore the fact that it would increase their BOM cost.
I'm not saying Apple is definitively correct here, but, it's good to have choice and Apple is the only company with this kind of deeply integrated design. If you want a fully modular laptop, go buy a framework (they are great too!) and if you want something semi-modular, go buy a ThinkPad (also great!).
I need macOS for work. Now that the writing is on the wall for Hackintosh (which I used to do regularly while purchasing a Mac every few years, most recently in 2023 and 2018, because I love that choice), I don't have a choice. I used to spend 10-20 hours per third party machine for that choice.
I don't truly mind that they solder on the SSD, embed the controller into the processor -- you're right that it's great we have choice here. I mind the exuberant repair cost _on top of_ Apple's war on third party repair. Apple is the one preventing me to have choice here, I have to do the repair through them, or wait until schematics are smuggled out of China and used/broken logic boards are available so that the repair costs what it should: $300 to replace 2 chips on my logic board (still mostly labor, but totally a fair price).
I love Apple for their privacy focus and will continue to support them because I need to do Mac and iOS development, but I will likely stop buying mobile workstations from them for this reason, the risk of repair is simply much higher and not worth this situation.
Yeah, I always have AppleCare. I view it as part of the cost of a mac (or iPhone).
And yeah, this incident reminded me of why it's important to back up as close to daily as you can, or even more often during periods when you're doing important work and want to be sure you have the intermediate steps.
Mine fell off from the roof of a moving car at highway speeds and subsequently spent 30 mins being run over by cars until it was picked back up. Otherwise no complaints.
I had an 2019 i9 for a work laptop. It was absolutely awful, especially with the corporate anti-virus / spyware on it that brought it to a crawl. Fans would run constantly. Any sort of Node JS build would make it sound like a jet engine.
That was the worst laptop I've ever had. Not only was it turning the jet engines on when you tried to do something more demanding that moving mouse around, it throttled thermally so much that you literally could not move that mouse around.
I have the OG 13" MBP M1, and it's been great; I only have two real reasons I'm considering jumping to the 14" MBP M4 Pro finally:
- More RAM, primarily for local LLM usage through Ollama (a bit more overhead for bigger models would be nice)
- A bit niche, but I often run multiple external displays. DisplayLink works fine for this, but I also use live captions heavily and Apple's live captions don't work when any form of screen sharing/recording is enabled... which is how Displaylink works. :(
Not quite sold yet, but definitely thinking about it.
Yep. That's roughly 20% per generation improvement which ain't half-bad these days, but the really huge cliff was going from Intel to the M1 generation.
M1 series machines are going to be fine for years to come.
It feels like M1 was the revolution, subsequent ones evolution - smaller fabrication process for improved energy efficiency, more cores for more power, higher memory (storage?) bandwidth, more displays (that was a major and valid criticism for the M1 even though in practice >1 external screens is a relatively rare use case for <5% of users).
Actually wasn't M1 itself an evolution / upscale of their A series CPUS that by now they've been working on since... before 2010, the iPhone 4 was the first one with their own CPU, although the design was from Samsung + Intrinsity, it was only the A6 that they claimed was custom designed by Apple.
In early 2020, I had an aging 2011 Air that was still struggling after a battery replacement. Even though I "knew" the Apple Silicon chips would be better, I figured a 2020 Intel Air would last me a long time anyway, since my computing needs from that device are light, and who knows how many years the Apple Silicon transition will take take anyway?
Bought a reasonably well-specced Intel Air for $1700ish. The M1s came out a few months later. I briefly thought about the implication of taking a hit on my "investment", figured I might as well cry once rather than suffer endlessly. Sold my $1700 Intel Air for $1200ish on craigslist (if I recall correctly), picked up an M1 Air for about that same $1200 pricepoint, and I'm typing this on that machine now.
That money was lost as soon as I made the wrong decision, I'm glad I just recognized the loss up front rather than stewing about it.
Exact same boat here. A friend and I both bought the 2020 Intel MBA thinking that the M1 version was at least a year out. It dropped a few months later. I immediately resold my Intel MBA seeing the writing on the wall and bought a launch M1 (which I still use to this day). Ended up losing $200 on that mis-step, but no way the Intel version would still get me through the day.
That said...scummy move by Apple. They tend to be a little more thoughtful in their refresh schedule, so I was caught off guard.
When I saw the M1s come out, I thought that dev tooling would take a while to work for M1, which was correct. It probably took a year for most everything to be compiled for arm64. However I had too little faith in Rosetta and just the speed upgrade M1 really brought. So what I mean to say is, I still have that deadweight MBA that I only use for web browsing :)
Oh yes, my wife bought a new Intel MBA in summer 2020... I told her at the time Apple planned its own chip, but it couldn't be much better than the Intel one and surely Apple will increase prices too... I was so wrong.
Bought the same machine from the same store last December, been using it ever since with a big grin on my face. I'll probably consider upgrading in 2028 or later.
I still use my MacBook Air M1 and given my current workloads (a bit of web development, general home office use and occasional video editing and encoding) I doubt I’ll need to replace it in the coming 5 years. That’ll be an almost 10 year lifespan.
It's a very robust and capable small laptop. I'm typing this to a M1 Macbook Air.
The only thing to keep in mind, is that the M1 was the first CPU in the transition from Intel CPUs (+ AMD GPUs) to Apple Silicon. The M1 was still missing a bunch of things from earlier CPUs, which Apple over time added via the M1 Pro and other CPUs. Especially the graphics part was sufficient for a small laptop, but not for much beyond. Better GPUs and media engines were developed later. Today, the M3 in a Macbook Air or the M4 in the Macbook Pro have all of that.
For me the biggest surprise was how well the M1 Macbook Air actually worked. Apple did an outstanding job in the software & hardware transition.
I switched from a 2014 MacBook pro to a 2020 M1 MacBook Air, yeah the CPU is much faster, but the build quality and software is a huge step backwards. The trackpad is feels fake, not nearly as responsive, keyboard also feel not as solid. But now I'm already used to it.
They also feel very bulky/innelegant while still being fragile for the most part and not really hitting workstation level territory.
I don't understand how people are enamored with those things, sure it's better in some way than what it was before but it's also very compromised for the price.
With whisky i feel like id never need anything else. That said, the benchmark jump in the m4 has me thinking i should save up and grab a refurb in a year or two
M1 Pro compared to Intel was so big step ahead that I suppose we all are still surprised and excited. Quiet, long battery life and better performance. By a lot!
I wonder if M4 really feels that much faster and better - having M1 Pro I'm not going to change quickly, but maybe Mac Mini will land some day.
Honestly it was a game changer. Before I'd never leave the house without a charger, nowadays I rarely bring it with me on office days, even with JS / front-end workloads.
(of course, everyone else has a macbook too, there's always someone that can lend me a charger. Bonus points that the newer macbooks support both magsafe and USB-C charging. Added bonus points that they brought back magsafe and HDMI ports)
Yep. Bought an M1 Max in 2021 and it still feels fast, battery lasts forever. I’m sure the M4 would be even quicker (Lightroom, for example) but there’s little reason to consider an upgrade any time soon.
I have the same one, but everyone I know with an M series Mac says the same thing. These are the first machines in a long time built to not only last a decade but be used for it.
I have the M1 MacBook Pro with MagSafe and I still charge it via USB C simply because I can't be bothered to carry around another cable when all of my other peripherals are USB C.
That's interesting, I don't use USB A at all whatsoever anymore, and frankly I don't use the SD card reader or HDMI port either. Maybe I should just get a MacBook Air 15" but I do like the 120 hz screen of the Pro models, that's the main reason I'm holding out.
It's the other way around, isn't it? MagSafe was removed in the 2016-2019 model years (not sure why; maybe to shave off another bit of thickness?), and then brought back in 2020 to MacBook Pro and 2022 to MacBook Air.
Personally, I practically never use MagSafe, because the convenience of USB C charging cables all over the house outweighs the advantages of MagSafe for me.
I too bought a 2020 MBA M1, it was great initially, but now seems like its getting throttled, same goes with my iPhoneX, I used to love Apple, but its just pathetic that they throttle older devices just to get users to upgrade.
Apple is using LPDDR5 for M3. The bandwidth doesn't come from unified memory - it comes from using many channels. You could get the same bandwidth or more with normal DDR5 modules if you could use 8 or more channels, but in the PC space you don't usually see more than 2 or 4 channels (only common for servers).
Unrelated but unified memory is a strange buzzword being used by Apple. Their memory is no different than other computers. In fact, every computer without a discrete GPU uses a unified memory model these days!
On PC desktops I always recommend getting a mid-range tower server precisely for that reason. My oldest one is about 8 years old and only now it's showing signs of age (as in not being faster than the average laptop).
The new idea is having 512 bit wide memory instead of PC limitation of 128 bit wide. Normal CPU cores running normal codes are not particularly bandwidth limited. However APUs/iGPUs are severely bandwidth limited, thus the huge number of slow iGPUs that are fine for browsing but terrible for anything more intensive.
So apple manages decent GPU performance, a tiny package, and great battery life. It's much harder on the PC side because every laptop/desktop chip from Intel and AMD use a 128 bit memory bus. You have to take a huge step up in price, power, and size with something like a thread ripper, xeon, or epyc to get more than 128 bit wide memory, none of which are available in a laptop or mac mini size SFF.
> The new idea is having 512 bit wide memory instead of PC limitation of 128 bit wide.
It's not really a new idea, just unusual in computers. The custom SOCs that AMD makes for Playstation and Xbox have wide (up to 384-bit) unified memory buses, very similar to what Apple is doing, with the main distinction being Apples use of low-power LPDDR instead of the faster but power hungrier GDDR used in the consoles.
Yeah, a lot of it is just market forces. I guess going to four channels is costly for the desktop PC space and that's why that didn't happen, and laptops just kind of followed suite. But now that Apple is putting pressure on the market, perhaps we'll finally see quad channel becoming the norm in desktop PCs? Would be nice...
Memory interface width of modern CPUs is 64-bit (DDR4) and 32+32 (DDR5).
No CPU uses 128b memory bus as it results in overfetch of data, i.e., 128B per access, or two cache lines.
AFAIK Apple uses 128B cache lines, so they can do much better design and customization of memory subsystem as they do not have to use DIMMs -- they simply solder DRAM to the motherboard, hence memory interface is whatever they want.
> Memory interface width of modern CPUs is 64-bit (DDR4) and 32+32 (DDR5).
Sure, per channel. PCs have 2x64 bit or 4x32 bit memory channels.
Not sure I get your point, yes PCs have 64 bit cache lines and apple uses 128. I wouldn't expect any noticeable difference because of this. Generally cache miss is sent to a single memory channel and result in a wait of 50-100ns, then you get 4 or 8 bytes per cycle at whatever memory clock speed you have. So apple gets twice the bytes per cache line miss, but the value of those extra bytes is low in most cases.
Other bigger differences is that apple has a larger page size (16KB vs 4KB) and arm supports a looser memory model, which makes it easier to reach a large fraction of peak memory bandwidth.
However, I don't see any relationship between Apple and PCs as far as DIMMS. Both Apple and PCs can (and do) solder dram chips directly to the motherboard, normally on thin/light laptops. The big difference between Apple and PC is that apple supports 128, 256, and 512 bit wide memory on laptops and 1024 bit on the studio (a bit bigger than most SFFs). To get more than 128 bits with a PC that means no laptops, no SFFs, generally large workstations with Xeon, Threadrippers, or Epyc with substantial airflow and power requirements
FYI cache lines are 64 bytes, not bits. So Apple is using 128 bytes.
Also important to consider that the RTX 4090 has a relatively tiny 384-bit memory bus. Smaller than the M1 Max's 512-bit bus. But the RTX 4090 has 1 TB/s bandwidth and significantly more compute power available to make use of that bandwidth.
The M4 max is definitely not a 4090 killer, does not match it in any way. It can however work on larger models than the 4090 and have a battery that can last all day.
My memory is a bit fuzzy, but I believe the m3 max did decent on some games compared to the laptop Nvidia 4070 (which is not the same as the desktop 4070). But highly depended on if the game was x86-64 (requiring emulation) and if it was DX11 or apple native. I believe apple claims improvements in metal (the Apple's GPU lib) and that the m4 GPUs have better FP for ray tracing, but no significant changes in rasterized performance.
I look forward to the 3rd party benchmarks for LLM and gaming on the m4 max.
Eh… not quite. Maybe on an Instinct. Unified memory means the CPU and CPU means they can do zero copy to use the same memory buffer.
Many integrated graphics segregate the memory into CPU owned and GPU owned, so that even if data is on the same DIMM, a copy still needs to be performed for one side to use what the other side already has.
This means that the drivers, etc, all have to understand the unified memory model, etc. it’s not just hardware sharing DIMMs.
Yes, you could buy a brand new (announced weeks ago) AMD Turin. 12 channels of DDR5-6000, $11,048 and 320 watts (for the CPU) and get 576GB/sec peak.
Or you could buy a M3 max laptop for $4k, get 10+ hour battery life, have it fit in a thin/light laptop, and still get 546GB/sec. However those are peak numbers. Apple uses longer cache lines (double), large page sizes (quadruple), and a looser memory model. Generally I'd expect nearly every memory bandwidth measure to win on Apple over AMD's turin.
AnandTech did bandwidth benchmarks for the M1 Max and was only able to utilize about half of it from the CPU, and the GPU used even less in 3D workloads because it wasn't bandwidth limited. It's not all about bandwidth. https://www.anandtech.com/show/17024/apple-m1-max-performanc...
Indeed. RIP Anandtech. I've seen bandwidth tests since then that showed similar for newer generations, but not the m4. Not sure if the common LLM tools on mac can use CPU (vector instructions), AMX, and Neural engine in parallel to make use of the full bandwidth.
You lose out on things like expandability (more storage, more PCIe lanes) and repairability though. You are also (on M4 for probably a few years) compelled to use macOS, for better or worse.
There are, in my experience, professionals who want to use the best tools someone else builds for them, and professionals who want to keep iterating on their tools to make them the best they can be. It's the difference between, say, a violin and a Eurorack. Neither's better or worse, they're just different kinds of tools.
I was sorely tempted by the Mac studio, but ended up with a 96GB ram Ryzen 7900 (12 core) + Radeon 7800 XT (16GB vram). It was a fraction of the price and easy to add storage. The Mac M2 studio was tempting, but wasn't refreshed for the M3 generation. It really bothered me that the storage was A) expensive, B) proprietary, C) tightly controlled, and D) you can't boot without internal storage.
Even moving storage between Apple studios can be iffy. Would I be able to replace the storage if it died in 5 years? Or expand it?
As tempting as the size, efficiency, and bandwidth were I just couldn't justify top $ without knowing how long it would be useful. Sad they just didn't add two NVMe ports or make some kind of raw storage (NVMe flash, but without the smarts).
> Even moving storage between Apple studios can be iffy.
This was really driven home to me by my recent purchase of an Optane 905p, a drive that is both very fast and has an MTBF measured in the hundreds of years. Short of a power surge or (in California) an earthquake, it's not going to die in my lifetime -- why should I not keep using it for a long time?
Many kinds of professionals are completely fine with having their Optanes and what not only be plugged in externally, though, even though it may mean their boot drive will likely die at some point. That's completely okay I think.
I doubt you'll get 10+ hours on battery if you utilize it at max. I don't even know if it can really sustain the maximum load for more than a couple of minutes because of thermal or some other limits.
The 14" MBP has a 72 watt-hour battery and the 16" has a 100 watt-hour battery.
At full tilt an M3 Max will consume 50 to 75 watts, meaning you get 1 to 2 hours of runtime at best, if you use the thing full tilt.
That's the thing I find funny about the Apple Silicon MBP craze, sure they are efficient but if you use the thing as a workstation, battery life is still not good enough to really use unplugged.
Most claiming insane battery life are using the thing effectively as an information appliance or a media machine. At this game the PC laptops might not be as efficient but the runtime is not THAT different provided the same battery capacity.
FWIW I ran a quick test of gemma.cpp on M3 Pro with 8 threads. Similar PaliGemma inference speed to an older AMD (Rome or Milan) with 8 threads. But the AMD has more cores than that, and more headroom :)
"Unified memory" doesn't really imply anything about the memory being located on-package, just that it's a shared pool that the CPU, GPU, etc. all have fast access to.
Also, DRAM is never on-die. On-package, yes, for Apple's SoCs and various other products throughout the industry, but DRAM manufacturing happens in entirely different fabs than those used for logic chips.
It's mostly an IBM thing. In the consumer space, it's been in game consoles with IBM-fabbed chips. Intel's use of eDRAM was on a separate die (there was a lot that was odd about those parts).
Yeah memory bandwidth is one of the really unfortunate things about the consumer stuff. Even the 9950x/7950x, which are comfortably workstation-level in terms of compute, are bound by their 2 channel limits. The other day I was pricing out a basic Threadripper setup with a 7960x (not just for this reason but also for more PCIe lanes), and it would cost around $3000 -- somewhat out of my budget.
This is one of the reasons the "3D vcache" stuff with the giant L3 cache is so effective.
For comparison, a Threadripper Pro 5000 workstation with 8x DDR4 3200 has 204.8GB/s of memory bandwidth.
The Threadripper Pro 7000 with DDR5-5200 can achieve 325GB/s.
And no, manaskarekar, the M4 Max does 546 GB/s not GBps (which would be 8x less!).
Thanks for the numbers. Someone here on hackernews got me convinced that a Threadripper would be a better investment for inference than a MacBook Pro with a M3 Max.
> So for example if you have a server with 16 DDR5 DIMMs (sticks) it equates to 1,024 GB/s of total bandwidth.
Not quite as it depends on number of channels and not on the number of DIMMs. An extreme example: put all 16 DIMMs on single channel, you will get performance of a single channel.
If you're referring to the line you quoted, then no, it's not wrong. Each DIMM is perfectly capable of 64GiB/s, just as the article says. Where it might be confusing is that this article seems to only be concerning itself with the DIMM itself and not with the memory controller on the other end. As the other reply said, the actual bandwidth available also depends on the number of memory channels provided by the CPU, where each channel provides one DIMM worth of bandwidth.
This means that in practice, consumer x86 CPUs have only 128GiB/s of DDR5 memory bandwidth available (regardless of the number of DIMM slots in the system), because the vast majority of them only offer two memory channels. Server CPUs can offer 4, 8, 12, or even more channels, but you can't just install 16 DIMMs and expect to get 1024GiB/s of bandwidth, unless you've verified that your CPU has 16 memory channels.
It's not the memory being unified that makes it fast, it's the combination of the memory bus being extremely wide and the memory being extremely close to the processor. It's the same principle that discrete GPUs or server CPUs with onboard HBM memory use to make their non-unified memory go ultra fast.
No, unified memory usually means the CPU and GPU (and miscellaneous things like the NPU) all use the same physical pool of RAM and moving data between them is essentially zero-cost. That's in contrast to the usual PC setup where the CPU has its own pool of RAM, which is unified with the iGPU if it has one, but the discrete GPU has its own independent pool of VRAM and moving data between the two pools is a relatively slow operation.
An RTX4090 or H100 has memory extremely close to the processor but I don't think you would call it unified memory.
I don't quite understand one of the finer points of this, under caffeinated :) - if GPU memory is extremely close to the CPU memory, what sort of memory would not be extremely close to the CPU?
I think you misunderstood what I meant by "processor", the memory on a discrete GPU is very close to the GPUs processor die, but very far away from the CPU. The GPU may be able to read and write its own memory at 1TB/sec but the CPU trying to read or write that same memory will be limited by the PCIe bus, which is glacially slow by comparison, usually somewhere around 16-32GB/sec.
A huge part of optimizing code for discrete GPUs is making sure that data is streamed into GPU memory before the GPU actually needs it, because pushing or pulling data over PCIe on-demand decimates performance.
> CPU trying to read or write that same memory will be limited by the PCIe bus, which is glacially slow by comparison, usually somewhere around 16-32GB/sec.
If you’re forking out for H100’s you’ll usually be putting them on a bus with much higher throughput, 200GB/s or more.
I thought it meant that both the GPU and the CPU can access it. In most systems, GPU memory cannot be accessed by the CPU (without going through the GPU); and vice versa.
CPUs access GPU memory via MMIO (though usually only a small portion), and GPUs can in principle access main memory via DMA. Meaning, both can share an address space and access each other’s memory. However, that wouldn’t be called Unified Memory, because it’s still mediated by an external bus (PCIe) and thus relatively slower.
This is still half the speed of a consumer NVidia card, but the large amounts of memory is great, if you don't mind running things more slowly and with fewer libraries.
Was this example intended to describe any particular device? Because I'm not aware of anything that operates at 8800 MT/s, especially not with 64-bit channels.
That seems unlikely given the mismatched memory speed (see the parent comment) and the fact that Apple uses LPDDR which is typically 16 bits per channel. 8800MT/s seems to be a number pulled out of thin air or bad arithmetic.
Heh, ok, maybe slightly different. But apple spec claims 546GB/sec which works out to 512 bits (64 bytes) * 8533. I didn't think the point was 8533 vs 8800.
I believe I saw somewhere that the actual chips used are LPDDR5X-8533.
Effectively the parents formula describes the M4 max, give or take 5%.
Fewer libraries? Any that a normal LLM user would care about? Pytorch, ollama, and others seem to have the normal use cases covered. Whenever I hear about a new LLM seems like the next post is some mac user reporting the token/sec. Often about 5 tokens/sec for 70B models which seems reasonable for a single user.
Is there a normal LLM user yet? Most people would want their options to be as wide as possible. The big ones usually get covered (eventually), and there are distinct good libraries emerging for Mac only (sigh), but last I checked the experience of running every kit (stable diffusion, server-class, etc) involved overhead for the Mac world.
A 24gb model is fast and ranks 3.
A 70b model is slow and 8.
A top tier hosted model is fast and 100.
Past what specialized models can do, it's about a mixture/agentic approach and next level, nuclear power scale. Having a computer with lots of relatively fast RAM is not magic.
Thanks, but just to put things into perspective, this calculation has counted 8 channels which is 4 DIMMs and that's mostly desktops (not dismissing desktops, just highlighting that it's a different beast).
Desktops are two channels of 64 bits, or with DDR5 now four (sub)channels of 32 bits; either way, mainstream desktop platforms have had a total bus width of 128 bits for decades. 8x64 bit channels is only available from server platforms. (Some high-end GPUs have used 512-bit bus widths, and Apple's Max level of processors, but those are with memory types where the individual channels are typically 16 bits.)
The vast majority of any x86 laptop or desktops are 128 bits wide. Often 2x64 bit channels up till last year or so, now 4x32 bit DDR5 in the last year or so. There are some benefits to 4 channels over 2, but generally you are still limited by 128 bits unless you buy a Xeon, Epyc, or Threadripper (or Intel equiv) that are expensive, hot, and don't fit in SFFs or laptops.
So basically the PC world is crazy behind the 256, 512, and 1024 bit wide memory busses apple has offered since the M1 arrived.
> This is still half the speed of a consumer NVidia card, but the large amounts of memory is great, if you don't mind running things more slowly and with fewer libraries.
But it has more than 2x longer battery life and a better keyboard than a GPU card ;)
I don't think so? That PDF I linked is from 2015, way before Apple put focus on it through their M-series chips... And the Wikipedia article on "Glossary of computer graphics" has had an entry for unified memory since 2016: https://en.wikipedia.org/w/index.php?title=Glossary_of_compu...
For Apple to have come up with using the term "unified memory" to describe this kind of architecture, they would've needed to come up with it at least before 2016, meaning A9 chip or earlier. I have paid some attention to Apple's SoC launches through the years and can't recall them touting it as a feature in marketing materials before the M1. Do you have something which shows them using the term before 2016?
To be clear, it wouldn't surprise me if it has been used by others before Intel did in 2015 as well, but it's a starting point: if Apple hasn't used the term before then, we know for sure that they didn't come up with it, while if Apple did use it to describe A9 or earlier, we'll have to go digging for older documents to determine whether Apple came up with it
There are actual differences but they're mostly up to the drivers. "Shared" memory typically means it's the same DRAM but part of it is carved out and can only be used by the GPU. "Unified" means the GPU/CPU can freely allocate individual pages as needed.
We run our LLM workloads on a M2 Ultra because of this. 2x the VRAM; one-time cost at $5350 was the same as, at the time, 1 month of 80GB VRAM GPU in GCP. Works well for us.
Can you elaborate, are those workflows in queue or can they serve multiple users in parallel ?
I think it’s super interesting to know real life workflows and performance of different LLMs and hardware, in case you can direct me to other resources.
Thanks !
About 10-20% of my companies gpu usage is inference dev. Yes horribly not efficient usage of resources. We could upgrade the 100ish devs who do this dev work to M4 mbp and free up gpu resources
At some point there should be an upgrade to the M2 Ultra. It might be an M4 Ultra, it might be this year or next year. It might even be after the M5 comes out. Or it could be skipped in favour of the M5 Ultra. If anyone here knows they are definitely under NDA.
They aren't going to be using fp32 for inferencing, so those FP numbers are meaningless.
Memory and memory bandwidth matters most for inferencing. 819.2 GB/s for M2 Ultra is less than half that of A100, but having 192GB of RAM instead of 80gb means they can run inference on models that would require THREE of those A100s and the only real cost is that it takes longer for the AI to respond.
3 A100 at $5300/mo each for the past 2 years is over $380,000. Considering it worked for them, I'd consider it a massive success.
From another perspective though, they could have bought 72 of those Ultra machines for that much money and had most devs on their own private instance.
The simple fact is that Nvidia GPUs are massively overpriced. Nvidia should worry a LOT that Apple's private AI cloud is going to eat their lunch.
High availability story for AI workloads will be a problem for another decade. From what I can see the current pressing problem is to get stuff working quickly and iterate quickly.
I'm curious about getting one of these to run LLM models locally, but I don't understand the cost benefit very well. Even 128GB can't run, like, a state of the art Claude 3.5 or GPT 4o model right? Conversely, even 16GB can (I think?) run a smaller, quantized Llama model. What's the sweet spot for running a capable model locally (and likely future local-scale models)?
You'll be able to run 72B models w/ large context, lightly quantized with decent'ish performance, like 20-25 tok/sec. The best of the bunch are maybe 90% of a Claude 3.5.
If you need to do some work offline, or for some reason the place you work blocks access to cloud providers, it's not a bad way to go, really. Note that if you're on battery, heavy LLM use can kill your battery in an hour.
Having 128GB is really nice if you want to regularly run different full OSes as VMs simultaneously (and if those OSes might in turn have memory-intensive workloads running on them).
At least in the recent past, a hindrance was that MacOS limited how much of that unified memory could be assigned as VRAM. Those who wanted to exceed the limits had to tinker with kernel settings.
I wonder if that has changed or is about to change as Apple pivots their devices to better serve AI workflows as well.
you'd probably save money just paying for a VPS. And you wouldn't cook your personal laptop as fast. Not that people nowadays keep their electronics for long enough for that to matter :/
This is definitely tempting me to upgrade my M1 macbook pro. I think I have 400GB/s of memory bandwidth. I am wondering what the specific number "over half a terabyte" means.
I am always wondering if one shouldn't be doing the resource intensive LLM stuff in the cloud. I don't know enough to know the advantages of doing it locally.
The weird thing about these Apple product videos in the last few years is that there are all these beautiful shots of Apple's campus with nobody there other than the presenter. It's a beautiful stage for these videos, but it's eerie and disconcerting, particularly given Apple's RTO approach.
Incidentally, when I passed through the hellscape that is Cupertino/San Jose a few years, I was a little shocked that as a visitor you can't even see the campus; it's literally a walled garden. I guess when I was initially curious about the campus design during its build, I assumed that even a single part, maybe the orchard, would be accessible to the public. I guess based on the surrounding urban development though, the city isn't exactly interested in being livable.
I used to think the videos with all of the drone fly-bys was cool. But in the last year or so, I've started to feel the same as you. Where are all the people? It's starting to look like Apple spent a billion dollars building a technology ghost town.
Surely the entire staff can't be out rock climbing, surfing, eating at trendy Asian-inspired restaurants at twilight, and having catered children's birthday parties in immaculately manicured parks.
Oh I think they're very well done and very pretty! But lately this discomfort has started to creep in, as you note. Like something you'd see in a WALL-E spinoff: everyone has left the planet already but Buy n Large is still putting out these glorious promo videos using stock footage. Or, like, post-AI apocalypse, all the humans are confined to storage bins, but the proto-AI marketing programs are still churning out content.
You would just think that with a brand so intrinsically wrapped around the concept of technology working for and with the people that use it, you'd want to show the people who made it if you're going to show the apple campus at all.
It kind of just comes off as one of those YouTube liminal space horror videos when it's that empty.
You can even go back to 1983 "Two kinds of people": a solitary man walks into an empty office, works by himself on the computer and then goes home for breakfast. https://youtu.be/4xmMYeFmc2Q
It's a strange conflict. So much of their other stuff is about togetherness mediated by technology (eg, facetime). And their Jobs-era presentations always ended with a note of appreciation for the folks who worked so hard to make the launch happen. But you're right that much of the brand imagery is solitary, right up to the whole "Here's to the crazy ones" vibe.
It's weirdly dystopian. I didn't realize it bothered me until moments before my comment, but now I can't get it out of my head.
In a strange way, yes, with the premise that the world is otherwise the realm of uncreative, uninspired people. But the comment was addressed more at the odd lifelessness of the imagery.
If only in some shots, but they are such a valuable company that they simply cannot afford the risk of e.g. criticism for the choice of people they display, or inappropriate outfits or behaviour. One blip from a shareholder can cost them billions in value, which pisses off other shareholders. All of their published media, from videos like this to their conferences, are highly polished, rehearsed, and designed by committee. Microsoft and Google are the same, although at least with Google there's still room for some comedy in some of their departments: https://youtu.be/EHqPrHTN1dU
> You would just think that with a brand so intrinsically wrapped around the concept of technology working for and with the people that use it, you'd want to show the people who made it if you're going to show the apple campus at all.
I would think that a brand that is at least trying to put some emphasis on privacy in their products would also extend the same principle to their workforce. I don’t work for Apple, but I doubt that most of their employees would be thrilled about just being filmed at work for a public promo video.
There are legal issues with it too, or at least they think there are. They take down developer presentations after a few years partly so they won't have videos of random (ex-)employees up forever.
I interviewed there in 2017 and honestly even back then the interior of their campus was kind of creepy in some places. The conference rooms had this flat, bland beige that reminded me of exactly the kind of computers the G3 era was trying to get away from, but the size of a room, and you were inside it.
That by itself raises an interesting editorial question. Apple (like most big companies) doesn't do things randomly re: high impact public communications like this. I'm curious what made the Mac mini a product that merited showing people doing things, with a computer that is tied to one location, vs. a Macbook Pro's comparative emptiness, for a computer that can go anywhere and be with anyone. It could be as simple as fun vs focus.
I imagine the Mac mini is really small now if it could be powered via USB PD then I think it’s no problem to put it in a backpack with a kb and touchpad then bring it home then to the office. This is because I notice there are many people just bring their MBP to the office then plug it to a big screen. The downside is just you can’t work anywhere like with a MBP, but the usability is mostly the same (to me)
> MacBook Pro with M4 Pro is up to 3x faster than M1 Pro (13)
> (13) Testing conducted by Apple from August to October 2024 using preproduction 16-inch MacBook Pro systems with Apple M4 Pro, 14-core CPU, 20-core GPU, 48GB of RAM and 4TB SSD, and production 16-inch MacBook Pro systems with Apple M1 Pro, 10-core CPU, 16-core GPU, 32GB of RAM and 8TB SSD. Prerelease Redshift v2025.0.0 tested using a 29.2MB scene utilising hardware-accelerated ray tracing on systems with M4 Pro. Performance tests are conducted using specific computer systems and reflect the approximate performance of MacBook Pro.
So they're comparing software that uses raytracing present in the M3 and M4, but not in the M1. This is really misleading. The true performance increase for most workloads is likely to be around 15% over the M3. We'll have to wait for benchmarks from other websites to get a true picture of the differences.
Edit: If you click on the "go deeper on M4 chips", you'll get some comparisons that are less inflated, for example, code compilation on pro:
14-inch MacBook Pro with M4 4.5x
14-inch MacBook Pro with M3 3.8x
13-inch MacBook Pro with M1 2.7x
So here the M4 Pro is 67% faster than the M1 Pro, and 18% faster than the M3 Pro. It varies by workload of course.
I'm pleased that the Pro's base memory starts at 16 GB, but surprised they top out at 32 GB:
> ...the new MacBook Pro starts with 16GB of faster unified memory with support for up to 32GB, along with 120GB/s of memory bandwidth...
I haven't been an Apple user since 2012 when I graduated from college and retired my first computer, a mid-2007 Core2 Duo Macbook Pro, which I'd upgraded with a 2.5" SSD and 6GB of RAM with DDR2 SODIMMs. I switched to Dell Precision and Lenovo P-series workstations with user-upgradeable storage and memory... but I've got 64GB of RAM in the old 2019 Thinkpad P53 I'm using right now. A unified memory space is neat, but is it worth sacrificing that much space? I typically have a VM or two running, and in the host OS and VMs, today's software is hungry for RAM and it's typically cheap and upgradeable outside of the Apple ecosystem.
It seems you need the M4 Max with the 40-core GPU to go over 36GB.
The M4 Pro with 14‑core CPU & 20‑core GPU can do 48GB.
If you're looking for ~>36-48GB memory, here's the options:
$2,800 = 48GB, Apple M4 Pro chip with 14‑core CPU, 20‑core GPU
$3,200 = 36GB, Apple M4 Max chip with 14‑core CPU, 32‑core GPU
$3,600 = 48GB, Apple M4 Max chip with 16‑core CPU, 40‑core GPU
So the M4 Pro could get you a lot of memory, but less GPU cores. Not sure how much those GPU cores factor in to performance, I only really hear complaints about the memory limits... Something to consider if looking to buy in this range of memory.
Of course, a lot of people here probably consider it not a big deal to throw an extra 3 grand on hardware, but I'm a hobbyist in academia when it comes to AI, I don't big 6-figure salaries :-)
Somehow I got downvoted for pointing this out, but it's weird that you have to spend an extra $800 USD just to be able to surpass 48gb, and "upgrading" to the base level Max chip decreases your ram limit, especially when the M4 Pro on the Mac Mini goes up to 64gb. Like... that's a shit load of cash to put out if you need more ram but don't care for more cores. I was really hoping to finally upgrade to something with 64gb, or maybe 96 or 128 if it decreased in price, but it's they removed the 96 and kept 64 and 128 severely out of reach.
Do I get 2 extra CPU cores, build a budget gaming PC, or subscribe to creative suite for 2.5 years!?
I haven't done measurements on this, but my Macbook Pro feels much faster at swapping than any Linux or Windows device I've used. I've never used an M.2 SSD so maybe that would be comparable, but swapping is pretty much seamless. There's also some kind of memory compression going on according to Activity Monitor, not sure if that's normal on other OSes.
Yes, other M.2 SSDs have comparable performance when swapping, and other operating systems compress memory, too — though I believe not as much as MacOS.
Although machines with Apple Silicon swap flawlessly, I worry about degrading the SSD, which is non-replaceable. So ultimately I pay for more RAM and not need swapping at all.
Degrading the SSD is a good point. This is thankfully a work laptop so I don't care if it lives or dies, but it's something I'll have to consider when I eventually get my own Mac.
It looks like different versions of the ‘Pro’ based on core count and memory bandwidth. Im assuming the 12c Mini M4 Pro has the same memory bandwidth/channels enabled as the 14c MBP M4 Pro, enabling the 64GB. My guess would be related to binning and or TDP.
> All MacBook Pro models feature an HDMI port that supports up to 8K resolution, a SDXC card slot, a MagSafe 3 port for charging, and a headphone jack, along with support for Wi-Fi 6E and Bluetooth 5.3.
No Wifi 7. So you get access to the 6 GHz band, but not some of the other features (preamble punching, OFDMA):
The iPhone 16s do have Wifi 7. Curious to know why they skipped it (and I wonder if the chipsets perhaps do support it, but it's a firmware/software-not-yet-ready thing).
I was quite surprised by this discrepancy as well (my new iPhone has 7, but the new MBP does not).
I had just assumed that for sure this would be the year I upgrade my M1 Max MBP to an M4 Max. I will not be doing so knowing that it lacks WiFi 7; as one of the child comments notes, I count on getting a solid 3 years out of my machine, so future-proofing carries some value (and I already have WiFi7 access points), and I download terabytes of data in some weeks for the work I do, and not having to Ethernet in at a fixed desk to do so efficiently will be a big enough win that I will wait another year before shelling out $6k “off-cycle”.
Big bummer for me. I was looking forward to performance gains next Friday.
Hm why? Is 6E really so much worse than 7 in practice that 7 can replace wired for you but 6E can't? That's honestly really weird to me. What's the practical difference in latency, bandwidth or reliability you've experienced between 6E and 7?
I don’t have any 6E device so I cannot really tell for sure but from what I read, 6E gets you to a bit over 1Gbit in real world scenario. 7 should be able to replace my 2.5Gbe dongle or at least get much closer to it. I already have routers WiFi 7 Eeros on a 2.5Gbe wired backbone.
I guess it makes sense if what you do is extremely throughput-focused... I always saw consistency/reliability and latency as the benefits of wired compared to wireless, the actual average throughput has felt fast enough for a while on WiFi but I guess other people may have different needs
Laptops/desktops (with 16GB+ of memory) could make use of the faster speed/more bandwidth aspects of WiFi7 better than smartphones (with 8GB of memory).
> It looks like few people only are using Wifi 7 for now.
Machines can last and be used for years, and it would be a presumably very simple way to 'future proof' things.
And though the IEEE spec hasn't officially been ratified as I type this, it is set to be by the end of 2024. Network vendors are also shipping APs with the functionality, so in coming years we'll see a larger and larger infrastructure footprint going forward.
Yeah, this threw me as well. When the iMac didn’t support WiFi 7, I got a bit worried. I have an M2, so not going to get this, but the spouse needs a new Air and I figure that everything would have WiFi 7 by then, and now I don’t think so.
Faster is always nice, makes sense. But do you really need WiFi 7 features/speed? I don't know when I would notice a difference (on a laptop) between 600 or 1500 Mbit/s (just as an example). Can't download much anyhow as the storage will get full in minutes.
wifi 6 can’t do shit given that there’s crazy overlap if you go for 80MHz channels. Most situations tap out around 700Mbps over usually a 40MHz channel.
Wifi 6e/7 6GHz band with a bunch of non overlapping 160MHz channels is where the juice is at. But even then a lot of devices are limited to smaller channel widths.
Can anyone comment on the viability of using an external SSD rather than upgrading storage? Specifically for data analysis (e.g. storing/analysing parquet files using Python/duckdb, or video editing using divinci resolve).
Also, any recommendations for suitable ssds, ideally not too expensive? Thank you!
Don't bother with thunderbolt 4, go for USB 4 enclosure instead - I've got a Jeyi one. Any SSD will work, I use a Samsung 990 pro inside. It was supposed to be the fastest you can get - I get over 3000MB/s.
With a TB4 case with an NVME you can get something like 2300MB/s read speeds. You can also use a USB4 case which will give you over 3000MB/s (this is what I'm doing for storing video footage for Resolve).
With a TB5 case you can go to like 6000MB/s. See this SSD by OWC:
I'm a little sus of owc these days, their drives are way expensive, never get any third-party reviews or testing, and their warranty is horrible (3 years). I've previously swore by them so it's a little disappointing
Basically any good SSD manufacturer is fine, but I've found that the enclosure controller support is flaky with Sonoma. Drives that appear instantly in Linux sometimes take ages to enumerate in OSX, and only since upgrading to Sonoma. Stick with APFS if you're only using it for Mac stuff.
I have 2-4TB drives from Samsung, WD and Kingston. All work fine and are ridiculously fast. My favourite enclosure is from DockCase for the diagnostic screen.
The USB-C ports should be quite enough for that. If you are using a desktop Mac, such as an iMac, Mini, or the Studio and Pro that will be released later this week, this is a no-brainer - everything works perfectly.
i go with the acasis thunderbolt enclosure and then pop in an nvme of your choice, but generic USB drives are pretty viable too ... thunderbolt can be booted from, while USB can't
i tried another brand or 2 of enclosures and they were HUGE while the acasis was credit card sized (except thickness)
I've used a Samsung T5 SSD as my CacheClip location in Resolve and it works decently well! Resolve doesn't always tolerate disconnects very well, but when it's plugged in things are very smooth.
Hopefully in the next days/weeks we’ll see TB5 external enclosures and you’ll be able to hit very fast speeds with the new Macs. I would wait for those before getting another enclosure now.
Afaik the main oem producer is Winstars, though I could only find sketchy-looking Aliexpress seller so far.
With a thunderbolt SSD you'll think your external drive is an internal drive. I bought one of these (https://www.amazon.com/gp/product/B0BGYMHS8Y) for my partner so she has snappy photo editing workflows with Adobe CC apps. Copying her 1TB photo library over took under 5 min.
I had a big problem with crucial 4tb ssds recently, using them as time machine drives. The first backup would succeed, the second would fail and the disk would then be unrepairable in disk utility, which also will refuse to format to non-apfs (and an apfs reformat wouldn't fix it).
I edit all my video content from a USB-attached SSD with Resolve on my MBP.
My only complaint is that Apple gouges you for memory and storage upgrades. (But in reality I don't want the raw and rendered video taking up space on my machine).
This is the first compelling Mac to me. I've used Macs for a few clients and muscle memory is very deeply ingrained for linux desktops. But with local LLMs finally on the verge of usability along with sufficient memory... I might need to make the jump!
Wish I could spin up a Linux OS on the hardware though. Not a bright spot for me.
It won't have all the niceties / hardware support of MacOS, but it seamlessly coexists with MacOS, can handle the GPU/CPU/RAM with no issues, and can provide you a good GNU/Linux environment.
IIRC one of the major factors holding back M3 support was the lack of a M3 mini for use in their CI environment. Now that there's an M4 mini hopefully there aren't any obstacles to them adding M4 support
How? What cloud providers offer it? MacStadium and AWS don't.
I guess you could have a physical MBP in your house and connect it to some bring-your-own-infrastructure CI setup, but most people wouldn't want to do that.
How do you imagine that a cloud computing platform designed around running Macs with macOS would work for testing an entirely different OS running on bare metal on hardware that doesn't have a BMC, and usefully catching and logging frequent kernel panics and failed boots?
It's a pretty hard problem to partially automate for setups with an engineer in the room. It doesn't sound at all feasible for an unattended data center setup that's designed to host Xcode for compiling apps under macOS.
Getting M4 Mac Mini CI might be a while considering the amount they've changed the package. Too tall to go into a 1U, smaller so the rack frames need to be redesigned. Power button now on the bottom so switch actuation need finagling.
I miss Linux, it respected me in ways that MacOS doesn't. But maintaining a sane dev environment on linux when my co-workers on MacOS are committing bash scripts that call brew... I am glad that I gave up that fight. And yeah, the hardware sure is nice.
IIRC brew supports linux, but it isn't a package manager I pay attention to outside of some very basic needs. Way too much supply chain security domain to cover for it!
It does, but I prefer to keep project dependencies bound to that project rather than installing them at wider scope. So I guess it's not that I can't use Linux for work, but that I can't use Linux for work and have it my way. And if I can't have it my way anyway, then I guess Apple's way will suffice.
In general for local LLMs, the more memory the better. You will be able to fit larger models in RAM. The faster CPU will give you more tokens/second, but if you are just chatting with a human in the loop, most recent M series macs will be able to generate tokens faster than you can read them.
That also very much depends on model size. For 70B+ models, while the tok/s are still fast enough for realtime chat, it's not going to be generating faster than you can read it, even on Ultra with its insane memory bandwidth.
NextSTEP which macOS is ultimately based on is indeed older than Linux (first release was 1989). But why does that matter? The commenter presumably said "Linux" for a reason, i.e. they want to use Linux specifically, not any UNIX-like OS.
Sure. But not everybody. That’s how I ended up on a Mac. I needed to develop for Linux servers and that just sucked on my windows laptop (I hear it’s better now?). So after dual booting fedora on my laptop for several months I got a MacBook and I’ve never looked back.
BSD is fun (not counting MacOS in the set there), but no, my Unix experiences have been universally legacy hardware oversubscribed and undermaintained. Not my favorite place to spend any time.
It seems they also update the base memory on MacBook Air:
> MacBook Air: The World’s Most Popular Laptop Now Starts at 16GB
> MacBook Air is the world’s most popular laptop, and with Apple Intelligence, it’s even better. Now, models with M2 and M3 double the starting memory to 16GB, while keeping the starting price at just $999 — a terrific value for the world’s best-selling laptop.
Wow, I didn't expect them to update the older models to start at 16GB and no price increase. I guess that is why Amazon was blowing the 8GB models out at crazy low prices over the past few days.
Costco was selling MB Air M2 8 GB for $699! Incredible deal.
I’ve been using the exact model for about a year and I rarely find limitations for my typical office type work. The only time I’ve managed to thermally throttle it has been with some super suboptimal Excel Macros.
See that’s the thing. Given that somehow you need 1TB to get the matte screen, I feel like Apple is using it as a way to upsell. It would indicate that perhaps Apple won’t offer a matte MacBook Air.
They did? The tweet that announced stuff from the head of marketing did not mention 3 days.
That said, I believe you. Some press gets a hands-on on Wednesday (today) so unless they plan to pre-announce something (unlikely) or announce software only stuff, I think today is it.
"This is a huge week for the Mac, and this morning, we begin a series of three exciting new product announcements that will take place over the coming days," said Apple's hardware engineering chief John Ternus, in a video announcing the new iMac.
That's disappointing. I was expecting a new Apple TV because mine needs replacement and I don't really feel inclined to get one that's due for an upgrade very soon.
The current-gen Apple TV is already overpowered for what it does, and extremely nice to use. I can think of very few changes I would like to see, and most of them are purely software.
Mine has 128GB of onboard storage... but Apple still bans apps from downloading video, which annoys me.
The streaming apps virtually all support downloading for offline viewing on iPhone, but the Apple TV just becomes a paperweight when the internet goes out, because I'm not allowed to use the 128GB of storage for anything.
If they're not going to let you use the onboard storage, then it seems unlikely for them to let you use USB storage. So, first, I would like them to change their app policies regarding internal storage, which is one of the purely software improvements I would like to see.
I use a dedicated NAS as a Plex server + Plex app on Apple TV itself for local streaming, which generally works fine. Infuse app can also index and stream from local sources.
But there are some cases like e.g. watching high-res high-FPS fractal zoom videos (e.g. https://www.youtube.com/watch?v=8cgp2WNNKmQ) where even brief random skipped frames from other things trying to use WiFi at the same time can be really noticeable and annoying.
It would make more sense to discontinue the smaller model along with some other updates to the line. Or in other words, Air won't receive any other updates this week unfortunately.
It'll be interesting to see the reaction of tech commentators about this. So many people have been screaming at Apple to increase the base RAM and stop price gouging their customers on memory upgrades. If Apple Intelligence is the excuse the hardware team needed to get the bean counters on board, I'm not going to look a gift horse in the mouth!
It wouldn't surprise me if people typically use more storage on their phone than their computer. The phone should probably have a higher base storage than the base storage of their laptops.
Extremely interesting point. My initial reaction to your comment is that it is a crazy thing to say, but the more I think about it the more I agree with you. On my phone is where I have tons of 4k 30/60FPS videos, high resolution photos (with live), and videos downloaded on Netflix and YouTube.
On my Mac I don't have any of these things, it's mostly for programming and some packages. I'm almost always connected to Wi-Fi (except on planes) so I don't really need any photos or videos.
The only people that I see have high storage requirements on Macs are probably video/media creators? As a programmer I'm totally fine with 512GB, but could probably live with 256GB if I wanted to be super lean.
The only older configs that Apple sells are the M2 and M3 Airs, which were bumped. Everything else is now on M4, or didn't have an 8gb base config (Mac Studio, Mac Pro)
Ohh, good catch. Sneaking that into the MBP announcement. I skimmed the page and missed that. So a fourth announcement couched within the biggest of the three days.
Well, the issue for me with memory on these new models is that for the Max, it ships with 36GB and NO expandable memory option. To get more memory that's gated behind a $300 CPU upgrade (plus the memory cost).
I've seen a lot of people complaining about 8GB but honestly my min spec M1 Air has continued to be great. I wouldn't hesitate to recommend a refurb M1 8GB Air for anyone price conscious.
Thanks to your comment. I persuaded my friend who purchased an M3 Air 24GB recently and we got 200$ back (Remuneration for price drop valid for 14 days after the date of DELIVERY) where we live
I think spec-wise the Air is good enough for almost everyone who isn't doing video production or running local LLMs, I just wish it had the much nicer screen that the Pro has. But I suppose they have to segregate the product lines somehow.
Yeah, this one caught me off guard. We just purchased a MacBook Air in the last month and if we bought the same one now, we would save $200. Apple support would not price match/correct that, so we will be returning it and purchasing anew.
It's suspicious that your account created 1 day ago and you say there are more powerful and more battery efficient offers available but don't give an example.
P.S. writing to you from a six year old ThinkPad running Linux. I'm not an Apple fanboy. It is my opinion that Apple's products are leagues ahead of the rest of the industry and it's not even close: performance, efficiency, and build quality are incredible.
I'm torn between the new M4 MBP and a Framework laptop with Linux for my personal computer. Can you share some deciding factors for you? Does it mostly come down to the OS? How is the battery life with the Framework?
I dunno. Four years ago the MacBook M1 Pro was able to compile WebKit nearly as fast as the Mac Pro, and nearly twice as fast as the Intel-based MacBook Pro, and still had 91% battery versus 24% in the Intel-based MacBook Pro. Incredible.
My one concern is that nano-texture apple displays are a little more sensitive to damage, and even being super careful with my MBPs I get the little marks from the keyboard when you carry the laptop with your hand squeezing the lid and bottom (a natural carry motion).
A piece of paper also kind of works, but it's a bit ridiculous to constantly keep that around with you and try to reach 95% consistency. Eventually I stopped trying because it was just more infrastructure to protect the thing than I'd prefer.
Love the nano-texture on the Studio Display, but my MacBooks have always suffered from finger oil rubbing the screen from the keys. Fingerprint oil on nano-texture sounds like a recipe for disaster.
For my current laptop, I finally broke down and bought a tempered glass screen protector. It adds a bit of glare, but wipes clean — and for the first time I have a one-year-old MacBook that still looks as good as new.
I put a thin screen cleaner/glasses cleaner cloth on the keyboard whenever I close the lid. That keeps the oils off the screen as well as prevents any pressure or rubbing from damaging the glass.
If your goal is to sell more MBPs (and this is marketing presentation) then, judging by the number of comments that have the phrase "my M1" and the top comment, it seems like M1 vs M4 is the right comparison to make. Too many people are sticking with their M1 machines. Including me.
It's actually interesting to think about. Is there a speed multiplier that would get me off this machine? I'm not sure there is. For my use case the machine performance is not my productivity bottleneck. HN on the otherhand... That one needs to be attenuated. :)
It does and it gets even worse when you realize those stats are only true under very specific circumstances, not typical computer usage. If you benchmarked based on typical computer usage, I think you'd only see gains of 5% or less.
Anyone know of articles that deep dive into "snappiness" or "feel" computer experiences?
Everyone knows SSDs made a big difference in user experience. For the CPU, normally if you aren't gaming at high settings or "crunching" something (compiling or processing video etc.) then it's not obvious why CPU upgrades should be making much difference even vs. years-old Intel chips, in terms of that feel.
There is the issue of running heavy JS sites in browsers but I can avoid those.
The main issue seems to be how the OS itself is optimized for snappiness, and how well it's caching/preloading things. I've noticed Windows 10 file system caching seems to be not very sophisticated for example... it goes to disk too often for things I've accessed recently-but-not-immediately-prior.
Similarly when it comes to generating heat, if laptops are getting hot even while doing undemanding office tasks with huge periods of idle time then basically it points to stupid software -- or let's say poorly balanced (likely aimed purely at benchmark numbers than user experience).
So far I’m only reading comments here about people wow’d by a lot of things it seemed that M3 pretty much also had. Not seeing anything new besides “little bit better specs”
The M4 is architecturally better than the M3, especially on GPU features IIRC, but you’re right it’s not a total blow out.
Not all products got the M3, so in some lines this week is the first update in quite a while. In others like MBP it’s just the yearly bump. A good performing one, but the yearly bump.
I have to admit, 4 generations in, 1.8x is decent but slightly disappointing all the same.
I'd really like to justify upgrading, but a $4k+ spend needs to hit greater than 2x for me to feel it's justified. 1.8x is still "kind of the same" as what I have already.
wot, m8? Only Apple will call a 12 megapixel camera “advanced”. Same MPs as an old iPhone 6 rear camera.
Aside from that, it’s pretty much the same as the prior generation. Same thickness in form factor. Slightly better SoC. Only worth it if you jump from M1 (or any Intel mbp) to M4.
Would be godlike if Apple could make the chip swappable. Buy a Mac Studio M2 Ultra Max Plus. Then just upgrade SoC on an as needed basis.
Would probably meet their carbon neutral/negative goals much faster. Reduce e-waste. Unfortunately this is an American company and got to turn profit. Profit over environment and consumer interests.
I feel like if they pushed Win32/Gaming on Apple Mx hardware it'd give at least a single reason for people to adopt or upgrade their devices to new models. I know for sure I'd be on board if everything that ran on my steam deck ran on a mac game wise, since that's holding me back from dropping the cash. I still think I'll get a mini though.
Valve is trying to obsolete Windows, so they can prevent Microsoft from interfering with Steam. Apple could team up with them, and help obsolete Windows for a very large percentage of game-hours.
There will always be a long tail of niche Windows games (retro + indie especially). But you can capture the Fortnite (evergreen) / Dragon Age (new AAA) audience.
My only explanations for the lack of gaming support (see historical lack of proper OpenGL support) while still supporting high end graphics use cases (film editing, CAD, visual effects) are:
1) Either Apple wants to maintain the image of the Macbook as a "serious device", and not associate itself with the likes of "WoW players in their mom's basement".
2) Microsoft worked something out with Apple, where Apple would not step significantly on the gaming market (Windows, Xbox). I can't think of another reason why gaming on iOS would be just fine, but abysmal on MacOS. Developers release games on MacOS _despite_ the platform.
Steve Jobs was historically against gaming on apple devices and, I believe, went so far as to try to remove them from the Apple Store. Apple is only recently starting to introduce gaming seriously back into the platform.
Would be incredibly fascinating to consider what if Bungie was never bought by Microsoft and Halo ended up a Mac title first. It would've severely capped the influence of the game (and maybe its quality), even after it would have been ported to PC. Would Halo have even been imported to Xbox? On the flip side, if it somehow managed to capture considerable success- would it have forced Jobs and Apple to recognize the importance of the gaming market? Either way, the entire history of video games would be altered.
Yeah, for quite some time I played a lot of WoW because it was one of the very few games I liked / had for MacOS, and what I had was a mac. That and a really old version of heroes of might and magic N. (Maybe 3?)
You’re comparing cameras against different product segments.
Laptop cameras are significantly smaller in all dimensions than phone cameras. Most laptop cameras are 1-4MP. Most are 720p (1MP), and a few are 1080p (2MP). The previous MacBook was 1080p
For reference, a 4k image is 8MP.
12MP is absolutely a massive resolution bump, and I’d challenge you to find a competitive alternate in a laptop.
Especially because pixel count is a meaningless metric by itself. 12MP is the same as a Nikon D3, which if it could replicate the results of I would be happy with!
Absolutely, I have a 1080p Logitech webcam that gives worse quality in anything other than perfect lighting, whereas my 720p Intel Mac Air webcam gives grainer (but at least usable) quality in most conditions. Sensor quality and software have to be good too or it's just a spec pissing contest.
Megapixels is nothing more than the number of sample points. There's so much more to image quality than the number of samples.
I blame the confusion to PC&Android marketing people who were pushing for years and years the idea that the higher the megapixel digits the better the camera is. Non-Apple customers should be really pissed of for the years of misinformation and indoctrination on false KPI.
The marketing gimmicks pushed generations of devices to optimize for meaningless numbers. At times, even Apple was forced to adopt those. Such a shame.
More megapixels on a tiny sensor does not make it more advanced. At a certain point it only makes it worse. That doesn't tell you anything about the quality of the image. There is way more to digital cameras than pixel count.
I’m really excited about the nano-texture display option.
It’s essentially a matte coating, but the execution on iPad displays is excellent. While it doesn’t match the e-ink experience of devices like the Kindle or ReMarkable, it’s about 20-30% easier on the eyes. The texture feels also great (even though it’s less relevant for a laptop), and the glare reduction is a welcome feature.
I prefer working on the MacBook screen, but I nearly bought an Apple Studio Display XDR or an iPad as a secondary screen just for that nano-texture finish. It's super good news that this is coming to the MacBook Pro.
Do you actually have to wipe the screen with the included special cloth? The screen on all of the macbooks that I've had usually get oily patches because of the contact with keycaps, so I have to wipe the screen regularly.
I have Pro Display XDR with nano coating and the manual definitely says to only use their special cleaning cloth (or with some isopropyl alcohol). The standard coating might not need it though.
How is the contrast? The HDR content? Any downsides?
I will upgrade to M4 Pro and really hate the glare when I travel (and I do that a lot) but at the same time I don't want to lose any quality that the MBP delivers which is quite excellent imho
I have a 16" M1 Pro with 16 gigs of ram, and it regularly struggles under the "load" of Firebase emulator.
You can tell not because the system temp rises, but because suddenly Spotify audio begins to pop, constantly and irregularly.
It took me a year to figure out that the system audio popping wasn't hardware and indeed wasn't software, except in the sense that memory (or CPU?) pressure seems to be the culprit.
This kind of sounds like someone is abusing perf cores and high priority threading in your stack. iirc, on MacOS audio workgroup threads are supposed to be scheduled with the highest (real time) priority on p cores, which shouldn't have issues under load, unless someone else is trying to compete at the same priority.
There is some discussion online on whether this happens when you have a Rosetta app running in the background somewhere (say a util you got via Homebrew, for example).
Even when I remove all "Intel" type apps in activity monitor, I still experience the issue though.
This happens whenever I load up one of our PyTorch models on my M1 MBP 16gb too. I also hate the part where if the model (or any other set of programs) uses too much RAM the whole system will sometimes straight up hang and then crash due to kernel watchdog timeout instead of just killing the offender.
This is almost completely undocumented outside of this HN post (check Google). Can you tell us more about what this does and how it is intended to be used? Thanks!
I’ve had something similar happen as a bug when I was using a python sound device and calling numpy functions inside its stream callback. Took me a long time to figure out that numpy subroutines that drop the GIL would cause the audio stream to stall.
Whoa! I've been so annoyed by this for years, so interesting that you figured it out. It's the kind of inelegance in design that would have had Steve Jobs yelling at everyone to fix, just ruins immersion in music and had no obvious way to fix.
That sounds like an app issue, it might be doing non-realtime-safe operations on a realtime thread. But generally speaking, if you have an issue, use feedback assistant.
To run LLMs locally (Ollama/LLM Notebook), you want as much memory as you can afford. For actually training toy models yourself for learning/experiments in my experience it doesn't matter much. PyTorch is flexible.
> MacBook Air is the world’s most popular laptop, and with Apple Intelligence, it’s even better. Now, models with M2 and M3 double the starting memory to 16GB, while keeping the starting price at just $999 — a terrific value for the world’s best-selling laptop.
I love mine, it has a fresh battery OEM battery as well. Runs the latest OS with OpenCore Legacy. But it's starting to get a bit annoying. Usable, but it is starting to feel slowish, the fan kicks up frequently.
I might still keep it another year or so, which is a testament to how good it is and how relative little progress has happened in almost 10 years.
If I still had my 2015 I would have applied some liquid metal TIM by now, I did a paste refresh and that worked very well to get the fan under control.
Did you disassemble the fan assembly? And applied new thermal paste?
I'm not using mine any more, but I noticed a big difference when I replaced the battery and got all the dust out this spring. Also installed a new battery. Still quite a hard sell on the used market :)
If it's got a full function row, it will probably work just fine under Linux. My 2014 MBP chugged pretty hard with OpenCore but handles modern Linux distros much better.
Which MacOS version? I upgraded to a newer one and it crawled to a halt, it's unusable now. UI is insanely laggy. It's sitting in a drawer gathering dust now
That's annoying. I really want to fully remove lightning connectors from my life, but, my existing magic* devices work fine and will probably work fine for another decade or two.
I don’t think it will “feel” much faster like the Intel -> M1 where overall system latency especially around swap & memory pressure got much much better.
If you do any amount of 100% CPU work that blocks your workflow, like waiting for a compiler or typechecker, I think M1 -> M4 is going to be worth it. A few of my peers at the office went M1->M3 and like the faster compile times.
Like, a 20 minute build on M1 becoming a 10 minute build on M4, or a 2 minute build on M1 becoming a 1 minute build on M4, is nothing to scoff at.
I guess it’s only worth it for people who would really benefit from the speed bump — those who push their machines to the limit and work under tight schedules.
I myself don’t need so much performance, so I tend to keep my devices for many, many years.
I have a MBP M1 16GB at home, and a MBP M3 128GB at work. They feel the same: very fast. When I benchmark things I can see the difference (or when fiddling with larger LLM models), other than that, the M1 is still great and feels faster and more enjoyable than any Windows machine I interact with.
I do a lot of (high-end mirrorless camera, ~45MP, 14 bits/pixel raw files) photo processing. There are many individual steps in Photoshop, Lightroom, or various plug-ins that take ~10 seconds on my M1 Max MBP. It definitely doesn't feel fast. I'm planning to upgrade to one of these.
No support for M3 or M4 powered machines currently.
> All Apple Silicon Macs are in scope, as well as future generations as development time permits. We currently have support for most machines of the M1 and M2 generations.[^1][^2]
btw, there is a recent interview with an Asani dev focusing on GPUs, worth a listen for those interested in linux on apple silicon. The reverse engineering effort required to pin down the GPU hardware was one of the main topics.
For many years I treated Windows or macOS as a hypervisor - if you love Linux but want the Mac hardware, instant sleep & wake, etc, putting a full screen VM in Parallels or similar is imo better than running Linux in terms of productivity, although it falls short on “freedom”.
I do the same thing, but there are two big caveats:
1. Nested virtualization doesn't work in most virtualization software, so if your workflow involves running stuff in VMs it is not going to work from within another VM. The exception is apparently now the beta version of UTM with the Apple Virtualization backend, but that's highly experimental.
2. Trackpad scrolling is emulated as discrete mouse wheel clicks, which is really annoying for anyone used to the smooth scrolling on macOS. So what I do is use macOS for most browsing and other non-technical stuff but do all my coding in the Linux VM.
$ swift repl
Welcome to Apple Swift version 6.0.2 (swiftlang-6.0.2.1.2 clang-1600.0.26.4).
Type :help for assistance.
1> import Virtualization
2> VZGenericPlatformConfiguration.isNestedVirtualizationSupported
$R0: Bool = false
Have anyone tried it recently, specifically the trackpad? I tried the Fedora variant a few months ago on my M1 Macbook and it was horrible to use the trackpad, it felt totally foreign and wrong.
I feel you, but Apple's trackpad prowess is not an easy thing to copy. It's one of those things I never expect anyone else to be able to replicate the level of deep integration between the hardware and software.
It's 2024, and I still see most Windows users carrying a mouse to use with their laptop.
It pains me deeply that they used Autodesk Fusion in one of the app screenshots. It is by far the worst piece of software I use on Mac OS.
Wish the nano-texture display was available when I upgraded last year. The last MacBook I personally bought was in 2012 when the first retina MBP had just released. I opted for the "thick" 15" high-res matte option. Those were the days...
Nice to see they increased the number of performance cores in the M4 Pro, compared to the M3 Pro. Though I am worried about the impact of this change on battery life on the MBPs.
Another positive development was bumping up baseline amounts of RAM. They kept selling machines with just 8 gigabytes of RAM for way longer than they should have. It might be fine for many workflows, but feels weird on “pro” machines at their price points.
I’m sure Apple has been coerced to up its game because of AI. Yet we can rejoice in seeing their laptop hardware, which already surpassed the competition, become even better.
I'm curious why they decided to go this route, but glad to see it. Perhaps ~4 efficiency cores is simply just enough for the average MBP user's standard compute?
In January, after researching, I bought an apple restored MBP with an M2 Max over an M3 Pro/Max machine because of the performance/efficiency core ratio. I do a lot of music production in DAWs, and many, even Apple's Logic Pro don't really make use of efficiency cores. I'm curious about what restraints have led to this.. but perhaps this also factors into Apple's choice to increase the ratio of performance/efficiency cores.
> Perhaps ~4 efficiency cores is simply just enough for the average MBP user's standard compute?
I believe that’s the case. Most times, the performance cores on my M3 Pro laptop remain idle.
What I don’t understand is why battery life isn’t more like that of the MacBook Airs when not using the full power of the SOC. Maybe that’s the downside of having a better display.
> Curious how you're measuring this. Can you see it in Activity Monitor?
I use an open source app called Stats [1]. It provides a really good overview of the system on the menu bar, and it comes with many customization options.
It's hard to imagine ay reason why I would not want to keep upgrading to a new MPB every few years -- my M3 MBP is by far the best laptop I've owned thanks to the incredible battery life.
Of course I'm rooting for competition, but Apple seems to be establishing a bigger and bigger lead with each iteration.
I don’t see the yearly releases as saying you have to upgrade. Rather, having a consistent cadence makes it easier for the supply chain, and the short iteration time means there’s less pressure to rush something in half-baked or delay a release.
My M1 laptop from early 2022 is too good for me to care about upgrading right now, I loaded it up with 64GB ram and it's still blazing. What benefit would I really notice? My heavy apps loading a couple of seconds faster?
I agree. I also have an M1 mini with 8GB of RAM and while I wish I'd bought one with more RAM, it work nicely when the task doesn't require much RAM. The incremental benefits of the M2-M4 are nice, but they do not obsolete the M1 by any stretch of the imagination.
Insane cost for the amount of storage and RAM. I mean, year over year for Apple, awesome! Compared to the rest of the brands... so ridiculously expensive. Watching the price climb to 5K as you add in the new normal for hardware specs is absurd.
What’s amazing is that in the past I’ve felt the need to upgrade within a few years.
New video format or more demanding music software is released that slows the machine down, or battery life craters.
Well, I haven’t had even a tinge of feeling that I need to upgrade after getting my M1 Pro MBP. I can’t remember it ever skipping a beat running a serious Ableton project, or editing in Resolve.
Can stuff be faster? Technically of course. But this is the first machine that even after several years I’ve not caught myself once wishing that it was faster or had more RAM. Not once.
Perhaps it’s my age, or perhaps it’s just the architecture of these new Mac chips are just so damn good.
Laptops in general are just better than they used to be, with modern CPUs and NVMe disks. I feel exactly the same seeing new mobile AMD chips too, I'm pretty sure I'll be happy with my Ryzen 7040-based laptop for at least a few years.
Apple's M1 came at a really interesting point. Intel was still dominating the laptop game for Windows laptops, but generational improvements felt pretty lame. A whole lot of money for mediocre performance gains, high heat output and not very impressive battery. The laptop ecosystem changed rapidly as not only the Apple M1 arrived, but also AMD started to gain real prominence in the laptop market after hitting pretty big in the desktop and data center CPU market. (Addendum: and FWIW, Intel has also gotten a fair bit better at mobile too in the meantime. Their recent mobile chipsets have shown good efficiency improvements.)
If Qualcomm's Windows on ARM efforts live past the ARM lawsuit, I imagine a couple generations from now they could also have a fairly compelling product. In my eyes, there has never been a better time to buy a laptop.
(Obligatory: I do have an M2 laptop in my possession from work. The hardware is very nice, it beats the battery life on my AMD laptop even if the AMD laptop chews through some compute a bit faster. That said, I love the AMD laptop because it runs Linux really well. I've tried Asahi on an M1 Mac Mini, it is very cool but not something I'd consider daily driving soon.)
> Laptops in general are just better than they used to be, with modern CPUs and NVMe disks. I feel exactly the same seeing new mobile AMD chips too, I'm pretty sure I'll be happy with my Ryzen 7040-based laptop for at least a few years.
You say that, but I get extremely frustrated at how slow my Surface Pro 10 is (with an Ultra 7 165U).
It could be Windows of course, but this is a much more modern machine than my Macbook Air (M1) and feels like it's almost 10 years old at times in comparison. - despite being 3-4 years newer.
It's true that Linux may be a bit better in some cases, if you have a system that has good Linux support, but I think in most cases it should never make a very substantial difference. On some of the newer Intel laptops, there are still missing power management features anyways, so it's hard to compare.
That said, Intel still has yet to catch up to AMD on efficiency unfortunately, they've improved generationally but if you look at power efficiency benchmarks of Intel CPUs vs AMD you can see AMD comfortably owns the entire top of the chart. Also, as a many-time Microsoft Surface owner, I can also confirm that these devices are rarely good showcases for the chipsets inside of them: they tend to be constrained by both power and thermal limits. There are a lot of good laptops on the market, I wouldn't compare a MacBook, even a MacBook Air, a laptop, with a Surface Pro, a 2-in-1 device. Heck, even my Intel Surface Laptop 4, a device I kinda like, isn't the ideal showcase for its already mediocre 11th gen Intel processor...
The Mac laptop market is pretty easy: you buy the laptops they make, and you get what you get. On one hand, that means no need to worry about looking at reviews or comparisons, except to pick a model. They all perform reasonably well, the touchpad will always be good, the keyboard is alright. On the other hand, you really do get what you get: no touchscreens, no repairability, no booting directly into Windows, etc.
I changed the wording to be "booting directly" to clarify that I'm not including VMs. If I have to explain why that matters I guess I can, but I am pretty sure you know.
If the roles were reversed would you still need an explanation? e.g. If I could run macOS inside of a VM on Windows and run things like Final Cut and XCode with sufficient performance, would you think there's no benefit to being able to boot macOS natively?
Booting natively means you need real drivers, which don't exist for Windows on Mac as well as for macOS on PC. It'd be useless. Just use the VM, it's good.
And it's not the same - running Windows natively on Mac would seriously degrade the Mac, while running macOS on a PC has no reason to make it worse than with Windows. Why not buy a PC laptop at that point? The close hardware/OS integration is the whole point of the product. Putting Windows into a VM lets you use best of both.
The question was a hypothetical. What if the macOS VM was perfect? If it was perfect, would it then not matter if you couldn't just boot into macOS?
I'm pretty sure you would never use a Windows PC just to boot into a macOS VM, even if it was flawless. And there are people who would never boot a Mac, just to boot into a Windows VM, even if it was flawless. And no, it's not flawless. Being able to run a relatively old strategy game is not a great demonstration of the ability generally play any random Windows game. I have a Parallels and VMWware Fusion license (well... Had, anyway), and I'm a long time (20 years) Linux user, I promise that I am not talking out my ass when it comes to knowing all about the compromises of interoperability software.
To be clear, I am not trying to tell you that the interoperability software is useless, or that it doesn't work just fine for you. I'm trying to say that in a world where the marketshare of Windows is around 70%, a lot of people depend on software and workflows that only work on Windows. A lot of people buy PCs specifically to play video games, possibly even as a job (creating videos/streaming/competing in esports teams/developing video games and related software) and they don't want additional input latency, lower performance, and worse compatibility.
Even the imperfections of virtual machines aside, some people just don't like macOS. I don't like macOS or Windows at all. I think they are both irritating to use in a way that I find hard to stomach. That doesn't mean that I don't acknowledge the existence of many people who very much rely on their macOS and Windows systems, the software ecosystems of their respective systems, and the workflows that they execute on those systems.
So basically, aside from the imperfections of a virtual machine, the ability to choose to run Windows as your native operating system is really important for the obvious case where it's the operating system you would prefer to run.
I agree. I would like to be able to use any hardware to it's full potential with any OS, even if the OS is running as a VM inside another OS. That's more difficult to pull off due to needing to then run both OSs at once. So then at least let me install the OS I want directly on the hardware and legally use any other OS in a VM with as much performance as possible.
There is nothing stopping you, technically or legally, from replacing the OS on a Mac. Apple went out of their way to make it possible (compared to devices with Qualcomm chips, for example) and the platform is reasonably compatible with PC.
The point of this whole thing is that practically speaking, it matters to the person deciding to buy a computer as to whether they can feasibly install their OS of choice on it. It stands to reason then, that a downside of buying a Mac computer is that you can not practically run Windows natively on a modern Mac. In practice, it does not matter who's fault this is.
Aside: Really, it's a combination of factors. First, Apple uses a bespoke boot chain, interrupt controller, etc. instead of UEFI and following ARM SystemReady standards like virtually all of the other desktop and server-class ARM machines, and didn't bother with any interoperability. The boot process is absolutely designed just to be able to boot XNU, with tiny escape hatches making it slightly easier to jam another payload into it. On the other hand, just out of pure coincidence, Windows apparently statically links the HAL since Windows 10 version 2004, making it impossible for a straight port to be done anymore. In any case, the Apple Silicon computers are designed to boot macOS, and "went out of their way to make it possible" is an absurd overstatement of what they did. What they did was "do the absolute minimum to make it possible without doing anything to make it strictly impossible." Going out of their way implies they actually made an effort to make it possible, but officially as far as I know Apple has only ever actually acknowledged virtual machines.
I think it would be fair to argue that the reverse is true, too: If you choose to buy a PC, you will be stuck with Windows, or an alternative PC operating system. (Of course, usually a Linux distribution, but sometimes a *BSD, or maybe Illumos. Or hell, perhaps Haiku.) That said, objectively speaking Windows has more marketshare and a larger ecosystem, for better or worse, so the number of people who strictly need and strictly want Windows is going to naturally be higher than the comparative numbers for macOS. This doesn't imply one is better than the other, but it still matters if you're talking about what laptop to buy.
> the platform is reasonably compatible with PC.
Not sure what you mean here. The Apple Silicon platform has basically nothing in common with the x64 PC. I guess it has a PCI express bus, but even that is not attached the same way as any typical x64 PC.
The Apple Silicon platform is actually substantially similar to the iOS platform.
> compared to devices with Qualcomm chips, for example
Also not sure what this is meant to mean, but with the Snapdragon X Elite platform, Qualcomm engineers have been working on upstream Linux support for a while now. In contrast I don't think Apple has contributed or even publicly acknowledged Asahi Linux or any of the Linux porting efforts to Apple Silicon.
If we're mostly concerned about CPU grunt, it's really hard to question the Ryzen 7040, which like the M1, is also not the newest generation chip, though it is newer than the M1 by a couple of years. Still, comparing an M1 MacBook Pro with a Framework 16 on Geekbench:
Both of these CPUs perform well enough that most users will not need to be concerned at all about the compute power. Newer CPUs are doing better but it'd be hard to notice day-to-day.
As for other laptop features... That'll obviously be vendor-dependent. The biggest advantage of the PC market is all of the choices you get to make, and the biggest disadvantage of the PC market is all of the choices you have to make. (Edit: Though if anyone wants a comparison point, just for sake of argument, I think generally the strongest options have been from ASUS. Right now, the Zephyrus G16 has been reviewing pretty good, with people mostly just complaining that it is too expensive. Certainly can't argue with that. Personally, I run Framework, but I don't really run the latest-and-greatest mobile chipsets most of the time, and I don't think Framework is ideal for people who want that.)
Ultimately it'll be subjective, but the fans don't really spin up on my Framework 16 unless I push things. Running a game or compiling on all cores for a while will do the trick. The exact battery life, thermals and noise will be heavily dependent on the laptop; the TDP of modern laptop CPUs is probably mostly pretty comparable so a lot of it will come down to thermal design. Same for battery life and noise, depends a lot on things other than the CPU.
Our Thinkpads are definitely hotter; fans spin up routinely on the Ryzens but never does on the M1. Battery life is infinitely better on the M1.
The other thing I hate about the Thinkpads is that the build/screen/trackpad quality sucks in comparison to the Apple stuff. And for all the griping about Mac OS on this site, Windows is way worse - you can tell MS's focus is on linux in the cloud these days. All the ancillary stuff Apple is good at is underappreciated.
I only do coding & browsing so maybe I'm a weak example but I find this even with my pretty old Intel laptops these days.
My Skylake one (I think that would be 6 years old now?) is doing absolutely fine. My Broadwell one is starting to feel a little aged but perfectly usable, I wouldn't even _consider_ upgrading it if I was in the bottom 95% of global income.
Compiling is very slow on these, but I think I'd avoid compilation on my laptop even if I had a cutting edge CPU?
Depends. I used to offload almost all compilation tasks, but now I only really do this if it's especially large. If I want to update my NixOS configuration I don't bother offloading it anymore. (NixOS isn't exactly Gentoo or anything, but I do have some overrides that necessitate a decent amount of compilation, mainly dogfooding my merge requests before they get merged/released.)
>Laptops in general are just better than they used to be, with modern CPUs and NVMe disks.
I've had my xps 13 since 2016. Really the only fault I have against it nowadays is that 8gb of ram is not sufficient to run intellij anymore (hell, sometimes it even bogs down my 16gb mbp).
Now, I've also built an absolute beast of a workstation with a 7800x3d, 64gb ram, 24 gb vram and a fast ssd. Is it faster than both? Yeah. Is my old xps slow enough to annoy me? Not really. Youtube has been sluggish to load / render here lately but I think that's much more that google is making changes to make firefox / ublock a worse experience than any fault of the laptop.
Regarding Youtube, Google is also waging a silent war against Invidious. It's to the point that even running helper scripts to trick Youtube isn't enough (yet). I can't imagine battling active and clever adversaries speeds up Youtube page loads as it runs through its myriad checks that block Invidious.
I am on Intel TGL currently and can't wait for Strix Halo next year. That is truly something else, it's nothing we have seen in notebooks before iGPU wise.
I've had a couple of Tiger Lake laptops, a Thinkpad and I believe my Surface Laptop 4. Based on my experience with current AMD mobile chipsets, I can only imagine the Strix Halo will be quite a massive uplift for you even if the generational improvements aren't impressive.
I've owned an M1 MBP base model since 2021 and I just got an M3 Max for work. I was curious to see if it "felt" different and was contemplating an upgrade to M4. You know what? It doesn't really feel different. I think my browser opens about 1 second faster from a cold start. But other than that, no perceptible difference day to day.
My work machine was upgraded from an M1 with 16GB of RAM to an M3 Max with 36GB and the difference in Xcode compile times is beyond belief: I went from something like 1-2 minutes to 15-20 seconds.
Obviously if opening a browser is the most taxing thing your machine is doing the difference will be minimal. But video or music editing, application-compiling and other intensive tasks, then the upgrade is PHENOMENAL.
FWIW I think that's more the core count than anything. I have a M1 Max as a personal machine and an M3 Max at work and while the M3 Max is definitely faster, it isn't world-beating.
I think most of that difference is going to be the huge increase in performance core count between the base chip and the Max (from 4 to 12). The RAM certainly doesn't hurt though!
My current work machine is M1 Max 64Gb and it's the fastest computer I've ever used. Watching rust code compile makes me laugh out loud it's so quick. Really curious what the newer ones are like, but tbh I don't feel any pressure to upgrade (could just be blissfully ignorant).
I went from 12 to 15 pro max, the difference is significant. I can listen to Spotify while shooting from the camera. On my old iPhone 12, this is not possible.
Test Spotify against YouTube Music (and others) - I personally see no reason for Spotify when I have YouTube Premium, which performs with less overhead.
> I wonder what it will take to make Mac/iOS feel faster
I know, disabling shadows and customisable animation times ;) On a jailbroken phone I once could disable all animation delays, it felt like a new machine (must add that the animations are very important and generally great ux design, but most are just a tad too slow)
16 pro has a specialized camera button which is a game changer for street / travel photography. I upgraded from 13 pro and use that. But no other noticeable improvements. Maybe Apple intelligence summarizing wordy emails.
I think the only upgrade now is from a non-Pro to Pro, since a 120Hz screen is noticeably better than a 60Hz screen (and a borderline scam that a 1000 Euro phone does not have 120Hz).
I realize this isn't your particular use case. But with newer iPhones, you can use USB-C directly for audio. I've been using the Audio Technica ATH-M50xSTS for a while now. The audio quality is exceptional. For Slack/Team/Zoom calls, the sidetone feature plays your voice back inside the headphones, with the level being adjustable via a small toggle switch on the left side. That makes all the difference, similar to transparency/adaptive modes on the AirPod Pro 2s (or older cellphones and landlines).
I use a small Anker USB-A to USB-C adapter [1]. They're rock solid.
As great as the AirPod Pro 2s are, a wired connection is superior in terms of reliability and latency. Although greatly improved over the years, I still have occasional issues connecting or switching between devices.
Out of curiosity, what's the advantage of a jailbroken iPhone nowadays? I'd typically unlock Android phones in the past, but I don't see a need on iOS today.
Interestingly, the last time I used Android, I had to sideload Adguard (an adblocker). On the App Store, it's just another app alongside competing adblockers. No such apps existed in the Play Store to provide system-level blocking, proxying, etc. Yes, browser extensions can be used, but that doesn't cover Google's incessant quest to bypass adblockers (looking at you Google News).
> Out of curiosity, what's the advantage of a jailbroken iPhone nowadays? I'd typically unlock Android phones in the past, but I don't see a need on iOS today.
I have custom scripts,
Ad blocking without VPNs, Application firewalls.
I've found compile times on large C++ code bases to be the only thing I really notice improving. I recently upgraded my work machine from a 2017 i7 to a shiny new Ryzen 9 9950x and my clean compile times went from 3.5 minutes to 15 seconds haha. When I compile with an M2 Max, it's about 30s, so decent for a laptop, but also it was 2x the price of my new desktop workstation.
Can confirm. I have an M2 Air from work and an M1 Pro for personal, and tbh, both absolutely fly. I haven't had a serious reason to upgrade. The only reason I do kind of want to swap out my M1 Pro is because the 13" screen is a wee small, but I also use the thing docked more often than not so it's very hard to justify spending the money.
The biggest difference I’ve seen is iPad Sidecar mode works far more reliably with the M3 Max than the M1 Max.
There have been incremental improvements in speed and nits too, but having Sidecar not randomly crash once a day once on M3 was very nice.
On the other side, as someone doing a lot of work in the GenAI space, I'm simultaneously amazed that I can run Flux [dev] on my laptop and use local LLMs for a variety of tasks, while also wishing that I had more RAM and more processing power, despite having a top of the line M3 max MBP.
But it is wild that two years ago running any sort of useful genAI stuff on a MBP was more-or-less a theoretical curiosity, and already today you can easily run models that would have exceeded SotA 2 years ago.
Somewhat ironically, I got into the "AI" space a complete skeptic, but thinking it would be fun to play with nonetheless. After 2 years of daily work with this models I'm starting to be increasingly convinced they are going to become increasingly disruptive. No AGI, but it will certainly reduce a lot of labor and enable things that we're really feasible before. Best of all, it's clear a lot of this work will be doable from a laptop!
> I haven’t had even a tinge of feeling that I need to upgrade after getting my M1 Pro MBP.
I upgraded my M1 MBP to a MacBook Air M3 15" and it was a major upgrade. It is the same weight but 40% faster and so much nicer to work on while on the sofa or traveling. The screen is also brighter.
I think very few people actually do need the heavy MBPs, especially not the web/full-stack devs who populate Hacker News.
EDIT: The screens are not different in terms of brightness.
> I think very few people actually do need the heavy MBPs, especially not the web/full-stack devs who populate Hacker News.
I can fairly easily get my M1 Air to have thermal issues while on extended video calls with some Docker containers running, and have been on calls with others having the same issue. Kind of sucks if it's, say, an important demo. I mostly use it as a thin client to my desktop when I'm away from home, so it's not really an issue, but if I were using it as a primary device I'd want a machine with a fan.
I try to avoid docker in general during local dev and luckily it has worked out for me even with microservice architectures. It reduces dramatically CPU and RAM needs and also reduces cycle time.
YouTube shows a small red "HDR" label on the video settings icon for actual HDR content. For this label to appear, the display must support HDR. With your M3 Pro, the HDR label should appear in Chrome and Safari.
You can also right-click on the video to enable "Stats for nerds" for more details. Next to color, look for "smpte2084 (PQ) / bt2020". That's usually the highest-quality HDR video [2,3].
You can ignore claims such as "Dolby Vision/Audio". YouTube doesn't support those formats, even if the source material used it. When searching for videos, apply the HDR filter afterward to avoid videos falsely described as "HDR".
Keep in mind that macOS uses a different approach when rendering HDR content. Any UI elements outside the HDR content window will be slightly dimmed, while the HDR region will use the full dynamic range.
I consider Vivid [4] an essential app for MacBook Pro XDR displays.
Once installed, you can keep pressing the "increase brightness" key to go beyond the default SDR range, effectively doubling the brightness of your display without sacrificing color accuracy. It's especially useful outdoors, even indoors, depending on the lighting conditions. And fantastic for demoing content to colleagues or in public settings (like conference booths).
> With your M3 Pro, the HDR label should appear in Chrome and Safari.
Ahh. Not Firefox, of course.
Thanks, I just ran a random nature video in Safari. It was pretty. The commercials before it were extremely annoying though. I don't think it's even legal here to have so many ads per minute of content as Google inserts on youtube.
Hah, I tried skimming through a 2 hour youtube video in Safari and every time i fast forwarded a couple min google inserted two ads. Basically I watched ads more than the video.
How can people use anything that doesn't run ublock origin these days?
For me faster refresh rate is noticeable on phone or ipad where you scroll all the time. On a laptop you don't have that much smooth scrolling. For me it's a non issue on laptop, not even once I wished it had faster refresh. While I always notice when switching between Pro and non Pro iPad.
I find 60Hz on the non-Pro iPhone obnoxious since switching to 120Hz screens. On the other hand, I do not care much about 60Hz when it comes to computer screens. I think touch interfaces make low refresh rates much more noticeable.
No doomscrolling at all. Even when switching between home screens is like it's dropping frames left and right (it's not of course, but that's what it looks like coming from 120Hz). A Galaxy A54 that we still have in the house that was just over 300 Euro feels much smoother than my old iPhone 15 that cost close to 1000 Euro because it has a 120Hz screen.
Even 90Hz (like on some Pixels) is substantially better than the iPhone's 60Hz.
The Galaxy must be new. In my experience Android phones get extremely laggy [1] as they get old and the 120 Hz refresh won't save you :)
I just noticed that I don't really try to follow the screen when I scroll down HN, for example. Yes it's blurry but I seem not to care.
[1] Source: my Galaxy something phone that I keep on my desk for when I do Android development. It has no personal stuff on it, it's only used to test apps that I work on, and even that isn't my main job (nothing since early spring this year for example). It was very smooth when I bought it, now it takes 5+ seconds to start any application on it and they stutter.
A lot of my work can be easily done with a Celeron - it's editing source, compiling very little, running tests on Python code, running small Docker containers and so on. Could it be faster? Of course! Do I need it to be faster? Not really.
I am due to update my Mac mini because my current one can't run Sonoma, but, apart from that, it's a lovely little box with more than enough power for me.
I still use Ivy Bridge and Haswell workstations (with Linux, SSD and discrete GPU) as my daily drivers and for the things I do they still feel fast. Honestly a new Celeron probably beats them performance wise.
The modern AMD or Intel desktops I've tried obviously are much faster when performing large builds and such but for general computing, web browsing, and so forth I literally don't feel much of a difference. Now for mobile devices it's a different story due to the increased efficiency and hence battery life.
It's so nice being able to advise a family member who is looking to upgrade their intel Mac to something new, and just tell them to buy whatever is out, not worry about release dates, not worry about things being out of date, and so on.
The latest of whatever you have will be so much better than the intel one, and the next advances will be so marginal, that it's not even worth looking at a buyer's guide.
My 2019 i9 flagship MBP is just so, so terrible, and my wife's M1 MacBook Air is so, so great. I can't get over how much better her computer is than mine.
I would normally never upgrade so soon after getting an M1 but running local LLMs is extremely cool and useful to the point where I'd want the extra RAM and CPU to run larger models more quickly.
I'm bumping from a still-excellent M1 MAX / 64GB to M4 MAX / 128GB, mostly for local GenAI. It gives me some other uplift and also enables me to sell this system while it's still attractive. I'm able to exhaust local 7B models fairly easily on it.
Yep, the same, M1 Pro from 2021. It's remarkable how snappy it still feels years later, and I still virtually never hear the fan. The M-series of chips is a really remarkable achievement in hardware.
I dont think this has anything to do with the hardware. I think we have entered an age where users in general are not upgrading. As such, software can't demand more and more performance. The M1 came out at a time where mostly all hardware innovation had staggered. Default RAM in a laptop has been 16G for over 5 years. 2 years ago, you couldn't even get more than 16 in most laptops. As such, software hardware requirements havent changed. So any modern CPU is going to feel overpowered. This isn't unique to M1's.
That’s because today’s hw is perfectly capable of running tomorrow’s software at reasonable speed. There aren’t huge drivers of new functionality that needs new software. Displays are fantastic, cellular speeds are amazing and can stream video, battery life is excellent, UIs are smooth with no jankiness, and cameras are good enough.
Why would people feel the need to upgrade?
And this applies already to phones. Laptops have been slowing for even longer.
Until everything starts running local inference. A real Siri that can operate your phone for you, and actually do things like process cross-app conditions ("Hey Siri, if I get an email from my wife today, notify me, then block out my calendar for the afternoon.") would use those increased compute and memory resources easily.
Apple has been shipping "neural" processors for a while now, and when software with local inference starts landing, Apple hardware will be a natural place for it. They'll get to say "Your data, on your device, working for you; no subscription or API key needed."
That's a very big maybe. The LLM experience locally is currently very very different from the hosted models most people play with. The future is still very uncertain.
I think regretting Mac upgrades is a real thing, at least for me. I got a 32G Mac mini in January to run local LLMs. While it does so beautifully, there are now smaller LLMs that run fine on my very old 8G M1 MacBook Pro, and these newer smaller models do almost all of what I want for NLP tasks, data transformation, RAG, etc. I feel like I wasted my money.
Which ones in particular? I have an M2 air with 8GB, and doing some RAG development locally would be fantastic. I tried running Ollama with llama3.2 and it predictably bombed.
I always catch myself in this same train of thought until it finally re-occurs to me that "no, the variable here is just that you're old." Part of it is that I have more money now, so I buy better products that last longer. Part of it is that I have less uninterrupted time for diving deeply into new interests which leads to always having new products on the wishlist.
In the world of personal computers, I've seen very few must-have advances in adulthood. The only two unquestionable big jumps I can think of off hand are Apple's 5K screens (how has that been ten years?!) and Apple Silicon. Other huge improvements were more gradual, like Wi-Fi, affordable SSDs, and energy efficiency. (Of course it's notable that I'm not into PC gaming, where I know there has been incredible advances in performance and display tech.)
I've had Macs before, from work, but there is something about the M1 Pro that feels like a major step up.
Only recently I noticed some slowness. I think Google Photos changed something and they show photos in HDR and it causes unsmooth scrolling. I wonder if it's something fixable on Google's side though.
I agree with you about not needing to upgrade but, it still stands that IMHO Apple is better off with upgrading or even having the need to upgrade with competition. (Also it's really good that Macs now have 16GB of ram by default). As I have had my M1 14.2 Max I believe that the only reason I would want to upgrade is that I can configure it with 128GB of ram which allows you to load newer AI models on device.
The MacBook Pro seems like it does have some quality of life improvements such as Thunderbolt 5, the camera is now a center stage (follows you) 14 megapixel camera now all of them have three USB-C ports and the battery life claims of 22-24 hours. Regardless if you want a MacBook Pro and you don't have one there is now an argument on not just going to buy the previous model.
Same. I used to upgrade every 1.5 years or so. But with every Apple Silicon generation so far I have felt that there are really no good reasons to upgrade. I have a MacBook M3 Pro for work, but there are no convincing differences compared to the M1 Pro.
In fact, I bought a highly discounted Mac Studio with M1 Ultra because the M1 is still so good and it gives me 10Gbit ethernet, 20 cores and a lot of memory.
The only thing I am thinking about is going back to the MacBook Air again since I like the lighter form factor. But the display, 24 GiB max RAM and only 2 Thunderbolt ports would be a significant downgrade.
And M1 from 4 years ago instead of M3 from last year; while a 2x speed improvement in the benchmarks they listed is good, it also shows that the M series CPUs see incremental improvements, not exponential or revolutionary. I get the feeling - but a CPU expert can correct me / say more - that their base design is mostly unchanged since M1, but the manufacturing process has improved (leading to less power consumption/heat), the amount of cores has increased, and they added specialized hardware for AI-related workloads.
That said, they are in a very comfortable position right now, with neither Intel, AMD, or another competitor able to produce anything close to the bang-for-watt that Apple is managing. Little pressure from behind them to push for more performance.
Their sales pitch when they released the M1 was that the architecture would scale linearly and so far this appears to be true.
It seems like they bump the base frequency of the CPU cores with every revision to get some easy performance gains (the M1 was 3.2 GHz and the M3 is now 4.1 GHz for the performance cores), but it looks like this comes at the cost of it not being able to maintain the performance; some M3 reviews noted that the system starts throttling much earlier than an M1.
Same feeling. The jump from all the previous laptops I owned to an M1 was an incredible jump. The thing is fast, has amazing battery life and stays cold.
Never felt the need to upgrade.
probably the next update wave is coming from the need of AI features for more local memory and compute. The software is just not there yet in usual tasks but it's just a question of time I guess. Of course there will be the pressure to do that in the cloud as usual, but local compute will always remain a market.
and probably it's good that at least one of the big players has a business model that supports driving that forward
IMO Apple Silicon is just that good; have played No Man’s Sky on a new MB Air 13” at 1080p and mid to high settings.
An MB Air with an M3 and no fan out gamed my old gtx 1080 box which stuttered on NMS size games all the time
Shows just how poorly Intel has done. That company should be razed to the ground figuratively and the infrastructure given to a new generation of chip makers; the last one is clearly out of their element
I also have an M1 Pro MBP and mostly feel the same. The most tempting thing about the new ones is the space black option. Prior to the M1, I was getting a new laptop every year or two and there was always something wrong with them - butterfly keyboard, Touch Bar etc. This thing is essentially perfect though, it still feels and performs like a brand new computer.
I have an MBP M1 Max and the only time I really feel like I need more oomph is when I'm doing live previews and/or rendering in After Effects. I find myself having to clear the cache constantly.
Other than that it cruises across all other applications. Hard to justify an upgrade purely for that one issue when everything else is so solid. But it does make the eyes wander...
> Perhaps it’s my age, or perhaps it’s just the architecture of these new Mac chips are just so damn good.
I feel the same of my laptop of 2011 so I guess it is partly age (not feeling the urge to always have the greatest) and partly it is non LLM and gaming related computing is not demanding enough to force us to upgrade.
I think the last decade had an explosion in the amount of resources browsers needed and used (partly workloads moving over, partly moving to more advanced web frameworks, partly electron apps proliferating).
The last few years Chrome seems to have stepped up energy and memory use, which impacts most casual use these days. Safari has also become more efficient, but it never felt bloated the way Chrome used to.
But this ad is specifically for you! (Well, and those pesky consumers clinging on to that i7!):
> Up to 7x faster image processing in Affinity Photo when compared to the 13‑inch MacBook Pro with Core i7, and up to 1.8x faster when compared to the 13-inch MacBook Pro with M1.
I feel the same way about my M1 Macbook Air ... it's such a silly small and powerful machine. I've got money to upgrade, I just have no need. It's more than enough for even demanding Logic sessions and Ollama for most 8b models. I love it.
The only reason I'd want to upgrade my M1 Pro MBP is because I kind of need more RAM and storage. The fact that I'm even considering a new laptop just for things that before could have been a trivial upgrade is quite illuminating.
Yeah, I feel like Apple has done the opposite of planned obsolescence with the M chips.
I have a Macbook Air M1 that I'd like to upgrade, but they're not making it easy. I promised myself a couple of years ago I'll never buy a new expensive computing device/phone unless it supports 120 hertz and Wi-Fi 7, a pretty reasonable request I think.
I got the iPhone 16 Pro, guess I can wait another year for a new Macbook (hopefully the Air will have a decent display by then, I'm not too keen to downgrade the portability just to get a good display).
So the intel era is not Apple products? Butterfly keyboard is not an Apple invention?
They have the highest product quality of any laptop manufacturer, period. But to say that all Apple products hold value well is simply not true. All quality products hold value well, and most of Apples products are quality.
I guarantee you that if Apple produced a trashy laptop it would have no resell value.
It's expected Intel-based Macs would lose value quickly considering how much better the M1 models were. This transition was bigger than when they moved from PowerPC to Intel.
One complicating factor in the case of the Intel Macs is that an architectural transition happened after they came out. So they will be able to run less and less new software over the next couple of years, and they lack most AI-enabling hardware acceleration.
That said, they did suffer from some self inflicted hardware limitations, as you hint. One reason I like the MBP is the return of the SD card slot.
Similar for me. MacBook Air M1 (8 cpu / 8 gpu; 16 GB RAM)...running in or out of clamshell with a 5k monitor, I rarely notice issues. Typically, if I'm working very inefficiently (obnoxious amount of tabs with Safari and Chrome; mostly web apps, Slack, Zoom, Postman, and vscode), I'll notice a minor lag during a video call while screen sharing...even then, it still keeps up.
(Old Pentium Pro, PII, multi chip desktop days) -- When I did a different type of work, I would be in love with these new chips. I just don't throw as much at my computer anymore outside of things being RAM heavy.
The M1 (with 16 GB ram) is really an amazing chip. I'm with you, outside of a repair/replacement? I'm happy to wait for 120hz refresh, faster wifi, and longer battery life.
> Yeah, I feel like Apple has done the opposite of planned obsolescence with the M chips.
They always have. If you want an objective measure of planned obsolescence, look at the resale value. Apple products hold their resale value better than pretty much every competitor because they stay useful for far longer.
I have exactly the same experience, usually after 3 years I'm desperate for new Mac but right now I genuinely think I'd prefer not to change. I have absolutely no issues with my M1 Pro, battery and performance is still great.
I feel exactly the same. The one thing that would get me to pull the trigger on a newer one is if they start supporting SVE2 instructions, which would be super useful for a specific programming project I’ve been playing with.
100% agree on this. Ive had this thing for 3 years and I still appreciate how good it is. Of course the M4 tingles my desire for new cool toys, but I honestly don´t think I would notice much difference with my current use.
Same boat—I'm on a lowly M1 MacBook Air, and haven't felt any need to upgrade (SwiftUI development, video editing, you name it), which is wild for a nearly 4 year-old laptop.
I am replacing a Dell laptop because the case is cracking, not because it's too slow (it isn't lightning fast, of course, but it sure is fast enough for casual use).
I’m using the M3 Air 13 in (splurged for 24 GB of RAM, I’m sure 16 is fine) to make iOS apps in Xcode and produce music in Ableton and it’s been more than performant for those tasks
Only downside is the screen. The brightness sort of has to be maxed out to be readable and viewing at a wrong angle makes even that imperfect
That said it’s about the same size / weight as an iPad Pro which feels much more portable than a pro device
Tbf, the only thing I miss with my M2 MacBook is the ability to run x86_64 VM’s with decent performance locally.
I’ve tried a bunch of ways to do this - and frankly the translation overhead is absolute pants currently.
Not a showstopper though, for the 20-30% of complete pain in the ass cases where I can’t easily offload the job onto a VPS or a NUC or something, I just have a ThinkPad.
Yup, honestly the main reason I'd like to upgrade from my M1 MBA is the newer webcams are 1080p instead of 720p, and particularly much better in low light like in the evening.
If you're in the ecosystem get an iphone mount - image quality is unreal compared to anything short of some fancy DSLR setup - it is some setup but not much with magnets in iphone.
- RAM: 32GB is just not enough. I have firefox open with a mere 60-ish tabs and RSS is already at 21GB. I add my IDE and building a c++ app at -j10 and I start getting OOMs left and right, I have to close my browser whenever I build.
- Graphics: the GPU and compute capabilities are definitely not there when comparing to NVidia mid-range offering, it's more akin to a laptop 2060.
- Can only do 3 video outputs at most while there are enough hardware outputs for 5.
> I have firefox open with a mere 60-ish tabs and RSS is already at 21GB.
Isn't that just Firefox deciding to let things stay in memory because the memory is there? Anyway, Safari seems to run fine with 8Gb and will unload inactive tabs.
> building a c++ app at -j10
Strange you're getting OOM errors instead of more swapping. But why insist on running 10 tasks at the same time then?
when the hardware wait time is the same as the duration of my impulsive decisions i no longer have a hardware speed problem, i have a software suggestion problem
I got an MBP M1 with 32gb of RAM. It'll probably be another 2-3 years or longer before I feel the pressure to upgrade if not longer. I've even started gaming (something I dropped nearly 20 years ago when I switched to mac) again due to Geforce Now, I just don't see the reason.
Frankly though, if the mac mini was a slightly lower price point I'd definitely create my own mac mini cluster for my AI home lab.
I hate to say it but that's like a boomer saying they never felt the need to buy a computer, because they've never wished their pen and paper goes faster. Or a UNIX greybeard saying they don't need a Mac since they don't think its GUI would make their terminal go any faster. If you've hit a point in your life where you're no longer keeping up with the latest technological developments like AI, then of course you don't need to upgrade. A Macbook M1 can't run half the stuff posted on Hugging Face these days. Even my 128gb Mac Studio isn't nearly enough.
> If you've hit a point in your life where you're no longer keeping up with the latest technological developments like AI, then of course you don't need to upgrade.
That's me, I don't give a shit about AI, video editing, modern gaming or Kubernetes. That newest and heaviest piece of software I care about is VSCode. So I think you're absolutely correct. Most things new since Docker and VSCode has not contributed massively to how I work and most of the things I do could be done just fine 8-10 years ago.
I think the difference is that AI is a very narrow niche/hobby at the moment. Of course if you're in that niche having more horsepower is critical. But your boomer/greybeard comparisons fall flat because they're generally about age or being set in your ways. I don't think "not being into AI image generation" is (currently) about being stuck in your ways.
To me it's more like 3d printing as a niche/hobby.
Playing with them locally? Yes, of course it's a niche hobby. The people doing stuff with them that's not either playing with them or developing not just an "AI" product, but a specific sort of AI product, are just using ChatGPT or some other prepackaged thing that either doesn't run locally, or does, but is sized to fit on ordinary hardware.
< 1% of all engagement with a category thing is niche/hobby, yes.
I get that you're probably joking, but - if I use Claude / ChatGPT o1 in my editor and browser, on an M1 Pro - what exactly am I missing by not running e.g. HF models locally? Am I still the greybeard without realising?
Using the term "bro" assumes that all AI supporters are men. This erases the fact that many women and nonbinary people are also passionate about AI technology and are contributing to its development. By using "AI bro" as an insult, you are essentially saying that women and nonbinary people are not welcome in the AI community and that our contributions don't matter. https://www.reddit.com/r/aiwars/comments/13zhpa7/the_misogyn...
Is there an alternative term you would prefer people to use when referring to a pattern of behavior perceived as a combination of being too excited about AI and being unaware (perhaps willfully) that other people can be reasonably be much less interested in the hype? Because that argument could definitely benefit from being immune to deflections based on accusations of sexism.
When I see that someone is excited about something, I believe in encouraging them. If you're looking for a more polite word to disparage people who love and are optimistic about something new, then you're overlooking what that says about your character. Also AI isn't just another fad like NFTs and web3. This is it. This is the big one.
> Also AI isn't just another fad like NFTs and web3. This is it. This is the big one.
That's thoroughly unconvincing. That kind of talk is exactly what so many people are tired of hearing. Especially if it's coming from technically-minded people who don't have any reason to be talking like PR drones.
What makes you think I care about convincing you? These days every shot caller on earth is scrambling to get piece of AI. Either by investing in it or fighting it. You come across as someone who wants to hate on AI. Haters aren't even players. They're NPCs.
So people who aren't obsessed with AI to your liking are:
- boomer luddites
- primitive single-celled organisms
- NPCs
And even people who are enthusiastic about AI but aren't fanatical about running it locally get scorn from you.
I can understand and forgive some amount of confirmation bias leading you to overestimate the importance and popularity of what you work on, but the steady stream of broad insults at anyone who even slightly disagrees with you is dismaying. That kind of behavior is wildly inappropriate for this forum. Please stop.
That’s interesting because I would’ve thought having strong local compute was the old way of thinking. I run huge jobs that consume very large amounts of compute. But the machines doing the work aren’t even in the same state I’m in. Then again maybe I’m even older as I’m basically on the terminal server / mainframe compute model. :)
I work with AI models all day every day, keep up with everything, love frontier tech, I love and breathe LLMs. And I, like OP, haven't seen the need to upgrade from the M1 MBP because it runs the small 1-7B models just fine, and anything bigger I want on some GPU instance anyway, or I want a frontier model which wouldn't run on the newest and biggest MBP. So it's not just us Boomers hating on new stuff, the M series MacBooks are just really good.
Given that models are only going to get larger, and the sheer amount of compute required, I think the endgame here is dedicated "inference boxes" that actual user-facing devices call into. There are already a couple of home appliances like these - NAS, home automation servers - which have some intersecting requirements (e.g. storage for NAS) - so maybe we just need to resurrect the "home server" category.
I agree, and if you want to have the opportunity to build such a product, then you need a computer whose specs today are what a home server would have in four years. If you want to build the future you have to live in the future. I'm proud to make stuff most people can't even run yet, because I know they'll be able to soon. That buys me time to polish their future and work out all the bugs too.
So every user of a computer that doesn't create their own home-grown ML models is a boomer? This can't possibly be a generational thing. Just about everyone on the planet is at a place in their life where they don't make their own AIs.
Eventually as the tools for doing it become better they'll all want to or need to. By then, most computers will be capable of running those tools too. Which means when that happens, people will come up another way to push the limits of compute.
Most retailers have had the older models on closeout for a few weeks now. Best Buy, Amazon and Costco have had the M3 models for a few hundred off depending on models.
Watch SlickDeals. I think it was this time last year where lots of refurbs/2 generation old machines were going for massive discounts. Granted they were M1 machines, but some had 64GB RAM and 4TB drives for like $2700. Microcenter and B&H are good ones to watch as well.
The M-series macbooks depreciate in value far slower than any of the Intel models. M1 base models can still sell for nearly $1k. It's difficult to find a really good deal.
Got the money, are in the consumerism camp: Switch to latest model every year because the camera island changed 5mm.
Got the professional need in games or video and your work isn't covering your device: Switch to new model every couple of generations.
Be me: I want to extend the lifecycle of things I use. Learn how to repair what you own (it's never been as easy), be aware of how you can work in today's world (who needs laptop RAM if I can spin up containers in the cloud) - I expect to not upgrade until a similarly stellar step up in the category of Intel to Apple Silicone comes along.
All past Mx versions being mostly compared to Intel baselines: Boring.
M4 1.8 times faster than M1 Pro: Nice, but no QoL change. For the few times I might need it, I can spin up a container in the cloud.
Pick a daily cost you’re comfortable with. If you’re contracting at say $500/day, how much are you willing to spend on having a responsive machine? $10? $20?
Multiply it out: 220 work days a year * $10/day is $2200 a year towards your laptop.
On major remodels or with compelling features. I had an i9 MacBook Pro and then upgraded to anM1 MacBook Pro because it was a major leap forward. However, I will wait until the MacBook Pro is redesigned yet again (maybe thinner and lighter as I travel a lot and carry-on weight is limited), apparently in 2026 or so with OLED and other features, rumors say.
Depends if it is a personal machine or paid by your company. 5+ years is what I generally expect from an apple laptop (been using them since around 2007-2009) if I own. For an M1-3 that could be a bit longer. If it is paid by your company, then whenever you have the budget :)
I update largely based on non performance criteria:
- new display tech
- better wireless connectivity
- updated protocols on ports (e.g., support for higher res displays
and newer displayport/hdmi versions)
- better keyboard
- battery life
Once a few of those changes accumulate over 4+ generations of improvements that’s usually the time for me to upgrade.
My laptops so far: first 2008 plastic macbook, 2012 macbook pro, 2015 macbook pro, and M1 pro 16 currently. I skipped 2016-2020 generation which was a massive step backwards on my upgrade criteria, and updated to 2015 model in 2016 once I realized apple has lost their marbles and has no near plans on making a usable laptop at the time.
Also getting a maxed out configuration really helps the longevity.
The 2014 model I bought in early 2015 still works, though the battery is dodgy. I did get the motherboard replaced in 2020 which was pricey, but much cheaper than a new machine.
Is there some reason your current computer isn't working for you? If not, why upgrade? Use it as long as you can do so practically & easily.
On the other extreme, I knew someone who bought a new MBP with maximum RAM specs each year. She'd sell the old one for a few hundred less than she paid, then she always had new hardware with applecare. It was basically like leasing a machine for $400/yr.
My previous Macbook was a Pro model from 2015, I waited 6 years to finally upgrade to an M1 Air because of the awful touchbar models they had in between (though I'm still using the 2015 Pro for personal stuff, in fact right now. It's upgraded to the latest macOS using OpenCore and it still runs great). But I would say upgrade every 3-5 years depending on heavy a professional user you are.
Because it made the esc key useless for touch typists and because, as a vi user, I hit esc approximately a bazillion times per day I mapped caps lock to esc.
Now my fingers don't travel as far to hit esc.
I still use that mapping even on my regular keyboards and my current non-touch-bar macs.
I really like the touchbar Macs because changing the volume to exactly what I want is really easy. All the others have increments that are too large, so I have to try to remember if Shift + Opt + Volume Up/Down is what I want, or some other combination.
People have different passions, I like computers. If I feel a new Mac is going to be fun for whatever reason, I consider upgrading it.
Performance wise they last a long time, so I could keep them way longer than I do, but I enjoy newer and more capable models.
You can always find someone to buy the older model. Macs have a great second hand market.
I'm using them for several years - I still have a Mac mini (from 2012) and an iMac Pro (from 2017) running. I also get a company Macbook which I can upgrade every three years.
But there is also another strategy: get a new Mac when they come out and sell it before/after the next model appears. There is a large market for used Macs. A friend of mine has been doing this for quite some time.
Tethering to an iPhone is so easy though - just select it in the Wifi menu. I'm not sure if I'd ever pay for an LTE modem option. I'm sure it would be better efficiency and performance to have it built-in, but I wouldn't think many people care enough about that small difference to offer it as an option.
It's not about efficiency or performance, it's about not having to own the iPhone in the first place. Just put a SIM card inside the laptop and forget about it. Windows laptops can even seamlessly switch between wifi and LTE depending on which one is available. But of course Apple would never allow that because they want to force you to own the full set of Apple devices. Laptop being self-sufficient would be against their policy.
Not to mention that in the US the cell phone carriers artificially limit tethering speed or put data caps on it when you tether from your phone. You have to buy a dedicated data-only plan and modem.
I use the tethering quite often. I have for years. It is flaky and burns two batteries instead of one. I agree that many people do not care. Some of us who are traveling a lot are willing to pay for more options.
I wonder if one of the obstacles is the amount of data that would likely be used.
Most cellular carriers offer unlimited on-device data plans, but they cap data for tethering. Integrating an LTE modem into a laptop essentially requires a mobile data plan with unlimited tethering - which, AFAIK, doesn’t exist at the moment. I’m not sure why.
Integrating an LTE modem into an iPad requires a mobile data plan, and thats about it. It's not "tethered" if its built into the device.
I've always heard that patent disputes were at the root of the lack of a modem option. Apple had a prototype MacBook Pro back in the early Intel days IIRC but it was never released.
Maybe if Apple ever gets their in-house modems working, we'll see them on all of the product lines, but until then, it's a niche use case that likely isn't causing them to lose a ton of sales.
> It's not "tethered" if its built into the device.
I understand that. My point is that I think an LTE modem in a laptop might reasonably use far more data than an LTE modem in a phone or tablet. Most people who download and/or upload very large files do so on their computer rather than their mobile devices.
There is no reason macOS cannot have some option for throttling usage by background updates when connected over LTE. iPads have an LTE option.
That carriers have not figured out how to charge me by the byte over all my devices instead of per device is really not a big issue to me. I would like to pay for an LTE modem and the necessary bandwidth.
My intuition is that when Apple has their own LTE modem and is not dependent on Qualcomm, a MacBook Pro will have an option similar to that for Dell power users.
The industry as a whole is trying its best to not rely on Qualcomm, given its extremely litigious past. Apple already tried once to avoid using their chips for the iPhone's modem, which I seem to recall failed. When it comes to devices for enterprise, it's less of a gamble because the cost can be passed on to orgs who are less price sensitive.
I really like these new devices, but I’ve found that the latest MacBook Air (M3) is sufficient for my needs as a manager and casual developer. My MacBook Pro M1 Max has essentially become a desktop due to its support for multiple monitors, but since the Mac Mini M4 Pro can also support up to three external displays, I’m considering selling the MacBook Pro and switching to the Mini. I’ve also noticed that the MacBook Pro’s battery, as a portable device, is less efficient in terms of performance/battery (for my usage) compared to the MacBook Air.
Regarding LLMs, the hottest topic here nowadays, I plan to either use the cloud or return to a bare-metal PC.
My parents simply refuse to consider anything but Windows, even though the ever-changing Windows UI keeps pissing them off. It's like a mental block or something.
Question without judgement: why would I want to run LLM locally? Say I'm building a SaaS app and connecting to Anthropic using the `ai` package. Would I want to cut over to ollama+something for local dev?
Data privacy-- some stuff, like all my personal notes I use with a RAG system, just don't need to be sent to some cloud provider to be data mined and/or have AI trained on them
For normal web dev, any M4 CPU is good as it is mostly dependent on single core speed. If you need to compile Unreal Engine (C++ with lots of threads), video processing or 3D rendering, more cores is important.
I think you need to pick the form factor that you need combined with the use case:
- Mobility and fast single core speeds: MacBook Air
- Mobility and multi-core: MacBook Pro with M4 Max
Simultaneously supports full native resolution on the built-in display at 1 billion colors and:
Up to two external displays with up to 6K resolution at 60Hz over Thunderbolt, or one external display with up to 6K resolution at 60Hz over Thunderbolt and one external display with up to 4K resolution at 144Hz over HDMI
One external display supported at 8K resolution at 60Hz or one external display at 4K resolution at 240Hz over HDMI"
Does anyone understand this claim from the press release?
> M4 Max supports up to 128GB of fast unified memory and up to 546GB/s of memory bandwidth, which is 4x the bandwidth of the latest AI PC chip. This allows developers to easily interact with large language models that have nearly 200 billion parameters.
Having more memory bandwidth is not directly helpful in using larger LLM models. A 200B param model requires at least 200GB RAM quantized down from the original precision (e.g. "bf16") to "q8" (8 bits per parameter), and these laptops don't even have the 200GB RAM that would be required to run inference over that quantized version.
How can you "easily interact with" 200GB of data, in real-time, on a machine with 128GB of memory??
Wouldn't it be incredibly misleading to say you can interact with an LLM, when they really mean that you can lossy-compress it to like 25% size where it becomes way less useful and then interact with that?
(Isn't that kind of like saying you can do real-time 4k encoding when you actually mean it can do real-time 720p encoding and then interpolate the missing pixels?)
Yes the size is much reduced, and you do have reduced quality as a result, but it isn't as bad as what you're implying. Just a few days ago Meta released q4 versions of their llama models. It's an active research topic.
> Up to 4.6x faster build performance when compiling code in Xcode when compared to the 16‑inch MacBook Pro with Intel Core i9, and up to 2.2x faster when compared to the 16‑inch MacBook Pro with M1 Max.
OK, that's finally a reason to upgrade from my M1.
Does anyone know if there is a way to use Mac without the Apple bloatware?
I genuinely want to use it as primary machine but with this Intel MacBook Pro I have, I absolutely dislike FaceTime, IMessage, the need to use AppStore, Apple always asking me have a Apple user name password (which I don't and have zero intention), block Siri, and all telemetry stuff Apple has backed in, stop the machine calling home, etc.
This is to mirror tools available in Windows to disable and remove Microsoft bloatware and ad tracing built in.
There is zero iCloud account requirement. You do not need to use the App Store. Gatekeeper can be disabled with a configuration profile key. Telemetry (what little there is) can be disabled with a configuration profile key. Siri can be disabled, all of the generative AI crap can be disabled, yadda yadda yadda, with a configuration profile key. Every background service can be listed and disabled if you disable authenticated-root. Hell, you could disable `apsd` and disable all push notifications too, which require a phone home to Apple.
IIRC Apple is a lot less heavy handed wrt service login requirements when compared to Microsoft’s most recent Windows endeavors. And depending on the developer you can get around having to use the App Store at all. Being you’re on an Intel Mac have you considered just using Linux ?
There used to be this whole contingent of people who were adamant that Apple's software was too opinionated, bloated, that you couldn't adapt its OS to your needs, and that Apple was far too ingrained in your relationship with your device. That Linux was true freedom, but at least that Windows respected its users
I belong to that contingent, and I still stand by the assertion that Apple's software is too opinionated, configurability is unreasonably low, and you have to stick to the Apple ecosystem for many thing to get the most out of it.
My primary desktop & laptop are now both Macs because of all the malarkey in Win11. Reappearance of ads in Start and Windows Recall were the last straws. It's clear that Microsoft is actively trying to monetize Windows in ways that are inherently detrimental to UX.
I do have to say, though, that Win11 is still more customizable overall, even though it - amazingly! - regressed below macOS level in some respects (e.g. no vertical taskbar option anymore). Gaming is another major sticking point - the situation with non-casual games on macOS is dismal.
You can use OSX without an Apple account and paired with a 3rd party host based firewall (Little Snitch), the OS usually stays out of your way (imo). Bundled apps can be removed after disabling SIP (file integrity) but there are downsides/maintenance to that route.
At a linux conference I saw many macbooks. Talked to a few, they just ran linux in a VM full screen for programming and related. Then used OSX for everything else (office, outlook, teams, work enforced apps, etc). They seemed very happy and this encouraged them to not task switch as often.
I gave up on macos when they started making the OS partition read-only. A good security feature in general, but their implementation meant that changing anything became a big set of difficulties and trade-offs.
That, combined with the icloud and telemetry BS, I'd had enough.
Not only good security, but it also makes software updates a lot faster because you don't have to check if the user has randomly changed any system files before patching them.
The single most annoying thing about this announcement for me is the fact that I literally just paid for an Asus ProArt P16 [0] on the basis that the Apple offerings I was looking at were too expensive. Argh!
> MacBook Air with M2 and M3 comes standard with 16GB of unified memory, and is available in midnight, starlight, silver, and space gray, starting at $999 (U.S.) and $899 (U.S.) for education.
At long last, I can safely recommend the base model macbook air to my friends and family again. At $1000 ($900 with edu pricing on the m2 model) it really is an amazing package overall.
Upgraded to a M1 Pro 14 in December 2021, and I still rock it everyday for dev purpose. Apple does great laptop.
The only downsides is that I see a kind of "burnt?" transparent spot on my screen. When connecting to an HDMI cable, the sound does not ouput properly to the TV screen, and makes the video I plat laggy. Wondering if I go to the Apple Store, would fix it?
I'm not sure we can leverage the neural cores for now, but they're already rather good for LLMs, depending on what metrics you value most.
A specced out Mac Studio (M2 being the latest model as of today) isn't cheap, but it can run 180B models, run them fast for the price, and use <300W of power doing it. It idles below 10W as well.
It is interesting they only support 64gb and then jump to 128gb. It seems like a money play since it's $1,000 to upgrade for 128, and if you're running something that needs more than 64 (like LLMs?) you kind of have no choice.
As it goes for the section where they demoed the assistance from apple intelligence to the researcher creating an abstract and adding pictures to their paper. Is it better or worse to do this? People are already complaining so heavily about dead internet theory with the 'AI voice' being so prominent..
I wish Apple would include a USB-C data port or 2 on their big charger brick for the same single-cord bliss that the iMac enjoys (while plugged in). My little USB hubs that I carry around cant sufficiently power the MacBook pro
What's the deal with running Linux on these anyway? Could one conceivably set up an M4 mini as headless server? I presume Metal would be impossible to get working if MacOS uses proprietary drivers for it...
To be fair, the link in this story is to a press release. Arguably there are probably many things in it that can be considered "misleading" in certain contexts.
I have an M2 Max now, and it's incredible. But it still can't handle running xcode's Instruments. I'd upgrade if the M4s could run the leaks tool seamlessly, but I doubt any computer could.
For the gamers amongst you, do any of you game on your MacBook Pro Ms? If so, which one? Is there a noticeable difference in game quality between the M1 Pro vs M3 Pro for example?
Once they get a MacBook Air with an M4, it will become a viable option for developers and other users that want/need 2 external monitors. Definitely looking forward to that happening.
saw very little discourse in the fact that apple silicon is following a "tick-tock" approach, very much like intel in the 2010s.
for the sake of annual releases we get a new number, but besides increased silicon area, the major architectural changes seem to come every couple years.
about time 16gb was the default on something that costs four figures. the on-device ai craze in this lineup has finally pushed the company to give adequate memory.
You can just turn it off. macOS lets you change the resolution to use just the screen below the notch, and because it's mini-LED, the now unused "flaps" to the sides of the notch are indistinguishable from the rest of the bezel.
I recently switched back to using homemade desktops for most of my work. I’ve been running Debian on them . Still have my Mac laptop for working on the go
When I have a full team of people with 1080p webcams and a solid connection I can notice the quality. Most of the time not everyone fulfills those requirements and the orchestrator system has to make do
I mean you can easily create your own fully meshed P2P group video chat in your browser just using a little bit of JS that would support everyone running 4k, but it will fail the moment you get more than 3-8 people as each persons video stream is eating 25mbps for every side of a peer connection (or 2x per edge in the graph.)
A huge part of group video chat is still "hacks" like downsampling non-speaking participants so the bandwidth doesn't kill the connection.
As we get fatter pipes and faster GPUs streaming will become better.
edit: I mean... I could see a future where realtime video feeds never get super high resolution and everything effectively becomes a relatively seemless AI recreation where only facial movement data is transmitted similar to how game engines work now.
4k for videoconferencing is nuts. The new camera should be an improvement over the old. Plus, being able to show your actual, physical desktop can be Andy too. Using your iPhone as the webcam will still probably give you the best quality especially if you are in a lower light situation.
Poor. My M3 Max/128GB is about 20x slower than 4090. For inference it's much better, still much slower than 4090 but it enables working with much larger LLMs albeit at ~10t/s (in comparison, Threadripper 2990WX/256GB does like 0.25t/s). M4 Max is likely going to be ~25% faster than M3 Max based on CPU perf and memory bandwidth.
Disingenuous to mention the x86 based MacBooks as a basis for comparison in their benchmarks; they are trying to conflate current-gen Intel with what they shipped more than 4 years ago.
Are they going to claim that 16GB RAM is equivalent to 32GB on Intel laptops? (/sarc)
Lot's of people don't upgrade on the cadence that users on this forum do. Someone was mentioning yesterday that they are trying to sell their Intel Mac {edit: on this forum] and asking advice on getting the best price. Someone else replied that they still had a 2017 model. I spoke to someone at my job (I'm IT) who told me they'd just ordered a new iMac to replace one that is 11 years old. There's no smoke and mirrors in letting such users know what they're in for.
Yup, I'm a developer who still primarily works on a 2018 Intel Mac. Apple's messaging felt very targeted towards me. Looking forward to getting the M4 Max as soon as possible!
Given that they also compare it to an M1 in the same aside, I'd say you're wrong.
> Up to 23.8x faster basecalling for DNA sequencing in Oxford Nanopore MinKNOW when compared to the 16-inch MacBook Pro with Core i9, and up to 1.8x faster when compared to the 16-inch MacBook Pro with M1 Pro.
Ben Bejarin said that around 50% of the installed base is still using Macs with Intel chips. You’ll keep hearing that comparison until that number goes down.
The base M4 Max only has an option for 36gb of ram!? They're doing some sus things with that pricing ladder again. No more 96gb option, and then to go beyond 48gb I'd have to spend another $1250 CAD on a processor upgrade first, and in doing so lose the option to have the now baseline 512gb ssd
I'd add that although I find it a bit dirty, the computers are obviously still amazing. It's just a bit bizarre that the lower spec cpu offers the customer the option to change the ram quantity. More specifically, going from the M4 Pro to the M4 Max removes the option to change the ram from 36gb, whereas sticking with the Pro lets you select 48gb or 24gb, unless you choose the max Max. If I pre-order the Mac Mini with the same processor, I can select 64gb for the insane price of an additional $750cad, but it's just not available on the macbook pro M4 Pro.
It would indeed have been nice to see a faster response rate screen, even though I value picture quality more, and it also would have been nice to see even vaguely different colors like the iMac supposedly got, but it seems like a nice spec bump year anyway.
I think any idea that Apple doesn't thoroughly understand the capacity, value, market, price tradeoff is untenable.
The most obvious view is that Apple price gouges on storage. But this seems too simplistic.
My conjecture is that there's an inescapable tension between supply (availabilty/cost) sales forecasts, technological churn, and roadmaps that leads them to want to somewhat subsidize the lowest end, and place a bit of back-pressure on consumption at the high-end. The trick is finding the tipping point on the curve between growth and over commitment by suppliers. Especially, for tightly vertically integrated products.
The PC industry is more diffuse and horizontal and so more tolerant of fluctuations in supply and demand across a broader network of providers and consumers, leading to a lower, more even cost structure for components and modules.
In real terms, Apple's products keep costing less, just like all computer products. They seem to make a point of holding prices on an appearance point of latest tech that's held steady since the first Macs: about $2500 for a unit that meets the expectations of space right behind the bleeding edge while being reliable, useful and a vanguard of trends.
Seems plausible enough to me, but whether there's a business case or not isn't my concern as much as how it feels to price something out knowing that I'm deliberately gouged on arbitrary components instead of the the segmentation being somewhat more meaningful. They're already reaping very high margins, but by tightly coupling quantities of those components to even higher margin builds, it feels a bit gross, to the point where I just have to accept that I'd have to spend even more excessively than in previous years of similar models. As in, I'm happy to pay a bit more for more power if I find it useful, likewise with ram, but not being able to get more ram without first getting something I have no way to put to use seems a bit silly, akin to not being able to get better seats in a car unless I first get the racing spec version, otherwise I'm stuck with a lawn chair.
Does iCloud mitigate this? It’s always confused me that iCloud is intended only as a data sync and not back-up… If one device goes down and the rest still work, could you still access data from the dead device?
Looking at how long the 8gb lasted it's a pretty sure bet that now you won't need to upgrade for a good few years.
I mean, I have a MacBook air with 16gb of ram and it's honestly working pretty well to this day. I don't do "much" on it though but not many people do.
I'd say the one incentive a MacBook Pro has over the air is the better a screens and better speakers. Not sure if it's worth the money.
My hypothesis is Apple is mostly right about their base model offerings.
> I mean, I have a MacBook air with 16gb of ram and it's honestly working pretty well to this day. I don't do "much" on it though but not many people do.
If an HN user can get along with 16gb on their MacBook Air for the last X years, most users were able to get by with 8gb.
It's just a tactic to get a higher average price while being able to advertise a lower price. What makes it infuriating is memory is dirt cheap. That extra 8GB probably costs them $10 at most, but would add to utility and longevity of their hardware quite a bit.
They are supposed to be "green" but they encourage obsolescence.
They align need with more CPU and margin. Apple wants as few SKUs as possible and as much margin as possible.
8GB is fine for most use cases. Part of my gig is managing a huge global enterprise with six figures of devices. Metrics demonstrate that the lower quartile is ok with 8GB, even now. Those devices are being retired as part of the normal lifecycle with 16GB, which is better.
Laptops are 2-6 year devices. Higher end devices always get replaced sooner - you buy a high end device because the productivity is worth spending $. Low end tend to live longer.
People looking for low prices buy PC, they don't even consider Mac. Then they can have a computer with all the "higher numbers", which is more important than getting stuff done.
I bought a framework back in 2020 or so and really wish I just waited a little longer and spent a few hundred bucks more on the M1.
It's fine, but the issue is linux sleep/hibernate - battery drain. To use the laptop after a few days, I have to plug it in and wait for it to charge a little bit because the battery dies. I have to shut it down (not just close the screen) before flying or my backpack becomes a heater and the laptop dies. To use a macbook that's been closed for months I just open it and it works. I'll pay double for that experience. If I want a computer that needs to be plugged in to work I have a desktop for that already. The battery life is not good either.
Maybe it's better now if I take the time to research what to upgrade, but I don't have the time to tinker with hardware/linux config like I did a few years ago.
I don't mind spending a thousand bucks every 7 years to upgrade my laptop. I've had this macbook air since 2020 and besides the speakers don't being the best... I have no complaints.
I don't really see a world where this machine doesn't last me a few more years. If there's anything i'd service would be the battery, but eh. It lasts more than a few hours and I don't go out much.
Even the "few thousand" for a bit of extra RAM and SSD are highway robbery.
Anyways to each their own, I also had things break and repairability isn't really a thing with Apple hardware (while it easily could be if they wanted - even if difficult and by certified technicians instead of myself)
Been using Pro products for almost 15 years and I just switched to a Lenovo Thinkpad and Ubuntu. This is so much more fun and innovative. Apple reached a plato, you can like it or not.
I'm just some dude, looking at a press release, wondering when Tim Apple is gonna be a cool dude and release the MBP in all of the colors that they make the iMac in.
APPARENTLY NOT TODAY.
C'mon mannnnn. The 90s/y2k are back in! People want the colorful consumer electronics! It doesn't have to be translucent plastic like it was back then but give us at least something that doesn't make me wonder if I live in the novel The Giver every time I walk into a meetup filled with MacBook Pros.
Would it make sense to upgrade from M2 Pro 16 to M4 Pro 16? (both base models)
I mean it terms of numbers, more cores, more RAM but everything else is pretty much the same. I am looking forward to see some benchmarks!
Have they published this ahead of other pages or is it just me?
The linked Apple Store page says "MacBook Pro blasts forward with the M3, M3 Pro, and M3 Max chips" so it seems like the old version of the page still?
I used a Surface Pro for 6 years and and haven’t missed the touch screen once since switching back to MBP 3 years ago. I would have missed the handwriting input but that’s what a low end iPad is for.
This is the first time they have not obscenely limited their starter offerings with 8GB RAM. The time it took them to do that is just pathetic. Now I guess this will happen until how long? Maybe 2034 - and starting RAM 16GB. I wish I could say it's a welcome change but in 2024 for such overpriced machine if they are starting with 16GB RAM then that's anything but special. Also, I am sure the SSDs and RAMs are still soldered tight :)
If only they could allow their iPads to be used as a Mac screen natively I might buy a Mini and an iPad and get done with it two use cases but why would Apple want users to be able to do that without extra expense.
I find it very odd that the new iMac has WiFi 7 but this does not... Also it is so aggravating they compare to 3 generations ago and not the previous generation in the marketing stats. It makes the entire post nearly useless.
It is very aggravating, but if they advertised a comparison to last year's model and showed you small performance gains you might not want to buy it.
A more charitable interpretation is that Apple only thinks that people with computers a few years old need to upgrade, and they aren't advertising to people with a <1 year old MacBook Pro.
If you’re willing to buy from a retailer you can usually get two or three year financing terms. sell it at the end of the payment term for half (or more) of what you paid in total and get a new one on a similar plan if you want.
don’t think it’s wise though, i bought a base m1 pro mbp when it launched and don’t feel a need to upgrade at all yet. i’m holding off for a few more years to grab whenever the next major increase in local llm capability and battery life comes.
To all these comments, I'm not talking practically for myself, I'm talking about what a revolutionary "think different" company would do for its users and projection into the world. Starting with taking away the friction of changing models and the impact of all these product lines, which sometimes instantly become less valuable (2023's "8GB is more than enough"), would be a good start, and Apple if anyone could amortize this on behalf of their userbase.
Another observation; I've travelled the world and rarely see people who could use robust, secure products the most (vulnerable people) using Apple products. They're all packing second-tier Samsung or LG Androids and old Windows notebooks (there are decent Samsung, LG, Android, Windows products, but that's not what they have access to).
If you're willing to play, here are plenty of lenders who will finance this purchase.
If it affects your earning power to that extent, you should probably pony up and save in the long run, probably just a few years until you see returns.
Caste system usually can't be bypassed by paying a monthly subscription fee.
I will note that making it a subscription will tend to increase the overall costs, not decrease. In an environment with ready access to credit, I think offering on a subscription basis is worse for consumers?
I find it amusing how you answer your own "question" before asking it. Why would they target the marketing material at people who already know they aren't going to need to upgrade?
Roopepal did someone piss in your coffee this morning?
I had no questions. I’m merely saying that it’s funny they’re comparing to old technology instead of last year’s.
It’s a valid criticism.
Take a breath.
Apple knows an M4 is a hard sell for M2/3 owners. Except if you have specific workflows that can take advantage of the newer silicon, you'll spend a lot of money on something you probably don't need. I have an M1 32GB with multiple software packages running, and I see no reason to replace this machine.
This is why Apple is comparing against M1: M1 owners are the potential buyers for this computer. (And yes, the marketing folks know the performance comparison graphs look nicer as well :)
The software stack has gotten so bad that no amount of hardware can make up for it.
The compile times for Swift, the gigabytes of RAM everything seems to eat up.
I closed all my apps and I'm at 10gb of RAM being used - I have nothing open.
Does this mean the Macbook Air 8gb model I had 10 years ago would basically be unable to just run the operating system alone?
It's disconcerting. Ozempic for terrible food and car-centric infrastructure we've created, cloud super-computing and 'AI' for coping with this frankenstein software stack.
The year of the Linux desktop is just around the corner to save the day, right? Right? :)
Activity Monitor counts everything from I/O buffer caches to frame buffers in the "memory used" metric. Add in that MacOS won't free pages until it hits a certain memory pressure and you get a high usage with no desktop apps open.
This also means that cleanly booted machine with 16 GB will show more memory used than a machine with 8 GB.
Apple suggests you use the memory pressure graph instead to determine whether you're low on memory for this reason.
It's very hard to measure memory use because it's reactive to how much RAM you have; if you have more then it's going to use it. That doesn't necessarily mean there are any issues.
You can argue about whether it's actually bulletproof or not but the fact is, nobody else is even trying, and have lost sight of all privacy-focused features in their rush to ship anything and everything on my device to OpenAI or Gemini.
I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.