I like the attitude of author of this gist. Just make it work.
My own practical practice is to use both a very fine System76 Linux laptop and also a super fast, if constrained, M1 MacBook Pro. Depends on what I am doing, or sometimes just what I feel like using.
To be perfectly open though, I had considered switching to just using Linux until the M1 was released. The thing that held me back was spending $3500 on a commercial Common Lisp implementation 11 months ago. Going all in on Linux would have cost me more money.
I use Common Lisp daily on macOS and Linux, but use SBCL and Emacs with Sly. Never tried out a commercial implementation so I cannot compare to Emacs and SBCL. Are they worth the money vs. the open source tools?
How could they be worth it? Vendor lock-in has a huge cost IMO. You're going to have to keep paying for these insanely expensive licenses for years, maybe decades?
Sigh. I wish Apple would just release these drivers as a sign of goodwill to the rest of the community. I think these machines could make decent development workstations if they had an M.2 port and native Linux support on day 1. It really feels like the least they can do considering their transgressions against the open source community over the past few years...
This is way beyond wishful thinking. Apple will do what it makes most sense in their business strategy. Due to their past transgressions against the open source community, what makes you think that community fits in their business strategy?
I'm not sure what transgressions you're referring to, but releasing some specs would be nice, or updating the open source copy of XNU to a version with M1 in there.
Some of these have been pain points on QEMU (such as Android emulation) for a really long time. The coreaudio fixes and sparse block storage alone are fantastic.
What's the story with OpenGL on macOS? I know it is abandonware in favor of Metal but are there reasons Apple will at least keep it in the OS / drivers for foreseeable future? Do any of the ANGLE type things help when they finally take it away?
It still works and at least somebody still seems to work on it going by the somewhat random GL-related fixes and regressions in macOS Betas since GL was declared deprecated.
AFAIK ANGLE (and MoltenGL) only provide GLES2.x and GLES3.x over Metal, so out of the box those wouldn't be very useful for providing desktop-GL compatibility.
Apparently Zink on top of MoltenVk on top of Metal works, but that looks like a Jenga tower of emulation layers.
Ideally, Apple would rewrite their OpenGL framework on top of Metal (if it hasn't happened yet), and keep that working at least into the 2030s.
Virgil 3D renderer provides some desktop GL emulation, so it is still somewhat usable. But a serious application expecting desktop GL would see glitches.
My recommendation is to use Wayland, and to play serious games on macOS (although such games are not available on Linux AArch64 anyway...)
Virgil 3D does good enough for my workload. I saw some problems in GLES backend support but they are minor and I have already written patches for them.
However, as feature_list in src/vrend_renderer.c shows, some features are unavailable and cannot be emulated. It is good enough but not perfect (and I think it is reasonable.)
>> AFAIK ANGLE (and MoltenGL) only provide GLES2.x and GLES3.x over Metal, so out of the box those wouldn't be very useful for providing desktop-GL compatibility.
I thought Wayland implementations considered one of those a base requirement.
Thank you for your work & reply. I already submitted some and a few were merged and one is in review.
The other patches are not submitted yet because they require a little change for Epoxy, which is not merged yet, or I'm being lazy. Maybe I should open a MR as a RFC.
That dovetails with my own experiences. I installed tomb raider on my m1 air, and while it was playable i was not impressed with its graphical abilities. The m1 is great for integrated graphics, but not on par with dedicated graphics. CPU-wise it is the fastest computer I have though, noticeably faster than the i5 9600K in my desktop.
The Apples to Apples comparison (if you will) is to other integrated GPUs.
>The first Apple-built GPU for a Mac is significantly faster than any integrated GPU we’ve been able to get our hands on, and will no doubt set a new high bar for GPU performance in a laptop. Based on Apple’s own die shots, it’s clear that they spent a sizable portion of the M1’s die on the GPU and associated hardware, and the payoff is a GPU that can rival even low-end discrete GPUs.
What the hell, enough with this! My main dev machine until a few months ago was a Dell laptop with 8 GB of RAM on Ubuntu. I ran RubyMine on it all day, some Sublime, Spotify, Slack, MariaDB, Thunderbird, LibreOffice, and I had no problems whatsoever. The only reason it ever ran out of RAM was a memory leak in Firefox.
Things really go haywire when you are developing C++ with clangd (the de-facto code analyzer for Visual Studio Code, CLion, etc...). It sometimes takes over 8 gigs of ram for just the editor. Also it takes a ridiculous amount of memory to just compile your code, especially if you're using Eigen.
That's good for you and I'm glad you're happy but for some of us, considering the work with are doing, sometimes even 32GB is not really enough.
Different people, different needs.
That's fine, but people like these are mostly outliers in the dev community, they exist on Linux, Windows and even macOS and they definitely KNOW perfectly well what their requirements are and why.
The post I was replying to said something very different and much weirder.
The M1 is Apple’s low-end chip, currently available only on their cheapest computers. It’s like complaining that a Corolla can’t keep up with a Ferrari on a race track. The most charitable interpretation is that the person doesn’t understand the market position of the two items.
I find it easy to sneak up on 16gb of memory when running basic apps (editor, bunch of browser tabs, zoom, spotify) plus a complete environment for a mid sized web app (multiple services, dbs, message queue, caches, js bundler, workers etc), especially if I’m running tests and have mobile device VMs up. Pretty cheap insurance over the life of a machine to get 32gb of RAM and never have to think about it.
Perhaps you're correct that I don't need it in the sense that I only very occasionally find myself with a dataset or something that doesn't fit in memory, and could just run a high memory cloud instance for a bit to process it.
Even still, my experience of desktop Linux under memory pressure has been frustrating, even with fast SSDs, and overspeccing the memory is an inexpensive guarantee that my system will never start thrashing during an important screenshare demo or something, so it's an obvious choice if I'm shopping for a new computer.
When I built this computer last year the cost difference between 16 and 32GB was like $40...easy to justify a 2-3% premium on the overall cost of the machine to never have to give a second thought to conserving memory. That said, Apple charges $400 for the same upgrade (in their machines that support 32GB), so the calculus there is a bit different.
Desktop Linux tends to be a bit of a memory hog compared to MacOS. I think a lot of Linux users would be surprised how usable even the 8GB macs are for most tasks.
My work desktop is 32GB and it falls over any time I try to create really big R data sets for others. I have to use a cloud machine with 64GB, and I run that out of memory most of the time when trying to optimize the production pipeline. They refuse to give me anything larger, so that's my upper limit. If anyone knows how to create giant .rds files without storing everything in memory first, I'd love to hear it.
That's fine, it's not like it keeps them all open, i've had a couple hundred tabs open in Firefox while researching a project using tree style tab to organize them. Modern browsers cache pages to disk after a certain high water mark. People actually seem to think all those tabs stay in memory.
It was meant slightly in jest; but looking at Activity Monitor on my 32GB RAM Macbook Pro, it looks like I'm currently using ~28GB. I have Docker & a few Node.js processes (webpack builds, typescript compiler, language server, etc) taking about ~10GB between them, and then a sea of "Google Chrome Helper (Renderer)" processes each taking between 100 and 900MB. There are at least 20 of these, and then also the usual suspects with Slack, Skype (yes), Finder, etc.
Honestly, I could probably do with 16GB right now, but I'm planning on keeping this machine for at least 5 years; it was worth the few hundred bucks extra to future-proof it.
Browsers will quickly eat up all of your available RAM if you open enough tabs. The thing is, if you had less RAM, they'd be keeping fewer tabs alive in RAM. So you can't really infer from "I'm using X amount of RAM now" to "I need at least X amount of RAM". If you upgraded to 64GB you'd probably end up 'using' a lot more than 28GB for the exact same workflow.
Simply open a blank page, quit the browser and restart it. Now, only load the two or three page/sites you really need. Simple as that to bring mem use under 500MB, with Firefox at least. Repeat this once a day.
I personally close my browser at night and load it in the morning.
re. the "usual suspects", I got an M1 air late last year, and I've decided to keep it completely separate from my work laptop, so all it has installed is basically firefox, a couple of code editors, and whatever came with it. It absolutely screams compared to my other laptop, but I wonder how much of that is the M1 processor, and how much of it is because I don't have all this garbage running all the time.
Apple brands one of these machines as a "Pro" system and previously offered 32Gb in that model's option range.
The person isn't the one that's failing to understand the market position of the two items, Apple is the one that failed to brand it appropriately. It should have been a Macbook Air & Macbook, not a Macbook Pro.
Although realistically at this point "Pro" has lost nearly all meaning in Apple's lineup. It's like an R badge on a car. Used to mean something specific, now just means a generically higher premium option.
> Apple brands one of these machines as a "Pro" system and previously offered 32Gb in that model's option range.
They still do. The Intel, 4-port, 13” MacBook Pro is still available, and can be configured with 32GB of RAM. I don’t think it would be a sensible purchase at this point though.
That 2-port Pro has no reason to exist IMO. Even on Intel it used a chip with a TDP closer to the Air than the 4-port models; now on ARM they’re using the exact same part. Yeah it has a fan but most workloads will never turn the thing on.
The first laptop I saw running a VM was a Dell Precision M50 (?), it was 2" thick and almost 10lbs. It had something ridiculous (at the the time) like 1 or 1.5GB of RAM.
A sales guy was demoing a product to us, and it spun up a Windows 2000 VM for IIS and another for SQL server, it probably took 10 minutes to get started but he could then demo the app through the browser. Sounds silly but it worked.
"But can't you just run it on a server somewhere"
Yes. That'd be cheaper and faster. But at this high end of the market, it might make more sense for some people.
I work on a medium-to-large sized project that has a Go backend and a React/TypeScript frontend. Having our full development environment running (a handful of Go processes, Webpack, PostgreSQL, Redis, an indexing server, and probably other stuff I'm forgetting) and trying to edit code in both simultaneously (so having LSP servers running for both Go and TypeScript) is usually enough to cause my 16 GB laptop to swap heavily. It's not a pleasant experience: think multi-second UI lockups, music skipping, that kind of thing.
So, realistically, at this point I need a 32 GB machine to effectively do my job. Minority? Sure. But I think there are enough people in my boat that it's a legitimate market to have an option for.
Not the OP, but I fill 32GB pretty decently when I run my work’s customer facing web stack, database server, ancillary services, and a couple search engine instances. Prior to moving to locally hosted containers my main memory gripe was Electron. I much prefer having all of my development localized though. It’s a lot closer to the good parts of when I wrote “shrink wrapped” desktop software.
Boy isn't that the truth. I actively avoid them but sometimes you just can't like with signal-desktop. I actually have to keep it running because I don't want to reach for my phone every 10 minutes. Something as simple as that interface doesn't need electron but oh well. Maybe a lite version with just text and contacts would be acceptable. I don't need stickers, gifs and emojis.
Is it? Up until now most devs still go with 16 or even 8 GB laptops. 32GB laptops are far between (Macs or not), and much fewer people use desktops...
And we used to run Vmware with Windows (on Linux) or Linux (on Windows) in 2005 just fine, on laptops with a couple GB at best...
Dozens of devs I know in a company I work with have 8GB Macs (and 2-3 years old at that), and run Vagrant and Docker with no issue. And those are mostly Airs with shared GPU too...
I think people with 32GB laptops hugely overestimate how many other devs also are on 32GB...
I have a 2019 MBP with 32GB of RAM. I bought it thinking that I would be running a Windows VM under Parallels, on which I would then let my company install the execrable Microsoft "mobile iron" access control. (You know, the one where they can wipe the machine remotely.) But then they changed so that even machines with this enabled cannot access the company resources on Azure. It must be one of their own machines. (So what's the freaking point?) Anyway, if it would have worked out, that's definitely a use case for more than 16.
Could this be because they don't understand how much faster it can be -- the slowness is an accepted cost of doing work?
I've recently run virtualbox/vagrant/docker on an air from ~3 years ago, and it's painfully slow compared to my 3-yo system with 32GB RAM. It works perfectly fine, but it's slow.
> I think people with 32GB laptops
That may be true. It may also be true that people with 8GB laptops vastly underestimate the benefit of having more memory available for development workloads.
I think the people who claim that 16 GB isn't enough are insane, but it should be said: at 4 GB it does become a bit problematic to both keep browsers with a bunch of tabs open and do other things smoothly. If we keep modern website hogs out of the picture, I completely agree with you.
I tally about 500-1000 MB doing my job – if I can close the browsers that I do admittedly need to read documentation.
> I think the people who claim that 16 GB isn't enough are insane
The problem is containers. Some development workflows appear to require running a container for every piece of software needed, and that adds up (and feels incredibly wasteful).
None of my development workflow requires containers locally, so I get by just fine with 16gb.
I wonder when the pendulum will swing back and people will stop this insanity. I can already picture the buzzwords people will invent to describe "apps running on the OS itself".
On the other hand it's no insanity to run millions of lines of code in a computer, connected to the internet and hopefully sharing all the projects you need to work with and dependencies you need to make them work, with full access to your filesystem, usually requiring administrator rights to be installed or even used.
This would all be needless if there was an OS that allowed you to switch to another "env" wiping the RAM in the process (like, storing a snapshot of it for when you switch back), with guaranteed isolation at the filesystem level (perhaps able to still read-only link into common libs), able to install things without touching other "envs". If you're working in any webdev related environment, that is itself the definition of insanity.
It's like buying a hammer from the carpenter at the end of the road and giving access to all your house to him in the process, including your wife. Everything becomes a nail.
Maybe I'm just a simplistic person, but classical Linux distributions make this problem non-existing for 99% of what goes on in my world.
Their role is precisely to orchestrate the cooperation and interdepndency of those millions of lines of code. I don't understand why people have started turning those distros into glorified delivery vehicles for containers.
Simplistic is actually good - but I don't think that any (?) of the available current OSes is able to contain a program that can run once as administrator right? If I'm not recalling it incorrectly, BSD jails, linux cgroups, user filesystem permission, docker, vm's - and that's at the surface.
That would require people to clean up. Sometimes I squint at the installation requirements and then head over to hub.docker.com. I think it will get worse (i.e. more layers on top) before it gets better.
Yeah when you see "install this docker" OR "follow this 10,000 line install and compile guide" and then have to explain to your boss what you've been up to today.
Yeah, containers or VM's up the count indeed, but still.
2 Parallels (test version for m1) VMs running server ubuntu, 1 with 2 ssh connections, for X and tunnelling, running emacs into my desktop, running a web server with repl and environment, other running Postgresql. Safari with 8 tabs open, 4 terminal windows, Xquartz, tells me the biggest memory footprint is the vm with emacs/server/repl, at 3.9GB followed by the other VM at 2.8GB, but the overall used mem says 6GB used(?).
From the tabs Gmail was pointing to +1GB - a browser email client consuming +1/3 of the memory a VM running an OS plus a database.
I sometimes have another VM, another 2 ssh conns, more tabs open including youtube and never noticed any slow/sluggish behaviour.
I also feel like running VMs is wasteful, but it seems we are not able to create OSes with proper native tools for isolation.
Why would containers do anything to memory requirements? I guess you can't share libraries, but otherwise I don't see why memory use should be any different running the same process in a container or not.
Even this makes little sense unless your containers are really poorly sized. I used to run a mini Kubernetes cluster on my desktop with 6 nodes. Some of them running big Java projects, Postgres, Elastisearch, and Ruby. And I still had enough memory to run Rubymine or IntelliJ along with Slack and whatever other local stuff I needed.
tab discard extensions go a long way toward keeping browsers under control, just pin the tabs that you absolutely can't wait 3 seconds for them to reload or because you need notifications from them.
Since when compiling large sources is a desktop environment?
There are projects that you cannot link if you have under 32 GB; but that is not enough to say that desktop environment you use while compiling is unusable under 32 GB.
I mean really how many people are recompiling their entire code base every single day? Is their make/build environment so bad that they can't trust incremental compiles? I'm getting along just fine on my 8GB laptop. I know some people work with big datasets and video and such but holy cow. Most of us are probably just writing code for a browser/backend/library/test and not pushing around terabytes of data.
My 5-year-old laptop is an i5/8GB with Fedora/GNOME (my primary machine is a desktop) and that is usually ok, but there are certainly things I can't do (like use Google Earth while having slack or a web browser open).
You won't but I think some HN people have to push around a lot data and the fastest way to do that is to get as much of it in memory as possible. Others love their electron apps and docker suites and VMs.
For a regular user, probably not. Contemporary dev work with spurious containerization on the other hand…
Maybe some companies should give their employees two systems. Your average way too expensive bragging rights laptop, and a smaller older one which is the only one you're allowed to do your code review/productive commits on.
People, for some kind of reason, really like their laptops. They also have more brand recognition, thus the "bragging right" part, when you get your new X1/Macbook every 2 years.
And if you still work on your laptop but have your personal server box as a CPU slave, that still leaves some issues. You might be able to do your build steps on the machine, but your IDE still runs on your laptop, so that needs to be beefy enough -- never mind auxiliary applications like browsers, Photoshop, CAD etc.
Also, it's quite hard to get a slow laptop with decent ergonomics, unless you buy used (not that much an option for companies).
Also, once you go headless server, it's better to just put it online anyway, as you don't need 24/7 access
Are you joking?! That depends entirely on what you're doing! I could do my day to day work on 16 GB just fine. I could probably live fine with 8 GB if I actively closed web pages that I didn't actually need to keep open.
I suppose it's what you do with it. I use about 6GB of an 8GB system, with the rest allocated to buffer cache. Firefox, VS Code, terminals, mapping software...I can't use 16GB unless I spin up a VM.
For me, it is completely fine. I just develop some softwares with little graphics load, and use Web applications and watch some videos.
You have no luck here if you are going to play modern games, but modern proprietary game engines do not support AArch64 Linux anyway. It would be difficult time for Linux desktop if other PC vendors migrate to AArch64.
It’s still a really large discrepancy. I would guess something is broken somewhere. What is the webgl frame rate on native Mac? Have you tried running those webgl demos with chromium?
Note that I haven't really optimized it yet. Certainly there should be low-hanging fruits, but I don't dare to get them as it is good enough for me as GNOME is no longer laggy and I know some overheads cannot be eliminated.
If I have to do graphics-intensive workload, I would simply run it on macOS or buy another accelerator as M1 is not the best performant graphics accelerator in the market anyway.
It could be related to how many cpu's you allocate for the vm. I use 1 single cpu, and networking is extremely fast, very low ping times. (mainly usr the headless server version, no x11)
You could but then you would still be in the MacOS desktop environment. I think he doesn’t just want to run Linux applications but instead wants the Linux desktop environment.
Desktop environments on Linux basically just draw into a full-screen X window, don't they? (And then the programs you run are children of that root window.) I wonder how hard it would be to hack up XQuartz to support that...
I haven't tried XQuartz at all because I didn't think that kind of software has plenty of resource and is well maintained. Retina support is just one problem caused by lack of maintenance.
There are only two options: hack Virgil 3D or hack XQuartz. You cannot "just" run XQuartz and solve problems. And I choose hacking Virgil 3D because it should have less communication overhead and work with Wayland.
It's a total PITA, but it can be done as long as you are ok with full screen. I use it to run PixInsight on FreeBSD and view it on macOS: https://xw.is/wiki/HiDPI_XQuartz
/edit: oh, and yeah, if you patch xrandr not to return, I suspect that the resolution change will persist on switching between macOS and the unix desktop, though I haven't tested that yet.
Why is this such a big issue? Can't you install any ARM Linux distro from a flash drive? Does that not work? I get that device drivers may not be perfect for any new laptop, but it should mostly work.
Linus Torvalds: ARM has a lot to learn from the PC :
"I think ARM is a very promising platform," he said. "At the same time, the ARM community has never had the notion of a standard platform. ARM never had the PC."
In a way, they themselves started becoming a manufacturer of "IBM compatibles", they just locked their OS to them.
A Jobs-less alternate Apple universe might've been interesting though. He both canned the licensed Apple clones and made OpenStep into the next major MacOS revision. If Copland would've been OS 8 or BeOS as OS X, there could be viable clones, and thus maybe enough critical mass to keep PowerPC alive a bit longer.
There’s a lot of proprietary hardware inside the MacBook M1s. The only reason device drivers exist is thanks to reverse engineering efforts. For this reason, no you can’t just install any vanilla ARM distro and expect things to work.
My own practical practice is to use both a very fine System76 Linux laptop and also a super fast, if constrained, M1 MacBook Pro. Depends on what I am doing, or sometimes just what I feel like using.
To be perfectly open though, I had considered switching to just using Linux until the M1 was released. The thing that held me back was spending $3500 on a commercial Common Lisp implementation 11 months ago. Going all in on Linux would have cost me more money.