Whether or not you're using ESXi, or want to, is an entirely different question. But "you're not able to" is simply incorrect. I virtualize several build agents and have for years with no issues.
macOS 26 is the last major version to support Intel, so once macOS 28 is latest this will probably become impossible (macOS 26 should be able to use Xcode 27, but maybe the platform removal will change this previous year's OS support from continuing).
I am being facetious. You'll have a PC for gamedev because that's your biggest platform unless you are primarily switch or PS5, in which case you'll have a devkit as well as a PC. But the cost of an Apple device is insignificant compared to the cost of developing the software for it.
So it really comes down to the market size and _where they are_. The games I play are either on my PS5, or on my Mac, never both. For any specific game, they are on one or the other. Ghost of Tsushima is on the PS5. Factorio is on my Mac. If I were an indie game developer, I'd likely be developing the kind of game that has a good market on the Mac.
> E.g., in Windows apps, menu items are keyboard-addressable by default. This is brilliant for accessibility, and for accustomed power users. MacOS has no _by default_ equivalent.
Cmd-Shift-? (really, Cmd-?)
You can begin using arrow keys from there, or start typing to search the menu items of the foreground app
You can also assign arbitrary hotkeys to any application's menu items in the OS system preferences
Yeah, I know about this; it’s not the same. In Windows apps following the standard (which all good ones do) _every menu item_ is keyboard addressable. Something several submenus in is trivially accessible by muscle memory: ALT-I-R-C to resize an image without constraints, e.g.
MacOS allows easy navigation of the menu, but does not guarantee that each item is hotkey-addressable.
It would work even better. From the linked support page:
"Motion is detected based on the amount of signal disruption taking place between the Xfinity Gateway and your selected WiFi-connected devices, so motion from small pets (around 40 pounds or less) can be filtered out while keeping you notified of large movements more likely to be caused by humans."
Moving the mouse pointer to middle of bottom edge of a screen will permanently move the Dock there. As far as I know, there is no way to disable this behavior. It's terrible.
For simultaneous multi-monitor, you're probably forced into using a full Dock replacement. There are a few options out there, but none have ever stuck with me personally.
except Excel's ribbon menu items. As far as I know there's no method to hotkey those like ALT+[<letter>] on windows. Same for Outlook's categorize email function.
So i5-14600K is 1.57x on multi-core, slightly worse on single-core. $235 for the CPU versus $599 for a whole system. Could maybe match the total price, but Intel won't be able to come anywhere close on the power efficiency.
The 120GB/s memory bandwidth of the unified memory helps especially with video, I guess. The M4 CPU isn't really stressed out most of the time. Only multicam and HLG conversations it maxes out.
Once I patched my old Dell T1650 BIOS for ReBAR support yet the iGPU of the i5 1135-g7 had similar GPU performance for video as the Intel Arc A380 in the desktop PC. The old PCIe3 speed limited its performance. I heard others reported a smoother replay experience with Apple silicon compared to even a RTX4090.
I get some delays when fast scrubbing through a 9 multicam 720p timeline and just 360p proxies. Still impressed compared to what I was used to. Video editors may be surprised about the performance for the price.
There is a Github project [1] which has detailed instructions. The ancient i5-3570 only allowed 2GB ReBAR, BTW. GPU-Z says ReBAR / 4G is activated and working, Intel Arc Driver does not see it but seems to use it. Some part of the BIOS had to be manually fixed, AFAIR.
The PC was given for free, the CPU €11 yet overall I wouldn't recommend the process just for the result. It's only little benefit, if at all, though fun. On that occasion I also added some NVMe driver which works well, demonstrated for the similar Dell Optiplex [2][3]..
4x vs. the old i5, not the M4. They are trying to say that comparing to a CPU released four years ago is pointless because the newer CPU is obviously much better.
Some of it depends on which variant fits you best. But yeah, in general the M1 is still very good--if you hear of someone in your circle selling one for cheap because they're upgrading, nab it.
On the variants: An M1 Max is 10 CPU cores with 8 power and 2 efficiency cores.
M4 Max is 16 cores, 12 + 4. So each power core is 50% faster, but it also has 50% more of them. Add in twice as many efficiency cores, that are also faster for less power, plus more memory bandwidth, and it snowballs together.
One nice pseudo-feature of the M1 is that the thermal design of the current MacBook Pro really hasn't changed since then. It was designed with a few generations of headroom in mind, but that means it's very, very hard to make the fans spin on a 16" M1 Max. You have to utilize all CPU/GPU/NPU cores together to even make them move, while an M3 Max is easier to make (slightly) audible.
> it's very, very hard to make the fans spin on a 16" M1 Max. You have to utilize all CPU/GPU/NPU cores together to even make them move,
I routinely get my M1 fans spinning from compiling big projects. You don’t have to get the GPU involved, but when you do it definitely goes up a notch.
I read so much about the M1 Pros being completely silent that I thought something was wrong with mine at first. Nope, it just turns out that most people don’t use the CPU long enough for the fans to kick in. There’s a decent thermal capacity buffer in the system before they ramp up.
Huh! I regularly max CPU for long stretches (game development), but I found I could only get the fans to move if I engaged the neural cores on top of everything else. Something like a 20+ minute video export that's using all available compute for heavy stabilization or something could do it.
The M3 is much more typical behavior, but I guess it's just dumping more watts into the same thermal mass...