Hacker News new | past | comments | ask | show | jobs | submit login

>Mac ARM laptops mean cloud ARM VMs.

What is the connection here ? ARM servers would be fine in a separate discussion. What does it have to do with Macs ? Macs aren't harbingers of anything. They have set literally no trend in the last couple of decades, other than thinness at all costs. If you mean that developers will use Gravitons to develop mac apps, why/how would that be ?




To quote Linus Torvalds:

"Some people think that "the cloud" means that the instruction set doesn't matter. Develop at home, deploy in the cloud.

That's bull*t. If you develop on x86, then you're going to want to deploy on x86, because you'll be able to run what you test "at home" (and by "at home" I don't mean literally in your home, but in your work environment)."

So I would argue there is a strong connection.


> If you develop on x86, then you're going to want to deploy on x86

I can see this making sense to Torvalds, being a low-level guy, but is it true for, say, Java web server code?

Amazon are betting big on Graviton in EC2. It's no longer just used in their 'A1' instance-types, it's also powering their M6g, C6g, and R6g instance-types.

https://aws.amazon.com/about-aws/whats-new/2019/12/announcin...


I agree about Java. I'm using Windows to write Java but deploy on Linux and it works. I used to deploy on Itanium and also on some IBM Power with little-endian and I never had any issues with Java. It's very cross-platform.

Another example is android apps development. Most developers use x86_64 CPU and run emulator using intel android image. While I don't have vast experience, I did write few apps and never had any issue because of arch mismatch.

High level languages mostly solved that issue.

Also note that there are some ARM laptops in the wild already. You can either use Windows or Linux. But I don't see that every cloud or android developer hunting for that laptop.


It works until it doesn't. We had issues where classes were loaded in a different order on linux causing issues that we could not repro on windows.


Interesting, but in that case you changed OS rather than changing CPU ISA, so not quite the same thing.


No, it's exactly the same thing. The more variables you change, the harder it will be to debug a problem.


I've deployed C++ code on ARM in production that was developed on X64 without a second thought, though I did of course test it first. If it compiles and passes unit tests, 99.9% of the time it will run without issue.

Going from ARM to X64 is even less likely to have issues as X64 is more permissive about things like unaligned access.

People are making far too big a deal out of porting between the two. Unless the code abuses undefined behaviors in ways that you can get away with on one architecture and not the other, there is usually no issue. Differences in areas like strong/weak memory ordering, etc., are hidden behind APIs like posix mutexes or std::atomic and don't generally have to be worried about.

The only hangup is usually vector intrinsic or ASM code, and that is not found in most software.

For higher level languages like Go or Java, interpreted languages like JavaScript or Python, or more modern languages with fewer undefined edge cases like Rust, there is almost never an issue.

This is just not a big deal unless you're a systems person (like Linus) or developing code that is really really close to the metal.


I've developed for x86, and deployed on x86. Some years later we decided to add arm support. Fixing the only on arm bug made our x86 software more stable. Turns out some 1 in a million issues on x86 that on arm happen often enough that we could isolate them and then fix them.

Thus I encourage everyone to target more than one platform as it makes the total better. This even though there are platform specific issues that won't happen on the other (like the compiler bug we found)


Apparently Apple had macos working for years on x86 before they switched their computers to intel CPUs. The justification at the time was exactly this - by running their software on multiple hardware platforms, they found more bugs and wrote better code. And obviously it made the later hardware transition to intel dramatically easier.

I would be surprised if Apple didn’t have internal prototypes of macos running on their own Arm chips for the last several years. Most of the macos / iOS code is shared between platforms, so it’s already well optimized for arm.


They've had it deployed worldwide on ARM -- every iPhone and iPad runs it on an ARM chip.


To add to your example, everyone that targets mobile devices with native code tends to follow a similar path.

Usually making the application run in the host OS is much more productive than dealing with the emulator/simulator.


Most of my work outside of the day job is developed on x86 and deployed on ARM.

Unless you're talking about native code (and even then, I've written in the past about ways this can be managed more easily), then no, it really doesn't matter.

If you're developing in NodeJS, Ruby, .NET, Python, Java or virtually any other interpreted or JIT-ed language, you were never building for an architecture, you were building for a runtime, and the architecture is as irrelevant to you as it ever was.


> Python

Well I can't speak to some of the others... but Conda doesn't work at all on ARM today (maybe that will change with the new ARM Macs, though), which is annoying if you want to use it on, say, a Raspberry Pi for hobby projects.

Additionally, many scientific Python packages use either pre-compiled binaries or compile them at install-time, for performance. They're just Python bindings for some C or Fortran code. Depending on what you're doing, that may make it tricky to find a bug that only triggers in production.


Sorry, yes this is an exception.

Also one I've come across myself so I'm a bit disappointed I didn't call this out. So... kudos!


If you're on a low enough level where the instruction set matters (ie. not Java/JavaScript), then the OS is bound to be just as important. Of course you can circumvent this by using a VM, though the same can be said for the instruction set using an emulator.


But that's the other way round. If you have an x86 PC, you can develop x86 cloud software easily. You don't develop cloud software on a mac anyway (i.e., that's not apple's focus). You develop mac software on macs for other macs. If you have to develop cloud software, you'll do so on linux (or wsl or whatever). What is the grand plan here ? You'll run an arm linux vm on your mac to develop general cloud software which will be deployed on graviton ?


> If you have to develop cloud software, you'll do so on linux (or wsl or whatever).

I think you are vastly underestimating how many people use Mac (or use Windows without using WSL) to develop for the cloud.


I can say our company standardized on Macs for developers back when Macs were much better relative to other laptops. But now most of the devs are doing it begrudgingly. The BSD userland thing is a constant source of incompatibility, and the package systems are a disaster. The main reason people are not actively asking for alternatives is that most use the Macs as dumb terminals to shell into their Linux dev servers, which takes the pressure off the poor dev environment.

The things the Mac is good at:

1) It powers my 4k monitor very well at 60Hz

2) It switches between open lid and closed lid, and monitor unplugged / plugged in states consistently well.

3) It sleeps/wakes up well.

4) The built in camera and audio work well, which is useful for meetings, especially these days.

None of these things really require either x86 or Arm. So if a x86 based non-Mac laptop appeared that handled points 1-4 and could run Linux closer to our production environment I'd be all over it.


I think you've hit the nail on the head, but you've also summarised why I think Apple should genuinely be concerned about losing marketshare amongst developers now that WSL2 is seriously picking up traction.

I started using my home Windows machine for development as a result of the lockdown and in all honesty I have fewer issues with it than I did with my work MacBook. Something is seriously wrong here.


I think Apple stopped caring about devs marketshare a long time ago and instead is focusing on the more lucrative hip and young Average Joe consumer.

Most of the teens to early 20 somethings I know are either buying or hoping to buy the latest Macs, iPads, iPhones and AirPods while most of the devs I know are on Linux or WSL but devs are a minority compared to the Average Joes who don't code but are willing to pay for nice hardware and join the ecosystem.


Looking at the arch slide of Apple's announcement about shifting Macs to ARM, they want to people to use them as dev platforms for better iPhone software. Think Siri on chip, Siri with eyes and short term context memory.

And as a byproduct perhaps they will work better for hip young consumers too. Or anyone else who is easily distracted by bright colours and simple pictures, which is nearly all of us.


> I think you are vastly underestimating how many people use Mac (or use Windows without using WSL) to develop for the cloud.

The dominance of Macs for software development is a very US-centric thing. In Germany, there is no such Mac dominance in this domain.


To be fair in the UK Macs are absolutely dominant in this field.


Depends very much what you're doing; certainly not in my area (simulation software) at least, not for other than use as dumb terminals.


Yes, in Germany it's mostly Linux and Lenovo / Dell / HP desktops and business-type laptops. Some Macs, too.


I have no idea where in Germany you're based, or what industry you work in, but in the Berlin startup scene, there's absolutely a critical mass of development that has coalesced around macOS. It's a little bit less that way than in the US, but not much.


Berlin is very different from the rest of Germany.


This. According to my experience and validated by Germans and expats alike, Berlin is not Germany :)


In Norway where I live Macs are pretty dominating as well. Might be Germany is the outlier here ;-)


When I go to Ruby conferences, Java conferences, academic conferences, whatever, in Europe, everyone - almost literally everyone - is on a Macintosh, just as in the US.


Most people don’t go to conferences.


Ruby conference goers don't represent all the SW devs of Europe :)


Why do you think not?

And why not Java developers?

They seem pretty polar opposite in terms of culture, but all still turn up using a Macintosh.


Because every conference is its own bubble of enthusiasts and SW engineering is a lot more diverse than Ruby, from C++ kernel devs to Firmware C and ASM devs.

Even the famous FailOverflow said in one of his videos he only bought a Mac since he saw that at conferences everyone had Macs so he thought that must mean they're the best machines.

Anecdotally, I've interviewed at over 12 companies in my life and only one of those issues Mac to its employees the rest were windows/Linux.


True, but it is full of developers using Windows to deploy on Windows/Linux servers, with Java, .NET, Go, node, C++ and plenty of other OS agnostic runtimes.


Given the fact that the US has an overwhelming dominance in software development (including for the cloud) I think that the claim this is only a US phenomenon is somewhat moot. As a simple counter-point, the choice of development workstation in the UK seems to mirror my previous experience in the US (i.e. Macs at 50% or more.)


My experience in Germany and Austria mirrors GPs experience with windows/linux laptops being the majority and Mac being present in well funded hip startups.


Same in South Africa (50% mac, 30% windows, 20% ubuntu) and Australia.


> You don't develop cloud software on a mac anyway

I've got anecdata that says different. My backend/cloud team has been pretty evenly split between Mac and Windows (with only one Linux on the desktop user). This is at a Java shop (with legacy Grails codebases to maintain but not do any new development on).


Mac is actually way better for cloud dev than Windows is, since it's all Unix (actual Unix, not just Unix-like). And let's be honest, you'll probably be using docker anyway.


Arguably now, with WSL, Windows is closer to the cloud environment than macOS. Its a true Linux kernel running in WSL, no longer a shim over Windows APIs.


Yep. WSL 2 has been great so far. My neovim setup feels almost identical to running Ubuntu natively. I did have some issues with WSL 1, but the latest version is a pleasure to use.


Do you use VimPlug? For me :PlugInstall fails with cannot resolve host github.com


I do use VimPlug. Maybe a firewall issue on your end? I'm using coc.nvim, vim-go, and a number of other plugins that installed and update just fine.


That is just utter pain though. I’ve tried it and I am like NO THANKS! Windows software operates too poorly with Unix software due to different file paths (separators, mounting) and different line endings in text files.

With Mac all your regular Mac software integrates well with the Unix world. XCode is not going to screw up my line endings. I don’t have to keep track of whether I am checking out a file from a Unix or Windows environment.


Your line-ending issue is very easy to fix in git:

`git config --global core.autocrlf true`

That will configure git to checkout files with CRLF endings and change them to plain LF when you commit files.


Eating data is hardly a fix for anything, even if you do it intentionally.


If the cloud is mostly UNIX-like and not actual UNIX, why would using “real UNIX” be better than using, well, what’s in the cloud?


Agree, although I think this is kind of nitpicking, because "UNIX-like" is pretty much the UNIX we have today on any significant scale.


macOS as certified UNIX makes no sense in this argument. it doesn't help anything, as most servers are running Linux.


I develop on Mac, but not mainly for other Macs (or iOS devices), but instead my code is mostly platform-agnostic. Macs also seem to be quite popular in the web-frontend-dev crowd. The Mac just happens to be (or at least used to be) a hassle-free UNIX-oid with a nice UI. That quality is quickly deteriorating though, so I don't know if my next machine will actually be a Mac.


True, but then the web-fronted dev stuff is several layers away from the ISA, isn't it ? As for the unix-like experience, from reading other people's accounts, it seemed like that was not really Apple's priority. So there are ancient versions of utilities due to GPL aversion and stuff. I suppose docker, xcode and things like that make it a bit better, but my general point was that didn't seem like Apple's main market.


> So there are ancient versions of utilities due to GPL aversion and stuff.

They're not ancient, but are mostly ports of recent FreeBSD (or occasionally some other BSD) utilities. Some of these have a lineage dating back to AT&T/BSD Unix, but are still the (roughly) current versions of those tools found on those platforms, perhaps with some apple-specific tweaks.


It works great though, thanks to Homebrew. I have had very few problems treating my macOS as a Linux machine.


> You don't develop cloud software on a mac anyway

You must be living in a different universe. What do you think the tens of thousands of developers at Google, Facebook, Amazon, etc etc etc are doing on their Macintoshes?


> What do you think the tens of thousands of developers at Google ... are doing on their Macintoshes?

I can only speak of my experience at Google, but the Macs used by engineers here are glorified terminals, since the cloud based software is built using tools running on Google's internal Linux workstations and compute clusters. Downloading code directly to a laptop is a security violation (With an exception for those working on iOS, Mac, or Windows software)

If we need Linux on a laptop, there is either the laptop version of the internal Linux distro or Chromebooks with Crostini.


They individually have a lot of developers, but the long tail is people pushing to AWS/Google Cloud/Azure from boring corporate offices that run a lot of Windows and develop in C#/Java.

edit: https://insights.stackoverflow.com/survey/2020#technology-pr...


>What do you think the tens of thousands of developers at Google, Facebook, Amazon, etc etc etc are doing on their Macintoshes?

SSH to a linux machine ? I get that cloud software is a broad term that includes pretty much everything under the sun. My definition of cloud dev was a little lower level.


This is the same Linus who's recently switched his "at home" environment to AMD...

https://www.theregister.com/2020/05/24/linus_torvalds_adopts...


Which is still x86...?

What point are you trying to make?


So? Linus doesn’t develop for cloud. His argument still stands.


Because when you get buggy behaviour from some library because it was compiled to a different architecture it's much easier to debug it if your local environment is similar to your production one.

Yeah, I'm able to do remote debugging in a remote VM but the feedback loop is much longer, impacting productivity, morale and time to solve the bug, a lot of externalised costs that all engineers with reasonable experience are aware of. If I can develop my code on the same architecture that it'll be deployed my mind is much more in peace, when developing in x86_64 to deploy on ARM I'm never sure that some weird cross-architecture bug will pop up. No matter how good my CI/CD pipeline is, it won't ever account for real-world usage.


on the other hand, having devs an alien workstation really put stress into the application configurability and adaptability in general.

it's harder in all the way you describe, but it's much more likely the software will survive migrating to the next debian/centos release unchanged.

it all boils down to the temporal scale of the project.


I'd say that on my 15 years career I had many more tickets related to bugs that I needed to troubleshoot locally than issues with migrating to a new version/release of a distro. To be honest it's been 10 years since the last time I had a major issue caused by distro migration or update.


> "Macs aren't harbingers of anything."

I have to agree. It's not like we're all running Darwin on servers instead of Linux. Surely the kernel is a more fundamental difference than the CPU architecture.


ARM Macs means more ARM hardware in hands of developers. It means ARM Docker images that can be run on hardware on hand, and easier debugging (see https://www.realworldtech.com/forum/?threadid=183440&curpost...).


> They have set literally no trend in the last couple of decades, other than thinness at all costs

Hahaha then you have not kept attention. Apple led the trend away from beige boxes. Style of keyboard used. Large track pads. USB. First to remove floppy drive. Both hardware, software and web design has been heavily inspired by Apple. Just look at icons used, first popularized by Apple.

Ubuntu desktop is strongly inspired by macOS. Operating system with drivers preloaded through update mechanism was pioneered by Apple. Windows finally seem to be doing this.


Because if you're developing apps to run in the cloud, it's preferable to have the VM running the same architecture that you're developing on.


maybe what he means is, if macs are jumping on the trend, man that must be a well-established trend, they're always last to the party.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: