- Cannot handle multiple outputs for a single rule
- Does not rebuild when flags change
- Make rules may contain implicit dependencies
- Slow for large codebases
- Does not understand how to build for multiple platforms, sucks for cross-compiling
- Recursive make sucks (job control does not work across recursive invocation boundaries)
- You must create output directories yourself
- You must create your own rule to clean
This adds up to a fragile & slow build system, where you have to do a full rebuild to have a reasonable level of assurance that your build is correct—incremental builds aren’t safe.
There have been a few attempts to make a “better make” over the years like Jam, SCons, Ninja, Tup, etc. each with their own pros and cons.
The new generation of build tools—Bazel, Buck, Pants, and Please are all an attempt to redesign build systems so that the design is resistant to the flaws that plague Make build systems. You can use a shared cache (shared between different users) fairly easily, and you have reasonable assurance than incremental builds are identical to full builds (so your CI pipeline can move a lot faster, developers almost never have to "make clean" when they change things, etc).
Personally I’m working on a project right now that uses Bazel (which is similar to Please) and is for an embedded system. It’s been a great experience, and I can pass a flag to Bazel to tell it to build for the embedded target using the cross compiler or for the native host--that makes it easy to share code between tools and the target system, and I can do things like write tests that run on both the target & host. Anyone who does any cross-compiling is missing out if they are using Make—but, do note that setting up a cross-compiling toolchain in Bazel isn’t exactly a cakewalk.
Make does support multiple outputs, though the syntax sucks. Most of what you are annoyed with though is like being annoyed at C for the same reasons: Make is a programming language with a built-in dependency mechanism, and as such you can use it to build whatever you want... now, does it already come with whatever you want? No. I can appreciate wanting something which does. But such systems usually then give you what they want. I don't want my programming language to solve half of these things you want, and yet somehow I have built things on top of Make that solve even the problem of building for arbitrary embedded cross compile targets. (And hell: of cross compile toolchains is what you care most about, the king of that space is autotools, which not only correctly supports compiling to everything always every time out of the box, but somehow manages to make it possible with nothing more than you passing some toolchain configuration as command line arguments to the generated configure script... and one might note that it generates Make, though without taking much advantage of what makes Make great.)
> Make does support multiple outputs, though the syntax sucks.
No, it doesn’t. There’s NO syntax for multiple outputs. If you can show me what the syntax is and prove me wrong, I’d love to see it. At best, there are workarounds for the lack of multiple output support, with various tradeoffs.
> Make is a programming language with a built-in dependency mechanism, and as such you can use it to build whatever you want...
Make is NOT a programming language. End of story. You can… kind of… build things with it, given that variables in Make can be used like functions, but it’s goddamn horrible and you shouldn’t do it because nobody will want to use it and nobody will maintain it.
At most, you can build something on top of Make, but you’re still facing the limitations of Make and working around them. If you are interested in building something on top of a low-level build system, you want Ninja instead of Make, because Ninja is good at that. Make isn’t.
Make is, at best, something you would want to use for a small C program where all the files are in one directory. Once you grow past that use case, my rule is that you shouldn't be using Make any more, because there are just too many things that Make does wrong.
Make is successful because eh, it’s good enough for small things, you can suffer through the pain if you need to use it for big things, and it was the first tool to solve this problem. We have better tools now. We had better have better tools now! Make is in its fifth decade… if we didn’t improve on Make in that many years, that would be a damning indictment of the entire field.
GNU Make does support multiple outputs, but the feature is very new (it's in the latest release that came out earlier this year), so if you didn't happen to catch that release announcement you probably missed it. The support is called 'grouped targets', documented in the second half of this page: https://www.gnu.org/software/make/manual/html_node/Multiple-... -- the syntax has &: in the rule line.
(One point you don't mention in your list of reasons why Make is successful is that it's reliably available everywhere. For projects that ship as source for others to build, that matters, and it delays uptake of fancy new build systems in that segment.)
The big advantage of make to me is I understand it and can figure out what happens when things go wrong. When something doesn't work the way I want with cmake or autotools (I haven't used Bazel etc.), I have to randomly start googling things. Sometimes I literally resort to editing the generated cmake Makefiles because I have no idea how to tell cmake what I want it to do...
The documentation is not so great. What I’m doing is enabling platforms (https://docs.bazel.build/versions/master/platforms-intro.htm...), defining the OS and CPU target for my system, copying the configuration out of Bazel’s “local_config_cc”, and modifying it to fit my use case. This didn’t take me very long, but it’s also not the first time I’ve done it.
- Cannot handle multiple outputs for a single rule
- Does not rebuild when flags change
- Make rules may contain implicit dependencies
- Slow for large codebases
- Does not understand how to build for multiple platforms, sucks for cross-compiling
- Recursive make sucks (job control does not work across recursive invocation boundaries)
- You must create output directories yourself
- You must create your own rule to clean
This adds up to a fragile & slow build system, where you have to do a full rebuild to have a reasonable level of assurance that your build is correct—incremental builds aren’t safe.
There have been a few attempts to make a “better make” over the years like Jam, SCons, Ninja, Tup, etc. each with their own pros and cons.
The new generation of build tools—Bazel, Buck, Pants, and Please are all an attempt to redesign build systems so that the design is resistant to the flaws that plague Make build systems. You can use a shared cache (shared between different users) fairly easily, and you have reasonable assurance than incremental builds are identical to full builds (so your CI pipeline can move a lot faster, developers almost never have to "make clean" when they change things, etc).
Personally I’m working on a project right now that uses Bazel (which is similar to Please) and is for an embedded system. It’s been a great experience, and I can pass a flag to Bazel to tell it to build for the embedded target using the cross compiler or for the native host--that makes it easy to share code between tools and the target system, and I can do things like write tests that run on both the target & host. Anyone who does any cross-compiling is missing out if they are using Make—but, do note that setting up a cross-compiling toolchain in Bazel isn’t exactly a cakewalk.