I didn’t go into great detail of the reasons, but the jobserver doesn’t address the problem except for the most trivial cases—the core problem is that you can’t cross to a recursive make invocation via multiple edges.
This is fairly common in larger projects, so you end up having to do some hackery to manually sequence make invocations if you want to use recursive make (which is pretty awful).
Honestly, for large projects, Make is an insane choice, notwithstanding the fact that people who are sufficiently savvy at Make can sometimes make it work. (If your tools are bad, you can make up for it with extra staff time and expertise.)
> the core problem is that you can’t cross to a recursive make invocation via multiple edges.
I’ve never had that issue, and used to heavily use recursive make. I carefully benchmarked those make files, and am sure this wasn’t an issue.
I suggest reading the paper “Recursive Make Considered Harmful” before attempting to use make for a large project. It describes a lot of anti-patterns, and better alternatives.
I’ve found every alternative to make that I’ve used to be inferior, and they almost always advertise a “killer feature” that’s a bad idea, or already handled better by make. It’s surprising how many people reimplemented make because they didn’t want to RTFM.
Anyway, the next build system on my list to check out is bazel. Some people that I’ve seen make good use of make say it’s actually an improvement.
> I’ve never had that issue, and used to heavily use recursive make. I carefully benchmarked those make files, and am sure this wasn’t an issue.
So, you’ve never had two different targets that depend on something in a subdirectory? If you’ve just solved this by building subdirectories in a specific order, or building entire subdirectories rather than the specific targets you need, what you’re really doing is making an incorrect dependency graph in order to work around make’s limitations. These kind of decisions make sense for full builds, but interfere with incremental builds.
Bazel & family (Buck, Pants, Please, etc.) are among the few build systems that solve this problem well. It’s not an accident that they all use package:target syntax for specifying targets, rather than just using a path, because this allows the build system to determine exactly which file contains the relevant build rule without having to probe the filesystem.
I would love to simply recommend Bazel but the fact is there is a bit of a barrier to entry depending on how standard / nonstandard your build rules are and depending on how you think that third-party dependencies should be pulled in. Depending on your project, you could convert to Bazel in an hour just by looking at a couple examples, or you could have to dive deep into Bazel to figure out how to do something (custom toolchains, custom rules, etc.)
As you observed, the alternatives to make are often inferior, and it’s often because they’re solving the wrong problems. For example, sticking a more complex & sophisticated scripting system in front of make.
If there’s a requirement to use an intermediate target from a subdirectory, and that’s not expressible in the parents or siblings, you could run into issues. I thought you meant it failed to parallelize multiple unrelated targets because they are in multiple subdirectories.
Anyway, the solution to that specific problem is written up in “Recursive make considered harmful”. Make handles it elegantly.
The title is kind of a spoiler, but if you’re using recursive make and care about fine grained dependencies across makefiles, you’re doing something wrong.
As an aside, I wonder if recursive make misuse is why people claim ninja is faster than make. I’ve benchmarked projects before and after, and before, make was generally using < 1% CPU vs 1000%+ for clang.
Afterwards, ninja was exploiting the same underlying parallelism in the dependency graph as make was. Thanks to Amdahl’s Law, there was no measurable speedup. The only thing I can figure is there’s some perceived lack of expressivity in make’s dependency language, or there are an absurd number of dependencies with trivial “build” steps.
My experience is that "larger projects" to the investment in their build infrastructure to maximize parallelism in order to reduce build times because it pays dividends in terms of turn around time and thus productivity.
One of the joys of working with someone who has been building large projects for a while, is that they just design the build system from the start to be as parallel as practical.
Make isn't great, but if you look at the opensource world the vast majority of the large projects are make based. The backbone of your average distro is a thin layer on top of what is mostly automake/autoconf in the individual projects. That is because while I can create a toy project that builds with make in a half dozen lines, big projects can extend it to cover those odd edge cases that break a lot of other "better" build systems. Particularity, when a project starts including a half dozen different languages.
So, while i'm not a make fan, I'm really tired of people pissing on solutions (c also comes to mind) that have been working for decades because of edge cases or problems of their own creation because they don't understand the tool.
A well understood tool is one where people know where the problems lie and work around them. Most "perfect" tools are just project crashing dragons hiding under pretty marketing of immature tooling.
> …I'm really tired of people pissing on solutions (c also comes to mind) that have been working for decades because of edge cases or problems of their own creation because they don't understand the tool.
You would have to be extraordinarily cynical to think that there’s been no progress in making better build software in the past forty years or so.
Yes, there are definitely plenty of build systems out there that were built from scratch to “fix” make without a sufficient understanding of the underlying problem. I’m not going to name any; use your imagination. But make was never that good at handling large projects to begin with, and it was never especially good at ensuring builds are correct. This is not “pissing on make”, for Chris’s sake, make is from 1976 and it’s damn amazing that it’s still useful. This is just recognizing make’s limitations, and one of those limitations is the ability for make to handle large projects well in general, specific cases notwithstanding. Part of that is performance issues, and these are tied to correctness. As projects which use make grow larger, it becomes more likely that they run into problems which either interfere with the ability to run make recursively or the ability to run make incrementally.
The “poor craftsman who blames his tools” aphorism is a nice way to summarize your thoughts about these kind of problems sometimes, and it does apply to using tools like make and C, but it’s also a poor craftsman who chooses tools poorly suited to the task.
This is fairly common in larger projects, so you end up having to do some hackery to manually sequence make invocations if you want to use recursive make (which is pretty awful).
Honestly, for large projects, Make is an insane choice, notwithstanding the fact that people who are sufficiently savvy at Make can sometimes make it work. (If your tools are bad, you can make up for it with extra staff time and expertise.)