Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: A Modern C/C++ build tools (Simple, Fast, Cross-platform) (xmake.io)
89 points by waruqi on Oct 25, 2019 | hide | past | favorite | 56 comments



I so badly want to get off the C/C++ build system roller coaster. I've gone from make to autoconf to various IDEs to CMake to Meson, and have looked at a bunch of others but never made the jump.

Eventually fatigue sets in, you pick a tech, and stick with it even as the tech fades into obscurity and nobody can figure out how to build your project anymore, let alone integrate it :/


I got off the C/C++ roller coaster long time ago. Currently staying on D. Others get off at the Rust station. I don't intend to ever go back unless forced by my employer.


I have it easy, MSBuild, and when not on Windows, CMake.

Then again, I just need it to the extend of Java/.NET mixed builds with C++.


> MSBuild

OMG don't say the name, it causes me physical pain. That XML crap which can't decide whether to be a shitty static GUI-editable build format or a proper description language that you can actually use. It ended up being neither (to be fair the first can't really be achieved for non-trivial projects). It's one of the worst designs I've had to deal with.

(I'm still in the process of cleaning up hundreds of .vcxproj files, each of with has multiple 1000s of lines of auto-generated boilerplate in it, using the crutch that is .props files).

> and when not on Windows, CMake.

Even better, let's add another abomination, in the form of CMake, on top of MSBuild. And give up the last bit of control over your build that you could hope to have.


But, let's not forget to mention, MSBuild the "execution engine" is absolutely fantastic... Working in Visual Studio, it reliably detects the minimal amount of things to rebuild.


> it reliably detects the minimal amount of things to rebuild

So they managed to implement graph traversal which is a basic interview question everywhere? Every dummy build system do it, but for Microsoft that’s achievement!


It's not about walking a dependency graph, but about getting the dependencies 100% right. In C, that's indeed very difficult, or at least a lot of work.

I don't know if you have tried MSBuild, but I've spent countless numbers cleaning up .vcxprojs, shifting things around, factoring into extern .props files, etc. Even on major changes I was often surprised how little rebuild was necessary. The granularity at which the system detects changes is not just per-file. It has per-file caches for the set of preprocessor variables, the set of include paths, all the little compilation and link settings, and so on.

So what it will do if you change the build file is, it will re-execute the build file processor and generate all the necessary build information for each C file. And then, it only rebuilds each file if a build dependency has actually changed in a relevant way for that file. For example, the order in which the preprocessor variables are defined does not matter (as long as they don't overwrite each other), but the ordering of include paths does. Also, if any of the files that were #include'd (transitively) in the last build changed since them, a rebuild is needed.

Compare that to the average crappy Makefile system that you are going to set up. The best thing you could achieve with Make will never be going to rebuild as little, or rebuild reliably the stale build products. I like Make for other reasons, but MSBuild is better and more robust.


I have spent large amount of time working with msbuild while migrating msbuild build into a different build system (including working on implementation of two way converter, from an to msbuild).

What you described (cache based on preprocessor flags) is really cool, in practice it does not give a huge win compared to proper modern build system like Bazel, and decent user interface (including the build language) and extensibility is much more important. Especially because it’s super hard to properly configure msbuild project with proper dependencies between modules.

I have a friend who worked at MS. He told stories how in their VS projects dependencies between modules (projects) were not configured, developers had to manually invoke compilation of dependencies. In theory it is possible to configure msbuild properly, in practice even MS employees failed to do so.

But I don’t consider msbuild a build system. For me it’s just a project format for VS which some developers commit to VCS. It’s great that MSBuild allows to have a proper C++ project definition with a custom command to actually build it (external build system), so VS sees proper project structure, but developers don’t have to deal with vcxproj files.


> He told stories how in their VS projects dependencies between modules (projects) were not configured

Yes that's still an issue at my workplace as well. When I joined, there were no dependencies - all files were compiled redundantly for each project. So there were no issues with regards to reliably rebuilding dependencies, but of course it takes a lot longer to build. And there are serious maintainability issues with de-facto internal libraries, because adding or removing files to/from each "library" means that each dependent project file has to be updated. I've started to split out a few "library projects", and now we're starting to get the rebuilding issue. I think dependencies can be configured in *.sln files, but that probably means that they have to be configured separately for each project, i.e. it probably can't be done automatically when simply importing the library's .props file into a .vcxproj.

A nice middle ground could be to make .props files that just include .c files, meaning the library's files will be part of each project, so no rebuilding issues and no maintenance issues since the list of files that belongs to a library is maintained in one isolated place (the .props file). But that again means the files are built redundantly in each project.

But the long-term solution will be to maintain the build descriptions in better-suited in-house data structures (yeah like CMake, just not that scary) and to generate the MSBuild cruft from that.


I'm not sure how it differs from all the existing tools really, it looks like it does more or less the same stuff, just slightly differently. Also it does look like it requires that xmake is itself installed on the target system.

Personally i'd like something similar to autotools, only much saner, in that you write some sort of script (or definition file or whatever) that describes your project and its requirements and then the tool generates a configure shell script for unix and windows (i mean two configure scripts, one for unix and one for windows) that itself generates a makefile and/or project file.

So, like autotools, the recipient of the code will not need to have your magic build tool installed, just the common tools available on their system (shell, make, cc for unix/mingw, visual studio or whatever on windows). This can be very useful especially for libraries (it is annoying when every library wants its own special snowflake build tool).


xmake can also support to generatr other project file. e.g. makefile vsproj cmakelist compile_command and etc.

  xmake project -k makefile
  xmake project -k vs2019
  xmake project -k cmakelist
  xmake project -k vsxmake 
  ...


This still requires xmake to be installed. CMake does that too, as does Premake and a bunch of other tools, but they all need to be installed - except Autotools.

What i'm talking about is generating a script for something that is part of the OS itself (like shell script, which is what Autotools does) that itself generates the Makefile which uses a widely available standard tool (Make). So a user (i include library users - as opposed to library developers - as "users" here) wont need to install xmake/cmake/whatever just to build the program/library, they only need to have shell and make which are available everywhere. On Unix at least, on Windows it'd need to generate a batch file for MSVC (this is where Autotools fall short since they're made for unix only).


Xmake has its own package management repository.https://github.com/xmake-io/xmake-repo

And it also supports self-built distributed repositories and third-party repositories (e.g. vcpkg, conan, clib, homebrew).

  add_requires("libuv master", "ffmpeg", "zlib 1.20.*")
  add_requires("tbox >1.6.1", {optional = true, debug = true})
  target("test")
    set_kind("shared")
    add_files("src/*.c")
    add_packages("libuv", "ffmpeg", "tbox", "zlib")


Just as a heads up, you should probably remove the

  yaourt xmake
section of the install section (the Arch Linux instructions).

Yaourt is no longer actively maintained and poses security risks. Just linking to the AUR package is usually enough.


Ok, I will look at it. Thanks!


Using third party package repositories is a really, really smart move. This way you can expand the universe of available C++ packages by combining different package management systems. In my experience, Conan is already pretty solid. But if a package is not available in Conan, chances are it's available in Vcpkg.


Interesting.

Visually, it looks a lot like Meson's Mesonfile syntax, in some ways.

Does the indentation have semantic meaning at all, or is it just to convey a hierarchy?


No, the indentation is for human readability only.


Yes.


Unfortunately no build tool can ever really improve productivity in absolute terms. They can only ever "improve" relatively, i.e by sucking less and reducing productivity less than some other competing build tools.

The time that is spent dicking around with these silly build systems/tools is always time that is wasted and sank into useless activities without any value added in the end product.


I completely disagree. Do you want to go back to typing g++ commands by hand for each source file? That's not a build tool and yet it is objectively worse than even a crappy build tool.


No, ofc nobody wants to do that and that's not really what I'm saying. I'm saying that all current build tools suck and can only ever decrease your software's value and decrease productivity.

Let me illustrate. Your software is a product X. It has some value V (as by some measure of value, perhaps $ earned). The product X is the ultimate output of some build tool T.

Now image you had this incredible build tool that produced X without any programmer doing any build related work ever at all. Your build output is X and value is V. 100% of your developer effort can go towards working on X and increasing V!

Now in reality you spend some amount of time writing build files, working with the build, fixing bugs in the build files, generally maintaining it. This is typically non zero effort and the cost grows above linearly wrt respect to build configurations/platforms supported etc.

However the output of the build tool is still the same X and value is still V. So your build tool added 0 value!

In fact any build tool can only ever decrease your product's value. Why? How? Because of the cost of messing with the build is non-zero and you spend time on it that could otherwise be spent working on actual things that produce more value (such as adding new features or fixing bugs or whatever). i.e. you can only put 100% - "build effort" amount of effort towards working on X and increasing V.


> Now image you had this incredible build tool that produced X without any programmer doing any build related work ever at all. Your build output is X and value is V. 100% of your developer effort can go towards working on X and increasing V!

That is only the case for the simplest of software though. "Build systems" are often much more than just build systems: they are project management tools.

e.g. quite often I have to perform some preprocessing or code generation.

I could :

- A: write a $cross_platform_scripting_language script and integrate it to my build system & CI, ensure that the script interpreter is correctly found, that the build dependency graph is correct, etc etc. But just getting the same version of python to work reliably on 5+ different platforms is already quite a pain.

- B: write the preprocessor directly in my build system language (in my case, most of the time CMake).

Option B takes wayyyyy less time in my experience than option A, even though it looks more "build-related" than option A.


Right like I said some build tool can improve your productivity wrt to some other build tool. But did it create any absolute value in the end product?

Let's say you first build your product with autotools and you sell it for X$ per unit. If you later port your application to CMake. What value does CMake create in the end product? Can you sell it for more than X$ on the basis of having been built with Cmake? Usually no. The build system created 0 absolute value gain. You can claim that oh.. CMake is simpler than autotools and will save some time in the future thus creating a gain relative to autotools. Yet it's not capable of generating absolute value added. In absolute terms every build tool can only retract effort away from the activities that produce the value, i.e.software features.


> What value does CMake create in the end product? Can you sell it for more than X$ on the basis of having been built with Cmake? Usually no.

That is true for every technological choice.

> In absolute terms every build tool can only retract effort away from the activities that produce the value, i.e.software features.

Again you seem to think that build systems don't contain software features. I don't think that this is true. My build system helps me to refactor things that I would have to do by hand otherwise, and which are pretty project-specific. For instance scanning subprojects for their license information and generating a .cpp with that info, looking for all uses of a particular token and listing the files using that token for them to be included somewhere, etc etc.


Looks cool, but is there a tl;dr for why I’d use this instead of something like Bazel?


I'm not related to xmake, but it seems a much simpler project than Bazel, which is much more opinionated and contains many more bells and whistles. Bazel may be suited for large, structured (read 'enterprise') code bases, but I can see as big a market for something like this as well.


I haven't compared it to bazel, but if you like the description style of xmake for the project or you like lua, you can try it.


You should definitely take a look at bazel. This is not say bazel is better than the one you wrote but cleary bazel is becoming a industry standard making other build tools obsolete.


The de facto industry standard is cmake, I have never even seen bazel in the wild and I work across many companies on big C++ code bases. I primarily use meson, and have even seen some cmake projects moved to meson, but I am under no illusions about it replacing cmake.


cmake is mediocre. Bazel is the future.


Ok, I will look at it, as far as I know, cmake seems to be more popular than bazel. At least I rarely see projects that use bazel, although I don’t like the syntax of cmake very much.


yes, I guess comparing with the de-facto standard solution (cmake) would be more beneficial.


I'm sure once it works, bazel is great and all, but Windows seems to have been an afterthought (which makes that whole "industry standard" thing a bit "complicated").

The hoops one has to jump through to get it running on Windows are a joke (TBH, the list of prerequisites and potential issues looks like a UNIX programmer was confronted for the first time with Windows):

https://docs.bazel.build/versions/1.1.0/install-windows.html


Bazel, like most Google open source projects, has a strong focus on Google's internal needs. Another similar Bazel surprise is how many hoops you have to jump through for the rare task of... debugging a cc_binary on macOS: https://github.com/bazelbuild/bazel/issues/2537


I’d say it is Apple’s fault, not Google’s: Bazel correctly does not include absolute paths into binaries (Bazel focuses on build reproducibility which is very good for caching and debugging of builds), and Apple toolchain does not provide convenient way to work with such binaries.


As a user, I couldn't care less. My goal is to build my application in debug mode. If the build system's view of the world is incompatible with the actual world, is that a failure of the world or the build system? At the end of the day, I still need to get my debug build working...


> Bazel, like most Google open source projects, has a strong focus on Google's internal needs

This is only partially true: internally they have Blaze, which is a different version of Bazel. I think Bazel is just not yet mature enough, and they released version 1.0 too early.


Bazel is an obscure tool mostly used by Google.

The industry standard is CMake.


CMake today is autotools yesterday: too buggy, too unreliable, fragile, slow. Bazel is properly made build system.

Basically, CM does not guarantee correct incremental builds, Bazel does (I had to kill CM cache thousands of times to make sure everything is rebuilt correctly after changes in CMake definitions). CMake is incompatible with effective distributed building (you have to use cpp and distcc), Bazel is designed with distributed in mind. CMake is incompatible with effective caching (you have to use suboptimal ccache), Bazel has proper caching based on input checksums. CMake macros are pain, Bazel macros and custom rules are nice. And so on.


This tool is (again) just rehashing the same old build methods only with different syntax.

I'd really like to see a build tool that would take things to the next level.

- As a developer if I include "foobar.h" in my code, why do I have to workout the include paths myself? Why can't the build system search for the file and resolve the include paths? If there are any unresolvable ambiquieties then ask me.

- As a developer if I use for example std::thread why do I need to add manually -pthread to my build / linker flags. Why can't the build system do this for me?

- As a developer if I use let's say Win32 API in my code why do I have to manually add the linker libs and flags. Why can't the build system do this for me?

- As a developer if I use c++14 features in my code, why do I need to manually add the -std=c++14 whatever flag in my build files? Why can't the build system do this for me?

etc.

Everything else is just re-hashing the same old tiring build metdhologies with different syntax/problems/bugs/shortcomings/portability bugs.


> If there are any unresolvable ambiquieties then ask me.

what if there is no ambiguity, only a single wrong answer, and now it "works" but you include the wrong thing ?

> - As a developer if I use for example std::thread why do I need to add manually -pthread to my build / linker flags. Why can't the build system do this for me?

because adding -pthread and -lpthread have different semantics (-pthread defines _REENTRANT which may change things - grep your /usr/include for that ;p) and you may want one or the other regardless of whether you use std::thread.

> - As a developer if I use let's say Win32 API in my code why do I have to manually add the linker libs and flags. Why can't the build system do this for me?

which linker libs ? on my computer I must have 5 version of those right now, not counting UWP and various MinGW versions. Also sometimes I want the debug standard library and sometimes not.

> - As a developer if I use c++14 features in my code, why do I need to manually add the -std=c++14 whatever flag in my build files? Why can't the build system do this for me?

determining if you are using C++14 features sounds like a corollary of the halting problem


I think Borland C++ did all the above in the 90s, having to manually tell GCC which libraries to link against (and with even in the "correct" order!) was something i found baffling back when i first tried MinGW.

Though it did help that all libraries and headers are part of the compiler itself. And perhaps it was using something like MSVC's #pragma comment(lib, "libname") (which i really wish GCC/Clang would implement at some point as it is very convenient) in every header it came with to properly associate with the relevant library, though i'm almost sure i replaced the OpenGL headers it came with and recreated the .lib file to be more complete (the default had only OpenGL 1.0 symbols) yet it still found the new stuff.


There are so many nice build tools for C, but unfortunately very few users will be happy to install and learn yet another tool.

And no matter which build system you choose, you'll find users who think your choice is awful and you should have picked their one true build system.


Great tools, good job!


Why not Python? Syntax is much cleaner and there’s nothing you can’t achieve. Even performance is on par.


Xmake built-in luajit, without any external dependencies, can be used after installation, although most distributions have built-in python, but there are still many systems that do not install python by default, such as windows.

Also I don't like the syntax of python


Python as in Scons? Otherwise I don't understand your comment.


Scons is effectively dead, and I wish people would start removing it from the Python build systems web pages.

Meson is written in Python and isn't dead. However, Meson requires a partner to actually build things--something like cmake or ninja.

However, I think the original poster meant "Why Lua instead of Python?" And the only real answer is "Because that's what the author wanted to use."


I only ever used SCons once, but it's had a handful of releases this year (containing what looks like some quite significant updates and improvements), the last being in August. In what way is it effectively dead?


It doesn't have any marquee products using it since, Blender, IIRC, dumped it.

For auxiliary things that you don't become expert in like source control, build systems, etc., it's vitally important to have a vibrant community around it so you can simply look up the answer and get back to your job. Or, you can have an emergent AI robot like Randal Schwartz, Wietse Venema, or Armin Rigo handle that support (seriously, do those guys ever sleep?). But you need to have one or the other, and Scons never seemed to.

I really don't understand why Scons never took off. I think it was simply that it was too early. Nobody actually cared about cross-platform in the sense of Windows/OS X/ Linux. Of course, Blender did care about that, and still dumped it. So, YMMV.

I think people only started to genuinely care about "cross platform" when developing both cloud code and client code simultaneously became a thing. And then that accelerated with the polyglot of languages layered on top of Javascript.


btw: There's also Waf written in Python, which also isn't dead.


we are also happy with waf, wich gives us a lot of flexibility (e.g., use python libs that are not build-related) and use a language that the team already know. (https://waf.io)


It’s pining for the fjords.


yes.


There's also Snakemake, which is mostly used in bioinformatics pipelines, but it's sort of like a Makefile program with inline python enabled.

It's nice and I could imagine it getting some adoption because of its flexibility, but my C++ projects tend to have their needs met with Makefile.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: