my issue with C/C++ code bases in 2023, is the same issue from 2013 and 2003: tooling. With Go, I can download any random code from at least 2018, and do this:
go build
and it just works. all the needed packages are automatically downloaded and built, and fast. same process for Rust and even Python to an extent. my understanding is C++ has never had a process like this, and its up to each developer to streamline this process on their own. if thats no longer the case, I am happy to hear it. I worked on C/C++ code for years, and at least 1/3 of my development time was wasted on tooling and build issues. I dont ever want to deal with that again.
Tooling is but one reason I hope Rust eats C/C++ lunch. I think everyone has some story of banging their head against an asinine build process. I am certainly sick of ./configure, make build, huge wait, and then a missing libXYZ error, install XYZ, repeat, missing libABC.
>" I think everyone has some story of banging their head against an asinine build process"
Not my case. I have simple CMake files for my projects and have no problems building on Linux and Windows (I do not do Mac). I do use few libraries (Postgres, network - http/tcp/udp and some others). So far no troubles. I do remember it was way more complicated though. So I am not really missing "Rust eating C++'s lunch". I do understand there are some real life huge insanely complex projects with basillion options. Luckily I do not havbe to deal with that part.
I do not want it to fetch dependencies. I download those once and include it as a copy per project. Update on on-need basis. Over the years I have accumulated enough. To start new project all I have to do is - clone template directory and I am ready to go.
> CMake doesn't fetch your dependencies for you and ensure they're the version that your code is compatible with.
That’s a little too harsh. You can certainly make it do that via a combination of git tags (to build from source) and CMake modules/Find*, but yes, it’s much more cumbersome than cargo/npm/go
But building something more complex in Rust (than just grabbing a bunch of libraries from Cargo) becomes as tedious as what you typically expect in C++ land. I've read anecdotes of Rust devs struggling with build.rs (the official build scripting in Rust) similarly to how people bang their heads at CMake.
This is why other C++ build systems like Bazel and Meson added Rust support, since Cargo/build.rs itself isn't sufficient enough for all use cases (especially when using it alongside with other systems languages.) And it's not too wild for something to think of adding Rust support for CMake as well...
So far the best thing that I've seen in wildlife for c/c++ code is just outsourcing everything into docker container with build tools, that you have with qmk: https://docs.qmk.fm/#/getting_started_docker. It has it's downsides, but it's just 2 commands to have built binary, one of which is to clone git repository.
I really had high hopes for Bazel, that they will either package or reuse some standard build tools configurations and just let use them by referencing them in single place in workspace, and the rest is going to handled by build system, but it doesn't seems like this is how it does work now and will work in observable future. If you want it this hermetic, you need to do it on your own.
That is great for simple projects, but a real toolchain for c++ needs to handle complex real world code. That means you have a mix of FORTRAN, ada, c, rust, some custom internal corporate language, and so on all mixed with the C++. If your build system cannot handle that mess i'm not interested
> That means you have a mix of FORTRAN, ada, c, rust, some custom internal corporate language, and so on all mixed with the C++. If your build system cannot handle that mess i'm not interested
In that case I would just use the rust toolchain which I'm sure can handle this stew.
Never going to happen. The standards group doesn't define the tooling. Every vendor has its own set compiler flags, error messages and ABI. Its also used on architectures where the target hardware has to be complied for on a completely different architecture, i.e. embedded devices.
Its too complex of a problem to be solved by a committee with competing interests.
Never say never, there is P2656. We'll see! It's more about making things interoperable than mandating a new tool, but it's still something useful in this space. It also links to previous work.
The C++ community has de facto standardized on CMake, which solves half the problem. It's not pretty, but it gets the job done.
Dependencies can mostly be managed with vcpkg and its CMake integration. The only real problem is when you need something that's not in vcpkg and have to write your own port overlay.
It's not perfect, but the situation is vastly better than it was ten years ago.
There’s also Conan which supposedly solves all problems, but we weren’t successful at migrating a large CMake project to it. Maybe if you use it from very beginning, it would work.
I remember trying to use go, it certainly wasn't that easy, the go folder was a really confounding concept. I think they got rid of that though (been a while).
sounds like a golden opportunity for c++next, whatever that will be: cppfront, circle? to standardize on a single building and package management solution.
That errors out when a library is missing, Soni figure out where to get it, how to install it and rerun configure, just to see something else is missing. And that then leads to a version conflict and then I mess.... only works well for projects with few dependencies and especially no uncommon dependencies.