Circle is irrelevant because it's closed source. It's been out there since 2019(?) and never gained any adoption or mind share among C++ developers. I've looked at it back when it first appeared, and it seemed interesting, but why would I spend any time on it when the creator of the language can lose interest tomorrow and shut down the project for good? There is no reason to devote any time or effort to a closed source project when plenty of interesting open source projects exist.
I wouldn't use a closed source C++ compilers either. But with C++ compilers, if one is abandoned, you can switch to another. With Circle, there is nothing else.
The best outcome the Circle author can hope for (and this is likely the purpose of the project) is that some company either buys it, or hires him to work on Circle full time. Circle benefits only one guy, it doesn't benefit the C++ community.
As a life long C++ native Software Engineer, cppfront/cpp2 is one of the few efforts in c++ that interest me these days.
I am also the hiring manager for our teams, and it's shocking that applicants that proudly include c++20 on their resumes but can not answer the intentionally open-ended question: Tell me about std::move().
IMO: If c++ is to thrive, it is in desperate need of the "10x simpler and safer" vision that is at the core of cppfront.
sure, people should be able to say what std::move does (basically nothing) but people asking the questions need to be aware of what they are asking about - most c++ programmers are not writing library code.
Rust is a move-first language. Move is standard; you have to deliberately make anything non-trivial copyable. Rust strongly prefers to be single-assignment and to pass things as read-only references or moves. Those are the right defaults. In C++, the good defaults are all harder, because they were added as afterthoughts to the language.
C++ has been gradually back-porting Rust features, but it provides help only for carefully written new code. You can still get raw pointers out, which breaks safety. To make forward progress with the C++ model, you have to throw stuff out of the language. That breaks too much old code. So they're stuck.
Many, many people, including me, have tried to make C/C++ safer without breaking backwards compatibility too much. It doesn't seem to be possible to do both. You have to disallow some things or the exercise is pointless.
Did this new proposal include array slices for C++? If you have slices, most of the need for pointer arithmetic goes away. But you need either a garbage collector, as in Go, a borrow checker, as in Rust, or restrictive scope rules to track when the underlying array goes away.
> C++ has been gradually back-porting Rust features,
I would hardly say that move semantics, or reusing temporary objects instead of deep copying them, is something invented by Rust.
Also, move semantics were introduced in C++ with C++11, which was in the works since the early 2000s. Rust first appeared in 2015. Are you actually trying to claim that C++ in 2011 specified in an international standard features that it back ported from a language that only saw the light of day in 2015?
> To make forward progress with the C++ model, you have to throw stuff out of the language.
No, not really. Mindlessly removing features only breaks backward compatibility with no reason. We have progress by offering improvements. It's up to the developer to manage how he manages their projects. Mindlessly breaking compatibility prevents that same developer from benefitting from improvements for no reason whatsoever.
> Many, many people, including me, have tried to make C/C++ safer without breaking backwards compatibility too much. It doesn't seem to be possible to do both. You have to disallow some things or the exercise is pointless.
This is an absurdly silly thing to say. It's a kin to complaining that making Rust safe is pointless because Rust still supports unsafe.
The safety problems of C/C++ come from the old features, such as null-terminated strings, arrays without size information, and pointer arithmetic. Until there is a will to kill those off, there will not be much forward progress in safety.
zabzonk, thank you for raising this point and the discussion it teases out.
std::move is nothing but a cast - but it means that means that for every new class, we should be actively considering if a move constructor is appropriate for that class. The consequences of a missing move constructor would be calling the copy constructor... pretty dry stuff I agree. about as exciting as discussing pass by value or pass by reference... but for certain domains of programming (embedded), just as critical. Nicolai M. Josuttis , a 20 year veteran of the standards committee, can be quoted as saying "Move semantics, introduced with C++11, has become a hallmark of modern C++ programming." [1]
But if we take a step back and consider the broader picture I think of "the average language proficiency of the team" as being a very real pressure. If the complexity of the code base creeps above the average proficiency of the team, the health of the code base struggles.
So, here's the thing. As a hiring manager for embedded products, If I can't get applicants that know about std::move(), then the scope of the c++ language has outpaced it's own talent pool, and something like cppfront becomes all the more critical.
You’ll never find two C++ programmers with the same skill set. Every C++ project has its own peculiar subset of the language. Forget rvalue references, some code doesn’t even use references.
This is a result of backwards compatibility. Features can only be added, not removed (or at least only removed if nobody is using them, like export templates and GC).
As a result C++ hoovered many different types of users over the years who all had their own idea of what they wanted from the language.
For a C++ programmer, navigating this by learning and adapting your skill set for a new project is simply a part of the job. You’ll always be able to find missing spots in even the most grizzled C++ vet’s knowledge.
What level are you hiring at? I could well imagine e.g. fresh graduates not having a good handle on some of the nuances of the language. You can get pretty far without having to refine your understanding of rvalue semantics.
my issue with C/C++ code bases in 2023, is the same issue from 2013 and 2003: tooling. With Go, I can download any random code from at least 2018, and do this:
go build
and it just works. all the needed packages are automatically downloaded and built, and fast. same process for Rust and even Python to an extent. my understanding is C++ has never had a process like this, and its up to each developer to streamline this process on their own. if thats no longer the case, I am happy to hear it. I worked on C/C++ code for years, and at least 1/3 of my development time was wasted on tooling and build issues. I dont ever want to deal with that again.
Tooling is but one reason I hope Rust eats C/C++ lunch. I think everyone has some story of banging their head against an asinine build process. I am certainly sick of ./configure, make build, huge wait, and then a missing libXYZ error, install XYZ, repeat, missing libABC.
>" I think everyone has some story of banging their head against an asinine build process"
Not my case. I have simple CMake files for my projects and have no problems building on Linux and Windows (I do not do Mac). I do use few libraries (Postgres, network - http/tcp/udp and some others). So far no troubles. I do remember it was way more complicated though. So I am not really missing "Rust eating C++'s lunch". I do understand there are some real life huge insanely complex projects with basillion options. Luckily I do not havbe to deal with that part.
I do not want it to fetch dependencies. I download those once and include it as a copy per project. Update on on-need basis. Over the years I have accumulated enough. To start new project all I have to do is - clone template directory and I am ready to go.
> CMake doesn't fetch your dependencies for you and ensure they're the version that your code is compatible with.
That’s a little too harsh. You can certainly make it do that via a combination of git tags (to build from source) and CMake modules/Find*, but yes, it’s much more cumbersome than cargo/npm/go
But building something more complex in Rust (than just grabbing a bunch of libraries from Cargo) becomes as tedious as what you typically expect in C++ land. I've read anecdotes of Rust devs struggling with build.rs (the official build scripting in Rust) similarly to how people bang their heads at CMake.
This is why other C++ build systems like Bazel and Meson added Rust support, since Cargo/build.rs itself isn't sufficient enough for all use cases (especially when using it alongside with other systems languages.) And it's not too wild for something to think of adding Rust support for CMake as well...
So far the best thing that I've seen in wildlife for c/c++ code is just outsourcing everything into docker container with build tools, that you have with qmk: https://docs.qmk.fm/#/getting_started_docker. It has it's downsides, but it's just 2 commands to have built binary, one of which is to clone git repository.
I really had high hopes for Bazel, that they will either package or reuse some standard build tools configurations and just let use them by referencing them in single place in workspace, and the rest is going to handled by build system, but it doesn't seems like this is how it does work now and will work in observable future. If you want it this hermetic, you need to do it on your own.
That is great for simple projects, but a real toolchain for c++ needs to handle complex real world code. That means you have a mix of FORTRAN, ada, c, rust, some custom internal corporate language, and so on all mixed with the C++. If your build system cannot handle that mess i'm not interested
> That means you have a mix of FORTRAN, ada, c, rust, some custom internal corporate language, and so on all mixed with the C++. If your build system cannot handle that mess i'm not interested
In that case I would just use the rust toolchain which I'm sure can handle this stew.
Never going to happen. The standards group doesn't define the tooling. Every vendor has its own set compiler flags, error messages and ABI. Its also used on architectures where the target hardware has to be complied for on a completely different architecture, i.e. embedded devices.
Its too complex of a problem to be solved by a committee with competing interests.
Never say never, there is P2656. We'll see! It's more about making things interoperable than mandating a new tool, but it's still something useful in this space. It also links to previous work.
The C++ community has de facto standardized on CMake, which solves half the problem. It's not pretty, but it gets the job done.
Dependencies can mostly be managed with vcpkg and its CMake integration. The only real problem is when you need something that's not in vcpkg and have to write your own port overlay.
It's not perfect, but the situation is vastly better than it was ten years ago.
There’s also Conan which supposedly solves all problems, but we weren’t successful at migrating a large CMake project to it. Maybe if you use it from very beginning, it would work.
I remember trying to use go, it certainly wasn't that easy, the go folder was a really confounding concept. I think they got rid of that though (been a while).
sounds like a golden opportunity for c++next, whatever that will be: cppfront, circle? to standardize on a single building and package management solution.
That errors out when a library is missing, Soni figure out where to get it, how to install it and rerun configure, just to see something else is missing. And that then leads to a version conflict and then I mess.... only works well for projects with few dependencies and especially no uncommon dependencies.
I feel conflicted about this. I find that the best speakers live are dynamic in their speaking rate: sometimes slow, sometimes rapid-fire. It really helps with the pacing of a talk, and there's something about being transfixed in person and giving over a block of your life to the speaker. It feels sacred, even if it's a mediocre talk. However, when watching or listening to recorded talks, I prefer a consistent speaking rate so I can choose an optimal speed.
I kinda feel like this is DOA until modules are more widespread. Any new cppfront code in an existing C++ codebase can't include any existing header files which automatically excludes it from any C++ codebase I've worked on (or any I expect to in the next few years) without a significant amount of developer time spent making the migration to modules-only.
Would love to try it out though when that happens, I do hope that modules will eventually get mainstream enough to make cppfront viable.
I still can't use C++20 modules with a macOS app (Clang). I tried, and compile times balloon 10-30x on some files.
Maybe I should give up and try Rust. Does Rust give you _maximum_ control over performance the way C and C++ do? I need that and that's I why I used C++ in the first place.
> Does Rust give you _maximum_ control over performance the way C and C++ do?
Yes, with a small asterisk. In general, this is true. In practice, with the way optimizations go, sometimes one or the other may be faster. Or people playing with definitions. But the answer to your question in spirit is "yes."
It's hard to say what I mean exactly, because I don't know what I will need to make my program fast, but I know I want the ability to adapt and control low level details. That would include memory layout and allocation, and skipping potential safety checks if I can prove my code works, beyond what the type system and compiler can prove.
I had some Swift code a while back. It was optimized, but when I rewrote it in C++ it got a lot faster. Something bad was happening that wasn't "visible in the code", as you said. The code was pretty complex algorithm and data structure stuff related to computational geometry and computer graphics. I suspect if I tried it in Rust I'd be fighting with the borrow checker and that is not appealing.
Only if you dance around the baggage of the borrow checker and restructure your performant data-structures to use integer indices. Thats all how most Rust DS libs do this - get rid of references and replace them with integer pointers.
I want to do cross platform native C++ dev. Maybe I should switch to Microsoft-land. I've been using a MacBook for years. I see Visual Studio docs talking about C++ cross-platform mobile (iOS, Android), but nothing about desktop (eg, Linux, macOS).
I suspect Visual Studio on Windows works better than what I've been using – VS Code or Emacs with clangd, which sometimes runs amok, takes 20GB of RAM, and kills my laptop.
It depends on what you mean by maximum performance. In general Rust is pretty fast, but safety checks do cause slow down, and even with the unsafe escape hatch there are code patterns and use cases Rust currently can't address that well.
For whatever reason, Herb Sutter decided to ignore this language on the presentation.
https://www.circle-lang.org/
This is the only one with the syntax based on C++, incrementally changing the features via #pragma settings.
"Circle Fixes Defects, Makes C++ Language Safer & More Productive"
https://www.youtube.com/watch?v=x7fxeNqSK2k
"Circle Evolves C++"
https://www.youtube.com/watch?v=P1ZDOGDMNLM