As the other commenter responded there, your example isn't about editions at all. It's about mixing ABIs and mixing multiple versions of the language runtime. Those are entirely separate issues.
You're correct that the possible changes in editions are very limited. But editions don't hinder interoperability in any way. They are designed not to. Today, there are no interoperability problems caused by editions specifically.
> compromises will be required, specially regarding possible incompatible semantic differences across editions.
That's just an assumption in your head. 4 editions later, it still hasn't manifested in any way.
4 editions later, Rust is yet to be used at the scale of C and C++ across the industry, my point on the comment is how editions will look like after 50 years of Rust history.
ABIs and multiple versions of the language runtime, are part of what defines a language ecosystem, hence why editions don't really cover as much as people think they do.
Editions will look like band-aids that don't fully solve the cruft accumulated over the 50 years. That's not hard to predict. It's still a very useful mechanism that slows down the accumulation of said cruft. I'm yet to see a language that has a better stability/evolution story
In any case, while I as language geek have these kind of discussions, given current progress in AI systems, my point of view is that we will get the next evolution in programming systems, thus it won't matter much if it is C, C++, Rust, C#, Java, Go or whatever.
We (as in the industry) will eventually get reliable ways to generate applications directly to machine code, just as optimizing compilers took a couple of decades to beat hand written Assembly, and generate reliable optimized code.
So editions, regardless of what they offer, might not be as relevant in such a timeframe from a couple of decades ahead.
The machine code will always be generated from "something". A one-line informal prompt isn't enough. There will always be people who write specs. Even the current languages are already far from "machine code" and could be considered "specs", albeit low-level
As the response to that comment points out, you are confusing editions for ABI changes. Different editions are purely a source-level change in the language. All editions are ABI-compatible.
Not really, because one thing that apparently I haven't gotten across is that they don't cover semantic changes, only grammar ones for the most part.
What is the compiler supposed to generate if code from edition X calls code in edition X + 10, with a lambda written in X + 5, expecting using a specific language construct that has changed semantics across editions, maybe even more than once?
> one thing that apparently I haven't gotten across is that they don't cover semantic changes, only grammar ones for the most part.
You have gotten it across just fine.
We're trying to get across that no one is ever going to do "global" incompatible semantic changes in editions. That's been understood from the start. Exactly because of the problems that you describe.
It would work as intended? I’m not sure what problem you are trying to point out. The lambda would compile in the edition in which it is written, resulting in a code unit passed to the other crate(s). As mentioned, the ABI is stable across editions. To put in your words, there are no semantic changes to the ABI across editions.
https://news.ycombinator.com/item?id=26966151