Yes, and I don't want a backward-incompatible Rust 2.0 either, but the slowness of stabilizing ! (named for when it's going to be stabilized), specialization, TAIT, const-eval, Vec::drain_filter, *Map::raw_entry, ... is annoying. Also the lack of TAIT currently causes actual inefficiency when it comes to async traits, because currently async trait methods have to return allocated virtualized futures instead of concrete types. Same for Map::raw_entry, without which you have to either do two lookups (`.get()` + `.entry()`) or always create an owned key even if the entry already exists (`.entry(key.to_owned())`).
If you think, that's bad, look at C/C++ standardization bodies, where stuff is eternally blocked because ABI compatibility.
---
Problem is lots of implementation things are vying for inclusion. And many thing people want aren't safe or block/are blocked by possible future changes.
For example default/named arguments were blocked by ambiguity in parsing when it comes to type ascription iirc. And not having default arguments makes adding allocator quite more cumbersome.
Plus Rust maintainers are seeing some common patterns and are trying to abstract over them - like keyword generics/ effect system. If they don't hit right abstraction now, things will be much more harder later. If they over abstract, its extremly hard to remove it.
---
Slowness of stabilizing never type (!) and specializations has to do with the issues they cause mainly unsoundness and orphan rules issues, iirc I haven't checked them in a while.
Having strong backwards compatibility does that to the language, alternative is arguably worse (see Python 2 vs 3).