I mostly agree, but I also feel it hasn't really been attempted properly, by which I mean exploiting the type checker to its fullest. For instance, write a stdlib where all functions reading input return "unsanitized" data types that must be supplied with a validation check before the sanitized value can be read out. Safer strings and arrays, possibly references instead of pointers, return explicit error values rather than errno, etc. And then require a program to only write against the safelib.
I suspect most C programmers just wouldn't like the constraints, or would be concerned with performance degradation, and that's the real problem.
Edit: another avenue is to also use Frama-C to check your code.
You can do this. You can ban all pointer arithmetic, use array types that have attached sizes and automatically introduce bounds checks, require various compiler extensions and annotations for lifetimes, require initialization immediately upon declaration, have the compiler introduce nullptr checks when it cannot prove a pointer is nonnull, ban std::variant and reinterpret_cast<T>, and more.
You end up with something that resembles the set of programming requirements from Rust (if you leverage lifetime annotations) or you end up with something resembling a refcounting GC (if you demand the use of shared_ptr everywhere).
This cannot really be "attempted properly" at scale. To implement this with C you need to both superset and subset the language. To implement this with C++ you need to harshly subset the language. Both communities hold backwards compatibility as a huge goal, making it impossible to move in this direction. A project like Carbon seeks to incrementally move in this direction, but required breaking off from the C++ community. Projects like Rust just rip the band-aid off entirely at the beginning.
The "don't worry, C is fine if you just don't suck at programming" folks don't tend to push for these extreme changes.
I'm not even thinking about compiler extensions, just standard opaque pointers and fat pointers, and macros/functions that operate on them, and a linter that flags any references to unsafe C stdlib functions or uninitialized locals. This won't be as safe as Rust, but you'll at least still be in C. I just think we can be a lot further along the safety spectrum in C than we currently are, it's just C programmers still use some outdated practices and idioms that could be safer.
The point is that you can get some safety guarantees. Nowhere near the degree of guarantees available in safer languages, but still better than the status quo.
Edit: one example of a pattern, whose name I can't remember, was to switch from returning pointers to data structures, to returning pointer offsets as handles. Using these handles you can then track more information about validity and handle "null pointers" more sensibly rather than it introducing undefined behaviour. In superscalar processors the offset calculation basically costs nothing, but the additional safety can be considerable. I believe I read about this pattern in game engines, let me know if you know the name of it!
You can get some safety guarantees. Many of these things are very good choices. I own a very large C++ codebase that uses this approach you mention all over the place.
But you can still see important limitations. Chrome, for example, has custom smart pointers (miracle_ptr) and still has UAFs galore.
I suspect most C programmers just wouldn't like the constraints, or would be concerned with performance degradation, and that's the real problem.
Edit: another avenue is to also use Frama-C to check your code.