C can add a whole alphabet if str?cpy functions, and they all will have issues, because the language lacks expressive power to build a safe abstraction. It's all ad-hoc juggling of buffers without a reliably tracked size.
The language is expressive enough to have a good string library. It has string.h instead because of historical reasons. When it was introduced the requirements for a string library were very different from today's.
The types that C knows about are the types that the assembly knows about. Strings, especially unicode strings, aren't something that the raw processor knows about (as far as I know). At the machine level, it is all ad-hoc juggling of buffers without a reliably tracked size, until you impose constraints like only use length-prefixed protocols and structures. Where "only" is difficult for humans to achieve. One slip up and wham.
C with its notion of an object, TBAA, and pointer provenance is already disconnected from what the machine is doing.
The portable assembly myth is a recipe for getting Undefined Behavior. C is built on an abstract machine described in the C spec, not any particular machine code or assembly.
Buffers (objects) in C already have an identity and a semantically important length. C just lacks features to keep track of this explicitly and enforce error handling.
Languages exist to provide a more useful abstraction on top of the machine, not to naively mirror everything even where it is unhelpful and dangerous. For example, BCPL did not have pointer types, only integers, because they were the same thing for the CPU. That was a mess that C (mostly) fixed by creating "fictional" types that didn't exist at the assembly level.
The people who define Cs abstract machine are well aware of what real hardware is like. The standard of course doesn't mention real hardware but what is in there is guided by real hardware behaviour. they add to the specs when a change would aid real implementation
The ommitte empers have been awareof simd for a long time and asking that. So far they have either not agreed, or because they have seen no need because autovectorization has shown much promise without. (that is both of the above are true though not always to the same people)
multi core is where languages have had to change because the language model of 1970 wasn't good enough.
How should they? in some cases they have decided that isn't where they want c to go, in others the model of 1970 is still good enough, and in others they are being slow (possible intentional to not make a mistake)
You must be thinking of c++? There is no object in C just structs which is just a little bit of organization of the contiguous memory. C exists to make writing portable CPU level software easier than assembler. It was astonishingly successful at this niche; many more people could write printer drivers. While ptr types may not formally exist in assembly, the variety of addressing modes using registers or locations that are also integers has a natural resonance with ptr and array types.
I would say C precisely likes to mirror everything even where it is unhelpful and dangerous. The spirit is captured in the Hole Hawg article: http://www.team.net/mjb/hawg.html
It is the same sort of fun one has with self modifying code (JIT compilers) or setting 1 to have a value of 2 in Python.
ed: https://en.cppreference.com/w/c/language/object is what is being referred to. I'm still pretty sure in the 80s and 90s people thought of and used C as a portable low-level language, which is why things like Python and Linux were written in C.