Strings in C are more like a lie. You get a pointer to a character and the hope there is a null somewhere before you hit a memory protection wall. Or a buffer for something completely unrelated to your string.
And that's with ASCII, where a character fits inside a byte. Don't even think about UTF-8 or any other variable-length character representation.
In fairness, the moment you realize ASCII strings are a tiny subset of what a string can be, you also understand why strings are actually very complicated.
> In fairness, the moment you realize ASCII strings are a tiny subset of what a string can be, you also understand why strings are actually very complicated.
Oh absolutely, but it's a pretty reasonable expectation that any contemporary language should handle that complexity for you. The entire job of a language is to make the fundamental concepts easier to work with.
Sadly, strings are, at the same time, complicated enough to be left outside the fundamental concepts of a language, but far too useful to be left outside the fundamental concepts of a realistically viable language.
Strings in C are more like a lie. You get a pointer to a character and the hope there is a null somewhere before you hit a memory protection wall. Or a buffer for something completely unrelated to your string.
And that's with ASCII, where a character fits inside a byte. Don't even think about UTF-8 or any other variable-length character representation.
In fairness, the moment you realize ASCII strings are a tiny subset of what a string can be, you also understand why strings are actually very complicated.