What you would likely do is use an extension bit so there would be no fixed maximum length for the strings. This would of course add a bunch of overhead that you may or may not make up with the other advantages of knowing the length of strings.
* If you're appending data to a string, and the variable-length increases by a byte, will you have to memmove() the entire existing string down 1 byte?
* Is every programmer responsible for detecting this condition?
* (This will make manual string manipulation very complicated and dangerous.)
* Suppose you're concatenating two strings, such that the sum of the lengths requires an additional byte. This could cause a buffer overflow. How would a strcat() function avoid causing a buffer overflow here?
* Does every string need a maximum-length counter too?
* Can you access a random element in the string without having to dereference and decode the length?
On the other hand, if you use a constant-sized length,
* What happens when you overflow the maximum length?
* Can you erroneously create a shorter string by appending text?
* How should string libraries handle this condition? By abort()ing? By returning a special error code? Does every string manipulation need to be wrapped in an if() to detect the special error? How should the programmer handle this condition?
In either case,
* Can you tokenize a string in-place?
* Can an attacker read a program's entire address space by finding an address that begins 0xffffffff and treating it as a string?
I think most of these are answered with the suggestion that no one would have made strings this way without making a matching library that handled the concerns you have here. Honestly char[] strings are much less useful in a UTF-8 world anyway.