Isn't that the whole point of this discussion? We're talking about conventions programmers treat as absolutes that are detrimental to the maintenance and security of the programs.
So I don't see how this has much bearing on the ultimate point I'm making, which is that yes, conventions can be bad haha.
It’s funny - it’s almost like C never had a string type and null-termination is more a convention than a primitive data type.
There’s nothing that prevents us from having Pascal-like strings as much as we want, provided we know we’ll need to reimplement everything we need.