Whatever dude. Go on living in your magical bubble (in Redmond, perhaps...?). You've seen the arguments here against why your earlier analysis is wrong, and are hellbent on insisting that writing more explicit code is not a good idea because you're such an elite h4x0r.
With any luck I won't have the pleasure of sharing a codebase with you.
They made a big bet on 16 bit chars before utf-8 existed. Lots of stuff from the same time period stuck with the same choice (java is one example). I am not a fan or opponent of utf-16, utf-8 does work well, but in many places a larger char type is a reality. In Windows it's the only way that makes sense, by the time you get to a syscall you need 16 bit strings and support for anything else without conversion is considered legacy.
I'm not even saying this is the only way to structure a code base or that I'm unwilling or haven't seen or worked with something else. I'm talking about what the sane conventions for a Windows app would be. When in Rome, and all that. I would not advocate utf-16 on Unix (even if that's what millions of people using say Java end up getting).
With any luck I won't have the pleasure of sharing a codebase with you.
EDIT:
Also, UTF-16 is bad and you should feel bad.