Hacker News new | past | comments | ask | show | jobs | submit login

I think the complaint was about the cast to LPSTR at the end.

    typedef char* PSTR, *LPSTR;
After the addition, you've already got a char * - casting to a char * again by another name is unnecessary.



So, at that point, it gets to be a little more philosophical, right?

I err on the side of "Oh, the API requests this type (which I'll pretend I don't know is actually just a char*), so I will explicitly cast to that". At least that way it's clear in the future that there is some explicit changing of types going on if, say, LPSTR ever changes.

Silent "Oh, well, we all know that it's really going to be a <whatever> pointer here anyways" is a good way to get subtle bugs.


PSTR and PWSTR have a size implicit in their name. They are not going to change. PTSTR (so far not talked about) happens to change based on a macro but I would not recommend using it in this century - it's easier to build all your Windows apps as utf-16 and pretend everything is PWSTR (could be a whole other topic).

It sounds more like you err on the side of inserting lots of pointer casts into the code without considering or very well understanding what the types mean, in order to "shut up compiler warnings" that might not even exist. This is pretty common but it is often a really good sign of someone who doesn't know what they are doing, they are fighting the compiler warnings in their own head instead of solving real problems. (It's really easy for a pointer cast to mask a bug too.)

Frivolous pointer casts are always suspicious. It's much better to let your compiler generate the warnings, listen to them and understand them, and in a lot of cases, fix issues without mindlessly putting in a cast.


Whatever dude. Go on living in your magical bubble (in Redmond, perhaps...?). You've seen the arguments here against why your earlier analysis is wrong, and are hellbent on insisting that writing more explicit code is not a good idea because you're such an elite h4x0r.

With any luck I won't have the pleasure of sharing a codebase with you.

EDIT:

Also, UTF-16 is bad and you should feel bad.


They made a big bet on 16 bit chars before utf-8 existed. Lots of stuff from the same time period stuck with the same choice (java is one example). I am not a fan or opponent of utf-16, utf-8 does work well, but in many places a larger char type is a reality. In Windows it's the only way that makes sense, by the time you get to a syscall you need 16 bit strings and support for anything else without conversion is considered legacy.

I'm not even saying this is the only way to structure a code base or that I'm unwilling or haven't seen or worked with something else. I'm talking about what the sane conventions for a Windows app would be. When in Rome, and all that. I would not advocate utf-16 on Unix (even if that's what millions of people using say Java end up getting).


Right.

For the record, I wasn't criticizing - I mostly agree with the side you're erring on here - I was just trying to clarify the criticism.


Gotcha, thanks. :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: