Hacker News new | past | comments | ask | show | jobs | submit login

Dude worked at RAD, so I'm willing to bet he's got some experience.

In your LPSTR example, that could've been to silence compiler warnings about doing pointer arithmetic.

EDIT:

From bio:

The most significant project I’ve created to date has been The Granny Animation SDK, a complete animation pipeline system that I first shipped in 1999 and which, 15 years later, still is in active use at many top-tier game studios. "

So, uh, yeah. Maybe do some reading before spouting off on another's presumed abilities?

EDIT2:

Downvote all you want, but what code of yours has been in production for 15 years?




You don't ever need to cast from char * to char * . Period.

In 16 bit land the "L" meant something but that hasn't meant anything since long before the API he is calling existed.


Right, but you're talking past me. Look at the code:

  (LPSTR)((char*)pSessionProperties + pSessionProperties->LoggerNameOffset)
He's casting pSessionProperties to a char* so that he can do the arithmetic on it--otherwise, the compiler would assume "Oh, golly, I should increment the pSessionProperties pointer by LoggerNameOffset times sizeof(SessionProperties)".

He has to make that conversion in order to do byte offsetting correctly. He then casts that back to what it wants (an LPSTR), to match the required argument type on the function.

It's completely reasonable code, so stop complaining about it as though it weren't.


I think the complaint was about the cast to LPSTR at the end.

    typedef char* PSTR, *LPSTR;
After the addition, you've already got a char * - casting to a char * again by another name is unnecessary.


So, at that point, it gets to be a little more philosophical, right?

I err on the side of "Oh, the API requests this type (which I'll pretend I don't know is actually just a char*), so I will explicitly cast to that". At least that way it's clear in the future that there is some explicit changing of types going on if, say, LPSTR ever changes.

Silent "Oh, well, we all know that it's really going to be a <whatever> pointer here anyways" is a good way to get subtle bugs.


PSTR and PWSTR have a size implicit in their name. They are not going to change. PTSTR (so far not talked about) happens to change based on a macro but I would not recommend using it in this century - it's easier to build all your Windows apps as utf-16 and pretend everything is PWSTR (could be a whole other topic).

It sounds more like you err on the side of inserting lots of pointer casts into the code without considering or very well understanding what the types mean, in order to "shut up compiler warnings" that might not even exist. This is pretty common but it is often a really good sign of someone who doesn't know what they are doing, they are fighting the compiler warnings in their own head instead of solving real problems. (It's really easy for a pointer cast to mask a bug too.)

Frivolous pointer casts are always suspicious. It's much better to let your compiler generate the warnings, listen to them and understand them, and in a lot of cases, fix issues without mindlessly putting in a cast.


Whatever dude. Go on living in your magical bubble (in Redmond, perhaps...?). You've seen the arguments here against why your earlier analysis is wrong, and are hellbent on insisting that writing more explicit code is not a good idea because you're such an elite h4x0r.

With any luck I won't have the pleasure of sharing a codebase with you.

EDIT:

Also, UTF-16 is bad and you should feel bad.


They made a big bet on 16 bit chars before utf-8 existed. Lots of stuff from the same time period stuck with the same choice (java is one example). I am not a fan or opponent of utf-16, utf-8 does work well, but in many places a larger char type is a reality. In Windows it's the only way that makes sense, by the time you get to a syscall you need 16 bit strings and support for anything else without conversion is considered legacy.

I'm not even saying this is the only way to structure a code base or that I'm unwilling or haven't seen or worked with something else. I'm talking about what the sane conventions for a Windows app would be. When in Rome, and all that. I would not advocate utf-16 on Unix (even if that's what millions of people using say Java end up getting).


Right.

For the record, I wasn't criticizing - I mostly agree with the side you're erring on here - I was just trying to clarify the criticism.


Gotcha, thanks. :)


The docs remain silent on the subject but since ControlTrace takes a TCHAR * I guess the logger name in the struct could be a TCHAR[] too. So perhaps LPTSTR was intended.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: