In case it helps, I've noticed what I'd consider a wart in C that seems to be a common stumbling block with this - including when I first learned C a couple decades ago: * when used for types, has the "opposite" effect when it's used for values/expressions.
int* foo; // The addition of a star here, on a type, means we're converting the type /into/ a pointer-to-int /from/ a plain int.
int i = *foo; // The addition of a star here, on a value, means we're converting the type /from/ a pointer-to-int /into/ a plain int (reference).
I found it much easier to keep the two straight once I segregated their use cases like this. There's a similar duality with & in C++ using references, although not quite as cleanly:
int& foo = ...; // The addition of & here, on a type, means we're converting the type /into/ a reference /from/ a plain int.
p = &foo; // The addition of & here, on a value, means we're converting the type /from/ a reference /into/ a pointer-to-int.
Regardless of "fault", I hope this helps :)
(edit: C++ style comments used even in 'C' code to try and prevent YN from treating stars as formatting... replaced a few instances with 'a star'... changed emphasis to use slashes because it's still eating asterisks characters...)
(edit: C++ style comments used even in 'C' code to try and prevent YN from treating stars as formatting... replaced a few instances with 'a star'... changed emphasis to use slashes because it's still eating asterisks characters...)