Hacker News new | past | comments | ask | show | jobs | submit login

You're thinking in abstract terms, I'm talking about the concrete implementation details. If we, just as an example, take C. and int can never be NULL. It can be 0, compilers will sometimes tell you it's "uninitialized", but it can never be NULL. all possible combinations of bit patterns are meaningfully int.

Pointers are different in that we've decided that the pattern where all bits are 0 is a value that indicates that it's not valid. Note that there's nothing in the definition of the underlying hardware that required this. 0 is an address just like any other, and we could have decided to just have all pointers mean the same thing, but we didn't.

The NULL is just a language construct, and as a language construct it could be defined in any way you want it. You could defined your language such that dereferencing NULL would always return 0. You could decide that doing pointer arithmetic with NULL would yield another NULL. At the point you realize that it's just language semantics and not fundamental computer science, you realize that the definition is arbitrary, and any other definition would do.

As for sum-types. You can't fundamentally encode any more information into an int. It's already completely saturated. What a sumtype does, at a fundamental level, is to bundle your int (which has a default value) with a boolean (which also has a default value) indicating if your int is valid. There's some optimizations you can do with a "sufficiently smart compiler" but like auto vectorization, that's never going to happen.

I guess my point can be boiled down to the dual of the old C++ adage. Resource Allocation is NOT initialization. RAINI.




Then your point is tangent to the question of zero values, and even more so to the abstract concept of zero values spilling over into protobuf.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: