Hacker News new | past | comments | ask | show | jobs | submit login

Would it have been trivial and obvious for Java (and would Java still have been "not scary") back in the 90s when it came out?



It wouldn't have been particularly hard from a language, standard library, and virtual machine perspective. It would have made converting legacy C++ programmers harder (scarier). Back then the average developer had a higher tolerance for defects because the consequences seemed less severe. It was common to intentionally use null variables to indicate failures or other special meanings. It seemed like a good idea at the time


> It would have made converting legacy C++ programmers harder (scarier).

And that, right there, is all the reason they needed back then. Sun wanted C++ developers (and C developers, to some extent) to switch to Java.


It would have been trivial for record types to be non-nullable by default.

Record types are 3 years old and they are already obsolete with regards to compile time null checking. This is a big problem in Java. A lot of new features have become legacy code and are now preventing future features to be included out of the box.

This is why the incremental approach to language updates doesn't work. You can't change the foundation and the foundation grows with every release.

I am awaiting the day Oracle releases class2 and record2 keywords for Java with sane defaults.


Tony Hoare (the guy who originally introduced the concept of null for pointers in ALGOL W) gave a talk on it being his "billion dollar mistake" in 2009: https://www.infoq.com/presentations/Null-References-The-Bill...

Now, this wasn't some thing that just dropped out of the blue - the problems were known for some time before. However, it was considered manageable, treated similarly to other cases where some operations are invalid on valid values, such as e.g. division by zero triggering a runtime error.

The other reason why there was some resistance to dropping nulls is because it makes a bunch of other PL design a lot easier. Consider this simple case: in Java, you can create an array of object references like so:

   Foo[] a = new Foo[n];  // n is a variable so we don't know size in advance
The elements are all initialized to their default values, which for object references is null. If Foo isn't implicitly nullable, what should the elements be in this case? Modern PLs generally provide some kind of factory function or equivalent syntax that lets you write initialization code for each element based on index; e.g. in Kotlin, arrays have a constructor that takes an element initializer lambda:

   a = Array(n) { i -> new Foo(...) } 
But this requires lambdas, which were not a common feature in mainstream PLs back in the 90s. Speaking more generally, it makes initialization more complicated to reason about, so when you're trying to keep the language semantics simple, this is a can of worms that makes it that much harder.

Note that this isn't specific to arrays, either. For objects themselves, the same question arises wrt not-yet-initialized fields, e.g. supposing:

   class Foo {
      Foo other;   
      Foo() { ... }
   }
What value does `this.other` have inside the constructor, before it gets a chance to assign anything there? In this simple case the compiler can look at control flow and forbid accessing `other` before it's assigned, but what if instead the constructor does a method call on `this` that is dynamically dispatched to some unknown method in a derived class that might or might not access `other`? (Coincidentally, this is exactly why in C++, classes during initialization "change" their type as their constructors run, so that virtual calls always dispatch to the implementation that will only see the initialized base class subobject, even in cases like using dynamic_cast to try to get a derived class pointer.)

Again, you can ultimately resolve this with a bunch of restrictions and checks and additional syntax to work around some of that, but, again, it complicates the language significantly, and back then this amount of complexity was deemed rather extreme for a mainstream PL, and so hard to justify for nulls.

So we had to learn that lesson from experience first. And, arguably, we still haven't fully done that, when you consider that e.g. Go today makes this same exact tradeoff that Java did, and largely for the same reasons.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: