I'm familiar with what a C64 is, but I have never seen one in my life. The majority of my fellow undergrad students know next to nothing about the history of computing. But in reality, that's just fine. If our goals are to become functioning software engineers, all that really matters is what's in use now. Provided, a little historical context can help us make some better decisions, but you don't see people coding in ed just so they appreciate their IDE.
> But in reality, that's just fine. If our goals are to become functioning software engineers, all that really matters is what's in use now.
I used to agree with you, but I've come a full 180 on this. I now believe that 'functioning software engineers' need a thorough understanding of what has already been done in the past to avoid endless re-invention of that past (but usually worse). The amount of resources that is squandered because present day engineers have no idea of how we got where we are today easily weighs up against the extra cost of requiring a class in computer history prior to becoming a functioning engineer.
If we don't do that we'll be stuck forever at the present 'local maximum', which is essentially mostly a rehash of old ideas but with some eye candy thrown in and usually more latency and waste. I suspect that we can do a lot better with our current hardware than we are really doing.
Java 5: 2004. Java 6: 2006. Acquisition: 2009-2010. Java 7: 2011. Yu're kind of right, but I had always interpreted that the acquisition had delayed every decision-making and effectively halted any release of Java, until they got the approval of the acquirer about the contents and scope of the release.
And even for SUN, a 20 year programmer today was only 14 or so when the company was sold to Oracle and probably didn't follow the tech industry.