Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In 1986 I had to not only know 65C02 assembly language to get any performance out of my 1Mhz Apple //e, I had to know that I could get 50% more performance for every access I did to memory on the first page as opposed to any other page. If I spent time doing that type of micro optimization today, I would be fired. I couldn’t imagine doing the types of things I could do today with modern technology.

In 1995, when I wrote my first paid for application in college, Internet was a thing for most colleges where I did some work on a HyperCard based Gopher server (long story), that wouldn’t have been possible in 10 years earlier.

In 2006, I was writing field service software for ruggedized Windows Mobile devices, architecting for semi connected smart devices is a completely different mindset than terminal programming or desktop programming. That wasn’t feasible before hardware became cheap and at least 2G was ubiquitous.

Even then what we could do, pales in comparison to the type of field service implementation I did in 2016 when mobile computing was much more capable, much cheaper and you could get a cheap Android devices and 3G/4G was common place.

But people thinking cloud computing is just “sharing mainframes” and don’t rearchirect either their systems or their processes is how we end up with “lift and shifters” and organizations spending way too much money on infrastructure and staff.

Also anyone who equates managing AWS to a “GUI” kind of makes my point, if you’re managing your AWS infrastructure from a GUI - you’re doing it wrong. 10-15 years ago you didn’t set up your entire data center by running a CloudFormation template or any other type of infrastructure as code.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: