Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: How different was the software industry when you first started out?
1 point by armaizadenwala on March 8, 2020 | hide | past | favorite | 3 comments



When I started, TCP/IP was in the process of vanquishing other network stacks.

C or possibly Fortran or Pascal were the main languages. Although, skill with assembly was often necessary to work around buggy or non-performant compilers.

Information was shared on dialup BBSs - access to the internet was rare or expensive.

If you wanted to know something, you bought a book or got a job at a place that did that thing and learned from others.

x86 was just one of many possible hardware choices.

Companies were so desperate for people who could write software, I knew people who were hired right out of high school because they could program. They instantly made more money than their teachers.

Upon discovering UNIX and X11, a whole new world opened up. Learning that ecosystem has really paid off as I've been able to more or less ride that wave for decades.


In the early 90's, everything was client/server where I worked. If you couldn't fix your problem from the command line, there was something really wrong with the software itself. If the database broke, it was perfectly normal to fire up a command line client and manually enter SQL queries to find the bug.

Before the decade was up, the overall belief was if you had to use the command line to fix a problem, there was something really wrong with the software itself. But you wouldn't use the command line to debug it. There is a reason penetration testers still use the command line so extensively to this day. I pity developers who don't get to learn what you can do from "underneath" since it is often so much more efficient.

The funny thing was, it was all client/server, but both the server and the client were powerful. Then things got monolithic and power was taken away from the client. And now that we're still reaping the benefits of Moore's Law, our clients are super powerful again, but we moved the server portion to "the cloud" which may or may not have simplified things. It's starting to look like the mainframe days again - just more efficient.

The takeaway is that no matter how you see things right now, it'll be deconstructed and reconstructed forever -- even if it doesn't really need to be. There is a lot of money being made on change for change's sake.

ADDITIONAL: A big thing back then was how information got shared so freely in the 80's and early 90's. It was easy to get into conversations about Artificial Life (for example) directly with folks at the various national laboratories. Once the Internet started to become commercialized, it was all about keeping secrets and profits - even on things that were mundane or reasonably well known to technical folk. Patent trolling really messed things up. There is still some openness, but it isn't as casually free as it was then.

Having said that, I really love how technology is evolving. Look at Raspberry Pi. In the 90's, you couldn't afford to build your own super computer to learn parallel coding, and cpu's were single core, so you couldn't really even fake it all that well. Now you can build some really amazing things on the cheap, with all the available sensors, etc, and of course, Arduino.

Perhaps sharing didn't really die. It just became more focused.


When I started (1985) there was no web/internet to speak of. Development was for desktop apps and embedded.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: