Hacker News new | past | comments | ask | show | jobs | submit login

If it were possible to send a modern software engineer back in time to that era, would they have any usable or employable skills in such an environment? (besides the uncanny ability to correctly predict the future)



Of course they would have usable skills - general understanding of computing, superb analytical minds, 36 years of advancement in education, excellent problem solving ability.

At most they would have to learn a new lexicon of programming languages and operating systems which are FAR less complicated than the current range of languages and design patterns software engineers must master today.

A few weeks (days?) of studying and figuring out how to pre-optimize for much smaller workloads and they'd be able to keep pace easily.


I agree with this for the top n% of programmers, but keep in mind there are orders of magnitude more "software engineers" employed today than back then, and many of them are employed in very narrow niches with very powerful tools. Certainly I've seen a great number of so-called programmers that would be utterly unemployable in such low-level environments as we had in the early 80s.


Oh the sweet innocence of youth!


Some of us are from that era.

Yes, you would manage so long as you can handle a shell prompt and can grok simple assembler.


Today's CS graduates are much better educated than people in those days. Early programmers wasted much of their energy fighting the platform, reverse engineering undocumented features, and haggling vendors for better access.

Today, you just focus on the problem and take the underlying platform for granted. Free Software has liberated computing from vendor-reliance.


The education system tried to keep itself above petty implementation details. Sure you needed to worry about it to get your assignments done, but that's not what the classes were all about.


fighting the platform, reverse engineering undocumented features, and haggling vendors for better access.

Whereas today we have Android.


Any "modern software engineer" that wouldn't have plenty of usable and employable skills back then should perhaps take a good long look in the mirror; do they really see a "software engineer" looking back?

Perhaps cheating, since I started programming in that era, but I'm sure I'm not the only one that still has remembers assembly language techniques (if not even the mnemonics for one or more 8 bit chips), as well as remembering at least a handful of BASIC dialects, the finer points of bumming cycles and bytes to fit in tight spaces, etc.

In fact, though I certainly don't want to give up my curent tech, my primary emotion looking at those pictures was wistful nostalgia.


I don't think some modern engineers would be successful back then because today many rely on the abundance of online resources, code schools, and such. I was only 2 at that time but when I learned programming on an Apple II+ a few years later I was entirely dependent on having a mentor (my father). I wouldn't have known where to go for help or what books to read otherwise. There was no concept of going online as far as I was concerned and the only way to get basic information about computing was through the network of people you knew, where only a small percentage of which had any interest in computers.


Old manuals go a long way towards understanding the computers they describe though. I've been nothing short of impressed with any manual I've found that was made before 1990.

Maybe because computers were not so complex yet so you could actually fit their basic operation/commands in a few hundred pages.


I got my CS degree in 1984 and did programming on my own from 1977. I'm still in development so I can be your time traveler. Yes and no? If you've taken the GUI, drag & drop components route I think you'd have trouble. Another big sticking point would be if you've only worked in languages with automatic garbage collection. My opinion is you need to have malloc and free, or the equivalent in assembly etc. ingrained in you to get it right, to make sure your routines free up memory no matter the exit points. Of course you could be a modern Linux C programmer using vi and you'd be their ninja rockstar on day one. Comparing the two eras I'd say you could do a lot less back then but you were expected to do a lot less. Standing up a database, using a connector to it in your language, getting input from the user to put in the database; 1-2 days now, weeks or more back then.


It depends on what you call a modern software engineer. Anyone who can program a small embedded system would probably be OK. People would miss IDEs and online help, the corpus of free software we enjoy today and lots of nice things we have now - IRC, Google, StackOverflow, but, apart from learning to live in a more frugal environment, I see no big difficulty.

Think about developing for Arduino with software running on an Arduino.


Would they want to be employed in such an environment?

Imagine spending long hours trying to fit your code into 128k?


Try 64K on a PDP-11. Or writing assembly to fit into (say) 16K of ROM on a Z-80 or 6502; that's terrifying, because ROM is forever and you do a 6-8 week spin if you make a mistake. (EEPROMs? Sure, at about 8X the cost of a ROM).

Mostly you'd be up against:

- No Intenet to look up reference material. For that you have books. I'm not sure how much of a revolution the online Unix man pages were, but I'd not worked on any other system that had that kind of documentation. Hope you have lots of bookshelf space (I did :-) ).

- No GUIs anywhere. There was Emacs, kind of. Mostly you got along with ed and regexprs.

- Frustrating toolchains; pre-ANSI C, with 7 characters of significant symbols on earlier systems. I don't remember if any Unix debuggers had symbol information, but they were all command-line driven at the assembly level, with no source information. That's okay, you could pretty much tell where you were by the assembly, because the optimizers were terrible.

- Email? Hoo boy. Might as well just go across the hall to talk to somebody, because unless you were on ARPANET that's about all the farther your email would get.

It'd be frustrating, but kind of fun.

Nice things:

- Tinier software. You've got skillz dealing with hundred thousand line programs. Things were smaller back then, mostly.

- No security worries. I don't know whether to laugh or cry, but DES was pretty controversial (the whole 56-bit key thing) and US citizens couldn't say anything to foreigners about crypto. No network, no crypto, right? (Unix passwords were encrypted with a rotor engine similar, and I think that salts came later).

- Boot times are about the same then as now. :-)


128k?!

Heh. Heh heh.

As if.

Try less than 16k. Less than 8k in may cases. Certainly well less than 64k.

Oh, and it was an absolute blast fitting code into that space.

In fact, my first computer of my very own was a 1802 based single board I wire-wrapped my self. I had loads of fun coding up programs in the 1K of RAM I had at first.


Well, it's better than going back in time only to be unemployed.

And besides, there are people who enjoy it: the entire 64k and 4k demoscene categories live because it's an interesting challenge.


You can still find jobs today where you need to fit all code into 512 instructions and you only have 32 or 64 bytes of RAM (e.g. a low-end PIC12 IIRC).


I could probably crank out a workalike clone of DOS 1.0 in a week or two :-)


Back then it took a week just to compile DOS 1.0.


You both exaggerate a bit. Sure you could knock out a proof of concept for DOS in a week, but it would takes months to get the nitty gritty details right. And assemblers were pretty spry back then, they had to be, and you didn't write an OS in anything higher level.


MASM 1.0 was hardly spry, it had a linear symbol table, meaning that it took exponential time as the size of the program increased. Plus, it crashed a lot, and I mean a lot. Every crash meant a tedious reboot, making development very slow.

I'd write a cross-assembler that ran on the PDP-10. That's probably what Microsoft did, as I find it hard to believe that MASM 1.0 could have been used to compile DOS.

I'd also write the DOS clone in C, running it on the 10 with an emulator. Then I'd hand-translate it to assembler.

Remember, DOS 1.0 fit on a 160K floppy, including the numerous utilities thrown in. That isn't much code, even assembler code.


Paul Allen confirmed (in his books) that in the early years Microsoft used PDP's a lot to emulate smaller hardware. In fact the very first Basic was developed without ever having the physical Intel CPU, only the Intel manuals.


There were fast assemblers. The SC assembler on the Apple II did maybe 1,000 lines a second. The x86 assembler from Digital Research was pretty fast too. You could play games with RAM disks and so forth.

The thing about cross-development on a mini or mainframe that killed you was the download time. I did cross-assembly at Atari, and it was always the download time that took soooo longggg. Wrote a few smart downloaders while I was waiting for dumber ones to finish. 9600 baud sucks hard.


He'd mop the floor with them. Systems, languages, environments, etc., etc., were trivial then compared to today. I worked back then in a few "labs" that looked a lot like that.


If you work in assembly you'd be in good shape. Though you probably would pull your hair out with the hardware speed, memory constraints, etc!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: