Hacker News new | past | comments | ask | show | jobs | submit login
Photos at Microsoft Corp (1979) (sound-photo.com)
200 points by colinprince on Aug 25, 2015 | hide | past | favorite | 83 comments



And this is what these folks are up to now.

http://www.businessinsider.com/microsoft-1978-photo-2011-1?o...

Some of them have done reasonably well.


After that, it doesn't look like she did much else. She raised her children and became a volunteer

Not a great way to phrase things...


Or we've all just become hypersensitive to something that doesn't necessarily mean what you are implying.

...is what I was going to reply with, with examples of how replacing the component parts of the sentence may change how it's interpreted. Then I looked at the source article, and saw that it's a followup sentence after describing how she sued Microsoft for sexual discrimination. Regardless of how it was meant to be interpreted, that's some pretty poor awareness of context on the author's part.


Phrased as perceived. The author is probably too young to appreciate what his mother likely did for him. Becoming a parent has a tendency to change your perspective on parenthood.


I agree, that is shameful. Each year I take a week of from work to care for my 2 children when my wife goes out of town. Hardest week of my entire year!


Yep, I think it's just poor phrasing.

I just read it as "it doesn't look like she pursued her career further"

I don't think the author meant to downplay difficulty of child rearing.


Could someone put this into a pastebin or something please? Somehow I can't go beyond the "we decided to investigate this" first page, either through the 'One page' or through 'Slides' (and there's no 'next'-'prev' either anywhere).


Thank you for that. I've tried disabling uBlock, run it in FF, but nothing seems to work. I was already coming up with conspiracy theories wrt my ISP. Guess their site is broken.

P.S. From what I can understand the site tries to load the page via AJAX on a local domain but is denied access. So while the top URL is www.businessinsider.com the content is being loaded from www.businessinsider.in (at least in my case) which your browser blocks as no 'Access-Control-Allow-Origin' header is present. Here is a mirror by the way [1]

[1] https://filetea.me/t1sizFxkd5jQVyhAq9LjUUkDw


It annoys me mildly that they switched the order of then/now half way through.


One of the guys (white shirt and brown corduroys) is Mike Courtney. He and I worked together at Seattle's Retail Computer Store, alongside Bob Wallace (Quickwrite & Shareware) and Tim Paterson (MS-DOS).


For those of us who've spent our entire careers in the networked, computerized office, it's interesting to realize that at the time Microsoft was setting the goal of having a computer on every desk, there wasn't even a computer on every desk at Microsoft.


Given that we still haven't achieved the paperless offices that many want, sometimes I start feeling like not much has changed over the last couple decades, but this album really illustrates the difference. It is rather humorous to look at just how much PAPER they used in development. Developers using paper binders of documentation, tractor feed prints of code, etc... it's nice that at least those days are mostly over!


If it were possible to send a modern software engineer back in time to that era, would they have any usable or employable skills in such an environment? (besides the uncanny ability to correctly predict the future)


Of course they would have usable skills - general understanding of computing, superb analytical minds, 36 years of advancement in education, excellent problem solving ability.

At most they would have to learn a new lexicon of programming languages and operating systems which are FAR less complicated than the current range of languages and design patterns software engineers must master today.

A few weeks (days?) of studying and figuring out how to pre-optimize for much smaller workloads and they'd be able to keep pace easily.


I agree with this for the top n% of programmers, but keep in mind there are orders of magnitude more "software engineers" employed today than back then, and many of them are employed in very narrow niches with very powerful tools. Certainly I've seen a great number of so-called programmers that would be utterly unemployable in such low-level environments as we had in the early 80s.


Oh the sweet innocence of youth!


Some of us are from that era.

Yes, you would manage so long as you can handle a shell prompt and can grok simple assembler.


Today's CS graduates are much better educated than people in those days. Early programmers wasted much of their energy fighting the platform, reverse engineering undocumented features, and haggling vendors for better access.

Today, you just focus on the problem and take the underlying platform for granted. Free Software has liberated computing from vendor-reliance.


The education system tried to keep itself above petty implementation details. Sure you needed to worry about it to get your assignments done, but that's not what the classes were all about.


fighting the platform, reverse engineering undocumented features, and haggling vendors for better access.

Whereas today we have Android.


Any "modern software engineer" that wouldn't have plenty of usable and employable skills back then should perhaps take a good long look in the mirror; do they really see a "software engineer" looking back?

Perhaps cheating, since I started programming in that era, but I'm sure I'm not the only one that still has remembers assembly language techniques (if not even the mnemonics for one or more 8 bit chips), as well as remembering at least a handful of BASIC dialects, the finer points of bumming cycles and bytes to fit in tight spaces, etc.

In fact, though I certainly don't want to give up my curent tech, my primary emotion looking at those pictures was wistful nostalgia.


I don't think some modern engineers would be successful back then because today many rely on the abundance of online resources, code schools, and such. I was only 2 at that time but when I learned programming on an Apple II+ a few years later I was entirely dependent on having a mentor (my father). I wouldn't have known where to go for help or what books to read otherwise. There was no concept of going online as far as I was concerned and the only way to get basic information about computing was through the network of people you knew, where only a small percentage of which had any interest in computers.


Old manuals go a long way towards understanding the computers they describe though. I've been nothing short of impressed with any manual I've found that was made before 1990.

Maybe because computers were not so complex yet so you could actually fit their basic operation/commands in a few hundred pages.


I got my CS degree in 1984 and did programming on my own from 1977. I'm still in development so I can be your time traveler. Yes and no? If you've taken the GUI, drag & drop components route I think you'd have trouble. Another big sticking point would be if you've only worked in languages with automatic garbage collection. My opinion is you need to have malloc and free, or the equivalent in assembly etc. ingrained in you to get it right, to make sure your routines free up memory no matter the exit points. Of course you could be a modern Linux C programmer using vi and you'd be their ninja rockstar on day one. Comparing the two eras I'd say you could do a lot less back then but you were expected to do a lot less. Standing up a database, using a connector to it in your language, getting input from the user to put in the database; 1-2 days now, weeks or more back then.


It depends on what you call a modern software engineer. Anyone who can program a small embedded system would probably be OK. People would miss IDEs and online help, the corpus of free software we enjoy today and lots of nice things we have now - IRC, Google, StackOverflow, but, apart from learning to live in a more frugal environment, I see no big difficulty.

Think about developing for Arduino with software running on an Arduino.


Would they want to be employed in such an environment?

Imagine spending long hours trying to fit your code into 128k?


Try 64K on a PDP-11. Or writing assembly to fit into (say) 16K of ROM on a Z-80 or 6502; that's terrifying, because ROM is forever and you do a 6-8 week spin if you make a mistake. (EEPROMs? Sure, at about 8X the cost of a ROM).

Mostly you'd be up against:

- No Intenet to look up reference material. For that you have books. I'm not sure how much of a revolution the online Unix man pages were, but I'd not worked on any other system that had that kind of documentation. Hope you have lots of bookshelf space (I did :-) ).

- No GUIs anywhere. There was Emacs, kind of. Mostly you got along with ed and regexprs.

- Frustrating toolchains; pre-ANSI C, with 7 characters of significant symbols on earlier systems. I don't remember if any Unix debuggers had symbol information, but they were all command-line driven at the assembly level, with no source information. That's okay, you could pretty much tell where you were by the assembly, because the optimizers were terrible.

- Email? Hoo boy. Might as well just go across the hall to talk to somebody, because unless you were on ARPANET that's about all the farther your email would get.

It'd be frustrating, but kind of fun.

Nice things:

- Tinier software. You've got skillz dealing with hundred thousand line programs. Things were smaller back then, mostly.

- No security worries. I don't know whether to laugh or cry, but DES was pretty controversial (the whole 56-bit key thing) and US citizens couldn't say anything to foreigners about crypto. No network, no crypto, right? (Unix passwords were encrypted with a rotor engine similar, and I think that salts came later).

- Boot times are about the same then as now. :-)


128k?!

Heh. Heh heh.

As if.

Try less than 16k. Less than 8k in may cases. Certainly well less than 64k.

Oh, and it was an absolute blast fitting code into that space.

In fact, my first computer of my very own was a 1802 based single board I wire-wrapped my self. I had loads of fun coding up programs in the 1K of RAM I had at first.


Well, it's better than going back in time only to be unemployed.

And besides, there are people who enjoy it: the entire 64k and 4k demoscene categories live because it's an interesting challenge.


You can still find jobs today where you need to fit all code into 512 instructions and you only have 32 or 64 bytes of RAM (e.g. a low-end PIC12 IIRC).


I could probably crank out a workalike clone of DOS 1.0 in a week or two :-)


Back then it took a week just to compile DOS 1.0.


You both exaggerate a bit. Sure you could knock out a proof of concept for DOS in a week, but it would takes months to get the nitty gritty details right. And assemblers were pretty spry back then, they had to be, and you didn't write an OS in anything higher level.


MASM 1.0 was hardly spry, it had a linear symbol table, meaning that it took exponential time as the size of the program increased. Plus, it crashed a lot, and I mean a lot. Every crash meant a tedious reboot, making development very slow.

I'd write a cross-assembler that ran on the PDP-10. That's probably what Microsoft did, as I find it hard to believe that MASM 1.0 could have been used to compile DOS.

I'd also write the DOS clone in C, running it on the 10 with an emulator. Then I'd hand-translate it to assembler.

Remember, DOS 1.0 fit on a 160K floppy, including the numerous utilities thrown in. That isn't much code, even assembler code.


Paul Allen confirmed (in his books) that in the early years Microsoft used PDP's a lot to emulate smaller hardware. In fact the very first Basic was developed without ever having the physical Intel CPU, only the Intel manuals.


There were fast assemblers. The SC assembler on the Apple II did maybe 1,000 lines a second. The x86 assembler from Digital Research was pretty fast too. You could play games with RAM disks and so forth.

The thing about cross-development on a mini or mainframe that killed you was the download time. I did cross-assembly at Atari, and it was always the download time that took soooo longggg. Wrote a few smart downloaders while I was waiting for dumber ones to finish. 9600 baud sucks hard.


He'd mop the floor with them. Systems, languages, environments, etc., etc., were trivial then compared to today. I worked back then in a few "labs" that looked a lot like that.


If you work in assembly you'd be in good shape. Though you probably would pull your hair out with the hardware speed, memory constraints, etc!


Ah, the good old days, when interacting with a computer often meant printing out 20 pounds of paper and then laboriously going through said printouts.


Gordon Letwin went on to troll OS/2 users in the early 1990's on Usenet. He had some sort of bet that Windows would have a specific feature (multitasking?) before OS/2. The bet was the loser would fly the winner to any city the winner desired (probably US) for dinner.


I was the one on the other side of that bet (not under this name.) It was multiprocessing. I said OS/2 would be there first and he said NT would. I'm not sure who won but we were pissed off at each other enough by the end of it, with some weird threats having been made, that dinner was pretty much out of the question.

Odd that you would remember that. I thought I was probably the only one who did.


I was there as the 'melling'. Trolling the Internet was more fun when you didn't have to worry about down votes and karma. :-). I tried to stay reasonable but the OS/2 guys were rapid. Those were the days before Microsoft was "evil" and people rooted for them against IBM. It might be time to root for Microsoft again.


I have one of the pictured terminals (the adds regent 20) set up for my daughters to play hunt the wumpus on a raspberry pi (mounted inside the voluminous case, it even gets decent wifi with a usb dongle)

Oh, were there people in the pictures? The hardware overshadows them!


Towards the bottom of the page, there are two pictures of the mainframe that they used. The desktop computers in the other pictures were probably just terminals that were connected to it. It was a DECSYSTEM 2020. It ran the TOPS-20 operating system.

The text of the link calls it a minicomputer, but the links that I found called it a mainframe.

http://research.microsoft.com/en-us/um/people/gbell/digital/...

https://en.wikipedia.org/wiki/DECSYSTEM-20

The fans of TOPS-20 were nearly as anti-Unix as the ITS partisans. This may help explain why Microsoft has not been so fond of Unix and the Unix way of doing things. At least in its early days.


TOPS-20 on a DEC-2060 was an amazing adventure. These were the origin days of the ARPANET and machines had hostnames like SUMEX, SCORE, MITXX, STORK, LOTS-A, LOTS-B, and SANDIA. I'll let you guess where that last hostname was located.

Many many hours were spent in Margaret Jacks Hall and CERAS at Stanford writing code and playing with the "net." Perhaps one of the best articles of the time was in Rolling Stone Magazine, February 1982 entitled "Hackers in Paradise"

http://www.designersnotebook.com/Scrapbook/Hackers_in_Paradi...

It was indeed a rich and yeasty environment in which I learned many things about computers, programming, networks, and friendship. Alas, VMS ruled the roost at DEC and I never really enjoyed that OS, switching to Unix in the mid-1980's and never looking back. That Unix adopted the COMND JSYS style of command completion made one feel right at home.

Oh, and if you have a mind to, you can fire up an emulator with code from http://klh10.trailing-edge.com/

Remember, FisK.


I would give quite a bit to own a DECSystem 2020. You couldn't really run a KL-10 in your garage but man you could run one of those. These days simh can run all your old TENEX, TOPS-10, and TOPS-20 code but nothing quite like the real iron. The last working one I knew of was running in Mark Crispin's garage.


Microsoft developed a popular Unix (xenix), sounds like there was some fondness.


If I'm not mistaken to create NT Microsoft hired a bunch of DEC people and put Dave Cutler at the helm. Like you say, none of those people were particularly fond of Unix.


NT was the weird mutant love child of VMS and Windows.

There was a lot to like about VMS, including early sharding, solid security, and an impressive file system.

NT managed to lose most of it, IMO.


A lot changed in 14 years, but this reminds me of the video of a visit to id Software in 1993: https://www.youtube.com/watch?v=Q65xJfVkiaI


In one of the photos there's a shot of a TRS-80 model 1, level II, with expansion interface and three drives. Microsoft did the BASIC ROM for that machine in Z80, as they did with a number of Micros. Full floating point version in 12k. I cut my programming teeth on that in 1979 in BASIC and assembler.

They also did a cut down version of BASIC for the Model 1, Level 1 machine, without floating point and single letter variables A-Z (!) It had just two string variables A$ and B$. It had precisely three error messages: "How?", "What?" and "Sorry". They squeezed that into a 4K ROM.


I love the 70s computing aesthetic along with the fashion. So unapologetically idiosyncratic.


Just curious, what do you find idiosyncratic about the 70s computing aesthetic?


I dig the gigantic, starship enterprise-like interface. LEDs and buttons, geek nirvana: http://www.technikum29.de/shared/photos/rechnertechnik/dec/p...


He just likes to use idiosyncratic in a sentence.


Bingo.


In case anyone gets curious about the big computer with a large tape drive, https://en.m.wikipedia.org/wiki/DECSYSTEM-20


Since the author is wanting to sell the original photos, you'd think someone at Microsoft's corporate archives dept (I assume they have something like that) or PR would snap these right up.


Looks like that will happen now.


So much paper.


Those noisy line printers must have been running constantly all day long.


Actual line printers were pretty quick and wouldn't have been running for long periods of time, and were likely to be tucked away in a back room somewhere. Now get a few 30-character-per-second DECWriters buzzing... And those were a vast improvement over the 10-CPS Teletypes that were the rage before video terminals were affordable.


I sort-of miss nursing a flotilla of chain printers for IBMs now. Sort of.


A former coworker had a story from when he used to service those things. He was doing a cleaning and was tossing the solvent-soaked rags into a garbage can on the other side of the printer door. Someone came by and tossed a cigarette into the can - managed to blow out quite a few ceiling tiles. He was always a bit hard of hearing but I never determined if it was connected to this incident or not.


I LOVE the sound of the 10 cps teletypes. The cadendes him while at rest and the monotonous printing...


Argh... Spell checkers... "Cadenced hum".


I like how every office has a cork board instead of a whiteboard.


I like how every office is an office.


How did they accomplish anything without a daily scrum meeting???!?!? They were so waterfall and not agile.


I wonder... how did they collaborate?


99% of their dumb ideas died a natural death before they managed to utter them to another human. The way work should be done.


And how were they ever able to develop a culture?


Also, there are no post-it notes.


Post-it notes were just gaining initial popularity around that time, in fact they weren't even called "post-it notes" until 1980.


It feels like a 'Mad Men' episode


More like a Halt and Catch Fire episode


I love the giant banner font used in the printouts on Matt McConaughey's desk. And the Trash-80s.


It's fascinating just how much printed material there is in these photos. How much of the work they were doing back then was not digital. It's an interesting contrast to the modern era where a very high percentage of the work in software development is done using a computer.


Is it just me or are they not smiling in any of the pictures :(


That is true, but you can't infer they didn't enjoy their jobs.


So much paper.


The man with the bushiest beard is Gordon Letwin. He single-handedly cost IBM a trillion dollars by making Windows do multi-tasking, killing OS/2 with less than 100 lines of code.

I saw his photo in the 90s, and it still haunts me. I'm an ex-Windows API programmer. I gave up when they changed their database access APIs too many times.


Sorry man, but that's not even close to what happened.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: