Hacker News new | past | comments | ask | show | jobs | submit login
Half an operating system: The triumph and tragedy of OS/2 (arstechnica.com)
329 points by jorgecastillo on Nov 25, 2013 | hide | past | favorite | 189 comments



> Long before operating systems got exciting names based on giant cats and towns in California named after dogs, most of their names were pretty boring.

Ah, yes. Mavericks, California. It's a great little offshore town, just off Pillar Point. I love that town.

Kidding aside, this is a great article.

Related to this story, the Windows 3.0 visual shell was originally not supposed to be Program Manager and File Manager. It was going to be a program called Ruby that I worked on with Alan Cooper and our team.

Ruby was a shell construction kit with a visual editor to lay out forms and components, which we called gizmos. You would drag arrows between gizmos to connect events fired by one gizmo to actions taken on another.

The shell was extensible, with an API for creating gizmos. A really weak area was the command language for the actions to be taken on an event. It was about on the level of batch files if that. But we hoped the API would allow for better command languages to be added along with more gizmos.

BTW, this project was where the phrase "fire an event" came from. I was looking for a name for process of one gizmo sending a message to another. I knew that SQL had triggers, but for some reason I didn't like that name. I got frustrated one night and started firing rubber bands at my screen to help me think. It was a habit I had back then, probably more practical on a tough glass CRT than it is today.

After firing a few rubber bands, I knew what to call it.

(As one might guess, I've always been curious to know if the phrase "fire an event" was used before that. I wasn't aware of it, but who knows.)

Anyway, Ruby didn't become the Windows 3.0 shell after all. The went with ProgMan and FileMan instead. To give Ruby a better command language, they adapted Basic and the result was Visual Basic. Gizmos were renamed "controls" (sigh), and my Gizmo API became the notorious VBX interface (sorry about that).

And we still don't have a programmable visual shell in Windows.


Inadvertently, you could say I owe my entire programming career to you specifically. Yes I can expound upon VisualBasic's retarded conceptions but playing around and making "Forms" was how I got really interested in programming as a career choice. One could say DOS batch scripting had a hand before that but it wasn't until VB3 that I really pursued it in earnest. It wasn't until a CS 101 course teaching Pascal where every thing really gelled into a cohesive unit in my brain. I still can't deny the VB underpinning even if I laugh when I admit it. Had Ruby been around with my batch scripting background, things likely would've taken the same turn either way. Regardless of how you see it now, there's likely a lot more people like myself with a similar story.

Having said all that, what you describe would still be interesting today. I just don't know how feasible it would be to implement. I had a file manager/shell idea in the form of a game construct. Something like crates to open or destroy as a delete mechanism. It was never more than a fleeting concept but I find it interesting that I'm not the only one lamenting about what is now Explorer.exe.


Wow, thank you, it's great to hear your story.

Actually, the apology at the end of my post wasn't about Visual Basic as a whole, but the VBX interface specifically. People did use it to build a lot of nifty controls and extensions, but the interface itself wasn't the best-designed thing in the world, and Microsoft eventually replaced it with COM/OCX.

Hmm... Not sure if that was an improvement!


A books.google.com search found "fire an event" in the 1979 "La Conception des systèmes répartis" (p.115) at http://books.google.com/books?id=1VogAQAAIAAJ&q=%22fire+an+e... . The quote snippet is:

"Control refers to any set of rules describing conditions under which processes may fire an event or switch to a new state."

Other sources confirm that there is a 1979 book with that title. (Google Books sometimes has a dramatically wrong year.) However, I don't have access to the book to verify it for myself.

There's also a interesting reference from particle physics; "Figure 6 shows the number of tubes that fire an event vs. the number of photo- electrons.", Proceedings of Workshop on "Weak Interactions and Related Topics", December 13-15, 1979. While not the same, it feels like a similar construction.

Otherwise, the only relevant hits for that phrase are post-1990.


Cool! That is very interesting, thank you for that research.

I guess I can't really claim to be the true originator of that term then. But it still makes a good story...


Yes, it does!

BTW, here are a couple of rule-base references of firing an event, in:

1) http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.81.... -- "An event can fire, i.e. it is active, ..." -- Representing procedural knowledge in expert systems: An application to process control (1985).

2) http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.12.... -- "The object manager identifies fireable events, and fires the rules of each of the participants of the event." -- Using Objects to Implement Office Procedures (1983).


> And we still don't have a programmable visual shell in Windows.

Very true, there isn't one. But there are several. None of them all that good though. Windows Workflow, System Center Runbooks, Biztalk, they've all made their mark. I've seen serious systems built using all of them, ultimately I have to wonder whether visual programming on that scale is even a good idea.


Not exactly desktop-focused, but I'm working on something in this space: http://noflojs.org/


No, it's not. Visual is basically bad for anything but toys.


It is not a good language indeed. It doesn't allow you to express proper abstractions or anything particularly interesting.

But it did allow you to get things done very quickly, and that's why you see that was used a lot in several industries for small things that doesn't require to be maintained a lot and still do useful things.


I don't think we used that exact terminology, but back in the '70s in the real-time interrupt-driven world, the concept was there. In fact, if you had used that phrase at that time, everyone would have grasped the meaning.


Indeed, when I was firing those rubber bands, I didn't think I'd coined some truly novel piece of terminology. It seemed like a fairly obvious word, especially given the use of "trigger" in the SQL world.

Ah, it starts to come back! In SQL, "trigger" is a noun, not a verb. It refers to a stored procedure that's executed in response to some event.

I knew "trigger" could also be used as a verb, of course, but it seemed that it might be confusing to use it that way given its meaning in SQL. So that's when I started searching for words and firing rubber bands.

I might have fired up something else too, but that's a story for another day...


I knew about Alan Cooper legacy in VB, but not the Ruby project. Is VB the only program that used this idea at Microsoft ? even another internal project ?

Were you from tripod too ?


I wasn't working with Alan when he wrote the original Tripod prototype. He hired me and Gary Kratkin, Mark Merker, and Frank Raab to rewrite the whole thing in a more maintainable and extensible way.

Alan was a good coder before he decided to concentrate on UX design, but like many prototypes where a number of different ideas have been explored, the code got a bit messy and he felt a fresh start was called for.

There's more of the story here:

http://www.cooper.com/alan/father_of_vb.html

I don't know of any other Microsoft projects that used the Ruby code or ideas, just Visual Basic.


I bought OS/2 for work to run on some DEC PC (not the damn Rainbow, the decent 486 DEC sold). The graphic card (S3) wasn't supported out of the box, so I called the IBM and got nowhere other than an acknowledgement it existed.

I called DEC and they too believed it existed, so they (while I was still on the line) called their contact at IBM. After being transferred twice, we arrived at the person who could mail me the driver, but I would have to sign an NDA. Myself and the DEC rep explained we didn't want source or a beta driver, just the release one. He insisted every customer had to sign. I said I'd think about it. After hanging up, the DEC rep couldn't stop laughing. He asked if I wanted a free copy of NT compliments of DEC. I took it and it had the correct driver.

I tried, but they had no chance.


> not the damn Rainbow

Hey I had a Rainbow! It was pretty amazing IF it had it's full potential it would have been the best computer until the Amiga came out.

Rainbow: CPUs - Z80 (8 Bit) and a Intel 8080 OS - CP/M and MS-DOS Could be upgraded to 286 later.

Would have been perfect BUT they didn't get things setup correctly and I was really stuck in Z80 CP/M land.


One man's innovation is another man's Frankenstein Monster.

It was actually an 8088 not an 8080.

Anyone who had to deal with people using those damn Rainbow floppy disk drives has my eternal sympathy. I really, really want to know what they were thinking on both the format and how you inserted those disks.


Had to look up what that was about. According to http://en.wikipedia.org/wiki/Rainbow_100#Floppy_disk_drives, they could do away with one drive motor, with only a 'minor' disadvantage: "Of note was the single motor used to drive both disk drives via a common spindle, which were arranged one on top of the other. That meant that one disk went underneath the first but inserted upside-down."


If anyone wants an insider's view, here's a Usenet post from one of the early Microsoft employees, Gordon Letwin:

http://gunkies.org/wiki/Gordon_Letwin_OS/2_usenet_post

Somewhere in the Usenet archive is Gordon trolling the OS/2 users for weeks (or months?) on end. I can't remember the exact details, but he had a bet with several people that Windows would have multitasking, or that OS/2 wouldn't have some sort of multitasking before Windows. The bet was to fly the winner to any city of their choice and buy them dinner.

The discussions were quite heated and it was particulary memorable because he was one of the first 12 employees at Microsoft.

http://en.wikipedia.org/wiki/Gordon_Letwin


Letwin was also the guy for OS/2 on the Microsoft side while the alliance lasted. His Inside OS/2 is a detailed account of trying to squeeze a real multitasking, protected memory OS into a '286:

http://www.amazon.com/Inside-OS-2-Gordon-Letwin/dp/155615117...

The cover photo is also classic.


The photo should be in a National Geographic captioned "A Unix beard in his natural habitat".


In its natural habitat. A Unix beard is genderless; it merely seems to exhibit a preference for male hosts.


I thought Unix greybeards were like Tolkien's dwarves in exhibiting little sex-based dimorphism.


"Foreword by Bill Gates"


So, did Letwin ever pay up?


I think Letwin turned out to be right. I can't find the Usenet thread. It was from the early 1990's.


Was he though? This is someone speaking from the outside, with fuzzy memories of release dates, but I'm pretty sure Windows (excepting NT) didn't have preemptive multitasking until 2000, and I think OS/2 was out before NT. I still have fond memories from the late 90's of a friend who adminned Windows and had a CD for when someone would set the password on the screensaver on the Windows boxes: the CD had an autorun that would pluck the password out of the registry and put it in the clipboard so you could paste it into the password box (no memory protection). That and the memory shotgunner: a program that would write random data to random memory addresses. No Windows machine would stay up for more than about 5 minutes running that, while Linux would kill the program and merrily continue on its way. Or the packet of doom to lock up Windows remotely.


Win95 had preemptive multitasking. Win 3.1 had cooperative multitasking. The "multiuser" aspect of Win95 wasn't secure at all, it was more for letting multiple users have separated settings (e.g., browser bookmarks). However, even that often failed because software would assume a single user and store all the settings in a common location. Since there was no real security in FAT32 filesystems that was pretty easy to do.


I may be splitting hairs at this point, but I think we're both partly right; according to Wikipedia (https://en.wikipedia.org/wiki/Preemption_%28computing%29), Windows 95, 98 and ME had preemptive multitasking for 32 bit programs, but not 16 bit (which would have been the majority at the time of release of Win95). My confusion may come from remembering running multiple DOS and Windows programs at the same time in OS/2 with nary a hiccup and being able to tweak settings (such as RAM allotments).


As I mentioned in a comment on a os2museum blog post, keep in mind Letwin left in 1993, before MS did many of the unethical attacks against OS/2 that is not mentioned in this article.


However, I got the impression he was asked to leave as a result of his confrontational behavior.


Up until March of last year a lot of ATMs in the US were still running OS/2 . . . I "upgraded" a lot of them to Windows XP. Yuck.

When I would take the OS/2 system offline and replace it with a Windows cage the payment network would sometimes tell me the uptime on the deprecated machines . . . one network operator claimed 8 years of uptime at one particular machine. I have no way of confirming that, but I definitely felt the OS/2 machines were rock solid, especially compared to the vulnerable Windows machines. Most small banks with NCR machines are running two software packages (APTRA Edge or Advance) with default admin passwords and are really behind on the monthly bug patches. Eek.

The OS/2 machines required you to input config info in hex though, so I was glad I didn't have to work on them in the field too much.


When you think of it, 8 years of uptime isn't special for a machine running on a UPS and not needing a reboot for software updates. There simply isn't a reason for a well-built operating system running on solid hardware to crash.


No UPS for a lot of machines. I can't imagine the OS/2 getting an update simply because core ATM features haven't changed since the 80's. But it was still impressive uptime because ATMs are generally cantankerous creatures.

A software upgrade on the XP machines was ridiculous. It usually involved 3-4 hours of loading and booting and restarting with several different CDs. The other techs and I were convinced that NCR padded the installs/updates to take longer since their certified field techs billed out at around $300/hour.


Cosmic rays, but yeah, that's solid.


ATMs usually do not have a UPS.


Then I'm more impressed by the local utilities than the OS :)


I actually bought a copy of OS/2 Warp when it came out because I was interested in its preemptive multi-tasking, and it was a decent operating system for what it's worth. I was definitely more stable than Window 3.11, but its real problem was compatibility. Back in the early 90s, everything was about getting compatibility, and while OS/2 had good compatibility, didn't have perfect compatibility.

As well, I worked at a bank, and as the article correctly stated, the entire bank was run on OS/2, most notably the ATMs, except the ATMs I worked with was using OS/2 2.0.

However, when Windows NT 3.51 came out, that was the game changer. I was the only person I knew who even knew what it was (I read about it in a magazine at the time), and I was able to get a student-priced copy at my college bookstore. I started using it, and it was awesome, everything just worked, except for some games. You couldn't even compare NT 3.51 to OS/2, it wasn't even in the same level. The look and feel of NT was exactly the same as Windows 3.11, and all the programs worked.


My computer had 8mb RAM at the time and NT required 16 to install so I "borrowed" another 8 from one of the school computers, installed NT and then returned the memory. It ran well enough on 8.


I was lucky enough to have a 486DX-50 (not a DX2-66)with 16 MB of RAM, so it ran perfectly for me.

Even with what was bleeding edge hardware at the time, I still remember trying to finish writing my senior project using Corel Draw and Microsoft Word on Windows 3.11, and having my computer crash ever 30-45 mins, and my project partner eventually broke down and started crying because of how frustrating it was.


If you ever work with an older person who reflexively saves a document every few minutes, you know why.... habits established back when crashes were frequent and sadly, accepted.


Yeah, mine was a Pentium 90! (I decided to scrimp on RAM and get the 90 in stead of the 66.) It was a huge jump from the 286 I had before. (Upgrading memory meant socketing rows of individual DIPs.)


Wow, I remember the Pentium 90s coming out and being amazed. I of course had rubbish old late 1980s British computers to use whilst everyone else was on 16 bit machines, but when the Pentium II came out, I finally got hold of a 486 DX2 66Mhz. It is strange how you learn to "make do" with hardware and systems, and learn patience. I finally bought an Ivybridge i7 last year after "making do" with machines for a long time. I fondly remember spending hours and hours tinkering with slow systems, but being just as productive!


There are times when you need to have high CPU performance, but the reality is that all semi-current machines are amazingly fast and capable of productive work.


I remember buying a copy of OS/2 Warp from Grey Matter in the UK for £5 as a promotional version (required a version of windows to upgrade from IIRC)

Sadly the big problem was that it swapped like mad in 4MB which Windows 95 didn't. Doesn't sound like a big problem today but the extra memory would have been expensive at the time. In particular 4MB was a popular option on 386/486 machines arranged as 4 x 1MB which often filled all the SIMM slots on the motherboard so upgrading to 8MB meant buying 4 x 2MB SIMMs which doubled the memory cost.

For most of us at the time 4MB was a good usable amount of memory with EMM386 but the upgrade past that didn't give much extra functionality.


3 years ago I did consulting for a UK bank on a project that involved migrating off os/2.


I worked at a large computer chain (R.I.P. Softwarehouse / CompUSA) from 1993 - 1996 and had been building clone computers for businesses from 1990-1993. I remember how this played out very well.

At the time, IBM had sent in scores of company reps to train up our floor staff on the advantages of OS/2 over the always-soon-to-be-released Chicago. They did a good job getting all of us to "drink the Kool Aid". I received a free (not pirated, promotional) copy of blue OS/2 Warp 3.0. It was a fantastic operating system for running a DOS based multi-node Telegard BBS and it did well with Win16 applications.

The impact of Windows 95 coming on the scene, though, is difficult to fully appreciate unless you were there. We had been selling pre-orders for months and there were a myriad of promos. I remember some of those preorders were sold under the threat that there wouldn't be enough copies to go around on release day. I had been playing with pirated copies of the betas of Windows 95 for the prior two months. Even in its beta form, it ran circles around Windows 3.0/3.1 in terms of reliability. I even remember reloading my PC with the most recent beta after release because a DOS application I used ran more reliably in it than in the RTM code.

Then launch day came. It was unlike anything I had ever seen in terms of a software release. We closed up at 9:00 PM and re-opened at 12:00 midnight to a line of customers that went around the building --- A line of customers ... for an operating system. We joked at the time that "Windows really was that bad". There were tons of additional promotions to ensure people came and lined up--Some RAM / hard disks selling under "cost" and others. And the atmosphere of the store felt like a party. We had theme music playing (start me up?) and some Microsoft video playing on our higher-end multi-media PCs. It was obvious to us, on the floor, trained by IBMs marketing machine, that Warp died that day.

As an anecdote to the stories about IBMs marketing being a little off: I remember around the release of Warp 4.0 I saw an advertisement at a subway station something along the lines of "Warp Obliterated my PC!"-- that tagline, evidently, meant to be some hip new use of the word obliterated.


> We closed up at 9:00 PM and re-opened at 12:00 midnight to a line of customers that went around the building --- A line of customers ... for an operating system.

I grew up in Dallas, SoftWarehouse (somebody else remembers that name, awesome)/CompUSA's original stomping grounds and this story kept coming back to me throughout the whole article. Windows 95 was considered revolutionary at the time, even to those of us lined up at the Lewisville, Texas store at 10:30PM to buy an operating system. (Incidentally, the first time I ever talked my dad into taking me to an overnight release of anything.) Windows 95 was the first operating system I ever saw non-technical people set out intentionally to buy and I spent months installing it for friends and family.

> We had theme music playing (start me up?) and some Microsoft video playing on our higher-end multi-media PCs.

Yep, it was "Start Me Up" by the Rolling Stones. If I remember correctly, the video was a demo reel of everything new in Windows 95 and was highlighted by the huge Start button popping in at the end of the video, then fading to black. It even had a snippet of the waving Windows 3.1 flag and the bear from the "Help / About" Easter Egg hidden in 3.1.


I bought my first PC from that Softwarehouse in the late 80's. Man, I so loved that store.


If you liked this article, you should read Show Stopper. It's out of print, but there are ample used copies via Amazon.

http://www.amazon.com/gp/aw/d/0029356717/


It's not quite at the level of Soul of a New Machine, but I still think every programmer should read this book. A great story from the trenches. It gave me so much respect for Dave Cutler, and confirmed so many things I suspected about Microsoft at the time.


Yeah? My reading of that book was that there was such an inhuman marriage/life destroying crunch getting NT out the door that nobody covered themselves in glory, much less showed themselves worthy of respect.

And it explained to me, in clear terms, why Windows was such a buggy pile of shit. It was created of its culture.


Seems to be everywhere as an eBook.

http://ereads.com/ecms/book_title/Showstopper (warning, will set affiliate codes on Amazon links)


"FoxTales" by Kerry Nietz is also a good read. Kerry talks about his experiences joining FoxPro development as a freshout college grad. I never cared all that much for FoxPro/xbase, but the book is a good read.

http://www.amazon.com/FoxTales-Behind-Scenes-Fox-Software/dp...


OS/2 had many flaws but its multitasking was unseen on a PC at the time. I remember formatting a floppy while running two 16-bit Windows sessions (which were communicating with each other) and multiple DOS windows, thinking I was in the future.

Even Windows 95 was limited by many system calls being funneled through single threaded BIOS or DOS 16-bit land.


That's my reason for switching to Linux, back in the Slackware days.

Back when CD burners were still uncommon, I got as a gift a japanese, SCSI-based one. With my hardware, I'd lose a CD if I forgot to disable the screensaver - the amount of disk seeks required to load the screen saver executable was enough to starve the buffer, and I'd get a buffer underrun every single time.

So, one time I was trying a linux system, and had to do a last-minute presentation, which required files on a floppy drive. For some reason, I had no usable floppies and had to format one, and I couldn't wait for the burn to finish.

So I inserted the drive and, fully expecting to lose the CD, called fdformat. cdrecord didn't even flinch, the buffer was still full when the format finished.

I only ever booted Windows from then on to play games.


Couldn't the Amiga already do all that multitasking on a 256KB machine, years before?


Sinclair QDOS which ran on the Sinclair QL did preemptive multitasking on a 128KB machine (buggy as heck when it came out but it did precede the Amiga by almost a year and half).


I don't think the Amiga had real multi-tasking, from what I understand. I may be wrong.


Yes, the Amiga had proper honest multitasking. A normal boot of an Amiga system would typically result in over twenty processes running in the background. In fact, its particular style of multitasking (static absolute priorities) was well-suited to real-time operation. Back in the days when CD writers could create coasters from buffer under-runs, I had more success writing CDs using my 25MHz Amiga than my 400MHz PC. Also, it was a microkernel system with things like device drivers and filesystems as separate processes, which had some fairly nifty consequences. For instance, it took Linux ages to lose its single kernel spinlock, which was a problem because of the huge amount of stuff done in kernel space. The thing the Amiga didn't have (mostly because of lack of hardware capability) was memory protection.


It had real preemptive multitasking, but lacked memory protection.


Yes, but it was 512 MB, not 256KB.

Plus it had a heterogeneous architecture, with dedicated chips for sound and graphics.

Just set the required data structures and let the chip do its work alone. Sounds familiar?


256KB Rom, 512KB Ram. This was a time when harddrives was at ~ 40 MB, 512MB ram was unheard of.


Yes 512 KB sorry, I wanted to "fix" your 256 remark, ended up mistyping MB instead of KB. My first computer used tapes.

You also needed to load the Workbench and related libraries from floppy, so ROM firmware alone wasn't enough.


The Amiga 1000 first came out with 256KB of RAM. It didn't have much space left after the OS had booted up though. It was the Amiga 500 that came out a little later that had 512KB.


Ah, my circle of friends only had Amiga 500, 600 and 1200.

I don't remember if I ever saw a 1000.


He said for a PC, i.e. an IBM PC clone.


Many of the people I knew in the beginning of 1990's ended up running either DESQview or OS/2. And that was because they were running either one node bulletin board system on their only desktop computer or they had a multi node BBS on their dedicated BBS computer.

Around that time I had updated from 1200 bps to 2400 bps to 14400 bps modem. Mostly the multitasking was running smooth enough to provide reasonably speedy BBS experience even for the 14k4 caller. The multitasking mostly was visible when the system was building and compressing the QWK archive for offine message reading (Blue Wave FTW).


In the very beginning of the 90's programmed a BBS and had it running under Desqview. In 1994 had to build a data bank accessible by modem, and based on my bbs to build it, but used OS/2 as multitasker, which made a big difference in stability.

In 1996 had to put access to that data via internet, and used the same OS/2 box for both modem access and an internet server with most of the usual services (mail/web/dns/proxy).

It was very reliable (really don't remember reboots, maybe it was mostly for upgrades), impressed me when i realized that i created a 16mb vector in that 16mb server with everything running normally, and loved WPS (not sure which today's desktop had so good integration with fs) and rexx (parsing without regexes was good, but later learnt how limited i was).


I remember running DESQView '386 for exactly that purpose. A TAG BBS on one session, and then whatever else in another session.

It was reasonably amazing at the time to be able to run more than one thing concurrently on your PC.


I was constantly listening to modules (first with DMP and later with Cubic Player) and DOS shelling from the player to do something else. That something else being editing ascii/ansi art or programming in Turbo Pascal. Because the module playing was done done in TSR, the players at the time could not switch to the next song in the playlist unless you returned to the player. When the music ended I had to quit the application I was working in and "exit" from DOS shell to give control back to player software. And then back to DOS shell and into the application.


One of the cool things about OS/2 is that it changed the sound the floppy drive made as it track-seeked. Rather than a chunk-chunk sound, it was more of a buzzing noise.

The small things one remembers...


It almost seems too early to reminisce about this stuff. My first computers were a TRS-80 clone and a Commodore 64, but this is the era in which I really started to get into computers: the Solaris and NeXT ads in Byte and PC Magazine, with impressive looking screenshots of a busy computer; the Mac IIfx briefly taking the clock speed crown at a whopping 40 MHz; wondering if I'd take an Amiga or a Mac to college, then settling on a 486SX from Gateway.

In college I really wanted to like OS/2 2.0 (and later 2.1), but driver problems with the Diamond video card were a constant problem. (If only we'd sprung for the ATI Graphics Ultra Pro!) I had a copy of DeScribe; later sold it to someone through the ISCA BBS.

My impression at the time was that Microsoft executed so much better than its competitors, offsetting its weaker Office products with a better UI, which in turn gave you a reason to run Windows. I later attributed its success much more to its ruthless business practices.

This article brings the focus on the strategic vision: betting big on clones; belatedly embracing the Internet; hammering away at PDAs and tablets, yet losing big to the iPod, iPhone, et al. Sometimes we predict the future, and sometimes we make it.


No there were some unix-like systems running at that time, three or four years before OS/2.


"Some" is an understatement and "unix-like" is kind of funny. There were several strains of unix available when OS/2 was released in 1985.

System V: https://en.wikipedia.org/wiki/System_5

BSD: https://en.wikipedia.org/wiki/BSD

SunOS: https://en.wikipedia.org/wiki/SunOS

Xenix: https://en.wikipedia.org/wiki/Xenix

https://en.wikipedia.org/wiki/Xenix (which was created by the eventually infamous Santa Cruz Operation - SCO - and licensed by Microsoft)

Also VMS qualifies as a fully memory-protected, preemptive OS: The real innovation that Windows and OS/2 did was to take fully preemptive OSes and put them in marginal hardware (at the time) like PCs.


Xenix wasn't created by SCO, it was developed by Microsoft (initially for the PDP-11) and was based on licensed AT&T Bell Labs code (and some BSD).

SCO ported Xenix to a few processors for Microsoft starting with 8086/8088. It wasn't until 1987 that SCO ownership of Xenix.


You are correct, but I was referring to desktops, and the particular example is Coherent from Mark Williams Company which was demonstrated running on an IBM PC at the National Computer Conference show in Houston in 1983. Minix came in 1987, if I recall correctly.


There was also Minix at around that time (late '80s/early '90s).


About 1985 IBM had a product called TopView that ran a multitasking supervisor on top of DOS and could do real multitasking with the right disk drivers. No protected memory though. TopView was really memory heavy, though -- you gave up at least 160KB just with TopView. Two guys created a TopView clone called Mondrian that ran in just 40K, and ran much faster. Those two guys were Nathan and Cameron Myhrvold. Mondrian was sold to Microsoft and those two guys went with it. Mondrian was a really clever engineering feat.


Oh, you made me remember again. I forgot that little detail of formatting. In those days formatting was a common practice, right now unless you want to do a wipeout clean install or prepare a new USB for other file system you don't need to format :).


That's basically why we tried OS/2 at my workplace at the time: to compile and continue working at the same time.

However, we very soon switched to simply having two PC's on our desktops.


I remember installing it from the 15 billion floppies it came on and thinking much the same thing.

It seems like it was on 5-1/4" disks. Can that be right?

And REXX! Ha. REXX.


3 1/2"

The IBM PS/2 only came with the smaller disk, so bigger wasn't necessary


For me, the worst thing is that Caldera was able to continue suing MS due to this Win9x dependency on DOS.


you mean MS-DOS. Caldera and others had their own version of DOS and MS went out of their way to make sure no one used it.


I remember my dad getting a promotional shirt for OS/2 with the caption "Flight 4.0 to Chicago has been delayed, I'm taking off with OS/2"

The idea being that Windows 95 was internally called Windows 4.0 with the codename Chicago.

I keep on searching for it but can't find it anywhere.

And Bill Gates on OS/2 in 1987: "I believe OS/2 is destined to be the most important operating system, and possibly program, of all time."


You can find many of Microsoft's codenames here:

http://en.wikipedia.org/wiki/List_of_Microsoft_codenames#Win...

My favourite codename sniping had to do with Windows NT (codename Cairo) and NeXT. When announcing NeXTSTEP 4.0 (codename Mecca), Jobs quipped, "Why stop at Cairo when you can go all the way to Mecca?"

http://www.paullynch.org/NeXTSTEP/Expo.1994.htmld/


I remember seeing this article in popsci way back when: http://www.popsci.com/archive-viewer?id=vypUfjzMwlAC&pg=52

It talks about Windows 4.0 and other contemporary "next gen" operating systems.


I have fond memories of OS/2 from the summer of 1995. At the time, I was a undergraduate at the University of Texas at Austin, and IBM needed summer intern testers for a product they were calling "OS/2 Lan Server Enterprise". OS/2 LSE was IBM's effort to re-platform OS/2 LAN Server on top of OS/2 DCE (in development on the lab next door to LSE). The general idea was to provide a way to scale up OS/2 so that it would interoperate with other DCE-based systems (mainly RS/6000 AIX, IIRC).

Anyway, the machine IBM gave me to use was a PS/2 Model 80. This was a 1988-era machine that had been brought to the semi-modern era with 20MB of RAM memory installed via several MCA expansion cards. Against my best expectations, the machine ran well, despite the fact that its CPU was at least 10% the speed of the then-state of the art.

From what I remember, the OS/2 LSE product itself was fairly solid. However, the biggest memory I have from that summer was the afternoon we spent playing around with the Microsoft Windows 95 beta disk we received for compatability testing. Towards the end of the afternoon, we tried to DriveSpace (compress) the disk. We got bored during the wait for the compress, so we pulled the power on the machine thinking that would be the end of it. However, once we powered the machine back up to install OS/2, Windows 95 just resumed compressing away like nothing happened. A few weeks later, a friend and I went to CompUSA for the Windows95 launch. Even at midnight, there was a line out the door, winding past the Windows 95 boxes, then the Plus Pack, then Office 95, and then memory upgrades... Didn't hear much about OS/2 after that...


Incredible that companies like Apple (Mac), Atari (ST) and Commodore (Amiga) weren't able to fully capitalise on their leading position in GUI based OSes of the time, which were miles ahead of both MS and IBM.


This is slightly counterfactual. The Mac today has a solid unix heart and runs on the best processors and technology out there, but that was not always so, and they were weakest precisely when PCs grew strongest.

The internals of pre OS X Mac OS are horrific and disturbing. They are in no way superior to the internals of Windows 95 and are at best only perceived superior due to a different user experience peppered with a healthy dose of self-delusion, certainly not through stability or performance. They are infantile compared to OS/2. Moreover, over the next few years after Windows 95 was released the PC improved greatly while the Mac mostly stagnated. Around the Power Mac era things were roughly even in terms of capabilities and performance though the Mac was significantly more expensive, by the Pentium and especially Pentium II era the PC began to become objectively more powerful than the Mac.

This put Apple into a freefall that they were only rescued from by the return of Steve Jobs who established style as a foundation the company rested on and pushed them into digital media and mobile devices, as well as forcing a hardware architecture migration (to the formerly hated x86 from the PowerPC architecture developed by a consortium of which Apple was a huge part of) and a complete OS rewrite (transforming Mac OS from an antiquated bucket of kludges into NeXTSTEP in Apple clothing).

At the time in question though Apple was pushing old, slow hardware at a price premium in a market that was rapidly passing them by.


Very spot on! Not many seem to remember, but OS/2 truly was technically superior to just about everything else out there in consumer computer land, excepting maybe things like 386BSD (which many of us didn't know about at the time, and probably would have peed our pants at setting up X ;). OS/2 had pre-emptive multitasking and memory protection; Apple and Microsoft didn't even come close in their offerings. Say what you will about how IBM fumbled the marketing, licensing, etc, or how the upstarts(!) won against the Evil Empire, but OS/2 was the better operating system, and Microsoft honed it's FUD by attacking IBM and OS/2, whilst offering a truly awful alternative.


Oh god. I remember setting up X back in the dark days of the early linux era. Peeing ones pants would have been preferable, it's astounding how far we've come.

As for Windows vs. OS/2 it's at best complicated. It's a bit like a micro-cosm of the PC vs mainframe debate. The raison d'etre of Windows 95 was backward compatability with a low memory footprint.

That may seem like a small thing, or at best not a thing to make so many enormous compromises over, but back then it was everything. The problem with doing multi-tasking "right" was that it imposed a ~4mb RAM requirement per 16-bit application being run at the same time. That is nothing today but back then 4mb was the minimum requirement for installing Windows 95, and it represented a cost of around $100 in 1995. Owning a computer powerful enough to run even a handful of 16-bit apps while running OS/2 or NT was simply above the economic means of a lot of people. And by the time technology caught up and RAM became cheap enough to make proper multi-tasking cost effective there was too much Windows 9x network effect for competitors to make much headway.


Remember setting up X11 and needing to manually set signal timings for your monitor? Good times.

That experience ranked up there with writing my own PPP init scripts and endless hours tweaking my fvwmrc. I didn't play video games during that period, because I was quite seriously having more fun learning every nook and cranny of any unix or unix-like system I could get my hands on.


Haha setting up X very true, really made me laugh thanks. And then the same pain with ndiswrapper because I bought cheap wifi cards that there were no drivers for....

And then folks complain today about the tiniest things!


Those of us around then know exactly why this was: PC Clones were too cheap and got better too quickly for other platforms to compete.

Back then we all thought "computers" was a hardware game. Only Microsoft realised the hardware didn't matter, software was the main game. And yes, I realise we have swung back to the "integrated hardware/software platform" thing being important again. Picking winning strategies in platform wars is hard.


To add to what nl said: Not only were MS-DOS compatible PCs cheaper, but also MS-DOS apps had a significant install base by the time OS/2 came out. The concerns of running legacy software on the then new 286 and the "just over the horizon" 386 CPUs was heavily weighted. Also, GUIs on all platforms then took up enough system resources (memory and CPU) that made enough folks pause in thinking that they could get better performance/cost running console-based apps.


> Only Microsoft realised the hardware didn't matter, software was the main game.

Yes. This was one of Bill Gates' many strokes of genius. After just one year of Traf'o'Data (Microsoft's precursor), Gates saw that the future was in software, not hardware, and he created Microsoft centered around this very vision (while Apple bet the farm on hardware).


Apple tried allowing cloners to copy the hardware and being software for a while, it nearly killed them. I don't recall if the clones were ever price-competitve with x86 clones though.


TO be fair, almost everything Apple did at that time "almost killed them".

The clones did expand the Mac market, but Apple was so incompetent at that point that it didn't know how to handle that.


Around 1989 I traded my Amiga 500 for an IBM AT clone running DOS. Each was worth around AUD$1000 at the time. It felt like I was going back in time 10 years - a mono screen and no mouse. The reason was that I needed it for uni.study purposes (Turbo pascal, Turbo C++, etc.) ...


Also, Atari didn't get marketing (Jack Tramiel was very reluctant to spend money there). Atari really didn't get systems software, either, and never hired to the extent needed to make a good impact there. The world was basically safe from Atari being anything other than a low-end consumer computer company.

And the thing that probably saved Apple in the late 80s was the desktop publishing business.


Also, Atari didn't get marketing (Jack Tramiel was very reluctant to spend money there).

I know that is the conventional story (and pretty similar for the Amiga). I'm not convinced.

I think that both the Amiga & the Atari ST were too far ahead of their time. They were multimedia workstations, without anywhere to play that multimedia (except on other Ataris and Amigas).

Like you said, the Mac managed to hit the desktop publishing wave, which was exactly right for the the paper-centric late 80's and early 90's.


I worked on the OS for the Atari ST. It sure wasn't ahead of its time. We looked at the Mac and felt jealous; that was some real engineering, while we had a bunch of crap from Digital Research (and a lot of it was unadulterated junk).

I never used things like MiNT, but those weren't supported by Atari anyway.

Atari just didn't have the resources to sink into an OS that could compete. They knew how to make cheap hardware, but after a while the PC ate their lunch. Nobody wanted to do biz with the Tramiels, so games were pretty much off the table.


Yeah, the Amiga seems like it completely dominated the multimedia niche, but the multimedia niche of the time essentially consisted of the demoscene. Which was great, except that the demoscene was not that large.


Huh, Amiga (1000 onwards) was big in video. Especially smaller TV stations that couldn't afford SGIs. Amiga (and SGIs) were light years ahead of Macs in graphics. People often forget this fact.


Amiga 2000 (mostly) + NewTek Video Toaster[1] were the driving force behind the TV graphics back then.

[1] http://en.wikipedia.org/wiki/Video_Toaster


My earliest forays into CG were on amiga and earliest paid jobs were with video toaster / lightwave and softimage on SGIs. Great times. With tools we have today they seem so primitive in comparison.


> And the thing that probably saved Apple in the late 80s was the desktop publishing business.

...and education, and government, and academia. Apple was a pretty safe bet in the 80s (The Mac IIfx was designed to government specs and was the fastest desktop PC around). They only really started to lose market after Windows 95.


One reason: stigma.

All I recall from childhood/teen years is "IBM compatible PC" a mantra on every TV commercial IBM or others produced.

A PC was for business and you were rich or a fool to spend thousands on something so useless to a regular Joe.

Apple was even more expensive and even less common than a PC and even so Apple was quirky and drew pictures, ATARI and Commodore were for games.

They were in totally different worlds. Back then I wouldn't have even thought to have one computer that did everything.


I dunno about that. At least where I grew up, if your family had a Mac or an Amiga, it was like you had domesticated a unicorn. Everybody wanted to see it, play with it. It marked you out as a member of a family possessed of either great sophistication or enormous wealth, either of which translates easily to status.

If there was any stigma, it was for being on the opposite end of the spectrum -- having a computer whose primary selling point was that it was cheap, like a Commodore 64. The C64 was a fine machine for the price, but nobody was going to ooooh and aaah over it they way they would if you took them into your Dad's study and showed them MacPaint.


Yeah.

I was the only one, among my circle of friends owning computers, that had a PC instead of an Amiga.

My dad thought the Amiga were only good for playing games, for anyone serious about computers, PC was the way to go.

So I was left reading 68000 Assembly manuals, some Amiga reference books, and playing with them on computer parties we used to organize.

In any case, the only way to buy any of them in my home country was on credit.


Agreed, Atari ST and Amiga especially never caught on much with businesses ... they were definitely seen as games machines, even though their capabilities were certainly on par with the IBM clones.


And Tandy? :)


An excellent book that studies that is The Future Was Here http://www.amazon.co.uk/Future-Was-Here-Commodore-Platform-e...


> IBM licensed Commodore’s AREXX scripting language and included it with OS/2 2.0.

I find this hard to believe, given that Rexx was developed by IBM.

http://en.wikipedia.org/wiki/Rexx


Indeed. And IBM do, in fact, still make other OSes, contrary to the subhead of the story: zVM/CMS, zOS (MVS), AIX, and whatever they're calling AS/400 these days.


OS/400 is very interesting OS for those interested in OS architectures.

The complete userspace is bytecode based, regardless of the language.

Applications are compiled on installation phase, or when the generated code is deemed not to be valid any longer.

When there was an architecture change for the PowerPC, many installations only required a regeneration of the installed software.

A concept that Microsoft tried with Longhorn and Windows Phone 7. Or we could even say, the model Android almost has as well.

Native Oberon and Inferno also tried a similar approach, to certain extent.


> When there was an architecture change for the PowerPC, many installations only required a regeneration of the installed software.

Unfortunately it wasn't quite that clean in practice. Regen required TIMI to have access to the compilation templates for each program. These were intermediate compilation stages ( bytecode, I suppose ) that it could then translate to the new machine architecture.

However, these templates were often missing, deleted or out-of-sync. So we did an awful lot of recompiling from source, when we could find it...


Thanks for the input.

I only did a bit of AS/400 administration back in 1994's Summer, just logged into the system and started the backup procedure.

I was actually doing Clipper development and the company where I had a Summer job used AS/400 systems for accounting.

Most of I know about OS/400 bytecode system was found looking for compiler technologies a couple of years later.


The thing I find interesting about OS/400 (or whatever it's name is this month) is the single-level store. When your program needs access to the contents of a file, it's just a memory offset. The OS relies on the swapping mechanism to bring those pages into available memory. Which, when you have flat 64 (one could argue 65) bit addressing … why not?


"Everything's a file!"

"No, everything's memory!"


Wait wait wait...so it effectively memory maps all the things?

Mainframe environments are weird.


And OS/400 files are libraries that you need to open, as another example of strangeness.

Or the OS/360 which uses virtualization for all OSs, like Hyper V does on Windows.

The first OS to boot from the hypervisor has master rights, but all OSs are virtualized.


> Or the OS/360 which uses virtualization for all OSs, like Hyper V does on Windows

No, the hypervisor was called VM. According to WP it most frequently ran CMS guests, which was a light weight single-app OS. But could also run OS/360 guests (which predated VM).


AS/400 isn't mainframe. It's midrange; smaller than a mainframe, and not as flexible.


That is why I always call it by the original name. :)

Yep that is also a nice feature.


Android with ART sounds very much like that. They compile the bytecode to native code at installation or major arch change.


AS/400 is now "IBM i".

Which leads us to the great set of names where we have: IBM i, Apple iOS, Cisco IOS


I am wondering if the tagline got changed--it now says IBM doesn't make consumer, desktop operating systems anymore for a reason.


Originally it stated "IBM don't make operating systems for a reason". It's hard to have full confidence in the rest of the article when that's what it leads with.


Ars Technica isn't exactly big on fact checking. These days most of their short articles are rewritten press releases, and most of their long articles are personal opinion/recollection written as if it was reportage.


While they may got the part about the history of AREXX wrong, most of the article is spot on.


The AREXX error was my mistake. I just read the technology transfer part backwards when I was researching. I've already fixed it in the article and updated it.


The original tagline was "IBM doesn't make operating systems anymore for a reason" which could have been debunked by knowing anything about IBM's product line or about 5 seconds of Googling.

Disclaimer: I'm an Ars subscriber and <3 Ars but when my bullshit detector is already off the charts before the article begins ... it makes me wonder how good the article is.


They probably misread the Wikipedia on OS/2, which says:

> In addition, IBM once made a deal with Commodore to license Amiga technology for OS/2 2.0 and above in exchange for the REXX scripting language.

In other words, IBM licensed REXX to Amiga in return for something else (we don't know what).

But who knows if this is true. Apparently IBM and Commodore already had an IP cross-licensing agreement at the time, and had access to their patents. And AREXX apparently did not contain any IBM code.


    In other words, IBM licensed REXX to Amiga in return for
    something else (we don't know what).
I think it was for technology used in the workplace shell (WPS).


Great read. In the end, it seems like a classic failure to resolve the innovator's dilemma. IBM decided that the future of computing would revolve around mainframes because they liked mainframes, not because that's where the facts led them. And ultimately they paid the price for it.


I think it's a bit more complicated than that.

Put yourself in the shoes of someone who'd grown up on mainframes. The systems you're used to are fabulously expensive, sure, but their operating systems have sophisticated architectures that allow them to easily juggle multiple users running multiple programs while maintaining the security of the system overall. They are the completest expression ever realized of forty years of progress in the field of information technology.

Then, one day, someone drops a PC in your lap. You are horrified. It's a single-user, single-tasking system with absolutely nothing stopping the user from trashing everything by running the wrong program.

To you, this new... thing, whatever it is, is barely worth the name "computer." It feels more like a toy -- like something you would give to a child to play with. Certainly nobody would ever run critical systems on it, you think.

And here's the thing. You're absolutely right! All your concerns are one hundred percent valid. But it turns out that nobody cares; the PC is much, much cheaper, and it lets everyone have their own dedicated hardware right on their desk running any software they care to install, instead of time-sharing a mainframe and begging for permission each time they want to try a new program. Or, put more bluntly, it lets them escape from having to deal with the corporate IT priesthood (i.e. you) anymore.

The market speaks! You are derided as a pointy-headed nerd and swept into the dustbin of history.

Now fast forward ten or fifteen years. People start taking all those PCs they bought and hooking them together into networks... and suddenly all those things you were worried about back in the day come roaring back to bite them. Users discover that their machine stops talking to the network when they click and hold the mouse button, because the OS can't walk and chew gum at the same time. And the complete lack of security makes their machines super easy to compromise.

The PC vendors panic. They scramble to rewrite their old systems into systems that can live comfortably on a network. And when they're done, they roll out systems that look an awful lot like what you were insisting the baseline for a "real computer" was fifteen years ago. The world cheers and lines up to buy back all the sophistication they had happily thrown away before.

In other words, it's not so much that the IBMers were wrong, it's that they were early. When OS/2 arrived, the world didn't understand yet why it needed something like OS/2. And by the time it did, OS/2 didn't exist anymore. But in this business, being early is effectively the same thing as being wrong. The market doesn't give out points for foresight.


This is the cycle of our industry.

A beautiful thing is created. It is very expensive. It is sold to very few customers at very high margins. The very high margins subsidise the further perfection of the beautiful thing.

An ugly thing is created. It is very cheap. It is sold to very many customers at very low margins. The very large revenues subsidise the further papering-over of the flaws of the ugly thing.

Eventually, the ugly thing utterly supplants the beautiful thing. A few wistful old high priests mutter about the beautiful thing. Meanwhile, billions of dollars and millions of hours are wasted working around the flaws of the ugly thing and reinventing, poorly, the features of the beautiful thing.


The market doesn't give out points for foresight.

Indeed; in fact I'd argue the market (and society in general) flat out punishes foresight, especially when it's right. What's that old Heinlein quote about Cassandra? And I think this is why so many nerds/engineers are "unsuccessful", at least by market definitions, or why they end up bitter: they'd rather be right than rich. Just look at the deriding RMS gets, and yet "Right to Read" has more or less come to pass.


The funny thing is, replace PC with tablet, replace mainframe with PC, replace Windows with Android and your story still holds true.

History doesn't repeat itself. But it sure does rhyme.


This is very very true. I see the "multiple users" feature they introduced a few dot versions ago in Android and thought it was useful (perhaps?) but then they are implementing all of Linux's features on top of Linux.... makes me chuckle


There is a good reason for this though - to some extent Android treats every "app" as a separate unix user running in it's own account space, as I understand it.


Ah yes, it does. So I suppose it is good that Android is implementing this on top of it, but it still seems like massive effort to duplicate lower level features due to lack of foresight during initial Android system design.


To some extent I prefer the Android model. Cryptolocker has got me thinking about the security of my backups, and one uncomfortable realization is that it's quite difficult on any remote system to create a folder and say "I need to re-issue my password to use this folder" without making it owned by root.

(Although I'd be really happy to know if a "user-level sudo" is possible).


The same thing happened with digital cameras. I remember hearing people scoff at them and thinking "wait and see, folks."


Like we native folks (mainframe) keep talking about those trying to stuff applications inside an application created for reading hypertext documents (pc).


> But in this business, being early is effectively the same thing as being wrong. The market doesn't give out points for foresight.

Unless you manage to survive and keep at it until the time turns out to be ripe. NeXT kinda-sorta did, 20 years down the road. It's an exception though, graveyards are full of great OS who didn't have the time to wait.


This reminds me of MS's PX00307 that I found and linked in https://news.ycombinator.com/item?id=3441885 with my comments pointing out the flaws in the thinking.


Regarding vulnerabilities, which many think (wrongly) are not present in mainframes, because they are networked nearly as much, and have massively fewer programs running on them, don't show up as much. But they are there.


Certainly, but it's also worth noting that the environments in which programs execute on most mainframes are much more restricted than those of even the typical servers of today, and have been by default for decades.


Somehow I'm not seeing the entire worlds financial system as being either un-networked or not an appealing target.

If you think skript kiddies on the other side of the planet are a difficult adversary, try inside jobs like corrupt fellow employees, now they're a worthy adversary.


It also rings to me as a story about artificial limitations. They were scared another program could do more for less. IBM is doing good today I think, I wonder if they'd fare better going the Apple/Google 'making people's live better' myth/motto, even at the ~sacrifice of their traditional comfort zone / cash cow.


When OS/2 Warp came out, I remember it being insanely cheap ($20?), so in a what-the-hell mood I bought it. Took it home and tried to install it. It was a hopeless mess of disks, both optical and floppy, and I never got it to run.

One of my cow-orkers at Apple had worked on the OS/2 Presentation Manager at IBM. I tried talking with her about it, but she said the experience had been "absolutely awful" and she didn't want to say much else.

IBM never had a chance.


I have OS/2 Warp 3 and 4, both boxes at home. They both worked on my computer at the time and run my tiny BBS and allowed me to work on the computer flawlessly. This was much better than my peers who had to dedicate a computer to this. But then I had a loaner modem and a shared voice/BBS line and they had a private line for the BBS too.

It worked nicely and the install wasn't bad, it was quite a few disks but than Windows didn't fare much better in that regard. It was also a mess of many floppy disks to perform an install.

I remember fondly the Team OS/2 meetings where we could geek out our love for OS/2 and mourn the inefficiency of IBM marketing to push it.

And then I found Linux.


And not a single mention of Babylon 5.

The special effects were created on Amigas: http://www.midwinter.com/lurk/making/effects.html

Also, while looking at Video Toaster's entry on Wikipedia, I found this gem:

"An updated version called Video Toaster 4000 was later released, using the Amiga 4000's video slot. The 4000 was co-developed by actor Wil Wheaton, who worked on product testing and quality control.[6][7] He later used his public profile to serve as a technology evangelist for the product.[8] The Amiga Video Toaster 4000 source code was released in 2004 by NewTek & DiscreetFX."

http://en.wikipedia.org/wiki/Video_Toaster




Why aren't the unix-family OSs of the era part of the story? Why didn't IBM even consider porting a unix-family OS to the PCs instead of paying an unproven company like Microsoft to write an OS?

(...all the events from this stretch of computing history seem so weird to me, like from a steampunk-like alternate reality movie. There's surely lots of context missing and stories that nobody will ever tell, since most of the decisions taken by all the key players seem so anti-business. Computers may have changed a lot from back then, but business is still business and all the decisions made seem either "irrational" or based on "hidden information" that is not part of the story.)


I guess because, as the article describes, IBM was all for protecting its workstation business as well as the lucrative mainframe business.

A workstation back then was defined as an expensive high-end computer that was designed to be used by only one user at a time (i.e. not a multi-user mainframe), yet was suitable for high-performance applications. They were intended mostly for corporations and academia where extra juice was needed and that could afford to buy workstations (think scientific computing, CAD and graphic design in the 1980s). IBM did at this time already manufacture workstations with UNIX as the operating system (see e.g. http://www.old-computers.com/museum/computer.asp?c=867&st=1 ) But they were way too expensive for the home user. PC (personal computer) was the low-end product that you could afford with a normal salary.

I understand UNIX back then was a mainframe and workstation operating system. Licensing was expensive and the hardware requirements beyond that of a PC. Few people had access to UNIX, mostly at universities and at big corporations. These were the very reasons why GNU and Linux were born - to provide a mostly-compatible UNIX clone for the home users with an affordable IBM PC compatible.

So my theory is that IBM was protecting its mainframe business - it did not want to put the powerful UNIX to the PC because it wanted to sell more expensive special hardware to those who wanted UNIX. So it hired a maverick company (Microsoft) to write a low-end, feature-poor operating system for PC (DOS). It was (and continues to be) a business strategy to bundle better software with better hardware so that you can ask customers that want only the superior software for a higher price (still essentially the business model of a certain Cupertino, California based manufacturer)


I enjoyed the article. It was a nice trip down memory lane. Regarding development tools, there were 2 commercial IDEs based on REXX: Watcom VX-REXX and VisPro REXX. I used Watcom's VX-REXX and it was a joy to use and allowed for incredibly fast and powerful application development. I heard the same about VisPro REXX. IBM's early set of tools C/2 and C Set++ were a bit painful to use. VisualAge C++ 3.0 was a decent toolset once you got over the weirdness of it. For a while, if you wanted to write C or C++ code using sockets you had to purchase IBM's TCP/IP SDK for $100.

The SIQ was a "synchronous" input queue and the problem has been understated in the article and comments. It was really bad. The base OS was incredibly stable, but the GUI shell, not so much due to the SIQ problem.

There were a number of Unix and Unix-like systems in addition to the ones already listed: Coherent, Interactive, and SCO are some that come to mind. They were pretty expensive IIRC, around $1000 to license.


I remember an internal training class at a large consulting firm in the mid 90s that was using OS/2. I thought, "This is an awful sign. Are we doing this just to get some business with IBM?" They were a big user of Lotus Notes 2. You never know...


Interesting read. This part really brought the current Windows 8 push by Microsoft to mind.

"These machines were meant to wrestle control of the PC industry away from the clone makers, but they were also meant to subtly push people back toward a world where PCs were the servants and mainframes were the masters. They were never allowed to be too fast or run a proper operating system that would take advantage of the 32-bit computing power available with the 386 chip. In trying to do two contradictory things at once, they failed at both."

Not quite the same situation, but they have many similarities.


Thank you for sharing, a great article indeed.

I'm glad it ended up the way it did, Microsoft at the time was betting on openness being a feature, and I think they helped move the computer and software industries they have gone in since, towards greater openness (and thereby professionalism).

People associate Microsoft with closed source, but it is of course relative, they were in their day the vendor who was banking on openness and courting developers harder than the others.


The article says that the Mac OS was the only OS to ship that ran on PowerPC CPUs. This is not true - later versions of the Amiga OS ran on PowerPC.


Here's a perspective from the founder of a successful bootstrapped software startup that began by developing native OS/2 applications:

http://www.stardock.com/stardock/articles/article_sdos2.html

I own a copy of the OS/2 Galactic Civilizations 2.


One of my fondest memories of OS/2 (there weren't many, sorry) was finding a media file on one of the diskettes called IBMRALLY.MID which was a little piano rendition of "Ever Onward, IBM" from way back when in the Way Back When Days of IBM.


Hah, I was thinking to Ask HN these days about the availability of exotic jobs, including OS/2 (or eComStation) programming jobs. I also wouldn't mind to take Motif jobs. Feel free to contact me if you have any ;)


Motif was recently GPLed, as it happened.


As this article admits, it is just a rewrite of "Triumph of the Nerds".


Yeah, it's typical of Ars these days. Not particularly artful rehashing of existing info.


"Version 3.0 was going to up the graphical ante with an exciting new 3D beveled design (which had first appeared with OS/2 1.2)"

I think it was Next computers who got first on this.


I seem to recall UPS widely deploying OS/2 internally back in the 90s, with custom (internal) apps written for it.

Probably all Windows by now...


This was a great article, thanks for posting!


>Finally, and most importantly for the future of the company, Bill Gates hired the architect of the industrial-strength minicomputer operating system VMS and put him in charge of the OS/2 3.0 NT group. Dave Cutler’s first directive was to throw away all the old OS/2 code and start from scratch. The company wanted to build a high-performance, fault-tolerant, platform-independent, and fully networkable operating system. It would be known as Windows NT.

A couple of decades later, Dave Cutler is still around at Microsoft and worked on the hypervisor for the Xbox One at the ripe young age of 71, allowing games to run seamlessly beside apps.

From http://www.theverge.com/2013/11/8/5075216/xbox-one-tv-micros...

>Underneath it all lies the magic — a system layer called the hypervisor that manages resources and keeps both platforms running optimally even as users bounce back and forth between games, apps, and TV.

>To build the hypervisor, Multerer recruited the heaviest hitter he could find: David Cutler, a legendary 71-year-old Microsoft senior technical fellow who wrote the VMS mainframe operating system in 1975 and then came to Microsoft and served as the chief architect of Windows NT.

>It appears his work bridging the two sides of the One has gone swimmingly: jumping between massively complex games like Forza Motorsport 5, TV, and apps like Skype and Internet Explorer was seamless when I got to play with a system in Redmond. Switching in and out of Forza was particularly impressive: the game instantly resumed, with no loading times at all. "It all just works for people," says Henshaw as he walks me through the demo. "They don’t have to think about what operating system is there."


Yes, many like to bash Windows, but the NT family is actually a VMS at heart, spoiled by a not so good user land experience.

I once attend a session by him about the Windows kernel design, quite interesting.


VMS + 1 = WNT, in the same way as IBM - 1 = HAL.

(So far as I know, both of these are just amusing coincidences.)


I once went through the VMS documentation, and could at least find some of the design ideas that I knew from Windows NT.

I think it is good that the industry enjoys different types of OS architectures and designs.

Just because UNIX managed to spread as it did, doesn't mean it is the be all of OS design. After all, its creators tried to fix UNIX, just the industry did not adopted it.


In the 2001 audio book Arthur C Clarke had a 15 minute intro where in it (amung other things) he explicitly says it is a coincidences and lists the page in the book explaining what HAL stands for.


Thanks for the tale, heartwarming to hear that he is still at Microsoft. In my view some of the best work in our industry has come from lifers (or at least long-timers).


I would love to see a kickstarted to buy OpenVMS from HP since they are retiring it. It was a quirky but solid OS with some great clustering and security.


One thing VMS was not was quirky, DEC went to considerable effort to make everything consistent. If you didn't know a command you could guess it, and it would take the same switches as any other command. Contrast that with Unix whose commands really are quirky...


Given by background was UNIX, it certainly felt quirky, but it was very consistent. The "why the heck are there 4 of the same file" moment is a little odd. It was a good OS.


A friend of mine (who is a huge VMS fan) wrote this: http://www.freevms.net/


The sad thing is that I have never seen an article on the entire MS OS/2 2.0 fiasco that is what I call complete and detailed and many omitting for example the unethical attacks MS did against OS/2 such as "Microsoft Munchkins". I try with my own blog article, but I admit it is not very good either.


Wikipedia has some interesting things to say about it.

http://en.wikipedia.org/wiki/Team_OS/2


> So the new System/360 mainframe line would run the also brand-new OS/360.

Bad example. Really bad example: Not even IBM could standardize on a single OS for the System/360.

The System/360 went through a few OS iterations before OS/360 came along: OS/360 was late, as recounted in The Mythical Man-Month, so DOS/360 came along, then BOS/360, then TOS/360, and even PCP, which didn't support multiprogramming. Other OSes were CP-67, which became VM, MFT, MVT, and still more OSes on top of that.

To this day, there are multiple OSes for the architecture descended from the System/360, including Linux.


don't outsource for the sake of a few months, if things take time, then they take time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: