My father was a lead in hardware engineering and manufacturing at NeXT. I have tons of footage of the amazing assembly technology that they used. I wonder if people would be interested in it.
It was heavily top secret at the time, but NeXT built their own custom robotics system (called thor) and had all sorts of amazing in-house manufacturing tech that was incredibly automated. The hardware itself was just beautiful, too.
I'd like to suggest having your dad and an Ars Technica journalist sit down for an interview. Get archive.org and Ars together to archive the footage and release a nice video edit in the article.
> NeXT built their own custom robotics system (called thor)
Thor! I remember reading a little bit about it, and finding it incredibly exciting, but finding anything from NeXT's projects that never quite went public has been really hard.
Just in case you're counting replies, yes! Bring your dad too, it'd be an honor for a ton of people who haven't met anyone from that age, let alone had a parent with stories like that :)
There's an old video on Youtube called 'The Machine to build the Machines' which goes in-depth on their automated assembly line for mainboards. Could be a nice intro.
One of my prized possessions is my NeXT Cube[1], finally acquired in 2014. I'd wanted one ever since I saw the "Actual Size" marketing brochure that came out in '89 or '90.
I owned a series of slabs (mono, color, turbo color) throughout the years, but never managed to get my hands on a Cube. A friend took pity on me in '14 and sent me one, and then I got the bits (special cable) needed to hook up a VGA flat panel though the non-ADB soundbox.
NeXT steps: finding enough 4M 30pin parity SIMMs to max out the memory, then replacing the internal 18G SCSI disk (a SCA drive with an adapter) with a SCSI2SD board and MicroSD card in order to reduce the number of moving pats.
I chuckle when I realize that I spend most of my day in front of a MBP at work and a Mac Mini at home - both running what is basically the descendant of NeXTstep.
One time when I worked at NeXT I accidentally kicked my cube with my tennis shoe - it was under my desk. Sparks flew. I told the fellow sitting in the next office what had happened - and he wanted to see it too. We both laughed as we watched sparks flying. Before long there was a crowd around my desk - maybe six people - incredulous that it was so easy to create fireworks. Well, what we didn't know was that Steve was sitting about 3 feet away - behind an office divider. His head peeked around the divider - and he silently watched the sparks flying. We all stopped laughing and returned to our work.
Next morning everyone in the company had a paper memo on their desks (the only one I ever remember - we all used email.) It said that cubes were to be properly displayed on top of desks only. No mention of the real reason for the new policy.
I recall the journalist wanting a pic for the cover of NeXT is dead edition. and even in a industrial oven he had a hard time igniting the magnesium. I think he never did. probably a magnesium alloy
Not by a long shot! Put in there by the friend who threw together a working system for me from the ones he had lying around.
A lot of those original sub-gigabyte drives in the slabs eventually went out with bad bearings (oh my the screeching sounds...)
We had a few Turbo Color slabs at an ISP that I worked for in the late 90s that had been acquired from a surplus place; they had "PROPERTY OF THE NATIONAL SECURITY AGENCY" stickers on them and had been stripped of RAM and HDs of course.
I still don't understand why they strip memory from machines from a classified setting. Once powered off, the data is only recoverable for a short time.
This reminds me of 4MB 30-pin SIMMs based on 350 mil 4Mbit chips, which was hard to fit into the NeXT Cube. The early 16Mbit chips was 400 mil which is even worse and probably part of why 16MB 30-pin SIMMs was never very popular.
If you've read the Steve Jobs biography (or biopic), you might recall Guy Kawasaki's accidentally-prescient spoof Apple press release[1] from 1994 about a (at the time) hypothetical Apple-NeXT acquisition. Choice quote:
> As a cofounder of Apple and the father of Macintosh, Jobs brings back to Apple the type of visionary leadership that enabled Apple to create three of the four personal computer standards (Apple II, Macintosh, and Windows).
"Red Box, the planned PC Environment for Rhapsody on Intel hardware, never saw the light of day, although today’s Intel-based Macs have virtualization solutions from other vendors which allow them to run Windows or Linux alongside Mac OS X."
I can imagine a distorted future reality where the natural progression of WebObjects was based on early successes with NeXT's OOP stuff with AT&T and Chrysler.
While iTunes, the App Store and the whole "Digital Life" stem from a Web-based Minivan Configurator, looking back at WebObjects as a product to enable that for others seems so contrary to what Apple is today.
That Old World of Software Product SKUs has been turned on its head to great effect, and I think it really is better this way.
OSX is one of Steve Jobs' least-discussed successes.
The original Mac, Pixar, the iPod, the iPhone are all talked about -- but without OSX as a Unix-based OS, application-development on Macs, and adoption of Macs in the '00s would have been much slower and more sparse.
Technically it was a Mach based OS, with a POSIX compatibility layer from BSD.
The distinction is important IMO, because I beleave part of the ease of them supporting multiple platforms so easily is because it was all built around a microkernel.
Now Apple has more of a hybrid with XNU, but many of the core concepts of Mach are still there.
NeXTWorld Magazine in the early 90s was my favorite publication. Every issue showcased apps with elegant and surprising capabilities, like the first spreadsheet with pivotable rows and columns. John Carmack wrote Doom on NeXT, and claimed that NeXTStep made development ten times faster. Other developers agreed. People were excited about programming on NeXT, and not because of any reality distortion field. Reading NeXTWorld felt like seeing into the future! (Here's an archive, rather incomplete: https://simson.net/ref/NeXT/nextworld/ The pdf scans convey more of the joi de vivre surrounding NeXT than the plain text files.)
Interesting parallel that both Jobs and Jean-Louis Gassée started hardware computers after Apple and that each retreated to operating systems, which they tried to sell to Apple.
I don't know if I'd say Lisp machines really died; like NeXT, their rival architectures+platforms of the era only outcompeted them by absorbing them and becoming them. Modern processor architectures look a lot more like that of the Lisp machine than they look like its competitors of the era.
Instruction pipelining and memory caching to allow for "cheap" dereferences; a "flat" virtual memory; loadable modules that are source on disk and get JITed into native objects when loaded into memory... these are just generic features you could expect out of most architectures+platforms today. But those were the differentiating factors for the Lisp machine.
I remember people wondering how the hell we were going to ever use the 50 million transistors per die that Moore's law assured us was coming straight at us in just a few years...
Yep... Lisp machines were thoroughly in the "mini"computer (cabinet-sized) category and completely missed out on the microcomputer revolution. IBM PCs were cheaper and got the job done. To an extent.
> Lisp machines were thoroughly in the "mini"computer (cabinet-sized) category and completely missed out on the microcomputer revolution.
Symbolics and TI both developed microprocessors for Lisp in the 80s. TI promoted their Lisp chip as the 'Megachip' enabling the 'Compact Lisp Machine'. It was the first microprocessor which integrated around a million transistor functions.
Both put the Lisp Machine for example on small Apple Nubus cards and inside an Mac II. The TI MicroExplorer and the Symbolics MacIvory were popular Lisp Machines.
This is a MacIvory board which contains a Lisp processor, a Weitek numerics chip and memory:
it occurs to me that the depth of the AI winter was proportional to the excessive investment. by the end I think everyone was sick of the AI group at every lab and university that had end endless stream of DoD funding and never seemed to go anywhere.
and now we know that at least on the NN side, what seemed like another silly toy was just a little more work and a little more hardware away from being demonstrably useful.
so anyways, I'm positing a maximum after which additional funding does active harm.
I only played with one on a VMEBus on a Sun, so I wasn't aware of their stand-alone form factor (i.e. I was guessing)
> The AI winter didn't help Lisp's popularity.
Really? I wouldn't have thought a symbolic machine company would have been noticeably affected by the Minsky-triggered AI winter. Or are you talking about the post Japanese Fifth Generation project slow down? The MCC project in Austin Texas (and Doug Lenat's Cyc) took some hits from that one...
(I wasn't alive then, so I'm only informed by what I've read.)
A few short years after Minsky's comment, microcomputers were in full bloom and expensive Lisp machines were no longer viable. Cheap PCs could run Lisp code faster than the dedicated hardware!
But Lisp never really gained a strong foothold on microcomputers. Existing code continued to run... but development slowed down as Lisp companies went belly-up, unable to sell their hardware. The standardization of Common Lisp in 1984 did help somewhat.
The Japanese Fifth Generation project was oriented around logic-programming, was it not? I had the impression they focused on technologies like Prolog rather than Lisp.
> I had the impression they focused on technologies like Prolog rather than Lisp.
It was a "full stack" thing. From silicon on up. And it definitely got fuzzier the higher up in the stack you went. Prolog-ish was big in the upper tiers of the stack (esp. as it had to do with "modularized knowledge" paks), but "support-all-symbolic-approaches" were motivating requirements on the lower tiers.
I lived in Japan, doing systems engineering for what are now called SOCs, before I moved to Austin during this period, and the PR around the 5th Generation Project had both the Japanese officials feeling high-on-life and the American officials a little deer-in-the-headlights panic-ey. IBM and Fujitsu (and Toshiba, and Motorola and AT&T) all made some bank off of frightened (or overly ambitious) politicos.
Windows 95 supported preemptive multitasking for 32-bit applications. Mac OS kept using cooperative multitasking for an embarrassingly long time through all versions of 7, 8, and 9. Macs didn't get preemptive multitasking until OS X came along although there was a half hearted API that some apps could opt into in OS 9. By that time, a lot of Windows users were already on Windows 2000 or moving up to XP, both of which made classic Mac OS look downright archaic. But not to be mistaken, Windows 95 was a lot more advanced than classic Mac OS in a lot of important characteristics.
Don't forget the per-process heap size bar that you had to adjust. If you were used to any 386 version of Windows, it was like going back to a very well-manicured version of the dark ages.
I'll freely admit that it was, in the main, not worse than MS-DOS, most of the time.
> I don't think that there was ever a point where Systems 7, 8 or 9 were worse than Windows 95.
Systems 7, 8 and 9 had better font handling, prettier icons, a more comprehensive GUI experience (eg. drag and drop worked everywhere), and a richer desktop publishing ecosystem.
But with respect to multitasking, 32-bit application support, protected mode, and networking, including support for, you know, the Internet -- Windows 95 was way, way, way better than Mac OS.
Apple had a 11 year head start with the Mac. With Windows 95, Microsoft had finally caught up, and then some.
The only people who thought Win 95 = Mac 84 were Apple fanboys.
Agreed, watching a Photoshop filter lock up an entire machine was painful in those days. Not to mention all of the other random beach balls one would get.
I actually came over to the Mac (for my home machine) for a little while after the writing was on the wall for the Amiga and it felt like a painful step backwards. At work we had been running NeXt boxes for a while, which made it's deficiencies all the more apparent.
I was not a big fan of Win95 but it would be a far stretch to argue that System 7/8/9 where anywhere near it in terms of actual use without regular crashing, which plagued MacOS at the time. Everyone was familiar with the bomb icon back then.
We always called the black and white spinner the beach ball and the colored one from OSX the pinwheel. May be a terminology thing, but I was talking about the old black and white spinner in MacOS. If one did any kind of video, image or 3d editing, it was not far behind and pretty much locked the computer up until whatever processing was done.
While Apple was faffing about with Taligent and Copland, Microsoft shipped something that provided preemptive multitasking and memory isolation for 32-bit apps.
Windows 95 was a collection of hacks that never should've worked but it put Apple behind the 8 ball to bring its OS tech into the present.
They also shipped the Lisa prior to that, which also had protected memory and pre-emptive multitasking. This died for other reasons however (too expensive, upstaged by the mac, etc).
Microware's OS-9 Level 1 had preemptive multitasking IIRC - this was late-1970s, early-80s - it was distributed by Radio Shack for the TRS-80 Color Computer line, which ran the 6809 8-bit processor, at something like .89 MHz (pretty f'in amazing when you think about it).
System 9 still lacked preemptive multitasking, due to a pragmatic decision from the very beginning of MacOS. (Perhaps even predating the Mac?) That alone would be enough to say that it was showing its age. As someone who was using System 9 at the time, I would say that this is indeed what knowledgeable users were thinking.
It's not entirely true. Copland microkernel was in (hidden) in System 8.5 onward, and provided preemptive multitasking, file mapping, timers etc. It's just that the only real 'task' on it was the old cooperative OS.
They had made a LOT of work on that, and the driver model by the time System 9 arrived. What really failed on Copland was the 'userland' equivalent. At the time, Apple had gone into a completely bizarre way of designing complex APIs for everything and most of them had no use whatsoever. There were heaps of crap, like OpenDoc and many others and Copland was 'trying' (and failing miserably) to integrate all of that.
How do I know? I have a collector t-shirt with 'Copland Driver Kitchen' on it ;-)
It's not entirely true. Copland microkernel was in (hidden) in System 8.5 onward, and provided preemptive multitasking, file mapping, timers etc. It's just that the only real 'task' on it was the old cooperative OS.
So not entirely true -- it's "only" true from the standpoint of typical User Experience.
When it came out, Windows 95 blew everything out of the water, including AmigaOS. Which was itself vastly superior to anything else at the time, including the various Mac OS.
It took years for Apple to even support preemptive multitasking, they were easily 5 if not 10 years behind (and still kind of are today).
>When it came out, Windows 95 blew everything out of the water
We had two NeXT color slabs at the time when Win95 came out. We used them to help MS launch Win95 in Hong Kong. NeXT by then was gone as a company and the slabs were getting long in the tooth. We switched to SGI -- the SGI Indy wasn't graphically as slick as NeXT Step, but it it blew away ANYTHING out there. IRIX was a very nice BSD Unix. We built one of the first ISPs in Hong Kong on IRIX. Solaris had just come out, which was a buggy mess, and to me wasn't nearly as nice to work with as SunOS on Sparc 10s. We were running IRIX, SunOS, Solaris and not long afterwards, early versions of Slackware (which ran our Usenet servers). But even today, my fondest memories are using NeXT and SGI -- very cool, very expensive.
> When it came out, Windows 95 blew everything out of the water
Surely not in a technical sense. OS2/Warp and NeXT already existed and BeOS would be released a few months later able to play multiple videos in real time (playing MP3 on Windows would preclude doing anything else with the machine for a few more years, lest you liked skipping).
Windows 95 was the best mass market OS at that time. Yes OS2/Warp and NeXT existed, but they were not popular. The Mac and Amiga OSs were inferior at that point.
It was heavily top secret at the time, but NeXT built their own custom robotics system (called thor) and had all sorts of amazing in-house manufacturing tech that was incredibly automated. The hardware itself was just beautiful, too.