I was working at Pixar at the time and a couple of us went out to lunch when the news broke to figure out What It All Meant. We passed Steve's office on the way out and one of our party, who had worked with him since early Apple days, congratulated him. The big question was whether they would keep the Unix plumbing visible and useful, or would it be buried under some API layer of no use to us Unix graybeards. Fortunately, the plumbing wasn't hidden in the walls.
Didn't Steve have an email that was in your mailbox by default when you got a new account on a NeXT machine? It had some embedded images and an audio file of him talking.
Anyway, I sent whoever it was, pretty sure Steve, an email with read receipt enabled and he bitched me out for invading his privacy. This was ~93 or so.
Honestly I find read receipt requests to be in very poor taste, if not a sign of outright hostility, and no, having them by default is not an acceptable excuse, no more than forgetting capslock on would.
The funny thing is that I never even thought about it, it was my first real email account. But ever since then i've been a vigilant anti-read-receipt person and to this day disable clients from replying to them whenever possible.
Those things annoy me too, but thankfully it's been years since anyone sent me one (unless my mail clients have been silently reporting back to people other than the NSA, of course). I assume the read receipt request button has been made harder to find in newer versions of Outlook or something.
I reckon the correct way to handle these receipt requests is to configure the mail client to send a modified receipt email which also has a read receipt request, and keep replying in kind until your correspondent gets the message (or the mail servers break down due to two bots telling each other that they read the notification that they just read each other's mail telling them they read their mail, of course).
I don't use Outlook; what I mean is that it seems that other email users aren't sending so many read receipt requests for some reason (such as changes to Outlook's user interface). Either that, or I correspond with a better class of email user these days.
Apple must have had lots of emails over the years with people wanting to return a 'Next' shirt or asking if they are going to restock such-and-such jeans again.
There must be someone at Next clothing (next.co.uk) dreaming that Apple sell that domain name to them one day.
I sent him an email from a NeXTStep configured PC in around 1996/97/98 stating I had some window GUI ideas that would give him a distinct advantage over Windows.
As I recall, his reply was terse, along the lines of:
~~~
What are they?
sj
~~~
I was doubtful it was actually the man himself responding. I thought it was some email administrator trying to steal my ideas (I was 20yrs younger then). I proceeded cautiously and don't recall if he replied again.
A decade later I learned terse replies were his MO.
To my young mind, it was more about someone else taking credit for my wonderful ideas. My naive thinking was that Steve Jobs would be dazzled for sure. He'd probably want to hire me or pay me a million dollars, but some other person checking his email or administering the email server... how could I trust them? I was young and foolish and had spent too much time acquiring and configuring a NeXTSTEP compliant SCSI drive and SVGA card.
Like most every idea I came up with at the time, it had the word “smart” in it.
Smart Windows.
What are “Smart Windows”? So glad you asked. At the time both NeXTSTEP and MacOS only had resizing handles on the bottom of windows(if I remember correctly). What I proposed was a full perimeter border on windows like what was seen in the Motif Window Manager, CDE, et al.
But wait there’s more...
If you double-clicked the border, the window would jump-move in that direction until it hit either another window border or the edge of the screen. If you held the shift key while double-clicking the window would instead expand in that direction. All together it was a precise way to quickly move and arrange a full screen panel of windows without ever having to drag a window or window border.
I eventually tested out the idea on Windows NT 4.0 inside a Visual Basic project. It was super dope. SJ missed out.
It actually reads like a solid fiction book, except that it isn't. You can skip a few dozen "romance" pages here and there and then it's really worth your while.
I found Becoming Steve Jobs by Schlender & Tetzeli to be a much more informative read than Isaacson's official biography, particularly on the intersection of Jobs/Next/Pixar/Apple.
Just wanted to second this, totally agree. Becoming Steve Jobs' authors interviewed Tim Cook, Jony Ive, Eddy Cue, Pixar’s John Lasseter, Disney CEO Bob Iger, and Jobs’ widow, Laurene Powell Jobs.
Definitely not true. As a concrete example disproving your fallacious post, I refer to Mr. John Gruber who called Becoming Steve Jobs "the book about Steve Jobs that the world deserves".
Johny Ives says of Isaacson's book, "My regard couldn’t be any lower."
Yeah, sure. I worked with Jobs (not for him) when he was at NeXT and I was at a closely affiliated company. I asked Jobs once in a meeting if he would ever consider moving the NeXT software onto non-NeXT hardware to give it more reach. He exploded, as was his way, and answered my question by pointing out that he drove a very expensive Porsche while my car was just a cheap econobox. That was his entire counter argument to me. Many of us had read his first biography "The Journey is the Reward" and knew that the guy thumping his chest about how his expensive car demonstrated his superiority had fathered and abandoned a daughter who had been living nearby on welfare checks that were paid for by us little people who paid taxes and drove economy cars.
Talk about "my regard couldn't be any lower".... I think Isaacson's book pretty much described the guy I remember.
Thank you for this. It is definitely good, in my opinion, not to forget all his flaws. He was an asshole. But he was also a genius. Most great people are like this, imperfect in some way, and I think it is naive not to recognize that.
No, it's 100% true. It's the book Steve Jobs had written as his official biography, so it's de facto the official biography. It doesn't matter if there is a better book, and your aggressive tone is not helping your argument.
When I joined Apple Software Engineering in 2001 as a writer, the first thing I noticed was that former NeXT people ran the company, made the important decisions, and believed fervently that the new Apple would be wildly successful. Oh, how right they were.
Yeah, Pixar totally "bought" Disney Animation and, to a lesser extent, the theme parks. I'm told John Lasseter has been deeply involved in them. That would be going home for him, he worked at Disneyland as a youth.
That's no longer the case. ESPN has seen its profitability roughly slashed in half in the last four years. Simultaneously they're rapidly bleeding subscribers while they've overpaid for sports rights at exactly the wrong time.
ESPN is at about $9.5b to $10b in sales for 2016 and $1b in operating income. Those numbers will decline until operating income is closer to $300m-$500m in the next few years.
By comparison, Disney did $9.3b in net income for 2016 ($14.2b operating income). ESPN is down to being closer to 10% to 15% of Disney's profits and 18% on sales. As a business it's a looming disaster of value destruction from the former peak.
TO be fair, you could consider the animation/film unit to be the marketing wing of Parks & Resorts for the purposes of this discussion. Why would anyone go to the parks without Mickey, Leia, and Iron Man?
To an extent... but their swansong vision of portable OPENSTEP was quickly shut down by Apple, which is a crying shame because if OPENSTEP was still around it would be keeping Microsoft competitive on the desktop front - instead of today's languished desktop developer story which remains frozen in 1998 (for Win32) or 2005 (for .NET).
Now that Swift is open-sourced, I wonder if we'll see a good Cocoa port to Win32 and Linux - because that would be NEXTSTEP's legacy.
I sometimes wonder if we need anything really new as a desktop paradigm. I know you say that systems are stuck in the early 00s, and I know that the underlying APIs are going through slow changes (they are mature after all).
But the ability to take a Mac from nearly 20 years ago and take one from yesterday and be able to use them in nearly identical ways (mobileme instead of icloud eh) is quite impressive.
I have seen the attempts to invent a new desktop paradigm in Linuxland with Unity and Microsoft's Metro attempt but I firmly believe that the windowing systems that we have at the moment are really as good as it is going to get, and more importantly, as good as it needs to be.
The eyecandy is all very nice (animations, fading in/out) and new compositors is great too but from a usability point of view, I don't think we really need to add anything new.
Am I alone in this? I mean Amiga got it right years ago and we've been trying to catch up since.... :-)
I don't think Linux will ever take the desktop, per se. The only opportunity is to leapfrog over the whole paradigm.
Android is obviously that. And the upcoming transition to conversational UI (not just voice but stuff like news feeds, Snapchat, etc) and also VR will I think be the end of the desktop paradigm.
I realize this is often overstated blah blah blah. But both mobile and VR are still kind of operating in a psuedo-desktop UI paradigm where they can't really reap their advantages. At some point in the next 10, 20 years I think both mobile and VR will have their WIMP-equivalent UI breakthrough and we will have a real horse race, and Linux will be in a different position. Speculation of course.
* "Windows, icons, menus, pointer" was the big breakthrough discovery of the meat and potatoes of desktop UI
Think about how today's businesses generally use their employees: let's not beat about the bush: they're using Microsoft Office for producing documents and spreadsheets, doing research with Chrome, and (if applicable) creating business value using either line-of-business software ("Bob's Car Repair shop database in MS Access") or specialised content-creation software (Photoshop, AutoCAD).
None of those use-cases can be fulfilled by recent interaction paradigms (i.e. smartphones, tablets, or voice-control). VR's useful for visualisation, but that's a very limited use-case, and VR has been used for architectural and civil-engineering visualization for decades anyway (e.g. https://www.youtube.com/watch?v=T2CYLlSn1gA ).
Granted, when considering an iPad paired with a keyboard - we see some of those use-cases become almost workable again (namely word-processing), but it's essentially the same thing as a traditional laptop - just with considerably more painful precision object manipulation due to the lack of a mouse cursor.
I feel the biggest paradigm-shift over the past 10 years is the movement of database business applications away from Access, VB6, Delphi, etc to be web-applications, usually hosted locally, and more recently to being moved to multi-tenancy, subscription-based services ("the cloud") - this has now become acceptable to businesses: the trade-off of no-longer being in exclusive control of their data is increased uptime, reliability, and significantly reduced start-up and (usually) running costs - and being browser-based this has meant a shift away from WIMP - where an activity is contained within a small-ish, often fixed-size, "window", to a "page", which has its own interaction model (e.g. no more modal windows because you can always open a new browser tab).
The next iteration of human/computer interaction we'll see that impacts our (society's) actual working lives will be something that enables me to be more expressive without resigning any degree of control over what I produce - and right now nothing does that better than a large display which lets me interact with multiple applications simultaneously, a keyboard with which I can enter commands or prose almost as quickly as I can think it (with a very low error ratio too), and a mouse for quick, precise, (gorilla-arm-syndrome-free) content/command selection.
Until something comes along that enables me to be more productive for my employer the desktop interaction model will still reign.
> right now nothing does that better than a large display which lets me interact with multiple applications simultaneously
Well, the command line is already more powerful, if less accessible. Piping streams together, writing code. These are very powerful things that would give you superworkers if your staff could do them. People stopped developing command line UIs because there was a gold rush on WIMP, so command line usability stopped getting funded. It's also just proven a hard nut to crack. People in academia have been trying and failing a more usable command line for a long time. Seems like just a matter of time to me though. It's only been 20 years.
Mobile is interesting, not because it is particularly better at anything, but just the opposite: there's no space on the screen, so it forces designers to reconsider more stream-based UIs. I.e., return to the command line paradigm and pick up where we left off in the 80s. WIMP and big screens give designers a bunch of idioms to rely on, and lots of space to fill up and they get lazy.
VR doesn't inherently do a whole lot more than a rectangular screen in terms of productivity. I think most VR UIs will be things you could make a rough approximation of on a desktop screen, minus immersion. But it's a playground in which to explore interaction ideas that will become the fundamentals of AR....
AR is where there is a new idea, which is that you can leave things lying around in space. A desktop is a window, and a relatively small one at that. You have to keep tidying it up. You waste a lot of time context shifting. Most workplaces are much larger than the few square feet a desktop provides. The killer feature of AR is basically just "spread out".
So, combine those three things... stream-based UI, with programming features (iteration, DSLs, pluggable objects), and then spread that out into space and I think you get something that will make desktops seem quaint.
Lots of human-computer interaction PhD dissertations between here and there though.
I happily used an openstep-inspired (or directly derived?) desktop environment named WindowMaker in Linux somewhere around 1999. I also remember LiteStep for Windows being available around that time. So there were attempts at using that approach on other platforms.
(Funnily, my current Xfce setup still visually resembles my OpenStep-like setup of that time.)
Interestingly, I used it on Apple's MkLinux on a Powermac. When I returned to that office, all the retired Powermacs were running it on the operations staff desktops. Felt good.
> Now that Swift is open-sourced, I wonder if we'll see a good Cocoa port to Win32 and Linux - because that would be NEXTSTEP's legacy.
Never say never, but none of the UI parts of Cocoa are written in Swift, and porting them to Windows/Linux would be way more than just writing them in Swift anyway.
Curious, in what way is Windows's development story stuck in 2005 and the Mac is not similarly "stuck" in 2001 (or perhaps 1995)?
Apple pretty much said that Cocoa would be the basis for all future work they do on Mac OS X since 2001, and they've continually improved it, such as adding new widgets, adding functionality to existing widgets, high-DPI support, and so on. I see there's a lot of DRY (Don't Repeat Yourself) going on: there is only one GUI platform and UI toolkit for macOS: Cocoa. While the groundwork isn't new it is kept up-to-date.
Compared to Windows where the "original" widget-toolkit (Window classes and the Common Control library) haven't been significantly updated since Windows XP - Wndows 7 did add some new controls, but they aren't exactly easy to use unless you're already a very experienced C++ Win32 GUI developer.
The old low-entry-barrier dev story was VB6, which had a mix of Common Controls and ActiveX reimplementations, then it moved over to VB.NET and WinForms, which is a wrapper around the same - but the last significant update to WinForms was in .NET 2.0 in 2005 - Microsoft recently announced that they would not be adding true high-DPI support to WinForms, advising desktop app developers to use WPF or UWP instead - and therein lies a problem: WPF and UWP are both completely separate UI and application stacks. WPF is not perfect and like WinForms, hasn't seen much of an update for a while (it still runs on DirectX 9) and UWP is just confusing: it's arbitrarily limited to only run in a restricted sandbox environment, the standard widget library is very anemic (e.g. still no tree-view!) and it shares nothing in common with the rest of the entire Windows ecosystem.
If you want to write a "real" Win32 desktop application you're best-off sticking with MFC, but there's so much legwork that's necessary, and legacy cruft (I swear the MFC codebase is still C++89 compatible) it just doesn't make sense for small applications, especially line-of-business systems that face constant changes, to be built with it. Compare with Cocoa on macOS where everything: from Microsoft Office and Adobe Photoshop, to the built-in Jigsaw Puzzle mini-game, is all built with the same framework.
The port was already done in the 1990s. You could run Enterprise WebObjects on Windows NT, and it came with several NeXTStep apps on the CD including Mail.app.
GNUStep[0] is a thing that exists, and it's mostly Cocoa-compatible (at least for old versions of Cocoa)? I don't think it is very widely used -- it never got much desktop traction compared to KDE or Gnome. But you can write apps that compile and run on both GNUStep and macOS -- the "macOS" version of Emacs actually targets GNUStep, for example.
In the early 90s, I remember reading in NeXTWorld Magazine many articles about the speed of development under NeXTStep. Ten times faster, according to John Carmack, who was creating Doom. Every developer agreed. Each issue showcased apps with surprising and elegant capabilities, really new features available nowhere else. The proposition was that NeXT Cubes were really worth the price, that they enabled magical solutions, and that there was still time left to join the other pioneers. Alas, I could only afford the magazine subscription. (In 2016 dollars, NeXT Cubes cost from $11,000 to $17,000.) When the Apple-NeXT merger was announced, people who had been following the NeXT saga, who had been aware of the benefits of NeXTStep and OpenStep, could easily imagine something great happening, and were especially glad when things worked out well in the end. (Here’s an archive of old NeXTWorld magazines. The PDF scans are actually easier on the eyes. http://www.nextcomputers.org/NeXTfiles/Articles/NeXTWORLD/)
Computing has changed. For CAD, games, a/v production, you still need a workstation. But the workstation is now obsolete in major fields that it once dominated.
Industrial simulation: these days the optimal approach is to get access to CPU is to crunch on headless servers, ideally in the cloud. Expand and contract computing power on a needs basis.
Front-office development: Objective C and Interface Builder was much better for prototyping ideas than C. Modern python/perl/racket/go/c#/etc are much further ahead. There were no mainstream dynamic programming tools more powerful than awk when NeXT was launched. Now the most common expression of hackers is dynamic programming on fabulously powerful free tools. High-quality operating systems are also free. Documentation is free. And it's all far better than what people used to pay for.
Deployment: interactive user interfaces are now delivered through the a web browser. Previously a lot of workstations were sold to run custom internal apps.
Save your money! Get a digital ocean account, and run your macbook until the power supply dies. Perhaps in another ten years google-style web-browser laptops will be ubiquitous.
Having dug around a bit with Awk that might be a true statement. Awk has an entire ocean of features that mere mortals are completely unaware of hidden behind the 1960s teletype optimized interface. So the statement that there were no tools more powerful than Awk might be true, but only because we don't know if there is a full fledged AI hidden under the layers of impenetrable interface.
What are your thoughts about the parent question? Is there a machine in the 15k price range that would give you remarkable edge over a macbook pro? I don't know anyone who's in these professions. Perhaps a NT desktop box with niche io cards?
The "standard" for high end audio processing is Pro Tools HD/HDX, which does cost this much in its fully-fledged form.
Users can run a CPU-only version of Pro Tools called HD Native which (along with its competitors like Logic, Sonar, etc) are useful and powerful programs. However they lack the power and depth of plugins available for the hardware-accelerated version of ProTools (and other hardware-accelerated plugins like UAD).
I think the Holy Grail is and has always been to "go native" and get rid of these sorts of accelerator cards as money is better spent on the general-purpose devices that retain their utility better. But we still aren't there yet - when you load up even the best CPUs with lots of convolution processing, too much latency is introduced. So yes, I think there's at least a niche market for the $15K general-purpose computer that truly represents a "bottomless pit" of audio processing power.
I haven't even touched on video editing or 3D rendering, both of which can be even more CPU intensive. When you add all these niches together, they still look like the original power-creative niche Apple was targeting 10 years ago with the Mac Pro and the Macbook Pro, and we still want more power.
Most of the advantage was the software. NeXT gave up on hardware by 94 or so. Oh, and then the web happened, setting us back 20+ years in terms of application development productivity.
If the web set us back 20 years, why weren't application programming frameworks able to bolt on some kind of networking/browsing capabilities and take the crown away from the web, with all of their l33t "application" abilities?
My answer: your application programming wasn't actually that valuable to real humans. The web leapt 20 years ahead on the things that actually mattered to the world, and we lost some glitz that really didn't.
I think the web actually has far too many capabilities for most applications, and application frameworks are mostly a racket to keep application programmers and professional application users in a lucrative dance of wasting each other's time.
The vast majority of people can't even string together two functions, which js quite possibly the most powerful thing computers can do. For all of our effort we can't even give the people that most basic of capabilities, something five year olds can do with a little guidance. Because programmers are obsessed with GUI animations and digital cockpits and enabling professional app users to bend to the arbitrary whims of their supervisors. Because when we imagine "power" we don't picture collective control of outcomes that matter to us, we think of the sensation we get when we can individually make a fancy tool do arbitrary things.
I realize that's just a rant, so I'll ask: what outcomes matter to you, that you feel are impossible on the web?
The advantage is already there, over Windows machines. It could be argued that the merger of NeXT and Apple democratised Unix.
In the 90s, most people were packing Windows 95 and if you wanted a Unix box you were going to pay for it. Silicon Graphics, NeXT, etc, were out of the reach of the general population and most developers. Only the CAD guys had them.
But many Windows users still complain today that the Mac is too expensive. So in that sense, if you want a Unix development machine you still have to pay a premium, just thankfully about an order of magnitude lower.
Of course, the democratisation would go much further: it is interesting that around the same time NeXT and Apple were merging, I first heard that some guy called Linus was busy porting Unix to the Intel architecture, and - unbelievably - he was doing it for free!
"I’d argue that this is probably the single most important tech acquisition of all time."
However, it was not Apple's first choice. They first attempted to acquire Be Inc. (BeOS) but Jean-Louis Gassée held out for $275 million and lost the deal in a surprise move when Apple, instead, bought NeXT.
I don't know. A BeOS foundation for MacOS would have made it even better at multimedia production. The tradeoff of course is that it probably would have been late to the whole Internet thing that was becoming something of a big deal around that point. BeOS's network stack was different enough from Berkeley sockets that porting network apps was a challenge. It lacked a version of Netscape at a time when even FreeBSD had a native version for example. In the late 90s this was not acceptable. Apple's engineering culture wasn't one of embracing open standards either, they could have very well stuck with the Be network paradigm and made it a headache to port any POSIX application.
Be's heavily multitasking kernel would have been quite a boon for Apple when off the shelf processors started going wide though. They could have been really reaping the benefits from C2Ds and beyond.
> They first attempted to acquire Be Inc. (BeOS) but Jean-Louis Gassée held out
And completing the circle: Gassée was partially responsible for Jobs leaving Apple, and was promoted to Jobs' former position after Jobs resigned. He founded Be after also being forced out of Apple.
Silicon Valley really can be thoroughly incestuous at times.
I knew at the tine that Gassee was making a huge mistake. Be had some fantastic tech but it wasn't otherworldly.
A MacOS based on Be would have been amazing, at the time.
X ended up being a better way to go, the other changed made when Jobs came back turned me off of Apple's products but I can appreciate the overall improvements.
I still remember sitting there with my mouth hanging open watching 4 quicktime movies play at the same time after I installed the BeOS Preview Release on my 6400.
What I particularly liked was how BeOS would run on any PCI PowerMac. The Rhapsody DR1 release had much smaller supported hardware and it was later than BeOS.
Gassee overplayed his hand and lost big time.
I LOVED Apple stuff. A BeOS based MacOS would have been phenomenal.
Nooooo! BeOS was really great. If only the wxWidgets port for it continued and we'd have easy cross platform software for it.
I see the package manager addition to Haiku and wonder if this was what was really intended for BeOS - it always struck me that it was more akin to the Mac and DMGs on there instead of a Linux-style package system.
I was applying for jobs back in 1997, and during that time they had a huge career fair called the Brass Ring fair. Employers from around Silicon Valley would have booths and hire people on the spot. I distinctly remember Apple's booth being completely barren, they were practically dead to most engineers. How the tables have turned since then. The big question is, Apple's path appears to be following the path after Steve Jobs left the first time, with the muddied product picture, no innovation, etc. I hope history doesn't repeat itself twice.
It would be great if someone wrote an article about the alternate histories where Apple chose other strategies. For instance, what an Apple-Be merge would have resulted in, what kind of products and tech. And how NeXT would have lasted without Apple.
A reason nobody has might be that the actual outcome was more unlikely and unexpected than the alternative histories one might come up with. They'd be short and conclude with both companies being sold for scrap and absorbed uselessly into some behemoth or another.
Wow, I haven't heard someone mention BeOs in over a decade. That was a pretty OS, but I'd take that black NeXT case any day. Those NeXT boxes were so cool.
There have been a few late nights where I found myself searching eBay and other places for a used BeBox. Sadly, they're hard to come by (and expensive). I used to have a bunch of NeXT hardware and it is indeed cool. I got rid of it in a move a few years ago because it was just taking up space in my closets, but I still miss it....
None of the tech really mattered, not much anyway. What Apple really got with NeXT was Steve Jobs' vision, along with a more seasoned CEO better capable of realizing that vision. Whether Apple had gotten BeOS or NeXT didn't really matter all that much in comparison.
This is ridiculous pedantry, but the "NS" prefix actually means "Next / Sun". It was introduced in OpenStep [1], which was supposed to become Sun's GUI layer on Solaris until Java happened.
Before the OpenStep revamp, the class prefix used in NeXTSTEP was "NX". I think there are still some lingering NX* classes/functions you might see in AppKit stack traces.
I think people forget that NeXT and its OS had some time to evolve. NeXT had .8 in 1988 and 1.0 in 1989. So, we are talking 8ish years of evolution and API changes. Going from the pre-web to WebObjects and its influence on the API.
That's an amazing factoid. "NS" standing for NeXTSTEP would then be a popular misconception, then. It also introduces an interesting hypothetical of what if Sun ended up adopting OpenStep, if Java was not developed in time or fell through or whatever.
The OpenStep spec was a collaboration between NeXT and Sun. OpenStep was available as a full OS (for 68k NeXT hardware and 486 beige PCs) as well as running on top of Solaris (SPARC), HPUX, and Windows NT. Sun went with Java instead but Microsoft has a WinObjC project. Go Figure. Until 10.7 or so, OS X still included Windows NT theme graphics [1].
NeXTStep 3.0 used the NS namespace; 2.0 and earlier was NX.
nice one--just checked your source and you're absolutely correct. I'm not a Mac OS so i had no idea how Cocoa methods were prefixed, but i recall listening to a reply of session from the Apple developer conference around 2007 in which a fairly senior apple dev, which giving brief summary of the history of Cocoa, said more or less that "NS" stands for NextStep"
Yes at the time all the major mac magazines were acknowledging that it looked like Copland would never ship, but the question was whether to replace it with NeXTSTEP or BeOS. Both had strong ties to former Apple execs with obviously Jobs being the bigger draw at NeXT, but Jean-Louis Gassée at Be was not to be discounted either.
Both Jobs and Gassée were both forced out by Sculley who alot of people credit with driving Apple toward the mediocrity that defined it in the mid 90s.
The other big factor at the time was the fact that BeOS ran on PowerPC, and NeXTSTEP did not. NeXT had done some work to produce their own hardware with dual 601s, but never shipped it and abandoned that work to focus on the IBM PC hardware. This meant that the first few years of the Apple/NeXT merger were spent just getting NeXTSTEP running on PowerPC.
The biggest buzz of the time was that even though Steve's return to Apple had killed the growing Mac clone industry, maybe Rhapsody would run on x86 hardware. There were rumors all over that they would make this happen, although they were many years too early, as Apple wouldn't make that switch until 2005.
NeXT was an actual operating system that people used. You could run word processing apps from which you could print and email. There was a fully fledged development platform with real developers building real third party apps. BeOS was more like a giant tech demo.
If we focus on just the technologies choosing BeOS would have meant at least a few years more engineering effort. And you still wouldn't have been able to replicate the ground breaking nature of OSX in the short time frame required.
I suspect both financially and in consumer mind share they would have never recovered from selecting BeOS.
BeOS/BeBox was demoed at a MacWorld. There was a cool display of the utilization of each CPU and you could disable each CPU by unchecking a checkbox.
The demo app didn't prevent you from disabling the last CPU.
The booth-guy wasn't very happy with me.
If it was even possible. I liked beos, but it was an rtos with a GUI on it. No real network stack, no real print stack, I don't even remember the state of internationalization. No multiuser story or security story. It's not clear the you could add that stuff to be.
Maybe not from a business/product standpoint, but the merger meant that OS X had NeXT roots that exist to this day, even with the ascension of Swift over Objective-C. I'd say the history of technology there is interesting, especially when you consider the troubled history getting to OS X:
I copy from a comment of mine from last week. In case you wanna procrastinate on a fascinating story about how Apple transitioned to NextStep's stack without completely annihilating their already weak developer base, here's a few links:
You get a corporate drama, a human drama and a tech drama all wrapped up in one series of presentations:
Jobs' return (as a consultant) 1997, promoting technologies from NeXT. Watch basically everyone asleep at the wheel except Jobs, the man with a plan.
Not just a plan but the Plan, which Apple has followed ever since. It's fascinating to watch this section of Jobs's 1997 comeback presentation, knowing how successful this strategy was with the iPhone and the App Store: https://youtu.be/4QrX047-v-s?t=7m40s
I was for a while thinking of the case where Apple bought NeXT but Gil Amelio and later Ellen Hancock stayed as CEO, and CHRP actually happened. While PC margins was declining, I was thinking that Apple could mostly focus on the higher-end PC and workstation markets, especially as OpenStep/Rhapsody was a UNIX.
According to Jobs they were at times 90 days away from bankruptcy. If that's true I'm thinking nothing but Jobs' full steam turnaround, including getting Microsofts' investor money could have saved the company. Doesn't seem to me like Armelio had either the energy nor the connections to do that.
Whenever I start to think like this, I remind myself that it's about as useless as thinking "if only I bet it all on 00" at the roulette table.
Crystal balls that see the future are certainly in short supply. But if I had one there's a lot better ways to make money than using it to buy 10 shares of Apple 20 years ago.
>Crystal balls that see the future are certainly in short supply.
Yes, but Apple, from 1999 until Jobs died at least, and for some time more, gave people opportunity after opportunity to buy a constantly sky-rocketing stock. If someone didn't see it with the first iMac (1999 IIRC), they should have seen it with the iPod. If not, then with the iPhone. They had many chances to buy that stock.
Yes, but again, this is in hindsight. For all you knew back then, it could crash at any time, and any other company could also skyrocket.
The point is, right now, any of the hundreds of startups you see just starting now /could/ potentially be the next Facebook. Statistically, most won't be. But of course in 20 years, you won't remember thinking about buying stocks for all the companies that failed. You'll only remember how you once considered buying stock fort his one company that is huge now.
>Yes, but again, this is in hindsight. For all you knew back then, it could crash at any time, and any other company could also skyrocket.
That's the same for every company, so it's a moot point. What would the lesson from that be? Nobody has a crystal ball, so nobody should invest in any stock?
Since investing does happen, and is based on signs and trends, what I'm saying is that Apple gave many of those repeatedly, and if someone didn't see a revived company when the original iMac came out, and then OS X 10.0, and then the iPod, and then the iPhone, and then the iPad, etc, then they only have their selves to blame.
You seriously can't fathom that knowing financials, track record, ceo, products, pending deals, etc. etc. gives you more information than random chance? How do you make any informed choice about anything at all then?
I can fathom it, but not on a 20-year timeline. The market prices stocks with all the information available. It's not perfect obviously, but it's better than any other price-generation algorithm that lacks a crystal ball. I you happen to have a unique perspective that the rest of the market lacks, then you can invest in that stock and make a good return.
But 20 years out, the hypothetical "I wish I had bought AAPL in 1996" is just as useless as the "I wish I had bet on green" comparison. In this scenario, it's an apt comparison.
Anyone making an informed choice in 1996, knowing financials, track record, ceo, products, pending deals, etc. etc. would have stayed the hell away from AAPL
Actually a lot of people believe they are a random distribution. There is a great book "A Random Walk Down Wallstreet" which explains it.
The basic idea is the market is efficient (the Efficient Market Theory) and all information about the stock is alreay priced into the stock. Picking a stock to beat the market average is basically a random chance.
Sure, across all investments. Pro-tip don't invest in all investments.
Real life markets are very clearly not perfectly efficient. Nor is stock price a perfect reflection of available information. Many/most/all? stock is prices on expectation i.e. guesses, not facts. Then there are all sorts of shenanigans like astroturfing and illegal manipulations/trading.
My dad remembers his broker calling him up at various times and trying to get him to buy Apple because it was basically worthless (which is also why he didn't buy it).
Wouldn't have taken much to make a lot of money.
Of course, most people, even if that lucky, would have sold early iPod era thinking "This can't continue" and wouldn't have made the profits they would if they held until the last year or two.
What I really don't understand is why I didn't select the right lottery numbers at the last draw. I could have made millions.
You should only have regrets for decisions that would have made sense with the knowledge at that time. Buying Apple stock in the 90s wouldn't. But when I first saw an iphone, my first thought was "Wow, I need to buy one". My second thought then should have been "and I need to buy Apple's stock".
I bought some Apple stock in 2000 or so (then sold and re-bought it in 2001 when I switched brokerages). Maybe 7 or 8 years ago I put in a limit order to sell off some of it but it fortuitously expired. At this point, the annual dividend payout exceeds my purchase price.
I bought a bit of Apple stock in 2001, when it was in a rut after the dot-com crash... The next year, I sold the stock to buy a frigging 10GB iPod, of all things!
That might have been the world's most expensive iPod.
>My father's most regretted financial move is ignoring his weird thought to buy 10 shares of Apple stock at this time.
And how long would he have held on to them before selling?
That's the really fun part of trading: it's one thing to see a profit, and quite another to bet on whether it's as good as you're going to get, or go 5x - 10x - 100x higher.
Personally, I was planning to buy at $12 in the dark days when the Book Value was about Market Cap (~4B). Then it popped to 19ish, and I decided that the easy/big money was done.
There was a magazine called MacAddict which I'm sure some here would remember. One of their mid-90s Christmas gift recommendations was Apple stock! I sometimes wonder how many small fortunes those few paragraphs created.
I remember being in college and Apple stock was at $7. I had some money in an E-Trade account, but Apple was in such dire straits that I figured a better use for $7 was to buy lunch at Chick-Fil-A. There was a good chance the lunch would be around longer than Apple would! I don't even want to think about how much a couple thousand $$ of Apple stock would have been worth at its peak.
Mine, too. I was working my first appdev job at a startup in 1999, and as they talked about IPO and such I thought "apple, that would be some cool stock to own" based on my IIc nostalgia from school. I had no idea how to actually buy any at the time, though, and not a strong enough desire to pursue it further.
That $0.75/share price is already adjusted for the splits. So the stock price back in 1996 would have been around $20/share. So that's $200 in 1996 that would be ~$32k today. A 15000% return is still nothing to sneeze at and that's before you count the dividends they started issuing after Jobs died.
In 1995 my parents talked about letting me pick a couple hundred bucks of any stock as a high school graduation present. I loved our Mac LC and suggested Apple, but they never went through with it. I'm pretty sure they thought I would just be throwing the money away with that pick.
Eyeballing the chart, it would have been right around the last time they paid a dividend until 2012 and the stock price would have been around 1.5. Also, there were three stock splits between then and now.
Oh well, I probably would have lost track of the stock in the many many moves I made during my college years.
It was no doubt an important acquisition, but 'the most important'? Someone who makes such a bold claim has gotta be someone who really knows computer history extremely well.
This is just one of those 'what if' scenarios.
If Apple didn't buy NeXT someone else would've innovated a good capacitive touchscreen device. Maybe it would've been Nokia. They were busy with experimenting with that long before the first iPhone got released, see the Nokia 770 and Maemo (or actually check its successor, the Nokia N800). Perhaps Elop would never have become the CEO of Nokia. Perhaps Nokia would've still be around. Perhaps not? We'll never know. Everything's connected.
> I called up Bill and said, “I’m going to turn this thing around.” Bill always had a soft spot for Apple. We got him into the application software business. The first Microsoft apps were Excel and Word for the Mac. So I called him and said, “I need help.” Microsoft was walking over Apple’s patents. I said, “If we kept up our lawsuits, a few years from now we could win a billion-dollar patent suit. You know it, and I know it. But Apple’s not going to survive that long if we’re at war. I know that. So let’s figure out how to settle this right away. All I need is a commitment that Microsoft will keep developing for the Mac and an investment by Microsoft in Apple so it has a stake in our success."
Microsoft got into applications before DOS existed. Microsoft applications like Multiplan were available for a variety of platforms, running CP/M and a variety of other operating systems. You could get Multiplan for systems as small as the TI-99/4 and as large as Xenix!
> Microsoft got into applications before DOS existed.
Not actually true. DOS was already done when Microsoft hired Charles Simonyi from Xerox Parc to start its application division in 1981.
Multiplan and Word were the first results, released in 1983, by which time it was obvious that Lotus 1-2-3 was going to win.
Simonyi had big ideas about "metaprogramming" and using what was basically a VM to make applications portable. (I ran Multiplan on a Tandy 100-style portable, where the program came on a chip!)
In an interview, Simonyi said:
QUOTE
"Multiplan was done on a byte-coded interpreting system, much like Java. It was probably the most ported system ever deployed. We thought that the market would be fractured for a long time and that we would be on all of those machines -- which we were.
"Interestingly enough, MS-DOS changed that and created a unified market. And, of course, Lotus 1-2-3 made their bet on creating a single, optimized, direct implementation for MS-DOS, and they cleaned up. We learned a lot from that failure. And then of course, when the next shift came to GUIs [graphical user interfaces], we cleaned their clock with Excel."
Supposedly, Microsoft Office used a bytecode language for a long time afterward, at least into the 1990s. Old time Mac users probably remember MS Office 4.2, which was an identical clone of the Windows version and ran like molasses.
"The first Microsoft apps for the Mac were Excel and Word" would, of course, be correct.
Microsoft was the Mac's biggest supporter, and Bill Gates appeared on stage at the Mac launch. For which he has been richly rewarded with decades of Apple fanboy bile ;-)
Microsoft provided emergency funding. It's quite possible that they could have raised the money some other way, though. However, wherever the money came from, Apple would have been doomed had they kept on their original path, so it doesn't seem that unreasonable to claim that the NeXT purchase saved them.
Microsoft gave a similar infusion of cash to Corel at around the same time. Through very adept dealing they managed to leverage that investment, essentially forcing the board to accept a buyout from Vector (a VC company partly owned by Paul Allen). MS took a huge loss on the deal ($125 million spent, sold their shares to Vector for $13 million), but Vector made an absolute killing.
I have no doubt in my mind that this was the plan for Apple as well. Whoever managed that deal on the Apple side must have been on their toes. Of course, the investment paid off handsomely for MS, but I'm quite sure that this wasn't their intent.
So while I think it is true that Apple required the cash infusion they got, they managed the situation masterfully. Whether or not they could have secured funding some other way, we'll never know because they took MS's money.
I enjoy watching the later videos on YouTube where Steve has a Bill Gates video and announces that Internet Explorer will be the default browser on MacOS.
The disbelief in the room is stunning, including the shouts of "NO!!!!!!". Very amusing today, perhaps not then.
Jobs brought in a great team from NeXT - like Jon Rubinstein, who oversaw the development of the iMac and the iPod, and later came as close to rescuing Palm, Inc. (with the Pre) as was possible.
Steve Jobs, the team he assembled, and a willingness to cut. Apple at the time was running the Newton which, although I loved programming it, was a totally wrong direction. It just wasn't connected as Mr. Jobs explained in his Q&A[1]. He built a much better OS than the Mac at that time.
Palm couldn't build the iPhone with Palm and had to build WebOS. QNX is amazing, but its UI and client developer library wasn't there or else RIM / Blackberry would have used that. Apple had used Linux, and frankly, there was no way Apple would have done a Linux-based system after the whole GNU thing at NeXT.
I do think Objective-C, the developer tools, and NS class libraries don't quite get the credit they should. They were very good (go read the Taligent documentation). Truthfully, if Adobe and Microsoft hadn't insisted on Carbon[2], the Mac would have been much better off.
Apple had done an Intel translation internally and had the institutional knowledge already from the 68K to PowerPC transition.
> Truthfully, if Adobe and Microsoft hadn't insisted on Carbon[2], the Mac would have been much better off.
It wasn't just them. Pretty much nobody wanted to write two different versions of their apps.
You would have had a negative feedback loop. Users wouldn't upgrade to Mac OS X because most of their apps weren't native, so everything would run in Classic, which was just as crashy as MacOS 9 but slower. Then developers would hold off on porting their apps to Cocoa, because hardly anyone was running OS X. So the 9 to X transition would have taken much longer.
There was a sort-of kerfluffle between the FSF and NeXT with regard to what was required for NeXT to be GPL-compliant. This is RMS' account:
"I say this based on discussions I had with our lawyer long ago. The
issue first arose when NeXT proposed to distribute a modified GCC in
two parts and let the user link them. Jobs asked me whether this was
lawful. It seemed to me at the time that it was, following reasoning
like what you are using; but since the result was very undesirable for
free software, I said I would have to ask the lawyer.
What the lawyer said surprised me; he said that judges would consider
such schemes to be "subterfuges" and would be very harsh toward
them. He said a judge would ask whether it is "really" one program,
rather than how it is labeled.
So I went back to Jobs and said we believed his plan was not allowed
by the GPL.
The direct result of this is that we now have an Objective C front
end. They had wanted to distribute the Objective C parser as a
separate proprietary package to link with the GCC back end, but since
I didn't agree this was allowed, they made it free."
I'd be curious to hear more firsthand details from any parties involved. There's a widely held perception that this was a really divisive and drawn-out fight that left a lot of bad blood on both sides, but based on RMS' account, it wasn't. RMS' account just sounds like Jobs contacted him about it, said "is it OK if we ship it this way?", RMS checked with his lawyer and responded, "no, you need to ship it this other way to comply with the GPL", to which Jobs' replied, "OK, we'll do it that way then".
I'm not a first hand observer, but I was certainly around and following the issue at the time. My impression was that a lot of people watching the issue got upset. I never saw anyone from NeXT or the FSF complain about anything. Just like today, there are people who think that Apple are evil because they don't embrace software freedom. Similarly there are people who think that the FSF are evil for not letting people do whatever they want. To be honest, the acceptance for software freedom as a legitimate idea is far more advanced today than it was back then, and you see how much vitriol is spewed about 'zealots' and whatnot.
To be fair, I don't think Apple was ever interested in software freedom. This is pretty obvious from their actions since that time (moving away from utilising software under the GPL). But I've never heard anyone other than developers complain about it. From a business perspective, it makes sense -- a license is a license. You don't want to pay the price, you don't get the license. I got the sense that Apple's management understood this principle completely.
From the FSF's perspective, they got an Objective-C front end for GCC and they were particularly happy about that. It was a kind of triumph because the GPL did its job. Objective-C programmers had a free platform to work with, which never would have happened if GCC had not existed and was not GPLed. One can argue that these days more and more companies understand the benefits of open source development and might contribute large pieces of code willingly, but that certainly wasn't the case in those times.
I would also be interested in first hand views, but from my perspective, it was always a non-issue.
I don't understand the Taligent reference. Taligent happened while Jobs was out of Apple -- it was a competitor to NeXT back when everything was about "object-oriented desktops", and Jobs didn't like them, much like he didn't like OpenDoc, for pretty good technical reasons. It's not surprising that Taligent went nowhere, same as OpenDoc.
I was comparing NeXTSTEP with one of its contemporaries to show how innovative and polished it was. Taligent was supposed to be the standard and it was a true pain in the butt to program. Apple and HP bailed in 1995.
I did like the concept of People, Places, and Things. It is a shame that little pieces of a good idea can be lost because of the overall failure.
Don't forget the 1988-1995 FSF boycott of all things Apple, in which they refused to accept patches targeting MacOS, thus driving hobby developers to DOS/Windows and helping cement Microsoft's dominance.
Steve Jobs, without any doubt. Apple's renaissance wasn't so much about technology as about an entrepreneur at the top ( instead of a manager) who could turn the ship around.
A solid UNIX operating system with a well-designed UI certainly helped in terms of technology. As did Objective-C. Technologically, Apple's achievements would've been difficult to realise with their classic Mac OS (which Apple knew quite well as they had been shopping around for a new operating system technology for quite some time at that point).
NeXTSTEP and Objective-C probably hit a sweet spot there, albeit one that would only come to fruition years later. I doubt Apple's current success would've been possible (in the way it turned out to be at least) with BeOS (another serious contender at the time) or Linux.
I think the fact that Palm's technology was designed for mobile and NeXT's wasn't turned out to be a huge advantage in the long run.
If Apple had planned to build the iPhone in 1996, they probably would have bought Palm and ended up with something a far cry short of the iPhone as we know it, for obvious reasons.
Palm's OS was designed for what mobile hardware was capable of in the 90s, whereas NeXT's OS was designed for what workstation hardware was capable of in the 90s. By the time the iPhone was released, Moore's Law had made mobile hardware capable of running the superior NeXT software (full unix kernel under the hood, Objective-C and its APIs, etc.) and you end up with a better end product.
The totality of Nextstep (kernel, os, obj-c runtime/component model, frameworks, gui layer, tooling, expert teams who built and maintained these things, etc, etc) was much greater than Palm or QNX, let alone something yet unbuilt 'based on Linux'.
Speaking of NeXT, I've yet to read a good history that explains how that came together. There's so much innovation there -- the NeXTStep kits (which became Cocoa, among other things), Objective-C, the fact that they used a Unix kernel. It was a lot of great ideas, but I don't think I have heard about who designed it. It's interesting to note, for example, that NeXT didn't invent Objective-C; they licensed it from the two guys who created it, Tom Love and Brad Cox, neither of whom I think were involved in development of NeXTStep. I'd love to learn more.
Yes! Maybe eventually something like a NeXT version of folklore.org will emerge since the stories one hears from old NeXT hands easily give Andy Hertzfeld's a run for their money.
(incidentally it wasn't just a Unix kernel but also the Mach microkernel. Interface Builder? Display Postscript? Sticking a DSP in the original machine? Lots of design decisions with potentially interesting history)
Brad Cox was a visionary, and his Planning the Software Industrial Revolution paper [1] from 1990 stands up well today. Especially worth referencing to any developer taking a dogmatic stance on static vs dynamic binding.
IMHO Steve's vision of small and beautiful devices finally came at a time when the technology was allowing for it to be possible. CPU, storage and battery improvements along with good/clean design allowed the iPod to become a real product and not just a design in a notebook. Then the iMac, unibody laptops, iPhone, etc.
Steve always thought big whereas Bill (Gates) thought practical. Steve got back into Apple at the perfect time to finally take advantage of the technology for something that a non-geek wanted.
I love BeOS and had NeXTSTEP 3.3 (& later OpenStep) and BeOS running in my office. There was no real comparison, NeXTSTEP was better in every way except doing video demos. It was easier to program and had a decent UNIX. NeXTSTEP was multi-user. BeOS was superior to quite a lot of the other OSes at the time though.
We will really never know, but I suspect there were some serious problems when Palm bought the code and couldn't build a phone / PDA OS out of it.
Apple's OS of that era was still single-threaded and an antique next to Windows. NeXT gave them a modern OS. Performance was falling further and further behind Intel-based systems. NeXTSTEP ran on Intel, Sparc and PowerPC. And most of all, NeXT gave them Steve Jobs.
The other way to look at it is Palm, QNX, and Linux technology never became as successful as NeXT technology in the mobile space. They could have bought those technologies and made them successful or maybe NeXTStep was the key.
My observation of the smartphone market is that Moore's law always wins. Microsoft knew that from the start; they build a mobile OS (Windows CE) designed more for future devices rather than current devices (like Palm did). So ultimately when the technology matured enough, Microsoft was ready and Palm languished trying to modernize. Ultimately, Apple did one better over Microsoft and simply ported over their desktop OS to a smartphone and beat everyone.
If only I had had money to buy Apple stock in 98 lol, be well retired now, instead of going to graduate school taking out a loan to buy tons of Apple stock would have been a far smarter move
It's uncivil because of the personally abrasive stuff like "Your comment is not relevant or appreciated. Not only are you not adding to the discussion" and (below) "You are a troll." Please edit that kind of thing out when posting here.
Your comment is not relevant or appreciated. Not only are you not adding to the discussion, this comment is simply a personal opinion coupled with an undependable generalization about what "many people" consider relevant or appreciate without even addressing the topic of discussion.
I read that the NeXT, let's call it, "adventure", is the reason Apple went with a Linux distribution as basis for their new OS and thus, we're all using OSX now?
You're getting downvoted to hell for a small misunderstanding, and the replies aren't necessarily helping.
NeXT is the reason Apple went with a UNIX-type OS that evolved into OS X, yes.
Linux is also a UNIX-type OS, but is not an ancestor of OS X.
Linux is so popular now that it's easy to identify it with UNIX but historically it's only one of several implementations that work in roughly similar ways.
Other replies are telling you which code OS X was actually descended from.
I never heard that one before. I thought Linux was just a conflation of "Linus" and "Unix", while GNU was a recursive abbreviation of "GNU is Not Unix".
Linux means "Linus's Unix" and the name came from the sysadmin at funet.fi who had to give the kernel a home subdirectory when Linus asked him to host the kernel on an FTP site ..
”MkLinux started as a project sponsored by Apple Computer and OSF Research Institute, to get "Linux on Mach" ported to the Macintosh computer and for Apple to explore alternative kernel technologies on the Mac platform. At the time, there was no officially sponsored PowerPC port of Linux, and none specifically for Macintosh hardware”
Apparently a Linux-based system was a contender for iPhone OS:
> Around 2005, Jobs faced a crucial decision. Should he give the task of developing the device’s software to the team that built the iPod, which wanted to build a Linux-based system? Or should he entrust the project to the engineers who had revitalized the software foundation of the Macintosh? In other words, should he shrink the Mac, which would be an epic feat of engineering, or enlarge the iPod? [1]
When this article first came out, I made the assumption that this statement meant iPod OS was Linux-based---also feel like I heard that somewhere else---but I have not found any sources to confirm this. Anyone else know?
Richard Stallman announced that he was working on a Unix clone on September 27, 1983. [1] This was the GNU Project from GNU's Not Unix.
Stallman was a prodigious programmer and wrote GCC (GNU C), a debugger and Emacs. He also created the GPL and founded the Free Software Foundation (from which the "open source" movement split off).
GNU was most of an OS by 1990 but didn't have a proper kernel (which was intended to be the ambitious Hurd).
In an unrelated move, Linus Torvalds used Stallman's tools to write a mini-OS and released it under Stallman's license in 1991. It naturally got incorporated into what some people (including Stallman) then called GNU/Linux. It was, in code terms, roughly 97% GNU and 3% Linux.
The naming is arguable because Linux was never a GNU or FSF development. However, Stallman's GPL allowed Linux operating system packagers (eg Debian) to use things that had been developed by or adapted for GNU.
Otherwise, Stallman told me he wouldn't have bothered with GNU if he'd known that BSD Unix was going to be available as free software. Unfortunately, FreeBSD wasn't released until 1993.
UNIX is an open standard with a certification program, and Linux certainly tries to follow the standard.
The only real reason that a Linux distribution hasn't been UNIX-certified is because RedHat and IBM found there was very little marketing value in calling it UNIX, that is, Linux is a stronger brand.
And I suspect the only reason MacOS is UNIX-certified is because they advertised it as "UNIX" and were sued by The Open Group, who owns the trademark. [1]