I agree with Gruber, but I'd like to add that these are not small errors on Walter Isaacson's part. They're huge errors. The biggest errors any biography author could make about Steve Jobs.
Why? Why are we even interested in reading a biography about Steve Jobs to begin with? Because he was a narcissistic asshole? Really? Because that's the part Isaacson nailed. There are plenty of assholes, and that characteristic alone does not make for a best-selling biography. No, the reason anyone is interested in reading Steve Jobs's biography is because of his work.
And yet, Steve's work is the part Isaacson doesn't get. Isaacson falls into the same traps that the media does with regularity; thinking Apple's design obsession is about veneer, thinking it's about marketing, about fooling people, about lying. It's not, that might sell a few products, but it does not sell record quantities of products and achieve top customer satisfaction.
You'd think a person with full access to Steve Jobs and people close to him would be able to at the very least ask a few questions about what he saw that others could not, that lead to the successes of eg. the iPhone. Recall other industry big wigs laughing it off, from RIM to Nokia to Microsoft. The iPhone was a joke to them. What did Steve see that they did not? What was his thought process? What made Steve Jobs so different for him to be able to upset industry after industry? These are things I'd have wanted to know and I can't help feel a bit sad that now we will never know. Because Isaacson squandered the only chance we got.
You shouldn't really get upset at CEOs trash-talking their competitors. It's their job. In fact, it's a compliment. Trash talking something means it's registered as a threat. How many of them trash-talked OpenMoko? (Conversely, what did Jobs trash-talk? Kindle and Android mainly)
Ironically enough Gruber is a big fan of taking offense at these entirely predictable comments from CEOs. He's basically trolling himself by taking obvious talking points seriously and trolling his massive readership by continually re-broadcasting these comments that are entirely without merit or interest.
"You shouldn't really get upset at CEOs trash-talking their competitors."
There's nothing wrong with that in general, but if you have special, unique access to an important industry figure, who is not going to be around for very long because he's dying, and you're writing what ought to be the canonical biography of the man, you really shouldn't be wasting your time putting false competitor trash-talking on the page, and certainly not without adding "but in fact Gates is wrong about this" and similar qualifiers.
I'm not even a little bit upset about CEOs trash talking each other. I'm sure Gruber doesn't care either. Not sure what lead you to believe that.
Observing something happening and writing about that doesn't mean that it upsets you, or that you like it, or that you dislike it. It only means you find it interesting or telling in some way.
>What computer would you rather use? A MacBook running Windows 7, or, say, a Lenovo ThinkPad running Mac OS X 10.7?
Being as how I run a Macbook Air with Windows 7, and don't even remember what OSX looks like, my answer is pretty obvious. I've never used a laptop that feels as good (the touchpad is the best).
No, maybe not exactly. But keep in mind, we here in the tech world, I think, (or this is just coming from my experiences) pretty much concede that Apple wins the design award. I have several non-tech friends whom still love windows, and are all about windows 7. I know people who are thinking about switching to a mac, and but are afraid of not knowing what the hell is going on. (I was honestly concerned at first)
I guess what I'm trying to say is, that other Software out there has a huge huge presence, with non-techie people, who are tempted by Apple only because of the "ooo shiny" effect of their slim devices, backlit keyboards, and the fanboism surrounding them. That's decidedly the hardware that is converting people. Windows users are more than happy to just use windows forever. Until their devices don't look as pretty as the guys next to them in the coffee shop.
(results may be different for mobile, but only because I think Android hasn't got it's act together with it's vendors yet. Some aspects of the biz take longer to sort out than technical development, unfortunately)
I hate to go off-topic, but you don't have any contact info in your profile... what's your typical battery life like with that set-up? I was under the impression 5ish hours on the 13" but The Verge said 4 this week... made me re-think adopting the win7/MBA setup.
Well, I run a pretty no frills Windows 7 installation (no animations, black background, use Black Viper's website to make sure I'm running the bare minimum services and processes). No idea if that even does anything, not a hacker here. But if I'm stingy with the backlighting I'll get between 5 and 5.5 hours of "normal" internet usage.
So it's not as good as advertised, and doesn't seem as good as when OSX is running. That said, it's still relatively great hardware; there are quite a few Windows 7 laptops around my office, and nobody is bragging about how great their battery life is.
A nit: Hardware isn't just industrial design. Apple manages to squeeze a lot more performance out of the same components as others, because they do design in an integrated, interdependent way, rather than modular. This gives less flexibility to customize/mix-and-swap, but better performance (for whatever you want to optimize: speed, weight, size, power consumption etc). This was extremely important in the early days of the iPhone, but now that components have improved so dramatically, we are nearing the point where there's performance to spare, and it needn't be optimized.
The upshot is that "iOS on an Android" with the same specs
wouldn't have performed as well. It would have been less smooth, less responsive etc. So that, at least then, hardware was crucial for the experience.
The same was true for the iPod and especially Woz's Apple computer. It's still true for the iPad. I believe it will be true for Apple's next product category, because (hopefully) they'll continue to move to the edge of what is possible - where optimization is absolutely essential to be the first to get over that edge.
Can you give more technical details, I don't really understand at all how iOS would run worse on a similarly spec'd phone? I can understand how hardware component design would affect the physical build, but I don't see how it makes the processor or memory faster.
It's things like bandwidth between components (EDIT and, in general, "fit" between components, and fit with the end-goal rather than with intermediate interfaces). I don't know Apple's internal design details, but they consistently out-perform competitors with higher specs. A good example is the Transformer Prime (quad-core) performing about the same as the iPad 2 (dual-core) - even though the Prime has a higher clock rate.
I think this is the review: http://www.anandtech.com/show/5163/asus-eee-pad-transformer-...
While it's true that Apple has its own SoC, this is built from existing components: it's just that the components are packaged together, instead of distributed across a mobo.
When you have a very limited set of hardware, and work side by side with the people that made it, it's far easier to optimize your software vs. one company writing software for 20 other hardware companies they don't have direct contact with.
Months ago on Hypercritical [1], John Siracusa had a series of episodes where he was talking about weaknesses of programming in Objective C vs more dynamic runtimes like C#/Java, like garbage collection, dynamic typing, etc.
One of the advantages he cited & partially dismissed, though, was that since Objective C compiles directly onto the hardware, performance & battery life is better on ram- and cpu-limited phones compared to the added overhead of Dalvik or another virtual machine or JIT. (His theory was that that bought them time in the short term but is still a long term issue for Apple; I'm not sure I agree, but don't disagree Apple needs to be evaluating this stuff.)
It probably helped more in 2007 vs. now; modern phones have a bit more horsepower to where they can afford to shrug off a VM. The narrow field of hardware helps significantly in terms of /feeling/ fast if not actually being faster. You'll note that Windows Phones are fairly snappy besides running on effectively year-old hardware and running everything* (including games, WinPhone doesn't support native code) on top of the CLR; since WinPhone only supports two SoCs (the Snapdragon 8x50 and 8x55), MS can pour all their energy into optimizing every last drop of performance out of that chipset.
* There's at least one exception in Trident (the IE rendering engine). Assuming most of the stock apps are CLR-based unless proven otherwise.
One marginal example: Rumor has it that the A5 ARM chip used in the iPhone 4S integrated technology from a company named "Audience" to better handle voice recognition for Siri:
As time goes by, I'd expect "third party" SoC vendors to offer their customers an ever wider range of custom silicon, but there will likely always be advantages to being the "first party".
Reports indicate Siri computes primarily in the cloud (the whole sound file is sent), and that it's not on iPhone 4 for other reasons (e.g. reduce server load, increase 4S sales)
That is true, as people have ported Siri to the iPod Touch and iPhone 4, and it is performing exactly the same. Therefore, they either emulated a whole chip (highly unlikely) or everything is processed in the cloud/on the A4/5/#.
The A5 has several integrated processors for specific purposes, specifically relating to audio and video. The Audience chip is likely one, but there are also processors specifically for the purposes of handling video encoding or decoding, and other things, probably like cryptography.
Apple was not specific about what all the coprocessors are or what they do, which is why I'm being vague, but they have publicly announced that these are integrated into the SoC.
So the "Rumor" part is that one of them is from Audience. That there are custome co-processors has been revealed publicly.
I don't know much about iPhone hardware, but a good example of this is on the 2010 13-inch MacBook. Apple has been a leader in enabling their software to offload tasks to the GPU (see OpenCL, etc). This enabled them to stay a generation behind on their CPUs for MacBooks (keeping the Core 2 Duo alive longer than expected, giving several advantages) without falling too far behind on overall performance.
It would not run worse. The stuttering nature of Android is because of the software architecture decisions that were made early in development. The same is true with iOS. iOS places much higher priority on rendering and handling user input which is why it always appears so smooth.
That theory was debunked by members of the Android team shortly after that G+ post made its way around the internet. It has more to do with the number of drawable contexts and the limitations (still) of how they're handled in mobile processors.
Interesting, but as a software guy I disagree entirely on the hardware/software side. I'd much rather have a 4s running ICS than one of the others running iOS, and I'd much rather have my MBP run Linux than even OSX (stupid EFI...). Granted, I have different tastes and needs than most, but I view Apple products not as the OS "in a pretty box", as Jobs put it, but rather as a pretty box with a good-not-great OS in it.
Correct me if I'm wrong but you're not disagreeing the conclusion that software is more important, you're just arguing the details as to which software is superior.
If I could mix and match here is what I'd take:
Software: iOS
Hardware: Nokia
Customer Service: Apple
Ecosystem: Amazon
Carrier: none of the above (the US sucks so bad...)
That's basically right. Apple makes fantastic hardware. None of the other manufacturers come close on design and execution. As far as software goes, I'm more concerned about what I can do with it than the polish.
I'm curious as to why you would prefer a Nokia handset to an Apple one. I've found their designs to be uninspiring.
Pretty much every review of the N9/Lumia series has mentioned the superior hardware. My main criticism of the 4/4S edges are on this page somewhere.
I don't claim any special experience here, Samsung, HTC and Motorola can all make really swell phones and it's hard to blame them when Verizon forces them to put a juice hogging 45 nm LTE chip in all their top end phones. Fujitsu phones are gorgeous but not widely reviewed in the English tech blogosphere.
All my Nokia phones still work or should work,
(Nokia 5110 from 1999, Nokia 5125 from 2001, Nokia 1100 from 2004, Nokia 5200 from 2007, Nokia N86 from 2009).
My carrier had to convince me to leave my 5125 as they were turning off the analog service (it's now used as an alarm clock), and I still use the 1100 sometimes.
However, I finally switched (partways) to Samsung, entirely due to the software (I wanted an Android phone), even though my N86 is better built and has a better camera (and hardware camera button, which I prefer).
I agree with the Linux running on a MBP part... indeed, I'm planning to do so when I buy my MBP in a few months. However, I do recognize that I'm not the typical use case for a person using this device, spending most of my time in the command line, etc. For actually developing something, there's a much better ecosystem running linux.
On the other hand, I'd take iOS running on, say, the Galaxy Nexus. If I jailbreak iOS, I can easily run iSSH, install inetutils, vim, etc, and come halfway to an amazing development device with access to the whole app ecosystem. For me, that's the dealbreaker (or maker, depending on the perspective.)
I jailbroke my 3gs a couple years ago with the same idea in mind. A development platform it is not. I liked the idea of being able to connect to a server from the bus and restart a service or what have you, but for the frustration involved it's just not worth it. I'd far rather haul an Air around and tether it to my phone.
I'll disagree on this point. I've used my 3GS for two years, and it served well as a quick and dirty way to develop in... unusual places. I'm not even talking about SSHing into another computer; just to localhost, quickly using vi to change a file, then running a git commit is ridiculously convenient for someone who doesn't own an Air and doesn't want to lug his laptop around. The only problem, really, is the speed, which I agree isn't the fastest, but the iPhone 4 is really usable. When I upgrade to the 4S in a year or so, it'll probably be ready for most stuff I want to do in a pinch.
In fact, I have an 8-10 year old Dell laptop and a MBP bought within the last year or two. I planned to make the switch but I've kept using the old Dell and the MBP sits at home as a net-browsing machine for my wife.
It's hard to dispute the excellent hardware Apple makes, but quite easy to do that to their software. My MBP dual-boots into Ubuntu so I can actually do my work and the 4S would make a great Android phone, though it still wouldn't be my choice due to its small screen.
What it really comes down to is that their software is too opinionated. At every turn I get frustrated by one inanity or another until I give up and just use something that works without requiring mental contortions on my part.
Good design is always opinionated. That you don't share those opinions doesn't make it poor design any more than it make you wrong for not sharing them.
I spent a long time thinking about whether the hardware or software is more valuable to me, and, in the end, I agree with Gruber. The software really does make the platform.
This doesn't actually mean that the software isn't the major thing that makes apple great though. Just because you find it frustrating doesn't mean the 99% of 'average' users don't find it vastly superior, usually without even realising it.
I feel the opposite. One example, when I use Linux I'm immediately met with the contortion of ctr-c/v/x not working as copy/paste across the board. Most notably in the terminal. Cmd-c/v/x works across the board.
This is just a side-effect of Apple using a non-standard modifier key. I don't think that the CMD modifier was chosen so that it wouldn't collide with the use of ctrl- sequences in terminal emulators.
I think that cmd actually was designed so that it wouldn't collide with ctrl. If you look at the keyboard for the Apple II, it had a ctrl key, and I believe it was used for control characters. On later keyboards, they introduced the open apple key (ancestor of command) while retaining the ctrl key. Apple manufactured their own keyboards, so it was relatively easy for them to have separate keys for control characters and keyboard shortcuts. Microsoft on the other hand, did not have this luxury. They were inclined to base their software around existing keyboards, and so they co-opted the ctrl key for shortcuts.
One cool thing I learned while researching this is that Emacs-style keybindings work in TextEdit or any native OS X text area.
^A go to beginning of line
^E end of line
^L center line vertically
^K kut (cuts text till end of the line and stores it in a separate buffer from the clipboard)
^Y yank (pastes from the kut buffer)
^D forward delete
(^ = Ctrl)
Yes it was, at least indirectly. The cmd key is designed to invoke shortcuts in the GUI layer and not conflict with anything that's a legacy from the text-mode style of interaction. That's why it's a special different key.
This is an accurate way to think of what the relationship between the command key and the control key on the Mac is today.
Historically, the Mac initially didn't have a control key. It was introduced later to accommodate terminal emulation. There was no "legacy" to accommodate for the Mac of 1984; it was a new platform with a completely different UI from most other computers of the time. Since then Control has been increasingly used as a general modifier key -- though still limited compared to command, for exactly the reason you state.
Not quite... it's a side effect of Apple's Interface Guidelines. In Linux, there are a few different sets of guidelines (if you're lucky), so God only knows what was in the mind of the developer at the time they worked on a GUI. To top it off, most Mac devs think a lot about the design of the GUI and concentrate on human/device interaction. A Linux dev is more likely to be focused on just getting the damn thing to work.
Being able to cut and paste in the terminal without thinking about it too much was one of my favourite features of Mac OS X, but it's nonsense to claim it's because of Apple's Interface Guidlines. It was just an unlucky coincidence that Apple avoided by chance.
Meanwhile the shortcut command for quit is right next to the key to close a window (at least in Qwerty). That's kind of stupid, but lets not go building any grand theories to confirm our biases just based on that factoid.
"It was just an unlucky coincidence that Apple avoided by chance."
The fact that they were not specifically thinking of the terminal (which you are assuming) does not mean that it was purely by chance. That's a non sequitur.
I know it's just "one example", but is it really that hard to remember that a few programs have that particular combination reserved? In most terminals I know, you just do Ctrl+Shift+x/v/c. Pressing one more key because you are in a special environment is hardly a big deal.
Funny how the iPhone suddenly has a "small screen"... When it first came out the phone and screen were criticized for being impractical due to it's large size.
Yeah I just figured that it wasn't to scale... Looking on http://phone-size.com it seems like it is to scale. So that diagram is clearly biased and hurts his argument, but I think that what he says is still valid.
I don't see any advantage to having a bigger screen. Especially when the bigger screen has less pixels (i.e. Galaxy S II). In that case, you actually have less pixel space for UI elements, text, video, etc. than you would with the smaller, denser screen.
Everytime I see this article mentioned, I cringe. I am a man with relatively small hands and I can reach the right edge and even the upper right corner of my Motorola Atrix 2 (almost as big as a Galaxy S II) easily. Have you actually tried this? Unless he has tiny, child-like hands, those graphics have to be inaccurate.
I can comfortably reach the upper right corner of my phone. Maybe I have larger hands than I think I do, or maybe I hold my phone differently than most people. But I still don't see how that blog post linked to in the parent comment is at all accurate.
There are still quite a few Android phones out there with keyboards. I miss my BB keyboard on occasion, but I can usually just wait till I get to a proper computer if I want to type anything particularly lengthy.
Sure. My point is simply that when the iPhone came out, it was widely criticized for not having a keyboard. These days, not having a keyboard is the norm, and those few phones which still have them are really unusual. What was once a target for criticism is now standard, and vice versa.
True, what's interesting though is that the trend hasn't quite held up in the tablet space. Go to a coffee shop and it's entirely normal to see tablet users using a keyboard of some sort.
I was going to say that at least tablet users who want keyboards at least buy them as an add-on rather than buying tablets with keyboards built in. Then I realized that a tablet with a built-in keyboard is called a "notebook computer", and they're already quite popular.
Seems to me that it's all about the difference in size. A full-sized physical keyboard is well superior to a virtual one. But on a tiny phone, either way is going to be painful, so it matters much less.
Did you have much difficulty getting Ubuntu running? I couldn't get Fedora to boot from USB or HD, even using rEFIt. I suppose I could check out Ubuntu again.
Macs have trouble booting legacy operating systems off external devices, but I've never had trouble booting anything off the internal drive, or booting off an external drive using EFI. At the moment, I'm running a Linux system off a firewire hard drive, with my bootloader, kernel, and initrd stored on the EFI protective partition of my internal drive.
Hmmm. I've never been able to get Fedora installed from USB to the point it will let me boot off the Fedora partition. I'll try again with Ubuntu and see if I can get that working, then move from there to another distro.
You won't be able to install via usb because refit/efi won't enable usb booting. You can use the internal dvd drive for installation however -- or replace the drive with a hard drive containing the installation medium.
Best way to do this is to partition your first drive into 3 partitions before installing OSX using disk utility pre-installation. Install OSX, bootcamp, windows, then linux. Then you're free to remove your dvd drive if you prefer using a hard drive in that bay, but you'll need the drive for OS installations.
I am probably a very small minority in this regard, but in this , for me, and to grubers point, it's the hardware not the software.
The hardware is an aesthetic superior design. But I run windows 7 on it. I find windows 7 to be a far superior user experience to osx, faster, and at least on this hardware more stable. Apple provides the easiest driver install procedure for windows then any other provider, I just find osx itself... Rather primitive, most likely imo due to the insistence of Apple on providing a hermetic user experience.
No tech company, no matter how smart, has all the answers in one box.
I disagree with Gruber's interpretation here. Apple wants to do everything well; excellent software is a by-product. But as a focus? Not really.
This is not to say Apple doesn't make good software, but there is very little in the actual output of the company that supports Gruber's notion.
Back in early iPhone days, one of the biggest complaints was a lack of multi-tasking in iOS -- you could only ever have one app running at a time. There were no push notifications, etc. The Apple explanation, per Jobs, was that they consciously chose to exclude that capability. A few versions later, voila -- iOS supports multi-tasking. This sort of cycle -- explain why a feature didn't exist due to some chosen policy/belieft, then include it in later revisions -- became a pattern for Apple.
Flip to the hardware side, and the story is different. When has Apple hardware, since Jobs return in the nineties, ever been a compromise? It hasn't, because Jobs focused on the hardware. While the software is important, it is really a means to an end. The hardware meets this condition too, but it is much higher in the pecking order of consideration than software.
well people complained about the lack of 3G, and now LTE. Some also complained about the lack of radio or full bluetooth support. I think your right, Apple wants to do everything well, not just "everything". If they can't do it well (copy pasting, multitasking...) they don't do it until they can.
I think we have yet to see the full spectrum of Apple's focus on software. Until the iPhone, they were hardware oriented (the Mac, the iPod). But since then, they are moving towards software. They make some of the best Mac and iOS apps after all.
Isaacson writes fluidly, put in the research and reporting (also rehashed a lot of other people's), but doesn't know the technology or the tech business. In fact I don't get a sense he likes them or 'got' Steve Jobs.
Hatchet job might be strong. But he dwells a lot on the charismatic and narcissistic and mercurial personality and not on why so many great people loved Jobs and worked so hard for him. Or what his insights about products and the business were (besides being a control freak and perfectionist).
The book is a good read, it's a creditable first draft of history, contains some first-hand stuff I never saw before about the genesis of the iPod and iPhone and iPad.
Isaacson gives the who, what, when, where, but doesn't really explain why. To his credit, he lets the people speak for themselves.
Jobs could have picked a lot of other people, but he picked a non-tech, non-business writer. I guess he wanted someone to just tell the story, not the strategy or product vision that makes Apple great.
Maybe Gruber should interview a bunch of people and give it a shot. It's not what Isaacson set out for or was in a position to do.
> Isaacson gives the who, what, when, where, but doesn't really explain why.
Most of the time. But sometimes he tries to explain and fails miserably.
It wasn't a newspaper article, it was a book that everyone knew would sell tens of millions of copies (even if the subject hasn't died a few weeks before). If he had sought council of one Apple observer (about his explanations, and whether they were right or catastrophically wrong) before wrapping up the book and sending it to the publisher, the book wouldn't be such a mess.
> Jobs could have picked a lot of other people, but he picked a non-tech, non-business writer. I guess he wanted someone to just tell the story, not the strategy or product vision that makes Apple great.
Or, as John Siracusa said, he might've chosen the wrong guy.
Perhaps Isaacson does not explain why because it is rather impossible to do so. The most honest type of journalism is to let people speak for themselves.
On the other hand, someone like Gruber would be the absolute worst person to write a biography. He idolizes Jobs and Apple and is the farthest thing you can get from an unbiased observer. The only truth you would find in a Gruber biography of Jobs is the truth of how Gruber himself sees the world.
The only person who could really answer the Why was Jobs himself, and even though this was an authorized bio, I never really felt that there was much personal insight from Jobs himself. Did Isaacson really not ask the questions, or did Jobs not know the answers himself?
or maybe Jobs was a control freak and didn't want to give them, and wanted to get everyone to read a somewhat shallow historical treatment and suck all the air out of the mass market for books about him, before someone wrote something more serious.
I had two problems with this post. First:
'NeXTStep was not “just warmed over UNIX”.'
It was, and so was Mac OSX. What Gruber doesn't seem to get is that warmed over unix provides a much more stable OS than Windows NT or DOS. He should be proudly admitting its warmed over unix.
Second:
"It’s almost impossible to overstate just how wrong Bill Gates is here, but Isaacson presents Gates’s side as the truth."
It should be mentioned more clearly that Gates was saying this on a sales call - his ultimate goal being to have every consumer computer made running Windows NT. If he stretched the truth a bit, he shouldn't be blamed for being ignorant, only ambitious.
This is what often irks me about Gruber - he makes disagreeing with Apple out to be an act of incompetence. Most engineers that don't like Apple products simply want greater customization over their tech, something Apple denies their users to promote ease of use.
It wasn't, unless you're completely ignoring the one thing that made OSX and NeXTStep unique, the OpenSTEP framework and Display Postscript/PDF GUI engine. That they were able to use these components made it possible to provide an operating system with a nice, well performing GUI running on a stable UNIX foundation.
Had NeXTStep only been a warmed over UNIX, wouldn't it have been better for Apple to just use Linux and X11, or even better use A/UX which already had a Mac-like interface?
Gruber is not really blaming Gates, he's blaming Isaacson for not doing proper research. He could have literally asked anyone for more information about this, and the answers he would have gotten would have provided more insight into what really happened and why Apple succeeded.
It enabled Apple to have OSX running on Intel from day one, and it made launching the iPhone and iPad possible without reinventing the wheel (which is what Nokia, Microsoft, RIM and Palm all had to do in response to the iPhone).
No, not necessarily, but do you think Android or PalmOS both fit the description "a warmed-over Linux distro"? The fact that they both run Linux is not what makes any of them unique or interesting.
"This is what often irks me about Gruber - he makes disagreeing with Apple out to be an act of incompetence."
No, you're missing the point. The incompetence is not "disagreeing with Apple", it's "presenting Gates' assertion as fact when it demonstrably is not."
Isaacson's task was not to convey what Gates said on a sales call. It's not a book about Gates.
Isaacson was using the quote from Gates to make a point about Jobs and Apple, and because Gates was wrong (even if justifiably so given his motivation in context), Issacson misinformed the reader.
Isaacson ought to have checked out Gates' claims, with someone who would know, like Avie Tevanian, or Glenn Reid, whoever.
I think it bears mention here that Isaacson has acknowledged there may be some places the book could be improved in and may be putting out a version 2 soon (or maybe 1S? I'll show myself out...). The whole thing was a bit rushed to press.
I don't have any particular feeling toward Gruber's work, but (apart for odd timing.) I really liked this piece.
But I had this nagging 'hey, I read that before!' feeling back in my head - and I was right, although I heard similar complaints before - voiced by John Siracusa (you know the guy that writes 10+ pages reviews of new versions of OS X on Ars Technica? That's him.) on his 'Hypercritical' podcast.[1] It's long (1h15m, and it's only the first part.), but in my opinion absolutely worth listening to.
If you have free time, or have nothing to listen to while commuting - give this one a shot.
Both the Hypercritical and the Talk Show around that time had in-depth discussions around the book; I believe Gruber started iwth "I mostly agree with Siracusa."
I was also a little surprised by the timing of this piece, because I also felt like I'd seen/heard some of it before -- but I realize the audience for DF is bigger than the audience for the Talk Show & Hypercritical, so it makes sense & is well sourced.
That would really be something. Sadly, I think there was barely a sufficient market for a deeply geeky coffee table book on the Mac's creation. I'm sure the market for an even more deeply geeky look at the development of a modern microkernel-based OS is even smaller.
I don't know what I'm more surprised at - Isaacson's piss poor job of doing Jobs' life justice, or that Jobs chose him to write the book. Either way, I walked away very disappointed, ready to never think about the book again.
This is what puzzles me the most about Jobs fans, such as Gruber: they think Jobs is a genius who just can't get anything wrong, except... when he picked the person to write his biography.
Eh. I agree with the Siracusian critique (echoed by Gruber) that Jobs "picked the wrong guy", at least for us hackers. But I don't think Jobs really gave a damn what Isaacson wrote about what went on under the kimono at Apple. If anything, he'd probably have preferred a biographer who would have left the TV "cracked it" quote on the cutting room floor, lest Samsung know what was about to hit them.
He wanted a bio that would help his kids get to know him better, and that's what he got. It's not a coincidence that the most intimate moments in the book all revolve around Jobs outside of Apple. That's what he talked to Isaacson about, and that's what Isaacson put to paper.
"I don't think Jobs really gave a damn what Isaacson wrote about what went on under the kimono at Apple."
You nailed it.
That's the opinion I heard from Gruber on The Talk Show podcast and it makes sense - that Jobs would've picked someone like Steven Levy if he really wanted to explain the inner workings of Apple and alternatively, Isaacson was perfect for a human interest puff piece.
All of the stuff that I wish Isaacson would've written about well - the years in the NeXT wilderness, lessons learned from Pixar, the inner workings of Apple from 97 on - are the things written about and examined in the exactly one place in the planet where it can help only Apple: the secretive internal executive training program known as Apple University, formed in 2008 and headed up by Joel Podolny, the former dean of Yale Business School. You didn't think someone that planned things at the scale that Jobs did would just have a succession program that stopped at Tim Cook taking over, did you?
Jobs' greatest creation wasn't any one product, it was Apple itself, a company engineered to innovate on a regular and ongoing basis for years to come.
Honestly curious, what was wrong with it and how do you think it could of been improved? – I really enjoyed it, though admittedly I'm a complete fanboy.
I'm a fanboy, but thought the book went far too deeply into personal relationships, lacked objectivity, and diminished Jobs' accomplishments in an effort to make him seem more human.
I know quite a few people who run Windows on MacBooks as their primary OS. Of course, the different mac clones that run the OS on generic hardware were pretty successful too (before being shut down by Apple's legal department).
What computer would you rather use? A MacBook running Windows 7, or, say, a Lenovo ThinkPad running Mac OS X 10.7?
Gruber, like everyone else, knows that the ThinkPad is a legendary design and that there are many people who prefer it over everything else. Picking it to serve as his example of inferior hardware was his signal that only true Mac fans should read on, so I didn't. Kudos to him for letting me know up front that the rest of the article wasn't my cup of tea.
Wouldn't it be nice if my MacBook Pro wasn't... didn't... was less... I'll spare you the complaints, and the praise for the ThinkPad T-series. They could both learn from each other.
I realize it's a matter of opinion which piece of hardware is superior. That's the point. Gruber threw up a billboard in paragraph four that says, if you think it's at all unclear that the MacBook Pro is the greatest laptop design of all time, read no further. If even he doesn't think this bit of hagiography ought to be read by a broader audience, who are we to contradict him and post it to a broader audience on HN?
I think you misinterpreted it. The ThinkPad was grouped along with "top-of-the-line HTC, Samsung, or Nokia handset running iOS 5" as being the best non-Apple hardware. I think it was a compliment to the ThinkPad that he didn't enumerate any other PC laptops like he did with phones.
You missed the point of the paragraph and cherry picked a sentence out of context. The first two sentences of the paragraph you cite are:
"For me, the answers are easy. It’s the software that matters most to me. "
He then partners iOS and Mac OS software on competitor hardware because it is the software that matters most to him. He isn't using the ThinkPad as an example of "inferior hardware".
"That's the point. Gruber threw up a billboard in paragraph four that says, if you think it's at all unclear that the MacBook Pro is the greatest laptop design of all time, read no further."
The point is that we, he, and Jobs would pick our preferred software on an inferior hardware device over our preferred hardware running inferior software, not the particulars of the choice.
This doesn't require a Steve Jobs mind-meld though.
When everyone stopped buying NeXT's elaborately designed hardware, Jobs slapped the OS onto beige 486 clones faster than you could press the turbo button.
Perhaps. But that flexibility is a tradeoff. That's the point here. The reason the pieces fit so well together is that they were designed to fit each other without regard for compability with a larger ecosystem (I'm being deliberately vague here whether I'm talking about the relationship of hardware to its case, or software to hardware, or subsystems to each other). If you want flexibility then you have to accept some wiggle room in how the pieces work.
The particulars of the choice are telling. The context called for a piece of clearly inferior hardware, and he picked a design legend. There are many outstandingly crappy examples of laptop hardware, and instead he picked a polarizing but very highly regarded design. He did it on purpose -- you don't accidentally choose the only laptop on display at the Museum of Modern Art as your exemplar of bad design (though, if you're a bit arrogant and poncey, it's a clever mistake to pretend to make.) You do it to polarize the debate and filter your readership to the folks who are likely to agree with you.
"The context called for a piece of clearly inferior hardware"
No, you misunderstood the article. The context called for a piece of SUPERIOR hardware. Your entire complaint here is based on a fundamental misunderstanding of the point Gruber was trying to make.
What scratches? My five – soon six – year old MBP doesn’t have any scratches. It has dents and some sort of strange chemical reaction happened where my palm usually rests, leaving a spot with small black pits.
Aluminium is prone to denting. That’s what you have to worry about.
Isaacson leaving those Gates quotes unremarked upon doesn't imply agreement; he's just relaying interesting details.
For example, the post-NeXT acquisition rant, which comes by way of Amelio, is effectively refuted by the whole life story that follows. So there's no need to spoon-feed a conclusion to the reader: "look how wrong Gates was!" Everyone gets it just about as well as Gruber does.
It does imply agreement if it's the only viewpoint being presented.
When he says that OSX used "some of the software that Apple had bought from NeXT", that's not a quote from anyone, it's still wrong (or grossly misleading at best, when the main reason to buy NeXT was to get the operating system). He could have asked anyone familiar with the topic, and he would have gotten the correct answer, which is that OSX is a direct descendant of NeXTStep.
It does imply agreement if it's the only viewpoint being presented.
Not at all; the entire rest of the book demonstrates the truth more richly than any sort of immediate-pairing-with-an-alternate-take would. It's not a compact newspaper story or a children's textbook: take it as a whole. Does it demonstrate the truth of Gates' quotes? Clearly not.
Even where Gates says, "let’s be frank, the NeXT OS was never really used", I don't see that as being presented as gospel by Isaacoson. It's just another accurately quoted viewpoint. (The presence of puffery like "let's be frank" and weasel words like "really" are clues to any reader that this assessment is very perspective-dependent.)
Gruber is probably right that Isaacson doesn't quite appreciate software or NeXT's technologies. I think Gruber was also right to refute the Gladwell 'tweaker' label, interpreted from Isaacson's work. But Gruber is wrong that leaving Gates' quotes dangling at the end of "this section of the chapter, with no additional commentary" leaves the average reader "to believe that the above is an accurate description of Apple’s NeXT acquisition." The average reader knows it's just an accurate quote of Gates' opinion, to be interpreted along with all the other info in the book, before and after.
Can you point out to me how the rest of the book demonstrates the truth about the origins of OSX? My impression is that Isaacson just got this wrong. The quote is just part of that. Let a random person read the last few pages of chapter 28, and I'd think that they would draw the same conclusion.
I'd recommend just listening to Siracusas podcast to get all the details on what he got wrong, from small nitpicks to big issues.
If you liked this piece (and didn't like Isaacson's book at all), don't miss John Siracusa's great critic of the book - Hypercritical, episodes 42 and 43. Well worths listening to...
The rhetorical questions at the start of this article were easy for me too - but surprisingly, they were the complete opposite of Gruber's. I primarily run Windows 7 on my MBA.
After listening to Hypercritical's take [1], I have to agree with Gruber on this one as well. Some of the errors in this book aren't your run-of-the-mill misinterpretations, or lost in translation. They are glaring, fundamental errors regarding how Apple was run as a company, Steve Jobs himself and the people in his life.
When you write a book about a technology giant's CEO and you can't even get the name of the company right ("Apple Computers"), you have to wonder what else is wrong.
Allow me to disagree. Apple is a systems company. They, of course, use software and hardware, but those are made to match each other. It's also Jobs' company - and it is what NeXT was probably meant to be.
If software were the only priority, OSX (and iOS) would be more modular, easily customizable and extensible - and it would be much more advanced than it is and than what its Unix roots allow it to be. And it would run on PCs since the 286 days (maybe with a decent graphics board). If hardware were the priority, they would have designed their own CPUs, embedded memory management functionality within the memory itself. By now, you would probably be able to SHA1 a block of memory without it ever touching the CPU data bus.
Much like a glass cockpit of a plane or your in-car entertainment system, you don't care what OS it runs or what types of CPUs are built into it. A Mac, an iP*d or an iPhone are devices you buy to cover a specific need - you want to write, crunch numbers, make phone calls, read books, listen to music, even write software... Of course, Macs are more flexible and allow a lot of customization, but it only goes that far. If you boot a Mac with Linux or Windows, is it still a Mac? Hasn't it lost something in the process? If you install OSX on an HP Envy, is it a Mac?
Jobs was a very flawed person, but he also saw differently, and did a lot of amazing things less flawed people failed at.
> If software were the only priority, OSX (and iOS) would be more modular, easily customizable and extensible - and it would be much more advanced than it is and than what its Unix roots allow it to be.
OS X is very customizable, and you mention the proof of this yourself: iOS. Apple was able to take the fundamentals of OS X and, within a few years, maintain it, and move code between iOS back into OS X. I don't think the evidence supports your premise, here.
> And it would run on PCs since the 286 days (maybe with a decent graphics board).
NextStep did run on Intel processors, from the get-go.[1]
> If hardware were the priority, they would have designed their own CPUs,
Starting with the iPhone 4, Apple did just this with their A5 chip.
> NextStep did run on Intel processors, from the get-go.
No. It ran originally on Motorola 68K processors (030 and 040) on NeXT's own hardware. It was then ported to other platforms.
> Starting with the iPhone 4, Apple did just this with their A5 chip.
But Macs ran PowerPCs (which were heavily influenced by Apple) and switched to commodity Intel processors. Apple did, for some time, design its own exotic hardware, but didn't went much beyond rendering expansion very difficult.
In retrospect the biggest problem I had with Isaacson’s book was that he really seemed to dumb down his subject. I realize that Isaacson may have had to do this to appeal to a non-tech audience and to fit an entire complex lifespan into one book — but the result is that Jobs becomes a flat cartoon character of sorts and everything becomes oversimplified. And maybe that's what Joe and Jill Average want to read -- but as a fanboy and geek it left me feeling a bit empty and uninspired.
>Isaacson clearly believes that design is merely how a product looks and feels, and that “engineering” is how it actually works.
The author doesn't seem to understand that Isaacson isn't writing for a HN audience. To the vast majority of people "design" does mean only aesthetics, so the author is to some extent justified in following the same route.
Same thing with de-emphasis of software. It is pretty much impossible to explain why a certain piece of software is good using words to a non-programmer audience - who may not even have seen an Apple device. I'd have glossed over software too especially since everyone associates Apple with brushed aluminium hardware anyway.
Just because some aspect isn't discussed in the book doesn't mean the author is ignorant of it.
These types books are meant for mass market entertainment, not a technically literate HN crowd. Of course if you measure the book against the wrong bloody benchmark then it fails miserably. And yet somehow after pages of doing exactly that the author manages to highlight his own mistake in the final 2 sentences:
>Isaacson’s book may well be the defining resource for Jobs’s personal life — his childhood, his youth, his eccentricities, cruelty, temper, and emotional outbursts. But as regards Jobs’s work, Isaacson leaves the reader profoundly and tragically misinformed.
" To the vast majority of people "design" does mean only aesthetics, so the author is to some extent justified in following the same route."
If he's writing about Jobs, then it's pretty much inexcusable to fail to put it the way Jobs would, rather than how "the vast majority of people" would.
I've read the book in its entirety, and while I don't refute the inaccuracies on some of the technical details covered, I don't really this it's all that important. As a reader, I was not interested in the technology aspect and technical details: that is already well documented. If this is what you're looking for as a reader, then this is not the book for you.
Isaacson was the perfect writer for this biography, in my opinion, thanks to his lack of technical knowledge. When you know the technology, it's easy to get lost in the things that don't matter. Isaacson has a fresh and often more objective perspective than any tech writer could. The details surrounding which kernel was used in Mac OS X and how much NeXT was responsible for really does not bring much value to me as a reader. Like I said, if I cared deeply about this, it's well documented already and easy to get from other sources.
What I got out of the book was a remarkably intimate look at the man himself: What made him tick, what his philosophies were, what the politics were and what the major obstacles were that he had to overcome. All of this, wrapped in an enthralling narrative and surprisingly intimate detail.
Isaacson may not have understood the technology, but he definitely understood Jobs' humanity, or sometimes lack thereof.
But why couldn't we have had _both_? The technological history of Apple post Steves return is very poorly documented due to its secretive nature.
Even so, Isaacson did a really poor job of analyzing Steves personality, and never really confronts him about it. The closest we really get is Steve saying "well, that's just who I am", and a theory by Ive (or Hertzfeld?) about Steves motivations for being so cruel at times.
Couldn't Isaacson have confronted Steve about this theory? How about asking him how this fits in with his relation to Buddhism? Or maybe that would be too technical..?
Isacsson said that the book "wrote itself". As someone else pointed out, books rarely make good authors.
This strikes me as a bit hand-wavey. ("Don't pay attention to the glaring technical inaccuracies that have somehow made it into the "official" biography of a technical icon, focus on the human story). It is certainly possible that Isaacson got the "human" side right, but when you're talking about a human who's iconic achievements are inseparable from the technologies he created, that's a pretty weak defense.
> But, as a thought experiment, which is more important to you? What phone would you rather carry? An iPhone 4S modified to run Android or Windows Phone 7? Or a top-of-the-line HTC, Samsung, or Nokia handset running iOS 5?
This is a fascinating question to me because though I agree with Gruber on preferring OS X on the PC hardware (for now anyway, at least vs Windows rather than Linux), I think I actually would prefer Android on an iPhone. My biggest gripe with Android is the shitty hardware and the seeming inability of any manufacturer to make a touch-screen that is not glitchy as fuck all. When it comes to software I concede iOS has more polish and there tend to be better designed apps. But on the other hand, Android has the more powerful apps. For instance, I use DoggCatcher for podcasts on android, and I've tried a half-dozen iOS podcast apps, many of which are more elegant, but they are extremely under powered feature-wise. Apple's philosophy of only have a home button is elegant serves discoverability, but I don't think it's inherently better, and for power users I think it can be a disadvantage.
I'm a big fan of Apple's products. Have been since the Apple II. But when I look dispassionately at the core capabilities, I do not see uniform excellence.
Apple clearly excels at: marketing/brand, hardware, and partner/supply chain management. But Apple's software quality is all over the map. Further, Apple does not "get" the internet (and never has).
Since "hardware vs software" was the focus of JG's post, I'll briefly state my case around those two elements.
Apple gets hardware. I doubt that anyone would argue otherwise. Fabulous objects-of-desire emerge from amazing industrial designs. I can't even think of a laptop I'd consider in the same league as the Air. Ditto the iPad, iPod, and Airport. (The iPhone is in a much closer race with the Samsung gear.)
Apple also sports price-performance advantages In certain key areas. iPods have held more memory per dollar since the earliest days of MP3 players, for example. HP was unable to match the iPad. And now the Air and other "computer products" have closed the gap. This is an under-appreciated aspect of Apple's game.
Apple sometimes gets software too. I personally loath the one-button-ultra-modal aspect of IOS. But the myriad of brilliant features (e.g., pinch, scroll, etc.) blow me away, in both a design and execution sense. Apple is great at UX-in-the-small. But at application-level, things aren't so balmy.
iTunes and its syncing model are frustrating at best. Mail, iCal, and Address Book are only now getting better than (elegant) toys. These three have had serious bugs for years. iWork - forget it. App uninstall is incomplete, leaving many remnants. OSX's underlying file system is a joke, as is MacPorts. Lion's desire to mimic IOS is frustrating at best.
Apple is the greatest show on earth based mainly on their brand development and their ability to produce must-have objects.
Oh, and I'd (reluctantly) take Win7/Air and IOS/Samsung -based on the strengths of the hardware in each case. A split decision.
What computer would you rather use? A MacBook running Windows 7, or, say, a Lenovo ThinkPad running Mac OS X 10.7?
I'd take the ThinkPad running 10.6 thanks. 10.7 is a total clusterfuck. It pisses me off (a seasoned developer) and it confuses the fuck out of my wife (who isn't).
Not related to the article, but this is what I had to say to my dad after I lent him my copy: Writing biographies about living people is weird. Writing a biography of someone who asked you to is F*ING weird. Apparently Isaacson's other books are better (though I haven't read them myself) but I'm sure the future holds some better researched if not much less personal bios in the future.
I am disappointed with Gruber's "shades of grey" conclusion. He is of course right that there are good parts as well as bad parts. Maybe I am too "Jobsian", but that equates total crap to me. The abysmal failures of the book leave room for someone else to write the definitive biography. Isaacson had his chance but he blew it. He sold a lot of copies, but the people who misunderstood Jobs won't be the ones spending their time telling the next generation about him. A hundred years from now, the book that people quote regarding Jobs will certainly be written by someone who properly understood the man. Someone who writes that "greatest book ever". This superlative attitude might seem overblown in everyday life, but it's what society values. Second place is in the end the first loser (or at least the first forgotten).
And hat author could be one of us. It can only be someone with the perspective to set it straight. It certainly won't be a writer thinking more about himself than his subject. In history, perspective matters more than profession.
I don't think there are such things as 'the definitive biography/history/story'. All such endeavours are filtered tellings of actual fact through prisms of bias, limited knowledge and the limitations of condensation.
There is a limited amount of attention that we pay to events of the past. Eventually one of them wins, and the others are mentioned only when the first one is notably wrong. There will be other biographies written, and at least one of them will be better than this one.
As I read it Gates is defending his statement that buying NeXT was stupid given the known facts at the time the deal was done. He does so by downplaying the NeXT software lineage in Mac OS X, but also by claiming that the real gem they got from NeXT was Steve Jobs, who Amelio couldn't have known would go on to be a great CEO, because he was well known at the time to be a maniac.
It seems like something Gruber would agree with if phrased slightly differently (e.g. "the most important thing Apple got from NeXT was Steve Jobs") so I don't know why he's getting so bent out of shape about a quote from another book which Bill Gates himself immediately questions the truth of in the the Jobs bio.
(And is it just me or is it a stretch to attribute the iPod interface to NeXT? Choosing an item from a list and going to a sublist isn't something I remember them inventing.)
He doesn't have a problem with Gates's statement, he has a problem with Isaacson's lack of research. Gates didn't write the book, Isaacson did. This is just one example where Jobs says something true, Isaacson thinks he's lying and instead trusts someone else who are either lying, or don't know the truth.
Who is Isaacson trusting? Bill Gates? If so why publish text that Gates disagrees with the truth of. Amelio? If so why publish the fact that Gates disputes that version? It sounds like he doesn't really "trust" either account, and why should he?
Gruber presents this as the best example of Issacson not trusting something true Jobs said and "trusting" the lies of others (as do you) but really it's just nerdy nitpicking about how far you should emphasise and editorialise the subjectivity of 3rd party accounts, there's no quote from Jobs being disproved in this example it's just Amelio (who in the anecdote just picked NeXT over NT!) describing Gates angry reaction at the time and Gates commenting about it later. Gruber's reaction is a total non-sequitur. He can't cope with an angry outburst by a rival that's just lost a business deal being left unchallenged, when the context is clear.
This whole "what do you think Steve Jobs would have done" is beginning to look a lot like a "What Would Jesus Do" kind of following. I get it that he was a visionary but come on…
I'm critical of John Gruber on many topics, but in this essay he is exactly right. Further, there are many examples, just like the Bill Gates one, where Jobs says something that is true[1] and Isaacson assumes its the "reality distortion field" and then quotes someone else, like gates, who has an agenda, telling a lie as "proof" that Jobs is lying.
Lets talk about this "Reality Distortion Field". People claim that Jobs can make you believe things that aren't true by simple application of charisma. Is anyone here willing to admit to being swindled in this way? I am not. I am not aware of Jobs ever saying something that was actually false (though I'm quite aware of many manifold lies told about Jobs.)
For instance, remember the introduction of the iPhone? How about the introduction of the iPad? Everyone here should be old enough to remember one or both of these keynotes. Surely Jobs "Reality Distortion Field" would be deployed to maximum effect at such keynotes-- and after both of them I remember much derision and claims that Jobs had the RDF on maximum and how those products were going to be complete failures, and how everyone needed a keyboard on their phones and how the iPad was a terrible, terrible name, inspired by female hygiene products, etc. etc.
IF you go back and watch these, can you find a single lie? Can you find any reality that was distorted? Sure, Steve Jobs called the iPhone revolutionary. That's obviously a characterization based on an opinion, but that opinions seems to have held up-- before it, there were only feature phones, really, and now every phone that isn't an iPhone is some sort of iPhone counterfeit (e.g.: has a touch screen) It clearly revolutionized the phone category, and created the app ecosystem. Similar things happened with the iPad.
Because Apple is successful, and because Apple does things its own way, people feel the need to attack Apple. And of course, they attack Jobs.
Most of these attacks have clear motivations-- people who bought another product who want to feel it is superior, or people who work for a competitor, or -- and this is the biggest source, I believe-- hack journalists who want to create a sensational story (I still remember a claim that Apple switched from ATI to NVIDIA chips in laptops the week before they were announced because of a leak from ATI... as if Apple could even do that so quickly for a product that was about to ship.... but people believe it. The story was "Steve got really mad and now the new MacBooks will ship with NVIDIA chips!" I know for a fact this is false because you can't change production that fast... but people believe those kinds of lies. After all, they've been told for year that Steve Jobs is an asshole, and, despite never showing this side of himself in public, they believe it.
[1] True because I know it to be true either because I witnessed it, or I'm more informed on the issue than Isaacson is. I've been an Apple watcher for 20 years, and I have noticed that much of what people believe about Apple is based on oft repeated myth without substantiation in fact. I remember Apple trivia fairly well, and the specifics of things that often happened before people writing about them now were out of grade school. (EG: Just this weekend I read in "Inside Apple" the long refuted claim that Apple "stole" Xerox technology for the Mac. Amazing kind of a theft that was-- Apple paid for a license to use that technology with stock which, if held to present, is worth Billions of dollars. Quite the heist!) Another example: for quite a time there, many windows fans believed that Bill Gates owned Apple, because to them $150M is a big "investment" and they think Microsoft bought Apple in 1997. (they didn't know that Apple had a lot more of that in cash already, and that part of the deal-- the bigger part-- was burying the hatchet on all the patents microsoft was violating, to the tune of several billion dollars a year from Microsoft paid to Apple for several years. This latter bit was reported, but kept quiet because Apple didn't care and microsoft wanted to save face... so its not widely known.)
I still remember a claim that Apple switched from ATI to NVIDIA chips in laptops the week before they were announced because of a leak from ATI... as if Apple could even do that so quickly for a product that was about to ship.... but people believe it. The story was "Steve got really mad and now the new MacBooks will ship with NVIDIA chips!" I know for a fact this is false because you can't change production that fast... but people believe those kinds of lies.
You're remembering the story wrong. It wasn't about laptops, and the switch took 6 months rather than a week. What happened was that in July 2000, when the Mac G4 Cube was about to be launched, ATI accidentally pre-announced the product with their press release about their Radeon chips being used in the Cube.
Six months later it was time for another MacWorld and another set of Mac updates. All the new G4 Macs used NVIDIA graphics boards instead of ATI. This was the first time in several years that Apple wasn't using ATI GPUs at all, and people speculated that this change might be due to ATI's PR slip-up.
Your posts in support of Apple are so breathless. I don't want to go all ad hom, but this is a little...much.
In any case, as a long time Apple observer, you'll remember that the RDF was coined by Apple engineers (probably Bud Tribble, at least according to this story). It was both a characterization, an expression of admiration and, yes, of frank criticism.
"Your posts in support of Apple are so breathless. I don't want to go all ad hom, but this is a little...much."
At the time of writing, nirvana has written comments on this post totalling 3931 words, most of which defend either Apple or Jobs. Don't play chess with pigeons.
"Your posts in support of Apple are so breathless. I don't want to go all ad hom, but this is a little...much."
You're talking to the person, rather than to the point. I'm not the point.
"Steve Jobs was a man, and that's OK."
Translation: Whenever someone says something false about Steve Jobs, the only reason someone might correct them is because they're a koolaid drinking cultist who cannot tolerate the idea that Steve Jobs was anything other than the second coming of christ.
Yeah, I'm glad you didn't "go all ad hom".
Edit:
I'm sorry if this post feels like it is making you the point. It actually is not intended to be that way. I'm trying to illuminate the tactic. Just as I'm trying to illuminate a tactic in my original post, and Gruber is illuminating the tactic Issacson used to discredit Jobs. The fact that the RDF was coined during the Mac project by the Mac team was known to me, and is a very different use of the term than the popular one I'm addressing.
> Translation: Whenever someone says something false about Steve Jobs, the only reason someone might correct them is because they're a koolaid drinking cultist who cannot tolerate the idea that Steve Jobs was anything other than the second coming of christ.
Not only does that not even slightly resemble anything I wrote (let alone what I think), frankly it's a ridiculous strawman that you shouldn't even employ for rhetorical points. Do you really not see any irony in the stated purpose of your posts in this thread and the content you've filled them with?
My point was only that yes, there was a very real basis for the RDF talk, and that yes, it appears that Steve Jobs was definitely problematic to work with and for. I've had that impression for a long time, especially since reading all the Folklore stories (the treatment of Jef Raskin was particularly unfortunate, whether or not the Mac was a better product for the project changes).
My understanding is that lots of people still loved doing so.
You asked
> People claim that Jobs can make you believe things that aren't true by simple application of charisma. Is anyone here willing to admit to being swindled in this way?
They might not be here, but the stories are right there. If you have a point other than the fact that someone not me on the internet is wrong and you know because you've seen a lot of keynotes in your time, let's talk.
I'm a fan of the work of Steve Jobs. He was not a magician, though, because there's no such thing. Personally, I think lionizing him reduces his accomplishments.
As for Isaacson's book, I read a few of the excerpts as they came out and they seemed like terrible writing, so I haven't read any of it since. Same reason I skipped Twilight.
You made those points sufficiently with your citation of the origin of the term, which I didn't dispute. It is your comments about me that are problematic, and you've continued them here. Since you spend more time talking about me than the "only" point you wanted to make, I'm not ashamed of pointing it out.
"You're talking to the person, rather than to the point. I'm not the point."
You actually are an expression of the point though. Most of your posts in this topic are as exemplar of the resonance to the RDF as any I've ever seen.
Like the OP, I've been an "Apple watcher for 20 years", and it's always funny to seeing people harping on this Xerox transaction, while missing most of the actual facts.
For what's worth, Microsoft gave Xerox money too, and also hired key people like Charles Simonyi from PARC. So it wasn't exactly like the movie where Bill Gates was yelling "I got the loot Steve! I got the loot!"
Yep, that made for tv movie has really ruined the debates on how the event actually unfolded. It was a pretty terrible movie in a lot of ways and it got a lot of facts wrong.
Part of Xerox's paroblem suing Apple would have been that it got its ideas elsewhere too (e.g. Alan Kay "discovered" OO programming after reading the source code of a simula compiler; most of the GUI concepts came from Englebart's lab)
I am not aware of Jobs ever saying something that was actually false (though I'm quite aware of many manifold lies told about Jobs.)
"You don't want a radio in your portable music player" was pretty damn false. Another is "people don't want porn on their machines" - like it or loathe it, the popularity of porn is pretty clear.
There's been a few things over the years we've been told by Apple that 'aren't in our interest' or 'we don't really want', but either are things we want, or suddenly become 'you want this' right after Apple starts providing that product.
Then there's the whole walled-garden thing, which isn't really about what users want (as it is presented), but about Apple wanting to shape users' expectations into something they can make money from. One clear example of this is the mac: once upon a time it was 'any colour you want, to suit who you are!', now it's 'you get one choice, regardless of who you are!'.
Mostly the 'lies' are just regular marketing stuff (and Apple does marketing well), but to paint Jobs as some uber-honest man is doing everyone a disservice.
I've never interpreted the claims of "Reality Distortion Field" as being part of his marketing spin. I'd always read about it as a management style, that he used on his team (internally). (This seems to be backed up by his recent biography).
Perhaps in the popular press the RDF has been bandied about as a name for his marketing spin. But popular press say a lot of things...
The RDF as a management tool is not unique to steve jobs, there are many charismatic people around who can make their teams really believe in a mission without any fact at all to backup this belief, (most cults start this way).
I'm accepting your first part for the sake of argument so we can focus on the second part:
"there are many charismatic people around who can make their teams really believe in a mission without any fact at all to backup this belief,"
I believe your perspective is in error when you get to the part about "without any fact at all to backup this belief". I believe that Steve Jobs was certainly charismatic, and probably one of the great people at getting his team to believe in a mission-- but I don't think this is "reality distortion", nor do I believe this was "without any fact at all to backup this belief".
In fact, I believe the reason he was so good was because he did have facts-- facts that the mainstream may not have been aware of-- but that were true. The thing is, many people still dispute these facts. (Eg: "The iPad is just a big iPod touch" disputes the killer app of the iPad, but the reality of iPad sales shows that they were wrong.)
Lets take some key products where Jobs got his team to believe in a mission to make something that was significantly different:
The Macintosh, NeXTSTEP & the iPhone.
For the Macintosh:
The facts: Most computers were difficult to use. Apple had strong experience with this for the apple // which was command line based. The Mac team went to Xerox and saw some of the key technology working and saw how it was more efficient (technology that Apple had a license to with the deal). Another Fact: The Apple // was a very integrated computer for its time, but a competing company (I forget the name at the time) had gone one step further and integrated the monitor with the computer. Thus the mission of the Macintosh: An integrated computer with the footprint of a phonebook that was sold like an appliance and that anyone could use because of its GUI, was not a distortion of reality, nor did it lack "any fact at all" to back up the mission. All the key elements existed elsewhere, though of course the schedule was completely unrealistic (but back then the fact that software was always late was not as widely accepted as it is now.)
NeXTSTEP:
The facts: Unix is powerful, multi-tasking is powerful. Object Oriented Software allows for component re-use. The mission: Build a unix workstation at reasonable cost that allows for rapid application development using object oriented software. True, NeXTSTEP was the first OO operating system (like the Mac was the first real GUI) and so there was some leap of faith to think they could do it or that it could be successful, but this is not based on a distortion of reality. Pre-emptive multitasking is really useful, and OO can allow for code re-use, and in the NeXT environment (and now OS X and iOS it really is a force multiplier for developers.) I don't see how he distorted reality or the facts there-- except, again, he set a deadline for delivery based on the fact that they were a startup. The deadline was unrealistic, because software takes too long and they missed it.
The iPhone:
The facts: The phone market was a mess. People hated their phones. (I did some research in this area, and found the churn rate was something like %83 and the dissatisfaction with ones phone was something like %70, though I may have those numbers reversed.) The software market for phones was locked down by carriers. The interfaces were terrible- often just a numeric keypad and if you had a full qwerty keyboard it made the phone unwieldy. A touch interface would be better, obviously ,right? Well, Apple bought Fingerworks. They knew touch interfaces could work because Fingerworks invented them. People hating their phones, the software being locked down by carriers, bad interfaces and limited usability due to physical keyboards are all things that you can't really dispute. There was a leap of faith in believing a completely touch based phone would work, and they spent many years working on it (and the iPad project which was started earlier). And again the timing for when they thought they could ship it was unrealistic and they had to bring in engineers from the OS X side of things to make their date. Did Jobs distort reality to get the team to work on the iPhone? I don't see why we should believe that. Did he get them to work on a mission without "any facts at all" to backup the belief that it could work? I don't think so-- that the phone industry was broken was obvious to a lot of people. I myself worked on a completely voice driven phone project in the late 1990s, but stopped due to being unable to get sufficient horsepower in a battery powered device to do the voice recognition.
In all three cases the market need was pretty clear. The technology precedents were visible. Both of these are facts that back up the belief in the mission. Neither of these rely on a distortion of reality.
All projects for new products require some faith. But getting people to believe something is possible, even when it hasn't been done before, doesn't mean necessarily doing it without any fact,s and in these cases, the facts to support the project were there.
If his crime is making people believe that the software won't take as long as it actually does, I can't fault him, and to be honest, he seems to be no worse in that regard than any manager I've ever had. (many of whom were deliberate about it.)
At Microsoft, for instance, when I worked there it was common practice to name the next release of windows something like "Windows 93" so that the employees all knew it had to come out in 1993, even though management knew it wouldn't be ready til 1997. Didn't want them to slack off thinking they had 4 years to get it done!
... Okay. I'm pretty sure the "Reality Distortion Field" was coined by people on the Macintosh team[1]... To describe exactly what i was talking about with regard to Steve Jobs.
Sooo, now I just feel awkward responding to the rest of your post...
You made a specific allegation and I provided three examples with a plethora of historical and technical facts that refute your allegation. I've long been aware of where the term originated, and the common use of it is not based on that origination's meaning (it was at least in part an honorific) and I was responding to the popular use. Further, you were not using it as an honorific either, but broadening it out to a very wide audience to apply to anyone who persuades a team to believe in a mission "without any facts at all to back it", so you obviously weren't using it in the same way that the Macintosh team was, and are simply now attempting to back peddle in the face of a fairly extensive refutation of your claim.
eek, No, the awkwardness comes from empathy reading your responses. I don't quite know how to respond (in honest bafflement, not back pedalling or because I feel some how convinced otherwise).
The RDF (as coined by the Mac team) was a very specific thing meant for steve jobs. I was more expanding on the fact that it isn't a unique skill/phenomenon and others use it.
Volpe-
I apologize, as it appears I completely misunderstood what you were saying when you talked about awkwardness. Further it appears that we're not too far apart in our perspective and that you were making a characterization in general that I interpreted as specific towards jobs. I missed that you were spring boarding from what I said to a broader comment, and though you were disagreeing with what I said and making a specific comment.
I think I agree with your broader comment, and agree that in the extreme it can be a dangerous talent.
You can look up some first hand accounts about the reality distortion field from folklore.org. Andy Hertzfeld has no secret motivation or an axe to grind.
I've read it extensively. The thing is, people act as if Jobs is the second coming and is, was and always will be the same person. He was young once, just like everyone else. Back in the Macintosh era, he often would come around to another person's way of thinking without realizing he'd coopted their idea. (Or maybe he realized it and was too insecure.)
But honestly, which is the bigger reality distortion:
Jobs saying that the iPhone is "revolutionary".
Or Bill Gates saying that none of NeXTSTEP made it into OS X?
Why is Jobs confusing his opinion for reality such a crime while flat out fabrication is given a free pass?
It is far more common for people to criticize Jobs for "reality distortion" than it is for people to criticize Gate for telling lies (and Gates is almost pathalogical- for years he managed to kill companies by lying about what Microsoft was going to do.)
The problem with your statement is that Isaacson interviewed Steve Jobs, his family members, and all his closest work associates and friends. His statements can't be dismissed as "lies about Steve Jobs" just because you or John Gruber don't like them or think so. What is reported in the biography come from multiple eye witness accounts of the people and events. What you are basically doing is libeling (since it is in written and not spoken word) Isaacson by accusing him without proof that he misreported events in his book. Please keep in mind that Steve Jobs specifically sought out Walter Isaacson to do the biography because he believed that he would tell the story completely and honestly, which is what Steve wanted.
"His statements can't be dismissed as "lies about Steve Jobs" just because you or John Gruber don't like them or think so."
Neither I, nor John Gruber have done so.
"What is reported in the biography come from multiple eye witness accounts of the people and events."
Really? Can you name any of these people who have witnessed none of the NeXT OS making it into OS X? Surely, there must be hundreds of Apple engineers who would have witnessed this. Surely there would be lots of corroboration. In fact, it should be obvious to any competent engineer who looks at the technical documentation about OS X or iOS.
The reality is, it is obvious to any engineer who looks at this technical documentation. The OS is completely NeXTSTEP, in fact, the classes still contain the prefix "NS" which is short for "NeXTSTEP". Its "NSScrollView" not "APScrollView" or "MCScrollView", etc.
If you review the technical literature, you'll find that OS X is more accurately described as NeXTSTEP with a Mac UI on top of it, than "Mac OS with a NeXT kernel" as Bill Gates describes it.
But maybe I'm completely delusional. I've only been working with this software for 20 years. Please, show us some of these eyewitnesses to the fact that the NeXT operating system wasn't used.
"What you are basically doing is libeling (since it is in written and not spoken word) Isaacson by accusing him without proof that he misreported events in his book."
Well, the truth is a positive defense in Libel, isn't it? The fact of the matter, as any competent engineer can confirm for you, is that OS X is essentially NeXTSTEP, evolved of course over the years, with a Mac Like UI on top. It got the menu at the top from the Mac, but it carried over the Dock from the NeXT days. The kernel, the Frameworks, the operating system, everything essential was NeXTSTEP. And most of the "Mac" things really were a new UI-- an evolution of the Mac UI called Aqua.
I don't know how many more examples to give you, but its clear that for you to believe that I'm "without proof that he misreported events in his book", you're going to have to give something more than assertion. I've cited many areas of the operating system that are direct descendants from NeXTSTEP and NOT from Mac OS. You've given none.
"Please keep in mind that Steve Jobs specifically sought out Walter Isaacson to do the biography because he believed that he would tell the story completely and honestly, which is what Steve wanted."
Yes, and I want you to please keep in mind that this was not the first time that a hack reporter that Steve trusted betrayed him because he wanted to sell more units of his writing. I don't know if Isaacson was the most honest biographer-- in some areas he was quite correct, accurate and fair, though this mostly seems to be because Andy Hertzfeld did the heavy lifting.
In Steve Jobs earlier years he was more immature, and more problematic and probably a lot less likeable. Notice also that you've not heard a peep from me (or Gruber) about an unfair portrayal in that time period.
The fact of the matter is that these kinds of lies-- like the claim that NeXTSTEP wasn't used, or that Apple stole from Xerox -- are repeated as articles of faith by people who wish to attack Apple. Yet they are very trivially refuted. And when refuted, the response is to pretend like the refuters are just unhappy that someone said something bad about Steve.
Well, lots of bad things were said about Steve that are true. We're not objecting to those. We're objecting to the lies.
Hell, if Steve engineered the purchase of NeXT by Apple even when NeXTSTEP wasn't useful at all, and thus had to be scrapped when building OSX (as Gates essentially alleges) that would be an example of Steve Jobs being one hell of a powerful salesman. I wouldn't even see that as something to be embarrassed about. Why would I object to that?
No, the objection is to the fact that its a lie fabricate from whole cloth, and trivially disproven. Which both I, and John Gruber have done.
There really is no other way to describe Isaacson's characterization of the relationship between NeXTStep and Mac OS X: It's false. OS X was not Mac OS with a kernel transplant — it was an evolution of NeXTStep with some accommodations for Mac users and developers. This is why you could get OpenStep programs running with little more than a recompile, but Mac programs that hadn't specifically been designed with compatibility in mind had to run in an OS 9 VM. The Finder is the most significant carry-over from OS 9 to OS X, but the OS X version was a Carbon program written completely from scratch to mimic the behavior of the OS 9 Finder because the OS was so fundamentally different. Meanwhile, NeXT's Mail, TextEdit and Preview chugged along just fine (not to mention third-party OpenStep apps like OmniWeb and Create).
One of the most frustrating examples of people telling lies about Jobs is the meme that he told an iPhone user that they were "holding it the wrong way" when they complained about the iPhone 4 antenna. What he actually said was (and this is the entire email): "Just avoid holding it that way."
Sure it's terse, but it's a long way from telling them they're doing it wrong. Jobs isn't known to be always nice and given this person's situation his suggestion is the best way to fix the problem in the short term. Apple later made amends here, too; admittedly only because people complained, but if the success of the iPhone 4 is a good indication (you can decide) this really was never really a problem.
There's not really much different between the two quotes. People were pissed because Job's response wasn't an adequate answer to their problem.
Apple talks a lot about how "well-engineered" their products are, and yes, the 4 was cool, but Job's answer seemed like something from http://thereifixedit.com. Don't hold it that way? It's a phone.
People wanted a proper response and they didn't get it until they were big enough to make a big stink about it. That's not good customer service.
In my eyes there's a huge difference, they just look the same if you don't bother to listen to what he said and don't pay attention to Apple's response in the context of the way they run their business.
It wasn't a friendly response, but it was the solution to the problem. And when they only gave out cases later after complaints (announced during a special event to talk openly about this issue, after they had taken the time to look into it themselves) it was because the only problem was the media reaction. Millions and millions of people bought and loved the iPhone 4 even after they stopped giving out free cases a few months later. They all had the option of returning it for a full refund.
How many companies do you do business with that provide this level of service, big or small? I honestly can't think of any.
Big or small? There are tonnes of small companies that provide and exceed this level of service - the one I work for (14 people) is one of them. A lot of small companies rely on very high levels of service.
Another example - a netadmin for a bank told me that the first they know about a failing hard drive - even before their own heavy monitoring picks it up - is the vendor calling them up and confirming the correct datacentre to send it to.
To say that Apple is the pinnacle of service anywhere suggests that you are either inexperienced or wilfully one-eyed.
Further, notably, the 4S for which they could have easily changed case designed if needed, uses this same essential external antenna design. I never owned a 4, but my 4S works fine.
Yes, it has two antennas and it also, unlike the iPhone 4, is both a CDMA and a GSM phone. Hence it needs multiple antennas for the different networks.
The antenna is still on the outside. You can still attenuate the signal by holding the phone with just the right death grip, as you could the iPhone 4, and every other phone on the market.
There was no problem to fix. This is a perfect example of how people spread lies -- with the help of media who wants to latch onto anything that might show controversy about Apple- to create these false impressions.
Reality: Every phone, if held in the right way, with have signal attenuation. This is basic physics.
Reality Distortion: There's a problem with the iPhone 4! It was fixed in the 4S!
What I can't understand is the persistence in constantly repeating these falsehoods, even after the truth comes out. I state the truth (the same design in the 4s) and then someone spouts the claim that it was "fixed" because the design did improve.
Well, every iPhone is going to improve on the last. That doesn't validate a false claim that the previous version was defective because it lacked future improvements!
With, critically, one important difference: it now has two cellular antennas. This way if one loses signal, say, due to being accidentally bridged, the phone switches to the other. This is why the gaps in the metal band are different between the 4 and the 4S. (Compare them.)
It's also why Consumer Reports endorsed the 4S but not the 4. Because it doesn't have the problem. Because they fixed it.
It is useless to argue with nirvana. He is like a conspiracy theorist. He made up his mind that the iPhone 4 and 4S have the same antenna design and would rather declare PC World and Consumer Reports and you anti-Apple liars rather than admit he was wrong. This is precisely how RDF works, the story that it's Apple against the world and thus anything bad said about Apple is never true and is said only by haters.
One of the most frustrating examples of people telling lies about Jobs is the meme that he told an iPhone user that they were "holding it the wrong way" when they complained about the iPhone 4 antenna. What he actually said was (and this is the entire email): "Just avoid holding it that way."
Wow. What's your take on whether Al Gore invented the Internet?
If you can look past the fact that Jobs didn't kiss the customer's ass you may realize that a CEO of a Fortune 500 company was personally responding to the customer's problem in the most effective way he could at the time.
> IF you go back and watch these, can you find a single lie?
Jobs said iPhone was the first phone with a full web browser.
Opera had mobile browser using proper "desktop" rendering engine (Opera Mobile, not Mini) on mobile phones before the iPhone.
Of course iPhone was the first to add capacitative touch screen and multi-touch gestures to the mix (rather than have click-to-zoom with a stylus) and that blurry line between stated "first full mobile browser" and meant "first full mobile browser that we think is really cool" is the RDF.
I've attended some Apple keynotes. My first one, I was expecting to be mesmerized. Alas, Jobs was ok. Nailed the demo, said what he was going to say, nothing more, and then left the stage. A real pro. Grade 'A', but not life changing.
I've seen RDF. Ever watch a charismatic evangelic Christian minister in action? A motivational speaker? A politician with the gift on the stump?
Reading the early stories of RDF, I have the impression the RDF (charm, seduction) referred to a one-on-one phenomenon.
As for public appearances, what distinguished Jobs from his peers is execution. Sorry fellas, but most of us geeks are introverts and shouldn't be giving demos. And sales pukes who make pitches are too often clearly full of crap.
Further, most tech pitches that I've seen required why too much suspension of disbelief. Jobs @ Apple always had a clear vision he was selling. I didn't always buy in (too bad for me). But it was never like he was pulling a con.
"Lets talk about this "Reality Distortion Field". People claim that Jobs can make you believe things that aren't true by simple application of charisma."
I never saw the RDF in quite that light. It always seemed to me to be a way that Jobs could convince people that something was revolutionary, earth shattering, world changing...even when it wasn't (or at best, a well executed evolution built on previous ideas). It was the pinnacle of salesmanship -- turning customers into religious followers. Something to be admired and feared.
That's why the figurative iShitinabox was such an on target joke. The idea was that Jobs could have put shit in a box, but a diminutive "i" before the name and had people lined up around the block for a week before launch ready to buy it -- all swearing that it's going to change the world.
And always, nobody in that line was ever willing to just fess up that they were camped out in front of the Apple store because Jobs told them to be there and buy his stuff. On questioning, they would all say, and perhaps even think, that it was an original idea for them to go there and stand in that line -- and that they were especially smart and clever people for having arrived at that idea by themselves -- and this cleverness is supported by the other 500 clever smart people who are camped out a week before they can actually buy iShitinabox -- or at least an amazing coincidence. They're all different(ly thinking) in exactly the same way.
And then in the months following, there'll be some segment of that buying population that will, deep down, be dissatisfied in some way with their iShitinabox, but can't quite get the mental lens in focus to really notice it because the RDF has them in its grips, and they'll flood internet forums talking about how iShitinabox is the best thing in the history of things and will fight detractors to the death -- deflecting constructive criticism, covering over product flaws, giving testimonials about how their life has changed, start marginally successful businesses around the iShitinabox that would be more successful if they also sold almost their exact same product to the other 50% of the planet that doesn't think iShitinabox is a herald of the second coming.
That is what Job's magic power was, his Reality Distortion Field.
It's not blind loyalty, that's wrong. The iProducts really are very good -- you get a damn fine product when you buy it. But it's the religious fervor that Jobs could generate, the obedience and recitation of the doctrine by Jobs -- bolstered by a tangible thing that you could point to.
What was it about Job's delivery that caused this to happen? I remember the first iPhone launch and people were lining up with stacks of thousands of dollars in cash so they could buy 20 or 30 of the phones at launch. People literally weeping in the street when the inventory was sold out and they couldn't get theirs. I've known at least a dozen people who bought every single iPod, iPhone and iPad like they were collecting Pokemon -- some even while barely making rent.
I know of at least one divorce over this phenomenon. After every keynote, my friend's husband would run out and buy pretty much one of everything that was put up for sale right after -- annihilating their savings for a new home.
I have a professor friend that literally can't control himself and spend thousands of dollars a month on subscriptions to various services and apps for his iProducts. In class, he will try and inappropriately push this stuff on his students like a born again preacher pushing the good news.
It's not a Field-of-Lies that comprised Job's Reality Distortion Field, but it's not really an innocent Field-of-Dreams either.
The primary flaw with every RDF theory is that most Apple customers have never heard Steve Jobs speak. They only know new products are coming out when they see them on the shelves or when news hits CNN or by word of mouth. And if the media and word of mouth are so resoundingly bad at reporting on tech, how are they supposed to be so good at propagating the RDF?
How in the world could the RDF be so powerful as to bamboozle users over the utility and novelty of features, when most of the people buying new Apple products don't even know what the specific features of the next product are?
That's what's so damn offensive about people trying to push the idea of the RDF, even when they try to soft-pitch the idea by saying the products themselves aren't bad.
The entire theory insists that the person who just used an iPod and iTunes for two years doesn't know a thing about their own experience. That their high opinion of Apple is driven not by their actual experience but by the magical mind powers of a person they've never heard speak. That they're objectively wrong and some person who's never used an Apple product for more than five minutes is right, as evidenced by some techno-gibberish on a spec sheet. And that when the happy Apple customer decides to try a Mac, because the iPod thing went so well and the Dell PC thing went so poorly, that the only possible rationalization is that they've been brainwashed.
There was something there. It was even acknowledged in the bowels of Apple (RDF was coined inside of Apple). I'm not sure I know entirely how it's propagated, but I have a hunch it's through people who could best be called "evangelists".
They didn't know Jobs personally. But they took time off of work to watch every keynote and try and crash the MacWorld expos. They bought every sort of device they could get their hands on since the original iMac. And they insist to their nontechnical friends that the benefits of the platform "it just works" outweigh any downside.
My nontechnical friends who have iPhones and Macs and such all seem to have gotten into the platform that way -- they all know somebody who evangelized the platform. And with the brilliant integration between the devices, once you dip a toe in, you may as well just jump in all the way.
Now it may not be Jobs' RDF that initiated this, it may be Ive's design I don't know. But there is a strange sort of pseudo-religious phenomenon there. You simply don't see this kind of behavior with any other technology.
How in the world could the RDF be so powerful as to bamboozle users over the utility and novelty of features, when most of the people buying new Apple products don't even know what the specific features of the next product are?
That's the question isn't it? Today the iProduct line is well known. You can guess, based on the prior pattern, that any new iProduct will be very good at what it does. It may even innovate in a few key areas and shakeup the particular vertical it launches into.
But remember when the original iPhone came out? or the iPad? I don't know of anybody who actually knew anything about them, but they sure as hell were motivated to stand in line for 8 or 9 hours to get one. Why? I sure as hell don't know. They all already had phones, what was it about the iPhone that they just had to experience. There really wasn't any prior history there, and the original phones were pretty expensive for the time. I'd even argue that to a rational observer, Apple's near death not that long before hand would dissuade somebody from spending that kind of money on a device by a company that has had a very shaky history.
So what was it that moved people to empty their bank accounts and try to buy a dozen original iPhones? The iPod didn't have that sort of mania attached to it.
Why are they all there? What would it hurt to wait a couple of days until you next swung by the mall? Why are they hi-fiving and jumping up and down like they just won the lottery? What great thing did they just accomplish?
Remember, at this point in time (June 2007) all they know they've accomplished is to wait in line to buy a phone by a not very successful computer company who's other non-computer devices were by and large complete flops (sans the iPod). Remember the Apple printers? Newton, Pippin, the Quick Take, the Hi-fi? In June of 2007, Apple was rumored to be up-for-sale (with Google a possible buyer), the stock wasn't performing well.
Outside of the keynote and some coverage in the major media, there was literally no information on this device up to this point. Nobody in that line had ever used one, most had probably never seen one in real life, there were no owner testimonials, early reviews weren't exactly glowing, this could have just been a weird Apple curiosity like the Quick Take.
But all of these people, at sites all around the U.S. lined up in massive groups to buy, site unseen, and hi-five each other, over a device that they already pretty much owned as far as they knew.
The RDF is amazing, it's real and it speaks to its power that those most affected by it seem to not even be aware of its existence.
Maybe because it's a 'social event', like lining up to buy concert tickets, or a new video game, or going shopping on black friday, or getting to an amusement park early to be first in line for their favorite roller coaster.
Regular people do those sorts of things for products and experiences all the time and no-one feels a need to invent an "RDF" to explain it.
If you want to say "Jobs was persuasive" in meetings, that's one thing. What people who've worked at Apple and seen keynotes comment on and call an RDF is basically that. There is no doubt that he was a charismatic guy who prepared and presented well. But that's a far cry from saying it's at all reasonable to assert that his charisma motivates any statistically significant slice of people to buy Apple products without any regard for their quality or value.
I don't understand how Steve Job's personal charisma affects how majority of Apple customers feel about their products.
Correct me if I'm wrong, majority of Apple's customers don't go around looking for Steve Jobs (if they know the name in the first place - before he passed away of course) videos and sales pitches.
(RDF or NOT) --> (People around SJ) -/-> (Average Apple customer)
Well, Steve Jobs shaped the image of Apple as a company and it's products, and ensured things worked the way he wanted them to, right or wrong, from the inside out.....
There's a reason people go nuts over most apple products, and it's not because Steve jobs gets up on stage, and it's not because of technical specifications.... it's because he managed to setup the company to produce and market products that carried a bit of that magic with them for some reason. The packaging, the styling, etc...... we can argue that a box is just a box, but I can't help but notice when I give someone an iPod for christmas or something the fascination they have with simply opening the box. They start delicately examining the box and protecting it, usually keeping it afterwards for no explicable reason, even before getting to the product at hand.
So a box is more than just packaging obviously... if it has that effect on people, it bolsters the brand..... pretty simple.
The great thing about humans in groups - once you are charming enough to sell the right 10-20% of a population on your own, with luck, other people will soon start to copy them, and pretty soon you are the Catholic church, or Facebook, or Apple.
You mean 10-20% of Apple customers were exposed to personal interactions with Steve Jobs? I'm not sure on that.
Example: Apple had almost spent nothing on adverts or anything in India until a year (or two) ago. And yet the iPod was super duper popular. (Other products not as much as for the average Indian the price of the product + 30% tax by the govt. on the import puts them out of reach.)
The millions of Apple customers today have little in common with the tiny hardcore group of Mac users who were hanging on for dear life back in the late 1990s. For those people, Jobs was the savior.
The "Shitinabox" thing dates back to the Mac Cube, which actually did sell well for a couple months after release. (Presumably mostly because it was Steve Jobs Approved.) I think it was popularized by John Dvorak.
Probably. But today's Apple isn't built on those hardcore Mac users. I wasn't one of them. If I go back in time, I would prefer XP over what Apple was shipping in 2002. I only have Apple stuff _after_ they became good again. (Too young to have had a chance to buy the originals - Apple II et al.) OS X is UNIX - love that - and it has nice proprietary apps which to me add lot of value to everyday life.
Didn't the FBI background check on Jobs conclude that he was prone to not being forthright and honest with a tendency to distort reality? I don't think they got winged over by charisma.
"IF you go back and watch these, can you find a single lie?"
I'm sure it will be rationalized as bad information or somesuch; but - open sourcing Facetime. The RDF is more of an exaggeration of the truth to the limit of the suspension of disbelief. If you call a product "Magical", is it not silly? What if the people that built the product actually believe it?
You seem to be a bigger fan of Apple than Gruber is, to the point of always accusing people of hounding Apple, I wonder what topics you're critical of him about.
>(EG: Just this weekend I read in "Inside Apple" the long refuted claim that Apple "stole" Xerox technology for the Mac. Amazing kind of a theft that was-- Apple paid for a license to use that technology with stock which, if held to present, is worth Billions of dollars. Quite the heist!)
In one earlier post you were claiming Xerox didn't invent the GUI, and that Apple did, and now you're saying it was properly licensed? If you're accusing others of thinking one-sidedly, you need to look in the mirror.
>Another example: for quite a time there, many windows fans believed that Bill Gates owned Apple, because to them $150M is a big "investment" and they think Microsoft bought Apple in 1997. (they didn't know that Apple had a lot more of that in cash already, and that part of the deal-- the bigger part-- was burying the hatchet on all the patents microsoft was violating, to the tune of several billion dollars a year from Microsoft paid to Apple for several years. This latter bit was reported, but kept quiet because Apple didn't care and microsoft wanted to save face... so its not widely known.)
The biggest concession from Microsoft was that they made IE and MS Office for Mac, which kept Apple afloat for enough time to develop and launch OS X.
>was burying the hatchet on all the patents microsoft was violating, to the tune of several billion dollars a year from Microsoft paid to Apple for several years. This latter bit was reported, but kept quiet because Apple didn't care and microsoft wanted to save face... so its not widely known.)
What? Why would Apple lose billions a year for $150 million? That too wasn't a settlement but gave away actual shares of the company. If what you claim is true then Apple's management did a very crappy job and must be sued. And you say RDF doesn't exist. Like the Matrix, the ones under it's influence do not know they are in an RDF :)
On the contrary, I think your point is very relevant to the issue. One of the most relevant responses. I would like to point out that I'm not claiming that Steve Jobs never lied, or that he didn't use distraction-- a famous example being the claim that Apple was not working on an ereader[1] because "people don't read anymore". Many people think this is a lie , but I disagree. People don't read as much as they did in the past, and thus obviously if Apple were to do something in that space they'd have to do something other than replicating the traditional "reading" experience. You can call that a distraction... or you can claim he was wrong because you have stats that show people are reading more now than they ever have (of physical books) but at best you'd have him with wrong facts.
However, I do wish you'd been specific about what statements about FaceTime you consider to be a lie. Now I'm forced to guess. The only issue about FaceTime that I can guess would be controversial was the claim that the protocol was going to be made open or a standard, or that something might be open sourced. (I'm thinking it was the former.)
I don't think this was a lie, but an error. I suspect that this was probably Steve Jobs idea and a decision he made at the relatively last minute which turned out to be problematic either because FaceTime uses technology that Apple had licensed, and thus couldn't open source or a protocol that was licensed and thus Apple couldn't standardize. Or, it may be that they are in the process of doing this, possibly with some standards body, and are mired in bureaucracy?
I really don't know what happened (I don't even know the specifics of the alleged lie).... but in order for there to be a lie, I think there needs to be more than just saying something that is not correct.
There are lots of sources of error- misremembering, misunderstanding, misbelieving, or misspeaking. There needs to be an intent to deceive for it to be a lie, in my book.
This is a positive defense of Bill Gates too-- was he simply in error? Did he think that the Mac UI which did make it into OS X was the "real technology" and that the "kernel" of NeXTSTEP was the entire Cocoa frameworks? In that viewpoint you could say that what Bill Gates said is strictly true, though the result of that-- Isaacson thinking that the NeXT acquisition was a waste of money other than getting Jobs, is pretty deceptive.
Its possible Bill Gates is old and wasn't thinking clearly, or was deceiving himself,, or Isaacson is completely misrepresenting what he said.
[1] Note that to date, Apple has never released a Kindle competitor. It has released very different things, the iPhone and iPad. They do allow you to read books, of course, with iBooks, but Apple wasn't chasing the Kindle, and the question was whether Apple was going to compete with the kindle.
We don't know, because Apple never said another thing about FaceTime being submitted as an open protocol. Someone might even think they made the empty promise just so they wouldn't get flack about another proprietary chat service. But who knows... Apple should have addressed it one way or another.
Just as I suspected, this point is rationalized away with speculation. He said:
“And we’re going to take it all the way. We’re going to the standards bodies, starting tomorrow, and we’re going to make FaceTime an open industry standard.”
The reason this rings so hallow is because it is so outside the MO of Apple. If Apple creates something they think is really great (and customers do too) it becomes part of what sets Apple apart. Apple doesn't license their technology, they use it as a key differentiator. The only real contribution Apple has made, in my mind, to open standards is webkit; and they built that on top of KHTML so I'm not sure they really had a choice.
Yeah, I've heard that response and I guess I don't know how I feel about it still; I don't see what prevents him from keeping his assurances or I guess, why he wouldn't? I've seen others point out that "standards" got a lot of support as they were bashing Adobe. (Frankly I also think there probably would have been a good FaceTime implementation on Android and it would impact the uniqueness/Appleness of the feature). I guess I'm speculating, maybe inappropriately.
Interesting, I couldn't have disagreed more with the initial choice. I love Linux and appreciate Android, but I drool over iPhone 4/4s and I love my MBP. I'm surprised that people still fawn over OS X as much as they do, frankly. Especially as it becomes increasingly annoying to use as a development machine (at least personally).
I also "drooled over the 4" for quite awhile waiting but when I got the 4S I found it didn't take me long to recall Edward Tufte's words:
"the elegant sharp edges that encase many touchscreens require users to desensitize their hands in order to ignore the physical discomfort produced by the aggressive edges. Last year in Cupertino, I yelled at some people about touchscreens that paid precise attention to finger touches from the user but not to how the device in turn touches the hands of the user (and produces divot edge-lines in the flesh)."
There are some great things about the 4/4S hardware. The screen quality, camera and battery life are well attested to all over. The A5 benchmarks lap the field. The voice quality is much improved. My reception is far more consistent when about. The look and build quality is amazing.
There's no need to make up features though. I have on a couple of occasions balanced my 4S vertically for all of a few seconds. It felt precarious and was hardly stable or intended.
What do you have around you that rests at eye level that you can set an iphone on? It's not like the camera aims up or anyone wants to FaceTime with my belly.
Unless you cling on to the phone like your life depended on holding it tight this is simply bogus. A full week? I find it almost impossible to believe that any grown person could be that feeble.
Sorry, it wasn't meant as an attack and for most of the time, it is perfectly usable. That having been said, Lion has removed environment variables, such that variables set via plists or .bash_profile/.bashrc are not propagated to apps launched via Finder. This means that I have to manually launch SublimeText2 or Vim so that the rest of my tools work.
For example, find articles on how to build Android in Snow Leopard and Lion. Besides the tool (which one escapes me at the moment) that simply shipped buggy and broke builds for many projects, the steps for getting up and running with an Android build are far more drawn out than in any Linux environment. Much of that is due to getting Xcode, or the things that Apple only ships via Xcode, git, etc.
It surprises how so many simply disbelieve what Isaacson has written because, in their heart of hearts, they can't believe it's true. Denial, or do they actually have personal insight into Jobs?
There has been a lot of vilification of Isaacson's book, much of it seeming to draw ire because it presents Jobs as a mere human.
It's pretty clear from the second-to-last paragraph that Gruber understands that Jobs was a fallible human being. The criticisms in this post revolve around Isaacson's handling of technical accuracy, and Isaacson's repeated implication that design and engineering are two adversarial ideas.
Gruber managed to do that without offering much evidence to the contrary.
Do Apple engineers and Designers really work side by side? Or is it just that Apple prioritise design over engineering.
So rather than the engineers throwing a brick over the fence and saying "Designers, make it look good" it's designers throwing a sleek brick over the fence and saying "Engineers, make it work."
The antennae-gate was a REAL problem, that Apple changed it's design for (in the 4S)... Can we really accept there is no tension in Apple over these things (As Gruber is suggesting)?
Regardless whether it's a design or engineering problem. It is pretty clear evidence there is tension between the teams (design and engineering). Which is the opposite of what Gruber is arguing.
Mostly the same, Volume/Silent switch are a little bit higher. And the steel rim now has more breaks in it (which was apparently a cause of the reception problems, when your finger went over the only break)
It's pretty clear from the second-to-last paragraph that Gruber understands that Jobs was a fallible human being.
Gruber concedes an inch to take a mile. It's a tactic as old as time and it's hardly surprising that he uses it here.
However it is the volume of the book dedicated to humanizing Jobs that turned many of Jobs greatest fans against Isaacson. As Gruber says "Isaacson got the self-absorbed hypocritical asshole right, but the world is full of self-absorbed hypocritical assholes."
Anyone with even a passing familiarity with Mac OS X (and iOS) history and internals knows that Gates' (and therefore the book's) claim that NeXT technology wasn't the basis of OS X is laughably false.
In fact I think Gruber himself continued to understate the role NeXT technology played. Gruber writes:
> It is in fact, completely and utterly wrong.
> NeXTStep was not “just warmed over UNIX”.
> Apple did get NeXT’s OS to run on Mac hardware.
> Mac OS X 10.0 was a hybrid of Mac and NeXT
> technology, but it was clearly the NeXT system
> with Mac technologies integrated, not the
> other way around.
Gruber should have gone further and pointed out that Mac OS X wasn't even a hybrid, but rather the latest iteration of NeXT technology combined with new code that reimplemented the Mac experience. The modest quantity of ported or migrated Mac technology (e.g. the Carbon APIs) existed purely to form a compatibility layer with existing Mac apps, and don't form the basis of any future development path.
It would be also fair to say that iOS does not have any legacy "Mac" technology in its stack, and shares its spiritual lineage with NeXT alone.
> The modest quantity of ported or migrated Mac technology (e.g. the Carbon APIs) existed purely to form a compatibility layer with existing Mac apps, and don't form the basis of any future development path.
I dunno. Post-Rhapsody, Apple worked hard to unify Carbon and Cocoa to provide a consistent user experience across applications. IIRC, in a surprising number of cases, the way unification was achieved was to make the Cocoa element be little more than a compatibility layer on top of Carbon. For example, I believe the save and open panels, print and page layout panels, and various dialog APIs worked this way.
It seems to me that if Carbon was viewed as a dead-end from the get-go, embedding it in many cases under Cocoa was a funny way of demonstrating that.
Carbon was a big project, and wasn't modest by any standard. Again IIRC, Apple rewrote the entire Finder in Carbon (and not in Cocoa) in order to test the Carbon library for bugs. Other things came from the Mac while their NeXTSTEP equivalents were abandoned: for example, UFS was blown away and replaced with HFS+. Quicktime was ported, as were 3D facilities and game APIs. NeXT's Soundkit was deprecated in favor of stuff for OS 9. Java, up through 1.3.x at any rate, came from OS 9. I think the font facility, always an ugliness in NeXTSTEP, was replaced with Apple's (including support for OpenType etc.). I think your depiction of this stuff as "modest" is not particularly accurate.
So anyway, Isaacson's depiction of OS X as being basically Mac technologies with a little NeXTSTEP kernel is ridiculous. But don't make the same mistake in the other direction.
IIRC, Apple rewrote the entire Finder in Carbon (and not
in Cocoa) in order to test the Carbon library for bugs.
This may well be true, but my own suspicion is that it had as much to do with Apple having hundreds of core system developers already familiar with the C++ APIs, and a limited timeframe to get Mac OS X released. Either way, both assertions point to Carbon being a compromise choice.
UFS was blown away and replaced with HFS+
In order to maintain compatibility. And it wasn't just ported code -- HFS+ support was rewritten from the ground up as a UNIX file system. Surprisingly, the result wasn't a delicate hack, and the fact that we're still using it today (on iOS too!) speaks to the engineering capability of Apple. (And NeXT, since it's all one big family now.)
Quicktime was ported
QuickTime was also ported to Windows.
as were 3D facilities and game APIs
In order to maintain compatibility. The recommended way to write games on Mac OS X has always been the OpenGL APIs, and you can hardly describe OpenGL as a legacy Mac technology.
Java, up through 1.3.x at any rate, came from OS 9
Java came from Sun. I have no knowledge of how Java was implemented in Mac OS X, but to the extent that any platform-specific or processor-specific hooks were lifted from the OS 9 distribution, that's hardly a Mac "technology".
I think the font facility, always an ugliness in
NeXTSTEP, was replaced with Apple's
Font handling in Mac OS X is part of Quartz, a new technology developed for OS X. Not only was it not taken from OS 9, the whole technology direction was abandoned. (Remember QuickDraw GX?)
There are a whole lot of factual errors here, or at least conclusions kind of pulled out of thin air. To wit:
> This may well be true, but my own suspicion is that it had as much to do with Apple having hundreds of core system developers already familiar with the C++ APIs, and a limited timeframe to get Mac OS X released.
Actually, NeXT already had a perfectly cromulent Cocoa-based "Finder", called the Workspace Manager. This was the program used in NeXTSTEP and later in Rhapsody. Apple threw it away and replaced it with a Carbon Finder built in-house. They were quite specific as to the reason: to guarantee that Carbon was bullet-proof, Apple built the Finder "to eat their own dog food" (though he didn't originate the term at Apple, I think Steve started using it too).
> [regarding UFS] In order to maintain compatibility. And it wasn't just ported code -- HFS+ support was rewritten from the ground up as a UNIX file system. Surprisingly, the result wasn't a delicate hack, and the fact that we're still using it today (on iOS too!) speaks to the engineering capability of Apple. (And NeXT, since it's all one big family now.)
It wasn't just compatibility. Though it was case-preserving :-(, HFS+ had some big advantages over UFS. It supported Unicode. It supported metadata. It supported soft links which were preserved in removable media. It had much better networked file support. And so on.
Also: IIRC HFS+ wasn't rewritten from the ground up. Apple already long had a version of it running on UNIX systems they had developed in-house.
> Quicktime was also ported to Windows.
The point is, NeXT already had a multimedia, sound, and (limited) video system. It was famous for its sound system in particular. But (Carbon) Quicktime was better. So they used it instead.
> In order to maintain compatibility. The recommended way to write games on Mac OS X has always been the OpenGL APIs, and you can hardly describe OpenGL as a legacy Mac technology.
You absolutely can! Porting OpenGL involves a huge number of low-level ties to the operating system. NeXT had its own 3D facilities as well (OpenGL under NeXTSTEP, and Display Renderman), which were entirely tossed out in favor of the "legacy" OS 9 version.
> Java came from Sun.
On early OS X, the bulk of the Java facility came from Apple. Sun didn't support the Java port at all. NeXTSTEP, or more properly OpenStep, had a Java port from Sun which was entirely replaced with the OS 9 Java version.
> Font handling in Mac OS X is part of Quartz, a new technology developed for OS X.
Not correct. Font and typographic engine technology was derived from ATSUI, Apple's advanced typography system. And why wouldn't they? It was the best in the world.
Look, I don't dispute, by any stretch, the notion that the crucial parts of OS X were NeXTSTEP. I'm a NeXTSTEP guy! But your dismissal of Carbon and OS 9 technologies that found their way into OS X is both overly casual and in many cases simply false. To this day OS X still has a huge number of OS 9 technologies embedded in it not because of compatibility, or just because of compatibility, but because they were the best technology. Apple's not stupid.
"It was famous for its sound system in particular."
I don't recall it being a big deal after the black hardware and their DSPs were killed.
"NeXT had its own 3D facilities as well (OpenGL under NeXTSTEP, and Display Renderman), which were entirely tossed out in favor of the "legacy" OS 9 version."
I don't recall NeXT ever having OpenGL. And Display Renderman pretty much lost to OpenGL in the 90s, so that wasn't about to be resurrected.
" NeXTSTEP, or more properly OpenStep, had a Java port from Sun"
As I wrote one of the more popular sound editors as an undergraduate (Resound), them's fightin' words. SoundKit was still quite good, even if MusicKit sorta died with the DSP.
I think I'm mistaken about OpenGL, my memory is fuzzy. As to Java: what I was thinking of was OPENSTEP/NT and NEO both supporting Java (WebObjects had Java as early as '97), but I believe it was never released on NeXTSTEP. I guess that doesn't count.
I do miss Renderman on the desktop. Maybe not very practical, just really cool.
When I was contracting at Swiss Bank in Chicago in 1994, I noticed an HP on the network running NeXTSTEP. I telnet'ed to it from the NeXTStation on my desk, and ran some renders to see how much faster it was. Got a stern email saying "Stop doing that.".
The entire article is Gruber hand-picking Jobs quotes that serve his vision of Jobs, and discounting the quotes that Isaacson picked -- or holding them as misunderstandings/misinterpretations -- because they don't conform with Gruber's opinion. Gruber has commented on Isaacson's book numerous times with incredible disdain (because, Gruber thinks, it misrepresents Jobs), so this is just a continuation of that theme.
But it's not about the quotes at all, except insofar as the quotes that Isaacson chose leave the reader with an impression that is simply wrong. The Bill Gates quote about the NeXT acquisition stuck out for me, too.
Are you disagreeing with Gruber's thesis that Isaacson really didn't understand the complementary contributions of software to "design"? Why?
I'm typically not a defender of John's, but he seems to me to be on target here.
Gruber makes multiple references to Jobs' faults in that article. He is simply shredding the quotes that Issacson used without actually thinking about what they meant or checking the facts.
What about this article do you think is Gruber twisting Issacson's words?. Every example has a well documented reason on why Issacson is flat out wrong.
Did we read the same article? Where does Gruber "shred" anything? Bill Gates says that "Instead the purchase ended up bringing in Avie Tevanian, who could help the existing Apple operating system evolve so that it eventually incorporated the kernel of the NeXT technology.", and Gruber says "Mac OS X 10.0 was a hybrid of Mac and NeXT technology, but it was clearly the NeXT system with Mac technologies integrated, not the other way around."
What Bill Gates said was absolutely true! Rhapsody with the NeXT Mach kernel, a pulled-in, non-NeXT BSD application layer, and then various other Mac layers (GUI, Cocoa, etc). Technically there is nothing wrong with his statement.
But it doesn't sound as impressive to Steve Jobs to say that only the microkernel (which we know is a critical part of a system), so better still to simply claim Gates is the liar?
> and then various other Mac layers (GUI, Cocoa, etc).
Incorrect. The GUI wasn't ported, it was rewritten for OS X. Cocoa's heritage is NeXT to the core. Nothing vaguely resembling Cocoa existed in the classic Mac environment.
> a pulled-in, non-NeXT BSD application layer
Incorrect. There is little of "BSD" in the Mac OS X environment, save for some FreeBSD userland components, most of which only matter on the command line. The application layer of Mac OS X is Cocoa.
(There is also the Carbon application layer which does partially derive from classic Mac, but that existed solely to ease application porting of existing Mac apps. But even Carbon is a hybrid, and was actually back-ported to Mac OS 9.)
> Technically there is nothing wrong with his statement.
Any Mac OS X software developer will know enough to confidently refute the accuracy of Gates' claims.
> What Bill Gates said was absolutely true! Rhapsody with the NeXT Mach kernel, a pulled-in, non-NeXT BSD application layer, and then various other Mac layers (GUI, Cocoa, etc). Technically there is nothing wrong with his statement.
Actually, technically there's everything wrong with that statement. It's just not the case.
Cocoa is NeXTStep. Take a look at the APIs. Everything is prefixed with NS.
The old Carbon APIs were completely reimplemented--those original Mac OS Classic APIs were mostly written in 68000 assembly with a liberal amount of Pascal thrown in--totally unsuitable for the new flagship OS.
I used OpenStep for about a year before the first OS X Betas were released. I can tell you first hand that Rhapsody was 90% NeXT and 10% veneer to make it "look like a Mac". "Terminal.app" barely changed from OpenStep to Mac OS X. Several other apps looked identical too. I'm pretty sure the minify button iconified the icons the way OpenStep used to.
Don't forget that NeXTStep/OpenStep had a full BSD subsystem too. That's not new or unique to Mac OS X. Go to your terminal and "man open". Notice the "First appeared in NextStep" part.
The fact is Mac OS X 10.0 is NeXTStep/OpenStep with some extra compatibility layers for the carbon APIs (plus an VM for doing running OS 9 apps). Saying that they "just pulled the kernel out" is patently false.
Yes, I have. And I can confidently say that Mac OS X, even today after a decade of evolution and sweeping changes, is clearly a direct descendant of NeXTStep and has little in common with System 7. I'll also reiterate what others said -- there is everything wrong with Gates' statement.
I do iOS development now, and I spent the 90s doing NeXTSTEP/OpenStep. I still own two NeXT machines.
The main differences now are 1) cheaper, faster hardware, 2) menubar at the top instead of a floating vertical menu, 3) I keep the Dock on the left instead of on the right side of the screen, 4) fancier development tools.
It's hard when you find out that your hero was in many ways a contemptible person. And it's not a noble reaction, but certainly a human one, to go looking for some fault with the source of this news.
Why? Why are we even interested in reading a biography about Steve Jobs to begin with? Because he was a narcissistic asshole? Really? Because that's the part Isaacson nailed. There are plenty of assholes, and that characteristic alone does not make for a best-selling biography. No, the reason anyone is interested in reading Steve Jobs's biography is because of his work.
And yet, Steve's work is the part Isaacson doesn't get. Isaacson falls into the same traps that the media does with regularity; thinking Apple's design obsession is about veneer, thinking it's about marketing, about fooling people, about lying. It's not, that might sell a few products, but it does not sell record quantities of products and achieve top customer satisfaction.
You'd think a person with full access to Steve Jobs and people close to him would be able to at the very least ask a few questions about what he saw that others could not, that lead to the successes of eg. the iPhone. Recall other industry big wigs laughing it off, from RIM to Nokia to Microsoft. The iPhone was a joke to them. What did Steve see that they did not? What was his thought process? What made Steve Jobs so different for him to be able to upset industry after industry? These are things I'd have wanted to know and I can't help feel a bit sad that now we will never know. Because Isaacson squandered the only chance we got.