Jobs’s software team took the graphical interface a giant step further. It emphasized “direct manipulation.”
And this right here is a major point of contention.
The standard WIMP GUI that is taken for granted today (and that evidently DE designers have a hard time shaking off even when supposedly attempting to "break new ground" - see GNOME 3 and Unity) might have ended up being the lesser approach.
The PARC conception of a GUI (later emulated by Niklaus Wirth/ETH Zuerich's Oberon, as well as Rob Pike's 8 1/2, rio, help and acme) really had this knack for actually enforcing composability.
In present GUIs, windows are mostly dumb, isolated and unable to talk. In addition, they are very difficult (if not outright unfit) for automation. Acceptance testing frameworks like Selenium which run as headless browsers show this isn't a problem if there's a common serializable representation (HTML, DOM...), but not so for desktop GUIs.
All text on screen is modifiable, regardless of where it is. Text is executable. A lot of common scenarios where people cook up quick scripts for task automation are effectively made obsolete, given that the desktop itself is one big programming environment without the user really being told it is. Task launchers/run dialogs are no longer needed.
More recent research, like Bluebottle OS (de facto successor to Oberon?), has experimented with zoomable interfaces. These obsolete virtual desktops entirely, because you have infinite (by the colloquial definition of infinity, of course) space to work on.
It's this drive to move beyond WIMP that has motivated a lot of people toward tiling window managers. Ironically enough, they are also the most primitive and simple to create, which really says something. Still pales in comparison to the classic GUI, though.
>The PARC conception of a GUI (later emulated by Niklaus Wirth/ETH Zuerich's Oberon, as well as Rob Pike's 8 1/2, rio, help and acme) really had this knack for actually enforcing composability.
Sounds really interesting can you point to some resources where I can read about this in greater detail?
Installing plan9 is about as close as you can get without reading but actually using those concepts. There is quite a strong link between all those projects.
I recommend anyone interested in PARC read "Dealers of Lightening" by Michael Hiltzk.[1]
It's been argued that the profits from the laser printer paid for the money spent on PARC a 100 fold. I'd agree with that. That said, I don't think Xerox could have been the new IBM/Microsoft/Apple combined - simply because their sales force was.. addicted? to the per imprint commission model, and it would be a huge change to go to a different model for them. So while PARC could have invented it, and they could have possibly gotten it into production, I don't think their existing sales and support force understood enough on how to sell and service it.
Reminds me of Kodak. They had the future, but it didn't have meaning in the mind of the company. Large thriving systems seem to lack enough schizophrenia to understand what they need to do to survive.
I was at Kodak when the terminal decision was made. Among other factors... Film sales were driven by retailers/drugstores, which were hooked on the model of customers visiting to buy a roll, another visit to drop off the roll, and a third visit to get the prints - and odds were that they'd buy something on each trip. When digital photography started catching on, those stores made clear that if Kodak did anything to disrupt that pattern, they'd drop Kodak products fast and destroy the company. Ergo, Kodak was reluctant to dive into digital. Yeah, there was a long-term plan to cope, but nowhere near as fast a transition as customers made.
I tried to push the idea of a cell-phone-equipped camera which would immediately upload images and mail you prints, but was a mere peon so that went nowhere.
Don't lose sight of who your REAL customers are. Bulk buyers who resell product aren't.
It reminds me of Clayton Christiansen's books on disruptive innovation, where he describes the mechanism by which companies are disrupted as them "being held hostage by their customers". Basically, what makes an innovation disruptive is that it is unappealing or threatening to your most valuable customers; thus, you always have a strong incentive to ignore it, even if you know strategically that it's important. Doing anything else would punish your revenues and stock price, basically assuring that you're removed as CEO.
Those are good insights. You could probably add to that the fact that exhortations to avoid "marketing myopia"[1] aside, it's not all that clear what distinctive capabilities Kodak brought to the digital photo game. It certainly wasn't their chemical supply chain (though they sold off Eastman Chemical a long time ago). Nor was it especially their distribution channels many of which, as you suggest, were actually negatives in the context of a digital shift.
Furthermore, consumables businesses--especially one as rich as the film business--are hard to replace. Printing had some of the same characteristics for a while and Kodak was somewhat in there. They did the online photo service too. But nothing like film sales and processing. Kodak could have done better but it would have been a tough navigation. Fujifilm seems to have dome OK but by a very circuitous route taken by a smaller company.
it's not all that clear what distinctive capabilities Kodak brought to the digital photo game
At one end, they had hands-down the best & fastest imagers. 13 years ago they had a fantastic camera in a manageable size & price which could have been, in time with effort, worked into a consumer product. They had a great start into dominance as a digital capital company.
At the other end, at core Kodak was a very large scale chemical consumables company.
Alas, it's very hard for a company to transition from a core competence to its polar opposite.
13 years ago they had a fantastic camera ... What model ?
A documentary said Eastman/Kodak was a strong influence on Apple, now its successors as extremely successful company, using similar philosophy, simple and focused products + distribution scheme, albeit virtual now.
Pretty amazing comment too. Did companies spin start up children that explore the current trends, being small => quicker moves, while benefiting the mother-ship with their discoveries so to have a faster transition.
Back in the late 70's, I worked at a company called Aph. They had developed everything in house that could be used to build a PC, and the stuff was way ahead of what else was around at the time. The company was composed of numerous brilliant people (Hal Finney was one).
But what was lacking was somebody with vision to notice what we had. And so it was all for naught.
"You're ripping us off!", Steve shouted, raising his voice even higher. "I trusted you, and now you're stealing from us!"
But Bill Gates just stood there coolly, looking Steve directly in the eye, before starting to speak in his squeaky voice.
"Well, Steve, I think there's more than one way of looking at it. I think it's more like we both had this rich neighbor named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it."
Steve went to the neighbours and paid to use the TV, Gates came and took it. If you were a developer in the 90's, Gates is still more evil than Jobs ever was, and philanthropy years later doesn't wash away the sins. Ruthless, but with enough money, it appears you can make people forget.
> philanthropy years later doesn't wash away the sins.
You could argue that they do, in the sense that stealing software/design isn't particular "evil", while Gates' recent philanthropic efforts are most certainly "good". I'm not saying what Gates did back then was right, but it really only led to damage localized to an industry; it didn't endanger people's lives or destroy culture. As a counter example, I'd call the RIAA/MPAA evil, in that they promote copyright extension and censorship, both of which directly damage culture. Back to the argument, improving the quality of life for millions of people is something I put squarely in the "good" category. In other words, Gates' previous actions didn't harm the quality of life for anyone, but his recent actions improve quality of life. So I call that "washing away the sins."
But please don't take that argument too seriously, because I don't. Its myopic and doesn't take into consideration the other, worse, things that Microsoft has done (that actually harmed people's quality of life). And really, quantifying these things is quite difficult. I'm just presenting an alternative viewpoint.
> Gates' previous actions didn't harm the quality of life for anyone
Disputable. Getting people to use pre-XP Windows, and also strong-arming the PC manufacturers into not bundling other operating systems such as BeOS, definitely count as harm to me.
> If you were a developer in the 90's, Gates is still more evil than Jobs ever was, and philanthropy years later doesn't wash away the sins.
I was, and maybe it doesn't wash them away, but Gates has become a powerful force for good in this world and I'm grateful for that. I don't see much point to nursing old grudges.
The key difference is that Apple gave Xerox $1 million in shares as a payment for them to demo the Alto to Apple engineers with the understanding that they'd implement the ideas in Lisa/Mac.
Apple didn't give Xerox $1 million in shares, they gave Xerox the opportunity to buy $1 million in stock, which Xerox did.[1]
There was no agreement that Apple was allowed to use the technology. Ten years later, Xerox tried unsuccessfully to sue Apple.[2]
Microsoft did pay Apple for the rights to use Macintosh-style interfaces in Xerox but there were disagreements as to whether it covered later versions of Windows. Apple tried to claim extremely broad protection for the "look and feel" of a program covered by 95-year copyright! Those claims would have been terrible for the software industry if upheld, not just in this particular case, but many others.
>Apple didn't give Xerox $1 million in shares, they gave Xerox the opportunity to buy $1 million in stock, which Xerox did.
Well, yeah, it was an offer of 100,000 shares of pre-IPO stock for $1 million that wound up being worth $16 million.
>There was no agreement that Apple was allowed to use the technology.
Right, they agreed to give a demo of their tech. Apple didn't use their technology, they saw ideas and created their own unique implementation that went far beyond the Star interface (and they hired some PARC employees). Xerox wasn't given what amounted to $16 million to visit the cafeteria, they knew what they were doing. After Xerox got a new CEO they filed their suit as a followup to exploit Apple's suit against MS in case they won.
>Microsoft did pay Apple for the rights to use Macintosh-style interfaces
MS paid Apple for some rights (with the license really tied to giving MS early access to develop apps for the Mac), and Apple filed suit over the MS implementation since they interpreted the license as limited, while MS exploited it as unlimited and Apple screwed up the dating in the agreement expecting an earlier ship date. MS completely screwed Apple in that deal. The eventual suit Apple filed was what triggered Xerox to file suit.
>Those claims would have been terrible for the software industry if upheld, not just in this particular case, but many others.
Yes, it's a good thing those claims weren't upheld. Though, had MS not pulled their Machiavellian stunt the suit wouldn't have been filed in the first place.
The Alto UI was revolutionary for its time, but it was not the same as the Macintosh UI, no overlapping windows for example, regions in quickdraw being the big differentiator.
There's a big difference between invention and innovation. Both are important, but we do less and less of the former. The idea of a mouse before PARC must have seemed a little crazy, whereas the idea of the iPhone before 2008 seems a little inevitable.
Alan Kay's recent talk "The Future Doesn't Have To Be Incremental" goes over this yin and yang of the two 'i' words of our industry with authority. Worth a watch.
This seems like a great time to post the phenomenal Everything is a Remix [1]. Part one discusses movies, part two film, and part three discusses interfaces for computers, with special emphasis on Xerox PARC, Apple, and the Mac.
And this right here is a major point of contention.
The standard WIMP GUI that is taken for granted today (and that evidently DE designers have a hard time shaking off even when supposedly attempting to "break new ground" - see GNOME 3 and Unity) might have ended up being the lesser approach.
The PARC conception of a GUI (later emulated by Niklaus Wirth/ETH Zuerich's Oberon, as well as Rob Pike's 8 1/2, rio, help and acme) really had this knack for actually enforcing composability.
In present GUIs, windows are mostly dumb, isolated and unable to talk. In addition, they are very difficult (if not outright unfit) for automation. Acceptance testing frameworks like Selenium which run as headless browsers show this isn't a problem if there's a common serializable representation (HTML, DOM...), but not so for desktop GUIs.
All text on screen is modifiable, regardless of where it is. Text is executable. A lot of common scenarios where people cook up quick scripts for task automation are effectively made obsolete, given that the desktop itself is one big programming environment without the user really being told it is. Task launchers/run dialogs are no longer needed.
More recent research, like Bluebottle OS (de facto successor to Oberon?), has experimented with zoomable interfaces. These obsolete virtual desktops entirely, because you have infinite (by the colloquial definition of infinity, of course) space to work on.
It's this drive to move beyond WIMP that has motivated a lot of people toward tiling window managers. Ironically enough, they are also the most primitive and simple to create, which really says something. Still pales in comparison to the classic GUI, though.