Over all of history, there is no accounting for what "the mainstream" decides to believe and do. Many people (wrongly) think that "Darwinian processes" optimize, but any biologist will point out that they only "tend to fit to the environment". So if your environment is weak or uninteresting ...
This also obtains for "thinking" and it took a long time for humans to even imagine thinking processes that could be stronger than cultural ones.
We've only had them for a few hundred years (with a few interesting blips in the past), and they are most definitely not "mainstream".
Good ideas usually take a while to have and to develop -- so the when the mainstream has a big enough disaster to make it think about change rather than more epicycles, it will still not allocate enough time for a really good change.
At Parc, the inventions that made it out pretty unscathed were the ones for which there was really no alternative and/or no one was already doing: Ethernet, GUI, parts of the Internet, Laser Printer, etc.
The programming ideas on the other hand were -- I'll claim -- quite a bit better, but (a) most people thought they already knew how to program and (b) Intel, Motorola thought they already knew how to design CPUs, and were not interested in making the 16 bit microcoded processors that would allow the much higher level languages at Parc to run well in the 80s.
It seems that barriers to entry in hardware innovation are getting higher and higher due to high risk industrial efforts. In the meantime barriers to entry in software are getting lower and lower due to improvement of toolings in both software and hardware.
On the other hand due to the exponential growth of software dependency, "bad ideas" in software development are getting harder and harder to remove and the social cost of "green field" software innovation is also getting higher and higher.
How do we solve these issues in the coming future?
But e.g. the possibilities for "parametric" parallel computing solutions (via FPGAs and other configurable HW) have not even been scratched (too many people trying to do either nothing of just conventional stuff).
Some of the FPGA modules (like the BEE3) will slip into a Blades slot, etc.
Similarly, there is nothing to prevent new SW from being done in non-dependent ways (meaning the initial dependencies to hook up into the current world can be organized to be gradually removeable, and the new stuff need not have the same kind of crippling dependencies).
For example, a lot can be done -- especially in a learning curve -- if e.g. a subset of Javascript in a browser (etc) can really be treated as a "fast enough piece of hardware" (of not great design) -- and just "not touch it with human hands". (This is awful in a way, but it's really a question of "really not writing 'machine code' ").
Part of this is to admit to the box, but not accept that the box is inescapable.
Thank you Alan for your deep wisdom and crystal vision.
It is the best online conversation I have ever experienced.
It also reminded me inspiring conversations with Jerome Bruner at his New York City apartment 15 years ago. (I was working on some project with his wife's NYU social psychology group at the time.) As a Physics Ph.D. student, I never imaged I could become so interested in Internet and education in the spirit of Licklider and Doug Engelbart.
You probably know that our mutual friend and mentor Jerry Bruner died peacefully in his sleep a few weeks ago at the age of 100, and with much of his joie de vivre beautifully still with him. There will never be another Jerry.
However since PC revolution, the mainstream seemed to take on the "data" path for whatever technical or non-technical reasons.
How do you envision the "coming back" of image path via either bypassing the data path or merging with it in a not so faraway future?