A lot of what's kept us "stuck in the 20th century" lies in just how many human details need to be accommodated to fully computerize the workflow.
Through the early 1990's, hardly anyone was doing electronic file transfer regularly in their personal workflows. While there were many examples of phoning in remotely to be updated or do certain kinds of work, it was a per-industry thing. The larger changes finally came to pass only as email and office networking gained widespread adoption. So...you didn't need computers everywhere in everyday life. They were a nice addition if you were writing frequently or you wanted a spreadsheet, but the net outcome of that was that you could run a smaller office with less secretarial staff - and not a lot more.
In the 90's and 00's, the scope expanded to cover more graphics and messaging workflows. But it was still largely 1:1 replacement of existing workflows in industry, with an import/export step that went to paper. And when you have the "go to paper" bottleneck, you lose a lot of efficiencies. Paper remained a favored technology.
It really wasn't until we had smartphones and cloud infrastructure that we could rely on "everything goes through the computer" and thus start to realize Englebart's ideas with more clarity. And that's also where the "social media" era really got going. So it's like we've barely started, in fact.
What all the prior eras in computing were like were a kind of statement of "it'll be cool when". The future was being sold in glimpses, but predominantly, the role of the computer was the one it had always had: to enhance bureaucratic functions. And the past decade has done a lot to challenge the paradigm of further enhancement towards bureaucratic legibility. In the way that urbanists joke about "just one more lane, bro" as the way to fix traffic, we can say "just one more spreadsheet, bro" has been the way we've attempted to satisfy more and more societal needs.
But there is a post-Engelbart context appearing now: instead of coding up discrete data models, we've started strapping machine learning to everything. It works marvelously and the cost of training is a fraction of the cost of custom development. And that changes the framing of what UI has to be, and thus how computers engage with learners, from a knobs-and-buttons paradigm to "whatever signals or symbols you can get a dataset for."
Through the early 1990's, hardly anyone was doing electronic file transfer regularly in their personal workflows. While there were many examples of phoning in remotely to be updated or do certain kinds of work, it was a per-industry thing. The larger changes finally came to pass only as email and office networking gained widespread adoption. So...you didn't need computers everywhere in everyday life. They were a nice addition if you were writing frequently or you wanted a spreadsheet, but the net outcome of that was that you could run a smaller office with less secretarial staff - and not a lot more.
In the 90's and 00's, the scope expanded to cover more graphics and messaging workflows. But it was still largely 1:1 replacement of existing workflows in industry, with an import/export step that went to paper. And when you have the "go to paper" bottleneck, you lose a lot of efficiencies. Paper remained a favored technology.
It really wasn't until we had smartphones and cloud infrastructure that we could rely on "everything goes through the computer" and thus start to realize Englebart's ideas with more clarity. And that's also where the "social media" era really got going. So it's like we've barely started, in fact.
What all the prior eras in computing were like were a kind of statement of "it'll be cool when". The future was being sold in glimpses, but predominantly, the role of the computer was the one it had always had: to enhance bureaucratic functions. And the past decade has done a lot to challenge the paradigm of further enhancement towards bureaucratic legibility. In the way that urbanists joke about "just one more lane, bro" as the way to fix traffic, we can say "just one more spreadsheet, bro" has been the way we've attempted to satisfy more and more societal needs.
But there is a post-Engelbart context appearing now: instead of coding up discrete data models, we've started strapping machine learning to everything. It works marvelously and the cost of training is a fraction of the cost of custom development. And that changes the framing of what UI has to be, and thus how computers engage with learners, from a knobs-and-buttons paradigm to "whatever signals or symbols you can get a dataset for."