(background I implemented Algol68 as an MSc project in the early 80s)
So Wirth was a big part of the Algol68 working group but grew disillusioned by it's resulting complexity - he went off and invented Pascal, essentially taking the easy to implement parts of Algol68 and put them in a language.
Almost everything new (or newly borrowed) in Algol68 is common today, often reinvented many times over: structs, unions, pointers, a full type system, threads, a standard library, procedure pointers, a heap with garbage collection etc etc
The things we don't have from it are largely different fonts for reserved words and/or types (something largely forced on them by using cards with only upper case), and A68's standard which I guess is what happens when you let the maths guys go wild - they invented a language to describe the language and its semantics, a language that's largely incomprehensible to most mortals (and then reinvented it for the revised standard) .... it made the spec really hard to read.
All in all it was a great effort considering the times, people were still figuring out how to do stuff at the time, it's not surprising they made some mistakes. It was also a time when compilers and OSs were largely owned by hardware companies, it was much harder for a language to get traction outside of academia (you expected to pay tens of thousands for a compiler, a free one couldn't be any good right?)
Actually when Wirth split from the Algol68 working group, he first created his own alternate successor to Algol60 - AlgolW (W=Wirth), which was still in use in 1979 when I took CS in college.
It seems odd for Wirth to be arguing so much for user needs and then come out with a language like Pascal which favored minimalism over practicality.
In this talk he seems to have already anticipated the necessarily extensible nature of a simple language, so the language core must be sound such that when new needs arise one extends, which is what did happen.
Well, Pascal had plenty of flaws too, not just excessive simplicity. From day one the only way to do almost anything non-trivial in Pascal was to abuse variant records as a way to do typecasts. The I/O library was horrible, but standardized as part of the language - e.g. there was no standard way to open a named file, and the only way to close any file was to reopen it (via reset/write) to a temporary file which would have the side effect of closing the original one.
In the event there was no standardized (vs proprietary) meaningful improvement to Pascal (ISO Pascal basically just added poorly thought out "conformant arrays"), and Wirth himself just moved on and designed a new language instead - Modula-2, which was an improvement but also not great, then abandoned that in favor of Oberon.
Actually Algol 68 was a very good programming language and many of its features were really better than their correspondents in most modern programming languages.
Unfortunately it was doomed since the beginning because its scary formal description has not been accompanied by an informal description and tutorial with programming examples.
Therefore the majority of potential users have only looked at the first few pages of the formal description and they did not understand anything, so they decided that the language must be unusable.
While Wirth has made many important contributions to the evolution of the programming languages and programming techniques, in his later languages, starting with Pascal, besides adding significant innovative features that were then widely borrowed in other languages, he also made excessive simplifications that created unreasonable restrictions to their applications.
Most people today do not realize how bad was Pascal in comparison with Algol 68, or even C, because they have used only compilers like Turbo Pascal, which have added a great number of extensions to the original Pascal, to make it usable for practical problems.
The formal description of ALGOL 68 is in fact much more simple than how it looks, but it is very hard to read for the first time on paper, due to the huge number of special terms that are defined, and which would need to be remembered during a linear reading. It would be quite easy to read if converted to a hypertext with appropriate links and using a reader that would show definitions when hovering over words. Of course, such a technology did not exist in 1968.
Scheme is in recent standards (since R6RS) divided in two: a core language, suitable for reasoning about, and a full language, aimed at expressing economically the kind of solutions software engineers come up with. The full language is meant to be expressed in terms of the core language.
This seems analogous to the two-language approach you describe for Algol 68. Are you aware of recent developments in Scheme? Your description of it characterises it as a ball-and-chain for Algol 68.
> 30. In programming, everything we do is a special case of something more general -- and often we know it too quickly. — AJP
> 31. Simplicity does not precede complexity, but follows it. — AJP
> "[Algol 60] is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors." —CARH
Quoting: "The emphasis on a simple and didactically appealing language would automatically have resulted in the development of a nucleus of a self-extendible language, perhaps the most promising approach of development. Such a language provides a facility for programmers to define data structure patterns, and operators which apply to these patterns. The crucial point here is that the language nucleus must be of utmost simplicity, and every aspect of the nucleus language must be very efficiently implementable on any reasonable computer organisation."
Such a simple language I suppose would be LISP. And an application of extending the language, say, could be hashmaps where ... what? ... something like?:
* define new language keywords, types, and operators in LISP plus a canonical implementation
* then have application programmers use those contructs?
This is similar in spirit to fix it "in a library not with new keywords" except in addition to the canonical implementation the inclusion of this new work would introduce new (valid) LISP tokens that programmers would re-use?
But LISP was already 10 years old at the time, so why did it not get traction?
Maybe too minimal syntax? I kind of liked LISP, but figuring out whether you looked at the 5th or 6th closing parenthesis somewhere in the code was not fun. In other words the code gets unnecessarily hard to read. A mandatory formatter might haver helped a bit, but that came only with Python. (Still with too much freedom because everyone chooses their own indentation level.)
LISP is interpreted, not strongly typed, so definitely another class anyway.
No, Lisp is not interpreted. Lisp compilers exist for long times. Lisp is dynamic typed. "Strong" typing often is wrongly used as a term for static typing.
When programming, Lisp certainly benefits from good editor support - as most other languages do too. And remarkably, the Lisp community pretty much agreed on a formatting and indentation style, so support in editors like Emacs is pretty good.
But its dynamic nature together with automatic memory management certainly limited the performance especially on the weak hardware back then. In comparison, it is almost trivial to write a small compiler for a language like Pascal which will produce efficient code.
Fun fact: the back then popular UCSD Pascal (I used it on the Apple II) was using a bytecode engine as the compilation task.
> But its dynamic nature together with automatic memory management
Generally Lisp was used with a resident interpreter and (loadable) compiler, often interactively, it contained lots of information about the program under development. PASCAL tended to be compiled to small programs with small runtimes by a batch compiler.
The actual performance was only a secondary problem. The main problem was that Lisp used lots of memory and when the GC kicked in, it was generating a lot of memory usage -> for example by a mark & sweep GC. On a shared computer this could mean that just the Lisp program used much of the memory and on GC it was busy for some period of time. Other uses would see a massive performance degradation of the computer.
That was one of the reason to develop special computers, expensive single user workstations for Lisp, which had lots of expensive memory just for one user and improved memory management. For example, they then had four megabyte (or more) of random access memory. ;-)
> UCSD Pascal (I used it on the Apple II) was using a bytecode engine as the compilation task
> "Strong" typing often is wrongly used as a term for static typing.
Minor contradiction: the distinction between strong and static typing is modern. From wikipedia:
> In 1974, B. Liskov and S. Zilles defined a strongly-typed language as one in which "whenever an object is passed from a calling function to a called function, its type must be compatible with the type declared in the called function."
Lisp was not "statically typed", but was often considered strongly typed, when it was type checking at runtime. For example if the Lisp system would get (+ 1 'a) for evaluation, it could at runtime check, whether all args are numbers. If not -> error.
> Lisp code was data and could be formatted a "pretty printer".
Sure, I have used it. Not in the 1960s, but in the 1980s. I don't doubt it was old already then.
The problem is only that if the formatter is not mandatory, programmers will not use it and develop their own styles. In theory the reader could run it through the pretty printer before looking at it, but few people/systems have the tooling ready for it.
Most people use automatic indentation in Lisp, though.
Thing is: Lisp is different, it's more fluid in its shape than a language, where the code only lives in one source file. My own code and data gets reprinted in the development process on many places. I have a source file, I may look at a pretty printed macro expansion, I'll see a code fragment in the debugger, I'll re-flow some code in the Listener. Sometimes I use this IDE, then I use another one, then the code looks slightly different. But it always follows a set of rules.
"I pulled out my copy of the draft report on Algol 68 and showed it to her. She fainted."
This is that document in final form.[1] There is also an "Informal introduction" and an "Very informal introduction", because the spec is incomprehensible. They had a terrible time talking about syntax back then. They were also trying to talk about semantics of parallelism, but it didn't go well. This was the "do this to all that stuff" era of parallelism, which maps well to some number-crunching problems.
The funny thing is that the Algol-60 Report is very simple.
That is a good summary, but nevertheless I believe that you have not described with enough details the proposals of McCarthy and Hoare, which have been the most important innovations included in the discussions about the future of Algol and which had a great influence on the evolution of many programming languages.
In the proposal of John McCarthy from December 1964, "Definition of new data types in ALGOL x", besides the overloaded operators mentioned by you, an even more important proposed feature were the "union" types, as a better language feature than the Fortran EQUIVALENCE.
Algol 68 has incorporated union types and overloaded operators in a form close to McCarthy's proposal, but C, even if it has taken the keyword "union" from Algol 68, unfortunately it has used it only with a meaning similar to Fortran's EQUIVALENCE.
In C. A. R. Hoare's proposal from November 1965, "Record Handling" a large number of new programming language features have been proposed, besides "records" and "record classes".
Both "record" and "class" are COBOL 60 keywords. COBOL 60 used "record" for what later PL/I will call "structure", a term inherited by C, and it used "class" for what ALGOL 60 called "type". SIMULA 67 has taken "records" and "record classes" from Hoare, but it has abbreviated "record class" to "class", a term inherited then from SIMULA 67 by all OOP languages.
Hoare's proposal also included "Declared Reference Variables" which PL/I has taken from Hoare and it has renamed them to "pointers", which is the term inherited from PL/I by most later languages.
Other new features in that proposal: "new" and "destroy" procedures for dynamically allocated records, a special reference value "null", i.e. the null pointer, what are now named constructors, for initializing the records, and "Finite Set Declarations" i.e. what are now called enumerations.
It is interesting that he elevates the use case of computer science education as the most important use case for a programming language to satisfy. It seems to presage his later works, including Oberon.
Universities have always been about correctly incubating the future practictioners of each field. The recent rush to relate each field's research projects to making commerce more efficient is putting (not in all cases, but in many) the cart before the horse.
I said this on another forum recently. There were good reasons for rejecting ALGOL 68, both because of the language's complexity, and because of the incomprehensible 2-level grammars of the Report. (Disclaimer: my grad supervisor was one of the editors of the Report, and I did a small amount of ALGOL68-related work then.) Languages like Oberon exemplify simplicity; at a different abstraction level, so does Scheme. But the complexity battle has been lost: for example, C++ is arguably far more complex than ALGOL68.
I was interested to check how various languages compare in complexity to ALGOL 68. As a rough proxy for complexity let's look at the number of pages in their language specifications (excluding glossaries, examples, libraries, etc.):
* ALGOL 68 spec: 209 pages
* C: 418 pages
* C++: 457 pages
* Python: 148 pages (but doesn't include a lot of things in PEPs)
* Rust: doesn't have a specification but does have a 600 page book
* Lisp: 48 pages
* Go: 109 pages
Some of these have more detail than others - the C spec is very detailed and the Go one is clearly intended to be more human readable than detail oriented. Also I've excluded core library specs here and they add a lot to some of these specs, but arguably some are "more core" than others so they're not really comparable.
The 2-level grammar may have been hard to read, but the language wasn't that hard to explain. Well, REF REF was, but the rest was pretty straight-forward. I do agree that Pascal is easier to explain. OTOH, C was much harder. CS and EE students seem to have an intuitive grasp of pointers, but others need to be told to be very careful with them and follow strict rules.
I printed this out, put it in the front of my old Turbo Pascal 7 book. Hopefully my daughter will find it, when she's old enough and I nudge here to the direction of Pascal. :-)
So Wirth was a big part of the Algol68 working group but grew disillusioned by it's resulting complexity - he went off and invented Pascal, essentially taking the easy to implement parts of Algol68 and put them in a language.
Almost everything new (or newly borrowed) in Algol68 is common today, often reinvented many times over: structs, unions, pointers, a full type system, threads, a standard library, procedure pointers, a heap with garbage collection etc etc
The things we don't have from it are largely different fonts for reserved words and/or types (something largely forced on them by using cards with only upper case), and A68's standard which I guess is what happens when you let the maths guys go wild - they invented a language to describe the language and its semantics, a language that's largely incomprehensible to most mortals (and then reinvented it for the revised standard) .... it made the spec really hard to read.
All in all it was a great effort considering the times, people were still figuring out how to do stuff at the time, it's not surprising they made some mistakes. It was also a time when compilers and OSs were largely owned by hardware companies, it was much harder for a language to get traction outside of academia (you expected to pay tens of thousands for a compiler, a free one couldn't be any good right?)