Hacker News new | past | comments | ask | show | jobs | submit login
Simula – The forgotten programming language (deusinmachina.net)
170 points by andsoitis on July 3, 2023 | hide | past | favorite | 99 comments



Hah, as someone who has studied at University of Oslo, many of the lecturers make sure that Simula isn’t forgotten, by constantly reminding us that it was the first language with classes etc.

Therefore, a few years back I decided to learn it a bit (I borrowed a really old book from the library) and solved one of the Advent of Code puzzles in Simula: https://github.com/judofyr/aoc2018/blob/master/day3prog.sim

I was actually very surprised at how “modern” it felt. However, neither this code nor the article touches on the real cool parts of Simula: Simulations based on coroutines.


The new informatics building (well, not so new anymore, they built it around 2010) at the University of Oslo is named after Ole Johan Dahl.

Dahl was a professor of computer science at the University of Oslo and is considered to be one of the fathers of Simula and object-oriented programming along with Kristen Nygaard.

https://en.wikipedia.org/wiki/Ole-Johan_Dahl

https://www.uio.no/om/finn-fram/omrader/gaustad/ga06/

Many of the rooms inside of that building are named after programming languages.

http://magnus.li/ifirooms/

Naturally there is a room named Simula. And there are some rooms named after some big languages like Python and Java. Then there are also rooms named after some more niche languages, like Scheme. And even some obscure languages, like Ada.


University of Oslo may have a room named after Simula, but that's nothing compared to Aalto University in Finland, they even have a computer science professor named Simula: https://people.aalto.fi/olli.simula


Ada isn’t that obscure, it outranks Scheme in the TIOBE index, for example.


If you are full-stack it probably seems obscure, and fortran seems outdated. But depending on your country, if you are safety critical embedded it is potentially the only option. It is in the top 2-3 in any case.

Edit: to finish my thought, if you are in physics or meteorology fortran is still common as well. There are a lot of software niches, and full-stack web dev is just the largest, at least as represented at HN, embedded might have more people if you combine planes, trains, automobiles, industrial, manufacturing, and more visibly IOT.


As of today, Fortran is newest and the oldest programming language with its most recent standard in 2023 and its first draft as the first high level programming language in 1954.


ADA was also used as the basis for Oracle PL/SQL, and the later ISO SQL/PSM standard.

These are very much alive, and implemented in major modern databases.

"SQL/PSM is derived, seemingly directly, from Oracle's PL/SQL. Oracle developed PL/SQL and released it in 1991, basing the language on the US Department of Defense's Ada programming language."

https://en.wikipedia.org/wiki/SQL/PSM


And the old(er) building right next to it is named after Kristen Nygaard.

Simula was used as the first programming language for informatics students at the University of Bergen until the mid 1990s, when they switched to Java. I was among the last batch of Simula students there.


Weren't Simula, Ada, Pascal and Oberon similar to each other syntax-wise? Begin-end etc.


There are similar syntax elements since Algol 60 was the common ancestor of all these languages; but in detail there are significant differences in syntax an semantics.


Ada, Pascal and Oberon are all Wirth-languages (the latter two directly, and the former indirectly). So they're very similar, yes.

Simila is not and, beyond some ALGOL-descended similarities, is quite different from the other three.


Ada is not a Wirth-language and only marginally inspired by Pascal. Wirth published Modula about the same time as Ada came around, but Ada is much more complex than Modula, with completely different features.


Ada's designers directly referenced Pascal's design and Wirth's theses to create Ada. They themselves call it a Wirth-language, as it follow's Wirth's principals.

Anecdotally, as someone who has coded extensively in Pascal (particularly the Object Pascal variant) and also (though, less so) in Ada; I'm baffled as to how anyone could look at the two and say they're only "marginally related". They're so similar that they're grouped together in much the manner C/C++ are, though Ada took a much harder compatibility break than C++ did:

https://p2ada.sourceforge.net/pascada.htm

> Well, what is that Ada language ? For here, it suffices to write: a superset of Pascal with some syntactical differences. Or a Super-Pascal. Or, a Pascal on steroids. In fact it's a better, redesigned language strongly inspired by Pascal. You'll see what I mean in the examples.


Pascal preceded Ada, but does not allow for coverage of most of the specialized topics that Ada does. For nearly every Pascal concept Ada took a different approach. Similarities are rather superficial.

To quote from https://dl.acm.org/doi/10.1145/956653.956654:

"Pascal itself only meets a small part of the Steelman requirements. Merely to attempt to extend Pascal would have been neither feasible nor a desirable approach. [...] Hence, the goal in the design of the Green language was to retain the Pascal spirit of simplicity and elegance but not necessarily the form of each Pascal feature"

So it's neither a "superset of Pascal" nor a "Pascal on steroids", but a new language design which meets (or better tried to meet since it is much more complex than Pascal) the "spirit of simplicitly" for completely different and much more complex requirements than Pascal did. It's definitely not a "Wirth language" and I don't know of any statement by neither Ichbiah nor Barnes where they would call it that way.


> In fact it's a better, redesigned language strongly inspired by Pascal.

Is the point you should have zeroed in on, not the thrown away comments. If you had, you would realize everyone is saying the same thing as you.

> It's definitely not a "Wirth language"

I specifically stated in my original comment that it was indirectly so, via it's Pascal lineage. If you want to jump through hoops to disavow it of it's clear and self-avowed forebear, go for it. There's nothing to be gained from continuing down this line of argumentation for me.


For some reason you seem to assume that taking inspiration from a language necessarily leads to a very similar language; there is no reason for this assumption; the language authors themselves have told us in the Rationale in what way they were inspired (see the quote in my previous comment). Those who consider Pascal and Ada to be very similar should study both languages more intensively, and not draw their conclusion from a few superficial features; furthermore, they fail to recognize the enormous difference in scope; Pascal offers a small fraction of the features of Ada and cannot be very similar for this reason alone.

Here is a simple example; both Pascal und its descendants suffered and still suffer from the dangling else problem; Ada solved this by adding end if and elsif, i.e. by integrating structured statements with blocks (Wirth himself came to a similar solution in Modula); superficially you still see the same Pascal keywords, but it's a different and much better approach.

Ada is neither indirectly a Wirth language; Wirth had nothing to do with it at all and came to quite different solutions in his later languages than the Ada authors.


I didn't assume anything, my point and reasoning was laid out and provided. You can disagree with it all you like, I don't care.


My opinion doesn't matter; it's what the facts say.


When I was at UiO ('93), they did more than that: The introductory CS course required us to use Simula. We had to do several projects in it.

Of course at the time there were people on staff who had written books about it.


NTNU still has at least one course that uses Simula for coursework. TTM4110 - Dependability and Performance with Discrete Event Simulation. An interesting course but without anyone who had real experience with the language teaching it, it was a slog to get through.


Ha, that’s interesting. I used to TA parts of that course; I had to learn Simula and Demos (a simulation library). If I recall correctly it was the first run of the course, and I had to develop some of the labs.


That's amazing, thanks for pointing out. Which compiler did you use to run Birtwisle's DEMOS?


I believe we used GNU Cim for the coursework, but it's some years since I took this class personally so I can't say for certain.


Thanks. Do you happen to remember whether the referenced course material was specifically describing the use of DEMOS and Simula as well?


Yes, the material was specifically describing DEMOS and Simula. The coursework was all about the dependability and performance of a simulation we had to program using DEMOS. Interesting course really, but a bit let down by teaching assistants who didn't know the material very well.

I believe CIM does the work of compiling C as well, but i may be misremebering.


Do you happen to have a PDF of the course material which you could share? I will contact the course admin otherwise.

Yes, CIM is generating C and uses the available C compiler behind the scenes.


I don't have a copy unfortunately.

The course admin might be on holiday but it's worth a try.

I can check tomorrow if I still have acess to the course resources


Do you remember how Kirkerud (the lecturer and textbook author) opened the semester stating with absolute certainty that AGI would never, ever be reached in our lifetime, or even the next several centuries?


I don't, but I mostly ignored the lectures as I'd been programming for most of my life already at that point, so I read the textbooks and showed up to the groups (the TA for my group was a member of Crusaders[1], which was fun). I think I attended a grand total of two lectures.

I do remember we found a really embarrassing algorithm choice in the Stack class, though (push and pop at the time were O(n), as it used a singly linked list that they iterated through recursively to find the top of the stack... we found that out on a group project where we kept running out stack space because of the recursion...)

[1] https://demozoo.org/groups/12/


> ...progenitor of the most popular, and reviled program paradigm, Object Oriented Programming. Designed by Ole-Johan Dahl, and Kristen Nygaard in 1962...

In 1962 the first version of Simula appeared which was quite different from Simula 67, not yet object-oriented, and dedicated to simulation.

Simula 67 was the first general purpose object-oriented programming language.

> Then, in 1991 another language would enter the playing field, Java, and the rest is history.

Java actually used the object model of Simula 67; James Gosling gave a talk about it at the 50th anniversary in Oslo. Here is the link: https://www.youtube.com/watch?v=ccRtIdlTqlU


The insight they had with Simula was similar to that of lisp. Code/behaviour as a first class thing:

"In Simula I, Dahl made two changes to the Algol 60 block: small changes, but changes with far-reaching consequences. First, “a block instance is permitted to outlive its calling statement, and to remain in existence for as long as the program needs to refer to it” [11]. Second, references to those block instances are treated as data, which gives the program a way to refer to them as independent objects. As a consequence of these changes, a more general storage allocation mechanism than the stack is needed: a garbage collector is required to reclaim those areas of storage occupied by objects that can no longer be referenced by the running program."

https://www.sciencedirect.com/science/article/pii/S089054011...


It's interesting that this somewhat implies the notion of a heap and garbage collection, but that this was not considered for strings.


Strings were really more thought of as arrays/buffers at the time. You could create a String class to serve the purpose of a “modern” string, I believe. However, Simula didn’t yet have operator overloading to make it look like a built-in type.


This gave me a chuckle:

> [case insensitivity... so] both beGIN and BEGIN are interpreted the same way. This was also characteristic of the time, as programmers programmed their computers by shouting.


Case insensitivity is only seen as "rustic" because C is case insensitive. Somehow, case insensitivity is seen as a modern thing but at the same time everyone recommends not choosing identifiers which are close to each other apart from case in style guidelines.


> Case insensitivity is only seen as "rustic" because C is case insensitive

One thing people discovered eventually - case-sensitivity is language-specific. It is easy so long as you assume everything is in English and 7-bit US-ASCII, but once you move beyond that assumption, it becomes significantly more complex. An easy way to avoid all that complexity is to not do it.

People also forget about the related notion of accent-insensitivity - again, if all you support is English, you can ignore that, since English hardly uses accents


Case insensitively is nowadays tied to how (static and dynamic) linking works. You’d either (a) have to use a case-insensitive and thus nonstandard linker, or (b) have to uniformly normalize identifiers before linking, making the normalized identifiers less readable, or (c) have to normalize identifiers based on the casing used for an object’s main definition, which means that changing only the case of that definition still breaks compatibility, so not quite case-insensitive after all, or (d) have at least FFI identifiers be case-sensitive (and use a custom linking runtime for intra-language identifiers).


>nowadays tied to how (static and dynamic) linking works

and this is the case because of C, which was my point. The fact that OS'es are written in C and thus have a C libraries and APIs is why you must respect C's rules for identifier names.

Also, yes compilers for other languages essentially name-mangle anyway, that's how (modern) fortran which is case sensitive and C++ which isn't (but tacks on a bunch of things to function identifiers) work. But the point about naming conventions or recommendations still stands, like a lot of things it is a fact of history favoring C, not that there is anything inherently better about it.


> and this is the case because of C

I believe it was already the case prior to C with assembly, and really because originally only one case (uppercase) was available. C just didn't change anything about the case-sensitivity of linking.


>Case insensitivity is only seen as "rustic" because C is case insensitive.

C is case sensitive, but it's also not seen as very modern language these days, so I'm not sure I understand your argument.

I think case-insensitivity is seen as rustic because it's a remnant of the times when some systems simply did not have a notion of case and everything was uppercase all the time. Modern systems handle case just fine so having several character strings being distinct in terms of byte values but equal in the way the language interprets them seems clunky and error prone (because it is).


> Modern systems handle case just fine

Modern systems tend to support international character sets (UTF-8, etc.), where the idea of 'case' makes things more complicated than ASCII; and probably more trouble than it's worth. Even if we stick with latin characters, are 'uppercase' and '𝖀𝓟ℙ𝗘ℝℂ𝙰𝒮𝓔' the same identifier?


>C is case sensitive Yes thank you, I mixed up my prefixes.


C is case sensitive. I think most of you got my point but argh.


For anyone young enough not to have used an ASR-33 or similar, the UNIX login: process would assume you were on uppercase only and "work" if you logged in as your username/password in uppercase. Guess it was an extra round or two of password hashing against the salt.

It remained in the "getty" process for some time, well into the {Free,Net,Open}BSD era.

CP/M and the like were pretty forgiving of commands being shouted too.

some languages (IMP, maybe others) used %reserved-word% formatting so you could do things pretty much how you wanted and then the system would know what was instructions.


>It remained in the "getty" process for some time, well into the {Free,Net,Open}BSD era.

Still there in agetty: https://github.com/util-linux/util-linux/blob/master/term-ut... And, I imagine in other getty implementations.


Well CP/M was a mixed bag. It was case sensitive and the command line upshifted everything.

However programs like MS-BASIC would allow you to create files with lower case names that were then unmanageable from the command line.


And sadly the same (case insesitivity, not shouting) is the case in Nim today, iirc


Nim is partially case-insensitive:

https://nim-lang.org/docs/manual.html#lexical-analysis-ident...

Sadly, not every developer understands why that’s a good or bad design, i.e. the global pool of developer talent is partially stupid.


I don't, and I'm ok with that; means I can learn more.

I have played with nim for a bit though, and this rule I just don't care for.


Sure, that's fair. I didn't care for it during much of the first year I started working with Nim. Once I began wrapping C libs and working with Nim libs whose conventions differ from my preferences, the rationale (explained in the manual) for Nim's partial case-insensitivity "clicked" for me and I eventually embraced it. But I can see it remaining a sore point for some folks.

My reply above was tongue-in-cheek, maybe a bit too edgy. The person I was replying to characterized the language feature as "sad", and that's just silly. Even if it's an aspect of the language one doesn't appreciate, it's no more "sad" than e.g. Clojure involving a lot of parentheses or Haskell promoting monads. Likewise, no one is stupid or partially so for not liking or understanding Nim's partial case-insensitivity.


Both Algol and Simula were ported to the 68000’s Atari ST. I remember having a box of diskettes with implementations of forth, modula , lisp, prolog, smalltalk.

I miss the those times when using a computer meant ‘programming on a computer’, and programming on a computer meant trying to find your foot on any of a dozen languages popping left and right.


There was Lisp on an Atari ST?


There's even one for 8-bit Ataris:

Interlisp[28] – developed at BBN Technologies for PDP-10 systems running the TENEX operating system, later adopted as a "West coast" Lisp for the Xerox Lisp machines as InterLisp-D. A small version called "InterLISP 65" was published for the 6502-based Atari 8-bit family computer line. For quite some time, Maclisp and InterLisp were strong competitors.


Logo was a very popular language at the time, and is clearly Lisp at heart.


Cambridge Lisp was available from MetaComCo.

I ported Franz Lisp to the Atari ST.


Indeed there was. The first Lisp I saw was a port of David Betz's xlisp to the Atari ST (in the 1980s). The port was buggy and I didn't have the sources, so I needed to get started on writing my own interpreter. The rest is history, as they say.


Most 8 and 16-bit microcomputers at the time had a Lisp.


There was even a Lisp on DOS.


There is a lot more to Simula. I was a student at University of Oslo at the time it was still mandatory in intro classes.

A fellow student was the son of one of the people who wrote a compiler for Simula.

He knew about all sorts of little documented features. Some gave precise control when a "simulation was executed". Thinking back, it makes me think of RTOS like features.

Now Simula was never the fastest language, and these did not offer guarantees that any real time system would but they were cool.

It was fun when you used language features that the teachers had never seen before.


A bit of a tangent, but it's interesting how many languages come from Nordic countries:

- Simula - Norway

- C++ - Stroustrup is Danish

- TypeScript / C# - Hjelsberg is Danish

- PHP - Lerdorf is Danish

- the v8 team including Lars Bak is Danish, and has developed several languages

- Erlang - from Ericsson in Sweden, though Joe Armstrong was British

Well now that I write it out, it seems like it's even more Danes than Nordic countries!

(Someone wrote pretty much the same comment here in 2016 - https://news.ycombinator.com/item?id=10833073)

Hm also from the same thread, L, M, and P from the LAMP stack all originated in Scandinavia! Didn't realize that https://news.ycombinator.com/item?id=10835311


Maybe spending long time in the dark every winter is good for thinking about programming languages.


IIRC, Linus may have said that somewhere about Linux.


That and it being too expensive to go out and get drunk ...


> Hm also from the same thread, L, M, and P from the LAMP stack all originated in Scandinavia! Didn't realize that https://news.ycombinator.com/item?id=10835311

Was going to mention that the Netherlands is not in Scandinavia until I remembered that the P is not Python :P

It is interesting that so much foundational tech comes from Europe when I think there is a common perception of that generation of tech as being largely driven by the US (MIT, Bell, etc).


> It is interesting that so much foundational tech comes from Europe when I think there is a common perception of that generation of tech as being largely driven by the US (MIT, Bell, etc).

The UK chronically undervalues its computing sector. Lots of the pioneering hardware/fundamentals work was done at Bletchley Park in the 1940s, but was destroyed or remained classified for decades. We had a disproportionately large slice of the software industry in the 80s, but that seemed to be dismissed as just gaming; and of course games themselves aren't taken as seriously as e.g. cinema, despite being a much larger market.


Also: ssh, mysql, qt. A bit more niche: varnish, opera webbrowser...


In that case it could be worth mentioning CSS, by the Norwegian Håkon Wium Lie.

I often feel like the beginning of the 90s was a missed opportunity by Scandinavian companies. There were many great companies and individuals that were innovative, but could not compete due to the lack of VCs from Silicon Valley and a smaller domestic market


Norsk Data was one of the big guys producing minicomputers in the 70s and 80s. Apparently they held on to minicomputers for too long and quickly lost business to Sun and workstation manufacturers in the early 90s. https://en.wikipedia.org/wiki/Norsk_Data

There's a cool documentary as well: https://www.youtube.com/watch?v=CswjD3plsF8


curl


Denmark had the mosts c64 groups in the world (the reason for this, I don't know) compared to population size. Most of the people involved with these demo/cracking groups in the 80ies were most likely in their teens. Once the 90ies arrived with the internet, I think it was obvious to most of them what they wanted to do with their life.

Sweden, Norway and Finland came in 2nd, 3rd and 4th. I think US was at place 20.


COMAL, a then mildly popular alternative to Basic, is/was from denmark, too.


That was the first programming language I ever used (taught in school in Scotland).

https://en.wikipedia.org/wiki/COMAL

As a bonus, it's an anagram of OCaml.


> - PHP - Lerdorf is Danish

Not just your "standard" Dane, but born in Greenland. The only such person I can name.


The AVR microcontroller architecture, perhaps most widely known from Arduinos, is from Norway [1].

[1]: https://en.wikipedia.org/wiki/AVR_microcontrollers#History


Broadening the scope a bit beyond just languages and their creation alone, we can add:

Linux - Linus

Ruby on Rails - DHH

Turbo and Borland Pascal, and Delphi - Hjelsberg again


My theory is the long winter months make people try nerdy things. like new programming languages.


Stroustrup once told about how he needed the best of both C and Simula, which resulted in creating C++.

Full interview: https://www.youtube.com/watch?v=ZO0PXYMVGSU


Prolog and ML (and also GHC) come from Scotland, while not Nordic, it's close.


Prolog stands for "programmation en logique", and as might be guessed from that comes from France (Alain Colmerauer in Marseille, to be precise). Now Kowalski, who also was an important pioneer in logic programming who collaborated with Colmerauer did work in Edinburgh, but from the horse's mouth:

I have tried to document as best I could our various contributions to that idea in an article published in CACM, 1988, and more recently in a History of Logic Programming, published in 2014. In summary, however, it is probably fair to say that my own contributions were mainly philosophical and Alain’s were more practical. In particular, Alain’s work led in the same summer of 1972 to the design and implementation of the logic programming language Prolog.

https://www.doc.ic.ac.uk/~rak/history.pdf


the most influential web framework, rails, came as well from Denmark or at least the author, David Heinemeier.


Could be a cultural NIH syndrome. :)


'Many' is a word that gains its meaning in comparison to some (implied) quantity, and I'm not sure what you are comparing to: you list 5 programming languages but looking at Wikipedia's list of programming languages* five is clearly not very many at all.

Maybe you mean something like "as a share of top 10 most used languages" or "per capita"?

* https://en.wikipedia.org/wiki/List_of_programming_languages


They meant "many important and/or well known" - some things go without saying. They didn't meant to include INTERCAL or BRAINFUCK or OZ.


If we go by nationalities and this list, then that is 3 misses and maybe 1 hit in terms of language design for the Danish. Maybe I should check the creators of the next language I learn and avoid langs created by Danish?


Is simula a forgotten language? While it is less used now I thought it was generally acknowledged as a precursor to all OOP today, although Smalltalk is generally more regarded as a pure OOP language defenders still cite.


The Erlang language was named after a Danish mathematician who "did the math" for the whole Erlang network thing I don't really understand.


Unfortunately he is thoroughly confused about OutImage.

> "OutImage is called to build the final Simula “Image”. From my digging it appears that OutImage is usually necessary at the end of a block, or a compiler error will be generated. I initially forgot OutImage though so, I imagine that maybe Portable Simula is just implicitly adding this if it’s missing, or it was changed in a later standard. Regardless I’ll leave it in from now."

In reality the "image" is just the current outline buffer, and OutImage prints the line feed. Very similar to Pascal later.

From the textbook:

> "OutImage causes the line which is currently being created - the current image - to be written to the printer or terminal and a new line to be started."


Simula still is less forgotten than its successor BETA.


Simula was the language my university teached object-oriented programming. This was around 1995 or so. Java had just entered public consciousness but it was too early to be considered. C++ was probably not considered pure enough to get the concepts across.

It was probably a good choice, for the time and for the course objectives. But I never used it since.


I was studying around the same time and it was all Scheme, Prolog, and C/C++, along with some obscure academic languages.


I am glad my first year at Uni (1994) they focused on SML a lot. Got a good base for the rest of our programming journey. Even now that I live in mostly Scala-land nearly 30 years later. And yes also a lot of other languages over the course's years. I still remember being forced to do bits of VHDL. We were taught the history of Simula (and Algol) but had to use C++ as our main object orientated language. They refused to teach my year Java so we had to learn that on our own.


>teached

Should be "taught". Just FYI.


Yeah I should have known that, thanks


At which university was that?


Lund University in Sweden


Ok, thanks. At Lund University, Simula 67 was used in education until 1997.


My first experience with an object-oriented language was with Simula in 1982, where it was used for one (if I remember correctly) course at the University of Twente.


A thread or blog post on more such forgotten programming languages would be interesting, and more so if it included historical anecdotes.


> ... By 1983 his “C with classes” programming language would be renamed to C++. Then, in 1991 another language would enter the playing field, Java, and the rest is history

Simula -> C++ -> Java.

Leaves out rather a lot of important object oriented programming languages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: