Hacker News new | past | comments | ask | show | jobs | submit login
Ten influential programming languages (2020) (hillelwayne.com)
363 points by ksec on Dec 18, 2022 | hide | past | favorite | 240 comments



So much wrong with this and the article linked in first sentence.

Just a few notes.

To be pedantic, COBOL was invented by the Navy, mostly by GMH. And to call it dead is not consistent with the humongous body of code that keeps many large institutions afloat.

And Fortran is very much alive in many scientific endeavors, including the amateur radio program WSTJX which implements various weak signal protocols such as FT8, WSPR, and JT 9. These are built by Joe Taylor, Nobel Laureate in physics for work on pulsars and their declining period due to gravitational drag. These protocols are useful in weak signal work such as amateur radio moonbounce.

And SQL was not invented by oracle but by IBM.

There are too many other flaws in both articles to list here. The definition of death here is suspect, and the reasons noted not well reasoned.

If one is truly interested in history, I suggest reading about the life and influence of not only GMH and Jean Sammet and the book covering the 120 programming languages in use in 1960.

Edit: Full disclosure: I’ve professionally programmed in a significant number of the languages here, and my career beginning predates the invention of several of the important languages noted in these articles. I have strong opinions about all of these languages and often share them whomever wants to hear them and many who don’t. (Edit typo)


The author does acknowledge that COBOL and FORTRAN are still around in the ALGOL section:

> Of the four mother languages, ALGOL is the most “dead”; Everybody still knows about LISP,2 COBOL still powers tons of legacy systems, and most scientific packages still have some FORTRAN.

Which is kinda true. COBOL and Fortran are not "dead", but they're heavily "deprecated"; I don't think there are too many systems created today in COBOL vs just maintenance and support.

Maybe it's not the same for Fortran: it still powers a big chunk of Python's scientific stack (in the scipy package).


> COBOL and Fortran are not "dead", but they're heavily "deprecated"; I don't think there are too many systems created today in COBOL vs just maintenance and support.

I don't get the sense that they are except in some segments of programming. Do you have any reference that would show that they are deprecated?

And Fortran is much bigger than scipy.


I do know that Fortran is a lot bigger than Scipy, I was just mentioning that it's "so important and relevant (still)" that it powers the whole "Data Science" Python world.

I have no sources for my claim, just complete gut feeling. I'd bet my lunch that there are more new systems started in Java than COBOL. But I'm happy to be proven wrong and fast until dinner :)


Interesting, I assumed all high performance packages for science and math either used c or c++!

Is there a reason to use Fortran instead?


From the standpoint of an optimizing compiler, Fortran is much easier to optomize. The existence of pointers in C make that much more difficult. As Francis Allen said (paraphrasing) "The prevalence of C has set back programming language research by decades."


Fortran has enough critical mass in that field. I.e. good libraries and applications are written in Fortran, so enough people in the field learn Fortran and use Fortran instead of spending time rewriting stuff that works and is well-tested in C++.


It's also that there is no recognition or other incentives for developing or maintaining software in these fields. There was an article a week or so ago about a growing panic in particle physics over an important C program maintained for 40 years by a single prof who will soon retire.


Basic also has tons of dialects that are still alive and is not just used for office macros. Visual Basic was a big language for desktop applications at one time.

Edit: Just looked it up. The last version of Visual Basic came out in 2019 so it's hardly dead.


Visual Basic .NET is as different from Visual Basic as C# from C++. They’re all different languages in their own right.

And Visual Basic is as different from BASIC as Java is from C.


I know dozens of mainframe devs (millineals) from my college still writing COBOL. Although most of them would switch domains in a heart beat if they could, I personally help two of them with courses for full-stack development using modern programming langrages.

Morbidly, I was told that COVID has killed several senior COBOL developers and so the demand for those still writing COBOL is still very high in the financial space.


> the 120 programming languages in use in 1960

Geez, I think I'd struggle to name 120 programming languages in total right now.


Not 120, but a decent list of stuff I'd never heard of: https://en.wikipedia.org/wiki/Timeline_of_programming_langua...


And there have been more than a handful invented in the intervening 60 years.


I guess people reinventing the wheel with a new programming language isn't just some newer-age thing, but happened frequently decades ago too.


It was also common to invent "tiny languages" (e.g. PIC) and domain-specific languages (e.g. Awk, Yacc) to a greater extent than now. There was no real community to share work. I heard of OO and C++, but didn't really understand it, so when I tried to make my own I made functions that dispatched on the type of the first argument, then another version which could dispatch on the types of any the arguments. We would call this data-oriented now. I suspect that a lot of new stuff arose from such misunderstandings.


True, though I guess they were all working in more isolated ways than we are today too. Probably the compilers were very much tied to the machines they were written on as well (I know we have lots of architectures today but I get the impression it wasn't like today where x86 dominates).



> While SIMULA wasn’t the first “true” OOP language, it was the first language with proper objects and laid much of the groundwork that others would build on.

This is wrong. Simula 67 was the first object-oriented programming language; though the term is said to be coined by Alan Kay who specified Smalltalk-72; but he understood and still understands object-orientation differently than we understand it today; our understanding of it today is more like what Simula 67 introduced; remarkably, Smalltalk-76 represented a significant departure from Smalltalk-72 toward the concepts already known in Simula 67; the primary difference of Smalltalk 76 from Simula was dynamic typing and the conception of even simple types as classes. Simula 67 was still actively used in the nineties (e.g. as a teaching language at Stockholm University until 1997). The performance was comparable to that of C++ or Pascal, definitely faster than Smalltalk on comparable machines.


Surprised this doesn’t mention Smalltalk’s huge influence on two major living languages: Objective-C and Ruby, both of which use the Smalltalk object model, dynamic message passing paradigm and all, which is very different from how most other OOP implementations work.

In the long run: Objective-C will be gone in 20 years, but Ruby won’t. Outside of ObjC interop facilities, Swift will retain some of Objective-C’s Smalltalk heritage, as one of many influences.


Agreed. Leaving out Objective-C using Smalltalk’s object model is a huge oversight.


The writing of history (historiography) is not meant to be a minutely detailed account of everything that happened. History is a narrative, a story about how we came to be here, and sees the past with a particular lens. I think this particular history was meant to bring appreciation of influential yet obscure and niche languages for the current generation of developers and software engineers. I doubt it was meant to be definitive or exhaustive.

In fact, at the very bottom of the article, there is a link to the Encyclopedia of Programming Languages with over 8000 entries. The influence of Smalltalk’s “everything is an object” on Objective-C and Ruby is probably in there somewhere.

Disclosure: I wrote Ruby professionally for over ten years, and learned the basics of OOP with ObjC back in the 90s. I get the significance of Smalltalk’s object model.


> The writing of history (historiography) is not meant to be a minutely detailed account of everything that happened. History is a narrative, a story about how we came to be here, and sees the past with a particular lens.

Of course I get all of that; however, the author also chronicled the influence of these languages and certainly Objective-C use of Smalltalk's object model, a language that provided macOS and iOS such a competitive advantage for many years, it's probably worth a mention.


I once learned Ada for a NATO project. I was amazed by the simplicity you could start a thread and monitor it.

Its not dead but you will never see it trending. Its just a niche language as most of the languages at the post are.


Like some of the sibling posters, I'll use this opportunity to share my feelings on Ada. I checked it out a few years ago, and I was amazed at how much it has to offer. Despite being designed by committee in the late 70s, Ada has many features that remain novel by today's standards. I'd say Ada is a 'must see' for anyone interested in implementing new programming languages, there are many great lessons language designers can take away from Ada.


Bertrand Meyer's Eiffel programming language was heavily positively influenced by Ada.

https://en.wikipedia.org/wiki/Eiffel_(programming_language)

It was also heavily negatively influenced by C++. If you like reading lots of hilarious ideological, syntactic, and semantic ranting and raving about how terrible C++ is, pick up one of Bertrand Meyer's original Eiffel books (from before Java, which is another negative reaction to C++ in a different direction), you'll love it!

Where I think it broke down is with multithreading, because you can't reason about preconditions and postconditions of a method if there are other threads banging away at it. But maybe that's been solved since when I read the Eiffel book.

Sather is an open source language that was originally based on Eiffel, but developed in its own direction. Named after the Sather Tower at Berkeley instead of the Eiffel Tower in Paris.

https://en.wikipedia.org/wiki/Sather

The Differences Between Sather and Eiffel

https://omohundro.files.wordpress.com/2009/03/omohundro91_sa...


One benefit I had from learning Ada was that it treats encapsulation, implementation hiding, inheritance, subclassing, message passing, etc. as separate mechanisms that you can opt into individually.

This is a huge difference compared to something like Java or Python where "everything is a class and a class is everything", and if you desire implementation hiding you sort of automatically also opt into get encapsulation and subclassing and the rest of it.

When you learn the mechanisms individually, object-oriented programming starts to make a lot more sense! Ada is worth looking at just for that experience alone! Then the other things are a bonus.


This connects to another of Ada's advantages: language-integrated support for subsetting the language. In the C world, you can define a subset of the language, but if you want a tool to ensure compliance, you're on your own. With Ada, you can define a 'profile' and have the compiler check compliance.

This has been done with the Ravenscar profile (for real-time work) and, most famously, the SPARK profile (for formal verification).

(There isn't integrated support for user-defined style guide conformance though. From the perspective I've used here, MISRA C is both a subset and a style guide.)

Some ancient documentation on RavenScar: https://gcc.gnu.org/onlinedocs/gcc-4.5.4/gnat_rm/Pragma-Prof...


Is there some good, free, open-source ADA implementation one can try out?


You can get going with the Alire package manager which installs the Free Software Foundations GNAT compiler (part of GCC) in a few minutes: https://ada-lang.io


Despite being designed by committee in the late 70s

You say that like it stayed there. Ada has been updated and refined continually since the first standard in 1983. Ada 2012 is the latest approved, and Ada 202x is in review.

Just like Fortran (see Fortran 90, 95, 2003, 2008 & 2018).


My apologies. I was only referencing the typical criticisms people have of the language. Most of which having absolutely nothing to do with the language itself. I'm very familiar with modern Ada. I've done lots of hobby programming in Ada 2012, and now Ada 202x.


You know what's a lot like Ada in a good way is Mesa, which evolved into Ceder, from Xerox PARC. I know people who really loved programming in it. They'd call it "Industrial Strength Pascal". It was a successful experiment in code reuse. A strongly typed language with strong separation between interfaces and implementations, which encouraged creating robust, hardened code.

https://en.wikipedia.org/wiki/Mesa_(programming_language)

>Mesa and Cedar had a major influence on the design of other important languages, such as Modula-2 and Java, and was an important vehicle for the development and dissemination of the fundamentals of GUIs, networked environments, and the other advances Xerox contributed to the field of computer science.

Demonstration of the Xerox PARC Cedar integrated environment (2019) [video] (youtube.com)

https://news.ycombinator.com/item?id=22375449

Computer History Museum: Eric Bier Demonstrates Cedar

https://www.youtube.com/watch?v=z_dt7NG38V4

Mark Weiser and others at Xerox PARC's ported the Cedar environment to Unix, which resulted in the development of the still-widely-used Boehm–Demers–Weiser conservative garbage collection.

https://news.ycombinator.com/item?id=22378457

I believe that stuff is the port of Cedar to the Sun. Xerox PARC developed "Portable Common Runtime", which was basically the Cedar operating system runtime, on top of SunOS (1987 era SunOS, not Solaris, so no shared libraries or threads, which PCR had to provide). He demonstrates compiling a "Hello World" Cedar shell command, and (magically behind the scenes) dynamically linking it into the running shell and invoking it.

Experiences Creating a Portable Cedar.

Russ Atkinson, Alan Demers, Carl Hauser, Christian Jacobi, Peter Kessler, and Mark Weiser.

CSL-89-8 June 1989 [P89-00DD6]

http://www.bitsavers.org/pdf/xerox/parc/techReports/CSL-89-8...

>Abstract: Cedar is the name for both a language and an environment in use in the Computer Science Laboratory at Xerox PARC since 1980. The Cedar language is a superset of Mesa, the major additions being garbage collection and runtime types. Neither the language nor the environment was originally intended to be portable, and for many years ran only on D-machines at PARC and a few other locations in Xerox. We recently re-implemented the language to make it portable across many different architectures. Our strategy was, first, to use machine dependent C code as an intermediate language, second, to create a language-independent layer known as the Portable Common Runtime, and third, to write a relatively large amount of Cedar-specific runtime code in a subset of Cedar itself. By treating C as an intermediate code we are able to achieve reasonably fast compilation, very good eventual machine code, and all with relatively small programmer effort. Because Cedar is a much richer language than C, there were numerous issues to resolve in performing an efficient translation and in providing reasonable debugging. These strategies will be of use to many other porters of high-level languages who may wish to use C as an assembler language without giving up either ease of debugging or high performance. We present a brief description of the Cedar language, our portability strategy for the compiler and runtime, our manner of making connections to other languages and the Unix operating system, and some measures of the performance of our "Portable Cedar".

PCR implemented threads in user space as virtual lightweight processes on SunOS by running several heavy weight Unix processes memory mapping the same main memory. And it also supported garbage collection. Mark Weiser worked on both PCR and the Boehm–Demers–Weiser garbage collector.

https://en.wikipedia.org/wiki/Boehm_garbage_collector

This is the 1988 "Garbage Collection in an Uncooperative Environment" paper by Hans-Juergen Boehm and Mark Weiser:

https://hboehm.info/spe_gc_paper/preprint.pdf

>Similarly, we treat any data inside the objects as potential pointers, to be followed if they, in turn, point to valid data objects. A similar approach, but restricted to procedure frames, was used in the Xerox Cedar programming environment [19].

[19] Rovner, Paul, ‘‘On Adding Garbage Collection and Runtime Types to a Strongly-Typed, Statically Checked, Concurrent Language’’, Report CSL-84-7, Xerox Palo Alto Research Center.

http://www.bitsavers.org/pdf/xerox/parc/techReports/CSL-84-7...

My guess is that the BDW garbage collector had its roots in PCR (pun intended, in fact this entire message was just an elaborate setup ;), but I don't know for sure the exact relationship between Cedar's garbage collector, PCR's garbage collector (which is specifically for Cedar code), and the Boehm–Demers–Weiser garbage collector (which is for general C code). Does anybody know how they influenced each other, shared code, or are otherwise related? Maybe there's a circular dependency!

https://news.ycombinator.com/item?id=24450970

Xerox Cedar “Viewers Window Package” (2018) (toastytech.com)

http://toastytech.com/guis/cedar.html

gumby on Sept 13, 2020 | next [–]

This says “developed after the Star“ but imho the Dandelion (marketed as the star) was too slow for this environment and you needed one of the bigger machines (Dolphin or Dorado). Actually it’s kind of amazing to realize that two years later youncould get a small Mac for about a fifth the price that sat on your desk (not rolled next to it on casters) and was much more responsive. Did less, but what it did it did well, and was all that most people needed.

In addition to the Smalltalk and Mesa environments mentioned in the post, there was the Interlisp-D environment too, which got much more use outside thanks to being used outside PARC.

pjmlp on Sept 13, 2020 | parent | next [–]

The Computer History Museum organized a session with Eric Bier, and several other folks demoing the Mesa/Cedar environment.

https://youtu.be/z_dt7NG38V4

The only modern environments that seem to have kept alive several of these ideas are Windows/.NET/COM, the ones designed by Apple/NeXT and to certain extent Android (although with a messed up execution).

Even Linux could grasp many of these ideas, if D-BUS would be properly taken advantage of and settled on a specific development experience.

Somehow it looks like we are still missing so much from Xerox PARC ideas.

----

The Cedar Programming Environment: A Midterm Report and Examination

http://www.bitsavers.org/pdf/xerox/parc/techReports/CSL-83-1...

C - Cedar/Mesa Interoperability

http://www.bitsavers.org/pdf/xerox/parc/cedar/C_-_Cedar_Mesa...

Describes Portable Common Runtime (PCR), and the PostScript and Interpress decomposers implemented in Cedar, and includes many other interesting document about Cedar.


I wish more language designers paid attention to how Ada did things. The language has some features that are very relevant to performance that even C++ or Rust lacks.

For example, Ada allows to return stack-allocate arrays with variable number of elements. One can emulate that in C++ with various arena classes, but the resulting code is not that efficient and the usage is much more complex.


My only real exposire to Ada was a report for a class project in college. One tidbit that stuck with me is that there were a few DOD projects using Ada after it was first created which actually got done ahead of schedule and under budget. That always impressed me given the normal DOD software track record.

Also, I believe people place way too much emphasis on "popularity" of a language. If a language has enough traction to stay alive it can be valuable unless you're doing the most generic web dev stuff around. Heck there's cases where Perl still outshines modern competitors.


I’ll shout out for ADA as well. Really enjoyed working with it. Compiler was a nit picking monster but the rigour ensured that you really considered your program. Thanks for making me remember.


Unrelated, but how does one get a job at NATO?


I offered a job in a company that implements NATO procedures. Its a company in Greece that implements radar software for airspace among other defensive projects.

I stayed there only for a month, i hated the objective of the job, I hated the UI custom library, I hated the way I had to test my code, I hated how much of disengagement I had to do from my morals to work on 'defense' projects.



Cobol is far from dead. It is not sexy and I dont know of any systems starting out today using it. But there are massive legacy systems till out there.

At least a couple years ago Cobol programmers were in high demand.

According to Microfocus [1 (2018)] over 2 million people worldwide are active full-time Cobol programmers, and it is running quite a few mission critical systems.

>Companies involved in keeping COBOL-based systems working say that 95 percent of >ATM transactions pass through COBOL programs, 80 percent of in-person >transactions rely on them, and over 40 percent of banks still use COBOL as the >foundation of their systems.


Pascal similarly lives on with Deplhi/Lazarus. I know plenty of currently supported commercial programs running on Delphi and considering a transition to Lazarus.

I've seen an uptick in APL and APL-derived array languages. "Niche" would have been a better term. I would put Smalltalk in the same category.

What about some commercial languages instead which are de-facto dead, such as ColdFusion, ActionScript, Lingo, and so on...


> and I dont know of any systems starting out today using it

I do... In a project I was working on for a major Dutch government organization, they were building stuff in COBOL. This was a project where we were using NLP for automated data extraction. I still find it hilarious that part of it was being built in COBOL.


It's not dead, ok.

Can we say it's mostly dead?

For example, is it more or less dead than perl?


>> Can we say it's mostly dead?

>> For example, is it more or less dead than perl?

== The current state of Perl 5 for Python fans ==

Perl 5: I'm not dead!

TIOBE: 'Ere! 'E says 'e's not dead!

Internet: Yes he is.

Perl 5: I'm not!

TIOBE: 'E isn't?

Internet: Well... he will be soon--he's very ill...

Perl 5: I'm getting better!

Internet: No you're not, you'll be stone dead in a moment.

TIOBE: I can't take 'im off like that! It's against regulations!

Perl 5: I don't want to go off the chart....

Internet: Oh, don't be such a baby.

TIOBE: I can't take 'im off....

Perl 5: I feel fine!

Internet: Well, do us a favor...

TIOBE: I can't!

Internet: Can you hang around a couple of minutes? He won't be long...

TIOBE: No, gotta get to Reddit, they lost nine today.

Internet: Well, when's your next round?

TIOBE: Next year.

Perl 5: I think I'll go for a walk....

Internet: You're not fooling anyone, you know-- (to TIOBE) Look, isn't there something you can do...?

Perl 5: I feel happy! I feel happy!


https://youtu.be/Jdf5EXo6I68?t=50

Monthy Python is just pure genius :D


Less dead than perl by far. I've seen a lot of new COBOL going into production just this year. New versions of the compiler (the IBM one, the only one that really matters for COBOL) are still being released. A while back they went to the trouble of implementing full JSON serialization/deserialization. You would be surprised how many web services are pretty much just COBOL the minute the server needs to do anything with files or databases or whatnot.

As somebody that grew up on C and mostly doesn't work on the applications programming side of things, it was always tough to get my head around why so many people still wanted to write COBOL. Over the years though I've concluded that its simply a better language for certain kinds of tasks than anything created since. And there has been no shortage of support in keeping it relevant and easy to integrate with modern systems.

It would be interesting to see the COBOL record IO and destructuring of files and whatnot be implemented in a more modern language. I don't think anybody particularly cares for COBOL syntax, but the programming paradigm it supports doesn't have any other major players. Nobody has ever really even tried, probably because the types of people that usually create programming languages have no connections to the somewhat closed world of COBOL enthusiasts at large corporations. Certainly an interesting idea for a hobby project though, there are a lot of interesting places you could take a mixed COBOL/functional language type thing.

For instance, a function would take one file and output another, file structures and record structures could be interpreted as data types, archiving files would be treated as a side effect of the system essentially (or potentially if you get real crazy, make that some sort of functional abstraction as well). Just spitballing here because it's a fantastic idea that I would love to see. There is no reason that working with files has to necessarily be treated as entirely external to the program/a side effect, so long as you take care to have some sort of way to deal with acquiring locks/contention issues and whatnot. Not a problem on mainframes, but there would need to be some systems features implementing to make that sort of paradigm effective in a Unix style environment. I'm not sure it could be done entirely in a runtime, a lot of that state would really need to be global.


Microfocus COBOL also matters - there's non-trivial amount of code that is still being worked on, in COBOL, that runs these days mainly on RHEL machines and is compiled with Microfocus.

As for handling destructuring files, how useful it is really shows when you consider how easier many tasks become when you have access to such facility - even if nowadays you might expect it to be handled by extra compilation step that uses protocol buffers or something similar.


Modern programming languages are more than capable of providing an abstraction layer that can offer seamless file->struct and struct->file conversions.

It's basically the equivalent of ORM for relational database.

But nobody does it not because it's hard or because the language doesn't support it. They don't do it because we no longer think of file system the main "database" that stores your application state.

Instead we think of the filesystem as a lower level building block for higher level abstractions (such as relational or whatnot databases)


At least in south America, I saw in almost every financial institution, a lot of cobol being used until today.


Is someone who is no longer having children but is still a major team player at their employer “mostly dead”?


Since we're talking about languages, a better question would be "is a language in which no new works are written but which has a large catalog of past works that are still studied by thousands of people 'mostly dead'?"

Most would say yes.


The many COBOL programmers are not being paid to study a dead language, they are being paid to adapt, evolve, and maintain living systems written in that language.


> in which no new works are written

There's a lot of new code written in most of these languages. It might be existing systems, but the code being written is new.


Perl, in my opinion, will never be dead enough.


As I said above, I have opinions about each of the languages I have learned.


> BASIC was the first language with a real-time interpreter (the Dartmouth Time Sharing System), beating APL by a year.

Of course Steve Russell wrote the first Lisp interpreter years before that.

Not to take away the significance of BASIC, but that wasn't it. I think in fact many of the "Significance" sections are rather condescending; COBOL was significant in being the first "mass" high level language, transformational in the same was BASIC was (and more than BASIC).

FORTRAN was hugely influential on computing, is mentioned all through the article, yet isn't considered "influential"?


Fortran presumably didn't make the list because it's not dead. Scientific-computing labs still use it directly, and perhaps more importantly, if you write code today that does any kind of linear algebra, it's probably using foundational libraries that are written in Fortran and still actively maintained as such, even though most users call them through FFI from newer languages that offer a better developer experience.


the only knowledge I have of Fortran not being dead is that its use in scientific computing is brought up in EVERY SINGLE ONE OF THESE PL CONVERSATIONS I've been in dating back to the 1970s. (It was the first language I learned if you you don't count HP calculators)

COBOL too.

Do the people maintaining these libraries get harangued on a weekly basis about how Fortran has undefined behavior and the PDP-11 and IBM 360 ISAs are not appropriate for modern multi level cached pipelined speculatively executed CPUs? How can their matrix multiplication libraries be as fast as they can be if they are assuming/relying on memory allocation layouts that don't exist any more in the hardware?


COBOL is not dead, if few (if any) new projects are launched in it. APL is hardly dead; J is more an implementation than a descendant IMHO.


More like it's not considered mostly dead (still used and taught in high performance computing).


Maybe the "real-time" part means that it was not on a batch system, but worked on a computer that could be used interactively.


I wouldn't be so dismissive of Algol 68. It wasn't particularly successful in production, but then neither was Algol 60. OTOH conceptually the former feels closer to modern PL design, with a greater emphasis on consistency and orthogonality. And if we're talking about specific features that ended up in other languages, well - the C keywords "void", "struct" and "union" all came from Algol 68.


Algol 68 structs descend, at least partially, from JOVIAL (which was based on ALGOL 58). A language that is annoyingly still in use, from what I heard.


Not to mention "long" and "short". Also, yes, PL/I had a POINTER type, but just one. Algol 68 did what languages do now, i.e. specifies the type of thing the pointer points at (or the type of the thing the REF type references; speaking of Algol 68 vocabulary, "dereference" made it into common CS jargon).


Yup, I totally forgot "long" and "short". Although I'm not sure that counts as a lesson learned given that C still came up with "double" for what should have been "long float".

Typed pointers were an ALGOL-W thing originally, though.


I'd agree - I think the classification of Pascal as an Algol60 derivative is a bit off - Wirth was on the committee that wrote Algol68 - my impression is that Pascal is essentially all the easier bits of Algol68


Wirth was on the committee that worked on what was then referred to as "Algol X", but at the time there was no agreement on what exactly that would be. His (and Hoare's) proposal was more incremental relative to Algol 60, removing some features and adding others - most importantly, first-class strings, pointers, and structs - while retaining the overall syntax and semantics.

This proposal did not get enough support on the committee, though. Some people rejected it because they saw the additions as too minor, not warranting a whole new language so early. Others didn't like the fact that it relied on the same BNF + verbiage approach to define the spec that resulted in numerous ambiguities to Algol 60 (which was supposed to be rectified by the new formalism just developed by van Wijngaarden).

So Algol 68 ended up evolving in a different direction overall, which Wirth consistently criticized, and stepped down from IFIP after it published the first draft of the language. He took his own proposal and turned it into Algol W; and that, in turn, became Pascal.

So you could say that some of the concepts that made it into Pascal compared to Algol-60 - namely, pointers and records - had the same origin as those in Algol-68, but the point of divergence precedes either language.


When I read this:

> But everything ALGOL-68 did, PL/I did earlier and better.

I just wanted to write "Not so". As much as I appreciate the OP in general, his knowledge of Algol 68 is limited. Algol 68 was more than the introduction of first class datatypes. Parallel programming comes to mind. In comparison to PL/I, there were other features such as automatic memory management. It helped to have had John McCarthy in the commitee.


Learning Standard ML made me a better programmer (back in the day)... It was all about specifying how input and results are related instead of giving instructions to the CPU on what to do.


That chart with Java is amazing. And in my personal case, also true. Actually paid for something called SmalltalkAgents (which was pretty cool btw) just a bit before Java happened. Then Java happened.

On QKS' SmalltalkAgents: http://computer-programming-forum.com/3-smalltalk/fe67cb349c...

(amazed that there is not a single image of this software on the net.)


PL/1 was an IBM mainframe language; but Intel made a compiler for a stripped-down version targeting 8086, called PL/M (Programming Language for Microprocessors). PL/M was (I believe) the main language used in writing the CTOS/BTOS operating system (Convergent Technologies/Burroughs).

PL/M was a nice language. It was suited for system programming (structs, pointers). I first came across it in some introductory programming book I bought from a second-hand crate. A lot of my early programming lessons came from that book, even though I had no access to a PL/M compiler. I have no idea who wrote it, or what its title was.


PL/M was a nice language indeed; I used it on embedded 8085 based systems; but I don't think it had many overlaps with PL/I; but there were similarities.


My encounter with PL/I was with some 3270-style Wang terminals for their minicomputers. They were programmable by writing PL/I which got compiled and stored in the terminal's NVRAM. There wasn't much you couldn't do with it, I made macros that could copy blocks of text between two different terminal sessions as well as a bunch of cursor positioning and general clipboard operations. PL/I is quite a nice language.


Having programmed in both, the most similarity is in the name. PL/I is a vastly complicated language that needs a specialized grammar to describe and PL/M works on microprocessors. Oh, and another flaw with the article is that PL/M was not invented by Gary Kildall, not Intel.


Well, I said that Intel produced a compiler. I see from Wikipedia that Kildall did indeed develop PL/M; the same article says that Intel "promoted" it. The compiler that we used was branded Intel. It sounds like Kildall developed it for Intel.

[Edit] My notion that PL/M was a cut-down version of PL/I was clearly wrong; I never used PL/I, the closest I got was reviewing a few documentation pages.


I have a mild interest in obscure languages.

Pascal is almost dead but still there is Delphi, FreePascal/Lazarus and Oxygen. They are all being actively developed. I just wish Embarcadero would stop charging an insane amount for Delphi.

Basic is even more nearly dead. There are still a few interesting things left like:

https://www.purebasic.com/

Up until a few years ago at least, and I still think it is the case Epic (https://www.epic.com/) had a huge amount of their codebase in Visual Basic classic.


The problem with claims of Pascal being dead or almost dead, is there are competing interests and evangelists of rival languages who wish it to be dead, and Pascal/Object Pascal simply won't do them the favor. As partially shown by Object Pascal being consistently ranked around #15 (for many years) on the TIOBE index (sometimes a bit higher and sometimes a bit lower).

To say Pascal/Object Pascal is almost dead, is to ignore how much more used and taught the language is over known and hyped languages in the media such as Go, Rust, Swift, Julia, etc... Nobody says that those languages are "almost dead", yet Pascal/Object Pascal is as or more used than any of them.

One aspect of the confusion over if Pascal is dead or not, has to do with naming and marketing. Delphi is an IDE/compiler of the Object Pascal language. People can of heard the name Delphi, and know its still very alive, but have no idea or don't realize the language is Object Pascal. In the same context, this goes for Oxygene, and to a lesser degree, for Free Pascal/Lazarus, PascalABC, etc... People aren't aware of how they connect as dialects of Pascal/Object Pascal, and get confused by the names and marketing.


Well, the real problem is that people are not writing native Windows applications anymore, that's what Delphi was good at. And Embarcadero also priced themselves out of the market, they should have had a long hard look at Jetbrains and how those guys thrived in an overcrowded market.


They certainly are, because not everything can be a Web app, and thakfully Electron is still a niche thing.

Borland priced themselves out of the market when they trasitioned into Emprise, now that is what is left as customers, the people that pay for tooling, and sadly it is hard to go back in time for hobby prices.

See Qt as well.

I have the feeling JetBrains might fall into the same trap, as I see them trying to make a Delphi out of Kotlin.


> Basic is even more nearly dead

I would not say so. The BASIC used in the Office pack is the last child of the MS-BASIC/GWBASIC/QBASIC/QuickBasic/VisualBasic lineage, is still (unfortunately?) very much alive, and probably powering a much bigger part of the modern world than we should be comfortable with.


VBA isn't that bad a language. Compiles to native and runs with impressive speed. Lots of built in libraries for numeric programming among other things and fairly easy to call into any dll. The issue is it's accessibility. Like JS and Python people blame the language for what happens when you hand people with no software training and an interest only in solving their immediate problem, a computer language they are productive in.

And how can we call basic dead when VB.net is still in the tiobe top 10?


I was also exposed to a variant of BASIC in high school through my scientific calculator. I wonder if it is still the case ?


If you are thinking of TI calculators, it's very much still the case; at least until the nspire range completely takes over the older models.


Casio actually, but I am not surprised that TI would have done the same.


The "Why Pascal is not my favourite language" essay mentioned used to get cited frequently, but a lot of its criticisms are either irrelevant or just plain incorrect with respect to Delphi and co. e.g. "There is no 'break' statement" - In Delphi, there is. "There is no 'return' statement" - yes, there is.

It became a recurring circular argument at that point, with rounds of

"This essay is just wrong now."

"that just shows that Delphi is not Pascal, the essay is correct for _Pascal_ as standardised";

"then why does this essay keep being raised as if it's still relevant to Delphi, when it is not?"

"That's a misinterpretation",

"yes, one that newcomers make constantly. It's harmful."

The article is probably accurate that this essay was influential. "Was it correctly so?" is a different issue.


Discussed at the time:

Most(ly dead) Influential Programming Languages - https://news.ycombinator.com/item?id=22690229 - March 2020 (289 comments)

also Most(ly dead) Influential Programming Languages - https://news.ycombinator.com/item?id=24602741 - Sept 2020 (13 comments)


For reference, 4 of them are still in the current TIOBE top 50 (https://www.tiobe.com/tiobe-index/):

BASIC: 6 and 13

Pascal: 16 (Delphi/OP)

COBOL: 27

ML: 50

Mentioned as being in the next 50 (for reference, Elixir and Clojure hang out in this tier):

Smalltalk, APL

Not in TIOBE top 100:

ALGOL, PL/I, CLU, SIMULA 67


True facts regarding Java. When Java came out, it hindered the ability for the other programming languages to grow. Till date, Java stays one of the most popular languages.


Are you referring to this?

> Smalltalk wasn’t the only casualty of the “Javapocalypse”: Java also marginalized Eiffel, Ada95, and pretty much everything else in the OOP world

I think it stifled one set of languages and helped others, while helping the "enterprise software" c++ crowd acclimatize to 70s era inventions things like memory safety and garbage collection. Python, Ruby, and JVM langugaes like Clojure and Scala were helped I think.


On that topic, there is a very good (and funny) talk by Bret Victor called "The Future of Programming" presented as if it were 1973.

It gives an interesting tour of dead languages and the things we lost.

https://www.youtube.com/watch?v=8pTEmbeENF4


This is one of the best talks I've seen, do you have any other recommendations?


"Simple made easy" [0] by Rich Hickey is a must-watch (even if you never touch Clojure)

[0] https://www.youtube.com/watch?v=SxdOUGdseq4


All talks by Bret Victor are great, they are listed on his websites but they are more concerned with the future of programming than old languages: http://worrydream.com/

On the topic of language design, there is also this talk by Brian Kernighan which gives good practical advice if you're building a language and talks about his journey with awk: https://www.youtube.com/watch?v=Sg4U4r_AgJU


Stop Writing Dead Programs


+1

This talk explain me why I feel better with languages with more dynamic features.


Most of these languages are neither dead nor dying. They are just niche.

I just started learning APL this year. It is still actively developed and no you don't need a special keyboard. Plus we have unicode now which makes the special symbols a non-issue these days.

Same with most of the other languages. There is Pharo and Squeak for Smalltalk, the pascal community has free pascal. Sure those communities might not be huge but they are not in acute danger of vanishing any time soon.

And COBOL is still carrying the economy.

Now some languages like ALGOL might actually be dying, sure.

Natural languages are considered dead when they lose their last native speaker. Similarly when the last person being able to use a programming languages dies, we can consider that language dead.

Which means that we have lots of languages that died at childbirth but once a programing language has managed to go over a certain popularity threshold, it is very hard to kill.

Languages don't need to win any popularity contests to be alive. Language maximalism in the sense that you need to be one of the most popular languages or you are considered a failure and dead is just silly.


> Natural languages are considered dead when they lose their last native speaker. Similarly when the last person being able to use a programming languages dies, we can consider that language dead.

I don't think this is the right comparison. A native speaker would be more like someone who learned the language as their first or maybe second language, rather than someone who can use it at all. And by that metric, these languages are pretty much dead/dying, since they mostly have no new learners who aren't into PL history.


> since they mostly have no new learners who aren't into PL history.

They do have have new learners, that was my whole point. Whether they learn out of historical interest, to become better programmer in general, for a job, for research, or because they need it for a specific project does not matter. (And yes all those reasons apply.)

Take a look at companies using APL: https://github.com/interregna/arraylanguage-companies

Or look how many people use it to solve Advent of Code.

As for learning them as a first language, if we applied that criteria then most programming languages would be born absolutely dead and stay there. I don't think anyone ever learned Elm or Purescript as their first language, are they dead?

I really don't get why people make such weird claims, declaring healthy and obviously alive communities to be dead. Again, things don't need to popular to be alive.


I mean, I'm pretty sure most of those companies would view APL as a liability, just like companies that still use COBOL. The question really is about whether new, significant projects are being started in these language.

People choose weird, novel, old, and esoteric languages for Advent of Code because it's fun and a way to stand out from the pack, or practice languages that might not ever get used otherwise. It's not evidence of a healthy community.

And yeah, I mean Elm and Purescript might not be "dead", but the fact that they are niche, largely non-general-purpose languages doesn't really mean that they are "healthy" either. I'd be willing to take a bet that both of these languages are effectively dead in 10 years.


There's a difference between learning a language as part of history, and actually being proficient at it. I know ALGOL 60 well enough (having tried to implement it), but I wouldn't consider myself a "native speaker" of it.


As a parallel, there are plenty of people that speak Latin.


your first programming language isn't really analogous to your native language. there isn't actually a fair comparison between programming languages and natural languages. active users of a programming languages seems a fair enough measure of liveness.


Sure, but I think the question is what we mean by "active." I think "has anyone written code in this language recently" isn't a great metric, because by that measure pretty much any language that ever has had adoption is "alive" by virtue of being used in Advent of Code, as the OP points out. This just seems contrary to my intuition about what it means for a PL to be alive and have a healthy community. What I meant by "first language" was more about community growth, and there being people willing to mentor newbies, new projects being started for new users to hack on, etc.


Lazarus ( Free Pascal ) seems to be thriving actually. APL is probably as big as it ever was. COBOL will shrink over time but it may still outlast me. I have no insight into SmallTalk but my University curriculum included it and it would not surprise me at all to see it used academically today. I was also taught Scheme and Fortran and those are still going strong. If I wanted to use any of these, I know I can find both free and commercial dev environments easily that run on platforms I still use.

Is ALGOL still in use anywhere though? Is there a compiler available that runs on anything modern? I genuinely curious.


My exposure to SmallTalk has basically been from professors who always referenced it as great and a pillar of OOP back in the day at university. Of course I was a crontrarian student who was drinking the functional cool aid at the time so my response was usually "lol OOP" or something similarly clever. I investigated it a bit and my main takeaway was "ellegant but slow". Being the mature student I was I then proceeded to read up on OOP in detail for the following semester and changed from "lol OOP" to "yeah, yeah but it's so much better in Eiffel". I'm very thankful that I've since turned from a flamewar-triggerhappy person into a very pragmatic person that uses whatever language gets the job done while trying to find the value and neatness in each of them :)


Looks like an Algol special character was added to Unicode in 2009 so... maybe? Seems like Fortran and COBOL pop here as dead languages still in use but I haven't seen anyone mention Algol (or B, BCPL or anything similar either).


Sure is lucky that the title of the post explicitly says "mostly dead" and includes a big disclaimer about how not all of them are dead, then!


Also, as was obscured by the title mangler, the "dead" in the title is just a side remark; the title without the parenthesis is "10 most influential programming languages", and "(ly dead)" after "most" is just a parenthesis.


Maybe we'll just revert to the linkbaity title in this case. I had it briefly as "Mostly-dead, influential programming languages (2020)" but that probably did more harm than good.

https://news.ycombinator.com/newsguidelines.html

Edit: that didn't work. I'm just going to take the whole 'dead' thing out.


Just curious: How do you determine whether the title change worked? Do you try it out and then check back later to see whether commenters are still getting hung up on the topic?


Yes, but less reliably than that.


Well, it's not a claim that these are the ten most influential programming languages. Lisp was clearly more influential than most of these, but it didn't make the list because it's still alive.

I think the title was mostly just a pun/riff on the title of the other post linked in the first paragraph, which did claim to be a list of the most influential languages, and doesn't really make sense if read as a specific concrete claim.


Think the author saw a way to sneak in a 'Princess Bride' reference and took it.


> Natural languages are considered dead when they lose their last native speaker. Similarly when the last person being able to use a programming languages dies, we can consider that language dead.

Someone "able to use a programming language" more directly parallels someone able to speak a language at all, by which measure Latin would not yet be dead. If you're going to draw the line there, that would be what linguists term an "extinct language".

It's hard to draw parallels to natural language because there are no native speakers of programming languages. The closest parallel I can think of is that a language can be considered dead when no new projects are started in it. Once you've reached that point, the remaining work in the language is maintenance, which is comparable to people studying and translating old Sanskrit texts.

Alternatively, a language can be considered dead when it has stopped changing and frozen in its final form.

By either of these measures a lot of these still aren't dead, but my bet would be that COBOL counts.


> my bet would be that COBOL counts.

Didn't COBOL recently (in COBOL terms) get object oriented features?

See https://www.ibm.com/docs/en/cobol-zos/6.2?topic=programs-wri...


Always 30 years late on trends. I'm waiting for functional COBOL with linear types.


> And COBOL is still carrying the economy.

Not by choice but rather out of desperation and necessity to continue to run the mission critical code, mostly in banking, finance, and it is also occasionally encountered in government services albeit its presence has greatly diminished in the last decade or so. Young developers have no interest in learning COBOL, and the old developers have alredy died several times, have been dug out of their graves several times and brought back to the frontlines, and there are no more of those to be had. So the mission critical code continues to run on the constantly upgraded mainframe hardware, sometimes having not been changed in a few decades.

Oddly enough, despite a few large scale attempts to come up with automated COBOL -> whatever language migration / translation tools, they seem to have been largely fruitless. I think it was IBM that hired a bunch of hardcore academics in mid-2000's (?) to design and implement a COBOL to Java translator in Standard ML, and they even managed to produce something working. Unfortunately, then history becomes murky and trails off, and the eventual fate of that creation is an enigma of sorts. And, then, earlier versions of Java (before Java 8 anyway) were a horror to behold and would probably require a Java < 8 into Java >= 8 translator, so…


I'd actually love to work with COBOL professionally, my problem is that COBOL jobs still pay a substantially low amount of money compared to jobs using modern stacks.

For something so "critical" you'd think the base salaries would start at $250k/year; but I guess we aren't at that level of desperation yet.


Out of curiosity what about COBOL appeals to you? My first job out of college was working with RPG on an AS/400 (so I'd consider it at least a cousin) and the experience was awful.


Basically wanting to work with a true legacy system, colloquially I feel like legacy" typically means "abandoned php/java8/c#" project. COBOL projects are older than me!

I think it would really help me grow as an engineer to understand how these systems are maintained, how they are tested, and how they are updated; what better way than something several decades old?

I feel like this is hard to find in the modern era, especially post 2010 onward.


Ah, interesting. I will admit I learned a great deal in the 2 years, though not all of that was the things you said. For me it was we had a lot of different codebases because each client had their own needs (beyond everyone who used our payroll system including printing checks, but that system rarely needed maintenance) so I had to learn to quickly read and understand a codebase to be able to make changes w/o spending days/weeks getting up to speed.


Interesting reading your comment as well! I honestly don't know what I would expect to learn just writing would I think would happen. :D

Thinking about it more, I don't think I've met single person that ever touched a COBOL code base throughout my career (talking from ICs up to VPs here). It seems the "oldest" was always some early Java project from early 2000s.


Because at $250k/yr they'd have no trouble attracting engineers willing to learn COBOL. There isn't a current "emergency" a la Y2K to spike the demand of experienced COBOL programmers; they have time to hire anyone with a CS background and let them learn the language on-the-job so they can do low-priority legacy maintenance.


> […] and let them learn the language on-the-job so they can do low-priority legacy maintenance.

And that is a problem frequently faced with fresh graduates and young developers (no disrespect to them) whose common reaction to legacy programming languages and platforms is best characterised as «ew, this is gross». Even a heftier salary package does not do the job – they just don't want to learn legacy technologies.

As a personal anecdote, it is the situation I encountered with a few young devs (3-4 years of the industry experience) in my team whom the management had entrusted to learn and support a 4GL platform that they borderline flatly refused to learn and left their employer shortly afterwards to code in React and other NodeJs frameworks. Which is regretful as the 4GL language in question was a pretty modern and good design and could have been learned to get a glimpse into good design and coding practices, the intricacies of the complex transaction processing etc. All of which could have been reused pretty much anywhere outside 4GL. In my view, it was a sorely missed opportunity.


The '08 recession forced me into a job maintaining an old Ada system when I was a young developer. Toward the end of that experience I was put on a project that involved modernizing and updating said system by porting it from SPARC/Solaris to x86/RedHat. I also had to learn to read MATLAB because most of the new modules involved translating the work of scientists into Ada code.

A good learning experience, at least. I've long since moved on to more "modern" technologies.


What were your use cases for APL ? I love it to bits on principles but never used it seriously, despite all the efforts of dyalog and Aaron Hsu.

niche languages have a flavour that mainstream spoils with feverish fads.. I always like to read perl or TCL forums and am surprised by the ideas and productions.


Finance/quant people sometimes use J which is some kind of APL:

https://en.wikipedia.org/wiki/J_(programming_language)

Edit: also K something something: https://en.wikipedia.org/wiki/Kx_Systems


APL is a great language for on the fly number crunching, which is pretty niche like writing with shorthand as a stenographer.


That's why he said MOSTLY dead. Mostly dead is slightly alive. With all dead, there's only one thing you can do: go through their pockets and look for loose change.


> the pascal community has free pascal

And lazarus.


and of course Delphi still has a lot of traction on Windows, though probably fading in favor of lazarus


> Most of these languages are neither dead nor dying.

They were sent by Jack Sparrow to settle his debt. Technical debt of course.


It’s kinda amazing to me that the main discussion this article provoked is the “deadness” of the language, rather than the ideas and influence these languages have.

For example, buried in there were things about CLU and Argus, and got me reading the Argus paper to see if there were anything interesting i can learn about contemporary distributed or federated systems (like Mastadon).


That's mostly an effect of the title, which included the word 'dead' until I finally realized we needed to take it out.


Prior discussions:

https://news.ycombinator.com/item?id=22690229 (289 comments; March 26, 2020)

https://news.ycombinator.com/item?id=24602741 (13 comments; Sept 26, 2020)

Some other submissions but with 0 or 1 comment.


He does mention SNOBOL (and there was SPITBOL which was compiled, not interpreted). I'd like to think it influenced Awk, but I don't know if that's really true.

I had an interview with Rational Machines in the early 80s, and they were Ada fanatics. A friend of mine who also interviewed there said it was really more like a religious cult.


This was a great trip down memory lane for an old retired sometime programmer and software engineer. I learned programming in K & K BASIC on a GE time share machine and FORTRAN II on an IBM 7090. I wrote production code in many of these languages (BASIC, COBOL, PL/I (not much), Pascal, SmallTalk (my favorite)). Among the odder "language" projects I ever worked on was a cross-compiler written in Pascal for a report generator. The compiler generated COBOL as it's object code, which we chose because of it's facility with reading database records, and formatting output for fixed spacing line printers.

I'm familiar with all but a couple of the languages in his list, and wrote code in most of them, although in some cases not code that ever went into a product or production environment. ML and CLU, though, I've never touched.


The BASIC cause of death is written that it was seen as a lesser, kids language.

That sentiment may be true to some extent, but fails to mention all of the problems of the BASIC language that made people look into alternatives, none of which were snobbiness.

If that were a real reason, Python would have suffered a similar fate.


> The BASIC cause of death is written that it was seen as a lesser, kids language.

Nah, its more that:

(1) The standardized, broadly compatible subset was a mostly unstructured, imperative language with very limited features, only global variable scope, named custom functions limited to a single expression (and a syntax that sharply limits what can be in a single expression), variables limited to numbers and strings, that was very tightly adapted to systems without modern capability (“line numbers” seem baroque now, but it really was a great way to have a simple, line-oriented text environment that worked as both REPL and simple dev environment.)

(2) While there were plenty of nonstandard versions with more modern features, they were mostly proprietary, often tied to very specific other pieces of software, and not particularly interoperable between them.

(3) Plenty of more modern, just as easy to use for simple cases, languages became available that had broad cross-platform compatibility (sometimes, not always, with formal standardization) without the limitations of BASIC’s broadly-compatible subset.

(4) Almost no computers came with “power on BASIC” anymore, so the unique role the ubiquitous use of BASIC in this way in the 8-bit home PC era gave it and which provided a support for its overall popularity went away.


> variables limited to numbers and strings

Not true. When I learned Applesoft BASIC in 1978–79 on an Apple ][+, it had arrays.


Correct, numbers, strings, and arrays of each of those would have been more accurate.


defined via 'dim'


BASIC illustrates the "one-screen problem," where a program becomes unreadable when it exceeds one screen, or perhaps one printed page. The one-screen problem is how I explain to beginners the value of things like subroutines with named arguments, local variables, and the like. Languages that survive, have to work for the size of programs that people want or need to write.


> BASIC illustrates the "one-screen problem," where a program becomes unreadable when it exceeds one screen, or perhaps one printed page.

Python has the same problem though, as do "scripting" languages in general. The limit may be some low amount of "screens" as opposed to a literal screenful of text, but either way it's quite impossible to program "in the large" with it. Even Go is hampered in this domain by its limited abstractions.


Oh, you shouldn’t really be comparing unreadability of BASIC to even Unix shell scripts. Python is the pinnacle of readability and structure in comparison.

Let’s see…

Numbered lines as labels, GOTO everywhere. You got GOSUB if you were lucky.

All state propagated mostly via global variables. Subroutine parameters were present in dialects few and far between.

Now, variables. Maximum length of an identifier was what, 1? Or 2?

Inconsistent delimiters and syntax in general. Take GW-BASIC and its hot mess of graphical commands. Here you can’t have parentheses, but there you must. Semicolons here, commas there.

All the stuff that makes programs readable — consistent syntax, named procedures, reasonably long variable names, named labels if you must — all appeared very late, in QBASIC I think. But the rest of the BASIC world still had to cater to the lowest common denominator if your program was to be even mildly portable.


For starters, you didn't "get GOSUB if you were lucky". It dates back to the very first version of Dartmouth BASIC.

That aside, most of the limitations that you describe are applicable to BASIC as it was in 1970s (and some, like numeric-only labels and single-letter identifiers, are straight from the 60s). For example, GW-BASIC that you mention already had 40 significant characters in identifiers, WHILE loops, and single-expression user-defined functions.

I also don't recall anyone writing any serious amount of BASIC code while catering to the lowest common denominator for the sake of portability, simply because the dialects were different enough that it wasn't really feasible. Even on a single platform like DOS, going from e.g. Turbo BASIC to QBASIC required substantial changes.


That a language might overcome the "any hunk of code bigger than a screenfull is too complex to manage", other than by chopping the code into functions and classes or whatever, is news to me.

What are other methods for slaying this complexity? And what are the languages that use it?


I think it's all of the languages people complain are "too complex": to be able to deal with more of program's complexity, you take on more language complexity.

e.g.:

• Encapsulation. Keeping implementation details private is pointless when you deal with a small program (the implementation is right there on your screen), but it becomes valuable when otherwise it'd be too hard to check which details you can change and which you can't.

• Strong type systems. All the types are obvious when the program fits in your head, and seem like unnecessary boilerplate. However, when the program is large, it off-loads a bunch of sanity checks from your head to the compiler.

Same goes for design patterns. Writing an `AbstractWidget` and `WidgetFactory` when you have two widgets to deal with is overcomplicating. But when you have 50 widgets, you need something to avoid drowning in copypaste and spaghetti if/elses.


Ideally, the features needed for writing bigger codes should be optional. For instance, a beginner can write a Python script within a single screen that does something like graph some data. Encapsulation can be treated as an add-on, as can be the type system.

That's assuming you want scalability. If a language is absolutely never going to be used for scripting by beginners, it doesn't need to be used that way.


> Ideally, the features needed for writing bigger codes should be optional.

Newer languages like Rust, Swift and arguably Carbon don't make programming-in-the-large features "optional"; they make them as lean as possible, so that users can more easily apply them at the outset compared to older languages like Java/C#. That way even a screenful-length trivial script can potentially grow into a larger system without becoming unmaintainable in the process.


Visual Basic was very successful, until Microsoft turned it into a mere alternative syntax for C# [0]. There is a parallel universe in which classic Visual Basic would have continued to thrive as a win32 glue language. (It still does to a minor extent in the form of VBA.)

[0] http://catb.org/jargon/html/V/Visual-Fred.html


I agree. VB6 was actively executed by Microsoft. VB6 didn't have to die.

I'm still surprised to this day that nobody stepped into that vacuum and created a VB6 clone. There should have been a ton of money for doing so--I wonder what prevented it.

I still have clients who are struggling to upgrade millions of lines of VB6 code base into something more modern (generally Python). GUI construction sucks in every single language since VB6.

The nice part is that they are unanimous in not choosing a Microsoft language ever again.


I've done some porting of VB6 into C# with WinForms. It's actually a pretty direct mapping. I wrote a converter, and then spend a few minutes cleaning up the converted code per form. Took me a couple weeks to convert a 500KB program, which mostly involved going through and converting VB6-isms that failed to auto-convert into C#. I can't imagine trying to translate something like that into such a different language and UI environment as Python. Unfortunate that they'd reject such a thing just because C# came from Microsoft.


QBasic (actually QuickBASIC; QBasic being the IDE) fixed some of the more glaring deficiencies of the original BASIC and I think was pretty popular for quite some time. I tried it and didn't like it--as I recall, I found things like error handling awkward. But it was from Microsoft and stayed around for quite some time.

Although Turbo Pascal was also popular for a time, I'm not sure anything truly replaced these beginner-friendly languages until Python came along.


> QBasic (actually QuickBASIC; QBasic being the IDE)

QuickBasic and QBasic both included IDEs; QBasic was the later revision (1991, vs. QuickBasic in 1985).

(Weirdly, perhaps, Microsoft released Visual Basic the same year as QuickBasic.)


BASIC was very practical for 8 bit machines. You didn't even need an editor, because you could just retype a line. It was also very straightforward for a complete novice (as most people were with those machines).

In any another environment, it really doesn't make sense.


True. One feature that was missed about BASIC on those early computers: you’d turn on the Apple ][ and you were greeted by the BASIC prompt and you could just start typing your program.

This was before floppy disks… you could load a program from cassette tape.


In Atari BASIC and Commodore BASIC, you didn't even need to retype a line. You could cursor up and edit it right on the screen. This saved a ton of typing.


I would expand this list with: perl(mostly-dead influential), and ceylon and coffescript(these last ones mostly-dead and not so influential, let's be honest).

I also would say that ruby is a future mostly-dead language, but I don't want to sound too controversial.


CoffeeScript might just be one of the most influential languages. Tons of ES6 features were directly inspired by CoffeeScript and ES6+ JavaScript is the most popular language in the world right now.


I'm not sure I agree with you. I'm not denying that coffescript might have inspired ES6+ features, but from this to say that it's a very influential, let alone one of the most influential languages, is a big leap. In my opinion, of course.


Each of the languages on the list influenced many subsequent languages, not just one.


> I also would say that ruby is a future mostly-dead language, but I don't want to sound too controversial.

Aren't you just saying it anyway? Why do you think this?


I developed a lot of projects with ruby and yet I can't think of a single reason to opt for ruby over python or typescript. Ruby doesn't do anything better both in terms of language or platform than its already very well established competitors.

It's my understanding that ruby rose to prominence mostly because of ruby on rails, now that RoR is in a downward trend I think ruby will follow the same trend until it's reduced to a small community of enthusiasts in the same way that happened to perl.


I can think of one, Ruby has a very healthy ecosystem for web development, with a very dominant an popular framework. A lof ot big companies seem to be invested on it to some degree. I don't think it has the brightest future and it might have a long but steady decline ahead, but I don't think risk of immediate dead.

(This is not to talk about the merits of the language, which it's really orthogonal to popularity).


> That’s one reason I love studying history. To learn what we’ve lost and find it again.

Same here. There's a lot of amazing things which we have yet to learn from the 'also ran' systems of history.


Programming languages are more like machine tools. The old and really fun to watch metal planer is like assembler. It plows though things leaving a stress free flat surface, but it's less efficient to work with.

C is like a lathe, there are older and newer features, but they all work essentially the same.

Pascal is like a Bridgeport, easy to understand, and straightforward in use. Unfortunately the main vendor got greedy and drove up the price too high.

All the newer languages are like CNC machines, that come in all shapes and sizes.


... I fear then that Lisp (especially Common Lisp) is that insane custom-programmed set of 6dof robots with tool changers.


I worked at a bank and the truth of the matter is COBOL is never going away. Most large banks run on it.


It will go away one large bank bankruptcy at a time (perhaps several at a time in case of country bankrupcy). Who is going to use COBOL for a new project at a new bank ?


"the question is why did C++ survive the Javapocalypse"...

I was thinking you can't run Java on AVR, but you can run C++ (Arduino..)

But here you go:

https://atmega32-avr.com/java-virtual-machine-for-the-atmel-...

Even so, this VM is written is C, with no JIT compiler on the 8K microcontroller, so C++ is much faster. Hmm.. now I'm wondering if GCJ could work with avr-gcc..


That's just suboptimal microcontoller choice. There are a lot of microcontrollers that directly the Java Micro Edition instruction set, for example aJ-100.


Windows Win32 API still uses Pascal calling convention instead of standard C. Shadow from the dead.

And also there was the 4th generation language movement at that time, 4GL. Defined by coding like writing English sentences, and not much algo. More akin to our no-code, low-code trend. SQL was also a child from 4GL. C/C++ was then seen as 3GL. And then we can now watch 3GL ultimately took over 4GL, at least until now, for half a century.

Tidbits.

Corrected: COBOL was 3GL.


COBOL came well before the 4th generation of programming languages - it was one of the earliest 3rd-gen languages. One could argue that like 4th-gen languages it was domain specific, but most programming languages back then were tailored to rather narrow domains.


As someone who still loves Perl I don't want to say Perl is mostly-dead, but it's dying and it influenced all popular scripting languages today.


Perl is nod dying but but in a process of metamorphosis with hope to one day turn into the beautiful butterfly that is Raku


It’s been dying for 15 years. First slowly then “all of a sudden”

What’s merlyn doing these days?

His response to the early warnings that “Perl is dying “ was “more people use Perl now more than ever”

The downfall of Perl should be a “case study”. Other communities should learn from the Perl community’s mistakes.

Java also had many problems for quite some time but its frequent update cycle seems to have helped.


>> What’s merlyn doing these days?

It looks like he's still doing Perl:

http://www.stonehenge.com/merlyn/

https://www.oreilly.com/library/view/learning-perl-8th/97814...


My personal list of dead but influential languages would include NewtonScript [1]:

    With the cancellation of the Newton project by Apple in 1998,[8] all
    further mainstream developments on NewtonScript were stopped. However,
    the features used in NewtonScript would continue to inspire other
     programming models and languages.
    
    The prototype-based object model of Self and NewtonScript was used in
    JavaScript, the most popular and visible language to use the concept so
    far.
    
    NewtonScript is also one of the conceptual ancestors (together with
    Smalltalk, Self, Act1, Lisp and Lua) of a general-purpose programming
    language called Io[9] which implements the same differential
    inheritance model, which was used in NewtonScript to conserve memory.
[1]: https://en.wikipedia.org/wiki/NewtonScript


Not sure why NewtonScript rather Self. Self is the originator of prototype-based programming and differential inheritance (although the term differential programming was preferred). NewtonScript adopted them, as did JS and Io.


Another missing tidbit: when the Mac was introduced in 1984, Object Pascal was the only language supported by Apple to write early Mac software [1].

Apple also shipped a version of UCSD Pascal for the Apple ][ as well.

[1]: https://en.wikipedia.org/wiki/Object_Pascal


> That’s one reason I love studying history. To learn what we’ve lost and find it again.


I just want to say that I still enjoy BASIC, but via BlitzMax NG. It's a good language to quickly prototype things in - https://blitzmax.org/


Additional info about the influence of Simula on Smalltalk: http://worrydream.com/EarlyHistoryOfSmalltalk/


This is an excellent talk on designing successful programming language by Brian Kernighan [1]. In the beginning of the lecture he did show a picture of the Tower of Babel depicting plethora programming languages as it's appeared in the cover of Communications of the ACM (CACM) magazine in January 1961 and most of the programming languages mentioned in the article were already there!

[1] Brian Kernighan on successful language design:

https://youtu.be/Sg4U4r_AgJU


FORTH

Lives on in every stack machine


The whole point of this list is to chronicle influential (but "mostly dead") programming languages; if they were counted as alive because of their influences, then they'd still be alive.

But, also, Forth isn't on the list.


My point is FORTH should definitely be on the list. It is pretty much dead as an actual programming language in that no one programs in it anymore. But it was/is very influential... For example, PostScript is basically a dialect of FORTH.


> My point is FORTH should definitely be on the list. It is pretty much dead as an actual programming language in that no one programs in it anymore.

Forth—not dead!

The Forth community just had their 37th conference a few months ago [1].

It's obscure no doubt but it's still being used and dev tools are still being developed and updated. Apparently the FedEx handheld scanner runs on Forth and it's also used to create firmware and boot loaders [2] for various embedded systems [3].

[1]: https://forth-standard.org

[2]: https://www.freebsd.org/cgi/man.cgi?query=loader&sektion=8

[3]: https://www.forth.com/resources/forth-apps/


Yeah, FreeBSD is/was a big Forth user for its boot menu, but that’s being ported to Lua now.


Not just FreeBSD, but Sun SPARC, Apple PowerMac, Pegasos, IBM Power Systems, OLPC XO-1, and other systems.

https://en.wikipedia.org/wiki/Open_Firmware


> My point is FORTH should definitely be on the list. It is pretty much dead as an actual programming language in that no one programs in it anymore. But it was/is very influential...

Wow, I totally misunderstood your point. I'm sorry.


Pretty sure Algol (and not Forth) lived on all the Burroughs large systems stack machines (B6700 and friends) - they're still with us as the Unisys A series


Yes, I agree FORTH should be probably be on the list, at least if the list was a few languages longer, for as influential as it's been.

WAForth for WebAssembly is beautiful and modern!

https://github.com/remko/waforth

It's a lovingly crafted and hand written in well commented WebAssembly code, using Racket as a WebAssembly macro pre-processor.

I learned so much about WebAssembly by reading this and the supporting JavaScript plumbing.

The amazing thing is that the FORTH compiler dynamically compiles FORTH words into WebAssembly byte codes, and creates lots of tiny little WebAssembly modules dynamically that can call each other, by calling back to JavaScript to dynamically create and link modules, which it links together in the same memory and symbol address space on the fly! A real eye opener to me that it was possible to do that kind of stuff with dynamically generated WebAssembly code! It has many exciting and useful applications in other languages than FORTH, too.

Lots more discussion and links in the reddit article.

But here's the beef, jump right in:

https://github.com/remko/waforth/blob/master/src/waforth.wat

Reddit /r/Forth discussion of WAForth:

https://www.reddit.com/r/Forth/comments/zmb4eb/waforth_wasmb...

remko:

Author here

If you can't be bothered to install VS Code, you can have a look at a standalone version of the example notebook (in a 26kB self-contained page).

And if you're planning to go to FOSDEM 2023, come say hi: I'll be giving a talk there on WebAssembly and Forth in the Declarative and Minimalistic Computing devroom.

DonHopkins:

I really love your tour-de-force design and implementation of WAForth, and I have learned a lot about WebAssembly by reading it. Never before have I seen such beautiful meticulously hand written and commented WebAssembly code.

Especially the compiler and runtime plumbing you've implemented that dynamically assembles bytecode and creates WebAssembly modules for every FORTH word definition, by calling back to JavaScript code that pulls the binary bytecode of compiled FORTH words out of memory and creates a new module with it pointing to the same function table and memory.

WebAssembly is a well designed open standard that's taking over the world in a good way, and it also runs efficiently not just in most browsers and mobile smartphones and pads, but also on the desktop, servers, cloud edge nodes, and embedded devices. And those are perfect target environments for FORTH!

What you've done with FORTH and WebAssembly is original, brilliant, audacious, and eye-opening!

I'd read the WebAssembly spec before, and used and studied the Unity3D WebAssembly runtime and compiler to integrate Unity3D with JavaScript, and I also studied the AssemblyScript subset of TypeScript targeting WebAssembly and its runtime, and also Aaron Turner's awesome wasmboy WebAssembly GameBoy emulator .

I first saw your project a few years ago and linked to it in this Hacker News discussion about Thoughts on Forth Programming because I thought it was cool, but it's come a long way in three years, and I'm glad I finally took the time to read some of your code, which was well worth the investment of time.

Until reading your code, I didn't grasp that it was possible to integrate WebAssembly with JavaScript like that, and use it to dynamically generate code the way you have!

Also, the way you used Racket as a macro assembler for WebAssembly was a practical and beautiful solution to the difficult problem of writing maintainable WebAssembly code by hand.

Even for people not planning on using FORTH, WAForth is an enlightening and useful example for learning about WebAssembly and its runtime, and a solid proof of concept that it's possible to dynamically generate and run WebAssembly code on the fly, and integrate a whole bunch of tiny little WebAssembly modules together.

Playing with and reading through your well commented code has really helped me understand WebAssembly and TypeScript and the surface between them at a much deeper level. Thank you for implementing and sharing it, and continuing to improve it too!

remco:

Wow, thanks a lot, I really appreciate that! It makes me very happy that I was able to get someone to learn something about WebAssembly by reading the source code, which is exactly what I was going for.

[More links and discussion of WAForth, WebAssembly, and Forth Metacompilers:]

https://www.reddit.com/r/Forth/comments/zmb4eb/waforth_wasmb...


Didn’t Pascal die due to some combination of C and Javas popularity and Borland team getting poached by Microsoft in the late 90s?


Pascal graduated to Delphi (Object-pascal with built-in GUI designer), and had a very healthy run for a decade and was my favorite language until Borland corporate stuff happened and a cross-licensing deal with Microsoft that turned Delphi into a sort of bastard sibling of .NET, and new owner Embercadero raised prices out of the reach of most of the small-time users that made up a bulk of the userbase. I was in a tiny factory-automation engineering group that used it to build user interfaces and data reporting for all the factory stations. Well, on that note, 3M corporate shut down the factory and moved it to China, so maybe other forces were in play to kill the userbase.


Was is Delphi creator who left Borland, created C#, then TypeScript?


Anders Hejlsberg.


In my youth, I learned basic then Pascal before taking compsci in high school. Despite the fact that I was confident in both, your course supported both languages, but by far the "not a programmer" types chose VB and that was their speed and it's was fine. For those of us that really enjoyed programming, we learned and loved Pascal/Delphi more, but I'd argue most of us transitioned to C/java afterwards. Delphi took a middle ground approach that wasn't entirely successful to either of these groups.

I still think the biggest mistake MS made with their tooling was burying VB6+ and replaced it with VB.net. They may have made VB a "real" function extensive language, but I think a bunch of soft-programming people just left and never went back. C# ate any of those that would've used VB.net in this way anyways IMHO.


When I was first exposed to C (perhaps in Byte Magazine), I was dead certain that Pascal would win out. I wonder if the minor optimizations achieved by programming closer to bare iron on early computers led to a minor performance advantage that outweighed the need for readable / maintainable code.


C did compile to more efficient code than Pascal, but The Delphi compiler processed code 10x faster than the C compiler, and hit an amazing sweet spot for the 300Mhz workstations of the day, where you could iterate through edit/compile/run in Delphi just as fast as Visual Basic, and when you were done the program ran an order of magnitude faster than VB (and maybe 90% as fast as C). Also, because the language was easier to parse, the tooling was able to interact with your code way better than Borland's same product for C++ or any of Microsoft's products, so you were getting more accurate code completion and context-sensitive help and all that.

I believe that Delphi was a vastly superior product to any of its contemporaries, and was killed purely by corporate forces.


C and Pascal were both fast, but C allowed programmers to dynamically allocate and free memory without having to deal with 255 character limits and other limitations imposed by Pascal. It also allowed us to create some magnificent bugs, too.


My sense was Delphi was replaced more by Visual Basic, though it's entirely possible that's based on my personal experience and didn't represent a larger trend.


Turbo Pascal was popular because it was cheap at a time when buying a compiler could cost hundreds of dollars. But it really never got picked up widely beyond Borland products. BASIC options got a bit better. And, yes, C and Java were popular for "serious" programming.


Depends on what is your opinion about Delphi and Embarcadero


My friend is working on KlongPy (https://github.com/briangu/klongpy) which is has a terse array-notation language similar to APL.

I'm curious: Is there anything interesting in APL that hasn't yet been implemented in NumPy, etc.?


Concise infix notation with combinators.


Surprised Prolog isn't mentioned. It greatly influenced Erlang (the first version was written in Prolog)


over decades there has an enormous amount of discussion about the rise and fall of various programming languages but somehow it remains for ever a shallow analysis, more noise than signal. what do we really know about this phenomenon at the end of the day?

one problem is the difficulty of measuring what we are talking about: when is a language "dead", when the last program in production stops running? when the last programmer that can program in it retires? when the last school that is teaching it removes it from the curriculum? or maybe when the last computer scientist that can enhance it decides there are richer academic and/or commercial pickings elsewhere?

another problem is the objective measurement of any of the above definitions. beside the design and availability of surveys, objectivity was always a problem given that software is big business and hence various actors have incentives to skew the perceived state to support their aims. but it has become particularly problematic in the current interconnected universe of developers that leads to very pronounced virality and network effects which may or may not persist.

finally the scope of programming is ever expanding as more and more aspects of society and economic life get digital but the process is still in full swing, materially incomplete and dominated by idiosyncratic oligopoly circumstances. think for example about the billion mobile devices around the planet that are currently practically impossible to program by their users. one could imagine e.g., of the "dead" languages (Basic?) suddenly return from the grave as an ubiquitous mobile device programming paradigm. A programming language that was useful in the past in a certain context may become useful again if a similar context reappears.

what would be interesting is a sort of "principal component analysis" of the programming language space: identifying the principal functionalities / capabilities provided by different languages. a dead language could then be defined as one that is completely dominated (effectively a subset) of another one along all relevant principal component directions


> one problem is the difficulty of measuring what we are talking about: when is a language "dead", when the last program in production stops running? when the last programmer that can program in it retires? when the last school that is teaching it removes it from the curriculum? or maybe when the last computer scientist that can enhance it decides there are richer academic and/or commercial pickings elsewhere?

When no company willingly starts a greenfield project written in this language.


No Forth? Wow.


Can someone with real knowledge on deployment and usage of these language tell where are they being used (beyond a dedicated hobbyist community)? Which industries? What systems? Volume of transactions? Genuinely interested and will appreciate it. Thank you in advance.


I don't have a ton of experience with any of them, but I've had a few run-ins with COBOL. Most notably, at my last job I was working in a team that was actively working to replace a large legacy COBOL system for retail supply chain management. I never worked with the COBOL applications directly, but the stakeholders on my projects overlapped with users of that system. Less personally, but with more data points, there's a large employer in my city with a substantial code base in COBOL who actively recruit career changers and teach them COBOL. I've met several people who went that route while doing outreach and trying to mentor new developers in my community.

My general impression is that the large COBOL codebases are used across a lot of different parts of industry, and for a lot of different things. The most common feature is that COBOL applications continue to be used in organizations that are so dysfunctional and bureaucratic that they can't successfully agree on how to replace it. The general shape of the problem seems to be that business decisions were made decades ago, and written into the codebase. Replacing the codebase would mean that someone at some level would need to sign off on requirements, which would mean taking responsibility for something, and the organizations that can't move off COBOL are so toxicly risk averse that nobody will make an affirmative statement about any requirement. Instead, they manage to just barely get by with a combination of very experienced developers who can retire comfortably and DGAF and make decisions, and very junior (and poorly paid) developers too naive to push back or ask too many questions.


This was interesting, even as a non PL person. Everything before C and Basic is kind of hazy and distant to me. This is the first I’ve heard of the concept of four mother languages. I wonder if there is anything that explains it more.


Not mentioned in the article or in prior discussions, but Scala seems "dead" to me. Or at least extremely niche and unlikely to take off.

AFAICT, Scala has basically been replaced by Go for backends and Kotlin (or just plain Java) elsewhere.


Scala 3 released like a year ago.

Yes, it is niche but not extremely niche. People either into functional programming or that are part of the JVM ecosystem have at least heard of it. It is one of the big alternative JVM languages with Clojure and Kotlin.

Sure it won't ever overtake Java and Kotlin is eating a bit of its lunch but Tech is not a popularity contest.


Scala was certainly more hyped before (for better or worse). If Scala is what you would call dead then other languages like Haskell, Clojure, Racket, OCaml all must be considered dead as well, which doesn't make much sense.



Pascal gives me flashbacks to the 90s in Germany every Prof was into it.


I think Pascal's sun set because Borland killed it with Delphi.

In my opinion, Borland was the 300lbs gorilla in the Pascal compiler arena, so when Borland decided to move push Delphi over Pascal, there was no interest in filling the Pascal hole.

btw I wrote commercial products on Pascal that I thought was neat. A chat program that used IPX/SPX, a a fax server using DesqView, and MS Mail queues both inbound and outbound, reverse delta databases that stored all changes at field level, overcame the 640K memory barrier by using a TSR that swapped the data in/out on demand, and a proto-PXE server using my insanely fast copy program over IPX. That was the good old times.


I still use Delphi for my Windows desktop product and Lazarus for Linux desktop software


I have a very fond memories of Delphi, but if I need something custom nowadays (think utils, not something SAP-like) I just write it in PowerShell.


Pascal was the introductory computer language at my college in 1988. I don't remember much of it.


> Pascal was the introductory computer language at my college in 1988. I don't remember much of it.

I took Pascal as a college freshman in 1982. And even though I haven't programmed in it seriously since then, it stuck with me. It didn't hurt that it was a minimal language designed for teaching computer science. For example, when I learned Pascal, there was no string type; you had to create an array of characters.

I was checking out the Free Pascal website recently; although the language has been greatly expanded, Free Pascal source code still made sense to me.


A bit unrelated, but comes to the features of different languages and who introduced.

Does anyone else still remember how JavaScript prototypical inheritance was touted as a “good” idea?


I just posted a link about NewtonScript [1] in this thread, which had prototypical inheritance in the early 90s, which JavaScript later used.

Short answer: it's more memory efficient than a full-blown OOP system.

I was pretty active in the Newton MessagePad community back then and I knew lots of Newton developers; they loved NewtonScript and raved about prototypical inheritance.

[1]: https://news.ycombinator.com/item?id=34047775


That particular feature came from a language called Self, which was a prototype object based dialect of Smalltalk invented at Xerox PARC in the late 1980s. https://en.wikipedia.org/wiki/Self_(programming_language)

It was one of several big language influences for Brendan Eich when he knocked out Javascript in a hurry a few years later.


When I was at 7th grade I only knew BASIC. Then someone told me about Pascal and it blew my mind, because it didn't have the GOTO statement.


Ada95 was good because it forced people to think about software and type safety.


Does anyone have book recommendation on learning COBOL?


Where is C++ on that list?


By what possible metric could you consider C++ "mostly dead"? I'm fairly sure that most, if not all AAA games nowadays are coded in C++, for example. Never mind such little known projects like GCC, Qt, and LLVM. It's not a language I care to use, but it's clearly still very healthy: widely used and regularly updated.


Not to mention that it's "influential" mostly by providing examples of what not to do in a sane programming language :-)


I am sure I saw its corpse on the road.

I kicked and kicked and it would not get up again.


Most programs you use and internet infrastructure are written in C and C++. Furthermore, it's actively used to develop new software everyday by most large companies.


I'd say your first comment is very true and your second is very untrue. There will be sectors that will generally gravitate to c/c++, and a huge boatload that won't touch them with a ten foot pole. Corp languages are jvm based, .net based, js/ts, and maybe Go in some contexts now. None of the companies I've worked at in like 15 years has decided to create new c++ projects (clearly subjective view).


None of the other programming language compilers in the world would not build anymore if C++ would be dead or dying.


There’s significantly less interest in implementing the next version of C++; Apple and Google don’t care about it. So someone’s going to have to update g++ and clang.


While many compilers are originally written in C or C++, most are soon rewritten in themselves.


And then they woke up.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: