Hacker News new | past | comments | ask | show | jobs | submit login
Some memories of Niklaus Wirth (odersky.github.io)
209 points by gautamcgoel 8 months ago | hide | past | favorite | 61 comments



I remember having just left a scala event I was running in a cab, and seeing Martin on a london bridge with a gaggle of guys following him. I wasnt aware he was the student of Nikalus Wirth -- but it makes a lot of sense.

It's a shame though, that Wirth's philosophy was, imv, ditched from 90s-00s (where we get: python, js, ruby, ...scala).

It seems now language design is rediscovering Wirth and his techniques (see, eg., google's carbon which iirc, seeks to be ast-free single-pass).


There is something deeply attractive about writing an ast-free single pass compiler. It's a great exercise to write a compiler that targets a scripting language like lua or js in a single pass. I think though that there are significant advantages to having an ast and multiple compilation passes, particularly on present day computers that have effectively unlimited memory for the purpose of compiling a single source file.

Multiple passes allows you to break down the problem into separate orthogonal components. To support the same functionality in a single pass, you have to jam an enormous amount of functionality into one place which can make it very difficult to see the forest of the trees as well as introduce unintended coupling. It also is difficult to implement certain recursive features in a single pass (though this could be considered an advantage depending on how powerful you intend the language to be).

Ideally, a compiler can also provide apis that will expose the ast at different phases. It is somewhat sad to me that tree-sitter has emerged as an LCD solution for syntax highlighting and semantic analysis. If a compiler could export e.g. a json representation of the ast after parsing, that could be used for syntax highlighting rather than tree-sitter.


tree sitter seems to do pretty well in its limited role; why is this unfortunate in your opinion?


Not OP, but I think their point was clear - they seem to wish that compilers offered their own ASTs / tools for working with ASTs, so that tools like tree-sitter wouldn't exist at all.

It makes sense - tree-sitter is in large part a reimplementation of many, many parsers, and supporting a new language means duplicating yet another parser. The tree-sitter grammars are doomed by nature to forever chase the latest features in any evolving language, and reaching full compatibility turns out to be a ton of effort in many cases.

That said, language design being the freewheeling "if you can dream it, you can make it" thing it is, though, I doubt it's realistic to expect every language to have a fully-reusable, standardized parser - people just won't ever be that consistent, IMO.

Those who want consistency will need to provide it themselves, in the form of tools like tree-sitter.


> Those who want consistency will need to provide it themselves, in the form of tools like tree-sitter.

Or in the form of radical simplicity. For the purpose of the parents discussion Lisp is the ultimate minimal language. Since, the syntax is its own AST. Other aspects of most Lisp are certainly not so simple.


Everything is a balance. Can you have the power of Python in a Wirth-like language? Pascal compiles fast into executables that are easy to distribute. Python is extremely flexible and easy to use for a wide array of problems, but is a packaging nightmare. Languages like C are blazing fast and easy to distribute, but require a lot of code to do what I can do in a few lines of Python.

I haven't used Swift, but hear it may be a bit of a compromise in that it compiles (albeit not in a single pass I don't think) and using it is not that much more verbose than Python. I know Nim and Crystal are similar attempts, but Nim doesn't seem as easy to use to me.


The jai, zig, etc. approach is compile-time metaprogramming; make the compiler available to program, make the build system plugging into the compiler.

Then I think you can do it. The compile-time is "a Wirth" and the runtime is "a Wirth".

You get 95% the benefits of a python, since 95% of the dynamics you want are actually just static transformations of the program.


I don't see how you can get 95% of the benefits of Python by programming in Zig. When I looked into Zig, it was like coding in C. Not as many built-in data structures and you have to manually manage memory and all that. Zig seems like an attempt for a better C.


for most applications you dont need gc, you can just have a one-off temporary allocator. you only really need the malloc model, or the gc model, of memory management for long running apps

for python-style apps, theyre all die quickly


> ... but Nim doesn't seem as easy to use to me.

Hm... Can you elaborate why? For me, Nim is the easiest language to work with. It really feels like Python with static typing.

This one is probably a hot take, but I believe that static typing is vastly superior and easier to think about than dynamic typing. In languages like Python or JS, Lua, etc. it's not like they don't have types, types are merely hidden, you still should be aware of types. They just move responsibilty for handling type errors from compiler to the programmer.

Or maybe because Nim is extensible? I know that big point of Nim is macros, and macros in general are hard. But in my experience I've never felt that I need to write one, when using Nim.


> (see, eg., google's carbon which iirc, seeks to be ast-free single-pass)

Carbon’s (mostly hypothetical) frontend is actually 3 passes: lexing, parsing, and a semantic analysis pass, each with separate IRs. One of their arguments is actually that they are more efficient as separate passes operating on packed data structures rather than e.g. a parser that calls into a lexer for each individual token. In a way, this is a claim that Wirth’s approach is inherently inefficient on contemporary CPU microarchitectures.


Hmm, i read it as "wirthing" each layer. since, iirc, the IR representation of each pass is in fact, basically, just a linear array of ints.

So you have linear scans on top of linear scans.


Beautiful semi-obituary + personal impact report.

Odersky has done very impressive work (I only recently learned about his work on TurboModula-2 - cool!), but his work is not focused on simplicity as Wirth's was - I wonder if he tried hard to adopt Wirth's simplicity mantra or not (given how he praises it in the article)?

Scala's type system is quite complicated, for instance, and the notion of "object-functional" itself, as a hybrid concept, lacks simplicity.


> [Odersky's] work is not focused on simplicity as Wirth's was - I wonder if he tried hard to adopt Wirth's simplicity mantra or not

I would beg to differ. I think he's very much aiming at simplicity. Btw, the essence of Scala really is simple. Take for example DOT (Dependent Object Types is the theoretical calculus Scala is based on), it is simple

https://www.scala-lang.org/blog/2016/02/03/essence-of-scala....

as video https://www.youtube.com/watch?v=bZEWNKzhBoU

Or take his emphasis on simplicity in practical software engineering: https://www.youtube.com/watch?v=-qf8yteuxPs&pp=ygUObWFydGluI...

Than there is the question of practical concerns with Scala, like fitting onto the JVM, interoperability with Java, superficial features for programmer comfort (or lack thereof) and the tendency of some people to use the most powerful features for the simplest of problems that inevitably lead to messy codebases. And that certainly makes things complicated.

> the notion of "object-functional" itself, as a hybrid concept, lacks simplicity

I think this is misunderstanding about what Scala is all about. Scala is not supposed to be 50% functional and 50% object-oriented. It's supposed to be both 100% functional and 100% object-oriented (aka having powerful module system) _at the same time_. It's supposed to be a concrete proof that this dichotomy (FP vs OOP) is false and you can have both in the same language with the same features.

This is contrary to other languages which may have support for FP and OOP but have _separate_ support for each, like SML, OCaml or F#. Scala is in this tradition, but different, it supports FP and OOP with one set of features. And from this point of view it is _simpler_. Whether people find it _easy_ is another question.


> It's supposed to be both 100% functional and 100% object-oriented (aka having powerful module system) _at the same time_.

That is well said, and I agree.

And it certainly is an acomplishment, it's even in some way elegant, but I would not call it "simple". Simple is a compiler book in less than 100 pages that _also_ includes the full source code of a compiler that can compile itself, or a language the syntax graphs of which fit on a single paper page of ordinary size.

Odersky saw that to make impact, you need to interact with an existing, large library ecosystem instead of "build it all yourself", so I'd say he is a realist-pragmatist whereas Wirth was a practicising (self-made) idealist-minimalist.

I think I will get out my old MODULA-2 book in honour of Wirth again and play with GNU Modula-2 a bit, let's see how that feels after 24 years of C++/Java/Python in-between.


Scala has accumulated a lot of cruft on its own over the years and it hasn't always succeeded, but at the same time I'd say that a core value has always seemed to be trying to unify Java's weird special cases ("why does every class member have to support being static, instead of just supporting first-class singletons?", "why are primitive types not part of the type hierarchy?", "why null?", or even "why are some type conversions hard-coded in the compiler?").


All I know about scala is it takes aeons to compile scala Minecraft mods, and few people know how to fix them.


They cleaned up a lot of warts (like implicits) with scala 3


I don't think implicit's were a wart, and AFAIK in Scala 3 implicits effectively still exist but with a different syntax (but not touched it for a while, so may be wrong)


Scala 2’s implicits have been a source of great pain in many Scala codebases at many different organizations. They are too powerful. The situation is comparable to languages that depend on gotos instead of structured control flow elements (for/while loops, if statements, function/subroutine calls).

I can’t speak for Scala 3 as I haven’t used it at all. If they’ve limited the power of implicits to a few more structures usages it would be a great benefit to the language.


In Scala 3, implicits live on under a different name. They're called _given_s. But they're reduced to just one use case: propagating a context through (and deriving one given form others). And Type Classes are subsumed under this use case.

The other use cases have been essentially removed:

* Extension methods are now its own feature, relying on completely different mechanism * Automatic conversions have been severely curbed and de-emphasized, although they are still there, but much more explicit


Too powerful? Or just ab/mis-used?


Interoperating with Java was a requirement. That immediately takes simplicity off the table if the goal is to bring FP to the JVM.


Admittedly this is from a distance, having only toyed a bit with each, but Clojure always seemed much simpler than Scala.


That follows from Clojure being dynamically typed, though. Different design constraints lead to different results.


Why's that?


Types interop on a VM that wasn't designed for a more complex one like Scala has.

Collections interop between Java<->Scala. Scala's (immutable/mutable) collections are not built on the Java ones, and they provide helpers to translate between the two.

Calling conventions, name mangling, Java to Scala issues, etc.


FP doesn't need complex type systems though. And there's not a lot of type stuff on the JVM level, it's more dynamic than you'd think.


I’m not defending Scala design just describing the interop challenges given said design.

Scala was certainly envisioned with a strong type system in mind. The JVM doesn’t mandate mandate that, true, but the compiler they wrote did.


Yeah, I too found the praise for Wirthian ideals funny in the context of Scala. But on the other hand I believe Scala 3 did clean up lot of things, so maybe there is still something there?


For the unaware: Odersky himself is a “modern day Wirth”. Amazing language designer. He’s responsible for generics in Java and the Scala language.


It’s interesting that odersky helped put generics in Java and also made Scala. Don Syme helped put generics in .net, and then made F#. It feels like two people on a very similar trajectory.


I didn't know Syme helped with generics in .net. Very interesting indeed.

I think it makes sense for these people to create their own "perfect" languages. I'm not aware of C#, but I know (or knew) Java very well. And Generics always was a half-baked option for a broken language.

From a purists' perspective, in Java "everything is an object", except ints or floats, but we have Integer or BigInt, but then we have `null`, but then... if Integer extends from Number, does that mean that a List<Integer> extends from List<Number>? Nobody knows...

Those things are what I feel Odersky wanted to solve and couldn't with Java, so he went "Bender Style".


The question, "how do we make functional languages popular, (useful?) in the enterprise?" was a question in the air for, i'd, say decades. The 00s answer was to create a merger between enterprise langs (Java, C# etc.) and functional langs in a new language.

The net result, as we know now, was to push those langs to adopt functional features -- rather than to shift to others.

I suspect the question for the niche programming community, of the next decade, will be "how do we make effect systems popular?" or some such thing.

Or, "how do we make AOT metaprogramming popular?" etc. (cf., jai, zig... carbon, circle, ...)

I suspect if a post-C++ new-wave gets popular, with structural typing and ct metaprogramming, we might see the same in C# quite easily


You already get them with code generators and Rosylin, but it is a bit painful to use.


There is also a new compiler feature, "interceptors" that swaps in calls at compile-time. https://learn.microsoft.com/en-us/dotnet/csharp/whats-new/cs...

This looks very hard to use well, but that's because "it's not for you" to use directly in app code, rather it's a low-level building block for such AOT and code generation patterns and toolkits.


Agreed. Roslyn analyzers, code generators, refactoring providers and other extension points are not designed for application developers but are rather building blocks.

We built a code generation and analysis framework based on these Roslyn extension points. It's called Metalama: https://doc.postsharp.net/metalama.


I hate it, as it is a bad approach to what should have been a proper AOP framework like PostSharp or Microsoft Fakes.

Instead we will end with a special cased mini framework, designed to override reflection code for the purposes of AOT code generation.

Something that AOT toolchains for Java have solved much better with agents, PGO and config files.

From my point of view, it is neither a good approach to sort that out, nor a good AOP alternative.


Sorry, but Odersky is nothing like Wirth.

Scala failed hard because it is kitchen sink of every half baked feature someone wanted to write a PhD thesis about. The graphs of the interdependencies of the standard library are an excellent example of a totally insane design, the Scala data structures have been at least an order of magnitude slower than the JVM native ones. Don't get me started about the tooling, which is too slow for any real world projects. (The only ones I regularly see to use Emacs are Scala developers, because opening projects in an IDE like IDEA could take up to 30 min on high end workstations.)

Wirth valued clean design, speed and simplicity. Odersky wants to compete with C++ for complexity. As the article stated, for Wirth a feature would have to pay for itself regarding complexity, speed and usability. If someone can demonstrate how Scalas features came to pay for themselves, I would appreciate a pointer.

Finally... generics in Java are a shit show, thanks to type erasure. Fair enough, Odersky was probably forced for this implementation, thanks to backwards compatibility, still, nothing to be proud of.


Type erasure is an interesting discussion, because functional programmers love type erasure. Consider the function:

    def foo[T](t: T): T    
If the function is truly generic, then we know nothing about the implementation details of T. The only meaningful implementation of foo is identity:

    def foo[T](t: T): T = t
From a FP perspective, T should be erased, and the programmer should not cheat and use reflection. The type signature should tell you much about what the function does.

I've used Scala professionally for the last 8 years. The tooling is fine, assuming you avoid exotic libraries like Shapeless. Compile times are fast enough that I don't think about it much, using IntelliJ incremental compilation. A full recompile of a micro service might take 30 seconds, whatever it is is not a big deal IMO.

> opening projects in an IDE like IDEA could take up to 30 min on high end workstations

This is insanely wrong. Maybe you are operating on experiences from 10 years ago? I have a new M2 MacBook Pro, and opening projects is quite fast. The first time you do it, it will resolve the SBT dependencies and index files. That can be done in the background, and is probably less than a minute for a typical project. Even my previous laptop could open projects quickly despite being 3 years old.

> the Scala data structures have been at least an order of magnitude slower than the JVM native ones

They are slower, but not an order of magnitude slower. Perhaps you are remembering the infamous email from the Yammer CTO that got leaked, but that was a long time ago, and the compiler and libraries have improved greatly since then.

But yes, in performance critical code, you can just switch to an imperative style and use Java collections.

I don't have time to explain the benefits of Scala features, but I'll just point out that other than implicits, many ML inspired features of Scala have made their way into modern languages, including Java, C#, Rust and Swift. Scala didn't invent those ideas, but repackaged them in a novel way.


IMHO the keyword is 'microservice': Compile times of 30sec for a microservice are not acceptable for me, btw. The projects I am referring to were bigger, and the time to compile/open projects in IDEA where _not_ acceptable at all. (Btw.: I am speaking about Scala experts, not some new grads which didn't know what they were doing.)

Concerning your type erasure example, the id function obviously doesn't need to know anything about types. In the real world, types have traits/interfaces/expected attributes etc. and type erasure prevents the compiler to verify this when using binary dependencies, which is obviously _not_ what one wants in a statically compiled language.


I just compiled our oldest, largest micro service, it was 17 seconds. This is a full recompile, which I rarely need to do, because IntelliJ has incremental compilation. If you have a giant mono-repo, then compile time becomes important.

I can only guess what these "Scala experts" were doing, there are certain language features that can slow your compile times, and libraries like Shapeless that leverage those features in amazing ways, but can kill your compile times.

Scala is a toolkit that enables amazing things, but you have to understand the trade-offs, both in terms of compile time, and understandability of the code.

>In the real world, types have traits/interfaces/expected attributes etc. and type erasure prevents the compiler to verify this when using binary dependencies, which is obviously _not_ what one wants in a statically compiled language.

Scala has ways of recovering the information, such as using context-bounds and type manifests. Scala can actually verify an amazing amount of constraints at compile time, but the more exotic tricks will increase compile time.

What you are complaining about are foot-guns that rarely happen in my 20 years experience with JVM languages (I think generics were added in Java 5, but still...)


> it is kitchen sink of every half baked feature someone wanted to write a PhD thesis about.

Having been a PhD student in his lab between 2005-2010, I can attest that's not true. Language design was definitely Martin's prerogative. As a PhD student, a lot of time went into implementation efforts, and while several features did end up in PhD theses, the last word on what goes in the language was Martin's.

Scala was ultimately a research project, with ambitions to succeed outside of academia. As such, some ideas worked better than others (pattern matching and case classes vs specialization). As it became clear that the language was picking up in the industry (lead by Twitter around 2008), many of these experiments moved into compiler plugins, macro libraries or research forks.

> Sorry, but Odersky is nothing like Wirth.

I haven't known Wirth to venture into such broad statements, but given what's been written about the way he lead his lab, I would say there were many similarities: a strong bias for solid implementation work, the (bootstrapping) compiler as a litmus test for features or performance work. And yes, a focus on simplicity.

Unfortunately, simplicity is often confused with familiarity. What's simpler: having statement and expressions as separate concepts that don't mix well, or only expressions? Most people coming from C and Java would have internalized the dichotomy and find it (or at least, back in the day, found it) "complex" to think of every expression having a type ('void' vs 'Unit'). The same goes for the split between primitive and reference types vs a unified type hierarchy.

This is not to say Scala doesn't have its warts, and implicits (in particular, implicit resolution rules) combined with macros could lead to a lot of pain. Hopefully there are lessons learned there and Scala 3 is better.

> Scala failed hard

Far from obvious. Databricks has probably upwards 10MLOC of Scala code, and seem to be doing very well (https://www.lihaoyi.com/post/TheDeathofHypeWhatsNextforScala...). Plenty other examples.

Could Scala have been more successful? Undoubtedly. But it's far from a "hard failure". New languages have adopted many Scala features, so nowadays Scala is believed to pay for itself only when using pure-FP libraries. That's very unfortunate if you ask me, since I believe there's a pragmatic sweet spot that lies around the style best illustrated by Haoyi Li's ecosystem of libraries.


Thanks for the inside story.

Let me elaborate: Scala failed hard in the sense, that it was IMHO a far superior language compared to Java around 2010 (! JDK 7/8 times) and basically is dead now for new projects (unless there are some die hard Scala fanatics on a team, and even they are moving for greener pastures), also see how Kotlin succeeds everywhere at the moment.

What I totally don't get is, that Scalas failure was _not_ surprising at all, and there were a lot of kind people giving constructive feedback, why Scala fails in the industry (example: https://gist.github.com/alexo/1406271):

- Slow compile times - No binary compatibility even between minor updates - Every feature under the sun was stuffed into Scala, making it impossible to transfer projects to 'industry programmers' w/o too extensive training - Tooling support (like IDEs) was extremely lacking/slow/bad - Not to speak about the community infights about the right way(TM) to approach a problem

Personal experience from me: Scala was too slow/cumbersome to use with subpar tooling. And I consider myself a target group: In love with FP but forced to deploy on the JVM. Besides my own experience, I saw teams of Scala developers fail to materialize any significant benefit in real world projects over 'dumper' programming languages, not even speaking about transferring Scala projects to non academic 'elite' teams.

I like some ideas in Scala 3 and IMHO it is sad that Kotlin (which is IMHO just syntactic sugar over Java) gets so much attention, but in the end Scala hat plenty of years to fix its problems and its failure comes as no surprise, because there was plenty of feedback. Are there still some Scala projects around? Yes, mostly Scala 2 because, surprise, libraries still don't have binary compatibility etc. For Scala 3 I have neither seen industry adaption or any enthusiasm from a wider community.


This all sounds right to me - scala needs first class support for what it is doing in the JVM itself - which it doesn’t have. Trying to ASM scala is a pain.


Come on now, Scala has its issues, but statements like "because opening projects in an IDE like IDEA could take up to 30 min on high end workstations" are just not true unless your project is straight up broken.


So modern day Wirth is quite different from antiquity Wirth. Java generics are quite complex they have multi hundred pages books to explain it all. Scala a really clever language, perfect for 1 line code doing 50 line equivalent of Java code. Slow compile times and immature tooling is another important feature of Scala.


Current Scala compile times really aren't that bad. Yeah, Go will be 10x faster, but then you have to use Go...

For a typical micro service, I never think about compile time, thanks to incremental compilation. Even a full compile is not bad. Tooling is also fine, although Scala 3 set things back. IntelliJ now supports Scala 3 adequately.


Well, its like one can always tolerate shortcomings when they like something/someone. Reasons can always be added later for liking something but rarely other way around.


Look, I've been using Scala professionally for 8 years. I don't notice that my compile times are any slower than Java, although I haven't tried any meaningful comparison. There are some notorious libraries like Shapeless than can kill your compile times; the solution is to not do that, unless the trade off is worth it to you.

It could be x1.2 slower, maybe even 2x, and I wouldn't notice it, because it is fast enough. IntelliJ has incremental compilation. Compile times only matter when I do a full rebuild, and it is still fast enough.

If I had to build a giant mono-repo like at Google, then the compile speed of languages like Go become important.


There was a language called Pizza that Odersky wrote while in Karlsruhe (before moving to EPFL Lausanne), and that became part of "standard" (not as in ISO, but as it SUN-approved) Java as "generics".

https://en.wikipedia.org/wiki/Pizza_(programming_language) https://www.reddit.com/r/scala/comments/8c7h8f/history_the_p...


I learned programming mostly with Wirth's Pascal quite a few years ago. Now I am employed in a company whose primary language for backend is Odersky's Scala.

I have fond memories of the former and greatly enjoy the latter.

It's only now that I learn Odersky was Wirth's student. It clicks.


It'd be interesting what was the other five PhD students did. I kind of guess most of them being successful, not necessarily in the domain of programming language design.


It's easy to check, they seem to have done well: https://www.genealogy.math.ndsu.nodak.edu/id.php?id=61847&fC...


Modula-3 (not designed by Wirth, but tangentially mentioned in the article) was a practical disaster. Phenomenally verbose, and with a type system implementation that could easily confuse a complex class with an int.

It also had go-like structural interfaces, if that floats your boat. Pretty much everyone I knew hated working with it.


I liked it, have almost all published books, and enjoy that Swift, Java and C# have taken several ideas from it, even better now that AOT is finally part of the whole package instead of commercial offerings.

D as well, but nowadays they are fighting with a dwellig community.


'fighting with a dwellig community'? i thought i spoke english but maybe i was wrong


Dwellig you know, where the bandersnatch is


Not everyone is a native English speaker on the Internet, what a surprise.


i thought i was, but this sentence is too advanced for me


I think it's a "dwindling" community.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: