Hacker News new | past | comments | ask | show | jobs | submit login

The most recent paper mentioned here is from 2000. The most recent 'greatest of the great' is from 1978.

Is it simply that it takes a long time for us to decide what constitutes a significant advancement, or is something else going on?




The most recent paper mentioned here is from 2000.

Realize that the list was proposed in 2004.

Is it simply that it takes a long time for us to decide what constitutes a significant advancement, or is something else going on?

Seminal work is recognized by its influence, and influence becomes more apparent as impact accumulates over time.


"influence becomes more apparent as impact accumulates over time."

Its interesting to try to re-prioritize the papers given a decade of impact (or lack thereof) since the list was published. I have no useful results but it was good mental exercise.


I'm not sure anything interesting happened in the PL world since 2004 besides PEGs. Please feel free to correct me if I'm wrong, I always prefer to be among the first to jump on a new bandwagon in the language design trends.


Gradual and optional typing have become a pretty big idea over the last 6-7 years, as language designers have tried to retrofit type systems onto languages that weren't designed with them.

This idea has been showing up in industry (whether the designers call it "gradual typing" or not) with at least Hack, Flow, TypeScript, and Dart (which has optional type annotations).

You can find lots of references for useful underlying theory and research applications (the first paper that mentions "gradual typing" was Siek and Taha in 2006) at https://github.com/samth/gradual-typing-bib.

Typed Racket (http://docs.racket-lang.org/ts-guide/) is probably the best-engineered exemplar of many of these ideas, especially 1. installing run-time checks at boundaries between typed and untyped code, and 2. tackling problems of typing previously untyped code that uses rich class and interface patterns.


Thanks, that's fairly new.


A few:

* Fault Location via Dynamic Slicing

* Interprocedural Analysis and the Verification of Concurrent Programs

* Logics and Algorithms for Software Model Checking

* Verifying Low-Level Programs via Liquid Type Inference

Moreover:

* Agda

* LLVM


Thanks! Interesting. I'll try to dig out the recent papers. Although I have a feeling that most of this is not that new. LLVM is just a very basic SSA, which is early 90s. Agda builds on similar grounds as things like Coq, Minlog, even ACL2 and Isabelle, which are very old. There've been a lot of buzz in the concurrent code verification ever since the early pi-calculus papers by R. Milner.

But the liquid type inference seems to be a genuinely new thing, and apparently I totally missed it somehow. It even seems like I unknowingly reinvented something similar, by using arbitrary Prolog code as type attributes on top of a basic Hindley-Milner.


Agda builds on similar grounds as things like Coq, Minlog, even ACL2 and Isabelle, which are very old.

… Coq, Minlog, even ACL2 and Isabelle build on similar grounds as Chrysippus’ in the 3rd century BC …

i.e. there’s nothing new under the sun.


Dependent types, gradual typing, HoTT


HoTT?


The ones I've been paying attention to are data parallel programming systems and type inference algorithms (simpler ones than MLF).


What are good examples of things in the bucket of type inference algorithms that are simpler than MLF? You must be looking for ones which infer sophisticated logics, still, otherwise that bucket will always be dominated by HM, no?


Boxy types, HMF, and bidirectional (older but more recently applied to parametric polymorphism) all leverage partial annotations to handle a more flexible parametric polymorphism than HM itself allows.


PEGs date from 2002 :-)

Another brilliant recent discovery (2001, so still more than a decade ago I am afraid) is the algebraic connection between differentiation and zippers. See Conor McBride's paper "The Derivative of a Regular Type is its Type of One-Hole Contex"


Even in the parser corner, the parser combinator / graph-structured stack / GLL algorithm is both new and exciting. GLL was published in 2010: http://dotat.at/tmp/gll.pdf


That's very cool, thanks for posting it. I enjoyed Daniel Spiewak's paper on parser combinators for GLL too: http://www.cs.uwm.edu/~dspiewak/papers/generalized-parser-co... (scala code here: https://github.com/djspiewak/gll-combinators)


The point that this was published in 2004 is certainly well taken.


A clearly practical and influential advance from the last 10 years is the application of tracing JITs to high level languages (as in TraceMonkey and LuaJIT 2).




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: