Hacker News new | past | comments | ask | show | jobs | submit login
Modern Language Wishlist (lispcast.com)
98 points by pchristensen on Feb 2, 2012 | hide | past | favorite | 72 comments



Most languages have most of the listed features. The most interesting uncommon idea on that list to me is model of time. But only when it allows reversible computing. I don't know how practical that is though. Anyways I've separated out the parts of the list which I think are less common. And listed examples off the top of my head so by no means is the below meant to be exhaustive.

The uncommon ones:

* Units - F# and Frink

* homoiconicity - Lisps, Prolog, Io, Factor, Pure?, ...

* macros and extensible syntax - the above list, Nemerle, Dylan

* Unification - Prolog, Mercury - logic languages. anyone with type inference. anyone with predicate dispatch, a full example of which I do not know. F# Active patterns and scala extractors are close.

* Error (managing numeric Imprecision) - Best I can understand I can only note that I have seen people treat similar concepts with monads. So haskell and any language which allows easy monads. So scala, f#, haskell, clojure, nemerle

* Math types - Axiom/Aldor. To an extent, dependently typed languages - not practical. yet?

* Polymorphism - Although it sounds more like structural typing. So OCaml, Scala. Also F#, haskell partially.

* Aspects - Metaprogramming makes this relatively easy and clean to implement. so anyone with macros too. Arguably, monads are another way to follow the same philosophy.

* term rewriting - not common, very niche and not mentioned, but pure-lang allows this and for abstract math programming it is a really fun & powerful concept.

* pluggable types - F# has this as type providers. Gosu open types. I have used it in F#. At a start, it is a very awesome way to consume an api.

Less Uncommon:

* pattern matching - most functional languages. to a small extent the java.nexts

* Immutable values - most functional languages

* Parser - parser combinators or any language which implements PEGs e.g. Nemerle

* Design by contract - eiffel and as a library in most languages

* laziness - the usual functional suspects.


I think that the full numeric tower of scheme is what the author had in mind for Math types: big nums, true rationals, strong sense of exactness, etc...

Also, CLOS (Common Lisp) has after, before and around methods for "aspect oriented".


> homoiconicity

Python is not homoiconic but the ast module allows for wicked stuff.

> Error (managing numeric Imprecision)

Python decimal module does not handle precision error but handles significance, so that(from the doc) "for instance, 1.3 * 1.2 gives 1.56 while 1.30 * 1.20 gives 1.5600". This underrated module also does much more.

Anyway I feel this precision stuff might be a job for numpy/scipy.


> Most languages have most of the listed features.

Most of the features are not hard to find, but you have to switch languages a lot if you want to use them all.


This is basically a desciption of Clojure, in particular:

- Explicit model of time (http://www.infoq.com/presentations/Value-Identity-State-Rich...) - Homoiconic / Extensible syntax (shared by most Lisps) - Math-oriented numeric types (Clojure maths is arbitrary precision by default, and it has nice things like rational types) - Immutable (Clojure data structures are all immutable) - Garbage collection (inherited from the JVM) - String manipulation (Regular expressions are a built-in type)

Most of the other features actually seem more like libraries than language features, but given that Clojure can access the entire Java ecosystem I think you can do all of it from Clojure with relatively little pain.


Clojure is awesome, but does struggle on the "Good error messages" requirement, which is a problem when you're starting out.


I find it interesting that two of the most exciting languages that exist right now are built on top of the VM for one of the worst ones.


The strength of Java for many years has been the platform, and indeed the surrounding ecosystem, not the language. In that sense it's no surprise that there is a lot of effort going in to providing better languages on that platform.


The reason I'm not using Clojure is that it's based on the Java platform. I keep hearing that JVM is nice because there are good libraries, but I haven't figured out what the good libraries are. Sure, there are some nice platform independent abstractions for common operating system interfaces like file and network I/O, but it still lacks lots of stuff.

The problem with Java libraries w.r.t. modern high level languages is that Java libraries are built on Java abstractions. It doesn't matter how high level the new language is, but when interfacing with Java libs, you have to stick to single dispatch object oriented programming. So in the end, there are many cases where you use the Java library through some wrapper layer written in Clojure or your Clojure code ends up looking a lot like Java.

Once you add a wrapper layer shim between your preferred language and the platform, it doesn't matter what's underneath. That's why I like to stick to languages that are built on native code and libraries, C and Posix and Unix API's with some Linux and/or BSD additions. Of course, if you want to run on Windows, all the code has to be duplicated since Win API's are different.


This just seems like an everything and the kitchen sink list. I am not convinced supporting every possible use case leads to an approachable/efficient language. It seems akin to arguing my car should also be a boat and an airplane.


It's a heck of a mixed bag, but most of them are orthogonal and would coexist nicely, especially if libraries are allowed to seriously extend the syntax and type system. Seems hard but doable to me.


I think it's OK because he said

>(plus libraries)

So some of this would belong as a language feature and other stuff would be in the standard library (is what I'm imagining at least).

I'm a Cocoa guy, and when a lot of people rave about Objective-C, they usually are referring to niceties Cocoa provides. This is sort of what I was imagining as I read along.


How far is Python from this list? Macros are missing but the remaining list seem close to me.


Math sucks a lot in Python 2.7:

  3 / 2 == 1
Python 3 fixes that, thou, but still it uses floating point numbers instead of rational numbers.


> it uses floating point numbers instead of rational numbers.

import decimal

import fractions


There "from __future__ import division" to fix that.


I'll add one that people don't seem to think about much: the ability to encapsulate an ongoing computation and grab new values from it at leisure.

Examples: Haskell has lazy lists; Python has generators; Unix shells have pipes; in Go you can whip this up pretty easily with a goroutine and a channel; etc. Does his ideal language allow for this?


I should add controlled laziness to the list.


Is it just me or did anyone else feel he was talking about Racket? I have never understood why Racket doesnt get the love that it deserves.


It's close, but:

  * Racket doesn't meaningfully have a good syntax for literal maps or sets
  * Racket doesn't have math-oriented types, AFAIK, even in Typed Racket
  * Racket doesn't have units.
  * Racket doesn't have aspects (you can obviously add them--see Swindle--
    but Swindle is very rare these days, and not the preferred way to write
    Racket
  * Racket is not interface-based (ditto)
  * Racket *supports* immutable values, but mutable (via define) seems more common.
    The MLs or Clojure seem a lot closer here.
  * Racket is arguably not polymorphic.  I'm aware you can do weird stuff by
    playing with a bunch of hidden parameters on structures, but the description
    sounds a lot closer to OCaml functors to me.  (And look no further than
    how Racket has for/list, for/hash, for/gvector, etc., for a harsh example of the
    limits of that polymorphism.)
...actually, that doesn't seem that close.


I don't think that he was talking about Racket, but some of your points are mistaken.

You might not like the literal syntax for maps in Racket, but it certainly exists.

Typed Racket has lots of math-oriented types; we just wrote a paper about their design here: http://www.ccs.neu.edu/racket/pubs/padl12-stff.pdf

Comprehensions such as for/list and for/hash are polymorphic, in that they operate on arbitrary sequences, of whatever type. for/list constructs lists; for/hash constructs hashes. Clojure, a language that takes uniformity of interface much further than Racket, has similar operations.


You're completely right on the math. I had math-oriented types wrong; I assumed that the author wanted the ability to say, "This type is restricted to values defined by this set," which he didn't. Even if he did, I see that Typed Racket actually does support such types, which is awesome. So I'm completely wrong.

Whether e.g. #hash((key . value) (key . value)) counts as a literal hash syntax is interesting. If you want to argue it does, then I'll argue that C does, too, since I can trivially #define my way there through C99 struct assignments and a function that constructs a hash off an array of those structs, or that C# does because I can use an initializer (e.g., "new Dictionary<string, string> {{"Foo", "Bar"}, {"Baz", "Quux"}}"). Literal hashtables and vectors, to me, means something that's visually apart from base syntax forms, specifically so that it stands out to the coder. By this standard, Python, Ruby, Smalltalk, and Clojure would qualify, while Racket, Io, and C# would not. Whether that matters to you depends on what you want.


I don't think that's right about hash tables. In particular, the distinction I would make is that literal hash tables are part of the syntax of the language in Racket, like in Clojure, Python, etc, and not in C# or C. What this means in Racket is that (read (open-input-string "#hash()")) produces a hash table -- that's parsing, not running the hash table constructor.


Racket doesn't have math-oriented types, AFAIK, even in Typed Racket

As far as I know, it uses "machine-oriented" data representation when it can (sufficiently small integers, inexact numbers) and promotes to arbitrary-precision representation when it has to (larger integers, non-integer rationals). I don't know how much more OP expects out of "math-oriented" numbers than what's explicitly listed (no overflow, rational division).

There exists a matrix library, but it might be lacking some desired operations. I'm not sure what OP means by having "equation" as a type.


Is it just me or did anyone else feel he was talking about Go? I have never understood why Go doesnt get the love that it deserves.

(No seriously, he described Go)


> Is it just me or did anyone else feel he was talking about Go?

That's really just you:

* Go does not ship with a set collection, no literal or convenient syntax

* Go only ships with with doubly linked lists and no literal or convenient syntax

* Go's literal syntax for the Array and Map builtins is significantly less convenient than that of most other languages (including but not limited to statically typed ones)

* Go is not homoiconic

* Go does not have an extensible syntax

* Go does not have math-oriented numeric types (quite the opposite), neither does it have precision errors (I am not even sure it can meaningfully interact with IEEE-754 error flags)

* Go does not have units (as far as I can tell)

* Go does not have pattern-matching, let alone unification (could have made error reporting good, can't have that)

* Go does not have aspects

* Go does not (as far as I can tell) have any special support for writing parsers

* Go has very little support for immutability

* Go does not have an explicit model of time

* I don't think I've seen any built-in structure serializer and deserializer (equivalent to Lisp readers and writers) in Go

> (No seriously, he described Go)

Only if you're completely delusional, skipped about 60% of his list and gave Go huge leeway on the rest.

Clojure, for instance, is a far better match on this.


Have a look at the "gob" package for serializer and deserializer support for Go types.

As for the syntax-related points, Go offers quite a bit. There's the goyacc tool, the scanner package, the template package (think quasiquote), and a bunch of packages for processing Go code: go/ast, go/scanner, go/parser, go/printer, go/build, go/doc, etc.

For math-oriented numbers, there's the "big" package. Many languages make such numerics much more convenient but you can get pretty far with little effort using just the "big" package.


This doesn't sound like Go to me, but I've never used it:

Homoiconic

Code can be manipulated as data.

Extensible syntax

I find I don't use macros much anymore. I do more with data-oriented programming. But it is nice to have when you need it.


A lot of this list sounds very much like Postscript: * Code is stored in an array with the execute flag turned on. Therefore functions can be easily edited (or assembled) as data. First class functions all the way. * The environment is a dictionary (associative array), which can be swapped with another dictionary at any time (not just at function entry points). * Garbage collection is standard

I was thinking that it may be an interesting project to write a general purpose Postscript (non-graphics oriented) interpreter, with a decent library, full continuation passing support, a different way of handling the "current dictionary" stack (to allow for static in addition to dynamic variable support), and a swappable parser to allow for additional program definition styles (infix or prefix in addition to the default postfix notation). Just something kicking around in the back of my head for a while.


Maybe Factor http://factorcode.org/ is what you want ?

Also, the rather new Red language (Rebol-like) http://www.red-lang.org/ seems interesting.


I also had the same weird feeling (see my post elsewhere in the thread).

I think Go's just still too immature for most people to invest deeply in. Production code can have a lifespan measured in years or even decades, so you want to use a language that's stable and mature, and you want to know that it's going to be around at least as long as your application. Things will change once the 1.0 Stable release rolls around, and people will really start dipping their feet in (and those people who have been dipping their feet might take a dive).


Yeah. And of course Go doesn't satisfy everything on that list either. But it does satisfy most of it with just the standard library.

Go 1 should solve most of the big problems, currently libraries need to keep up with weekly releases and packages are still being moved around.


He also took a good shot at describing Algol-68.


I am a big fan of Go, but he was certainly not describing this language. A lot the items on the wishlist were math and sciences items (arbitrary precision int, float, rational number; vectors, matrices; units). These are not in the language, and, while they can be in libraries (see big.Int and big.Rat), without operator overloading, I don't think Go is a good fit for these types of applications.


Absolutely, that was the first impression I also got when reading the post.

Actually, its Go's standard packages which matches most of these expectations.


This list lacks a feature that Go has and without which I wouldn't even have considered it : native and complete UTF-8 support.


I think a lot of languages could be described by most of the items on the wishlist. I thought Clojure was an extremely close fit, but then I program a lot in Clojure :)

I don't think Go matches all his criteria, though. For instance, Go is not homoiconic.


Clojure is close. From what I know of it, Go is close, too.


The most obvious thing I can see that he wasn't explicit about was Unicode support. Should be native, just work and included from day one.


Good one.


If you're going to throw everything and the kitchen sink in a language, you may as well include Erlang-type multiprocessing and message passing.


+ 1 for easy serialization. Every language should have equivalent of Python pickle module.

Sometimes you need specific file format, compatibility between languages, customization, etc - then pickle is not enough. But for my uses pickle was good enough most of the time, and it's stupidly easy to use. No need to change your code in any way. That makes one-off cashing of intermediate results to file system manageable, implementing save/load game is 3 lines of code (counting import). I love it, and I miss it in every other language I use.

Javascript has JSON, but writing general code to serialize arbitrary object graph with cycles, functions as field values, properly storing prototype chains, etc is still hard.


http://colinm.org/language_checklist.html

This has never been more relevant.


This list is more about default library a language should come with rather than language design.


I am surprised it's not very Lisp-biased.

Anyway, no love for type inference or generic?


Except for homoiconicity.


Tcl, Forth, Io, and (traditional) Smalltalk are also all homoiconic. You don't have to have a Lisp to get there.


Wrong: first item should be environments as first class types. Much of the rest follows. And you can stick it to smarmy common lisp geeks.


This post does a really good job of analyzing first-class environments and explaining why "first-class environments are useless at best, and dangerous at worst": http://funcall.blogspot.com/2009/09/first-class-environments...


I found his argument a bit of a wash. It boiled down to usual critique of macros, operator overloading, and inheritance: "You don't know what's going to happen".

Meh, most of the time it's not a problem, when it is, it's for a good reason, and if it's for a bad reason, you probably should choose a different software package that is written better.


Here's one he missed: a good package manager.

Yes, I realize that's not technically an aspect of a language, in the purest sense. Yes, I realize that this is something that the community can provide. However, I will argue that any new language that neglects to ship a package manager alongside the language implementation is doing a disservice to its users.

Go (goinstall) and Rust (cargo) had the right idea.


One thing that always bugs me is that modern languages that ship with fairly strong datetime libraries still make it a pain to deal with just dates. My db can handle just a date why does my language need to treat date as some hack on datetime?

I want 12/01/1980 not 12/01/1980 00:00:00 -5


First need of a good language: have a simple specification. The full reference manual shall be very concise, complete and readable. Even with a language as simple as javascript, the specification is a nightmare.


Yet, no mention of concurrency/parallelism primitives. sigh


Good idea. But they do kind of fall under a "model of time."


In fact, the article does mention concurrency in there. I must have read over that thinking right away that they meant UTC/ Time Zone issues again. d'oh!


hmm, what exactly does he mean by data-oriented programming?

iirc that's something where you don't store data chopped up into objects, but have arrays that keep all the data of one "aspect" of all "objects"... or something like that. something game programmers would use, helps avoid cache misses.

is that what he was talking about? if so, what does it have to do with macros?


I meant data-driven programming.

http://www.faqs.org/docs/artu/ch09s01.html

More data (and data structures) and less code. It's very common in Lisps and other homo-iconic languages.


thanks :)


REBOL is the only language I can think of that comes close to this list.

http://rebol.com/


All I want: Smalltalk with tail recursion and a process model like erlang (implies immutable values).


The more I read this article , the more I feel he is not talking about _language_. He is talking about _library_.


Yea he mentions language features like interface-based polymophism, garbage collection, namespaces, first class functions, closures, etc. But then he goes on to talk about I/O methods, string manipulation, CSV output, etc. By which I think he means standard library features. The post really should have been split into language features + standard library features to make it clearer.

I may be influenced by the fact i'm into Go at the moment, but i think it covers most of the "features" he talks about, as well as many of the "libraries" (some may not be part of the standard library, but will be available through 3rd party packages). If the author is reading this, I encourage him to have a good honest look at http://golang.org/

I don't know if i like his discussion of "math-oriented types" vs. "machine-oriented types". At least as far as I see it (and according to wikipedia[1]), in maths whole integers (Z) are a subset of rational numbers (Q). When in your high-school maths class you say 3 + 2.5 = 5.5, you're really doing an implicit conversion to 3.0 + 2.5 = 5.5 or 3/1 + 5/2 = 11/2. Most languages allow you to also get rid of the bit-size of numbers by just defining "int" or "float" numbers that default to some predetermined bit-width (e.g. 32-bit). Most languages also have libraries or mechanisms (see Go and Python) for large precision numerical calculation so you can have your theoretically infinite size "no-overflow" situation; but that comes at the cost of speed, so we only use these types when it is specifically needed. It's not impossible to use them in all cases, but it is impractical.

In the end, a language can't do everything. Most languages focus on providing a good core, along with mechanism for users to extend functionality in any way they see fit. In this case I think the author wants the language to just do everything for him out of the box, without any pesky libraries, and without being bloated or slow. _That_ most certainly _is_ impossible. There is a good reason why no language implements _all_ of these "features".

[1] http://en.wikipedia.org/wiki/Set_(mathematics)

Ed: Z is a subset of Q.


Actually, from a math perspective, it's kind of nonsensical to talk about type conversion at all. That is, 3, 3.0, and 3/1 are purely notational differences, and all three represent exactly the same object. Since math is theoretically infinite precision, the way you write a number has no impact on the way operations like division act on it, making whether it belongs to Z, or just Q or R a moot point. Now, if you restrict your problem domain to Z, then that affects the operations you're able to do to a number and remain in that domain. That's more like what happens with types in programming; we restrict our domain by default because leaving the integers changes the computer's representation of the number in a way that affects its behavior.


Kind of. But you might say that rational numbers are ordered pairs of integers and positive integers, real numbers are Cauchy sequences of rationals, and complex numbers are ordered pairs of real numbers, and the real number 3 can't be added to (2+i) without converting it to a complex number first.


Those are all valid ways of conceptualizing those sets, but I don't think it changes the point I was making. The real number 3 doesn't need to be "converted" to a complex number to be added to 2+i. 3 is always both a real number and a complex number, which may be represented as either 3 or 3+0*i, and either way gives 5+i when added to 2+i. All the latter notation really does is clarifies what domain you're currently working in, and even so, I've never seen anyone write it out explicitly.

Type conversion is more like if you had the written number 3 and a picture of the point 5 on a number line and someone told you to add them. Naturally, you would write 5 as a number first, because you don't have a useful way to add a number and a picture. But this doesn't change the results of adding the quantities 3 and 5; it's purely an artifact of the way the information was presented to you.


A nice thing about John Conway's construction of the Surreal numbers is that integers are really a subset of the rationals, which are really a subset of the reals, which are really a subset of the surreals. I find this much more elegant than having "merely" finding an embedding of the smaller structure inside the larger one.


The math-oriented types thing is known to schemers as the "numerical tower" and is seen as non-negotiable.


Wow, i didn't know any language had something like that. Can't say I'm surprised that it's a feature of Scheme/Lisp.

If OP said "Numerical Tower" in the first place I'd probably have looked it up and understood what he was talking about. Thanks for the info anyway. Learn something new every day on HN.


Or about languages designed in such a way that putting most of this stuff in the standard library feels natural, rather than bolted-on.


Clojure and Scala are both there, except on idiomatic libraries (you might have to use Java libraries).

Clojure wins on simplicity of syntax (Scala has a rule that operators ending with ':' associated to the right, which makes an incredible amount of sense once you understand the language but is annoying and arbitrary to beginners) and homoiconicity. Scala wins on pattern matching and robustness (static typing). I'd use Scala for a game, because it's fast (both in terms of human and CPU performance). Both are great languages.


Because Clojure is a Lisp we have macros and with macros comes a lot of power. Clojure has a full powered pattern matching library. See this: https://github.com/clojure/core.match




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: