Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While mostly agree with your criticism of NKS, in Wolfram's defense, I have personally found Mathematica to be a joy to use. Under its unfortunate syntax, there's a great functional programming language there. While you can do many of the same things in Matlab or Octave, I always prefer to use Mathematica.


There are many respects in which Mathematica is an exceedingly well designed language. It falls down in some areas. Perhaps the Part syntax [[..]] is a bit clunky, but I think for the most part the syntax is actually pretty great.

Here's a rough list of things that I think make it stand out, written for someone who might not be familiar at all with it.

0) Functional style

A common style of functional programming involves building up quite complex data structures in an add hoc way using lists, symbols, rules, and so on, and then successively transforming these structures. Mathematica excels at this, having more 'neat' high-level functions to work with such ad hoc data structures than I've seen in any other language.

Of course you've got the usual Map and Fold and Nest, but then you've get FoldList, NestList, MapThread, MapIndexed, Scan, Cases, DeleteCases, Count, Select, Position, Replace, ReplacePart, ReplaceAll, SortBy, and on it goes (see http://reference.wolfram.com/mathematica/guide/FunctionalPro...). Many X also have a ParellelX that takes advantage of multiple cores.

Also, many of them use the expressive pattern matching 'sub-language' that Mathematica has. The pattern language was originally designed for writing rules that perform symbolic integration and differentiation, it turns out that the kind of flexibility you need for that is great for general purpose use. You use them all the time to get at parts of your complex expression to process them, count them, remove them, etc.

Lastly, almost all these functions have support for a 'depth spec' that enables you to say at what level or levels inside your expression you want to operate. Also useful for operating on rich data structures.

1) Uniform syntax

Like other LISPS, there is a nice and uniform s-expression syntax 'underneath' the more elaborate syntax. For example, to check the length of a list and conditionally append something, we have, If[Length[x] > 3, t++]], whose FullForm is actually If[Greater[Length[x], 3], Increment[t]].

Of course, you can use more familiar syntax, but sometimes when you're not sure of the precedence of something it can help to drop down to the explicit form.

Another nice thing about the syntax is that it makes everything serializable. Trivially. ToString[..., InputForm] produces a string version of anything, a function, a graph, a distribution, whatever. ToExpression gets it back. And you can define your own forms too, to join the other built-ins, of which there are many useful ones, like AccountingForm, ScientificForm, TeXForm, TableForm, etc..

2) Symbolic expressions

At a basic level everything is built out of lists, just like LISPs do. Well, lists with an interesting twist: the lists are 'colored', so to speak -- every list has a head, a symbol hiding behind the list, even if it is empty. So Foo[] is the empty list with the symbolic head Foo. A list like {1,2,3,4} is just a list with the symbolic head List.

All symbols (heads) are inert, and are defined just by mentioning them. But you can attach rules to the symbols that say how they should evaluate -- that is in fact how one defines functions.

In fact, the ordinary control flow constructs are exactly this: special heads that evaluate their arguments in certain unusual ways. So If will prevent its other arguments from evaluating until it knows whether its first argument evaluates to True (another symbol).

Now, this can be somewhat inefficient for tight loops, but luckily you can compile such things down (although you loose some of the fancier features, like pattern matching). Also, there is a lot of hidden optimization -- large lists of integers are transparently packed into arrays, as are matrices, and use special libraries for multiplication etc, even though you're using the same Times symbol at the top level.

Anyway, it's a very unique model.

2) Homoiconicity

Code is just data that hasn't evaluated yet. For example, I could count all the global variables mentioned by the definition of a given function:

Count[DownValues[func], s_Symbol /; StringMatchQ[Context[s], "Global`* "], Infinity]

I regularly use Mathematica in this sort of way to analyze the code I'm writing find bugs and understand how something executes. It's quite neat.

You can also use the basic symbolic pattern matching ingredients to do quite weird and interesting things, like defining your own language constructs if you needs something extra -- think LISP macros on steroids.

In fact, it is quite possible to define object orientation by using the symbolic pattern matching functionality such that it looks and feels like it was built into the language. Typically, though, you end up thinking in a very different way and only occasionally find yourself needing OO-like code.

3) Batteries included

Mathematica has got a ton of stuff built-in.

* It has a really quite killer image-processing library, for example (it bundles OpenCV to handle a lot of the more sophisticated stuff).

* It has great support for graphs -- which are first-class objects, not matrices that are to be interpreted as graphs. By support I mean almost all of the textbook graph algorithms, for example.

* It's got hierarchical clustering, common statistical tests, linear and logistic regression, etc

* It's got almost all of the commonly used statistics distributions, a lot of computation, and a limited degree of inference on them.

* It's got all the pure math you can eat, and plenty of numeric math.

* It's got more visualization and plotting than you'll know what to do with.

* It's got nonlinear optimizers and constraint satisfiers.

* It's got quite an unusual debugging system, that allows you to watch for certain patterns in your code as it executes (although no-one ends up using it because it is a bit alien).

* It's got a pretty robust layout engine (originally for typesetting math).

* It has a more-than-adequate set of graphics primitives built in, both 2D and 3D. Of course you use the same s-expression syntax to build your scene graph (well, tree), which makes it extremely easy to use (Graphics[Circle[]] displays a circle in your notebook, for example). The renderer is of really high quality.

* It's got what amounts to a GUI toolkit, so you can very quickly build an interactive UI with sliders and buttons and interactive graphics, again thanks to the symbolic nature of the language. Anything too ambitious will cause the frontend to unpredictably crash, so don't try to write a word processor.

4) Notebook interface.

You really do not know what you're missing until you've used it. Notebook interfaces are the future. I know iPython has an experimental one that my academic friends love.

Trust me, a REPL is a poor excuse for a notebook interface. The rest of the world will catch up, though, its an idea that is too good to hide.

5) Help system

I suppose it's up there with Visual Studio type of products. But it does have the nice feature that the documentation is actually all executable. It's quite nice to look up some function and then start mucking around with one of the examples live until it does what you want to do, and then you copy paste it into your notebook.

6) Good names

All the functions have rational, good names, with a uniform syntax. You can often predict a function that should exist, type the first few letters, press Cmd-K to autocomplete, and be pleasantly surprised that it does exist.

You won't find 3 different versions of a function, you'll find one version with sane defaults and optional named parameters that can customize that behavior.

--------

The only real downsides I can think of is that it doesn't have a first-class dictionary type, which can be annoying. Part of the reason it doesn't is that it wouldn't be LISPy, and there also weren't performant pure data structures (until Clojure came along). There is a fairly easy way to fake dictionaries using downvalues on anonymous symbols, but it doesn't feel idiomatic. A little bird tells me that proper dicts are coming in version 9 (using Bagwell's tries for performance).

Oh, and that it costs a chunk of change. I forget about that because I get it for free, because I work at Wolfram. Though in the R&D department at Wolfram|Alpha, not in the advertising department at Mathematica :).


NKS is BS, I fully agree. And Mathematica borrowed a lot from Macsyma, to put it mildly.

http://maxima.sourceforge.net/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: