These posts are interesting, but I can't shake the feeling that something is fundamentally wrong. There have been decades of research into parallelism. Why are we groping in the dark like this at such an elementary level? I'm not picking on Tim Bray (in fact I applaud his experiments). But it feels like we're once again committing the perennial mistake of our industry: ignoring previous work. What we ought to do, instead of naively muddling around the multicore problem, is go back and understand what (for example) the Fortran 90 and SISAL guys did. Huge amounts of work went into these very questions before the advent of commodity hardware caused the money to run out.
I agree with you. I think the motivation is to take current, widely used languages and learn how to write parallel programs. In particular, how to do that with the Java infrastructure.
But despite all the previous work, the problem is not regarded really solved, as indicated by the ongoing columns by Herb Sutter (focusing on c++ shudder) and Fortress, also coming out of Sun, and the new GO language supported by some googlers. There seems to be a desire to find the right language constructs in familiar languages to make the multicore problem go away. Knuth thinks that this is not a good idea, or at least is intractable.
My own approach to multicore use is to divide the problem in such a way that individual single-threaded processes can do a chunk of it, and combine the results. Not every problem works well this way however.
Who's "we"? Bray has been muddling around naively because he's a regular guy with a popular blog. Academics are well aware of the literature. I was a lousy academic and even I can tell you the history as far back as the '70s. Frankly, I don't think it's that hard to write most concurrent code. Pick a concurrency pattern and stick to it. It's only in high-performance code that things get tricky.
Well, he is not exactly a regular guy. He arguably wrote the first successful web crawler, managed the project to bring the OED to CD-ROM, was part of two startups, co-authored XML specification. This clojure series relates to his wide-finder project at http://wikis.sun.com/display/WideFinder/Wide+Finder+Home, in which "Wide Finder is an attempt to answer this question: What is a good way to write computer programs to take advantage of modern slow-clock-rate/many-core computers, without imposing excessive complexity on programmers?"
I have written lots of concurrent code, and I think it is hard, as does Herb Sutter, Donald Knuth.