For decades, parallel programming was only done by people in the high performance world. It was (is) difficult and often painful. The languages, APIs and paradigms were broken in various ways. But they got the job done, and the relatively few people who actually needed to do it were able to.
It's only now that parallel programming is moving into the mainstream. Hence, it's only now that it's worthwhile for computing as a field and an industry to pour the resources into making parallel programming "easier." The benefit from doing it before was marginal.
You get read-only access to stack variables, read-write access to stack variables defined with __block, and you have to copy the block explicitly to the heap if you want to call it outside of its scope, and then manually free it.
Depending on how pure the language is, /variables/ can be immutable. Nonetheless, the standard, default, run-of-the-mill variables that most programmers will run into most of the time are not. Same goes for closures.
Except those can't read nor modify variables in their enclosing scope. Anonymous inner instances are more akin to C's nested functions, which GCC has had for a long time.
Correct me if I'm wrong though (which would only make me very happy); been years since I did Java.
They can read and modify object fields in their outer class's instance. But it's true they can't modify stack variables. This makes contextual sense - C blocks are being stack allocated and have to be explicitly copied to the heap. Java inner instances get copied immediately, because everything lives on the heap in Java.
"Some simple algorithms can be developed that allow remote cores (other nodes) or even heterogeneous cores (different types of nodes) to be used by the program."
Does this mean blocks and GCD would allow easier cluster and cloud computing too? Or maybe use all the CPUs in a household to complete tasks?
For decades, parallel programming was only done by people in the high performance world. It was (is) difficult and often painful. The languages, APIs and paradigms were broken in various ways. But they got the job done, and the relatively few people who actually needed to do it were able to.
It's only now that parallel programming is moving into the mainstream. Hence, it's only now that it's worthwhile for computing as a field and an industry to pour the resources into making parallel programming "easier." The benefit from doing it before was marginal.