Denotational semantics: code is a meaningful artefact.
This fits well with the functional paradigm, where values are immutable and therefore 'timeless'.
In contrast, imperative code, if it has any actual semantics at all, tends to be operational:
Operational semantics: the "meaning" of code is how it affects the behaviour of the machine that's interpreting it.
The problem with operational semantics is exactly that of imperative code: in order to understand what some code will do, you need to know everything which 'happened before' (ie. the current state of the machine).
>> The problem with operational semantics is exactly that of imperative code: in order to understand what some code will do, you need to know everything which 'happened before' (ie. the current state of the machine).
Wouldn't knowledge of everything within the current scope be sufficient?
In order to understand how it's actually going to play out on a real machine in real time, you have to be able to simulate the execution (and perhaps the compiler), either experimentally or in your head.
I have to confess I'm basically trolling here. I mean, I hope to get an answer, but my expectations are low. I feel like I'm asking a salesman for why I shouldn't buy their product.
It isn't even that I don't think they have a good product. They certainly do. One need only actually read SICP to realize just how bloody amazing some of these ideas are. (Granted, they didn't use all of the modern names. But they certainly seem to have hit all of the modern ideas.)
It's somewhat true for imperative - you still have to keep an eye on your call graph, et cetera.
But in Haskell, say, because of lazy evaluation, it's easier to lose track of how much CPU effort something is going to cost, and when you're going to have to pay for it.
It is more than that. In many of the popular imperative languages, any memory modifications are done by the programmer. Not just memory, but sometimes processing capabilities. When things get hidden behind an abstraction over "map", say, suddenly what you thought was a single function call was actually a crapload more.
This is becoming less true, of course. Java getting lambdas hides a TON of places where you just managed to allocate a ton of memory and/or perform a ton of operations.
And note, I do think this downside can be oversold. So, don't take this as a condemnation of "functional" languages and methods. It definitely exists, though.
Granted, I am almost certainly simply infatuated with what is essentially an "anti immutable" algorithm. http://taeric.github.io/Sudoku.html
I should really learn what that means one day...