That beams you right back to the 1960s where FEXPRs were common in Lisp. They died a slow and horrible death at the end of the 70s under the influence of Scheme and Maclisp.
Kernel makes the point that much of the vilification of fexprs was actually caused by dynamic scope. Combining them with lexical scope makes for a surprisingly reasonable language.
If you read Pitman's paper, you can read that his critique is not so much about theoretical issues of interaction of dynamic binding, but about practical issues. Pitman was somebody who wrote and used lots of Lisp code which was used by users (for example Macsyma). Dynamic binding was a side issue - practical compilability a much bigger issue.
a) I have read Pitman; please stop insinuating that I haven't.
b) Pitman actually points out that dynamic binding makes compilation of fexprs harder. That was also one of the original selling points of scheme: lexical binding simplifies compiler implementation.
c) Issues of practical compilability are extremely likely to be context-sensitive and hard to generalize from. That some lisp 35 years ago had a hard time compiling some language involving fexprs doesn't help decide if this language here and now can use them.
d) Finally, you're attacking a strawman of your own creation. Pitman didn't focus on dynamic binding, sure. Who said otherwise? That insight was one of Kernel's contributions, that a world where lisps are lexically scoped is actually quite a good fit for fexprs.
Pitman said that compilation of FEXPRS is difficult. Even with lexical binding. Pitman argued for a macro expansion phase. Etc. If you read the original kernel paper it does not even discuss Pitman in more than two sentences. Since Pitman is still alive, he could have even asked him. The kernel paper had a seperate addendum where vague ideas about compilation were discussed.
Pitman was working on large software systems where compilation was the norm and useful.
All the references to lexical scope but one are annotations added decades later. The one early reference said "dynamic scope harder than lexical scope."
In any case, the claim that fexprs make compilation challenging for large software systems is much more nuanced than what you were implying earlier, that Kernel's no different from "the 1960s".
Now that compilation is less challenging thanks to lexical scope, and with computers being so much faster, it's worth considering whether fexprs can be compiled to be fast enough (say to ruby levels for a start). That would still be valuable. Right?
I think it's very superficial to claim that only 'two sentences' of Shutt's thesis were about 'Pitman'. He mentioned the concerns in the abstract[1], for crying out loud. He's addressing the issues throughout even if he isn't constantly paying homage to some sort of Pitman deity.