Having done both Java and Lisp for years (both professionally), I agree that in general Lisp requires more cognitive effort to read per line, but I disagree about the reason.
Cognitive load vs. verbosity can be visualized with something like Laffer curve - sure it's easier to read a couple lines of Lisp than one line of APL, but when you get into situations like one line of Lisp (or Python) gets expanded into 30 lines of Java, then the former is much easier. Terse languages make you pay cognitive cost for unpacking meaning, verbose languages make you pay for keeping track of what all these lines of code are doing. Ever had this situation when you read two pages of a book and realized you didn't really read anything, and have to re-read it (and 3 pages prior) to understand what's the point? That's Java in a nutshell, especially pre-1.8.
As for Lisp and cognitive load, the difficulty comes from a) dynamic typing, and b) macros. The former makes bad naming choices much more painful, as you can easily lose track of what's the code doing, and to what data. The latter are fine when written well, when written badly, your problem suddenly explodes in complexity as you have to peel off a layer of syntactic abstraction, and fix the underlying mechanism.
Cognitive load vs. verbosity can be visualized with something like Laffer curve - sure it's easier to read a couple lines of Lisp than one line of APL, but when you get into situations like one line of Lisp (or Python) gets expanded into 30 lines of Java, then the former is much easier. Terse languages make you pay cognitive cost for unpacking meaning, verbose languages make you pay for keeping track of what all these lines of code are doing. Ever had this situation when you read two pages of a book and realized you didn't really read anything, and have to re-read it (and 3 pages prior) to understand what's the point? That's Java in a nutshell, especially pre-1.8.
As for Lisp and cognitive load, the difficulty comes from a) dynamic typing, and b) macros. The former makes bad naming choices much more painful, as you can easily lose track of what's the code doing, and to what data. The latter are fine when written well, when written badly, your problem suddenly explodes in complexity as you have to peel off a layer of syntactic abstraction, and fix the underlying mechanism.