COBOL, and other PLs of that day, were heavily integrated with the ISAM file structure. ISAM a precursor of SQL, and was basically a hash table implemented very efficiently in a file. If you have hash tables, you can do pretty much anything. (Actually, it was better than a hash table. Once you had indexed (I) into the file, you could sequentially read forward (S). The AM is Access Method.)
I was thinking more about performance than space. If you hit the disk for every recursive call (not sure if it is the case, I can't read COBOL) the performance is probably going to be terrible even if the files are memory mapped. Optimized tail recursion would avoid most of that.
Nobody bothers about the 8080 these days. If it had been a disassembler for the Z80, it might be a different matter. But there's dozens of those around, and written in C to boot.
New parts of old systems, mostly. There are some huge COBOL application spaces out there, and quite often writing new modules for that with a more modern COBOL mind- and featureset is still easier than writing it with a different technology and then taking the trouble of connecting that to the existing part using eg CORBA.
The answer has been "no" for a long time. Once people got excited about Algol they started basing languages on the idea of inline math/expressions. FORTRAN survived by changing; COBOL was mostly abandoned.
I thought so as well. Nobody would call a python-like language "a python". I understand that the balkanisation of lisp is responsible for the fact that anything s-expr based is called a lisp. But, common lisp and scheme are actual, specified languages.
True that, though Common Lisp interpreter is not full Common Lisp e.g. "Common Lisp + its tons of standard libs" -- just "interpreter able to parse Common Lisp".
Does this mean that they are storing the stack as records on a file ? Hopefully they will implement tail recursion as soon as possible :-)