In computing, "batch" connotes long running processes in addition to "script"'s connotation of collapsing multiple commands into a single one. The seemingly redundant parsing by the Batch interpreter is a feature, not a bug.
1. The parser allows modifying a .bat file during its execution and having those changes execute without restarting the Batch interperter. [1] This is in keeping with the rationale for batch processing -- facilitating serial execution of computationally expensive operations.
2. The Batch interpreter allows self modifying code.[2] In the early 1980's when Batch was designed, sophisticated COBOL programmers might have felt right at home. Lisper's were probably more hit and miss.
This is a case where historical context is useful. Today, it might perhaps be worth mentioning Powershell in a discussion of the Windows command line. Batch was the DOS command line and exists in Windows for evolutionary reasons.
In the days when abundant RAM and fast CPU speeds were prefixed with "mega" and distributed computing often happened at BAUD rates, not restarting a process was a big deal. More importantly, then as today, the execution speed of the batch interpreter was not a critical section of a batch process.
Bash first came out in 1989, but the typical shell of the time was sh, and it ran on Unix systems with more memory, storage, and CPU power than the typical PC. Thus it's no surprise that the sh family started with more features, while COMMAND which cmd evolved from was extremely minimalistic. DOS 1.0's COMMAND.COM was just slightly more than 3 kilobytes and didn't have conditional nor goto statements:
Even if the *nix machine was no bigger than a PC (e.g. - Xenix on '286; BSD on a PDP-11), it could still swap out processes, vs sharing a single real memory space.
But as you said, this allowed (at least the illusion, on some machines) more total memory to work with.
In the 1980's a 80286 would have been toward the high end of x86 systems. The 80386 started shipping in bulk in 1986 and the Compaq DeskPro 386 and IBM PS/2 Model 80 were well north of $5000 in 1987 and the "prosumer" PS/2 was the Model 60 with a 80286 when the line was released.
Even in the late 1980's 8088 based systems were pretty typical and why IBM included the PS/2 25 and 30 in its initial product line.
I remember using (sharing!) an XT (8088? 8086???) at work in 1985. One good thing about the 386 in 1986 was that it made the price of 286 (AT/clone) systems come down. I almost never saw an XT after 1986. We started seeing quite a few more clones (Compaq, etc) about that time, as well.
Which is of course a big tangent off of "why/whence command.com & .BAT files" :-)
OS/2 and Windows were a big deal in virtualizing memory use in PC land, with widespread Linux use still "a few years" in the future. (and effective adoption of NextStep even further out)
I think the 80386SX was also a factor in lower 80286 prices after the 386DX came out. Those systems were really popular.
When I bought the Amiga 500 in 1988, 8088 Turbo machines were still the entry level clone system. My vague recollection of the consumer and small business market was that that obtained for a couple of more years.
bash came out in 1989. The Bourne shell it's backwards compatible with is from 1977. But both were designed to run on Unix on multi-user multi-process machines. DOS batch language was designed to run on a single-user single-process machine. There's a reason they used to make a distinction between "minicomputers" and "microcomputers".
> DOS batch language was designed to run on a single-user single-process machine.
This. My preferred explanation for the difference between Unix and Windows for a long time has been "Unix assumes a lot of people are doing different things on one computer, all at the same time. Windows was designed with the mental model of a calculator."
I apologize for writing in a manner that would mislead someone into believing that I claimed that self modifying code was a need for anyone, let alone systems programmers. It's all Turning complete, even if that completeness comes complete with a tarpit.
To a first approximation, I'd expect that 1980's Unix programmers would generally be working in environments more accepting of restarting a batch process when there was an error in the code, i.e. significantly less likely to hold a clerical or secretarial "TCP report" job in a regular commercial office. Though that split has become less likely, it's still pretty uncommon for a computer user with little control of their work environment and deadlines to be using *nix rather than Windows.
In the Savings and Loan recession I had one of those jobs with unpaid overtime and an employer who was comfortable abusing it. I'd start a batch print job and go home. When I came back next morning picking up the paper off the floor was pure pleasure relative to baby-sitting the machines until midnight or more.
I remember when I learned that command.com kept the current .bat file open when running it, rather than reading it into memory (like most other interpreters that I used or learned at school). We had a client on a Novell network with a menu program that generated .bat files to start the program selected, then restart the menu program. This turned out not to work very well on a multi-user network, as user A would exit an app, and continue running the .bat at some offset, which user B would now have caused to have different content, and probably not even a line break at said offset into "THE" file.
Easy enough to fix by having the menu program put the run script (.bat file) into the Novell equivalent of each user's home directory, but it was a real WTF moment at first :-)
1. The parser allows modifying a .bat file during its execution and having those changes execute without restarting the Batch interperter. [1] This is in keeping with the rationale for batch processing -- facilitating serial execution of computationally expensive operations.
2. The Batch interpreter allows self modifying code.[2] In the early 1980's when Batch was designed, sophisticated COBOL programmers might have felt right at home. Lisper's were probably more hit and miss.
This is a case where historical context is useful. Today, it might perhaps be worth mentioning Powershell in a discussion of the Windows command line. Batch was the DOS command line and exists in Windows for evolutionary reasons.
In the days when abundant RAM and fast CPU speeds were prefixed with "mega" and distributed computing often happened at BAUD rates, not restarting a process was a big deal. More importantly, then as today, the execution speed of the batch interpreter was not a critical section of a batch process.
[1]: https://stackoverflow.com/questions/906586/changing-a-batch-...
[2]: http://swag.outpostbbs.net/DOS/0019.PAS.html