I was at Apple when they were transitioning from Pascal (which had driven much of the Lisa and Macintosh) to C (and some C++).
With the exception of a few curmudgeons, C was the way to go for most people. You lost nested procedures, but the stuff you got back was immense. Pascal's big problem was that the extensions required to make it a "real" systems programming language (arbitrary memory access, I/O that didn't suck, etc.) were not standardized; good luck porting anything. Pascal strings were effectively broken (with a proliferation of Str255 / Str64 / Str32 types that were effectively incompatible unless you cheated).
C, while it still lacked a standard, had everything you needed out of the box, and the direction for a standard was clear.
Pascal became a legacy language and died at Apple in the early 90s.
With the exception of a few curmudgeons, C was the way to go for most people. You lost nested procedures, but the stuff you got back was immense. Pascal's big problem was that the extensions required to make it a "real" systems programming language (arbitrary memory access, I/O that didn't suck, etc.) were not standardized; good luck porting anything. Pascal strings were effectively broken (with a proliferation of Str255 / Str64 / Str32 types that were effectively incompatible unless you cheated).
C, while it still lacked a standard, had everything you needed out of the box, and the direction for a standard was clear.
Pascal became a legacy language and died at Apple in the early 90s.