A huge amount of the limitations were "just" down to cost.
E.g. on the C64 the CPU and graphics chip competes for memory cycles all the time. This is why the screen blanks when loading, for example - to prevent the graphics chip from "stealing" cycles. A more expensive memory subsystem would allow working around that, and speed up the entire thing. This is a recurring issue with many other architectures as well for cost saving reasons (e.g. the Amiga "chipram" vs "fastram" distinction).
From 1983, the 8/16-bit 65C816 (used in the Apple IIGS) would have been a more natural choice for the C128 or even a "revised" C64 (at the cost of some compatibility), and reached clock rates up to 14MHz.
A lot of it was also down to market pressure and R&D costs... All of the home computer manufacturers led precarious existences, as evidenced by most of them failing or transitioning out of that market (and often then failing); at its peak, Commodore was notorious for being tight fisted and spending ridiculously little money on R&D relative to its size, and that was probably what made it survive as long as it did despite serious management failures, while Apple largely survived by already then going after a less cost-sensitive market segment and much higher margins.
The list of R&D efforts in Commodore that were shut down not because they were not technically viable, but because the company wouldn't spend money on them (or couldn't afford to) is miles long. I'm sure the same was true in many of the other companies of the era (but I was a C64 and Amiga user, and so it's mostly Commodore I've read up on..)
We could certainly have had far more capable machines years earlier if a handful of these companies had more money to complete more of these projects and/or if there had been demand for more expensive machines at the time.
But then progress is very often limited by resource availability, not capability.
E.g. on the C64 the CPU and graphics chip competes for memory cycles all the time. This is why the screen blanks when loading, for example - to prevent the graphics chip from "stealing" cycles. A more expensive memory subsystem would allow working around that, and speed up the entire thing. This is a recurring issue with many other architectures as well for cost saving reasons (e.g. the Amiga "chipram" vs "fastram" distinction).
From 1983, the 8/16-bit 65C816 (used in the Apple IIGS) would have been a more natural choice for the C128 or even a "revised" C64 (at the cost of some compatibility), and reached clock rates up to 14MHz.
A lot of it was also down to market pressure and R&D costs... All of the home computer manufacturers led precarious existences, as evidenced by most of them failing or transitioning out of that market (and often then failing); at its peak, Commodore was notorious for being tight fisted and spending ridiculously little money on R&D relative to its size, and that was probably what made it survive as long as it did despite serious management failures, while Apple largely survived by already then going after a less cost-sensitive market segment and much higher margins.
The list of R&D efforts in Commodore that were shut down not because they were not technically viable, but because the company wouldn't spend money on them (or couldn't afford to) is miles long. I'm sure the same was true in many of the other companies of the era (but I was a C64 and Amiga user, and so it's mostly Commodore I've read up on..)
We could certainly have had far more capable machines years earlier if a handful of these companies had more money to complete more of these projects and/or if there had been demand for more expensive machines at the time.
But then progress is very often limited by resource availability, not capability.