Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

He's missing one key fact in his analysis which makes the conclusion incorrect, imho : early computers were really, really slow !

Early mechanical or relay computers for example had dedicated hardware to compute multiplications or divisions or more complex operation on full width floating point registers and still they took minutes to complete a computation. Doing things in software sounds great written from a multi-GHz modern computer but it would have been just too impractically slow at the time. Doing something as seemingly simple as a division can take thousands of cycles when you need to do everything "in software" with only additions and subtractions. When your machine has a clock frequency in the order of 1Hz, this translates to hours on a single operation.

So the issue was mostly technical, the hardware at their disposal was not capable of doing things quickly enough, so they did not design impossible things based on a future understanding of how things should be done. The process of invention is very iterative, building up on technological progress. When advances in semiconductors allowed to reach clock speeds millions of time faster, then doing things "in software" came naturally.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: