Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I would say that, if you make such a claim, you should state why it is a mistake and what should be done instead.

Others have explained the problem with the von Neumann bottleneck,[1][2] and ways to overcome it or even redefine it.[3]

The industry is evolving towards programming paradigms less affected by that original computation model such as functional, functional-reactive, or agent-based programming(*), yet most of those are still converted down to Von-Neumann-style runtimes to be processed. Now that machine learning is increasingly being used to solve new problems, many computations may be done with dedicated non-VN hardware like tensor processors or analog computers (which are coming back with a vengeance).

(*) Listed in increasing order of real-world expressivity, although their computational expressivity in all them is mathematically equivalent to Brainfuck.

[1] https://dl.acm.org/doi/10.1145/1283920.1283933

[2] https://www.techtarget.com/whatis/definition/von-Neumann-bot...

[3] https://www.sigarch.org/the-von-neumann-bottleneck-revisited...*



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: