Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Any random access structured programming language completely breaks down for working with GBs of data, in parallel and on a cluster of machines.

Why would you need a cluster of machines to work on mere gigabytes? You can get single-processor machines with terabytes of ram. Even laptops come with 64GB now.



Yeah, big data these days should be petabytes of data or at most high hundreds of terabytes coupled with very intensive access patterns.


It's almost like the parent comment is just putting keywords together instead of a coherent argument!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: