Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It might sound trite, but we got "big disk", "big memory", "big cpu" and "big gpu" instead.

It's crazy how much you can do with one machine these days. Hence you often just have "data". And then snowflake/bigquery/redshift if it literally can't fit on a machine (which is rare).



Not to mention big compression and big vectorization. I'm right this minute messing around with a trillion+ row dataset in ClickHouse. It runs fine on a VM with 36 vCPUs and 1.8Tb of storage. (AWS c5.9xlarge instance type, EBS GP2 storage)


c5.9xlarge -> 74 GiB memory + 36 vCPUs (EBS storage) - $1.53 on-demand (N. Virginia).


It's not very expensive as your note implies. This is probably all you need for most analytic projects. Well, plus a couple replicas in case the first one catches on fire.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: