Hacker News new | past | comments | ask | show | jobs | submit login

In my previous company, we had 63 petabytes of data in Snowflake.



That sounds great: storage problem is solved.

What about large scale read via OLAP queries (y'know, the typical measures and dimensions)


That's a respectable amount for a DW, true. Spark and it's ilk are designed for much larger scales though. Multiple FAANG use cases for Spark are in the petabytes per week range.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: