Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
geoduck14
on Nov 16, 2021
|
parent
|
context
|
favorite
| on:
Databricks response to Snowflake's accusation of l...
In my previous company, we had 63 petabytes of data in Snowflake.
hello_moto
on Nov 22, 2021
|
next
[–]
That sounds great: storage problem is solved.
What about large scale read via OLAP queries (y'know, the typical measures and dimensions)
blueglassfish
on Nov 16, 2021
|
prev
[–]
That's a respectable amount for a DW, true. Spark and it's ilk are designed for much larger scales though. Multiple FAANG use cases for Spark are in the petabytes per week range.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: