Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tangent: what’s the ideal data structure for this problem?

If there were 20million rooms in the world with a price for each day of the year, we’d be looking at around 7billion prices per year. That’d be say 4Tb of storage without indexes.

The problem space seems to have a bunch of options to partition - by locality, by date etc.

I’m curious if there’s a commonly understood match for this problem?

FWIW with that dataset size, my first experiments would be with SQL server because that data will fit in ram. I don’t know if that’s where I’d end up - but I’m pretty sure it’s where I’d start my performance testing grappling with this problem.



I think your premise is somewhat off. There might be 20 million hotel rooms in a world, but surely they are not individually priced, e.g. all king bed rooms in a given hotel have the same price per given day.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: