Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

On a related note, growing a hash table by a factor of two might not be optimal due to interactions with the memory allocator, more specifically the new allocation will be larger than all previous allocations combined so that you can not reuse previously allocated memory. Starting with m buckets one will allocate m * 2^t buckets after t growing operation, in total m * (2^(t + 1) - 1), but that is one bucket less than the m * 2^(t + 1) buckets required for growing again [1]. Whether this might be an issue obviously depends on the memory allocator used and other factors like the usage of a compacting garbage collector, phi appears again when looking for a better growth factor, and I will just leave this link to a Stack Overflow question [2] as a starting point because I am unable to find the article I had initially in mind.

[1] Assuming you can somehow incorporate the current allocation, previously freed were only m * (2^t - 1) buckets.

[2] https://stackoverflow.com/questions/2369467/why-are-hash-tab...



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: