Hacker News new | past | comments | ask | show | jobs | submit login

Thank you for correcting this! (I get the same numbers as you for a million elements).



note that the overhead will vary a ton based on the number of elements you use. I believe Julia's dicts currently resize to be 4x current size (to avoid hash collisions), so you should see anywhere from 30% extra to 300% extra depending on how many elements you have. There has been some effort recently to move to a more SwissDict-like approach which should reduce the memory overhead.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: