Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It was certainly possible to run the algorithm on 16GB of array before the moment when it happened but did the original developer have that sort of space on their desktop at the time? Possibly not.

If a unit test only runs on a server and not on the laptops of the developers then its not going to be written, whereas ideally someone should write a test that is tagged to only run on the server but that is a lot of extra work if that isn't a common thing on a project. Even now I would be quite wary of producing a max size input test for an array of ints and especially objects, that is going to raise some questions due to being a slow test and a highly resource consuming one.

If I was worrying about the same in aerospace programming however then no question that test would get written and we would ensure it got run on the hardware that it was designed to run on. In typical business software its less common to run potentially very long running tests and simulations of the machines states everyone wants to go faster than that especially for the sake of a max input test.



Also you can have checked math, like in Rust, and automatically just crash when you overflow a variable.

In this case, it's not a bug (cannot get incorrect result) but an unsupported input.


If you're worried about being able to make an array where a.length == int.max, (which is reasonable in an algorithm which does any math on the length), replace the array with another substitute which you can mock the size of (e.g. in Java that would be possible against an ArrayList). You can test the algorithm separately from the execution of the algorithm.


Small nitpick: you don't need 16GiB, 2^31 bytes is "just" 2GiB. Doesn't contradict your point though.


Each element of the input array is at least 4 bytes, bringing it to 8GiB.


Not necessary. But, still, my work laptop from 2020 has 32Gb of memory. So, not that implausible.


Swap in an ArrayList and mock size() and you need a few bytes...


You might not need 16GB of memory. There are systems where int is only 16 bits, and overflowing 16 bits is not that difficult.

But maybe it was uncommon to have arrays larger than 64K in the 80s due to segmented memory?


The core of Java was being written in the late 1990s. I had a machine in 1995 that had 16MB of memory but 8MB was more typical for Pentium machines. By 2000 the AMD Athlon and Pentium 3 were the latest and greatest and Anandtech was testing with 128MB of memory [1].

Java defines an int as 32 signed, it doesn't do anything else it calls 16 bit ints shorts. So it definitely has to be 8GB for the array.

Sun was using Solaris machines rather than PCs and they were higher spec on things like memory but still I doubt they had 2GB let alone 8GB+ needed to run this test. That sort of memory didn't become routine for another decade where Sandy Bridge was being tested with 8GB in 2011 [2].

Also goes to show how much things have stagnated, a desktop computer with 16GB has been standard for a long time. The preceding decade we went from 128MB to 8GB as normal and the next 15 years to today normal is 16-32GB which is no where near the same place of progress of memory density.

[1] https://www.anandtech.com/show/566/6 [2] https://www.anandtech.com/show/4083/the-sandy-bridge-review-...


The correct way to test that code is to write a version that doesn't take an actual array but an index -> int function, then you wouldn't need to instantiate the array at all.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: