Hacker News new | past | comments | ask | show | jobs | submit login

>and a GC that takes O(1) time to free unlimited amounts of memory

If the unlimited amounts of memory are all in one contiguous block and ALL of them are ready to be free'd (and determining that all of them are ready, takes O(1) time somehow) then what you say is plausible.

In any other case it's fantasy. Suppose I have n variables and I point each of them at a distinct new object (not pointed to from anywhere else). Then I re-set a random subset of them to null. Based on which ones I randomly set to null, there's 2^n possibilities for what the GC must do. If your claimed GC O(1) is long enough for the processor to take k conditional jumps, there are at most 2^k possible results of running it. For n>k, there are necessarily subsets in the above scenario which your GC fails to handle.

>This time-space trade-off is always available to the end user of such a system

People routinely run servers on virtual machines with very limited RAM.




Your argument proves that the time taken can be no less than O(n) in the number of object references in live objects. The (potentially billions of) unreachable objects are never touched.

I agree that limited RAM environments are important. Asymptotic complexity only matters when you're effectively operating at the asymptote.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: