Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Isn’t this basically a question about the halting problem? Whatever arbitrary cutoff you chose might not work for all.


No, compression formats are not Turing-complete. You control the code interpreting the compressed stream and allocating the memory, writing the output, etc. based on what it sees there and can simply choose to return an error after writing N bytes.


Yes, and even if they were Turing complete, you could still run your Turing-machine-equivalent for n steps only before bailing.


Not really. It's easy to abort after exceeding a number of uncompressed bytes or files written. The problem is the typical software for handling these files does not implement restrictions to prevent this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: