Hacker News new | past | comments | ask | show | jobs | submit login

While I agree with most of your points, I think that these dithering images use less client-side power than a regular image compression (e.g. JPEG or PNG). The dithered images are smaller (according to them), and I don't see why they would use more CPU per byte to decode than JPEG.



> The dithered images are smaller (according to them), and I don't see why they would use more CPU per byte to decode than JPEG.

Just try substituting something else in that statement and see if it makes sense. “The lzma-compressed files are smaller than the raw test files, so opening them in an editor shouldn’t use more cpu” or “HEVC-compressed videos are smaller than DivX videos, so shouldn’t they be more efficient to decode and play (sans hardware acceleration)?”


The examples you give compare new/good compression (lzma, HVEC) with old/bad compression (uncompressed, DivX). Yes the new/good compression uses more CPU per byte.

I see no reason to believe their website uses a compression scheme that uses more CPU per byte than PNG or JPEG. I don't think they're using anything advanced. Actually I just checked, and their website is using PNG.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: