Hacker News new | past | comments | ask | show | jobs | submit login

If you're cpu bound it might make sense to not gzip. Also if you send a lot of small files (in that case you probably would do better to change that so you don't need to send many small files but in general a single web page requests 10's if not 100's of small resources these days).



BBC site headers show heavy use of cache, they could have easily cached the compressed version and VARY'd on the ACCEPTability preference of the user-agent to handle the compressed version.


That's how it should work but until relatively recently it had some annoying hitches with clients which claimed to support gzip but were buggy[1], which meant you had to maintain a list of clients to never enable compression for. One of those clients is IE6 so I'm not terribly surprised that they put it off – for a site as widely visited as the BBC, even a small percentage of visitors means a potentially large number of complaints.

I still don't think that excuses not implementing it by now but I'd bet the explanation starts with some engineer having a bad week and not wanting to relive the experience.

1. Some major CDNs and caching proxies like nginx also haven't bothered to implement Vary but that doesn't matter for this particular scenario since they don't appear to be using any of them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: