On my blog, my feed.xml is currently 62654 bytes compressed, 307726 bytes long uncompressed. The compressed version is served with gzip_static, so if the client sets the HTTP header "Accept-Encoding: gzip" then it's sent at no additional CPU cost to me.
Looking at my access logs, some RSS readers are well-behaved and get served 304s. Some of them however not only request feed.xml without caching, but they don't even set the HTTP header "Accept-Encoding: gzip". They download feed.xml every time and they download it uncompressed.
I'm hosting my blog on a feeble 5-year old Synology NAS located in my apartment. It's not seeing a lot of traffic so it's currently not a problem, but these RSS readers are wasting several orders of magnitude more of network bandwidth and CPU resources (gotta encrypt it for HTTPS) on my end than well-behaved RSS readers.
Looking at my access logs, some RSS readers are well-behaved and get served 304s. Some of them however not only request feed.xml without caching, but they don't even set the HTTP header "Accept-Encoding: gzip". They download feed.xml every time and they download it uncompressed.
I'm hosting my blog on a feeble 5-year old Synology NAS located in my apartment. It's not seeing a lot of traffic so it's currently not a problem, but these RSS readers are wasting several orders of magnitude more of network bandwidth and CPU resources (gotta encrypt it for HTTPS) on my end than well-behaved RSS readers.
It's just plain embarrassing.