In my experience it's typically less than 100 bytes. You'll struggle to squeeze the HTTP overhead below that.
Anecdotal evidence: I just base64+gzipped a JPEG with 13668 bytes. Result: 13713 bytes, delta: 45 (0.3%).
Command used:
base64 -w 0 infile | gzip -9 -n > outfile
For HTTP you can even use raw deflate, which does away with some of the gzip file format overhead. IIRC this saves you another 18 bytes for the header (10 bytes) and footer (8 bytes).
Tangent: the reason I'm such a gzip nerd is that I used it to implement compression for level assets in Battalion Wars 2 for the Wii. I didn't want to make up yet another format, and gzip was ideal save for the fact that the size of the raw data is stored at the end of the file. We needed the info to allocate the necessary memory, but seeking to the end would have caused some latency from the DVD drive mechanism. The solution was to add a proprietary header which is ignored by the gzip code and the rest of the toolchain.
Anecdotal evidence: I just base64+gzipped a JPEG with 13668 bytes. Result: 13713 bytes, delta: 45 (0.3%).
Command used:
For HTTP you can even use raw deflate, which does away with some of the gzip file format overhead. IIRC this saves you another 18 bytes for the header (10 bytes) and footer (8 bytes).Tangent: the reason I'm such a gzip nerd is that I used it to implement compression for level assets in Battalion Wars 2 for the Wii. I didn't want to make up yet another format, and gzip was ideal save for the fact that the size of the raw data is stored at the end of the file. We needed the info to allocate the necessary memory, but seeking to the end would have caused some latency from the DVD drive mechanism. The solution was to add a proprietary header which is ignored by the gzip code and the rest of the toolchain.