Hacker News new | past | comments | ask | show | jobs | submit login

It's not ideal, but in practice GZIP base64 is only marginally larger than GZIP binary



Indeed, good point, and worth clarifying. A lot of people think the size overhead is the problem, which usually it isn't, like you say, because of fairly cheap compression.

However, the main issue with big base64 blobs is that you can and should never assume that JSON parsers are streaming. So you may need to load the whole thing in memory, which of course isn't good.

Note that I'm not necessarily blaming JSON for this. My gut feeling is that crusading for streaming parsers is a Bad Idea. Instead, this is something that should probably be a higher-level protocol, either by streaming chunks (a la gRPC) or by having separate logical data streams (see e.g. QUIC). JSON RPC does not, afaict, solve these issues.


Base64 multiplies the GZIP size by 1.33x (4/3, or a 33.3% increase in size)

SO (https://stackoverflow.com/questions/4715415/base64-what-is-t...)


That's for Base64-encoded GZIP, not GZIP-encoded Base64 :)


Have you tried zstd, now widely supported?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: