Searched the article, no mention of gzip, and how most of the time all that text data (html, js and css too!) you're sending over the wire will be automatically compressed to...... an efficient binary format!
So really, the author should compare protobufs to gzipped JSON
Last time I was evaluating different binary serialization formats for an API I was really hoping to get to use one of the cool ones, but gzipped JSON just beat everything and it wasn't even close.
There are some compression formats that perform better than gzip, but it's very dependent on the data you're compressing and your timing requirements (is bandwidth or CPU more important to conserve).
But in the end compressed JSON is pretty good. Not perfect, but good enough for many many things.
> This can sound like nothing, but considering that Protobuf has to be converted from binary to JSON - JavaScript code uses JSON as its object literal format - it is amazing that Protobuf managed to be faster than its counterpart.
Presumably the difference would be much larger for languages that can actually represent a statically-typed structure efficiently.
Also, the tradeoffs have changed since Protobuf was invented. Network bandwidth has gotten cheaper faster than CPU bandwidth has, so the en/de-coding speed is more important than the packet size in many situations. And if you don't use gzip, Protobuf is much faster (especially in non-JS languages, and especially if you use fixed-size integer types instead of variants).
But in that case the server/CDN won't be able to cache the gzipped forms of the individual files -- so probably a win for highly dynamic/user-specific content, but a loss for static or infrequently generated content.
I would think that serialization/deserialization time would be the largest drawback of json (at least for serving APIs). Pretty much all the other pain points can slowly be ironed out over time, albeit with deeply ugly solutions.
It depends on what your data looks like. If your content is mostly UTF-8 text, with dynamic keys, then I wouldn't expect protobuf to have much of an advantage over JSON for parsing to an equivalent structure. On the other hand, if you have binary data that needs to be base64 encoded in JSON, then protobuf has a significant advantage.
Searched the article, no mention of gzip, and how most of the time all that text data (html, js and css too!) you're sending over the wire will be automatically compressed to...... an efficient binary format!
So really, the author should compare protobufs to gzipped JSON