Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is relevant to webdev, though -- binary data in JSON is often encoded as a Base64 string.


Why use 6-bit data, if JSON can transport 8-bit (-1 for quoting the ") ?


Byte values that are ASCII control characters need to be escaped, and byte values that are not valid UTF-8 can't be represented in JSON strings.


Would this always be valid UTF8?


As an anecdote I have a Phoenix LiveView project where I've discovered that loading, converting and sending an (uncacheable) image as base64 via the websocket connection feels quicker and smoother (I didn't benchmark it) than only updating the src path and letting the browser load it (with http1, I would have to compare with http2).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: