Hacker News new | past | comments | ask | show | jobs | submit login

To me, this is the equivalent of developers assuming everything is 7-bit ASCII because that works in their testing... with US English input only. Sure, they might use UTF-8 under the hood, but they never test with anything other than English so all sorts of things end up broken in subtle and not-so-subtle ways. Right-to-left, word breaking, collation, or something.

"AverageColorRGBA8" for example bakes in the assumption of RGB order, 8-bit, and an assumed linear color space. All three assumptions are regularly violated in real-world apps. For example, HDR formats generally use 10-bit integer or 16-bit float primaries. Some formats use YUV instead of RGB.






Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: