Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You could easily design a stream parser that rejects duplicates before it passes them off to the user, by maintaining a set of already encountered keys within the parser state. The space overhead isn't a concern unless your map/set has millions of entries, but your non-streaming parser would have choked from the memory usage long before then, anyways.


> You could easily design a stream parser that rejects duplicates before it passes them off to the user, by maintaining a set of already encountered keys within the parser state.

You could, but you are not allowed to. Protobuf parsing requires that the last duplicate key wins.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: