Requirements change. I’ve seen 10+ year old proto files at google still being used by services that are still being actively developed.
Over those ten years your data model has evolved. Your first client, the one whose data model you were imitating when you added the required field, was deprecated five years ago and turned down last year. You have a dozen or a thousand client services, each using you as a backend in a slightly different way. Are you sure every single one of them is going to require that field? Are you sure the field will even be semantically meaningful for their use case?
Those are exactly the kinds of things I was thinking of when I suggested that proto is designed for Google scale.
It’s not that smaller companies never have these problems. It’s that, at smaller companies, the ways in which they manifest themselves and the cost/benefit ratios tend to favor different solutions to these problems. For example, the company I was at that stuck with Proto2 so they could keep required fields, all the engineers worked in a single room, and could resolve questions about the needs of all a protocols consumers by simply standing up and saying, "Hey everybody, …"
Over those ten years your data model has evolved. Your first client, the one whose data model you were imitating when you added the required field, was deprecated five years ago and turned down last year. You have a dozen or a thousand client services, each using you as a backend in a slightly different way. Are you sure every single one of them is going to require that field? Are you sure the field will even be semantically meaningful for their use case?