when the purpose is to abuse your monopoly to further your business interests in another area, being obtuse and convoluted to get plausible deniability is good engineering. This is just sloppy.
I think this is a good example of corporations being made up of people, rather than being contiguous coordinated entities as many of us sometimes think of them.
An engineer doing "good engineering" on a feature typically depends not only on them being a "good engineer" but also on them having some actual interest in implementing that feature.
I would imagine that in a well coordinated company engaging in this kind of thing, the order wouldn't be "slow down firefox", but something along the lines of "use XYZ feature that firefox doesn't support and then use this polyfill for FF, which happens to be slow". Something that doesn't look too incriminating during any potential discovery process, while still getting you what you want.
That's assuming a degree of engineering competency at the product decision making level that is usually absent in companies that are structured as Google is, with pretty strong demarcations of competencies across teams.
Nah, that's got a risk profile. They could implement whatever your strategy is in the next release. You aren't going to necessarily get the longevity of the naive approach.
Plus a Firefox dev would discover that more easily as opposed to this version which they can just dismiss as some JavaScript bug on YouTube's part
that's the beautiful thing, you make the polyfill contingent on the browser being firefox rather than probing for the feature and then you forget to remove it once they implement the feature
But why do you have to be that clever? If you're caught the consequences are the same regardless and both implementations would exhibit equivalent behavior.
The only superior approach here would be one that is consistent enough to be perceived but noisy enough to be robust to analysis.
Also it should be hidden on the server side.
Who knows, maybe there are a bunch of equivalent slow downs on the server side in the Google property space.
Given this discovery it would probably be reasonable to do some performance testing and change the user agent header string of the request.
Google docs, image search and Gmail operations would be the place to hide them.
I dunno. How long has it been there without anybody noticing?
5 years? 7? Longer?
No matter how they approached it, you could demonstrate the pattern through the law of large numbers regardless. Might as well make the implementation straight forward.