At some point, you do have to update the interface to move things forward. In the early days of programming, you had to manually punch cards. Then someone invented teletypes and you could program with ed. Then someone invented screens and you could use vi. No doubt each of these changes were incredibly jarring on the first day, but the change was worthwhile overall -- using a visual editor is much more productive than managing stacks of punched cards.
If everyone took the approach of never making anything that would have to be learned, our field probably wouldn't even exist.
The examples you mentioned brought usability benefits. Teletypes were obviously better than punch cards. Screens were obviously better than teletypes. Even GUIs were obviously better than CLIs for a lot of applications - they've essentially opened up new domains of work on the computer.
Here, we're talking about replacing perfectly good interface with one that's strictly inferior in terms of usability and features. Or, in the Google code search example, changing a few colors, and in the process significantly downgrading usability due to adding latency that wasn't present before.
Change is good if it brings something better, or new, or opens up the road for further improvements. These changes discussed here are doing neither.
If everyone took the approach of never making anything that would have to be learned, our field probably wouldn't even exist.