Some years ago Google had the option to remove a website from search results. After a few months they discontinued. It would be nice to know why, if the user adoption was too low, the computational cost too high, or the user experience too bad because users inadvertently removed good results.
Like most features that Google releases, I believe that they use them to train their algorithms and once they get enough human input on a feature from people using it, they remove the feature and turn it over to the machine. We see it on the SEO side all the time. Release a feature, call it a ranking factor, thousands of SEOs jump all over it. The algo learns and then the features importance is removed. NoFollow links are a great example of this.
How many google ads are on github or stack overflow compared to the other pages you find? They know exactly what every little change to the algorithm does to their bottom line.
Personal remove lists basically turn each query from each person with such a list into a completely distinct query, which breaks caching on multiple levels. If a few people are using such a feature or if people sometimes add a couple -sites to a query, no big deal, but if enough people used it with basically unique site lists, the performance degradation would probably make a rollback of the feature inevitable.
It was a tool to remove a site from your own results. If you searched and saw a site you didn't like, you could click to no longer get the results from that site. It wouldn't affect what anyone else saw anywhere.
Not a problem as long as it only affects the user themself?
My guess is some ux head exploded when they realized someone might blacklist a site they wanted. Or maybe the one who made it got promoted and no one stepped up to maintain it?