Can you elaborate more on (or give an example of) "mechanical honesty"? From the outside looking in, it doesn't seem unfeasible to implement biases in mechanical systems (especially government departments!), but I haven't given it a lot of thought.
It's a public mapping of people and businesses with how to contact them. Any dishonesty would be noticed, because everyone has the same phone book.
Contrast with getting a phone number from $large-company, be it Yelp, Google or otherwise. The only thing stopping them from selling this privilege to an intermediary is their reputation, and clearly where Yelp is concerned, this isn't enough.
One could argue that Google has been exploiting this since the creation of AdWords: they will happily sell the right to advertise on a search for your exact company name to a competitor, a nice income stream which verges on extortion.
It's even possible to deliver personalized dishonest results; applications including search engines know who you are, and can lie to you and only you.
It's certainly possible to be 'mechanically dishonest', but no question that contemporary technology makes it easier and more lucrative.
Another example is targeting job ads to particular demographics. You can put out an ad that only white men between the ages of 22 and 30 who come from an upper class background will see. And women, minorities and people from poor backgrounds, not only won't see them, they won't know they exist.
I don't think Yelp has done anything other than "lie in the phone book" so far, so it's a bad example.
A better example might be where Uber made their app turn off a bunch of shady privacy a violating settings, but only if the phone was geographically located at Apple HQ.