The main issue here is the idiotic customers writing asinine comments and reviews. I can't count the number of times I've seen things like :
"I ordered that pair of running shoes, I really wanted a toaster, one star" or "The packaging was broken when I received my game, 0.5/10".
When you accumulate "reviews" like these where people are totally unable to dissociate their internal state of mind and the quality of the product at hand (and basically using the system to vent their personal frustration), the value you get out of it is basically non-existent.
I tend to prefer the logistic modeling and I wish more sites would use that instead. Giving users the power to rank things from 1-5 or 1-10 is arbitrary and mostly worthless.
As an example, Rotten Tomatoes critic reviews. You get a wide variety of feedback but ultimately it's a Yes or No. The result is a percentage of Yes votes and I find their movie choices to be very reliable in terms of a "quality movie".
I think this sort of model could definitely work for restaurants.. or even on Amazon for a bigger platform example. The bottom line is it forces reviewers to decide...despite whatever asinine thing they can complain about, is it a yes or no?
This isn't a problem just on yelp, I've found useless reviews on other sites like Amazon too. Customer buys wrong product (even though the product description is crystal clear) and writes bad/stupid reviews (instead of simply returning the product). Another really annoying type review is "product was amazing, but it came 15 days late - I know it is not a problem of the company, but I'm still giving one star review". If they knew it was a problem from USPS (or whatever shipping company) why leave a bad rating for the product/company?
Also, in all these review sites - how do we know if the reviewer actually used the product/service? Amazon seems to differentiate, but what about others?
It would be awesome to have a review site where
1. All reviews older than 6 months (or 1 year or whatever timeframe makes sense) are not shown by default
2. Only confirmed users of the product/service are able to review
At least on Amazon you can see the misleading reviews quite easily - just click to view the one stars and see if they are relevant or not. Personally I find Amazon reviews about the best in terms of actually deciding what stuff to buy.
This doesn't mirror my own experience. Typically the 5-star reviews often fall into the category "worked great for me" or fanboyism, but it's generally ambiguous how extensive of a test they've performed on the item, if it even warrants it (though some reviews are exhaustive on this front). And 1-star reviews often are hyper-critical to excess, though any potential issues are often highlighted there, prompting me to read them first. Since all reviews end up with these extremes and the "is this review helpful" ironically isn't often helpful (more like, "do I agree with the review"), I always end up just going with my initial gut feeling on whatever I'm looking to buy anyways.
Yes, that's part of the solution. But it has to be much more robust than "This review was useful". The meta-moderation needs to be recusive, e.g. like PageRank.
The other part of the solution is a review system designed around the fact that people have different tastes. Netflix at least tries to do that. On Yelp, a great hole-in-the-wall will average three stars because those that don't like them will vote the opposite of those that love them. Three stars doesn't help either type of person.
I don't know how it works for Amazon, but my guess is that only for the products that get a lot of traffic (e.g. kindle) you have enough "honest" opinions to get rid of the bias.
Smaller restaurants don't always have enough reviews to average away the bad ones, especially when people conscript their friends (like getting everybody in your Google+ circle to leave a one star review for a restaurant that didn't let your wear your Glass).
The place has to have enough reviews for the bad ones to be rendered insignificant. If a restaurant has only 10 reviews and three of them are one-star "they only had one gluten free option" or type of reviews, the rating will be skewed over bullshit reviews.
That is true, but it definitely seems broken that that could be the case.
Personally, I always either read all the reviews or disregard them completely when the review count is <10 or 20 on every "start driven" website, precisely because of this. May be these sites shouldn't even be showing the avg when there aren't enough reviews to be meaningful?
"I ordered that pair of running shoes, I really wanted a toaster, one star" or "The packaging was broken when I received my game, 0.5/10".
When you accumulate "reviews" like these where people are totally unable to dissociate their internal state of mind and the quality of the product at hand (and basically using the system to vent their personal frustration), the value you get out of it is basically non-existent.