I've often wondered about this. I always suspected online reviews would tend to skew high. I think there's a couple of factors at work.
First, people are pretty good at predicting if they'll like something. If it looks like they're not going to like a movie, they don't watch it. Hence, the majority of people who review something, are people who expected to like it.
Second, 5 star reviews are kind of pointless for something like cotton balls or artificial sweetener. Either you got the product and it's the right thing in the quantity you wanted, or it's not. There's not a lot of room for subjective discrimination between one box of Splenda and another. Since the product lived up in every way to your expectations, there's really no reason to give it less than full marks.
I agree to some point, but quality can have an effect. I remember buying cotton swabs in France that were amazing quality (there were none with bare ends, they were rigid enough to actually swab and they came in a convenient container), however ones I bought from Walmart for the same price (converted, not face value) were god awful in quality (easily 5% were bare ended, they were made of the cheapest thinnest plastic tubes that were less rigid than a bendy straw and couldn't swab at all and came in a cheap plastic bag).
I suppose these aren't the things usually purchased online though.
Having actually worked as a video game/movie reviewer, I've rarely bought or seen a movie that I knew I wouldn't like. I'm even cunning enough to get out of having to go see chick flicks with my wife (most of the time, but that doesn't spare me from rentals and TV runs). However, I believe from being immersed so much for several years I've got the ability to see through market hype and know before hand what's going to suck, for instance I'm already trying to find a way to avoid seeing 2012.
To your second point I would just like to add that such a skewed distribution of reviews is actually not so much a problem. Most dog foods will get five stars because they just do their job. And only a few don’t, hence the skewed distribution. But the customer has no problem at all to avoid bad products. If anything such a clear dichotomy should make it rather easy.
(Just one little theory of mine: I think star reviews don’t work anymore when it comes to more complex [say, than dog food] things. For those you need to read the written reviews and ignore stars. [And maybe look at the distribution. Finding out that a product is polarising and the reasons for that can be helpful.] If someone rates a camera with one star because it doesn’t film 1080p [a perfectly valid complaint] I don’t care much for his opinion because of different preferences. His written information is nevertheless helpful [I could, after all, care for 1080p], but the information conveyed in his one star rating is close to zero.)
I wouldn't say ignore the stars, but use them as a filter. Read only the negative reviews. Any 5 star review is just a rabid fan, you can safely ignore those. 4 star reviews might be more interesting, but the real content is in the 1, 2 and 3 star reviews.
Whatever issues those people have with the product might be genuine issues which you would also not be happy about, or the issue might be something you wouldn't mind. I've bought products based on negative reviews purely because whatever the reviewer didn't like about it, it was something I know I would like.
Usually though the reviews won't tell you whether a product is any good, but assuming there are enough reviews on a product the reviews will allow you to determine that certain products really are crap.
Read only the negative reviews.... the real content is in the 1, 2 and 3 star reviews.
I always feel sad rating a movie three stars for "Liked it" on Netflix, because I know that Netflix's rating system is incompatible with the way people assign ratings. Any rating system has to assume that less than half the rating system will be used for neutral-to-positive reviews, and the rest of the system is a big wasteland of varying degrees of "This sucks!" I know the recommendation system doesn't care whether I mean three stars to be positive or not, but it still bothers me.
Is this just an American thing related to the hundred-point scale used in schools, where 69/100 is a failing grade? Or is it universal?
I think the great thing about netflix is that it's recommendations try to predict what you would rate a movie.
So, if you consistently use three stars to indicate "liked it, reasonably good" and I use it to indicate "waste of time" then our respective recommendations from netflix will be different.
The movies it predicts as 3 stars for me, will be movies I think are a waste of time, and for you would be movies you might find reasonably enjoyable.
Me too, especially on Amazon. However you might discount the 1-star reviews for the same reason as the 5-star ones. Give me a good 2-star rating any day--1s and 5s are easy but it takes thought to come up with a 2. ;)
Incidentally, I was thinking of replacing my old phone and started reading the negative reviews on Verizon's site. Apparently there's no such thing as a reliable phone that's good at making calls (wish somebody would invent one).
My grandparents had this old rotary dial phone, it was black (available in one color!), metal and heavy, really built like a tank. The dial itself was metal and hard to move, and the finger holes weren't beveled so they were sharp and it actually hurt a little to dial the phone.
Eventually they ordered some new service and Bell (remember?) made them turn in the phone.
The funny thing about the 5 star rating system is that it's completely the wrong model for many situations in which it's used. It originated from sites like slashdot where, because everyone had to read the same front page of content, it made sense to pick the content democratically using a five star system.
But, systems like netflix that use the five star rating system have it all backwards! Their goal should be to establish the niche genres that attract each user. This goal has nothing to do with brute popularity. Sure you want to assess how much someone likes a movie in order to steer them into the correct genre, but it doesn't make sense to attach an overall popularity rating to the movie.
Given a hugely diverse database such as the netflix movie library, it's ridiculous to assume that individuals will like things in proportion to the average popularity. That's not how taste works. And the weird thing is, there is no reason for these databases to constrain themselves to an averaged popularity index - they're just accustomed to the five star model is all. They should be using a micro-genre mapping scheme that steers you towards clusters of movies that have received attention from users with similar taste.
Meanwhile, by constraining your rating to a discrete number of stars (1,2,3,4, or 5) they are killing the quality of their sample. (See the jellybean guessing experiment in the link below..)
These algorithms should ditch their discrete-value database of how much users say they like something, and instead use some continuous measure of how much attention is spent on each item. Hell, some people love to watch crappy movies and write bad reviews for them. Anyway, for more on this rant, see the link below.. cheers-
Granted, it's been a year or two since I ditched Netflix, but as I recall their recommendation system already ranked the importance of ratings based on how similar the other reviewer's tastes were to yours (I imagine by finding other people who tended to have ranked movies the same as you).
My impression was that they were far more concerned with doing well with recommendations than with the plain rankings.
That's true, but their entire model is still built on a database built up of 5-discrete-value ratings. True, it does seem that they place an emphasis on reviewers' similar tastes, but they are still crippled by the poor resolution of their data.
They're spending so much energy tweaking their 5-star algorithm by tiny amounts, but it seems like they'd be much better off investing in a richer database medium - like attention spent browsing various genres on their website while looking for their next movie...
I don't know, it just seems like the five star system is so crude for a company willing to spend millions to improve their recommendation system by even a tiny amount.
"So you like this movie? Like, would you say, "4 stars" like it, or "5 stars" like it?" really? That's what your database is made of? - know what i mean?
That is a great point.. imagine a prof issuing grades that way. I'd love to see someone with a 100-point system do some A/B testing to see how much recommendations degrade with a 5-point system.
First, people are pretty good at predicting if they'll like something. If it looks like they're not going to like a movie, they don't watch it. Hence, the majority of people who review something, are people who expected to like it.
Second, 5 star reviews are kind of pointless for something like cotton balls or artificial sweetener. Either you got the product and it's the right thing in the quantity you wanted, or it's not. There's not a lot of room for subjective discrimination between one box of Splenda and another. Since the product lived up in every way to your expectations, there's really no reason to give it less than full marks.