Web Rating Systems Devalued by Users?

Recently, I sold a number of items on ebay, and have bought many items from Amazon marketplace – and a web-electronics dealer that I had never used before. In each case, rating systems became important parts of the sale or purchase. So, although they have different names, we are referring to eBay’s ‘Feedback Score’, Amazon MarketPlace ‘Ratings’, and probably any number of shop-comparison sites’ ‘Review’ systems.

The problem is, I have found that whilst the eBay system (which as far as I am concerned is ‘the Original’) is rather ‘low resolution’ – Positive, Neutral or Negative – it also happens to provide the most realistic at-a-glance data of all the systems I’ve seen. Yet, it remains ‘Low Resolution’… you only have those three choices to use and not really a great deal of extra space for comment.

The eBay Rating System

So, if I am selling on ebay (and I was selling high-value items), it was important to me to see that the buyer had a reasonable number of positive ratings already… ideally achieved over months or years. It was also important, of course, that the number of neutral or negative rating accounted for a very low percentage of their feedback score – or to put it the way ebay does, that their Positive Feedback approached 100%. As a buyer on ebay, I will similarly look for a high Positive Feedback rating. But in both cases, it is the neutral and negative ratings that provide most interest.

Why?

Because if you are uncertain of a buyer or seller, it seems best to understand what might go wrong if something should go wrong.

But, do all those Positive Ratings really reflect entirely positive eBay experiences? I suspect not. A common bugbear I have felt when selling is that often an auction will end with a flurry of activity so you can know for sure that the winning bidder will have been on-line at the time of auction completion; so they could reasonably know that they won the item. But do they pay straight away? Well, of course some do, but it winds me up no end when the payment floats in a few days later, when I made sure I was available to post the items the day after the auction ended. Perhaps I am more conscientious than many sellers (and I know some make a point of saying they can only post on Saturdays, or similar). Nevertheless, I have exclusively ended up providing positive feedback even for these annoying ‘delayed payers’ because that is the trend and apparent expectation of ebay (after all, they did all pay within the timescales set out by eBay).

Conversely, as a Buyer I get frustrated by sellers who you pay immediately the auction ends, yet you get no communication from them about when they will post your item… but still, on the eBay system, the trend is to give positive feedback (as long as all other aspects were OK, of course).

The result, in my view, is that eBay’s ‘Positive’ ratings are really ‘Tolerable’ or ‘Within Reasonable Expectations’ ratings.

Alternative Rating Systems

Some alternative rating systems are:

  • Amazon Marketplace; 5-star + text system;
  • Pricerunner.com; 4-item, 5-star rating system + text;
  • IMDB; 10-star + text.

Yet, none of these systems allow a feedback score of ‘zero’ stars. I believe that it annoys a number of people that the lowest feedback they can give appears to be a score! I mean, I didn’t get ‘1 out of 5’ (20%) when I didn’t turn up to that exam! And whilst the missus might rate give me a 20% success score for the meal I burnt to a crisp (because at least I tried!) – you don’t really apply the same rules to a person or business selling something to you, or undertaking to buy something from you under certain terms and conditions.

The second, and most major failing in my view, is that it seems many users treat these scorings as an all-or-nothing rating scheme. I mean, when offered 1 to 5 (or 10, or 50) stars I would expect the rating to reflect ‘very unhappy to very happy’, not ‘very unhappy to ok’.

Real-World Ratings

Let’s consider how we might talk about an experience in a high-street shop to a friend, in the context of one experience, and would we go back:

  • I bought this gadget from them, but it broke the second time I used it, and by the time I went to take it back – they’d closed down!
  • I’ve had to return the item 3 times as faulty and the staff were not at all helpful until I really pushed them. I certainly would not use them again, as I don’t trust them or their products;
  • It took them quite a while to make my coffee, and it was really much weaker than it should have been. I complained and they corrected it, but I still found the whole experience disappointing. I would consider using again, though, as I think the barrista was a trainee, so it was probably a one-off;
  • You know, it’s just boring-old socks, but they fit well and the price was ok too;
  • I hate it when you ask an assistant for information, and they just read off the card in front of the product! I can read! But, overall, they delivered it when they said and the price was ok, so I suppose I’d recommend them;
  • I paid for the ‘deluxe’ model which was a few pounds extra but it really seems to work well;
  • I’m really thrilled! The staff were really helpful and attentive to my needs, and I got a great price;
  • After I left the shop, I realised I’d made a terrible mistake with the widget model I chose. The staff were very understanding and quickly helped me to pick a better model for me and exchange the item – I got a small refund too as the model I ended up with was cheaper! They couldn’t have made the process any easier;
  • They accidentally served me a double moccha-latte-chino with cream instead of skinny-soy-bean-curd; but they got me the correct drink as soon as I pointed this out… and they gave me a token for a free drink next time.

I have placed them in some sort of order, and although I am confident that I would not ‘score’ these experiences in exactly this sequence, it is probably close.

The model I think applies here is:

  • Seriously below expectations;
  • Below expectations, but forgiveable;
  • On a par with expectations, average, OK;
  • Perhaps a little better than expected;
  • Really amazing product / service, much better than expected.

How Review Schemes Actually Get Used

Here’s a screenshot from PriceRunner.com:

priceRunnerReviewThumb.jpg

Note how the two reviews do not (probably) reflect the model I have suggested. The person who was happy with the company seemed extremely happy with all aspects – the person who was unhappy almost exclusively selected the lowest marks.

Two points reveal themselves:

  1. Many people’s assign ratings based on their overall experience – even if they are asked to break down their ratings into sections;
  2. The maximum positive ratings are rarely completely backed up with appropriate comments. I mean, surely 5/5 means ‘I can’t imagine it could be any better’ – not ‘It had everything I needed’.

Certainly, the number of reviews I have seen that actively distinguish scoring on different topics is very low.

Summary

I think the time has come for a better way to review products, services and companies.

Although it is probably impossible to entirely prevent trolling (multiple posting of) +ve or -ve reviews on independent websites, the software should not make it too easy to enter extreme positive or negative responses without backing it up in some way – perhaps with a comment or even just a simple confirmation. eBay is a winner in many respects here, as the rating system is so integral to its operation that at least one is more likely to see ratings from people who – when using other services – simply might not bother to rate the product / service.
How about a review system that actually uses positive and negative numbering?

The final comment is perhaps (as usual) – Perhaps the fault is not with the users? Perhaps the fault is with the people who designed and programmed these review systems?