Saturday 13 February 2010
Were we ever #1?
Were we ever #1? This feels like a question worth asking after the whipping at South Africa’s hands in Nagpur. Maybe the ICC ratings don’t actually mean anything.
For several years I have trusted the Rediff ratings more than the ICC ratings. The Rediff ratings suggest that India never were #1. The latest Rediff ratings Google could find, published in December 2009, show India at #2 behind Australia.
The nice thing about the Rediff ratings is that they set more value on wins against better teams, and wins away from home. They were developed back in 2001 by two geeky cricket fans, one of whom was the Director of the Economics Department at Bombay University. The good professor might have felt the need to develop an intelligent ratings scale because the official ICC ratings developed earlier in 2001 were so bad. These ratings were designed by a panel of distinguished cricketers, like Sunil Gavaskar and Ian Chappell, and treated all test wins as equally valuable. This is not a bad attitude for a player, who should play equally hard against any opposition. But from a fan's viewpoint this original ICC scale is asinine. I thought this post was going to be a rant about the stupidity of the ICC ratings.
However, it turns out that over time the ICC have improved their ratings methodology. They have now incorporated the best idea from the Rediff methodology, that wins against stronger teams matter more. With that improvement, the ICC ratings are not meaningless. India topped a meaningful table in 2009.
There still are interesting differences between the Rediff and ICC scales. The ICC scale gives extra weight to test series outcomes, which is nice. It does not weight-up away wins, which is odd. But the biggest difference is that the ICC ratings give double the weight to wins in the last two years, while the Rediff scale treats an entire cycle of home-away tests as one equally important block.
For instance, the Rediff scale gives Australia’s 5-0 whitewash of England in the 2006-07 Ashes as much weight as the 1-2 loss in England in 2009. Rediff’s logic is that these are the two most recent home-away series. In the ICC ratings, the 5-0 hammering in 2006 gets only half the weight as the 1-2 loss in 2009, because the 5-0 hammering happened more than two years ago. Clearly, weighting-up recent matches makes it harder to apply a home-away factor, because very few pairs of teams will have both home and away matches in the most recent two years.
Neither approach is right or wrong, different scales serve different purposes. The ICC ratings will respond more quickly to changes in performance. It will therefore have more predictive power, will generate more rapid rating changes and therefore more news. The Rediff ratings are probably a more fair and comprehensive summing up of a complete block of historical performance. The swapping of ranks indicates that there probably is no real (statistically significant) difference in the performance of the best test teams since Shane Warne and Glenn McGrath retired.
Rediff ratings don’t seem to have been updated and published on schedule. The most current Rediff ratings don’t reflect South Africa’s drawn series against England, or Australia’s annihilation of Pakistan. Unfortunately, this might be for a good reason. As a profit maximizing brand, Rediff might not want to tell the Indian public things they don’t want to hear. Judging by the mean-spirited and jingoistic reader comments that were posted under the last Rediff update, this is a real concern.
Maybe the chest-thumping nationalism of a big chunk of Indian fans is much more worthy of a rant than the ICC’s rating methodology.
No comments:
Post a Comment