Forum   |   Links    


Forum Home   Start New Topic   Edit Profile   Register  

1   2   3  

New Ranking list

Show Profile  Greg Posted: 18 February 2005, 8:21 AM  
I think for a start rolling at least 12 months for Elite would be good.

Show Profile  Bryan Posted: 19 February 2005, 12:01 AM  
Results for the Oceania carnival will be hopefully loaded soon. The PDFs from the website take a long time to process into a database so I've asked for an Excel speadsheet from the organisers. I also have to add badge credits for the A level events and enter all event details like distance (for km rates). It takes about one to two hours per event.

Show Profile  Bryan Posted: 21 February 2005, 12:27 AM  
I've played around some on changing the criteria of the Ranking list.
I've run various configurations on M21E selected names. Here are the results:

M21E Ranking Comparison - Current - Average Best 4 Minimum 2
Name Ranking
Carsten Jorgensen 4.6
Chris Forne 7.22
Karl Dravitzki 9.39
Bruce McLeod 9.56
Neil Kerrison 10.92
Darren Ashmore 11.87
Mark Lawson 13.5
Rob Jessop 15.74
Stu Barr 17.39
Bill Edwards 18.9
Greg Flynn 21.52
Jamie Stewart 25.85
Brent Edwards 28.26
Jason Markham does not appear

M21E Ranking Comparison - 12 Months - Average Best 6 Minimum 4
Name Ranking
Chris Forne 10.87
Carsten Jorgensen 11.7
Darren Ashmore 13.22
Karl Dravitzki 13.76
Mark Lawson 16.42
Neil Kerrison 17.83
Rob Jessop 19.49
Bill Edwards 22.47
Greg Flynn 23.89
Jamie Stewart 25.85
Brent Edwards 32.86
Stu Barr does not appear
Jason Markham does not appear
Bruce McLeod does not appear

M21E Ranking Comparison - 14 Months - Average Best 6 Minimum 4
Name Ranking
Carsten Jorgensen 6.12
Chris Forne 9.16
Karl Dravitzki 11.37
Darren Ashmore 13.22
Neil Kerrison 13.85
Mark Lawson 15
Rob Jessop 19.49
Stu Barr 21.26
Bill Edwards 22.47
Greg Flynn 23.89
Brent Edwards 29.56
Jamie Stewart 30.79
Jason Markham does not appear
Bruce McLeod does not appear

M21E Ranking Comparison - 18 Months - Average Best 6 Minimum 4
Name Ranking
Carsten Jorgensen 4.44
Chris Forne 7.15
Neil Kerrison 9.72
Karl Dravitzki 11.37
Darren Ashmore 12.15
Mark Lawson 12.42
Bill Edwards 16.1
Rob Jessop 18.75
Greg Flynn 20.83
Stu Barr 21.26
Brent Edwards 26.94
Jamie Stewart 30.79
Jason Markham does not appear
Bruce McLeod does not appear

M21E Ranking Comparison - 24 Months - Average Best 6 Minimum 4
Name Ranking
Carsten Jorgensen 2.41
Rob Jessop 3.43
Chris Forne 4.65
Jason Markham 5.35
Darren Ashmore 8.86
Neil Kerrison 9.72
Karl Dravitzki 11.37
Mark Lawson 11.45
Bill Edwards 12.28
Stu Barr 12.83
Jamie Stewart 15.86
Greg Flynn 17.26
Brent Edwards 20.89
Bruce McLeod does not appear

- If minimum counting less than 4, Bruce appears in the ranking. (in my opinion this is bad)
- If period is 2 years, Jason appears. (bad)
- If period is 12 months, Stu disappears. (ok if you want a very current ranking but a longer period is probably better)

I decided not to use the weighting method because of the problems mentioned in this thread plus it would have taken extra programming time.

I'm leaning towards either the 18 months or the 14 months (Average best 6, at least 4). Any opinions?

Show Profile  Chris Forne Posted: 21 February 2005, 8:59 AM  
The problem with a non-weighted (rectangular) window, is that old results count just as much as recent results. The ranking points will therefore not reflect recent change.

In reply to Michael

> Re the number to count, I would like a lowish number so the
> ranking is a better indication of recent change.

Lowering the number to count, will not help the ranking indicate recent change any better, it may even do the opposite. The only way to make the ranking better reflect recent change is to shorten the windowed period or use a time weighted approach.

Take for example Rob Jessop, using a 24 month window. By having some really strong runs during the 2003 Nationals and ANZAC carnival, he still has a current ranking of no.2, even though he has not performed as well over the last year and a half. Lowering the number of events to count would not help, as I'm reasonably sure he would still remain at no. 2.

All lowering the count does is favour the orienteers who occasionally 'pull one out of the bag' rather than the orienteers who are consistent but never really blitz the field.

Another problem with using a rectangular (non-weighted) window is that abrupt changes in the ranking may occur when results fall out of the windowed period. Assuming an 18 month window, Rob would have had a current ranking of no.2 up until about October last year, then within the space of about a week his ranking would have fallen to somewhere around its current position of no.8 even though no events had occured.

Personally I like Neils suggestion of having a uniform 6 month period after which ranking points are gradually increased. This way recent events will have a similar effect to at present, but old results will gradually fade away rather than abrubtly falling off.

If a non-weighted approach is used, I'd suggest a 12 month period to keep the rankings relatively up-to-date, and average best 6, at least 4, to prevent one-off-wonders or people like Bruce who only run two events a year from featuring quite so highly.

Also how about the idea of giving extra weighting to events that have stronger fields?


Show Profile  Neil K Posted: 21 February 2005, 11:59 AM  
I agree wholeheartedly with Chris.

Also it's interesting that most of the methods clearly show that I'm either the same as or significantly better than you Darren and Brent suxs.

This message was edited by Neil K on 21 February 2005, 8:01 PM

Show Profile  addison Posted: 21 February 2005, 12:15 PM  
Chris, extra weighting is already given to events that have stronger fields as ranking is not only given on where you come, but on who you beat and who else is racing. Bryan explained that earlier.

Show Profile  HeadHoncho Posted: 21 February 2005, 10:36 PM  
Chris, the problem you describe is a "problem" of most ranking systems. To use two examples:

The IOF rankings have a 12 month window, no weighting. Rankings can change quite markedly when events drop off - Tania was inside the top 50 after WOC 2003 but dropped well outside the top 100 (because she didn't have the opportunity to "replace" those ranking points through non-attenddance at WOC 2004)

ATP Tennis rankings are the same (although I'm not sure that they are non-weighted).

By comparison, World Golf Rankings have too long a period and are weighted for recent results - but it took over a year for last year's best golfer (VJ Singh) to get to No.1 in the rankings and in the early days Greg Norman was ranked No.1 for years past his best.

12 month non-weighted rankings work best when events are around the same time each year. So perhaps one can argue Tania deserved her ranking to drop because she didn't attend WOC 2004. Ditto Tennis where the same events are generally held at the same time every year.

NZ Orienteering events follow a similar pattern - Nationals at Easter every year, provincial champs during Oct-Nov so I believe a 12 month window is best. Then essentially your runs at an event are replacing the ones from the previous year.

Then it becomes a question of how many events, and this is a question of do you want to reward excellence or consistency? In the early days of IOF rankings, Ben Rattray and Gareth Candy were ranked inside the top 100 because very few people ran the 6 events that could count. Now that the (counting) number of events has reduced to 4 they are probably better.

IMO a minimum of 2 events is too few - there are more than that number of events at Easter. To be ranked you should turn up to a resaonble number of events so minimum should be at least 4.

I also believe the number of events used for ranking, and the minimum number should be the same. So it shoule be average of best 4, minimum 4 or average best 6, minimum 6.

Show Profile  Jamie Posted: 22 February 2005, 1:10 AM  
Its not often I can say I support a posting by Rob C - but this makes sense.

Chris and Neils suggestions are what we would have in an ideal world, but it sounds like its too much trouble to be worth it. After all were it not for Bryans fantastic voluntary effort we wouldn't be debating this at all!

Show Profile  Bryan Posted: 22 February 2005, 10:50 PM  
I'll go with 12 months, average best 4, minimum 4.

I ran this criteria on the same sample above:

M21E Ranking Comparison - 12 Months - Average Best 4 Minimum 4
Name Ranking
Chris Forne 9.72
Carsten Jorgensen 11.7
Darren Ashmore 11.87
Karl Dravitzki 13.76
Mark Lawson 15.02
Neil Kerrison 16.13
Rob Jessop 15.74
Bill Edwards 18.91
Greg Flynn 21.52
Jamie Stewart 25.85
Brent Edwards 30.41

(no change in order from the one with average best 6, minimum 4)

Show Profile  Chris Forne Posted: 23 February 2005, 2:32 AM  
Looks pretty good to me.

Show Profile  addison Posted: 23 February 2005, 3:12 AM  
Bryan, is it changing for elites only or for everyone? I think the old system for ages grades that are not open is best

Show Profile  Neil K Posted: 23 February 2005, 2:13 PM  
Good work. Its good to see this a positive solution come from constructive comments. First time ever in orienteering.

Bring back the Biff.

Show Profile  Bryan Posted: 23 February 2005, 11:00 PM  
The ranking system will only change for elites. The old system (average 4, minimum 2) over a specific period (usually from the beginning of the year to when the rankings were compiled during that year) has some advantages for young and old grades:
- we have a lot fewer numbers compared with elites
- people swap classes quite a lot (move up, move down)




1   2   3  

Ruffneck Productions © Ruffneck Productions