Hi all
As we always say one of the key reasons we produce the Top 100 course rankings is to produce debate so it certainly seems like it’s ‘job done’ on that score!
Some of the more inflammatory comments were predictable but at least it shows a certain passion and enthusiasm for the subject.
Jezz has obviously responded on a number of things but there are a number of points I thought were worth making/restating....
As I said in my editor's letter (which I appreciate not everyone may have read yet) there simply can’t be any 100% right or wrong positions when it comes to ranking courses as the whole thing is based on subjective opinion to a degree.
Golf Monthly’s Top 100 list is our opinion, Golf World’s is theirs and Dodger’s, The Craw’s and Snelly’s (just three names pulled out completely at random you understand) are theirs.
Our opinion is derived from the input of our course rankings panel in the form of course assessments carried out to a set criteria that you can read about in detail here.
The criteria and the weighting between them is broadly the same as every major course ranking I’ve ever seen. We probably add a little more weight to the ‘visual appeal’ and ‘ambience’ categories than some other rankings and we have done that because we want our rankings to be inspirational and useful not just to ‘better golfers’ who tend to focus very heavily on the test and conditioning but to make it relevant to all golfers for whom breathtaking scenery and the overall experience you have as a visitor is very important… especially if you haven’t played well!
Our panel is made up of 40 golfers – GM staff, readers and advisors - that cover all ages, handicaps and regions of the UK&I.
Everyone on the panel has played at least half of the current top 100 with many very close to the complete Top 100 (and courses played total around 300).
Jezz and Rob Smith (who make up the Senior Panel with me) have now played all 100 (and almost all of the next 100) and I’ve played 72 of the current Top 100 and 63 of the next 100.
We then have input from our senior advisory panel - all of whom have played in excess of 600 courses worldwide (including all or almost all of the Top 100) with one of them having played 3,000.
Since we published the last rankings in November 2010 I have played nearly 50 of this year’s Top 100. I suspect Jezz and Rob will have done even more and that means we have an up to date experience of courses we are ranking, which is important as it means we will have seen at first-hand any improvements (on and off course) and equally any significant drop in standards of the playing surfaces.
Several clubs in the Top 100 we have spoken to say they haven’t been visited by anyone from a rival magazine’s panel for a number of years.
I don’t quote those figures to show off but rather to underline the combined experience, and the more experience you have, the better placed you are to be able to rank one course above another.
Between the members of the panel, we undertook over 1000 course visits (covering 170 courses) from March 2011 to August 2012. Every course had at least one visit in this period with many around five (some as many as ten), meaning we get a wide range of feedback and opinion on all the various criteria.
To give you an idea of the level of information we ask panellists for, and get back, here are some links to three sample assessment forms.
http://www.mediafire.com/view/?isiyhhispkh31v3
http://www.mediafire.com/view/?9farhw3am4hovze
http://www.mediafire.com/view/?4t2zfhvc15f90xf
Once we have all the assessments in, Jezz, Rob Smith and I sit down and go through them noting the marks, comments and crucially the benchmarking section where panellists suggest rankings and compare the course to others. Taking in all those comments we then start to adjust positions according to the pooled feedback, adding in our views. Where we have disagreements amongst ourselves, then we debate it until we get at least a 2:1 majority. Sometimes the final call comes down to answering the question ‘where would you rather go and play tomorrow, course A or course B’.
On its own, that’s not very scientific but when it’s the final element of an extended assessment process, then I think it’s a fair tiebreaker.
I’d call the above pretty comprehensive and I hope shows that our process is anything but lazy, as someone suggested.
By contrast I have seen the communication a rival magazine sends to panellists. It consists of a list of their last Top 100, and merely asks them to cut and paste courses up and down. If a panellist moves a course more than 10 places, they are asked to write a sentence on why. In my book that’s not very comprehensive.
So there you have it - The Golf Monthly Top 100 course rankings in a nutshell.
Hope the above has been of some interest.
Mike
Ps we have a Top 100 facebook app coming next week so you can easily tally up which one's youve played and share your list
As we always say one of the key reasons we produce the Top 100 course rankings is to produce debate so it certainly seems like it’s ‘job done’ on that score!
Some of the more inflammatory comments were predictable but at least it shows a certain passion and enthusiasm for the subject.
Jezz has obviously responded on a number of things but there are a number of points I thought were worth making/restating....
As I said in my editor's letter (which I appreciate not everyone may have read yet) there simply can’t be any 100% right or wrong positions when it comes to ranking courses as the whole thing is based on subjective opinion to a degree.
Golf Monthly’s Top 100 list is our opinion, Golf World’s is theirs and Dodger’s, The Craw’s and Snelly’s (just three names pulled out completely at random you understand) are theirs.
Our opinion is derived from the input of our course rankings panel in the form of course assessments carried out to a set criteria that you can read about in detail here.
The criteria and the weighting between them is broadly the same as every major course ranking I’ve ever seen. We probably add a little more weight to the ‘visual appeal’ and ‘ambience’ categories than some other rankings and we have done that because we want our rankings to be inspirational and useful not just to ‘better golfers’ who tend to focus very heavily on the test and conditioning but to make it relevant to all golfers for whom breathtaking scenery and the overall experience you have as a visitor is very important… especially if you haven’t played well!
Our panel is made up of 40 golfers – GM staff, readers and advisors - that cover all ages, handicaps and regions of the UK&I.
Everyone on the panel has played at least half of the current top 100 with many very close to the complete Top 100 (and courses played total around 300).
Jezz and Rob Smith (who make up the Senior Panel with me) have now played all 100 (and almost all of the next 100) and I’ve played 72 of the current Top 100 and 63 of the next 100.
We then have input from our senior advisory panel - all of whom have played in excess of 600 courses worldwide (including all or almost all of the Top 100) with one of them having played 3,000.
Since we published the last rankings in November 2010 I have played nearly 50 of this year’s Top 100. I suspect Jezz and Rob will have done even more and that means we have an up to date experience of courses we are ranking, which is important as it means we will have seen at first-hand any improvements (on and off course) and equally any significant drop in standards of the playing surfaces.
Several clubs in the Top 100 we have spoken to say they haven’t been visited by anyone from a rival magazine’s panel for a number of years.
I don’t quote those figures to show off but rather to underline the combined experience, and the more experience you have, the better placed you are to be able to rank one course above another.
Between the members of the panel, we undertook over 1000 course visits (covering 170 courses) from March 2011 to August 2012. Every course had at least one visit in this period with many around five (some as many as ten), meaning we get a wide range of feedback and opinion on all the various criteria.
To give you an idea of the level of information we ask panellists for, and get back, here are some links to three sample assessment forms.
http://www.mediafire.com/view/?isiyhhispkh31v3
http://www.mediafire.com/view/?9farhw3am4hovze
http://www.mediafire.com/view/?4t2zfhvc15f90xf
Once we have all the assessments in, Jezz, Rob Smith and I sit down and go through them noting the marks, comments and crucially the benchmarking section where panellists suggest rankings and compare the course to others. Taking in all those comments we then start to adjust positions according to the pooled feedback, adding in our views. Where we have disagreements amongst ourselves, then we debate it until we get at least a 2:1 majority. Sometimes the final call comes down to answering the question ‘where would you rather go and play tomorrow, course A or course B’.
On its own, that’s not very scientific but when it’s the final element of an extended assessment process, then I think it’s a fair tiebreaker.
I’d call the above pretty comprehensive and I hope shows that our process is anything but lazy, as someone suggested.
By contrast I have seen the communication a rival magazine sends to panellists. It consists of a list of their last Top 100, and merely asks them to cut and paste courses up and down. If a panellist moves a course more than 10 places, they are asked to write a sentence on why. In my book that’s not very comprehensive.
So there you have it - The Golf Monthly Top 100 course rankings in a nutshell.
Hope the above has been of some interest.
Mike
Ps we have a Top 100 facebook app coming next week so you can easily tally up which one's youve played and share your list