Our team spent several months building and refining the Singletracks MTB Destinations ranking algorithm, and our plan is to continue tweaking it as we go along. There’s no such thing as a perfect algorithm, but there’s also something to be said for coming up with an objective ranking of mountain bike destinations based on key factors that are important to mountain bikers. I’ll go over the factors we’re using in this article for those who are curious.
Destination Defined
The Singletracks MTB Destination ranking considers trails within 25 miles of a city center (these city centers are predefined within a standard geocoded database.) And by “trails,” we really mean trail systems. For example, we’re not counting all the individual named trails within a resort; instead, we’re aggregating information about all those trails within a system or resort and considering the system as a whole.
While compiling the best destinations list, we realized early on that many destinations overlap, and so we’ve had to make some editorial decisions about which towns to keep and which ones to ditch. For example, Banff and Canmore are less than 10 miles apart and as you can see from the map above, there’s a ton of overlap between the two. In this case we chose to include Canmore instead of Banff on the list since there are more trails within the Canmore radius.
In Colorado we ran into similar issues and resolved them on a case-by-case basis. As another example, there is a lot of overlap between Leadville, CO and Breckenridge, and we chose to keep Breckenridge since the town itself offers more of the amenities bikers are looking for and is therefore more of a “destination.”
Quality AND Quantity
One of the key metrics many mountain bike destinations use to market themselves is trail mileage, and that factors heavily in our algorithm. The mileages we use are not just limited to singletrack trails since many important rides (like Porcupine Rim in Moab) are actually classified as 4×4 roads or even doubletrack. We also cap the length of each trail to 50 miles so that, for example, a trail like the Colorado Trail doesn’t add 500 miles to every town it passes through.
On the quality side, we’re using our existing MTB trail ranking scores to measure how good a ride really is. The trail ranking algorithm deserves an article of its own but at a very high level, trail scores are calculated based on the number and quality of reviews posted in our trail review database.
We take the scores for the 5 most popular trail systems within a 25-mile radius of a destination and add those scores together. Why just the top 5? Based on our latest survey, the average MTB trip lasts right around 3 days, so most visitors will only have time to ride a few of the best trail systems in a single trip. This way we’re also excluding places that have a lot of average trails but not any true stars.
Other Factors + Sorting
In addition to the quality scores and trail lengths, we also factor in the diversity of trail difficulties in the area (more diverse=higher rank) and also whether lift service is available. We also factor in the population of the destination; small towns get an added boost over big cities like Denver to factor out popular workweek riding destinations. However, there is a floor on population size so tiny towns like Downieville, CA (population: 282) aren’t over-compensated.
At this stage we also throw out any destinations that fewer than 25 Singletracks members have visited. Admittedly this is an arbitrary number but we want to make sure we’re basing the rankings on the opinions of a decent number of riders. Many destinations on this list have been visited by thousands of Singletracks members, but we want to make sure up-and-coming destinations are captured, too.
Finally, we apply weights to some of the ranking factors and calculate a Bayesian average of both the quality and quantity scores. This is basically the same approach sites like Amazon.com are said to use and at a high level, this seeks to balance items with just a few high ratings with items with a lot of slightly lower ratings.
So why isn’t XYZ destination ranked higher?
See above. Ha!
Seriously though, we’ve been collecting information from hundreds of thousands of mountain bikers around the world for more than 15 years but clearly every mountain bike trail on the planet has not been added to the Singletracks trail database yet. If a destination isn’t ranking highly it could be because we don’t have a lot of trail info for that area yet. Anyone can use this form to add a trail or trail system to help fill in the gaps.
If all the major trail systems in an area seem to be listed, the next step is to make sure reviews are up to date by adding your ratings and trail difficulty assessments. And if a trail system has lift service but the box isn’t ticked on the right side of the page, be sure to edit the trail info using the link in the “actions” menu.
The destinations list is automatically updated based on current info, though not all changes are reflected right away. We’re constantly watching the list and adjusting the algorithm to make sure the results are reasonable. And while the results don’t always line up exactly with our own editorial choices, it’s nice to have an objective measure to discover destinations we may have overlooked. Plus, it’s hard to argue with a computer. 🙂
Check out the top 10 mountain bike destinations here!
3 Comments
Mar 2, 2015
Mar 2, 2015
However, this begs the question: are we weighting mileage too heavily relative to quality? As it stands now, quality is weighed up to 265% of quantity, depending on diversity of difficulties offered and lift service. I think saying quality is 2.6 times more important than quantity is pretty strong but it would be interesting to see how the list changes if we go even higher.
So in the case of Fruita, it's penalized for both lack of quantity AND the lack of lift service (which plays into diversity.)
Mar 3, 2015