Levels are calculated for every player after every match and are based on:
- The player's level before the match.
- Their opponent's level before the match.
- The actual result compared to the expected result. Given we know the player's levels before the match we can predict the result if both players played as expected. If the actual result shows that one of the players played better than expected, their level will go up a bit and their opponent's level will go down a bit.
- We then take a number of factors into account and work out how much change that 'bit' should be.
The algorithm uses:
- Maths - is at the heart of the algorithm and is used to determine how much their level should change if there was no damping or behavioural modelling. It's actually very straightforward as, for example, if your level is twice that of your opponent then you'd be expected to win your games 11-5 or so. If your level is 20% higher then you'll win 20% more of the rallies with points scores around 11-9. We use a combination of points scores and games scores to work out how much better the winner played as a ratio and that's where we start. The overall goal is that if you play twice as well as your opponent then your level will be double theirs. This works all the way up from beginner (<50) to top pro (>50,000)
- Weighting- the more important the match (e.g. a tournament) the greater the weighting. This allows you to play a box match without having too much impact on your league standings.
- Damping- there is a reasonable amount of damping dialled into each match trying to get the balance right between wild swings and slow progress. The algorithm does its best to reward every player for a good result and, in consequence, an appropriate level reduction if the result isn't so good. The intention is that a player's level is a reasonable assessment of their current playing level within a match or two.
- Behavioural modelling- as it turns out, not everyone puts 100% effort in every match and that’s down to behaviour. There are many other cases too where player behaviour defies the maths and, based on the analysis of 1.6 million results on the system, we’ve built an extensive behavioural model that allows us to predict and make use of these behaviours. See the calibration FAQ for more detail on what behaviours we model.
At a very high level, it's quite straightforward as there's a direct correlation between the ratio of the player levels to the expected points scores. You can do it in your head! E.g. 20% better - 11-9, 30% better - 11-8, 50% better - 11-7 and so on. If you play better than expected you'll go up a bit or, if worse, you'll go down a bit. That's it, really! The complexity is over how much that 'bit' is but, fundamentally, your level will adjust so that your level ratios match your results ratios - on average.
We can work with game scores only, making assumptions around the average 3-0 result (based on our analysis of real 3-0 match results) but we can only use averages and apply a lot more damping so it takes quite a few results for the levels to become accurate. Not all 3-0 results are the same, obviously.
Note that in the match review page, you can scroll down and click on 'View full explanation' and it shows you the algorithm output in verbose mode for that match so you can see exactly why your level changed by the amount it did.