Menu

    Explaining level changes

    David Duffett v Jason Baskin (Sun 16 Feb 2020)

    Array
    (
        [text] => Starting level for David Duffett: 3,002, level confidence: 92%. Set manually.
    Starting level for Jason Baskin: 1,770, level confidence: 54%.
    David Duffett to win as he is currently playing 70% better than Jason Baskin.

    David Duffett won 67% of the points.

    As David Duffett has played below his allowed range at 2,476, his level reduction is 5.3% before damping. On the assumption that David Duffett would normally have been playing at level 2,774 (based on typical behaviour), Jason Baskin played better than expected and therefore gains a pre-damping level increase of 8.8%.

    Due to the difference in level between the players, the adjustments have been reduced to 4.1% and 6.7% respectively.
    As this is a best of 1 match rather than best of 5, these adjustments have been reduced to 3.3% and 5.4% respectively.

    Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for David Duffett changes to -2% and Jason Baskin changes to +5.4%.

    After applying standard match damping, the adjustment for David Duffett becomes -1.3% and for Jason Baskin becomes +3.3%.

    Given David Duffett's level and the type of match played, an additional damping of 16% has been applied to his level change.

    Apply match/event weighting of 50% for 'Redland Boxes' so the adjustment for David Duffett is -0.6% and for Jason Baskin is +1.7%.

    Increase level confidence due to one more match played. David Duffett: 96%, Jason Baskin: 73%. Reduce level confidence based on how unexpected the result is. David Duffett: 85%, Jason Baskin: 65%.

    A final adjustment of -0.2% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools.

    Final level for David Duffett: 2,994, level confidence: 85%.
    Final level for Jason Baskin: 1,786, level confidence: 65%.
    [struct] => Array ( [home_level] => 3002 [away_level] => 1770 [win_text] => David Duffett to win as he is currently playing 70% better than Jason Baskin. [home_win] => 1 [winnerid] => 729 [loserid] => 1033 [points_ratio] => 0.66666666666667 [games_ratio] => 1 [games_points_text] => David Duffett won 67% of the points. [current_relative_level] => 1.6960451977401 [result_relative_level] => 1.325 [winner_level_played_at] => 2653 [loser_level_played_at] => 2003 [undamped_level_adjust] => 0.88387169914321 [final_winner_adjust] => 0.99439083580143 [final_loser_adjust] => 1.0166533849041 [calibration] => 0.99788128097987 [home_level_now] => 2994 [away_level_now] => 1786 [excluded] => [total_points] => 9 [total_games] => 1 ) )
    Match won by David Duffett. Result: 6-3.

    Starting level for David Duffett: 3,002, level confidence: 92%. Set manually.
    Starting level for Jason Baskin: 1,770, level confidence: 54%.
    David Duffett to win as he is currently playing 70% better than Jason Baskin.

    David Duffett won 67% of the points.

    As David Duffett has played below his allowed range at 2,476, his level reduction is 5.3% before damping. On the assumption that David Duffett would normally have been playing at level 2,774 (based on typical behaviour), Jason Baskin played better than expected and therefore gains a pre-damping level increase of 8.8%.

    Due to the difference in level between the players, the adjustments have been reduced to 4.1% and 6.7% respectively.
    As this is a best of 1 match rather than best of 5, these adjustments have been reduced to 3.3% and 5.4% respectively.

    Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for David Duffett changes to -2% and Jason Baskin changes to +5.4%.

    After applying standard match damping, the adjustment for David Duffett becomes -1.3% and for Jason Baskin becomes +3.3%.

    Given David Duffett's level and the type of match played, an additional damping of 16% has been applied to his level change.

    Apply match/event weighting of 50% for 'Redland Boxes' so the adjustment for David Duffett is -0.6% and for Jason Baskin is +1.7%.

    Increase level confidence due to one more match played. David Duffett: 96%, Jason Baskin: 73%. Reduce level confidence based on how unexpected the result is. David Duffett: 85%, Jason Baskin: 65%.

    A final adjustment of -0.2% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools.

    Final level for David Duffett: 2,994, level confidence: 85%.
    Final level for Jason Baskin: 1,786, level confidence: 65%.

    Notes

    • This calculation is done in two main parts; first work out the adjustment needed to match the results and then apply damping. This means that levels should always be 'about right' but the time taken to get there or the volatility is dependent on the damping.
    • A level also has 'level confidence' which drops if players haven't played for a long time or have had enexpected results. As low confidence levels adjust more quickly than high confidence levels, it allows these players to find their level more quickly without impacting their opponent's level too much.
    • Point scores are used as well as game scores for accuracy - particularly important for 3-0 results - though we can work with game scores only too, albeit with more damping.
    • Mismatched players are allowed for - you don't have to hammer your opponent. See explanation above if this applies to this match.
    • The section on damping is where we still have some options. We have recently made a change to damp league matches more than tournament matches and box matches even more than that. This gives added weight to the more important matches.
    • There are occasional, very small adjustments made to all players to keep the averages constant which are not covered here.
    • You don't get a bonus just for winning - if you want to go up you have to play better than expected against your opponent.
    • We have spent more than 5 years fine tuning the level calculations based on tens of thousands of match results and a great deal of feedback from players, team captains and coaches. It's the most usable and accurate ranking system there is in any sport, let alone squash.
    • For a more complete explanation of how levels are calculated (on which this system is based) see the help file here.
    • If you have thoughts/opinions on the above or any feedback on the way levels are calculated or updated, please contact us. We welcome all feedback, although we are keen squash players ourselves and would pefer to be on-court than in front of a screen so please be patient and please do try to see if your question has already been answered on the help page. We are unable to answer questions about hard anyone played in their match - we only get to see the results - and if your level didn't increase as expected please make sure you've looked at the above explanation before contacting us. If you want to go up the levels, train harder, listen to your coach and win more points. Or just be incredibly talented!