Back in September, we shared our NHL contract projections as well as our methodology for the project. Now, after ~250 contracts and ~$400 million have been committed to players, it’s time to look back at how our projections performed. It’s one thing to share predictions and provide incremental updates as we have done on Twitter, but it’s another to take a step back and evaluate our efforts as a whole in this contract projections post-mortem. 

We encourage you to explore the contracts projections methodology and write-up in full, but in essence, we combined predictive modeling and statistics with in-depth knowledge of contracts and the CBA to project the length and value of player contracts. 

In this piece, we will assess the overall performance of our projections as well as highlight some differences and trends across various sub-groups of players as we look to improve our projections for future years. 

Overall Performance

1) Contract Length

We are really happy with the model’s performance, especially with the caveat that we gave before embarking on this project that this offseason would be unlike any other that came before. While less term and money was given out league-wide than would be expected in a normal year, our projections still tracked signings closely without any kind of arbitrary or ad hoc adjustments for this new financial reality. 

We predicted the correct length for 53% of all contracts signed, with a mean absolute error (MAE) of 0.70 years. By and large, players received less term than projected. This relationship is shown in the violin plot below, as the width of each band represents the proportion of players at a given projected length that actually received a given term.

thehockeycode actual vs projected length 2020

The bulk of players projected for one- or two-year deals received those short contracts, but most players projected to be signed for longer were forced to settle for shorter deals, as well. In fact, just four players in our dataset were projected longer than two years and actually ended up signing for longer than their projected term

  • Torey Krug with St. Louis (Projected 5 yrs vs Actual 7 yrs)
  • Brendan Dillon with Washington (Projected 3 yrs vs Actual 4 yrs)
  • Josh Anderson with Montreal (Projected 4 yrs vs Actual 7 yrs)
  • and Chandler Stephenson with Vegas (Projected 3 yrs – vs Actual 4 yrs).

2) Cap Hit & Contract Value

In terms of contract value, our projections achieved an MAE of $538K AAV (or 0.66% of the $81.5M Salary Cap). When accounting for the actual length of contract signed, this figure drops to just $408K (0.50% of the cap). 

It’s unrealistic to expect that we would predict the value of every contract exactly, but there is still tremendous value in being “in the ballpark”. To that end, we were within +/- 10% of the actual AAV on 38% of all contracts signed, and within +/- 20% on 60% of all deals

Using this “known term” AAV projection, we achieved an r-squared score of 0.82—in other words, 82% of the variation in player contract value was explained by our model. This relationship is shown below. 

As indicated by the abundance of red in the above chart, players by and large made less money than expected this offseason—our projections were much more likely to miss high than low. Again, “expected” is a bit of a misnomer in this case as we did expect our projections to be an overestimate due to past offseasons not providing a good comparison for this unique year (i.e. previous free agency marketplace contract commitments were made with the assumption of future cap increases) . 

While the black dashed line represents a one-to-one relationship between projected and actual cap hit, the red dashed line corresponds to the relationship we actually observed—on average, players were paid approximately 84% of what was projected.

Though players across the board were negatively impacted this offseason, the financial burden was not shared equally between asset classes —RFA’s made closer to what they were projected than UFA’s, as shown below. 

thehockeycode ufa actual vs projected cap hit 2020

Inspired by a visualization Evolving Wild posted earlier in the offseason, the flatter blue line in the above chart indicates that UFA’s were paid just 79% of what was projected, while the figure below demonstrates that RFA earnings were much more in line with projections (90%). This discrepancy in results suggests that RFA prices were not as affected by the flat cap, while UFAs players were squeezed.

thehockeycode rfa actual vs projected cap hit 2020

Seen another way, our projections were in range for the majority of contracts across both statuses, but we were much more likely to miss high than low. 

Too Low (Projection > 20% below actual)6%4%
In Range (Projection within 20% of actual)52%65%
Too High (Projection > 20% above actual)42%31%

3) No-Movement Clauses
As an additional wrinkle on top of prior contract work, we also attempted to predict whether a player would receive a no-move or no-trade clause. To that end, only eight of the 22 players we projected to receive a clause did so. However, all eight players who we predicted to receive a clause did so. In other words, our clause predictions achieved a recall of 100% but a precision of just 36%. From this, we believe it’s reasonable to conclude that teams were less generous with movement protection just as they were with term and dollars this offseason. The economic uncertainty around hockey related revenue, financial stability of franchises and ownership, as well as the need for club executives to account for the impact on the salary cap were all factors that influenced decision making this offseason. 

Limitations and Future Work 
As we said, this was an offseason and a year like no other. 91 days passed between Alex Pietrangelo signing with Vegas and Mike Hoffman, another of our top ten projected free agents, inking his deal with St. Louis after going there on a PTO. 

This highlights one limitation of our approach: though we did include the timing of the signing as a feature, we simply treated all signings this offseason as though they were made in July of a normal year. This likely negatively impacted the performance of the model, as the amount of available money decreased and both high-end and depth players saw their leverage erode over time and had to settle for lower financial terms. 

All in all, we are happy with the results of our effort, but certainly not satisfied. It’s time to go back to the drawing board and start making improvements for this summer. Stay tuned!

For more of our work, you can learn more at and or check us out on Twitter @thehockeycode and @SamForstner.