Five weeks in, you start to hear NFL experts trade their preseason overconfidence for regular season overconfidence. It’s tempting to fall into the trap thinking that with over a quarter of the season in the books, now we have an idea of how the rest of the regular season will unfold.
It’s tempting, but it’s not really true. The best way to measure whether someone knows what they’re talking about is to see if their predictions come true. Fortunately for us historians, each game a group of experts predict what will happen every week — it’s called the point spread.
Assuming we actually learn something each week, then the point spreads should reflect the actual results as the leaves change colors. But do they?
I looked at the point spread for all regular season games from 1988 to 2011. Now the question becomes how do we measure if a point spread was “right”? If the Texans are favored by 7 to beat the Bengals, and they win by 10, is that a “good” projection?
The simplest way to test this is to see the difference between the actual result and the projected result. The line that was most “off” in the database came two years ago. Likely due to the genius of Josh McDaniels, the Raiders were 7-point underdogs in Denver in 2010, but won the game 59-14. So in that game, the line was off by 52 points. Last year, the 49ers were 3-point favorites at home against Tampa Bay, but won by 45 points; that line was “off” by 42 points.
One interesting sidenote: you might think that big lines are more likely to be off by bigger margins than small lines. But that’s not really the case. The standard deviation of “how much a line is off by” is roughly 8 points regardless of the spread. It’s not exact, of course, but for our purposes, we can work under the assumption that lines are generally equally likely to be off by the same amount regardless of the spread.
Anyway, back to the point of the post. How accurate are lines early in the year? In week 1, the spreads are generally a little tighter than they are the rest of the way; perhaps the oddsmakers are just as unsure as the rest of us. But no matter what week it is, the lines are always off by about 10 points per week. Take a look. The table below shows how much the lines were off by, on average, in weeks 1 through 17 from 1988 to 2011. In the last column, I’ve shown the percentage of games where the line was within 10 points of the actual result.
wk# | gms | Spread | LineOff | w/i 10pts |
1 | 362 | 4.7 | 10.7 | 60% |
2 | 361 | 5.2 | 10.5 | 58% |
3 | 340 | 5.6 | 10.4 | 56% |
4 | 322 | 5.5 | 10.9 | 56% |
5 | 319 | 5.2 | 10 | 64% |
6 | 315 | 5.4 | 10 | 60% |
7 | 315 | 5.6 | 10.9 | 57% |
8 | 315 | 5.5 | 10.6 | 59% |
9 | 326 | 5.3 | 10.5 | 59% |
10 | 339 | 5.6 | 10.1 | 57% |
11 | 358 | 5.6 | 9.4 | 61% |
12 | 362 | 5.7 | 10 | 62% |
13 | 363 | 5.4 | 10.1 | 59% |
14 | 360 | 6 | 10.8 | 59% |
15 | 363 | 6 | 10.9 | 57% |
16 | 363 | 5.7 | 10.9 | 57% |
17 | 335 | 5.7 | 10.4 | 61% |
It’s tempting to think we know more once we see more, but that’s unfortunately not the case. Of course, if it was, it would be easy to make money gambling on football.
For those curious, week 12 of the 2003 season was as close as Vegas has ever come to perfection. Look at how close these lines were to the actual results (as always, the boxscores are clickable):
What about the other side of the coin? Vegas was really, really, really off in week 17 of the 1993 season. Just so you know, 1993 was the year the NFL tried the double-bye week approach resulting in an 18-game season, so this was really like a week 16 most years: