Pythagorean expectation
|
Pythagorean expectation is a formula invented by Bill James to estimate how many games a baseball team "should" have won based on the number of runs they scored and allowed. The term is derived from the formula's resemblance to Pythagoras' formula to compute the length of the hypotenuse of a triangle from the lengths of its other two sides.
The basic formula is:
- <math>Win% = \frac{Runs Scored^2}{Runs Scored^2 + Runs Allowed^2}<math>
Win% is the winning percentage generated by the formula. You can then multiply this by the number of games played by a team (today, a season in the Major Leagues is 162 games) to compute how many wins one would expect them to win based on their runs scored and allowed.
Empirically, this formula correlates fairly well with how baseball teams actually perform, although an exponent of 1.81 is slightly more accurate. This correlation is one justification for using runs as a unit of measurement for player performance. Efforts have been made to find the ideal exponent for the formula, the most widely known being the pythagenport formula (invented by Clay Davenport) 1.5log((r+ra)/g)+.45 and the less well known but equally effective: ((r+ra)/g)^.287, invented by David Smyth.
It is widely believed that deviations from a team's expectations are primarily due to luck and the quality of the team's bullpen and to the situation in the game when runs are scored.
See Also: Baseball statistics, Sabermetrics
Use in basketball
When noted basketball analyst Dean Oliver applied James' pythagorean theory to his own sport, the result was similar, except for the exponents:
- <math>Win% = \frac{Points For^{16.5}}{Points For^{16.5} + Points Against^{16.5}}<math>
External links
- Applying the pythagorean expectation to Football (http://www.footballproject.com/story.php?storyid=122): Includes a discussion of how the exponent in the formula should be larger the larger the number of points scored per game becomes.