Foul Ball Rates Jumped in 1999-2000: Are Maple Bats the Reason?


20150617_183549The most recent updates to the historical foul ball rates (four more seasons worth of data has been added—updates to missing games and teams as well as three more full seasons) show the three foul ball increase discussed in an earlier post becoming more pronounced.


What is also interesting with the updated numbers, retrieved from, is the growing difference in rates between the American League and the National League. The AL out paces the NL by .68 foul balls more per game. This is the equivalent of about 1650 more balls per season in the American League.

1999 and 2000 Account for 99% of Foul Ball Rate Increase

The updated data (the addition of about four more seasons of information) didn’t significantly change the historical 47.7 foul balls per game rate, but it does show that starting in the 1999 season and through 2000, foul rates in the AL jumped to over 48 balls per game. The frequency of foul balls in the AL leaped 1.23 balls in 1999 from the previous year which had seen a minor dip in foul ball production. The AL rate add another .95 in 2000 before appearing to level off over the next decade and a half. In a matter of just two seasons, the American League saw an overall increase of 2.18 balls per game. When 2001 is thrown into the calculations, the AL frequency jumped 2.68 balls per game in only three years. Since interleague play started, there has only been a three ball increase in both leagues, and a 2.33 balls per game increase in the AL; this constitutes a .233 increase in fouls between 1997 and 2014. These three years account for the entire jump in foul ball rates over the 10 years of data collected to this point since the start of interleague games.

The NL rate during this same time ended up breaking the 47 foul ball mark, and that rate remained relatively consistent after the 2000 season. The 1999 season shows an increase of 1.96 balls per game, while the 2000 season added .63 fouls to the game in the league. This is an surge of 2.59 foul balls in a matter of two seasons, and a 2.95 increase in three seasons. A similar pattern as seen in the American League is evident. There was a significant jump in the average of foul balls slapped starting in the 1999 season; this lasted about two more seasons before beginning to level off, with only a very gradual uptick.



Switching from Hickory and Ash to Maple Bats: Possible Reason for Jump in 2000

This drastic rise in rates, one that accounts for nearly the entire increase over the last 15 years or so, could be a result of a couple of changes in the game during this time. The most notable is the type of wood used in bats. The other may be steroid use that is thought to have peaked around this time.

While there is no clear cut, definitive season during which players switched from the preferred hickory or ash to maple bats, the late 1990s did see a marked downturn in the use of ash bats. The 1999 and 2000 seasons may have marked the point at which most batters were swinging maple.

Assuming these seasons were the major transition years, it is reasonable to conclude, given the appearance that rates leveled off and remained fairly consistent over the next 15 years, players were adjusting to the weight and feel of the new wood. There is always an adjustment period.

What is important to remember is these bats are not the reason for foul ball changes. The bats themselves don’t cause or create the fouls. Players having to adjust and get used to the feel, weight and other nuances associated with a new type of bat wood (ash, for example, can flake after a few uses, which means the batter uses more and fresher bats) is what causes an increase. This is similar to batters adjusting their stance or changing their swings in an effort to become a better hitter. In this sense, the mass exodus from hickory and ash bats to maple would most likely account for batters learning to swing the new bats.

The adjustment hypothesis would also explain the near constant rate of foul ball rates since the 1999 and 2000 season rates rocketed up nearly 2.5 balls per game. As players grew accustomed to the maple bats, they swung more consistently, therefore rates have increased only marginally since those seasons.


Final Thought(s)

While a decade of data must still be inputted, a clearer picture of foul ball rates is beginning to appear. Changes, even ones which appear insignificant, have a rather profound influence on rates and how batters respond to pitches. The impression here, that roughly 95% of the foul ball rate surge over the last 20 seasons occurred in the matter of two seasons indicates something happened which profoundly altered the game. The closer proximity of fans to the game as new parks were being built around this time wouldn’t account for any increases, particularly of this nature. The only possible culprit, besides perhaps the weather, would be these years represent the transition years from hickory and ash to maple bats. No rule changes of significance occurred during this time frame which may explain the rate changes.