MLB American League Foul Ball Rates Jump during Strike Years: Shortened Seasons Equal Higher Averages

One thing is certain about a short season: Foul ball rates jump. It appears a larger strike zone does too. The data for the 1988-1996 MLB season range for the American League shows a reasonably even rate of foul balls at each park from between the 1988 and 1993 seasons. The American League averages fluctuated only 2 fouls per game over that timeframe (43.82 in 1990 to 45.73 in 1991). With some incomplete data, this fluctuation could be near zero too.

 

What the data does show inconclusively is that a shorter season significantly raises foul ball rates.

134561

In the strike-shortened 1994 MLB season, where teams averaged about 114 games and there was no postseason, rates climbed nearly two fouls for the year average over the previous high of 45.73 in 1991.

 

Legend: Red: Highest total for season; Blue: Lowest total for season; Orange: 1994 and 1995 were shorter seasons due to strike Green: Year strike zone was increased.
Legend: Red: Highest total for season; Blue: Lowest total for season; Orange: 1994 and 1995 were shorter seasons due to strike
Green: Year strike zone was increased.

 

 

This 1.8 increase in the average indicates that more fouls area hit during the early part of the season than the latter third. This makes sense given that early on, many hitters are seeing a pitcher for the first time and trying to adjust their swing. As the season progresses fewer fouls will be hit because hitters will have a better understanding of what the pitcher’s tendencies are and can lay off more close pitches than earlier on.

 

At the same rate, pitchers will develop stronger pitches and understand batters’ habits better as the season goes on too.

NewSiteBackGround

This conclusion is supported by the high rate of fouls in the strike-shortened 1995 baseball season. This season was also shortened, but teams played an average of 144 games. As you can see, the data indicates that the rate between the highest averages before the strike is still in excess of one foul, 1.06 to be precise.

 

Given the one foul drop on average, it appears that this is a reasonable breakdown of how the rates worked between the 1988 and 1995 seasons:

 

  • High rate through 114 games.
  • Next 30 games lower average by 1 foul
  • Remaining 18 games drop average nearly 1 more foul ball.

 167914_1423751036

When we consider the fact that teams in contention will watch more pitches because they need to win, and that teams completely out of the postseason running will simply not work as hard and allow more rookies to bat the conclusion is reasonable, I think. Given the 1994 season had no postseason, no teams truly cared about the season. It was, for all intents and purposes, a wasted, null season. When baseball returned in 1995 with a postseason, we can see the rate drop by one foul on average, as batters became choosy with respect to what they swung at.

 

It seems clear that while domes might not effect foul ball rates, shorter seasons do. The only way to know for sure is to cut the season down to 114 or 144 games to see if the same pattern is visible—if the earlier games have a higher rate and the latter games have a lower rate, then we can confirm.

 

Regardless, the facts are intriguing.

 

 

Note: All data is from retrosheet.org sources. Some teams are incomplete, so foul ball rates will be marginally inaccurate. Based on missing data, I estimate a +/-2 foul ball range if all statistics were available.