A few months back I was emailed a very intriguing question. This is the email:
I was trying to figure out a question having to do with foul balls, came across your site, and figured you might have some thoughts on the subject.
Beyond fan injuries from foul balls a popular issue in sports is the increasing length of games. My father has an interesting theory that one of the causes might be that there is less foul territory. The smaller
the foul territory, the fewer number of foul ball outs. The fewer number of foul ball outs, the longer at bats last. The longer at bats last, the longer the game lasts.
Based on your article on foul territory I took the 7% difference in foul ball outs from pre-y2k and post-y2k and multiplied it by the number of average foul balls per game (40). From this I took that roughly 3 foul ball outs per game were being lost. Assuming that a foul ball into the stands as opposed to a foul ball out equates to an extra 2 pitches in that at bat, and that the average time between pitches is about 22 seconds, you’re looking at a couple extra minutes of game play. Nothing drastic.
What are your thoughts?
My dad [Donnie] started thinking about it because as Dodgers fans we’ve watched the foul territory disappear from that stadium through renovations and subsequently it has become less of a pitcher’s ballpark than it used to be. I cannot think of any other stadiums that have been similarly renovated (maybe the Angels). I don’t know if the numbers can be tracked, but stadium renovations may be skewing the amount of discrepancy down in foul ball outs between older and newer stadiums.
Kevin and his father Donnie raise a very interesting question, one it seems nobody else has considered as I looked and looked.
When I finally returned to my computer (I was on vacation at the time), I looked into his numbers some. I discovered this is a remarkably astute theory. Though it’s taken me months to work through, it appeared Donnie and Kevin are definitely on to something. There is an increase in time of a game time as a result of foul ball territory shrinking.
While the numbers used in the initial analysis are marginally off, what is important is recognizing the total number of foul balls and how many of those that will end up in
the stands based on standard trajectory models like the one put forth by Gil Fried in his work on foul balls, and Alan M. Nathan of the University of Illinois, who has developed a baseball trajectory calculator. The current historical average I have is 46 foul balls are slapped each game. Based on the trajectory models, at least 20 balls will find their way into the stands of those parks that don’t have huge foul ball territories like Oakland Coliseum. These are balls that are hit into the seats as bounces, line drives or pop-ups/flies). My calculations echo those of Bob Gorman, author of Death at the Ballpark, from five years ago.
But to simplify things, I went back to my original foul ball territory piece. In that foul ball article I wrote in response to Eno Sarris’ story, “Is Fouling Off Pitches a Skill?”, looking at a similar point. In that post I wrote that
To determine whether newer stadiums have less foul territory than older parks, I averaged two stadium age groups (1900s and 2000s) to see if there’s an indication of significant changes. There was.
1900s Parks: The total number of foul outs for the 16 parks built prior to 2000 is 3430. This number divided by the total parks (16) equals an average of 214.38 foul outs per park.
Y2K Parks: In parks built in 2000 and later, the numbers are very telling. The total outs during the time Sarris’ research looked at is 2793 foul outs. When calculated per park, the average comes to be 199.5 outs per park.
[My data] indicates that in pre-Y2K parks there is, on average, more foul territory than in parks built during and after Y2K. In fact, the data bears out a surprising 7% difference in space. I’d walked into this assuming the opposite—that new parks had more space to move.
When Kevin and Donnie did their analyses, they were right about more balls traveling into the stands and out of foul out territory in the newer stadiums. With 7% less territory, that is an accurate assumption. After some additional analysis of his initial theory numbers and what I could ascertain from my research, it turns out that those numbers DO contribute to the longer games. But not nearly as much as one thinks.
As Kevin asserts in his email, there are about three foul ball outs per game that are lost during a game at those fields with less foul territory. The baseball fan in me wants to ardently agree with him on this point. Regrettably, it doesn’t appear the data supports that conclusion.
Based on the results from my previous piece, the difference in TOTAL foul ball outs, in spite of the number differences and clear impact of smaller territory on foul ball out rates, is marginal. In parks built in 2000 or earlier, the total outs per game in each park averages to 2.6. In those fields that are 2001 and later, there are 2.5 foul ball outs per game.
What I see as the explanation for this statistically insignificant difference is that the trajectory of foul balls is such that many balls that travel into the stands don’t go deep enough to be out of the reach of players who reach over into the seats to snag the ball. Essentially, players can still reach most of the balls in the 7% area of difference. Balls 3-4 rows in aren’t completely out of reach, and that is roughly how much foul territory was lost with the smaller foul ball territory. A player who is 6 feet tall or more can easily get to the second row in nearly all parks. This cuts outs the loss of territory to only about 3.5%. Simply put, because so many players can reach into the first two rows there’s no statistically significant difference in foul ball outs; the 7% difference, in this case, is insignificant.
Of course, this data also means this one-tenth (.1) out has to be made up some other way. As we all know as baseball fans, 3 outs equals a half inning, and we all know a half inning could go 30 minutes or more. Thus, the conclusion is simple: By building newer stadiums that are roughly 7% closer, the game has been extended by .1 outs. Over the course of 81 home games, that means nearly three innings have to be made up. That’s roughly an hour lost due to the small foul territory in newer parks. Given there is no set time for an out, we can’t know for sure how long the game will be extended because of the .1 difference. But seeing as the average time of an MLB game in 2014 was 3:08 hours, the data bears out that the difference has added statistically nothing time-wise to the game.
I decided to try to go a bit deeper with this number and looked at breaking down the .1 in another way. I tried to figure out how much time a .1 difference would add to the game. To do so, I calculated that each inning, based on those 2014 numbers, takes roughly 21 minutes to complete. This means those 8.1 outs (2-2/3 innings) add a total of 36 minutes (rounded up) to the game over the 81 home games. That’s .44 minutes to each game; that’s less than half a minute.
In 2010, the average game was about 13 minutes shorter, in case you were wondering.
What this means is a minimal jump in souvenirs for fans, as well as marginally longer games.
I was sure that Kevin and his family were dead on and that they’d discovered a major culprit in the longer game conundrum. While the data bears out that the smaller territory DOES prolong the game, it’s by less than half a second per game. The pitching clock more than makes up for that minor extension.
So, while there are more balls traveling into the stands, there’s no evidence that the 7% change in foul territory in new stadiums has any statistically significant impact on the game as it relates to foul ball outs.