One of the basic tenets of sports economics at this point is that there is zero evidence that whether a city plays host to sports teams has any measurable effect on its local economic health. That was the finding back in the 1990s when Joanna Cagan and I first started researching sports stadiums, and it’s continued to be the case in more recent studies — though as academic researchers have told me, it’s hard to get new studies approved when studying the economic impact of sports teams is like studying whether gravity makes things fall down.
So it was more than a little surprising when I was alerted (thanks, David!) to a new study out of Ball State University that claims that investing in luring and retaining pro sports teams has a long-term payoff for cities. According to the university’s press release, Census Bureau data show that if you look at certain long-term metrics, hosting a sports team “has a positive effect on the GDP and the population of the metropolitan area.”
There were some red flags, certainly — for starters, nobody on the study team appears to have any background in economics, rather being an assemblage of IT professors and business students — but still I figured this would be worth checking into. And then I got a look at the actual study itself, and I knew that I had to head straight here to share it with you, because this is one of the most hilariously incompetent pieces of academic work you could ever expect to see. Let’s skip straight to the “methodology” section, which is really where you want to start in any economic research paper:
We argue that not all economic benefits are measurable with typical metrics. Nor are they tangible in terms of their visible impacts on the appearance of the city and its residents. Rather, the main impact of investing in large-scale sports facilities and also hosting professional sports teams appears in the long term.
Okay, so not “typical metrics,” but rather something “in the long term.” What’s your actual data, guys?
In order to see the effect of hosting professional sports teams on the local economy, the correlation between the sum of the number of teams and the GDP of the Metropolitan area is calculated which is given below.
GDP SUM(Teams) Linear Correlation Table Data Set #1 Data Set #1 GDP 1.000 0.881 SUM(Teams) 0.881 1.000
Table1. Correlation between Team Numbers and GDP
And … that’s it. This entire paper is based on the observation that if you look at which cities had the most economic growth from 2001 to 2018, and which cities have the most sports teams, they’re the same cities.
But, you know, of course they are. The whole reason you choose put a sports team in a city is because it’s growing in relative size and economic activity — it’s why Rochester used to have an NBA team, but doesn’t anymore, and won’t anytime soon. Also, the last two decades have seen a historic rebound of the largest cities in particular, with well-off residents and companies alike recolonizing the metropolitan areas that they largely abandoned in the 1960s and ’70s. So if big cities are doing well over all, and big cities have the most sports teams, obviously there’s going to be a correlation there — it’s like observing that rich people tend to own the most yachts, and concluding that buying a yacht is a great way to get rich.
This is one of the most famous logical fallacies of all time, and is usually summed up as correlation does not imply causation. It has its own Wikipedia page, with some terrific examples, the best of which may be this one:
Sleeping with one’s shoes on is strongly correlated with waking up with a headache.
Therefore, sleeping with one’s shoes on causes headache.
The above example commits the correlation-implies-causation fallacy, as it prematurely concludes that sleeping with one’s shoes on causes headache. A more plausible explanation is that both are caused by a third factor, in this case going to bed drunk, which thereby gives rise to a correlation. So the conclusion is false.
I reached out to Ball State’s PR spokesperson to see if the authors would be available to answer some questions about their work, but I haven’t heard back. In the meantime, I talked to College of the Holy Cross economics professor Victor Matheson, who agreed with me on the correlation vs. causation error, and added:
It is unfortunate that they didn’t understand the huge statistical error they were making here, but that’s what being a student is all about – learning by doing and recognizing mistakes so you don’t make them again in the future. What is concerning here, however, is that two Ball State professors also put their name on a paper that is clearly wrong and allowed it to be issued in such a way that the general public could read it despite its obvious flaws. Perhaps they were just trying to do something out of their area of expertise and didn’t understand why the analysis was wrong, but I
certainly wouldn’t put my name on a paper about computer technology or information systems unless I was very sure that I wasn’t doing something that would instantly make me look very foolish.
If there’s a silver lining here, it’s that at least this paper doesn’t seem to have resulted in a flurry of media coverage that grabs at the man-bites-dog nature of the headline without looking into whether the paper itself makes any damn sense. In fact, I thought twice about whether writing about this would be giving the paper more attention than it deserved — but it is still a fascinating case study in how bad ideas can take on a momentum of their own, eventually enlisting an entire public relations apparatus to put them out into the world and defend them. Thank goodness we have professional journalists out there to tell legitimate research from utter gibberish — oh wait…