Create, Replicate, Debate: Economic Research in Light of Reinhart & Rogoff

May 9, 2013

In recent weeks, the world of economics has been abuzz with news that a graduate student had discovered an error in an influential study by Harvard economists Carmen Reinhart and Kenneth Rogoff.  This study, which initially indicated that economic growth stagnated when government debt reached very high levels, had been cited in support of austerity measures that have were implemented across Europe. However, a recent attempt at replication of this study by a student at University of Massachusetts at Amherst determined that the paper contained a number of flaws that, when corrected, actually appeared to reject some of the key results.

The significance of these findings is currently being hotly debated.  However, while commentators have assessed the impact of this error as somewhere between having no effect and having caused mass unemployment, the reality is that understanding the effect of the study on the global economy, like understanding the effect of debt on growth, is complicated.

More generally, this controversy has raised important issues on quality controls in economic research, as well as the way in which this research should be used in policy-making. On the first point, which is not a new issue, a quote from Henry Ford comes to mind: “quality means doing it right when no one is looking.” That is, there are no shortcuts to good empirical work. When a study is free of technical errors, relies on defensible assumptions, and applies relevant analytical tools to appropriate data, discussion can be focused on substantive areas where reasonable people can disagree.

The debate on the second point is more difficult. Many economic studies are theoretically or empirically complex and rely on large data sets and advanced analytical techniques. As such, policy-makers and stakeholders who rely on these studies to make decisions must assess them as non-experts. In the wake of Reinhart and Rogoff, Betsey Stevenson and Justin Wolfers, two economists at the University of Michigan, have published an interesting set of guidelines to help guide non-economist in assessing economic research in the context of policy-making.

Good economic analysis should follow the scientific method and apply statistical tools to test a clearly-specified hypothesis. However, it is important to keep in mind that no single piece of research provides the definitive answer to explain economic behavior. Economic analysis can lend evidence for (or against) the existence of relationships between real-world phenomena, but is limited in its ability to determine whether those relationships are causal.

For example, high levels of government debt may (or may not) be correlated with a slow-down in a country’s economic growth. However, even a study that finds a statistically significant relationship between these two variables should not be cited as having conclusively proven that high levels of borrowing cause stagnating growth. In fact, it may not necessarily even make sense to conclude that any single relationship between these two factors exists under all circumstances. For example, if two countries are at an identical level of debt relative to GDP, but one government spends wastefully while the other invests in education and infrastructure, it is reasonable to expect that the returns on the borrowed funds will differ.

Although rigorous empirical analysis is essential to guiding the debate on economic policy, it is important to remember that no single study holds the absolute truth about past—much less future—outcomes. Policy makers (and their economists) should objectively assess economic research before relying on it, asking critical questions about the assumptions and findings, evaluating the limitations of the research, and considering whether the assumptions and data apply to their specific facts and circumstances.


This website uses cookies to improve functionality and performance. By continuing to use this website, you agree to the use of cookies in accordance with our Privacy Policy. Ok