Approximately six out of 10 economics studies published in the field’s most reputable journals — American Economic Review and the Quarterly Journal of Economics — are replicable, according to a study published today in Science.
The authors repeated the results of 18 papers published between 2011 and 2014 and found 11 — approximately 61% — lived up to their claims. But the study found the replicated effect to be on average only 66% of that reported in the earlier studies, which suggests that authors of the original papers may have exaggerated the trends they reported.
Colin Camerer, a behavioral economist at the California Institute of Technology in Pasadena, who co-authored the study, “Evaluating replicability of laboratory experiments in economics,” told us:
Four clearly failed to replicate, three were near misses (large effects but not highly significant by conventional p-value) and 11 replicated rather well.
As he and his co-authors note in the paper:
…replication in this sample of experiments is generally successful, though there is room for improvement.
Interestingly, the authors asked peer traders (those working as experimental economists) to predict the rates of replication before the replication experiments were actually conducted. They also surveyed traders on their beliefs of the probability of replication.
On average, the market prediction of the replication rate was 75.2%, and the survey belief was 71.1%, both of which turned out to be higher than the actual replication rate of 61.1%.
Here is the link to the paper (the above link didn't work for me): http://science.sciencemag.org/content/early/2016/03/02/science.aaf0918.abstract. It's gated so I don't know which studies replicated and which ones did not.
Here is what they said about the paper at CalTech:
The authors suggest that there are some methodological research practices in laboratory experimental economics that contribute to the good replication success. "It seems that the culture established in experimental economics—incentivizing subjects, publication of the experimental procedure and instructions, no deception—ensures reliable results. This is very encouraging given that it is a very young discipline," says Michael Kirchler, another coauthor and collaborator from the University of Innsbruck.
"As a journal editor myself, we are always curious whether experimental results will replicate across populations and cultures, and these results from multiple countries are really reassuring," says coauthor Teck-Hua Ho from the National University of Singapore.
Coauthor Magnus Johannesson from the Stockholm School of Economics adds, "It is extremely important to investigate to what extent we can trust published scientific findings and to implement institutions that promote scientific reproducibility."
"For the past half century, Caltech has been a leader in the development of social science experimental methods. It is no surprise that Caltech scholars are part of a group that use replication studies to demonstrate the validity of these methods," says Jean-Laurent Rosenthal, the Rea A. and Lela G. Axline Professor of Business Economics and chair of the Division of the Humanities and Social Sciences at Caltech.
You ... you will never be a member of Bushwood!