Honesty and Transparency Are Not Enough
There has been a replication crisis in applied statistics (most notably in psychology, but also in social science and medical research), in which studies published in top scientific journals “fail to replicate”; that is, outside research teams are unable to reproduce published claims.
Sometimes replications are difficult because the underlying data and code are inaccessible. Data can be withheld for confidentiality reasons, because a researcher does not want to admit other interpretations or codings (as when primatologist Marc Hauser refused to let his research assistants view his monkey videos), because the data simply are not there (as with political science student Michael Lacour’s fake survey on attitudes toward same-sex marriage, or the survey on gun control that economist John Lott says was entirely lost in a computer crash), or for reasons that remain unclear (as in a much-publicized survey published in 2006 of mortality in Iraq; Spagat, 2014).
In political science and economics, it is common practice to work with data that are publicly available but can take a bit of effort to obtain and clean, effort that is hidden in the public record. This leads to problems such as the famous “Excel error” of economists Reinhart and Rogoff, whose conclusions from a much-publicized paper in 2010 turned out to depend on an embarrassing data processing error that was not caught for years, in part because data sharing is not the scientific norm. Any outside researchers who questioned Reinhart and Rogoff’s claims had to go get the data themselves and reconstruct the analyses from scratch, so it took time for the failure to be discovered.Some content is only viewable by Chance Subscribers