And, at the least, every author should know that what they are submitting to journals is not made up data:
Last week, [LaCour and Green's] finding that gay canvassers were in fact powerfully persuasive with people who had voted against same-sex marriage — published in December in Science, one of the world’s leading scientific journals — collapsed amid accusations that Mr. LaCour had misrepresented his study methods and lacked the evidence to back up his findings.
On Tuesday, Dr. Green asked the journal to retract the study because of Mr. LaCour’s failure to produce his original data. Mr. LaCour declined to be interviewed, but has said in statements that he stands by the findings.
The case has shaken not only the community of political scientists but also public trust in the way the scientific establishment vets new findings. It raises broad questions about the rigor of rules that guide a leading academic’s oversight of a graduate student’s research and of the peer review conducted of that research by Science.
New, previously unreported details have emerged that suggest serious lapses in the supervision of Mr. LaCour’s work. For example, Dr. Green said he had never asked Mr. LaCour to detail who was funding their research, and Mr. LaCour’s lawyer has told Science that Mr. LaCour did not pay participants in the study the fees he had claimed.
Dr. Green, who never saw the raw data on which the study was based, said he had repeatedly asked Mr. LaCour to post the data in a protected databank at the University of Michigan, where they could be examined later if needed. But Mr. LaCour did not.
“It’s a very delicate situation when a senior scholar makes a move to look at a junior scholar’s data set,” Dr. Green said. “This is his career, and if I reach in and grab it, it may seem like I’m boxing him out.”
But Dr. Ivan Oransky, A co-founder of “Retraction Watch,” which first published news of the allegations and Dr. Green’s retraction request, said, “At the end of the day he decided to trust LaCour, which was, in his own words, a mistake.” ...
Critics said the intense competition by graduate students to be published in prestigious journals, weak oversight by academic advisers and the rush by journals to publish studies that will attract attention too often led to sloppy and even unethical research methods. The now disputed study was covered by The New York Times, The Washington Post and The Wall Street Journal, among others.
“You don’t get a faculty position at Princeton by publishing something in the Journal Nobody-Ever-Heard-Of,” Dr. Oransky said. Is being lead author on a big study published in Science “enough to get a position in a prestigious university?” he asked, then answered: “They don’t care how well you taught. They don’t care about your peer reviews. They don’t care about your collegiality. They care about how many papers you publish in major journals.”
via www.nytimes.com
Here is what seems to have happened leading up to publication of this paper:
- Junior scholar approaches senior scholar with an idea
- Senior scholar is happy to be a co-author
- Junior scholar makes up data
- Senior scholar says post it at, I'm guessing, ICPSR
- Junior scholar does not post the data at ICPSR
- Senior scholar does not at any point of the process demand to see the data ("grab"? -- this suggests that collaborating scholars are thieves and there is no honor among thieves)
- Senior scholar isn't curious about who funded the study
- Senior scholar allows the paper to be submitted to Science, which isn't a second tier political science journal
- The paper is discovered to be a fake, which was almost inevitable
These are not scholars at regional public state universities (one of them found that potential outcome to be very unattractive). This is the big time and the behavior is audacious.
I've requested data from authors several times. Here are responses I've gotten:
- Unanswered emails
- Outright refusals because "we are still mining the data"
- Receipt of the data and enough documentation to attempt replication
Number three is the only ethical response. The whole issue will be avoided when data must be made available as a requirement for publication. JAERE doesn't have a data policy in its instructions of authors. Neither does Resource and Energy Economics or JEEM or EARE. Only Land Economics has a data policy:
It is the policy of Land Economics to publish papers only on the condition that the data used in the analysis are: (1) clearly and precisely documented; (2) readily available to any researcher for purposes of replication; and (3) sufficiently detailed in the specifics of computation to permit replication. Appearance of an article in Land Economics constitutes evidence that authors understand these conditions and will abide by the stated requirements.
Here is the AER's Data Availability Policy. In short:
As soon as possible after acceptance, authors are expected to send their data, programs, and sufficient details to permit replication, in electronic form, to the AER office.
I think that every economics journal should adopt this policy. The benefit is that data is made available, results can be replicated and the social scientific endeavor is strengthened. The only cost, I think, is that authors must spend extra time putting their data into a format that is understandable by someone else. It is mostly an opportunity cost that will reduce the number of papers written in the long run. My guess is the papers that aren't written are the lowest quality papers so it really isn't much of a cost at all. The cost might even be a benefit (and, yes, I'm thinking of some of my papers).