Replication, Replication

Being a good undergrad, I walk around with the smug face of a scientific zealot – it’s easy to criticise research practices when you aren’t running any studies yourself.1 The critical approach to experimental methods and statistical analyses, however, was also one factor that attracted me towards psychology. My Intro to Psych teacher started the first class with a picture of Sigmund Freud: “You know this guy? I’m gonna trash him!” And so he did, and the entire field of personality research with him. Other lecturers have been more subtle, but most have encouraged me to examine research findings carefully and with a critical eye.2

Last semester, being one of the rare people who find true joy in the depth of spreadsheets, I took a course called ‘Advanced Statistics’. It was taught by Dr. Maurits de Klepper, who is Amsterdam University College’s Teacher of the Year – and deservedly so. Maurits is not just an amazingly committed educator; he’s also a fervent defender of scientific integrity. So while the class covered some of the more advanced statistical methods, we teamed up to put our skills to use in replicating the analysis of a previously published paper of our choice.

There was a curious atmosphere about the class – who, after all, would take a non-compulsory applied statistics class, if not the greatest geeks? – such that none of the groups stopped at just replicating the analysis of their chosen target, but all improved on it. And improvement was necessary! Because it appeared that none of the six papers – from psychology, economics, and public health – lived up to the standards Maurits had taught us (see my paper with my friend Zuzanna Fimińska here). If a bunch of undergrads with a liking for statistics can so easily criticise (in part) highly regarded researchers, doesn’t that show a severe problem?

So this is what I had in the back of my mind when reading the latest issue of Perspectives on Psychological Science (no subscription required), which is entirely devoted to replication. The editors describe a dire situation of rising doubts in the discipline, fuelled by a series of highly publicised cases of fraud and studies showing just how wide-spread improper practices are:

“These doubts emerged and grew as a series of unhappy events unfolded in 2011: the Diederik Stapel fraud case, the publication in a major social psychology journal of an article purporting to show evidence of extrasensory perception followed by widespread public mockery, reports by Wicherts and colleagues that psychologists are often unwilling or unable to share their published data for reanalysis, and the publication of an important article in Psychological Science showing how easily researchers can, in the absence of any real effects, nonetheless obtain statistically significant differences through various questionable research practices (QRPs) such as exploring multiple dependent variables or covariates and only reporting these when they yield significant results.”3 The list continues.

In what makes a cure even more evasive, psychology has its very own problem with replication: it’s becoming ever more rare.4 Even for graduate students, merely replicating existing findings is an ungainly endeavour, scarcely rewarded and discouraged by what Christopher J. Ferguson and Moritz Heene call an “aversion to the null” in psychology.5 If it ain’t significant, it ain’t gonna be published. The problem goes so far that it appears appealing to let undergraduates do the work.6 But when discussing my capstone project, even I was discouraged from merely replicating known findings.

It could be disheartening to see how far psychological research is from the ideal that I am taught. In particular, as it appears that many don’t exactly want to change anything about it. One paper by Heather M. Fuchs, Mirjam Jenny, and Susann Fiedler investigated how researchers feel about stricter requirements for studies.7 The paper is entitled “Psychologists Are Open to Change, yet Wary of Rules”, and it shows that many researchers endorse stricter good practices – but not rules for publication. But what change is that supposed to be? It gets even worse if you look at the statements themselves. For instance, not even half of the respondents (n = 1292, so this is not just a quick asking-around-in-the-department) thought it should be good practice that “Authors must collect at least 20 observations per cell or else provide a compelling cost-of-data-collection justification.” This is one of these famous ‘you learn that in your first statistics class’ statements that you would hope to be deeply ingrained in the mind of anybody who made it as far as grad school, not to speak tenure.8 Yes, it might not be feasible in some disciplines, especially neuropsychology9, but denying it being good practice?

Then again, there are also signs of positive change; not least the current issue of Perspectives on Psychological Science, and the ongoing debate that has motivated it. Elsewhere, the medical researcher John Ioannidis has made an iconoclastic career out of his claim that “most published research findings are false”10 And the Internet, too, opens up new opportunities. The Reproducibility Project is one: an attempt at large-scale, open collaboration to reproduce prominent findings, pooling together small efforts at what might become hundreds of institutes.

  1. Which of course is not exactly true in my case. []
  2. This, by the way, was also what deterred me most strongly from the study of economics: there appears to be little space in undergraduate curricula for research methodology, not to speak of critical questioning. It is not just that assumptions are made quite implicitly; I have hardly found them defended argumentatively upon asking. []
  3. Pashler, H. & Wagenmakers, E.-J. (2012). Editors’ Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence? Perspectives on Psychological Science, 7(6), 528-530. doi:10.1177/1745691612465253 []
  4. Ferguson, C. J. & Heene, M. (2012). A Vast Graveyard of Undead Theories: Publication Bias and Psychological Science’s Aversion to the Null. Perspectives on Psychological Science, 7(6), 555-561. doi:10.1177/1745691612459059 []
  5. ibid. []
  6. Grahe, J.E., Reifman, A., Hermann, A. D., Walker, M., Oleson, K. C., Nario-Redmond, M., & Wiebe, R. P. (2012). Harnessing the Undiscovered Resource of Student Research Projects. Perspectives on Psychological Science, 7(6), 605-607. doi:10.1177/1745691612459057 []
  7. Fuchs, H. M., Jenny, M., & Fiedler, S. (2012). Psychologists Are Open to Change, yet Wary of Rules. Perspectives on Psychological Science, 7(6), 639-642. doi:10.1177/1745691612459521 []
  8. Another one that comes up time and again is statistical significance. How often have I read sentences à la “X was faster/higher/stronger than Y, but the difference was not significant”. If you are writing this, you’ve either slept through Stats 101, or you’re trying to imply something your data simply does not support. []
  9. Then again, it might spare us certain neuropsychology studies and the associated yellow press headlines… []
  10. Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLoS Medicine, 2(8), e124. doi:10.1371/journal.pmed.0020124 []
Continue Reading

National Geographic Young Explorers Grant Workshop at McGill

Today I went to a workshop held by National Geographic at McGill about their Young Explorers grant program. Rolled out in 2006, the program has so far sponsored more than 200 researchers, journalists, and adventurers with nearly a million dollars.

National Geographic had brought three of their previous grantees to present their work. Andrea Reid talked about her research on Nile perch in Lake Victoria, an invasive predator that has been described as “Darwin’s Nightmare” for its consequences for other species. Becca Skinner should photos from a project in which she tried to trace the locations of pictures taken in the aftermath of the 2004 tsunami in Aceh five years after the catastrophe.

I particularly loved the story of Amy Higgins. After graduating witha B.Sc. in Biology, she was working as a school teacher in the Ladakh region of Jammu and Kashmere when she heard a ‘myth’ about artificial glaciers. Being an avid hiker, she organised a school trip to the remote area and indeed found artificially created frozen lakes.

The artificial glaciers were built by Chewang Norphel, a retired Kashmeri engineer. Amy took the opportunity to intern with Mr. Norphel and started to learn from him. On creating artificial glaciers, she says it is “like building an ice skating rink in your backyard.”

With the help of two consecutive National Geographic Young Explorer grants, Amy started researching the impacts of the artificial glaciers on local agriculture. They are used to store water for irrigation, which is otherwise scarce in the region. In the Master’s thesis she just completed at Yale’s School of Forestry and Environmental Studies, Amy reports that Mr. Norphel’s constructions add 20 to 40 additional days of irrigation and allow farmers to switch from growing barley to more profitable crops such as peas.

I loved Amy’s story because it’s a tale of immense serendipity, although guided by her obvious curiosity and enthusiasm. Very, very cool.

The other speakers National Geographic had brought in included Environmental Anthropologist Kenny Broad, who was immensely funny in one moment (“… if all else fails, go to Burning Man. […] If Burning Man fails, go surfing.”), only to talk of the dangers of field research and the friends he has lost to explorations a minute later.

The day was concluded by breakout sessions. In small groups, we could pitch our own project ideas to the old hands (I ended up with National Geographic’s Chris Thornton, who did an amazing job giving feedback, and Amy Higgins). There were a ton of really cool projects in my group – on the role of music in the identity of Sahrawi refugees in Western Algeria or lessons for conservation to be learned from the Canada’s indigenous people, to name just two.

I can only encourage everybody who is working in a field science to have a look at Natural Geographic’s Young Explorers program. A grant of up to 5,000 US$ is available for applicants between 18 and 25 years of age. There are three sources of funding available for different projects:

  1. The Committee for Research and Exploration funds scientific field research. Applicants may come from disciplines such as Geography and Biology, but also Anthropology.
  2. The Expeditions Council supports “explorations with story potential”. While these projects may have a scientific component, they should yield material for Natural Geographic’s many publications.
  3. The Conservation Trust funds on-the-ground conservation action. The emphasis of this program – the smallest of the three – lies heavily on innovative methods.
Continue Reading

On occasion of the death of Elinor Ostrom

I was sad to read that Elinor Ostrom, political economist and 2009 Nobel Prize winner in economics, died this Tuesday. Since I came across her work years ago, she has been one of my favourite economists, and a great inspiration for my thinking about society. My decision to study behavioural sciences, including economics, I owe in part to her work.

With her studies on self-governing communities, Ostrom freed the commons of the attribute ‘tragic’, as the German taz writes. She showed that between state control and privatization, a third avenue exists for collective action. Bringing together rigorous economic theory and field observations collected by anthropologists, Ostrom mounted a powerful challenge to mainstream economic theory.

In recent years, Ostrom’s work gained reach beyond her initial case studies, which dealt with water basins, irrigation systems, and mountain meadows. Lawrence Lessig, in particular, expanded the idea of the commons to the digital realms. The licensing scheme he initiated, Creative Commons, is used on this blog – as on many others – to make virtual products available to us 21st century kids and our read/write culture.

Despite my glowing reverence, and that of many who unite under the banner of the commons in search for a better society, Ostrom’s work has always struck my as thorough and tentative. ‘Governing the Commons’, her major work and the one which established her reputation, is formulated not like the triumphal product of years of hard work, but like a research report. It combines the two attributes of a great researcher; a revolutionary vision and detailed empiricism.

For me as a social liberal and anti-surveillance activist, the role Ostrom attributes to monitoring and punishment in self-governing communities was an intellectual challenge when I first discovered her work. In the past months, in particular, I have spent much thought to the relationship of liberty and punishment. My academic work has focused intensively on altruistic punishment and its role in human cooperation recently, and I am looking forward to dedicating my senior thesis to this topic.

As much as Elinor Ostrom has influenced me academically, I also admire her life story. Until today, she is the only woman to ever win the Nobel Prize in economics, and her biography betrays a sense of the hurdles she had to overcome on the way. Most notably, she received the prize although she did never study economics – when she applied for graduate school, she was turned down as women were deemed unfit for such a mathematical subject, and ended up in political science.

Elinor Ostrom’s work has given me a vision of collective action and a society without hierarchies, but it has also warned me of the sacrifices such a community demands of its citizens. She has been an inspiration in many ways, and she will remain a model researcher to me in the future.

Continue Reading