Fieldnotes from Ghana

I am currently in Accra, Ghana for a cultural psychology project organised by the University of Amsterdam in cooperation with the University of Ghana at Legon. As part of the trip, I have been doing some data collection for the Ghana part of a large cross-cultural study on perceptions of power and leadership (run by Eftychia Stamkou of UvA). Here’s three things I learned from collecting data for a survey experiment:

  • Participant recruitment is easy (at least sometimes): for our study, we recruited university students at the Legon campus of the University of Ghana. Much to my surprise, recruiting participants for an (unpaid) 20 minute survey was very easy: when we knocked on doors in the university dorms, most residents were happy to participate in our survey. (In contrast, a group recruiting teachers for an hour-long study had a much harder time to find willing participants).
  • Asking about age can be sensitive: in discussing our questionnaire with our Ghanaian collaborators, we were told that questions about the participant’s age can pose some problems. For example, there is a perceived threshold marriage age – 25 years – and participants may feel uncomfortable with indicating that they are older than that (although we did not ask about marital status). We resorted to providing participants with the opportunity to indicate an age range instead; and participants appeared to be happy with that.
  • Likert scales are not always intuitive: one of the most common survey response formats in psychological research are Likert scales, often times with five or seven points. Many of our participants seem to have used our seven-point scales as three-point scales however, with most or all answers on the options one, four, and seven. In a place where many participants may be unfamiliar with Likert scales, using a three-point scale (or binary measure with ‘don’t know’ option) may be preferable.

Overall, my experience collecting data here in Accra has been quite positive – it certainly went much faster than expected. I suspect that collecting more representative data, or in more rural locations, will be more difficult; but that’s for some future experience. So far, I’ve learned some very useful things about doing research here, and about the research done here – more on that in my next post.

Continue Reading

Replication, Replication

Being a good undergrad, I walk around with the smug face of a scientific zealot – it’s easy to criticise research practices when you aren’t running any studies yourself.1 The critical approach to experimental methods and statistical analyses, however, was also one factor that attracted me towards psychology. My Intro to Psych teacher started the first class with a picture of Sigmund Freud: “You know this guy? I’m gonna trash him!” And so he did, and the entire field of personality research with him. Other lecturers have been more subtle, but most have encouraged me to examine research findings carefully and with a critical eye.2

Last semester, being one of the rare people who find true joy in the depth of spreadsheets, I took a course called ‘Advanced Statistics’. It was taught by Dr. Maurits de Klepper, who is Amsterdam University College’s Teacher of the Year – and deservedly so. Maurits is not just an amazingly committed educator; he’s also a fervent defender of scientific integrity. So while the class covered some of the more advanced statistical methods, we teamed up to put our skills to use in replicating the analysis of a previously published paper of our choice.

There was a curious atmosphere about the class – who, after all, would take a non-compulsory applied statistics class, if not the greatest geeks? – such that none of the groups stopped at just replicating the analysis of their chosen target, but all improved on it. And improvement was necessary! Because it appeared that none of the six papers – from psychology, economics, and public health – lived up to the standards Maurits had taught us (see my paper with my friend Zuzanna Fimińska here). If a bunch of undergrads with a liking for statistics can so easily criticise (in part) highly regarded researchers, doesn’t that show a severe problem?

So this is what I had in the back of my mind when reading the latest issue of Perspectives on Psychological Science (no subscription required), which is entirely devoted to replication. The editors describe a dire situation of rising doubts in the discipline, fuelled by a series of highly publicised cases of fraud and studies showing just how wide-spread improper practices are:

“These doubts emerged and grew as a series of unhappy events unfolded in 2011: the Diederik Stapel fraud case, the publication in a major social psychology journal of an article purporting to show evidence of extrasensory perception followed by widespread public mockery, reports by Wicherts and colleagues that psychologists are often unwilling or unable to share their published data for reanalysis, and the publication of an important article in Psychological Science showing how easily researchers can, in the absence of any real effects, nonetheless obtain statistically significant differences through various questionable research practices (QRPs) such as exploring multiple dependent variables or covariates and only reporting these when they yield significant results.”3 The list continues.

In what makes a cure even more evasive, psychology has its very own problem with replication: it’s becoming ever more rare.4 Even for graduate students, merely replicating existing findings is an ungainly endeavour, scarcely rewarded and discouraged by what Christopher J. Ferguson and Moritz Heene call an “aversion to the null” in psychology.5 If it ain’t significant, it ain’t gonna be published. The problem goes so far that it appears appealing to let undergraduates do the work.6 But when discussing my capstone project, even I was discouraged from merely replicating known findings.

It could be disheartening to see how far psychological research is from the ideal that I am taught. In particular, as it appears that many don’t exactly want to change anything about it. One paper by Heather M. Fuchs, Mirjam Jenny, and Susann Fiedler investigated how researchers feel about stricter requirements for studies.7 The paper is entitled “Psychologists Are Open to Change, yet Wary of Rules”, and it shows that many researchers endorse stricter good practices – but not rules for publication. But what change is that supposed to be? It gets even worse if you look at the statements themselves. For instance, not even half of the respondents (n = 1292, so this is not just a quick asking-around-in-the-department) thought it should be good practice that “Authors must collect at least 20 observations per cell or else provide a compelling cost-of-data-collection justification.” This is one of these famous ‘you learn that in your first statistics class’ statements that you would hope to be deeply ingrained in the mind of anybody who made it as far as grad school, not to speak tenure.8 Yes, it might not be feasible in some disciplines, especially neuropsychology9, but denying it being good practice?

Then again, there are also signs of positive change; not least the current issue of Perspectives on Psychological Science, and the ongoing debate that has motivated it. Elsewhere, the medical researcher John Ioannidis has made an iconoclastic career out of his claim that “most published research findings are false”10 And the Internet, too, opens up new opportunities. The Reproducibility Project is one: an attempt at large-scale, open collaboration to reproduce prominent findings, pooling together small efforts at what might become hundreds of institutes.

  1. Which of course is not exactly true in my case. []
  2. This, by the way, was also what deterred me most strongly from the study of economics: there appears to be little space in undergraduate curricula for research methodology, not to speak of critical questioning. It is not just that assumptions are made quite implicitly; I have hardly found them defended argumentatively upon asking. []
  3. Pashler, H. & Wagenmakers, E.-J. (2012). Editors’ Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence? Perspectives on Psychological Science, 7(6), 528-530. doi:10.1177/1745691612465253 []
  4. Ferguson, C. J. & Heene, M. (2012). A Vast Graveyard of Undead Theories: Publication Bias and Psychological Science’s Aversion to the Null. Perspectives on Psychological Science, 7(6), 555-561. doi:10.1177/1745691612459059 []
  5. ibid. []
  6. Grahe, J.E., Reifman, A., Hermann, A. D., Walker, M., Oleson, K. C., Nario-Redmond, M., & Wiebe, R. P. (2012). Harnessing the Undiscovered Resource of Student Research Projects. Perspectives on Psychological Science, 7(6), 605-607. doi:10.1177/1745691612459057 []
  7. Fuchs, H. M., Jenny, M., & Fiedler, S. (2012). Psychologists Are Open to Change, yet Wary of Rules. Perspectives on Psychological Science, 7(6), 639-642. doi:10.1177/1745691612459521 []
  8. Another one that comes up time and again is statistical significance. How often have I read sentences à la “X was faster/higher/stronger than Y, but the difference was not significant”. If you are writing this, you’ve either slept through Stats 101, or you’re trying to imply something your data simply does not support. []
  9. Then again, it might spare us certain neuropsychology studies and the associated yellow press headlines… []
  10. Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLoS Medicine, 2(8), e124. doi:10.1371/journal.pmed.0020124 []
Continue Reading

National Geographic Young Explorers Grant Workshop at McGill

Today I went to a workshop held by National Geographic at McGill about their Young Explorers grant program. Rolled out in 2006, the program has so far sponsored more than 200 researchers, journalists, and adventurers with nearly a million dollars.

National Geographic had brought three of their previous grantees to present their work. Andrea Reid talked about her research on Nile perch in Lake Victoria, an invasive predator that has been described as “Darwin’s Nightmare” for its consequences for other species. Becca Skinner should photos from a project in which she tried to trace the locations of pictures taken in the aftermath of the 2004 tsunami in Aceh five years after the catastrophe.

I particularly loved the story of Amy Higgins. After graduating witha B.Sc. in Biology, she was working as a school teacher in the Ladakh region of Jammu and Kashmere when she heard a ‘myth’ about artificial glaciers. Being an avid hiker, she organised a school trip to the remote area and indeed found artificially created frozen lakes.

The artificial glaciers were built by Chewang Norphel, a retired Kashmeri engineer. Amy took the opportunity to intern with Mr. Norphel and started to learn from him. On creating artificial glaciers, she says it is “like building an ice skating rink in your backyard.”

With the help of two consecutive National Geographic Young Explorer grants, Amy started researching the impacts of the artificial glaciers on local agriculture. They are used to store water for irrigation, which is otherwise scarce in the region. In the Master’s thesis she just completed at Yale’s School of Forestry and Environmental Studies, Amy reports that Mr. Norphel’s constructions add 20 to 40 additional days of irrigation and allow farmers to switch from growing barley to more profitable crops such as peas.

I loved Amy’s story because it’s a tale of immense serendipity, although guided by her obvious curiosity and enthusiasm. Very, very cool.

The other speakers National Geographic had brought in included Environmental Anthropologist Kenny Broad, who was immensely funny in one moment (“… if all else fails, go to Burning Man. […] If Burning Man fails, go surfing.”), only to talk of the dangers of field research and the friends he has lost to explorations a minute later.

The day was concluded by breakout sessions. In small groups, we could pitch our own project ideas to the old hands (I ended up with National Geographic’s Chris Thornton, who did an amazing job giving feedback, and Amy Higgins). There were a ton of really cool projects in my group – on the role of music in the identity of Sahrawi refugees in Western Algeria or lessons for conservation to be learned from the Canada’s indigenous people, to name just two.

I can only encourage everybody who is working in a field science to have a look at Natural Geographic’s Young Explorers program. A grant of up to 5,000 US$ is available for applicants between 18 and 25 years of age. There are three sources of funding available for different projects:

  1. The Committee for Research and Exploration funds scientific field research. Applicants may come from disciplines such as Geography and Biology, but also Anthropology.
  2. The Expeditions Council supports “explorations with story potential”. While these projects may have a scientific component, they should yield material for Natural Geographic’s many publications.
  3. The Conservation Trust funds on-the-ground conservation action. The emphasis of this program – the smallest of the three – lies heavily on innovative methods.
Continue Reading