News & Views

New article on mixed-methods evaluation

Any academic will tell you that one of the high points of their research is seeing work finally published; so I’m delighted (and relieved!) to be able to share the news that another paper building on QuIP experiences has just been e-published – this time in the Journal of Development Effectiveness. This adds to the list of papers tackling knotty methodological issues that we’ve published from QuIP work – others having addressed confirmation bias (with blindfolding), opaque qualitative data analysis (with causal mapping) and cherry picking (with a broadly Bayesian approach to purposive case/source selection). The new paper addresses two problems at once – the problem of dealing with complexity, and the problem of how to pass on the baton of impact evidence effectively so that it might even be used in a useful way. I can’t claim the new paper has comprehensively cracked either of these problems, but I think it does have useful things to say about them, by rethinking what we mean by mixed methods.

The full title of the paper is “Mixed methods impact evaluation in international development practice: distinguishing between quant-led and qual-led models.

In brief, the quant-led model is centred on ‘variance-based’ causal attribution, supported by qualitative contextualisation and design, and (sometimes) by ‘process theory-based’ attribution to help explain findings. It fits with a more positivist approach to social science, and a relatively replicable, technical, and linear view of development practice informed by answers to relatively stable and narrowly defined causal questions. While costly to produce it has the potential to come up with relatively easily understood and scientifically credible numbers for the magnitude of development impact that commissioners demand, even while leaving open the question of how relevant these findings are to other contexts.

The qual-led model combines quantitative monitoring with reliance on process theory-based attribution, combining multiple sources of evidence in an open-ended process of iteratively testing and updating theoretical understanding of causal mechanisms. It reflects an interpretive view of development that is more path-dependent, social, and complex. Findings tend to be less precise but can be broader in scope, informing reflection over their relevance to other contexts, picking up on unexpected causes and effects, and enriching understanding of underlying causal mechanisms.

Somewhat ironically, perhaps, the paper contrasts the two models to subvert use of the dichotomy between ‘quant’ and ‘qual’ in lazier and more simplistic ways. It’s easy to join the ‘randomista’ camp, and attack those who haven’t as being less methodologically rigorous; or to join the ‘complexity’ camp and attack others as lacking real world sophistication, for example. The problem is that this process of self-justification and ‘othering’ ends up entrenching differences, and undermining promising possibilities for synthesis. In distinguishing between qual-led and quant-led approaches to mixed methods the paper avoids claims about the general superiority of one over the other; rather it explores the causal process through which the two traditions associated with each model have emerged, how they can each be strengthened and transcended. Humans are not algorithms, and most of our decisions are based on qualitative judgements, on an often complex tangling of quantitative and qualitative data collection and analysis. That’s what the paper sets out to clarify, in the vain hope of fostering more constructive understanding, discussion and practice.

And here’s the rub! In the moment of satisfaction at seeing a paper published after many months of cogitation and doubt, the seeds of doubt are already sown about how it could have been better; whether adding another dichotomy to discussion will clarify and connect, or just add to the confusion and polarisation. Let’s see! For the moment, what Albert Hirschman memorably called the capacity for ‘self-subversion’ can wait. Meanwhile, this is an opportunity to thank those of you who have contributed to the experimentation and debate that the paper tries to reflect.

Comments are closed here.