State-of-the ART: Project Update 2
20 September 2014 | ART Project Updates
|By Fiona Remnant (Originally posted at Devlog@Bath)
One year, two countries, four pilots and eight increasingly cunning versions of the Excel spreadsheet further on – it’s time for an update on what the Assessing Rural Transformations team has been up to. Part of the answer is that we’ve been reminded (not for the first time) that cost-effective evaluation hinges as much on how efficiently data is analysed as well as collected.
As the Research and Communications Officer for the ART project I have spent much of the last year buried deep in analysis of our first QUIP (Qualitative Impact Protocol) results from four projects in Ethiopia and Malawi. Put simply, the QUIP aims to ‘provide credible, timely and cost-effective evidence of impact based on the testimony of intended beneficiaries of rural livelihood interventions without the need for a control group’. The not so simple implications of trying to meet this aim whilst addressing problems of attribution, confirmation bias and generalizability are discussed in more depth in the first project update, and in a recent paper which presents the findings of the four pilot QUIPs.
The pilots were our first opportunity to test the QUIP questionnaire in the field, working with teams of independent field researchers who were commissioned by the university and (deliberately) told nothing about the activities being assessed. The QUIP asks open and closed questions about general changes in the lives and livelihoods of selected households over the past two years, with no specific reference to any project interventions. This deliberate ‘blinding’ is discussed in more detail in the paper, but essentially aims to reduce pro-project bias on the part of researchers and respondents as far as is possible, and hence elicit a more credible assessment of the relative importance of all the factors that have affected respondents’ lives and livelihoods during that period.
While James Copestake has continued to develop and explore the credibility and robustness of this methodological approach, I have been working on the question of how to organise and analyse all the qualitative data we were collecting, and how to make the QUIP easily replicable. As is so often the way with the best solutions, a chance and unrelated meeting early in ART’s life with Morten Siersted of Excel consultancy F1F9 has led to an exciting collaboration and the development of what we hope is becoming a simple, cheap and accessible qualitative analysis tool.
I share F1F9’s obsession with sharing best practice and tools – it’s why I work in communications. For the QUIP to have a future beyond ART, it is vital that we can freely share more than just the guidelines and questionnaire, but also a tool to process the results, easily and rapidly. Having accepted the gauntlet thrown down by Morten, we set about seeing whether Excel could indeed handle qualitative as well as quantitative data. After all – Excel, unlike NVivo, is relatively accessible and familiar to most of us. We analysed the first two sets of data in both NVivo and Excel to compare the outcomes and to cross-check our coding system, and found little wanting in the Excel version. With generous pro-bono support from F1F9 staff over the last year, we now have a prototype spreadsheet which enables speedy analysis and report writing – a matter of days rather than weeks.
The field researchers entered all the responses straight into a simple Excel version of the questionnaire. Responses to closed questions are then easily translated into graphs and tables using the quantitative capabilities of Excel that most of us are familiar with. The open-ended responses are split and coded within the spreadsheet, and then organised according to our attribution coding at the touch of a button using the macros designed by the F1F9 team in Delhi. This gives you a much more manageable transcript of responses organised according to whether they (a) explicitly attributed impact to project activities, (b) made statements that were implicitly consistent with the project’s theory of change, (c) referred to drivers of change that were incidental to project activities. These responses are further classified by the area of life/livelihood impacted, and whether the effect was positive or negative. The coded transcript produced enables the reader to skip straight to a desired section, for example – all statements which explicitly attributed a positive change in food production to project activities.
The coded text is used to produce a series of tables which present the distribution of statements across the different codes and areas of life most affected as well as the main drivers of change identified by respondents. The tables, even without the accompanying narrative, paint a fascinating visual picture of changes in the sample selected. The first table presents the responses to closed questions – giving an overall picture of where most households have experienced positive or negative change in each area of their lives (food production, cash income, cash spending, food consumption, assets and overall wellbeing). The second set of tables summarise how many statements were made attributing change to project or non-project causes, with the most populated cells showing where most positive or negative impact has been felt in each domain. The final tables show the distribution of named causes or drivers of change, building a detailed picture of the most significant events or interventions which have driven both positive and negative change. This detail, backed up by quotations, can help to explain the complexity of different drivers working with and against each other.
The next challenge for ART is to combine the QUIP data with the quantitative monitoring data that our colleagues at Self Help Africa and Farm Africa have been collecting using Evidence for Development’s Individual Household Method (IHM). EfD are working on a robust system for presenting household and village level data on changes to key indicators such as household disposable income. These results can then be combined with the second round of QUIP respondents’ own perceptions about how their lives have changed and why to provide a more complete assessment of impact.
The QUIP spreadsheet and report are both still in development, but do contact me if you’d like to find out more, and perhaps try carrying out a QUIP yourself? We welcome all feedback to help us make adjustments as we embark upon the second round of impact assessment of the same projects over the next year.
Comments are closed here.