Skip to main content

Main menu

  • Home
  • About
  • Who we are
  • News
  • Events
  • Publications
  • Search

Secondary Menu

  • Independent Science for Development CouncilISDC
    • Who we are
    • News
    • Events
    • Publications
    • Featured Projects
      • Inclusive Innovation
        • Agricultural Systems Special Issue
      • Proposal Reviews
        • 2025-30 Portfolio
        • Reform Advice
      • Foresight & Trade-Offs
        • Megatrends
      • QoR4D
      • Comparative Advantage
  • Standing Panel on Impact AssessmentSPIA
    • About
      • Who We Are
      • Our Mandate
      • Impact Assessment Focal Points
      • SPIA Affiliates Network
    • Our Work
      • Country Studies
        • Community of Practice
        • Bangladesh Study
        • Ethiopia Study
        • Uganda Study
        • Vietnam Study
      • Causal Impact Assessment
        • Call for Expressions of Interest: Accountability and Learning Impact Studies
      • Use of Evidence
      • Cross-Cutting Areas
        • Capacity Strengthening
        • Methods and Measurement
        • Guidance to IDTs
    • Resources
      • Publications
      • Blog Series on Qualitative Methods for Impact Assessment
      • SPIA-emLab Agricultural Interventions Database
    • Activities
      • News
      • Events
      • Webinars
  • Evaluation
    • Who we are
    • News
    • Events
    • Publications
    • Evaluations
      • Science Group Evaluations
      • Platform Evaluations
        • CGIAR Genebank Platform Evaluation
        • CGIAR GENDER Platform Evaluation
        • CGIAR Excellence in Breeding Platform
        • CGIAR Platform for Big Data in Agriculture
    • Framework and Policy
      • Evaluation Method Notes Resource Hub
      • Management Engagement and Response Resource Hub
      • Evaluating Quality of Science for Sustainable Development
      • Evaluability Assessments – Enhancing Pathway to Impact
      • Evaluation Guidelines
  • Independent Science for Development CouncilISDC
  • Standing Panel on Impact AssessmentSPIA
  • Evaluation
Back to IAES Main Menu

Secondary Menu

  • About
    • Who We Are
    • Our Mandate
    • Impact Assessment Focal Points
    • SPIA Affiliates Network
  • Our Work
    • Country Studies
      • Community of Practice
      • Bangladesh Study
      • Ethiopia Study
      • Uganda Study
      • Vietnam Study
    • Causal Impact Assessment
      • Call for Expressions of Interest: Accountability and Learning Impact Studies
    • Use of Evidence
    • Cross-Cutting Areas
      • Capacity Strengthening
      • Methods and Measurement
      • Guidance to IDTs
  • Resources
    • Publications
    • Blog Series on Qualitative Methods for Impact Assessment
    • SPIA-emLab Agricultural Interventions Database
  • Activities
    • News
    • Events
    • Webinars
Photo by: Elisabeth van de Grift.
Blog

Integrating Qualitative and Quantitative Methods for Impact Assessment

You are here

  • Home
  • Standing Panel on Impact AssessmentSPIA
  • News
  • Integrating Qualitative and Quantitative Methods for Impact Assessment

Qualitative methods have long been used in agricultural research to bring the perspectives and worldviews of farmers into the research process. By seeing the world through the eyes of farmers and understanding their contexts and motivations, qualitative researchers have tried to answer questions such as – How can farmer field schools consider indigenous knowledge and cultural norms? How can recommendations for farmers find a balance between proposals that fit local needs and are context-specific, but are also broad enough to be generalizable? What are the best ways of bridging the gap between how farmers define pest problems and how scientists define them? For a historical account – spanning thirty years - of the work done by anthropologists at IRRI, see Price and Palis 2016.

However, despite a long history, qualitative approaches are still viewed as “good-to-have” but not a necessary and integral part of basic scientific research, impact assessments or policymaking. It is therefore worthwhile to consider some specific ways of fruitfully combining qualitative with quantitative approaches to highlight the value of integrating qualitative approaches to the researcher’s toolkit.

Importance of "cognitive empathy" in qualitative methods

The strength of qualitative research is that the interaction between researchers and respondents is open-ended. Qualitative researchers make great effort in ensuring that they do not ask leading questions, probe deeper for completeness, change questions if they do not work and remain open to exploring new or surprising lines of inquiry. The key for the researcher is to develop “cognitive empathy,” i.e., to understand another’s predicament from their perspective (Small and Colarco 2022). It is important to note here that the word cognitive carries as much weight as the word empathy. The researcher eschews the idea that ways of thinking that are different from those of “experts” are somehow “irrational” or “illogical”. Instead, the attempt is to uncover the respondent’s logic, in the respondent’s own words and voice, regardless of whether the researcher personally agrees with that logic. 

Researching new or complex topics

Qualitative methods are especially useful when the research topics or issues are new or complex. When a topic is understudied or the context is unfamiliar, it is useful for the researcher to spend time doing fieldwork before collecting quantitative data. Qualitative fieldwork includes different methods - from observations to formal interviews (and group discussions) to informal conversations. The data collected through qualitative fieldwork can serve several purposes – developing survey questions, suggesting hypotheses, testing theories, providing alternative sources of evidence, and designing programs or policies. For examples of combining qualitative evidence with quantitative data especially when starting new lines of inquiry, see Rao 1998 and Udry 2003.

An example from agriculture is SPIA’s efforts to improve the measurement on the adoption of NRM (natural resource management) packages. Such packages have multiple components and include a complex set of practices, making it challenging to develop survey questions. Qualitative research can start the process of designing good close-ended questions by first getting a detailed understanding of farmers’ actual practices and probing about the farmers’ own logic for their actions. Once the researcher gains insights into why farmers do what they do, it is possible to design survey questions that pay careful attention to question wording, sequencing, and terminology. For an example of the use of qualitative methods to design a survey instrument on a complex NRM package, see here.

Assessing survey misreporting

Self-reported data in surveys – particularly on sensitive issues - can be misreported, biasing causal estimates. While some behaviors – such as drug use or corruption – are universally sensitive, other behaviors might be sensitive only in particular contexts. For example, questions on the use of privately-owned pumps that use groundwater for irrigation purposes may be unremarkable in some areas but sensitive in others if the government places restrictions on the use of such pumps for fear of groundwater depletion. In such cases, researchers can use qualitative methods to assess misreporting of data gathered through quantitative surveys and develop measurement error bounds.

Such a strategy rests on the researchers’ ability of building rapport with respondents, which is only possible through open-ended conversations and informal interactions. For an example of the use of qualitative methods to validate survey results, see Blattman et. al. 2016, where qualitative validation showed that the patterns of measurement error were different from the researchers’ priors.

Explaining quantitative results

In impact assessments, quantitative results provide evidence on whether change occurs, but to understand why the change occurred (or why it did not), it is best to combine forces with qualitative data collection. In-depth interviews can help confirm quantitative results, but also shed light on the mechanisms behind an intervention’s success or failure (for the use of qualitative research to explain quantitative results in RCTs, see Branas et. al. 2018; Muralidharan and Singh 2023). Qualitative results can also uncover subtle treatment effects which may be harder to capture through structured surveys (Rao et. al. 2016). And, even when there is disagreement between quantitative and qualitative findings, this need not result in tension; rather, it can lead to a far richer and deeper understanding of the issues (Doss et. al. 2022).

Using coded interviews for selecting survey questions

Many complex and latent concepts – e.g., women’s empowerment or cultural assimilation or farmers’ agency – are best measured with open-ended questions. However, there is a practical need to develop close-ended measures. It is possible to use machine learning tools to select survey questions based on their statistical correspondence to coded qualitative interviews.  For an example of such an application, see Jayachandran et. al. 2023.

Creating improved training sets for machine learning

Natural Language Processing (NLP) tools are making it possible to analyze open-ended interviews, which generally focus on unsupervised methods to reduce the dimensionality of the text and make it more analyzable. A potential drawback is that rather than basing interpretation on a reading of the documents, researchers interpret the simplified representations of the text. Furthermore, as unsupervised models are typically “unguided,” i.e., they do not seek to extract a particular signal from text, it can often be that the resulting measures are not suited to the research question of interest. An alternative is to develop a “supervised” NLP method that draws much more on the strengths of the interpretive approach of qualitative analysis. In this approach, a sub-sample of the transcripts is coded by a small team of human coders to generate “labels”. The human coded sub-sample is then used as a training set to predict codes on the full, statistically representative sample. For a fuller description, see here, and for the code and resources necessary to implement the techniques described, see here.

In short, for qualitative researchers, open-endedness, flexibility, and complexity, which may be viewed as methodological obstacles in quantitative research, are key to their craft. Rather than minimize, control, or eliminate them, qualitative research design and methodologies at their best seek to adapt and capture them in their accounts of the social worlds. Our next blog will provide more insight into this process with an overview of the essential indicators of what constitutes good qualitative research.

Banner

Share on

Impact SPIA
Oct 31, 2023

Written by

  • Monica Biradavolu

    SPIA Member

Related Publications

cover
Evaluation Reports & Reviews
Impact SPIA
Issued on 2025

Evaluation of SPIA’s 2019-2024 Program of Work: Final report

Briefs
Impact SPIA
Issued on 2025

SPIA Brief Ethiopia Report 2024: Building Resilience to Shocks

Briefs
Impact SPIA
Issued on 2025

SPIA Vietnam Report 2024 (2-Pager Brief)

More publications

CGIAR Independent Advisory and Evaluation Service (IAES)

Alliance of Bioversity International and CIAT
Via di San Domenico,1
00153 Rome, Italy
  • IAES@cgiar.org
  • (39-06) 61181

Follow Us

  • LinkedIn
  • Twitter
  • YouTube
JOIN OUR MAILING LIST
  • Terms and conditions
  • © CGIAR 2025

IAES provides operational support as the secretariat for the Independent Science for Development Council and the Standing Panel on Impact Assessment, and implements CGIAR’s multi-year, independent evaluation plan as approved by the CGIAR’s System Council.