Skip to main content

Main menu

  • Home
  • About
  • Who we are
  • News
  • Events
  • Publications
  • Search

Secondary Menu

  • Independent Science for Development CouncilISDC
    • Who we are
    • News
    • Events
    • Publications
    • Featured Projects
      • Inclusive Innovation
        • Agricultural Systems Special Issue
      • Proposal Reviews
        • 2025-30 Portfolio
        • Reform Advice
      • Foresight & Trade-Offs
        • Megatrends
      • QoR4D
      • Comparative Advantage
  • Standing Panel on Impact AssessmentSPIA
    • About
      • Who We Are
      • Our Mandate
      • Impact Assessment Focal Points
      • SPIA Affiliates Network
    • Our Work
      • Country Studies
        • Community of Practice
        • Bangladesh Study
        • Ethiopia Study
        • Uganda Study
        • Vietnam Study
      • Causal Impact Assessment
        • Call for Expressions of Interest: Accountability and Learning Impact Studies
      • Use of Evidence
      • Cross-Cutting Areas
        • Capacity Strengthening
        • Methods and Measurement
        • Guidance to IDTs
    • Resources
      • Publications
      • Blog Series on Qualitative Methods for Impact Assessment
      • SPIA-emLab Agricultural Interventions Database
    • Activities
      • News
      • Events
      • Webinars
  • Evaluation
    • Who we are
    • News
    • Events
    • Publications
    • Evaluations
      • Science Group Evaluations
      • Platform Evaluations
        • CGIAR Genebank Platform Evaluation
        • CGIAR GENDER Platform Evaluation
        • CGIAR Excellence in Breeding Platform
        • CGIAR Platform for Big Data in Agriculture
    • Framework and Policy
      • Evaluation Method Notes Resource Hub
      • Management Engagement and Response Resource Hub
      • Evaluating Quality of Science for Sustainable Development
      • Evaluability Assessments – Enhancing Pathway to Impact
      • Evaluation Guidelines
  • Independent Science for Development CouncilISDC
  • Standing Panel on Impact AssessmentSPIA
  • Evaluation
Back to IAES Main Menu

Secondary Menu

  • Who we are
  • News
  • Events
  • Publications
  • Evaluations
    • Science Group Evaluations
    • Platform Evaluations
      • CGIAR Genebank Platform Evaluation
      • CGIAR GENDER Platform Evaluation
      • CGIAR Excellence in Breeding Platform
      • CGIAR Platform for Big Data in Agriculture
  • Framework and Policy
    • Evaluation Method Notes Resource Hub
    • Management Engagement and Response Resource Hub
    • Evaluating Quality of Science for Sustainable Development
    • Evaluability Assessments – Enhancing Pathway to Impact
    • Evaluation Guidelines
Meeting & Workshop Reports

Workshop Reports: Evaluation Guidelines: Applying QoR4D Frame of Reference to Process and Performance Evaluations, February 2023, Rome

You are here

  • Home
  • Evaluation
  • Publications
  • Workshop Reports: Evaluation Guidelines: Applying QoR4D Frame of Reference to Process and Performance Evaluations, February 2023, Rome

Abstract

IAES organized a 1.5-day hybrid workshop in February 2023 to usher in the launch of the beta version of the Evaluation Guidelines, foster a common understanding among evaluators and subject matter experts (SMEs) of approaches and entry points to evaluating quality of science (QoS) in CGIAR, draw broader lessons from assessing and evaluating Quality of Science, and identify opportunities to roll out and monitor the use and uptake of the guidelines in CGIAR and beyond.

The workshop brought together 72 key stakeholders (in person and remotely) from CGIAR (74%) and external (26%.) These stakeholder groups included funders, multilateral organizations, the private sector and non-governmental organizations. The workshop featured discussions around three main sessions and group activities. For a better understanding of the guidelines' applicability, workshop participants tackled three case studies including the evaluation of the CGIAR Genebank Platform (forthcoming). They effectively used the guidelines to identify the criteria, dimensions and methods of evaluation. Members of IAES's Evaluation Reference Group also participated in the workshop.

FAQs to the guidelines were collated in preparation for the workshop, and are a living document.

Please read the workshop report, accompanied by Annex with participant bios. Blogs and Q&A from participants include:

  • Reflections on workshop: Adaptive co-design of research for development (R4D) evaluations – the need for engagement, learning and ongoing reflection
  • Q+A: Former ICAR-NAARM head Dr D. Rama Rao on the QoR4D Frame of Reference workshop
  • Q+A: International project management expert Hasna Ziraoui on the QoR4D Frame of Reference workshop
  • Q+A: John Gargani on evaluation of research and innovation for development
  • Q+A: GEF IEO Director Juha Uitto On The QoR4D Frame Of Reference Workshop
  • Preguntas y respuestas: Emma Rotondo y Rodrigo Ybarnegaray, especialistas de evaluación internacional, acerca del taller sobre el Marco de referencia de la QoR4D (also in English)
  • Bibliometric analysis to evaluate quality of science in the context of One CGIAR
  • 2015 Evaluating quality of science workshop report

Soliciting Input and Feedback

IAES would like to receive questions about and feedback on learning from the application of the beta version of the Guidelines (contact IAES-Evaluation@cgiar.org as custodian of the document).

 

Citation

CGIAR Independent Advisory and Evaluation Service (IAES) (2023). Workshop Report: Evaluating Quality of Research-for-Development (QoR4D) in Process and Performance Evaluations, February 2023. Rome.

Share on

Evaluation
Issued on 2023
  • Workshop Report
  • Annex
  • Why Bibliometrics Matter for Assessing Quality of Science
  • Twelve evaluative reviews of CGIAR research programs with QoS evaluation critria
  • Expertise Fused with Context: Strengthening Bibliometric Recommendations Through Co-Design
  • 2015 Workshop report

Related Publications

MELIA cover page
Evaluation Reports & Reviews
Evaluation
Issued on 2025

Summary of Learning on Monitoring, Evaluation, Learning and Impact Assessments (MELIA): Knowledge Product

Evaluation Reports & Reviews
Evaluation
Issued on 2025

Review of CGIAR Management Response System to Independent Evaluations

Technical Notes
Evaluation
Issued on 2025

Social Network Analysis For The Evaluation Of Development Interventions: A Methods Note

More publications

Related News

Blog
Evaluation
28 Apr 2025

Driving Change Through Evaluation: Key Insights from PCU’s Solomon Adebayo and Allison Poulos

Speakers at Evaluability assessments session
Blog
Evaluation
20 Mar 2025

Transforming Evaluation for a Sustainable Future: Learnings from the IDEAS 2025 Conference 

eval team
Blog
Evaluation
15 Oct 2024

Young Emerging Evaluators from CGIAR Take the Stage: Research Analysts at the European Evaluation Society Conference 2024

More News

CGIAR Independent Advisory and Evaluation Service (IAES)

Alliance of Bioversity International and CIAT
Via di San Domenico,1
00153 Rome, Italy
  • IAES@cgiar.org
  • (39-06) 61181

Follow Us

  • LinkedIn
  • Twitter
  • YouTube
JOIN OUR MAILING LIST
  • Terms and conditions
  • © CGIAR 2025

IAES provides operational support as the secretariat for the Independent Science for Development Council and the Standing Panel on Impact Assessment, and implements CGIAR’s multi-year, independent evaluation plan as approved by the CGIAR’s System Council.