Juha Uitto is the Director of the Global Environment Facility (GEF)’s Independent Evaluation Office (IEO) and has decades of experience in evaluation in environment and development contexts.
On February 27 and 28, 2023, he attended a workshop organized by the CGIAR’s Independent Advisory and Evaluation Service (IAES)’s in Rome, Italy. The workshop aimed to introduce the new guidelines on evaluating quality of science and research in process and performance evaluations. These build on the CGIAR Independent Science for Development Council (ISDC)’s Quality of Research for Development (QoR4D) Frame of Reference, and provide the framing, criteria, dimensions, and methods for assessing QoR4D – both within CGIAR, and in other like-minded organisations. The hybrid online and in-person event was designed to help practitioners across and beyond the CGIAR system to understand and apply the new guidelines in their own evaluative contexts.
We spoke to Uitto to find out more about his experience of the workshop and his aspirations going forward.
Q: How familiar were you with the CGIAR, the ISDC and the Evaluation Function prior to the workshop?
A: I was quite familiar with CGIAR because I’ve worked in international development for many years and had a number of interactions with several of the research centers. CGIAR research has been instrumental in promoting food security and sustainability innovation in agriculture, one of the goals of the GEF, as well. I had come across some evaluation work emanating from CGIAR, especially that conducted by CIFOR and ICRAF, but I had little knowledge of how the overall evaluation function is organized.
Q: How much experience did you already have on the topic of evaluating quality of science and QoR4D?
My profession for over two decades has been in evaluation, in which I have specialized in evaluating the nexus of environment and development. I also have some background in evaluating scientific endeavors, but it has not been something that I have focused on in my work.
Q: Were your expectations from the workshop met? Did you have any unanticipated learnings?
The workshop certainly met my expectations. I wouldn’t say I had unanticipated learnings, per se, but I think there were a number of revelations that definitely clarified my thinking about how to assess systematically research quality and impact.
Q: What did you find the most exciting? (eg. sessions, people, case studies)
The most exciting aspect for me was the interaction between the scientists and the evaluators. We have so much in common but we speak somewhat different languages – or at least different dialects. Scientists working with CGIAR also seek to answer questions about what works, where and for whom and how can we improve performance – but they don’t think of it as evaluation. On the other hand, evaluators have so much to learn from scientists. In my office, I have advocated for starting any evaluation by scanning what research has to say about the topic. This informs our theories of change and helps us avoid reinventing the wheel.
Q: What will you take forward into your work? Where/how?
A: The GEF is an organization that was set out to be innovative. To reach this goal, both science and evaluation are built into the system. We at the Independent Evaluation Office have a close relationship with the Scientific and Technical Advisory Panel. Our collaboration is quite fruitful and we coordinate our respective work programs and exchange lessons. The workshop helped me see even more clearly where and how we can collaborate.
Q: Is there a particular way in which you would like to continue engaging with IAES/Evaluation and CGIAR on this topic, and around the Guidelines? If so, what?
A: This is a fascinating area to me. I’d be very pleased to continue engaging in order to learn and to share lessons from my work in evaluating environment and development programs.