Development action with informed and engaged societies
After nearly 28 years, The Communication Initiative (The CI) Global is entering a new chapter. Following a period of transition, the global website has been transferred to the University of the Witwatersrand (Wits) in South Africa, where it will be administered by the Social and Behaviour Change Communication Division. Wits' commitment to social change and justice makes it a trusted steward for The CI's legacy and future.
 
Co-founder Victoria Martin is pleased to see this work continue under Wits' leadership. Victoria knows that co-founder Warren Feek (1953–2024) would have felt deep pride in The CI Global's Africa-led direction.
 
We honour the team and partners who sustained The CI for decades. Meanwhile, La Iniciativa de Comunicación (CILA) continues independently at cila.comminitcila.com and is linked with The CI Global site.
Time to read
3 minutes
Read so far

Lessons Learned From Implementing Prospective, Multicountry Mixed-Methods Evaluations for Gavi and the Global Fund

0 comments
Affiliation

PATH (Carnahan, Gurley, Mpanya, Salisbury, Shearer, Shelley); Infectious Diseases Research Collaboration (Asiimwe, Kamya, Nagasha); University of Eduardo Mondlane (Chilundo); University of Washington (Duber, Phillips); Institut de Santé et Développement/University Cheikh Anta Diop (Faye)

Date
Summary

"...in line with calls for more participatory and collaborative models for learning and evaluation in the international development field..."

Gavi, the Vaccine Alliance (Gavi) and the Global Fund to Fight AIDS, Tuberculosis, and Malaria (the Global Fund) are large multilateral organisations funding country governments and partners to implement complex interventions to improve public health. They commissioned 2 mixed-methods evaluations to understand not only what happened as a result of the programme(s), but why the change(s) occurred. Both evaluations were conducted in multiple countries to produce country-specific and cross-country synthesis findings. This article shares 5 lessons distilled from over 7 years of experience implementing the evaluations.

PATH and the Institute for Health Metrics and Evaluation (IHME) at the University of Washington have served as the global evaluation partners (GEPs) leading a consortium of country evaluation partners (CEPs) for the Gavi Full Country Evaluations (FCE), conducted from 2013 to 2018, and the Global Fund Prospective Country Evaluation (PCE), conducted from 2017 to 2021. For example, the FCE aimed to identify drivers of vaccine coverage and equity, with an emphasis on Gavi's support of national immunisation programmes. The evaluations were carried out in 7 countries (FCE: Bangladesh, Mozambique, Uganda, Zambia; PCE: Democratic Republic of the Congo, Guatemala, Senegal, Uganda). The GEPs categorised insights according to the Framework for Evaluation in Public Health and compared their experience-based insights with existing best practices within the framework.

Summary of lessons learned from implementing the prospective, mixed-methods evaluations:

  1. For multistakeholder evaluations of complex interventions, the evaluation team and donors should include an inception phase to focus on stakeholder engagement and evaluation design (e.g., in the form of face-to-face meetings with individuals or small groups of stakeholders to introduce the evaluation and seek buy-in from key government officials). Support from high-ranking officials paved the way for easier access to other government officials and partners for evaluation data collection and contributed to the sense of legitimacy of the evaluation, thereby improving the likelihood that findings would be used.
  2. In a prospective process evaluation, the donor and evaluation team should align on the degree to which the evaluation is embedded in the programme implementation; a quasi-embedded approach can balance objectivity and learning. For example, for the PCE, the GEPs shifted the framing to explain the evaluation as a learning platform so that country stakeholders wouldn't feel they were being audited. This opened the door for a more collaborative approach, in which the CEPs established close relationships with country stakeholders who were able to share documents and data, extend meeting invitations, and serve as key informants. Such involvement has, however, has made it challenging for CEPs to maintain evaluation independence (or a perception of independence). Thus, expectations for programme stakeholders' engagement in the evaluation should be clearly communicated by the evaluation team from the outset of the process.
  3. In evaluations of complex interventions in which the programmes, organisations, or contexts are constantly evolving, the evaluation team needs to continuously monitor changes and adapt the evaluation. The evaluation plan should be designed with enough flexibility to adjust evaluation questions and approaches to respond to changes; to support this, buy-in from and engagement of the donor organisation is essential.
  4. Evaluation teams should ideally include individuals with experience across methods or at minimum, co-locate individuals with various methods backgrounds. Tools and approaches, such as collaborative data review meetings, root cause analyses, and Tableau dashboards, can help to bridge any divide between quantitative and qualitative methods expertise. For example, the PCE visualised quantitative data from Global Fund grant revisions and then generated key informant interview questions to understand why the shifts occurred and how they were affecting implementation activities. To achieve true mixing of methods and paradigms, it is necessary to have a well-integrated team with diverse expertise and established procedures and processes for dialogue and analysis. Using collaborative and interactive processes has been found to be valuable in facilitating mixed-methods analysis of the data and interpretation of findings.
  5. Evaluators should clearly communicate the strength of evidence underlying each finding and should consider engaging stakeholders in the process of refining findings and recommendations. For example, the FCE/PCE teams shared preliminary findings with stakeholders for review to ensure they were reporting full and accurate information. Occasionally, these reviews would motivate stakeholders to share additional evidence to be incorporated; the only potential risk of sharing early findings is that insufficient evidence could undermine evaluators' credibility.

Stakeholder engagement is a key theme that weaves many of these lessons together. In terms of best practices for engendering buy-in of these stakeholders and uptake of findings, the GEPs advise tailoring dissemination strategies and providing knowledge translation support. More formalised communication approaches have focused primarily on annual written reports and annual country-based meetings that bring together a diverse set of stakeholders to discuss evaluation findings and recommendations and provide input on future evaluation priorities. However, they recommend multiple modes of disseminating findings and more timely sharing of emerging findings (e.g., through shorter policy briefs, evaluation team engagement in programme meetings) to fully take advantage of the learning platform provided by collaborative-style evaluations.

Source

Global Health: Science and Practice December 2020, 8(4):771-82; https://doi.org/10.9745/GHSP-D-20-00126. Image credit: Gavi