Monitoring, evaluation and learning

GPE’s effectiveness as a partnership is grounded in our ability to learn from our shared experiences

GPE’s 2025 monitoring, evaluation and learning (MEL) framework aims to drive evidence-based learning and decision-making to improve performance across the partnership. This aim is achieved through:

  • Supporting learning and the use of evidence at the country level to enable adaptive management and strengthen the capacity to drive results.
  • Strategic monitoring, evaluation and learning for accountability, transparency and aggregating results at the partnership level.

Monitoring

Regular data collection, dissemination, and use are important for furthering our mission and achieving our goals. Robust data helps ensure all partners have key information readily available to monitor partnership effectiveness and support the effective implementation and outcomes of our grants. Monitoring includes the GPE results framework and an annual results report.

Results Framework

The Results Framework is aligned with the GPE strategy. It aims to support strategic decision-making and transparency by allowing the Partnership to monitor progress in the main areas of its strategy.

The Results Framework includes sector-level indicators that are disaggregated by country and individual characteristics. These characteristics include fragility status for countries and sex and disability status for children (as available). The Results Framework also includes indicators related to GPE grants and programs.

Access the:

The RF structure follows GPE 2025 strategic framework

RF Structure follows  GPE 2025 strategic framework
RF Structure follows  GPE 2025 strategic framework

Results Report

GPE produces an annual results report that provides updates on the status of the results framework indicators, analyzes the trends, and consolidates information from evaluations and other evidence-based reviews.

The objective of the Results Report is to provide a comprehensive snapshot of GPE’s achievements and areas for improvement. The report also aims to support the GPE Board and Committees decision-making and to promote evidence-based learning across the partnership.

Evaluations

Evaluations provide the foundation for evidence-based learning and actions across the partnership, and support transparency.

The evaluation program is underpinned by GPE’s evaluation policy, which outlines the key principles that all GPE-financed evaluations follow.

The five-year evaluation program includes:

Type FY22 FY23 FY24 FY25 FY26
Country-level evaluations Design Inception phase Phase 1
(8 case studies and annual synthesis)
Phase 2
(15 case studies and annual synthesis)
Phase 3 (end)
(15 case studies and final synthesis)
Thematic evaluations of GPE priority areas Design Inception phase Phase 1
(studies on 8 thematic areas)
Phase 2
(studies on 8 thematic areas)
Phase 3 (end)
(studies on 8 thematic areas)
Process and programmatic evaluations and reviews Internal review of the Multiplier;
KIX and Education Out Loud mid-term reviews;
COVID formative evaluation
External Multiplier evaluation;
KIX thematic evaluation on the applied research portfolio
COVID summative evaluation;
Education Out Loud impact evaluation;
KIX thematic evaluation on support for knowledge mobilization with stakeholders involved in hubs
KIX thematic evaluation on the country support mechanism Strategic capabilities;
KIX and Education Out Loud end-of-program reviews
Systematic reviews Teaching quality;
Grant completion reports
    Grant-funded country-led evaluations
Final strategic evaluation       Begins Complete by fall 2025

Learning from pilots

A rapid learning approach is currently underway to help GPE better understand and learn from the experience of rolling-out the new operating model. The approach focuses on systematically collecting, analyzing, sharing and acting on the feedback from across the partnership. Information is collected through surveys, focus group discussions, and webinars.