This blog is the first of a six-part series on education management information systems (EMIS), developed to share lessons learned and strategies employed by countries to enhance their data systems. This series is aligned with the agenda of the UNESCO-GPE EMIS conference held in April 2018 and provides an updated summary of the discussions on EMIS within the partnership over the past year. Details of the conference can be found here: GPE-UNESCO EMIS Conference
A great deal of research has been conducted on the production and use of statistical data in public policy in developing countries, and in the education sector in particular. Available literature on the subject suggests that education management information systems (EMIS) - understood as the systems governing the production and dissemination of statistical data on education - are too often dependent on external donor support, and that the demand for quality data is often generated by international organizations, and not by the governments of developing countries themselves.
In fact, large investments are made in education data, both by governments and across more than a dozen multi- and bilateral aid organizations, including GPE. The 2018 review of GPE's grant portfolio indicates that 29 of the 37 program implementation grants that were active or pending at the end of FY18 had a component dedicated to strengthening EMIS.
And these investments often represent a large amount of funds: a 2017 study published by the World Bank showed that the average cost of EMIS development and strengthening activities in World Bank projects ranged between US$1 million and US$7 million per project.
While millions of dollars are invested in strengthening EMIS every year, many countries still struggle with data-related issues, from lack of quality and timeliness to weak policies and data system architecture.
Recently, a team of World Bank researchers reviewed 20 education sector plan assessment reports of GPE partner countries in sub-Saharan Africa. The study, presented during a GPE-World Bank webinar in November 2018, highlighted that in all cases there was ineffective collection, presentation, analysis and use of data – in other words, problematic EMIS in spite of these large investments.
Conducting EMIS diagnostics to systematically assess the robustness of education data systems
In order to address the inefficiencies in many education data systems, countries should be encouraged to conduct a diagnostic of their EMIS, through a holistic and systematic situation assessment of its different blocks (legal framework, data architecture, methodological process, accessibility, etc.).
Diagnostic and evaluation tools exist in various forms, but in all cases these tools can help identify the strengths and the weaknesses of a system, and the bottlenecks preventing the data system from functioning well.
EMIS diagnostic tools are usually structured around a series of norms or standards reflecting good practices in various dimensions. When administrated through a comparable methodology, EMIS diagnostics can help benchmark countries against each other and allow to identify good practices that can be replicated across different countries, for instance on data collection or data utilization.
The experience has shown that conducting a diagnostic may be a good starting point for national policy makers to reach consensus on what to do to make the EMIS more effective, generate momentum for reforms, and to prioritize EMIS strengthening activities in both the short and long term. And in fact, EMIS diagnostics remain relatively easy and inexpensive to undertake.
Navigating different diagnostic tools
In April 2018, GPE and UNESCO organized an international conference on EMIS, where developing countries and international development agencies gathered to share lessons learned on different EMIS-related aspects, from data quality and utilization, as well as challenges of EMIS in fragile or decentralized contexts.
A specific session was dedicated to the various EMIS diagnostic tools and indicators, and representatives of organizations that have developed these tools and countries who have used them had the opportunity to share their experiences, methodologies, and outcomes.
The session benefitted from the participation of:
- Husein Abdul-Hamid, Senior Education Specialist & Education Statistics Coordinator, World Bank, (starts at 00:03:17)
- Shem Bodo, Acting Executive Secretary, Association for the Development of Education of Education in Africa (ADEA), (starts at 00:24:19)
- Ismaïla Berthe, Technical Advisor in charge of Partnership and Planning, and Tiéoulé Diarra, Head of the Documentation and Communication Center of the Planning and Statistics Unit of the Education Sector at the Ministry of Education of Mali (starts at 00:37:36)
- Matt Brossard, Senior Education Adviser (Systems, Innovations, Data and Evidence for Results), UNICEF (starts at 00:54:55)
- Joe Kai, Research Officer, EMIS Division, and Alton Kesselly, Deputy Minister of Planning, Research and Development, Ministry of Education of Liberia (starts at 01:37:50)
- El Haddad Ahmed, EMIS Specialist, and Oumou Saleme Mint Cheikh, Director of Strategy, Planning, and Cooperation at the Ministry of Education of Mauritania (starts at 01:44:42)
EMIS diagnostics based on the World Bank SABER tool
Among the best known of these tools is the World Bank's SABER-EMIS. The Systems Approach for Better Education Results (SABER) allows policy makers to assess the performance of a country’s data system and data utilization practices. For each of the policy areas evaluated under SABER-EMIS, the assessment framework include a four-level scoring, from “latent” to “advanced”, which allows to draw comparisons between countries.
To date, the World Bank has conducted EMIS SABER diagnostics in 13 developing countries, including six GPE partner countries.
With financial support from GPE, Liberia collaborated with the World Bank to conduct a SABER-EMIS diagnosis. The exercise helped Liberian authorities to identify the main challenges in data collection and processing and laid the foundation for a project to transition to an electronic data collection, validation, and data capture system.
It is interesting to note that other partners also rely on the EMIS SABER framework to measure the efficiency of education data systems in developing countries. As part of its Strategic Plan for 2018-2021, UNICEF measures the proportion of EMIS able to produce disaggregated data on gender, urban/rural, wealth, and disability, on the basis of indicators aligned with the EMIS SABER methodology.
Where data are available, we can see that the efficiency of EMIS has increased slightly over time, even if there are still large areas for improvement, particularly in data utilization and the availability of an inclusive EMIS for children with disabilities.
UIS Education Data Quality Assessment Framework
Another example of EMIS diagnostic tool is the one of the UNESCO Institute for Statistics, which was built upon the Data Quality Assessment Framework (DQAF) initially developed by the International Monetary Fund (IMF) in 2002 to assess the quality of economic data. In 2004, the World Bank and UIS modified the IMF framework for use in the evaluation of education data, based on six dimensions of data quality.
Between 2004 and 2013, almost 30 diagnostics were conducted in sub-Saharan Africa, and more recently the UIS, in partnership with the Australian Department of Foreign Affairs and Trade and the Secretariat of the Pacific Community conducted diagnostics in six Pacific island countries.
Making the EMIS diagnosis a peer learning exercise
The Association for the Development of Education in Africa (ADEA) has also developed an EMIS diagnostic tool, which looks at the performance of an EMIS through 17 norms and 117 standards. These norms and standards were developed and endorsed by African Union Member States through their respective Regional Economic Communities.
While the assessment framework was developed in a manner that makes it possible for country self-assessment, the value added of this exercise lies in the fact that EMIS diagnostics are usually conducted through a peer review methodology.
Officials from ministries of education from 3-5 countries are usually invited to travel to another country and meet with representatives from the government and the local education group to assess the performance of the EMIS.
This exercise allows participants to learn from their peers and discuss whether practices observed can be effectively replicated in other contexts. To date, EMIS peer reviews have been conducted in Angola, Botswana, The Gambia, Ghana, Mali, Mozambique, Swaziland and Uganda.
During the EMIS conference last April, participants from Mali presented their experience in conducting an EMIS peer review with support from ADEA. The exercise allowed the team to identify key strengths of the Malian EMIS, in particular that the country has a single agency in charge of all education sub-sectors, which inspired the team from Burkina Faso who was participating in the peer review mission. Following the exercise, Mali developed a costed action plan for 2017-2020 to strengthen its EMIS.
GPE will also be jointly funding an ADEA knowledge exchange activity with the World Bank in early 2019, facilitating the participation of Haitian ministry of education officials in a peer review of the EMIS in a west African country.
GPE investments and support to systems diagnostic
Over the past year, the GPE Secretariat has convened a task force on EMIS with several international development agencies including ADEA, AFD, ECW, DFID, DFAT, UNESCO -including its institutes IIEP and UIS-, UNICEF, UNHCR and the World Bank.
Through this task force, the international education community was given the opportunity to discuss and reflect on the experience of evaluating the performance of education management information systems.
Between the different diagnostic tools, we found out that more than 50 diagnostics on EMIS have been conducted over the past 10 years. During several rounds of consultations with various education stakeholders, we realized that countries need guidance to better navigate between the different existing tools and to better understand their differences and added value.
GPE’s Knowledge and Innovation Exchange (KIX) funding mechanism will support global and regional initiatives that use knowledge exchange, evidence and innovation to help developing countries solve critical educational challenges.
As part of the data systems funding window in KIX, an identified priority is better streamlining and signposting EMIS diagnostic tools, for use by ministry of Education staff and members of local education groups.
This needs to be undertaken in parallel with the development of coherent and coordinated standards for data systems. Among other activities to be funded, this window will support partners interested in creating a unified EMIS diagnostic tool or toolkit and developing coherent and coordinated standards for data systems.
The next blog in the series will discuss EMIS sustainability and data quality.
The GPE Secretariat would like to thank Matthieu Brossard, Ahmed Hadad, and Ismaila Berthe for their useful contributions to this blog.