How can data be used to improve teaching and learning?

Lessons from successful approaches to data collection and innovative use to drive and improve learning and teaching practices.

January 31, 2025 by Clio Dintilhac, Gates Foundation
|
5 minutes read
A teacher at the blackboard at the Nyamachaki Primary School, Nyeri County. Kenya. Credit: GPE/Kelley Lynch

A teacher at the blackboard at the Nyamachaki Primary School, Nyeri County. Kenya.

Credit: GPE/Kelley Lynch

Which schools perform better in your district? Which ones are lagging and why? Do all children have books? Are teachers teaching effectively and following the lesson plans? Did trainings happen and were they effective?

These were some of the questions I had on my interview guide with officials at the district level in a sub-Saharan African country a few months ago.

I will always remember their candid reply:

  • “We have limited data on which schools are lagging and limited means to support.”
  • "Our ratio of inspectors/advisors per school is 1 to 200. It's physically impossible for them to visit every school even just once a year."
  • "Our inspectors have little budget to go out in the field - if they want to go it's by their own means."
  • "When they go out in the field, they fill out a paper report which takes time to review."
  • "Even once we have the information, we have little ability to really act on the data."

Collecting and using school data to monitor and improve performance is complex, even with motivated civil servants.

Better understanding of how to support district and sub-district levels and the ministry to monitor whether learning is happening and whether key activities to support effective instruction are taking place is critical.

Key strategies for large-scale learning results

We have solid grounds for optimism. A great deal of evidence exists on how to improve foundational learning at scale. The Global Education Advisory Panel summary highlights two evidence-based approaches:

  1. The use of structured pedagogy in a whole-class setting: Teachers are provided with guides, student materials aligned with the guides, training, coaching and regular follow-up to support structured pedagogy.
  2. Teaching at the right level (TaRL) for remediation: This method groups students by skill level and adapts teaching to their needs.

However, these methods only work if teachers apply them daily. To monitor this, you need data. Data on:

  • Learning
  • Teacher practices
  • Whether key activities enabling good teaching practices are happening, such as books arriving or teacher training taking place (more on how to monitor key program activities here).

International lessons on data use in education

Successful systems use data in innovative ways to drive and improve learning. A review of some successful educational reforms suggests three promising approaches:

1. Using student-level learning data

The first approach is to use learning data collected at the student level to identify schools in need of additional support to reach learning benchmarks.

The TaRL approach in Zambia demonstrated large-scale results: after one year, reading proficiency increased from 34% to 52% (as measured by the ASER test in a 2016 pilot) for 1,200 learners in grades 3 to 5 learning in 80 schools across the eastern and southern provinces.

Teachers assessed students 3 times a year to allocate students to remediation groups so that they could catch up at a level and pace best suited to their needs. This assessment data was used directly to inform teaching decisions as well as consolidated at district and regional levels to measure school progress.

One drawback to this approach is the amount of time taken from classroom teaching to collect learning data with each pupil several times a year. Thankfully, efforts are underway to reduce the time teachers spend on these assessments in the TaRL method.

2. Using implementation indicators in structured pedagogy programs

The second approach, used by many structured pedagogy programs in low- and middle-income countries, involves collecting key implementation indicators through district-level staff or headteachers, often on tablets.

Indicators are captured by questions such as, “Are the materials in the class?”, “Do teachers use them?”, “Do they follow the lesson plan?”, or “Do students engage directly with the skills at hand?”

This monitoring approach has been implemented in many structured pedagogy programs such as Tusome in Kenya.

The advantage is a comprehensive diagnosis of implementation issues, enabling data collectors to give feedback to teachers. The drawback is that there is often not enough staff at district and sub-district levels to complete this level of monitoring, with school staff already contending with a large workload.

Some countries use rotation systems to ensure schools struggling to maintain and improve learning outcomes get more visits for additional support. When government staff is unavailable, dipstick surveys are one way to still monitor the implementation of structured pedagogy programs through external technical assistance.

3. Adopting innovative uses of administrative learning data

This last approach, although less evaluated, is pragmatic and cost-effective. It uses examination and administrative data to better understand school performance.

In Nepal, dashboards based on administrative learning data have been used to identify disparities so that resources can then be focused on the most vulnerable areas.

Creating such dashboards requires relevant data to be collected through administrative systems or proxies, such as exam results. Although a promising approach, more evidence is needed to assess under which conditions administrative data dashboards can accurately target schools in need of more support.

These three approaches that focus on using data to improve learning are not mutually exclusive and can readily benefit school goals when used in combination, as discussed in a panel at the FLEX conference (summarized here with additional reflections on the conference shared here).

For example, Senegal uses both the first and second approaches described here by undertaking quarterly assessments at the student level (approach 1) and using data on the fidelity of implementation of structured pedagogy programs—that captures how well teaching material is delivered as intended to students—and is measured using a new dedicated tool (approach 2) to manage their system.

Daara Serigne Mansour Sy. Tivaouane, Senegal. Credit: GPE/Chantal Rigaud

Daara Serigne Mansour Sy. Tivaouane, Senegal.

Credit:
GPE/Chantal Rigaud

What we know and don't know, yet

We've already learned a lot from existing models for using data at the district/sub-district level to better understand teaching and learning in schools. However, many areas remain to explore the full range and benefits more data can bring to supporting education systems to improve learning.

This blog illustrates different approaches to data collection and use, but equally important is understanding how best to support data use for teaching practice.

Decentralized levels often have a lot of administrative and management work to do that is not directly related to instructional support.

Supporting them to prioritize pedagogical support requires reprioritizing and redesigning their work. More evidence and advocacy are needed on effective approaches and data to improve teaching and learning in classrooms.

I look forward to learning about different countries' approaches to these key issues at the dedicated panel at the conference "Apprendre pour demain".

In the words of 10-year-old learning champions Zawadi and Elvis from Kenya: we need more relevant data for improving learning: let’s ‘measure early, measure all and measure well.’

Related blogs

Leave a comment

Your email address will not be published. All fields are required.

The content of this field is kept private and will not be shown publicly.

Comments

  • No HTML tags allowed.
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.