How evaluation promotes evidence-based philanthropy

Zenex Foundation

Why it is necessary?

Like many education stakeholders, the Zenex Foundation invests time, expertise, and resources in education with the hope to shift the needle and see improvements in learning outcomes. Because the problems are deep-rooted, almost intractable, and multi-faceted, a sense of urgency, and evidence-based funding in education is critical.  Education interventions are complex, and solutions are equally complex with multiple factors interacting in multiple ways – this makes the case for an evidence-based approach to grant making.

What do we mean when we say we are evidence-based?

The term “evidence-based” features strongly in our strategy documents and in conversations that we have internally and with partners. It is an embedded approach in the way we work, what we do, and how we do it. The body of evidence, about education interventions, provides ongoing learning which informs our project design and rollout.

Over the years, we have used integrated evidence from local and international research, evaluation, and monitoring to inform our decision-making. We get our evidence from evaluations and research, both our own and those of others, as well as our own monitoring. There’s a constant feedback loop in our strategy, creating a virtuous cycle where we are continuously learning, monitoring, and using evidence to inform decisions. We embarked on this journey over 20 years ago, with a high level of buy-in from our Board who have equally been strong proponents of an evidence-based strategy and committing at least 15% of our annual budget to research, M&E, and knowledge sharing. This is a significant and long-term commitment.

It is a continuous cycle to generate new knowledge about what works, particularly in areas where not much is known.  We consciously consolidate lessons across our work in projects, research, and evaluation and ensure that these lessons are shared both internally and with our partners.  We believe that we can only advocate for policy and practice from an evidence base.

How does building evidence inform our projects?

We adopt a rigorous approach in designing interventions and the evaluations design. We are also purposeful in planning the implementation of the interventions we fund. The Zenex Foundation typically invests in three kinds of projects:

  • Pilot projects to learn where little is known. In some instances, there may be some preliminary evidence that the idea works but it has not been tested.  This is done on a small scale, in no more than 10 schools.  We want to assess the feasibility of implementation with teachers and learners – is there take-up and traction?  We undertake implementation or process evaluation using mainly qualitative data collection methods.  Interventions are also tested in different contexts, for example urban or rural contexts.
  • Proof-of-concept to inform scale and systemic interventions. Where there is sufficient evidence about feasibility we design, implement, and evaluate interventions.  These are interventions at a larger scale (the sample is big enough) to generate rigorous evidence.  Impact evaluations are undertaken with a counterfactual so that we can, with confidence, report on learner performance. This type of evaluation has approximately 50 schools in the implementation group and 50 schools in the control group. These impact designs include learner testing, and we also track implementation and experiences of the target group.  This helps us unpack the levers that positively or negatively impact on learner performance.  In proof-of-concept interventions, there is a level of control, thus making it a more manageable process.  There is a high degree of project implementation fidelity and support from implementing partners.
  • Projects aimed at system change. Based on evidence, we can advocate and influence government to implement interventions at a large scale – in a district, a circuit more than 100 schools.  These projects must be led by government and typically include a co-funding partnership with government and in collaboration with other donors.  This is where the rubber hits the road.  It is also where we encounter challenges as a result of working within the various government constraints. These include a lengthy onboarding process to build relationships of trust, as well as challenges due to joint accountability and us not having full control. For these types of projects, we also undertake impact evaluations, as we need to continuously learn about what works at scale at a systemic level.  Increasingly, we are commissioning tracking and sustainability studies.  System-change is only effective if the change in routines, practices, behaviours are embedded in the system.

We have developed this typology for our projects, and donors can strategically fund at any level.  An important consideration is to be deliberate and intentional about our work.  Some donors have more of a risk appetite and can put funding behind innovation and testing.  Some donors, aim to work at systems level. No matter what type of project is funded, the Zenex model incorporates M&E and learning.

How do evaluation results feed into knowledge management?

Being evidence based is one part of the puzzle.  We also believe that it is imperative that we share learnings. Not all interventions have shown positive results.   However, in the spirit of learning, we have shared these with the sector.  Similarly, the findings have informed our strategy on an on-going basis.  Strategy is a dynamic process, and we need to pivot based on evidence.  We will interrogate what works and what does not work so well, to understand what can be learnt from these.  We believe that is it is imperative to share lessons, so we do not repeat mistakes or re-invent the wheel.

An evidence-based approach and systems change

Using an evidence-based approach is a necessary part of systems change. Uptake and utilisation by government is critical. This requires multiple approaches to advocate and influence government.  We believe we can best leverage our limited resources to contribute to knowledge building to inform evidence based systemic intervention and policy.


Examples of how evidence informs our work overtime

Learning Backlogs: Pilot Initiatives

In 2021, the Zenex Foundation published  a series of research papers to consolidate the knowledge we have built over the years about the nature and scale of learning backlogs in Senior Phase Mathematics, Senior Phase English, Early Grades Mathematics, and Early Grades Literacy. We used evidence from numerous evaluation studies which highlighted learning backlogs as a contributing factor in poor learning outcomes in South African schooling.

The evidence is now being used to design projects to address learning backlogs. For example, we are currently in the process with partners to design pilot maths projects to address learning backlogs in Grade 4.

Foundational Mathematics Knowledge: Proof-of-concept to develop rigorous evidence

The project called Base-10 Mathematics Project started with a research-driven pilot which was funded through the  First Rand Foundation and the National Research Foundation (NRF) Mathematics Research Chair initiative, and was tested in ten schools  The pilot proved that learners  need to have a sound foundational understanding of number sense in the early grades to progress to more higher order mathematics learning in later grades.  This pilot uses a particular pedagogical approach to mental mathematics.  We have scaled up the project to 140 schools across Gauteng and KwaZulu/Natal. We have commissioned an impact evaluation to test if the same outcomes can be achieved with a larger group of schools in different contexts, both urban and semi-rural.

Systemic Intervention: Grade R teacher training in Gauteng

Building on the implementation of this project in Western Cape, a collaborative venture between funders, Gauteng Department of Education (GDE) and Non-Profit Organisations (NPOs) was established with the aim of training all Grade R teachers in the province. Lessons from the Western Cape on implementation and evaluation design have been integrated into the Gauteng project. It is a provincial-wide initiative which is only possible through a co-funding partnership. This initiative is led by government not only in project management, but also in delivery.  GDE province and district subject experts are delivering the training together with the partner NPOs.  The project is in its second year of implementation, which follows on two years of on-boarding and materials development together with the province.  An impact evaluation has been commissioned.

Leave a Reply