Information about school performance can be powerful. Governments can use it to drive an accountability system, and schools can use data to identify areas for self-improvement. However, to be useful, the data need to provide fair and accurate information about school quality.
In the UK, Ark’s chain of academy schools is at the forefront of using data to ensure that all children maximise their potential. This experience is being used to drive innovative projects internationally: creating a new school information system which can be used in low resource environments; improving the inspection of thousands of schools in India; and creating value added performance measures in Uganda.
What is the problem with school performance data and accountability in Uganda?
Each year, Ugandan newspapers publish ‘league tables’ to show the performance of schools. For a whole week every January, these league tables dominate the front pages of all the leading newspapers. The tables are based on the percentage of students in each school who achieve a “Division 1” ranking – the highest grade out of 4 in the Ugandan Certificate of Education (UCE).
But this information can be misleading:
- It is relatively easy for schools with a high performing intake to achieve good exam results.
- Equally, achievements of schools performing well under challenging circumstances are not recognised. Even schools with the best quality of teaching would struggle to obtain a high percentage of top grades if their students arrive without basic literacy and numeracy skills.
- Evaluating schools based on how many students achieve the best grades can encourage schools to focus largely on their highest performing students. Some teachers do not give sufficient time to helping those who are struggling to catch up. More than half of students drop-out of school early, before they achieve their school leaving qualification.
By contrast, value added scores are calculated by comparing the results of each student at the end of secondary school to his or her primary exam scores. Schools get credit when a student performs better than expected, given their prior attainment. This helps control the ability of a school’s intake, and to gauge, more accurately, the quality of the education offered by each school. That’s why from this year schools in England will primarily be judged according to their value added score.
A value added model in Uganda is useful and simple to develop
In Uganda, Ark collected examination results data from a representative sample of over 300 schools. We found that primary leaving exam results explain almost half of the variation in UCE scores, showing the importance of controlling for this factor when evaluating performance.
In addition, if value added measures were used, the newspapers’ league tables would look very different. The best school in our study was found in Nebbi district, which is in a remote region of Uganda normally associated with education underperformance.
Importantly, the value added model works when it takes into account prior attainment, but not other factors. Adding a variable for socio-economic status only increased the predictive power of the model marginally.
This means that the government can create the school value added scores using exam results only – data which they already hold. The examination board in Uganda is now working on producing this model for all the secondary schools in the country, by linking their primary and secondary examination databases.
How will the government use the results?
Better information about school performance can give the government greater confidence to act on the data available.
At ‘system-level’ the examination board is interested in publishing value added scores next year to balance off the media’s misleading league tables. School inspectors also want to know the value added scores in advance of visits, so that they can challenge head teachers with more confidence. Schools would no longer be able to explain away poor results by blaming a low-potential cohort.
Both of these are promising avenues to explore. Ideally, they should be considered as part of a coherent accountability framework to make sure that the incentives for all people involved in education align (Pritchett, 2015). Further work would be beneficial, for example, to understand the extent to which parents can interpret and act on value added data, and the training required for school inspectors.
The Education Sector Plan in Uganda is due for renewal – perhaps this provides an opportunity to embed an intelligent accountability framework?
In the meantime, value added data can help refine individual projects, promoting efficiency and equity. For example, the Ministry of Education is planning to use value added data to identify the weakest schools in Uganda who need extra support.
What have we learnt?
The value added model works in Uganda, and it has the potential to improve secondary school accountability at low cost.
Value added data could benefit many other countries. The main requirement is a baseline assessment and a final assessment. Based on the list at the Education Policy & Data Centre, we count 27 developing countries with standardised tests for all primary and secondary school leavers, and there may be more.
There may also be some potential to develop value added measures at primary school level. England has a national assessment system in lower primary, which is used to form the baseline for a primary school value added measure. It would be worth exploring if any other countries have suitable assessments to adopt the same approach.
By Phil Elks from Ark.
Phil was the author of a recent Think Piece produced for DFID titled ‘Lessons learned from introducing value added performance measures in Uganda’.