We’re all overwhelmed with information these days, so we rely on those in the know – public bodies usually – to interpret stuff for us.
In the world of education, one of those public bodies is the Higher Education Funding Council (HEFCE) who has been researching how today’s youngsters do at university. Given the recent big boom in numbers of university students, this is a particularly important question and we must hear the true answer, rather than the answer we want to hear.
Originally, their study startlingly found that a higher proportion of state-educated students gained good degrees than students who had been to independent schools. They found lots else, by the way, of a far more worrying nature: ethnic minorities, disabled students and part-time students seem to do disturbingly less well, for example. But it was the startling school-background stat, which grabbed headlines.
HEFCE had analysed all graduates from 2013-14 and tracked back to their A-level (or other) entry qualifications, compared their school backgrounds for each A-level result and then looked at their degree results. They found that for those with the very top A-levels, school background had no ‘effect’ (an interesting choice of words anyway) on degree outcome, but that for those with ‘medium’ qualifications, more from state schools did well. Overall, they found that only 73% from independent schools gained high degrees as opposed to 82% from state schools. So, state school-educated students do better, was their overall conclusion.
Except that they were completely wrong.
Two key columns had been added up wrongly. In fact 82% of independently educated students gain good degrees, 73% of state-educated. Well, anyone can make a mistake in adding up – even if they are working for a pretty august body.
But HEFCE’s reaction when the mistake was pointed out was illuminating. They corrected the percentage numbers but stuck to their conclusion. Yes, you did read that correctly: they reversed the 73% and the 82% and then bizarrely said that this didn’t alter their overall finding. But while their report points to differences in university results between some state and independent educated undergraduates, this relates to a tiny group among those who had good (but not the top) A Level results at school or college: 2,200 or one per cent of all state-funded entrants.
This provides evidence so slight against an overall picture so clear that it seems perverse to stick to the ‘state students do better’ theory. As Alan Smithers of Buckingham University asked: “What model can perform this kind of alchemy?”
The answer is, a statistical model based on a presupposition of disadvantage suffered by those not in independent schools. It’s a classic circular argument: some students must be at an advantage, so assume they are, remove that advantage and they don’t do so well. So – goes the argument – the advantage must exist. The analysis by HEFCE in its report both ignores the tiny numbers, and sets aside the fact that entry qualifications are by far the best predictor of degree success.
As ever, with these stories, so what?
Well, so three things. First, we should continue to question the wisdom of spending public money researching ‘degree outcomes’ as if degree classifications were the same at every institution of higher education. Every student and every employer knows they are not.
Second, it is silly and intellectually lazy to investigate and then draw nationally published conclusions based on ‘school type’ as if that were a legitimate label for a student’s background. There are – as every parent knows – many different types and conditions of maintained schools, and many different types and conditions of independent. The overlap in quality, social clientele and style is considerable. Yes, it makes an good story, but hardly makes good science.
Most of all, the reaction of HEFCE to its own research is worth putting under the microscope. That they should make a basic mistake in adding numbers is not the end of the world (though shouldn’t someone in a national, publicly-funded statistics agency have checked?). That they are so wedded to their original theory (‘state-educated do better’) as to not even reconsider it when they see that their statistics point to the exact opposite conclusion, points at best to myopia, at worst to groupthink.
We hear daily that politicians are so wedded to dogma that they will not listen to facts. We hear the President of FIFA admitting that the World Cup decision had been made before the evidence was heard. Does educational research operate by different principles to the rest of us? Are belief and prejudice what really matter here, rather than the facts?
Chris Ramsey is Headmaster of The King’s School, Chester, and speaks for HMC on Higher Education.