Stats spat over worldwide education rankings

The leaning tower of PISA indicators: is faith in statistics misplaced?

Newspapers are bursting with grim data after Britain’s poor performance in a study known as ‘education’s world cup’. But how useful are sweeping international statistics like these?

A study released on Tuesday has thrown Britain into spasms of shame and self-flagellation. Newspapers trumpet damning statistics while politicians squabble over who is to blame. And it’s not just the UK: the hand-wringing is echoed from France to Scandinavia to the USA.

The cause of all this fuss? A report from the OECD known as the Programme for International Student Assessment (PISA), which pits 15-year-olds around the world against each other in a series of tests to determine each country’s level of educational attainment.

The PISA report is released once every four years and is based on data from 66 countries. It is, according to one expert, ‘education’s equivalent of the football World Cup’. And many Western countries are being outplayed.

Britain has dropped out of the top 20 and made no discernible progress in either maths, science or English. The USA performed worse than Vietnam. And Finland, which once perched proudly at the top of the rankings, is mortified to find itself outside the top ten. Its place has been taken by a group of East Asian economies such as Shanghai, South Korea and Singapore.

Politicians and commentators brandish the findings at their ideological opponents, claiming that the results back up their opinions. But some statisticians have questioned whether the study can really be used in this way.

One problem is that PISA only tests certain skills (such as practical mathematics), which may suit some education systems better than others. Then there is the fact that not all students answer exactly the same questions: each is given a sample of the full test and then the data is crunched in sophisticated and long-winded ways. The resulting inconsistency means that the UK’s true rank could be anywhere from 23rd to 31st – and critics claim that the margin for error is wider still.

And even if the results are reliable, the cause of this variation is not easy to determine: a successful education system might be a result of anything from great teachers to profound cultural divides.

PISA express

So is there any point in statistics like this at all? Many will suspect not. And a few commentators go even further: the tyranny of statistics, they say, encourages policy makers to abandon what works in their own society to scramble for credit in a flawed international competition.

But the researchers who compile measures like PISA have an unshakeable faith in the power of data – as long as it is interpreted with care. The rankings alone can’t tell us everything, they admit, but read them with caution and you will find a wealth of revelations hidden beneath the statistical surface. Numbers may not tell the whole story, but they have a lot to say to those who know how to listen.

You Decide

  1. Do you trust statistics you see on the news? Why / why not?
  2. How much attention should political leaders pay to statistical comparisons between their countries and others?


  1. How do you think your country’s schools could improve their results? Think of one idea and present it to the class.
  2. Find a statistic on a newspaper or news website and do some background research into it. Where does it come from? Is it misleading? Is there any extra information that should be taken into account?

Some People Say...

“Numbers speak louder than words.”

What do you think?

Q & A

So can I trust statistics at all?
They can be very illuminating. But don’t accept them at face value: when you read a number in a headline, think carefully about exactly what is being measured, how and by whom. Often an eye-catching headline can hide a more complex reality.
Why do East Asian students do so well in these tests?
A number of theories have been put forward, but one factor stands out: they work unbelievably hard. In countries like South Korea it’s not uncommon for school children to study for 12 hours or more each weekday, followed by booster classes at the weekend.
That sounds like hell.
Maybe. But it also reveals something that might encourage you: academic success is possible regardless of your supposed natural ability. ‘Talent’, as one expert put it, ‘is a myth.’

Word Watch

The Organisation for Economic Co-Operation and Development is one of the most important think tanks in the world. It collects and analyses data from 34 core countries (plus occasionally others) and uses its findings to make recommendations about government policy.
Most of the economies participating in PISA are countries, but China sends data from a handful of cities instead. Shanghai came top of this year’s table, followed by another Chinese city, Hong Kong. Students in these cities study for long hours in school and are set around 14 hours of homework each week.
Margin for error
Most statistics are gathered by testing or questioning a sample group and using the results to make assumptions about the population as a whole. This can give a very good indication of the truth, but there is always a level of uncertainty. Statisticians deal with this by giving a ‘margin for error’ for each statistic they generate: for a rigorous and extensive study the margin might be plus or minus two percent, but it can sometimes be far greater.


PDF Download

Please click on "Print view" at the top of the page to see a print friendly version of the article.