Civitas
+44 (0)20 7799 6677

PISA – Show’s over: international study exposes government standards charade

Final straw for government’s education record: world’s most comprehensive assessment of pupil knowledge and skills crushes UK government claims of rising school standards.

PISA results show declining standards between 2000 and 2006:

  • 523 – 495 (28 point decline) from 2000 in reading amongst UK 15 yr olds: a decline from 23 points above the OECD average, to 3 points above average. This is a drop from 7th to 17th place in PISA’s international rankings
  • 529 – 495 (34 point decline) from 2000 in maths amongst UK 15 yr olds: a decline from 29 points above average, to 3 points below average. This is a drop from 8th to 24th place in PISA’s international rankings
In contrast to:Government results show rising standards at all expected levels between 2000 and 2006:

  • English: 4 percentage point rise at Key Stage 2 (11 yrs old), 8 percentage point rise at Key Stage 3 (14 yrs old), 7 percentage point rise at GCSE (15 yrs old)
  • Maths: 4 percentage point rise at Key Stage 2 (11 yrs old), 12 percentage point rise at Key Stage 3 (14 yrs old) and 9 percentage point rise at GCSE (15 yrs old)

Learning, time and money squandered

Results from the OECD’s Programme for International Student Assessment (PISA) released today, provide conclusive evidence that whilst government exam scores have been rising, standards have in fact sunk under New Labour.

‘These devastating results provide a much-needed and much-anticipated expose of the government’s standards “charade”,’ said Anastasia de Waal, Head of Family and Education at the independent think tank, Civitas. ‘That standards have actually declined amongst UK pupils is unsurprising when we consider that much ostensible improvement has been achieved through harmful shortcuts.

‘Suspicions were roused when the UK was not included in the 2003 PISA. The UK’s response rate was such that the sampling standards were not met. Academic commentary at the time perceived this to be because schools with poor results did not participate. PISA’s damning verdict for 2006 shows that those suspicions were warranted.’

The chasm between the government’s alleged improvement and independent measures has developed as higher test scores have been achieved artificially by gaming schools and pupils for exams and lowering exam standards.

The real picture

PISA’s results for the UK are the culmination of extensive evidence showing that government claims of constantly improving school standards are contradicted by independent measures.

Several robust independent measures have shown that New Labour’s progress in both secondary and primary school test and exam results is inflated. For example:

Primary

Government statistics: Since 1997, the percentage of pupils at Key Stage 2 reaching the expected level or higher has increased from 63% to 80% in English, from 62% to 77% in maths and from 69% to 88% in science.

Independent counter-evidence

  • Progress in International Reading Literacy Study (PIRLS): has found that England dropped from 3rd place to 19th place between 2001 and 2006.
  • Primary Review, Faculty of Education, Cambridge University: has found that literacy levels have remained static since the 1950s and one in three Key Stage 2 test results are wrong.
  • David Jesson, York University: has found that 1 in 6 pupils achieve a higher level in their Key Stage 2 tests than their teachers think they merit.
  • The Curriculum, Evaluation and Management Centre (CEM), Durham University: have found in their own tests that between 1997 and 2002 there was no evidence of improvement in literacy and only meagre improvement in maths, despite significant rises in Key Stage 2 test scores.
  • National Foundation for Educational Research: has found no improvement in standards despite rising Key Stage 2 scores. The NFER standardises the test scores for Key Stage 2 tests. If actual standards are rising the tests have to be re-standardised so that all the scores do not shift upwards. However, despite four years of raised Key Stage 2 test scores, by 2002 the NFER found no need to re-standardise. This indicates no genuine change in achievement.

Secondary

Government statistics:

Key Stage 3: Since 1997, the number of pupils reaching the expected levels or higher has risen from 57% to 73% in English, from 60% to 76% for maths and from 60% to 72% for science.

GCSE: Since 1997, the number of pupils achieving 5 or more A*-C grades has risen from 46.3% to 61.5%.

A-level: Since 1997, the number of A-level passes has risen from 87.2% of all A-level entries, to 96.9%.

Independent counter-evidence

Key Stage 3:

  • Trends in International Mathematics and Science Study (TIMSS): Despite significant rises in Key Stage 3 maths and science results between 1995 and 2003, TIMSS scores for 14 year-olds showed no improvement from 1995 between that period.

GCSE:

  • Robert Coe, The Curriculum, Evaluation and Management Centre (CEM), Durham University: Dr Coe found that using the Year 11 Information System (YELLIS), taking an average of 26 subjects, pupils of the same YELLIS standard could generally expect to achieve approximately half a grade higher at GCSE in 2005 than they could in 1996. This means that a higher proportion of pupils could achieve A*-C without an increase in pupil standard.

A-level:

  • Robert Coe, The Curriculum, Evaluation and Management Centre (CEM), Durham University: Dr Coe used the Test of Developed Abilities (previously ITDA) to compare the actual attainment of pupils from year to year with their paper qualifications. Taking an average of 40 A-level subjects, he found that those scoring 50% on the ITDA test in 1997 would tend to achieve low C grades, but by 2005 were achieving low B grades. This again means that pupils could achieve a better grade without a rise in pupil standard.

PISA’s results, supported by other independent measures, show that rising test and exam results in the UK not only do not equate with higher standards of knowledge and skills, but that their achievement has often involved lowering standards. The methods which the government has encouraged to achieve many of its results, show a much greater commitment to Public Service Agreement targets than to pupils.

Nowhere to hide

The government will doubtless argue that the 2006 PISA data cannot be compared with the data from 2000. However, PISA clearly states its assessment as designed to allow countries to “…track their progress in meeting key learning goals”.

Whilst the science data between 2000 and 2006 are not comparable, the reading and maths data are valid measures of trends over time.

‘The government must finally take responsibility for the failure it has covered up until now,’ concludes Anastasia de Waal.

For more information ring:

Anastasia de Waal, Head of Family and Education: 020 7799 6677 (w), 07930 354234 (mobile)

Newsletter

Keep up-to-date with all of our latest publications

Sign Up Here