Introduction

The Programme for International Student Assessment (PISA) is, as its name suggests, an international survey that assesses students’ performance. The Survey, which is run in three-year cycles, assesses the performance of 15 year olds in reading literacy, maths and science. It also covers issues such as students’ backgrounds and characteristics and indicators about schools.

PISA is administered by the Organisation for Economic Co-operation and Development (OECD). It ran for the first time in 2000. The number of countries/economies participating in PISA has grown since then and more than 70 countries/economies now participate in the survey.

What PISA Assesses

The OECD states that PISA:

  • aims to evaluate education systems by testing the knowledge and skills of 15 year olds; and
  • focuses on the knowledge and skills that are needed for full participation in society.

The primary purpose of the Survey is to determine the extent to which students can apply their knowledge and skills in reading, maths and science to the challenges that they will encounter in adult life.

How PISA Assessments Work

PISA is a sample survey. A minimum of 150 schools in each country/economy participate in the study. Generally, between 4,500 and 10,000 students in each country/economy will take the test.

Students from England, Wales, Scotland and Northern Ireland are included in the UK sample. The proportion of students included from Wales, Scotland and Northern Ireland is higher than that for England. This enables individual reports to be produced for each.

Whilst PISA assesses performance in reading literacy, maths and science, only one of the domains is examined in detail in each cycle. For example, in 2006 the major domain was science, in 2009 it was reading literacy and in 2012, maths.

Students are required to sit a series of tests and complete a questionnaire. The questionnaire asks for information about the student and the school, including, for example, behaviour and discipline at the school. Parents are also asked to complete a questionnaire which covers areas related to socioeconomic attitudes to education and learning, and education resources at home.

The headteacher of each school participating in the survey completes a questionnaire about the school, including information about teaching staff and school climate. PISA does not survey teachers.
However, a teacher questionnaire will be included in the survey from 2015.

How the Results of PISA Surveys are Reported

PISA results are reported in several different ways. The most widely publicised and controversial is the publication of results in performance tables. Other PISA reports look at the relationship between students’ performance and social background, students’ attitudes and approaches to learning, and the relationship between students’ performance and education resources, policies and practice.

Country/economy test scores are standardised on a single scale with a mean of 500 and standard deviation of 100. Standardisation takes place when a subject is the major domain. The results for reading literacy were standardised in 2000, maths in 2003 and science in 2006.

Student performance is also reported through the use of proficiency scores on a seven-point scale ranging from 1b to 6. The OECD considers level 2 to be the baseline of proficiency at which students demonstrate the knowledge and skills needed for adult life. A score at level 4 or above is said to indicate strong performance. This information is used to indicate the spread of student performance within a country/economy. Combined with data about students’ background, this provides information about the relationship between performance and socioeconomic status.

The Main Concerns About PISA

There are many criticisms of PISA. These include criticism about the educational and cultural limitations of the survey, issues about the survey’s methodological limitations, questions about what PISA actually measures and concerns about political influence and interference.

Critics argue that PISA only focuses on reading literacy, maths and science and pays little attention to other aspects of the school curriculum. They also argue that the skills that are being assessed may not be commensurate with the educational goals and ethos of some countries. Further, they argue that the need for questions to be relevant across cultures and education contexts limits the questions that can be included and risks creating a ‘one size fits all’ assessment that is not a good match for any single system.

Whilst the OECD claims that PISA tests the knowledge and skills that are essential for full participation in society, critics argue that a test lasting just two hours and completed using a pencil and paper is unlikely to assess these things. They also argue that ‘real life’ will be different in different countries and that this cultural bias will disadvantage students from some countries. Some suggest that PISA will lead to a greater standardisation of education internationally. A particular concern is that there will be greater standardisation with those countries that have the greatest influence over the OECD and its work priorities (e.g. the USA, the UK and Australia).

Some argue that the format of PISA tests and students’ familiarity with the types of question will affect the test results. Students who are used to answering PISA-style questions and who regularly sit tests are likely to perform better than students who are not familiar with such questions or test environments.

A major criticism of PISA is that it is overtly political in nature and the reporting of students’ results in the form of league tables encourages this practice, distorting education policies and priorities. For example, some governments seek to introduce policies that will improve performance in PISA (effectively teaching to the PISA test). Governments also use performance in PISA to justify the need for policy change and then ‘cherry pick’ policies from ‘high.performing’ systems that reflect their ideological position.

The Use of International Assessments

The NASUWT believes that evidence about students’ performance in international assessments, including evidence from PISA, can be very useful. For example, it enables policy makers and others to draw comparisons between education systems. It also enables them to examine elements of different education systems and consider the reasons for common problems, as well as why some policies are effective. The evidence also provides opportunities for policy makers to reflect on the implications of these analyses for education (and social and economic) policy and practice.

The key point is that evidence from international assessments such as PISA is useful as a tool for examining education policy and practice.

However, this is very different from using the evidence to say that an education system is better or worse than other systems. It is not appropriate to use international evidence in this way as the following example that compares the results of PISA with those of another international assessment, Trends in Mathematics and Science Study (TIMSS), illustrates.

TIMSS assesses the performance of students aged 9-10 years and 13-14 years in maths and science and uses questions that are based on national curricula.

England participates in both PISA and TIMSS, as does New Zealand. In PISA 2009, New Zealand was one of the highest performing countries and performed significantly better than England in reading literacy, maths and science. However, in TIMSS 2007, New Zealand’s results were below the country mean in maths and just above the country mean in science and England’s results were significantly better than New Zealand’s in both maths and science. In other words, deciding to use PISA rather than TIMSS assessments to make a judgement about the quality of the education system could lead to very different conclusions about the system. Education systems are about much more than what international assessments test, so it is not appropriate for governments to adopt strategies to simply improve performance in international assessments.

The focus of policy reforms should be on agreed education priorities and on ensuring that policy is consistent with and contributes to the aims, purposes and values that underpin the education system. Policy should be based on a wide and rich range of evidence, including evidence from teachers and school leaders about effective practice and the issues that impact on learning and teaching. Of course, this may improve performance in international assessments such as PISA, and international assessments may be useful as one of the indicators that a policy is successful.

Education Policy and the Use of PISA and Other International Evidence

The Coalition Government places great emphasis on international evidence and uses it to justify many of its policy reforms. However, there are significant issues about the way in which this evidence is selected and used. The following two examples illustrate how the Coalition Government is misusing international evidence.

More School Autonomy and Parental Choice

The Coalition Government has used evidence about education policies in Sweden to justify introducing more school autonomy, greater parental choice and the creation of free schools. Speaking at a conference for school leaders, the Secretary of State for Education, the Rt. Hon. Michael Gove MP said that results in Sweden improved fastest in the areas where schools exercised the greatest degree of autonomy and parents exercised the widest choice.

However, evidence from PISA shows that Sweden’s performance in PISA tests has declined steadily from 2000 to 2009 in reading literacy, mathematics and science. Whilst Swedish students performed significantly better than the OECD average in all three domains in 2000, in 2009 students only performed around the OECD average in reading literacy and mathematics and were performing significantly lower than the OECD average in science.

Another study that looks at student performance internationally and in US states makes similar findings. The study draws on a range of international evidence including PISA, Trends in Mathematics and Science Study (TIMSS) and Progress in International Reading Literacy Study (PIRLS). It reveals that students’ performance in Sweden has declined over a period from 1995 to 2009 and that this decline is greater than any of the other 48 countries included in the study.

The decline in Sweden’s performance in international assessments corresponds with the introduction of greater school autonomy and parental choice, including the introduction of free schools.

A paper on Swedish free schools by Dr Susanna Wiborg (pdf) draws on a range of research about free schools. The evidence suggests that children attending free schools are more likely to come from middle-class families and that free schools are based in predominantly rich, middle-class areas. The report also finds that free schools exacerbate social and ethnic segregation.

This does not mean that the introduction of free schools, increased parental choice and greater school autonomy caused the decline in Sweden’s performance in international assessments, but it does raise significant concerns about these policies. It highlights the need for a careful examination of the reasons for the decline in students’ performance. It also emphasises the need to consider the impact of policy reforms on the education system as a whole, including different groups of children and all schools. For example, it is possible that improvements in standards in some schools could result in a decline in standards in other schools and across the system as a whole.

National Curriculum Reforms

When announcing plans to review the National Curriculum, the Coalition Government said that the review would ensure that ‘the construction and content of the new National Curriculum is based in evidence and informed by international best practice’. The Coalition Government established an Expert Panel to provide that evidence base.

The four-person Expert Panel published its report in December 2011 and in June 2012, the Secretary of State for Education, sent a letter to Tim Oates, Chair of the Expert Panel, setting out the Government’s next steps for the reforms.

Following the publication of the Secretary of State for Education’s letter, one of the members of the Expert Panel, Professor Andrew Pollard, published a blog outlining concerns about the way in which Ministers were interfering in the review process and about their failure to draw on the evidence provided by the Expert Panel. In the blog, Professor Pollard comments that Ministers are heavily influenced by the work of ED Hirsch and Hirsch’s ‘core knowledge’ curriculum. He says that when he was first appointed to the Expert Panel and met Nick Gibb MP, the Minister for Schools, the Minister had a copy of ED Hirsch’s book ‘heavily stickered with Post-It notes’ on his desk. Andrew goes on to say that ‘Michael Gove’s instructions to Tim Oates…were to trawl the curricula of the world’s high performing countries, to collect core knowledge, and put it in the right order’.

Two other members of the Expert Panel, Professor Mary James and Professor Dylan William also expressed their concerns about the review process. Correspondence between Professor James, Professor Pollard and Ministers is published on the British Educational Research Association (BERA) website and this reveals that both Professor Pollard and Professor James were raising concerns about Ministers’ involvement in the review throughout their time as members of the Expert Panel. Similarly, Professor Dylan William has voiced his concerns in the media about the National Curriculum review process.

In his blog, Professor Pollard refers to evidence in the Expert Panel Report that emphasises the need for education to be ‘the product of interaction between knowledge and individual development’. He points to the need for ‘teachers to use their expertise to manage this interaction beneficially’, that ‘this is the real lesson of international evidence’ and that a review of the curriculum is ‘fundamentally flawed without parallel consideration of the needs of learners’.

Ministers refer to learning from high performing nations such as Singapore. However, countries such as Singapore emphasise the need to develop a very different curriculum to the ‘core knowledge’ curriculum that the Coalition Government is seeking to establish. For example, Singapore Government’s Ministry of Education is seeking to develop a ‘21st century’ curriculum, an aim that is designed to provide opportunities for students to take a more active role in their learning and develop creative and entrepreneurial skills.

There is a wealth of national and international evidence about curricula, including national curricula. However, the Coalition Government is not drawing on this evidence. Ministers are using evidence very selectively to support or justify their plans. In doing this, they are ignoring evidence about how the proposed curriculum policies might impact on education outcomes. The fact that the aims and purposes of the National Curriculum have not yet been discussed and agreed compounds the problem. The aims and purposes of the curriculum should both guide what information is needed and how evidence about policy and practice is assessed and evaluated.

Challenging the Inappropriate Use of PISA and Other International Assessments

The inappropriate use of international evidence must be challenged and its appropriate and effective use promoted.

The NASUWT has taken a leading role in highlighting some of the most significant abuses of international comparative data by politicians and by some commentators in the media. In particular, the Union has continued to emphasise that claims that PISA and other similar studies demonstrate that the education systems in the UK are in decline relative to other countries are entirely without foundation. The NASUWT has remained clear that assertions about the relative quality of the education systems in the UK made on this basis are frequently driven by ideological perspectives rather than a critical evaluation of the available evidence.

The NASUWT is also working with Education International, the global trade union federation for education, and with partner unions in other countries to raise awareness of the problems caused by inappropriate use of PISA and other comparable assessments. The Union has engaged directly with the OECD on the use of PISA and, as a result of this work, has been able to secure inclusion of a questionnaire for teachers in the next round of PISA assessments in 2015.

A key priority for the NASUWT is to ensure that more research is undertaken on the strengths and limitations of PISA.

To this end, the Union has commissioned an independent expert investigation into the misuse of PISA by the Department for Education (DfE). The NASUWT is committed to working to ensure that public debate on PISA and its use by policy makers is informed by valid and reliable evidence.