Why is PISA getting such a bad rap lately?

I have been reading a great deal of educational comment recently questioning what is described as an obsession with PISA scores. PISA is the OECD’s three year survey of educational achievement of  15 year olds in science, mathematics and reading. Let’s get one thing clear from the start.

The criticisms have more to do with us and how we are using PISA, than it has to with the PISA assessments and data themselves.

Most of the criticism of PISA seems to be coming from the USA where there is an epidemic of standardized testing that has swept across the country, causing a significant malaise in education. In the USA we are looking at PISA as if it is yet another high stakes test. Viewed this way the stakes are indeed very high, because it is the nation’s education system that is been assessed – and found wanting. In Australia there is similar criticism, perhaps because although the nation’s 15 year olds still perform well above the OECD average, their edge is slipping.

I guess it is understandable that when you see a test is not treating you well, the first thing you want to do is find fault in the test. The problem is that we are focusing on the wrong things.

Finland has been lauded internationally for some time now because of its student achievement levels on all three PISA measurements – science, mathematics and reading. I listened recently to Dr Pasi Sahlberg, a life-long educator who was traveling in Australia earlier this year. He is currently the General Director of the Centre for International Mobility and Cooperation, Finland and previously he was a Senior Education Specialist with the World Bank as well as the Director of the Centre for School Development, Helsinki.

Finland, it seems has a very different attitude to PISA and, in fact, to all forms of standardized testing. PISA testing just happens to be something that they do. It is not viewed with anything like the significance that countries caught up in the Global Educational Reform Movement (GERM) give it. In Finland students do not take any externalized standardized tests until they have finished high school – at around 18 years of age. They do not judge the quality of their education using PISA data, but they do acknowledge that PISA data reinforces what they already knew about their education – that they had excellent teachers and successful schools.

And they knew all this without ever having resorted to widespread standardized testing.

The problem with our way of looking at PISA is that we focus on the numbers (are we above or below the OECD average?) and the ranking (who is beating us?). Our efforts then are directed towards trying to improving the score, competing with the others and moving up the ranks (the Prime Minister of Australia is determined to have Australia back in the top 5% of PISA by 2025). We start asking questions about how to tighten and standardize the curriculum, how to ensure that everyone is teaching the things that are going to be tested, how to give our students the test taking skills they will need to be successful in the tests.

But PISA is far more than ranking tables and scores. If only we would take the time to examine the data we would learn much more about how to improve our education systems. We would discover that we will achieve nothing if we continue to focus on accountability and standardizing curriculum.

We know from PISA  that in the 2006 round of testing less than 10% of the variation in student performance was explained by student background in five of the seven countries with the highest mean science scores of above 530 points. PISA demonstrates that equity and performance are highly related. In Finland, equity is considered more important than excellence. Dr Sahlberg tells us that “we have systematically focused on equity and equality in our education system, and not so much on excellence and achievement like many other countries have done.”

What are we doing to ensure that all our young people have access to the same quality of education regardless of socio-economic circumstances? In Finland it is illegal to charge fees for any education program that leads to a qualification, because education is deemed to be a right for all its citizens. There are no private schools in Finland. If we only looked more closely at the PISA data we would see that equity, not accountability or curriculum, is foundational to high achievement.

In Finland it is virtually impossible for a ‘bad’ teacher to enter the profession. The demand for teacher training positions is so high that last year 2,500 people competed for 120 positions and the selection panel was able to cherry pick the best of the best. Teachers are not paid dramatically high salaries, but they are highly respected. According to Dr Sahlberg “it’s more difficult to get into primary school teacher education in Finnish universities than medicine.”

We could also learn much about student attitudes towards specific curriculum areas as well as their views on the world. For example, a majority of students reported in PISA 2006 that they were motivated to learn science, but only a minority reported interest in a career involving science. This information needs to be fleshed out if we are hoping to create a nation of innovation and creative development in the immediate future. Why is it that one significant feature of a student’s background in terms of science achievement was whether they had a parent in a science-related career? What is the significance of the fact that PISA reveals USA 15 year olds did not do well in mathematics and yet they feel very confident in their mathematical abilities?

It is disturbing to discover that there is some degree of pessimism among the students about the future of the natural environment. On average across OECD countries, only 21% of students reported that they believed the problems associated with energy shortages would improve over the next 20 years. Are we taking this into account when we review our educational priorities?

PISA is a powerful resource if we would dig more deeply and use it to do something more than hijack the ranking tables to justify a test taking industry that is capitalizing on our failure to think below the surface. In New York, Pearson Education currently has a five-year, $32 million contract to administer state tests and it creates and sells education programs seamlessly aligned with the high stakes standardized tests for students and teacher assessments they are also selling. Pearson reported revenues of approximately $9 billion in 2010. This is big business.

What a tragedy if the most tangible outcome of a comprehensive review such as the OECD’s PISA was to be the creation of a multi-billion dollar industry rather than a successfully educated generation of our nations’ children.

12 Comments

Filed under Language and literacy, Testing, Thinking

12 responses to “Why is PISA getting such a bad rap lately?

  1. PISA assesses whether students can apply their learning, solve problems, etc. This is far more useful than the summative testing that takes place in most educational systems. Education which is geared to the real needs of students enables them to apply knowledge and skills in the real world and thereby become more capable of living lives that are independent, productive and creative. This is a very far cry from the type of education that simply aims at enabling students to achieve high scores in timed tests and exams. It’s also important to recognise that student-centred education also happens to produce students whose motivation to learn makes them the most capable students when it comes to attainment in examinations at the age of 15 and 18. http://3diassociates.wordpress.com/2012/07/25/1850/

    • It is a sobering realization that US students do better in TIMSS than in PISA. TIMSS assesses the extent which they are able to remember/recall/reproduce the taught curriculum, whereas PISA assesses the extent to which they can apply it.

  2. Read Now You See It by Cathy Davidson. A great book based on neuroscience that looks at the question of attention and makes a strong point for how we are putting our attention to the wrong things.

  3. Pingback: This Week’s “Round-Up” Of Good Posts/Articles On Education Policy | Larry Ferlazzo’s Websites of the Day…

  4. Pingback: The Best Sites For Getting Some Perspective On International Test Comparison Demagoguery | Larry Ferlazzo’s Websites of the Day…

  5. “Most of the criticism of PISA seems to be coming from the USA ”

    Do you read something else then English ?

  6. Criticism in French of PISA and Finland (or the lesson drawn from Finland).

    Note how Finish university professors complain about the poor level of their science students…

    http://www.xn--pourunecolelibre-hqb.com/2010/12/les-traits-du-systeme-finlandais-que.html

  7. Finnish miracle: fata morgana?
    Finnish students’ achievement (15 y) declined significantly: study of University Helsinki
    University of Helsinki – Faculty of Behavioral Sciences, Department of Teacher of Education Research Report No 347Authors: Jarkko Hautamäki e.a. Learning to learn at the end of basic education: Results in 2012 and changes from 2001
    S.: The change between the year 2001 and year 2012 is significant. The level of students’ attainment has declined considerably: under the mean of the scale used in the questions. The difference can be compared to a decline of Finnish students’ attainment in PISA reading literacy from the 539 points of PISA 2009 to 490 points, to below the OECD average. The mean level of students’ learning-supporting attitudes still falls above the mean of the scale used in the questions but also that mean has declined from 2001.
    Since 1996, educational effectiveness has been understood in Finland to include not only subject specific knowledge and skills but also the more general competences which are not the exclusive domain of any single subject but develop through good teaching along a student’s educational career. Many of these, including the object of the present assessment, learning to learn, have been named in the education policy documents of the European Union as key competences which each member state should provide their citizens as part of general education (EU 2006).
    In spring 2012, the Helsinki University Centre for Educational Assessment implemented a nationally representative assessment of ninth grade students’ learning to learn competence. The assessment was inspired by signs of declining results in the past few years’ assessments. This decline had been observed both in the subject specific assessments of the Finnish National Board of Education, in the OECD PISA 2009 study, and in the learning to learn assessment implemented by the Centre for Educational Assessment in all comprehensive schools in Vantaa in 2010.
    The results of the Vantaa study could be compared against the results of a similar assessment implemented in 2004. As the decline in students’ cognitive competence and in their learning related attitudes was especially strong in the two Vantaa studies, with only 6 years apart, a decision was made to direct the national assessment of spring 2012 to the same schools which had participated in a respective study in 2001.
    The goal of the assessment was to find out whether the decline in results, observed in the Helsinki region, were the same for the whole country. The assessment also offered a possibility to look at the readiness of schools to implement a computer-based assessment, and how this has changed during the 11 years between the two assessments. After all, the 2001 assessment was the first in Finland where large scale student assessment data was collected in schools using the Internet.
    The main focus of the assessment was on students’ competence and their learning-related attitudes at the end of the comprehensive school education, but the assessment also relates to educational equity: to regional, between-school, and between- class differences and to the relation of students’ gender and home background to their competence and attitudes.
    The assessment reached about 7 800 ninth grade students in 82 schools in 65 municipalities. Of the students, 49% were girls and 51% boys. The share of students in Swedish speaking schools was 3.4%. As in 2001, the assessment was implemented in about half of the schools using a printed test booklet and in the other half via the Internet. The results of the 2001 and 2012 assessments were uniformed through IRT modelling to secure the comparability of the results. Hence, the results can be interpreted to represent the full Finnish ninth grade population.
    Girls performed better than boys in all three fields of competence measured in the assessment: reasoning, mathematical thinking, and reading comprehension. The difference was especially noticeable in reading comprehension even if in this task girls’ attainment had declined more than boys’ attainment. Differences between the AVI-districts were small. The impact of students’ home-background was, instead, obvious: the higher the education of the parents, the better the student performed in the assessment tasks. There was no difference in the impact of mother’s education on boys’ and girls’ attainment. The between-school-differences were very small (explaining under 2% of the variance) while the between-class differences were relatively large (9 % – 20 %).
    The change between the year 2001 and year 2012 is significant. The level of students’ attainment has declined considerably. The difference can be compared to a decline of Finnish students’ attainment in PISA reading literacy from the 539 points of PISA 2009 to 490 points, to below the OECD average. The mean level of students’ learning-supporting attitudes still falls above the mean of the scale used in the questions but also that mean has declined from 2001.
    The mean level of attitudes detrimental to learning has risen but the rise is more modest. Girls’ attainment has declined more than boys’ in three of the five tasks. There was no gender difference in the change of students’ attitudes, however. Between-school differences were un-changed but differences between classes and between individual students had grown. The change in attitudes—unlike the change in attainment—was related to students’ home background: The decline in learning-supporting attitudes and the growth in attitudes detrimental to school work were weaker the better educated the mother. Home background was not related to the change in students’ attainment, however. A decline could be discerned both among the best and the weakest students.
    The results of the assessment point to a deeper, on-going cultural change which seems to affect the young generation especially hard. Formal education seems to be losing its former power and the accepting of the societal expectations which the school represents seems to be related more strongly than before to students’ home background. The school has to compete with students’ self-elected pastime activities, the social media, and the boundless world of information and entertainment open to all through the Internet. The school is to a growing number of youngpeople just one, often critically reviewed, developmental environment among many.
    The change is not a surprise, however. A similar decline in student attainment has been registered in the other Nordic countries already earlier. It is time to concede that the signals of change have been discernible already for a while and to open up a national discussion regarding the state and future of the Finnish comprehensive school that rose to international acclaim due to our students’success in the PISA studies.

    • Thank you for providing this report. Of particular interest to me are the comments in the second last paragraph. we ignore the influence of these cultural changes at our peril. We are working on a new book addressing precisely this cultural change and the changes in ways of thinking being experienced by our younger generation in response to the changes brought about by this digital, information rich age.

Leave a comment