This post was originally published on www.thomashatch.org
Following the cascade of headlines on the release of TIMSS scores last week, the results of the 2015 PISA tests were announced yesterday. There were some cautions about putting too much weight on the rankings. In fact, both Valerie Strauss in the US (“Why Americans should not panic about international test results”) and Stewart Riddle and Bob Lingard in Australia (Pisa results don’t look good, but let’s look at what we can learn before we panic) tried to stave off knee jerk reactions. Nonetheless, as usual, the headlines around the world seem to focus primarily on who’s on top of the rankings and where individual countries place on one or more of the tested subjects of math, reading, and science. Times Higher Education put it succinctly — Pisa results 2016: Singapore sweeps the board – but noted that while East Asian countries dominate the rankings, “China loses ground in tables after new provinces are included for the first time.” While some reports about TIMSS noted declines in performance by Finnish 15 year-olds, Finland remained near the top on the PISA tests, but the high performance of neighboring Estonia was recognized as well (Finland and Estonia top of the class in EU for education, Euronews). Other countries beyond Asia, like Canada, also received some positive headlines (Canadian students rank fourth for science performance, The Globe and Mail).
Many headlines in Australia seized on bad results from PISA 2015 that echoed declines on TIMMS (as Teacher Magazine put it, PISA 2015 brings more bad news for Australia while ABC Online highlighted Australian schools are in ‘absolute decline’ globally, says PISA report). At the same time, in headlines and on twitter, concerns (and blame) over poor performance of Wales and Scotland were also in evidence (Full Pisa results 2016 show Wales’ schools are still adrift of the rest of the rest of the UK, Wales Online; Scottish school standards in maths and reading slump in damning PISA survey, Herald Scotland).
There was also considerable controversy in Malaysia where government officials touted what they viewed as improved results (PISA 2015: Malaysia shows significant improvement in Math, Science & reading, New Straits Times Online); however, critics pointed out that OECD did not include Malaysia in the results of PISA 2015 (PISA 2015: Malaysia shows significant improvement in Math, Science & reading, New Straits Times Online). Quotes from one source cited OECD’s concern of a response rate of sampled schools in Malaysia of roughly 50% compared to the desired 85% response rate.
Beyond the headlines, reporting sometimes noted both good news and bad news. As (a rough translation) from Diario Perú21, put it: “The good news is that the level of Peruvian schoolchildren improved in the last three years – the fastest… in Latin America, the bad news is that Peru still ranks in the last place on the list.”
Meanwhile, Spiegel Online noted that German students were in the “upper middle” of the rankings but also highlighted that only two other countries scored lower when students were asked whether they could envision a career in science. In the US, however, stories headlined declines in math performance with only a mention or two that the association of between socio-economic status and student performance in science in the US has declined (American teens’ math scores fall on an international test, Los Angeles Times; Internationally, U.S. Students Are Falling, US News & World Report).
In a few cases, reports went beyond the basic rankings to highlight other aspects of the findings. Schools Week for example, headlined “No improvement for a decade” but also highlighted what it called “10 other oddities” including “White working class pupils are not doing worse than ethnic minority working class pupils”and “Second-generation immigrant children do as well as pupils with parents born in England.” Quartz also used the PISA 2015 release to headline gender issues (The origin of Silicon Valley’s gender problem). A number of reports also picked up on several other results that OECD highlighted, including gender gaps (particularly in interest in and career aspirations in science) and countries that were high performers and showed equity in education outcomes (like Canada, Denmark, Estonia, Hong Kong and Macao). However, those issues usually did not make it into the headlines.
(It is worth noting, however, that this exercise of scanning the headlines continues to be limited by language abilities and the vagaries of online translations, which continue to produce hard to interpret results like “Science does not go into the forest! Polish students at the forefront of PISA 2015”)
— Thomas Hatch
Australia
PISA results show further decline in Australia’s education rankings, Canberra Times
Australia not preparing students for adult life, Sky News Australia
PISA 2015 brings more bad news for Australia, Teacher Magazine
Teenagers fall year behind internationally in maths, The Australian
Australian schools are in ‘absolute decline’ globally, says PISA report, ABC Online
PISA results don’t look good, but let’s look at what we can learn before we panic, The Guardian
Canada
Canadian students rank fourth for science performance, The Globe and Mail
England
Pisa: UK and England see performance drop in maths and reading, but climb rankings in science, TES News
Estonia
PISA 2015: Estonia’s basic education best in Europe, The Baltic Course
Finland
PISA: Finland only country where girls top boys in science, YLE News
Germany
Pisa-Studie: Deutschland hält sich im oberen Mittelfeld, Spiegel Online
Ireland
Irish students among ‘best at reading’ in developed world, In-Depth-Irish Times
Japan
Japan’s 15-year-olds perform well in PISA global academic survey, The Japan Times
Luxembourg
PISA results 2015: Luxembourg student test results remain below OECD average, Luxemburger Wort
Macau
Education | Macau students ‘score high’ on PISA 2015, Macau Daily Times
New Zealand
NZ students’ results decline, but still above OECD average – PISA …, New Zealand Herald
Malaysia
PISA 2015: Malaysia shows significant improvement in Math, Science & reading, New Straits Times Online
Norway
Norwegian 15 year olds climbing on the PISA rankings, Aftenposten
Peru
PISA 2015: Perú mejoró sus resultados pero sigue en los últimos …, Diario Perú21
Poland
Nauka nie idzie w las! Polscy uczniowie w czołówce PISA 2015, TVP Info
Scotland
Scottish school standards in maths and reading slump in damning PISA survey., Herald Scotland
Singapore
Singapore students top in maths, science and reading in Pisa international benchmarking test, The Straits Times
United States
American teens’ math scores fall on an international test, Los Angeles Times
Internationally, U.S. Students Are Falling, US News & World Report
Wales
Full Pisa results 2016 show Wales’ schools are still adrift of the rest of the rest of the UK., Wales Online
In the initial administration of TIMSS, Portugal set a dangerous precedent in exempting islands off of its coast from testing, noting they were remote or difficult to reach.
Issues regarding representation and sampling continue to plague these international assessments, as evidenced in two quotes from the article above:
“China loses ground in tables after new provinces are included for the first time.”
“OECD’s concern of a response rate of sampled schools in Malaysia of roughly 50% compared to the desired 85% response rate.”
I contend that the validity of TIMMS and PISA data remains underproblematized. Writing from Indonesia, I am confident that most of the 35% of nonreporting Malaysian schools are on the island of Borneo, which Malaysia shares with Indonesia and Brunei Darusalaam. It is likely that their scores would be lower than those of reporting schools, because the British presence was less felt on this island than on peninsular Malaysia, and thus less schools were constructed here during the period of colonization (and for much of the subsequent 50 years, as Kuala Lumpur focused more on itself than on the margins). What we cannot say is whether any attempt was made on the central government to discourage Sarawak schools from reporting scores. Thus, the OECD’s concern seems to me to be justified.
I’d like to see the OECD provide more data on which schools are reporting, which schools are being exempted, and why. This information is purportedly available by contacting the national office representatives, which are printed for every tested country in one of the many manuals. But this step away from full transparency renders many such questions unanswered as governments rush to conclusions, which they often do within a week of the publication of new schools.
Before we compare results, we should compare administration pragmatics and review the survey methodology.
*Thus, the OECD’s concern seems to me to be justified, but I am concerned that their concerns won’t result in any action beyond noting concern!