The challenges of widening participation in PISA

by Andreas Schleicher
Director, OECD Directorate for Education and Skills
Claudia Costin
Senior Director, Education Global Practice, World Bank

Since 2000, the OECD Programme for International Student Assessment (PISA) has been measuring the skills and knowledge of 15-year-old students in over 70 countries. PISA does not just examine whether students have learned what they were taught, but also assesses whether students can creatively and critically use what they know.

Of course, such international comparisons are never easy and they aren’t perfect. But they show what is possible in education, they help governments to see themselves in comparison to the education opportunities and results delivered by other education systems, and they help governments to build effective policies and partnerships for improving learning outcomes.

But as the number of countries joining PISA kept rising, it became apparent that the design and implementation models for PISA needed to evolve to successfully cater to a larger and more diverse set of countries, including a growing number of middle-income and low-income countries who want to participate in the assessment.

In response to these challenges, the OECD and the World Bank just released a report titled The Experience of Middle-Income Countries Participating in PISA 2000-2015, which provides valuable lessons and insights based on the experiences of more than 40 PISA-participating countries. It establishes a strong rationale and foundation for enhancing PISA to make it more relevant to a wider range of countries. It also provides insights for the World Bank and other development partners on how to better support countries to participate in these exercises and to analyse and use the data in effective ways.

The report shows that while demand for participation in PISA among middle-income countries is increasing, these countries face both financial and technical obstacles to participating, including the need to translate and manage the assessment, and code student responses. The report also shows that the political, regulatory, and cultural environment of these countries can also affect whether, and how easily, the assessment can be conducted.

To maximize the benefits of participating in PISA, the report recommends that the OECD take five actions:

  1. Adjust the PISA test instruments to better measure differences between the highest- and lowest-performing students and, in particular, distinguish performance differences at the lowest levels of proficiency;
  2. Revise the contextual questionnaires so they are more relevant to low-income country contexts and policy issues;
  3. Evaluate the impact of PISA participation on middle-income countries’ capacity to conduct international assessments; 
  4. Tackle financial and technical challenges through partnerships with donors and through capacity building; and
  5. Extend outreach to local stakeholders in these countries.

Action is already being taken on these recommendations through the PISA for Development initiative. This project is already working to enhance the PISA instruments and will undertake field trials in seven developing countries during 2016. The final results of PISA for Development, which are expected in 2018, will provide local policy makers with new evidence to diagnose shortcomings in their education systems and inform new policies. In the meantime, the PISA for Development countries will benefit from peer-to-peer exchanges with other members of the PISA global community. The enhanced PISA instruments will be made available to all countries for the 2021 cycle of the assessment.

The OECD remains committed to working with the World Bank and other partners in maintaining and developing PISA as a global yardstick for measuring success in education. This is especially relevant in the context of the recently adopted Sustainable Development Goals as PISA provides valuable information about the level and distribution of quality and equity within a country’s education system.

Together, we will continue to contribute our expertise and platforms to encourage international collaboration on education through the PISA surveys, and to assist policymakers and practitioners throughout the world to use them more productively. 


How can we compare education systems that are so different?

by Dirk Van Damme
Head of the Innovation and Measuring division, Directorate for Education and Skills

Comparison of levels of education between ISCED 2011 and ISCED-97

Imagine three families accidentally meeting in a bar of a hotel in a sunny tourist location. They start discussing the schooling of their children and their professional futures. One Danish couple has young children aged 11 and 14, both attending the ‘Basic School’. The Dutch couple thinks that the oldest son probably has not performed well in school because he seems to have been repeating some grades in primary school. Their own daughter is around 25 and is following a short programme at an institution of higher education, which the parents describe as an “associate degree”. The third couple, French, assumes that, given the girl’s age, this must be a kind of specialisation following the license. Their own son has a license, which the Danish and Dutch couples interpret as a master’s degree. The schooling of their respective children is clearly a sensitive topic, because none of the three couples wants to enter into much detail: they’re afraid that the other couples would not fully appreciate the prestige and status of their child’s educational career. The result is confusion.

This is akin to what happened many years ago when education experts and policy makers started to meet and discuss education policies. A Babylonian confusion of terminology – with words only comprehensible to the citizens of a given country but incomprehensible to foreigners – made any reasonable discussion nearly impossible. In many cases, even experts did not understand that “Basic School” in Denmark covers the first nine grades, or that a Dutch “associate degree” is a short vocational programme of post-secondary but non-tertiary education that people sometimes pursue after some years of work experience, or that a French license is equivalent to a bachelor’s, not a master’s, degree.

People quickly started to realise that if ever international collaboration in education were successful, they needed instruments to make their systems comparable – instruments that could translate the peculiarities of their own systems into a universally understandable “language”. Especially when pioneers started to collect statistical data on education systems, such tools became absolutely indispensable.

The first edition of the International Standard Classification of Education (ISCED) was developed by UNESCO in the mid-1970s. It was quickly adopted by other international organisations, such as the OECD, the World Bank and Eurostat. The classification was first revised by UNESCO, OECD and EUROSTAT in 1997 (ISCED-97), and then again between 2009 and 2011 to create ISCED 2011, adopted in November 2011. The 2015 edition of Education at a Glance is the first major collection of data using the new classification. ISCED is the reference framework for classifying and comparing educational programmes and their “levels” – the tool to make systems transparent and comprehensible across countries.

Of course, classifying education programmes is a sensitive topic. Not only families but also countries attach prestige and status to education programmes and the institutions that deliver them. This is what makes mapping such programmes so difficult. But using ISCED is also an activity of international understanding and peer learning. Mapping and classifying programmes is not something done by bureaucrats behind their desks in international organisations, but by peers from countries working together. In a global international labour market, where credentials define access to jobs, earnings and social status, it makes a difference how specific programmes are classified.

Education systems are not static; they change. There have been some important changes at both ends of the education ladder recently: in early childhood or
“pre-primary” education, at one end, and in tertiary or higher education at the other. It is precisely in these two areas that the most recent revision makes the greatest difference. The instrument now has a precise classification for early learning, which has become so important politically. And in higher education, there has been general adoption of the bachelor’s/master’s model.

Probably the families at the bar will not resume their discussion by referring to ISCED levels. But at least among experts, developers and users of educational statistics and indicators, the use of the new ISCED is a tremendously important step forward. If educators and policy makers want to understand each other and learn from each other, a common language is necessary. ISCED 2011 provides them with the tool to understand the various levels of education.

ISCED 2011 Operational Manual: Guidelines for Classifying National Education Programmes and Related Qualifications
Education at a Glance 2015: OECD Indicators

Got a question about education? Education at a Glance probably has the answer

by Andreas Schleicher  
Director, OECD Directorate for Education and Skills 

Does education really pay off? Has public spending on education been affected by the economic crisis? How are education and employment related?

You’ll find the answers to these and just about any other question you may have about the state of education in the world today in Education at a Glance 2015: OECD Indicators, published today. Did you know, for example, that
tertiary-educated adults earn about 60% more, on average, than adults with upper secondary as their highest level of educational attainment? Or that between 2010 and 2012, as countries’ GDP began to rise following the economic slowdown, public expenditure on education fell in more than one in three OECD countries?

This year’s edition of the annual compendium of education statistics includes more than 100 charts, 150 tables and links to another 150 tables on line. It also contains more detailed analyses of participation in early childhood and tertiary levels of education; data on the impact of skills on employment and earnings, gender differences in education and employment; educational and social mobility; adults’ ability and readiness to use information and communication technologies; how education is financed; and information on teachers, from their salaries and hours spent teaching to information on recess and breaks during the school day.

We invite you to take a good long look – and learn.

Press release
Education at a Glance 2015: OECD Indicators
Regards sur l'éducation 2015: Les indicateurs de l'OCDE

Follow #OECDEAG on Twitter: @OECDEduSkills
Photo credit: ©OECD 

Are American students overtested? Listen to what students themselves say

by Andreas Schleicher 
Director, OECD Directorate for Education and Skills 

One of the claims one hears frequently these days is that American students have no time for learning because they are permanently subjected to standardised testing, while Finnish students, in turn, live in that paradise where high learning outcomes are achieved by everyone without any testing.

It is actually very hard to find comparative data on the prevalence of testing in OECD countries. So to explore this, we asked the principals of 15-year-old students who participated in the  PISA assessment how frequently their students take part in standardised tests. And over the years I have learned to trust the reports of students and principals on what actually happens in the classroom more than the claims of many experts

Here is what I found: 34% of 15-year-olds in the Netherlands said they take a standardised test at least once a month, 21% of students in Israel said so, and on average across OECD countries 8% of students so reported. In the United States, only 2% of students said they took standardised tests at least once a month. By the way, that turns out to be exactly the same share as in Finland.

Some 97% of American 15-year-olds said they took a standardised test at least once or twice a year, again about the same share as in Finland. Across OECD countries, an average of 76% of students so reported. Some 40% of American
15-year-olds took such tests at least three to five times a year, compared with 16% in Finland and 17% on average across countries. Interestingly, on some other forms of assessments, like student portfolios, Finland comes out far ahead of the United States.

Now, some will say these data are from 2009 and things might have changed since then. But we asked the same question again in the 2015 PISA assessment; those results will be disclosed next year. So watch this space.

Photo credit: Students sitting tests @Shutterstock

Now more than ever

by Andreas Schleicher
Director, OECD Directorate for Education and Skills

It is difficult for us here in Paris to think about much else beside the innocents who lost their lives last week during the senseless, brutal attack that shook our city. Our thoughts are with their families and loved ones; our spirit remains firmly fixed on the values we cherish: liberté, égalité, fraternité.

In the aftermath of these horrific events, fraternité becomes more than an ideal; it is the necessary glue that binds our societies together. It is in this context that we invite you to consider what PISA results show about the crucial role schools play in building our communities, particularly for immigrant students. A full report on this issue will be published in the near future.