beta

You're viewing our new website - find out more

Publication - Research Publication

Programme for International Student Assessment (PISA) 2015: Highlights from Scotland's results

Published: 6 Dec 2016
Part of:
Education
ISBN:
9781786526465

Report covering Scotland's performance in the Programme for International Student Assessment (PISA) 2015, covering maths reading and science.

62 page PDF

1.3MB

62 page PDF

1.3MB

Contents
Programme for International Student Assessment (PISA) 2015: Highlights from Scotland's results
1. Introduction and Methodology

62 page PDF

1.3MB

1. Introduction and Methodology

What is PISA?

1. The Programme for International Student Assessment ( PISA) is an assessment of 15 year-olds' skills carried out under the auspices of the Organisation for Economic Co-operation and Development ( OECD). The programme runs every three years across all OECD members and a variety of partner countries. Scotland has participated in all six surveys since the first wave of testing in 2000.

2. Each survey cycle focusses on one of three domains: reading, mathematics and science. In 2015 the main domain was science, with maths and reading as subsidiary domains. Further data on student wellbeing and collaborative problem solving (the "innovative domain" in PISA 2015) will be published during 2017.

Who participates?

3. Around 540,000 students participated in the study worldwide. This includes the 35 member states of the OECD and 37 "partner countries and economies".

Fig. 1.1: Global coverage of PISA 2012

Fig. 1.1: Global coverage of PISA 2012

Table 1.1: OECD states and partner countries and "economies" participating in PISA 2015 [1]

OECD countries (in grey) Partner countries and economies (in blue)
Australia Korea Albania Kosovo
Austria Latvia Algeria Lebanon
Belgium Luxembourg Argentina Lithuania
Canada Mexico Brazil Macao (China)
Chile Netherlands B-S-J-G (China) [2] Malaysia
Czech Republic New Zealand Bulgaria Malta
Denmark Norway Chinese Taipei Moldova
Estonia Poland Colombia Montenegro
Finland Portugal Costa Rica Peru
France Slovak Republic Croatia Qatar
Germany Slovenia Cyprus Romania
Greece Spain Dominican Republic Russian Federation
Hungary Sweden Former Yugoslav Singapore
Iceland Switzerland Republic of Macedonia Thailand
Ireland Turkey Georgia Trinidad and Tobago
Israel United Kingdom Hong Kong (China) Tunisia
Italy United States Indonesia United Arab Emirates
Japan Jordan Uruguay
Kazakhstan Viet Nam

4. The United Kingdom is a member state of the OECD and its results are published in the main OECD publication. Scotland participates as an "adjudicated region", meaning that its results have full quality assurance from the survey contractors appointed by the OECD, and can publish its results separately. Within the UK, England, Wales and Northern Ireland have boosted samples as "non-adjudicated regions" which means they are able to produce country-level analysis within their reports. Regional results are published as annexes to the main OECD volumes.

5. Survey fieldwork is carried out separately in each participating state by "National Centres" according to strict quality standards set by the OECD.

What does PISA measure?

6. PISA seeks to measure skills which are necessary for participation in society. Accordingly, it assesses how students apply the skills they have gained to the types of problem they may encounter in work or elsewhere. Pupils are assessed at the age of 15 as this is regarded as a reasonable point at which to test the impact of compulsory education throughout the developed world (most PISA 2012 participants in Scotland were attending S4). After this point students will typically move onto more specialised studies or enter the labour market. Box 1.1 contains the definitions of the domains tested by PISA.

Box 1.1: The PISA domains and their definition

* Scientific literacy is defined as the ability to engage with science-related issues, and with the ideas of science, as a reflective citizen. A scientifically literate person is willing to engage in reasoned discourse about science and technology, which requires the competencies to explain phenomena scientifically, evaluate and design scientific enquiry, and interpret data and evidence scientifically.

* Reading literacy is defined as students' ability to understand, use, reflect on and engage with written texts in order to achieve one's goals, develop one's knowledge and potential, and participate in society.

* Mathematical literacy is defined as students' capacity to formulate, employ and interpret mathematics in a variety of contexts. It includes reasoning mathematically and using mathematical concepts, procedures, facts and tools to describe, explain and predict phenomena. It assists individuals in recognising the role that mathematics plays in the world and to make the well-founded judgements and decisions needed by constructive, engaged and reflective citizens.

7. We have included some details on how science, the main focus of the 2015 PISA survey, was assessed in Chapter 2. Further details of how each domain was assessed can be found in the OECD volumes published on the PISA website, www.oecd.org/pisa.

8. The assessments are also supplemented by background questionnaires. Pupils are asked about their motivations for study, attitudes to school, beliefs about science, studying and their socio-economic background. Headteachers are asked about the challenges facing their schools, organisation and factors that they believe affect their students' performance.

The survey in Scotland

9. The survey was carried out in Scotland between 3 and 28 March 2015. The pupils tested are generally described as "15 year-olds" although the actual age range was 15 years and 2 months to 16 years and 2 months as of 1 March 2015. Students were mostly (87.5 per cent) in the S4 year group.

10. The PISA survey was managed by an international consortium led by ETS. The Consortium developed the tests, questionnaires and survey documentation and ensured that all participating countries met quality standards. In Scotland, the National Foundation for Educational Research ( NFER) was the "National Centre", responsible for local adaptations to the surveys, and administering the test in schools.

11. The school sample was randomly selected by NFER following submission of sampling forms to the consortium. The sample was stratified on the basis of previous exam performance (split into five categories), whether schools were publicly funded or independent, urban/rural location and school size, and whether schools were single-sex or mixed.

12. In total, 109 secondary schools participated in the survey. One hundred and two of these were from the main sample (87 per cent response rate), and seven from the back-up samples (resulting in 93 per cent participation rate after replacements were added in). This exceeded the OECD's minimum standard of 85 per cent participation.

13. Within each school 40 students were randomly sampled by NFER using software supplied by the Consortium. In total 4,283 students were drawn in the sample. Schools were able to withdraw a certain number of students where it was deemed that participation would be difficult due to additional support needs or language issues. Similarly students that had left the school in the interim were not considered part of the target sample. In total 3,610 students were deemed eligible participants. Of these a total of 3,123 students took part, with the balance being those who did not wish to take part (both students and their parents were given the opportunity to opt out of the survey), those who were absent on the day of the test or were withdrawn by the school because of their additional support needs.

14. The OECD had strict criteria for the level of exclusion that was acceptable, and the total exclusion rate of 6.52 per cent was deemed to be consistent with a robust sample. Similarly, the final weighted participation rate, calculated by the consortium, was 79.9 per cent, which was held to meet the OECD requirement of 80 per cent.

15. For the first time the assessment was administered in Scotland by computer. This was achieved using the existing facilities in schools with the support of school and Local Authority ICT services.

16. The software delivery system was provided by the international consortium and rotated the assessment items in six clusters so that approximately half were science, with the remainder split between reading, maths and collaborative problem solving - the innovative domain in 2015.

17. The assessment was administered in two one-hour sessions, with a further 30 minutes for the background questionnaire.

18. As in all previous cycles, there was a survey of headteachers within schools, which asked about their views on school organisation, teaching staff and resources. Eighty-six headteachers responded - a rate of 78.9 per cent.

19. In 2015, Scotland also participated in the Parents' Questionnaire, sent to all parents of students who sat the PISA assessment, which asked additional questions about student background, the support that students' received at home, career expectations and their engagement with the school. The response rate for this survey was 36.4 per cent.

Interpreting the results

20. It should be understood that PISA is a sample survey. Like all surveys of this type, it is subject to sampling error. The necessity of surveying only a sample of students, even when chosen at random, runs the risk that such a group will not necessarily reflect the larger population of students. We must therefore be cautious in assuming that the values found in the survey would be the same as those in the population.

21. This means that being confident that there is a difference between Scotland and the OECD average, or between groups and countries, will depend on both the size of the observed difference and the standard error associated with the sample sizes used. Significance tests are used to assess the statistical significance of comparisons made.

22. Therefore, it is not possible to produce individual country rankings based on the absolute (mean) score. Accordingly this report shows results divided into those countries whose scores are statistically significantly higher than, similar to or lower than Scotland. By "significant" we mean that we are 95 per cent certain that there is a difference (or similarity).

Change over time

23. This report covers, as in previous publications, the position of Scotland relative to other countries, and how this has changed over time. The mathematics assessment changed radically in 2003 and for science in 2006, as they became "full domains" for the first time, so we are unable to make comparisons before those waves. The OECD average for science was normalised at 500 in the 2006 survey - the first survey when it was the main domain.

24. One complication is that membership of the OECD has changed at various points. In 2010, Chile, Estonia, Israel and Slovenia were admitted to membership. This affected comparison of reading scores in 2009. [3] Scotland was above the OECD average when those four countries were included, but similar to the average of the pre-2010 membership. In 2016, Latvia also acceded to the OECD. When making comparisons with the OECD average, this report defines this as the average of member nations of the OECD at the time.

25. Further, the measurement of performance can be affected by new test items, the change of administration from paper- to computer-based assessment and the statistical treatment of data. While the scales have been equated to allow for expression on the same basis between cycles, the OECD provide a "link error" to quantify the uncertainty when comparing scores over different waves of data. All estimates in this report have taken this into account. A small number of countries were affected beyond this and this is noted in the OECD volumes. We are unable to report if this also applied to Scotland as rescaled means have not been possible at regional level.

Further analysis of PISA

26. Much of this report focusses on changes to Scotland's headline score and the relative position internationally. However, PISA is not just a snapshot of student attainment, but a comprehensive data-gathering exercise which enables analysis, not only of how well school systems around the world perform, but the factors that are behind this. The OECD publications present international analysis of students' abilities, motivations, attitudes, background, support at home and confidence. In addition, information is gathered on school structure and management, and the OECD analyse how various aspects of school organisation may be related to attainment.

27. The OECD will also publish further volumes of PISA 2015 data on Student Wellbeing and Collaborative Problem Solving during 2017.

28. Periodically, the OECD also publish short reports in their " PISA in Focus" series at the following link: www.oecd.org/pisa/pisainfocus/

Other surveys of performance in Scotland

29. The Scottish Government, in partnership with Education Scotland, the Scottish Qualifications Agency ( SQA) and the Association of Directors of Education in Scotland ( ADES) also conducts the Scottish Survey of Literacy and Numeracy ( SSLN), an annual survey which assesses student performance in numeracy and literacy in alternate years. The first numeracy survey was conducted in 2011 and the first literacy survey in 2012.

30. The SSLN provides Scotland-level performance data for pupils in primary stages 4 and 7 and in secondary stage 2. SSLN results can be found on the Scottish Government website using the following link: www.gov.scot/ssln . The 2016 survey of literacy represents the final wave of SSLN prior to the introduction of standardised assessments that support Teacher Professional Judgement.


Contact