Behaviour in Scottish schools: 2016 research

This report is from the fourth (2016) wave of behaviour in Scottish schools research, first undertaken in 2006.


3 Methodology

3.1 The research comprised a quantitative survey and a programme of qualitative research. The quantitative survey provides data on how frequently different behaviours are experienced and allows changes over time to be tracked while the qualitative research allows issues to be explored in more depth, adding context to, and aiding understanding of, the quantitative findings.

Quantitative survey of headteachers, teachers and support staff

Questionnaire development

3.2 The questionnaire was largely based on that used in the 2012 survey (which, in turn, was largely based on the questionnaires used in previous waves). A number of new questions were added including new questions on the impact of digital technologies, questions on the impact of behaviour on the overall atmosphere/ethos of the school and questions on verbal abuse related to specific equalities characteristics (religion, disability and additional support needs). Some other questions and response categories were updated.

3.3 A small scale cognitive testing exercise was conducted (to test new and amended questions) with staff in primary and secondary schools in January 2016 (17 members of support staff (11 online and 6 with the paper version of the questionnaire), fourteen teachers, and seven headteachers). Following this exercise, further amendments were made to the questionnaire. The final version of the online script and the paper version of the support staff questionnaire are attached at Annex C and D.

Survey mode

3.4 For the first time, the survey was conducted online rather than on paper. Respondents could choose to complete the survey on a device ( PC, laptop or tablet) at school, at home or elsewhere. Sampled staff were provided with a web link and a unique log-in code with which to access the survey.

3.5 The advantages of online completion include ease and speed of completion for respondents; higher quality data due to the greater control over routing and automatic checks where respondents miss out a response or enter an impossible/implausible response; cost savings on printing and postage; and environmental benefits through the reduced use of paper.

3.6 However, it was anticipated that some support staff may not have confidential access to a school computer within their normal working day and so, unless the headteacher (or survey liaison point nominated by the headteacher) was confident that all support staff would have such access, support staff were also provided with a paper version of the questionnaire and given the choice of which mode they preferred.

3.7 It was also recognised that the move to online may have an impact on response rates (since staff would have to take the 'extra' step of logging on to the survey, rather than simply completing the paper questionnaire they had been given) and so every effort was made to publicise the survey in advance and encourage participation ( e.g. through all the teaching unions).

Sampling and recruitment

3.8 All publicly funded, mainstream schools in Scotland were included in the sampling frame.

3.9 All secondary schools were sampled and invited to participate. Within each school, the headteacher was selected and the number of teachers and support staff to be sampled in each school was proportionate to the number of teachers in the school.

3.10 508 primary schools were sampled and invited to participate. To ensure that the sampled schools were representative, a stratified random sampling approach was used to select schools (stratification was by local authority, size of school, urban/rural category and the proportion of the school roll living in the 20% most deprived areas of Scotland).

3.11 Within each sampled primary, the headteacher was selected and the number of teachers and support staff to be sampled in each school was proportionate to the number of teachers in the school.

3.12 Secondary headteachers and primary headteachers in selected schools were sent an advance letter informing them about the survey and encouraging them to take part. They were then contacted by telephone (by Ipsos MORI telephone interviewers) to confirm their agreement to participate and obtain the name and contact details of the member of staff they wished to nominate as a liaison point for the survey. The liaison point was then sent full instructions on how to randomly select the appropriate number of teachers and support staff, together with survey invitation letters and (where required) paper versions of the questionnaires for support staff. The instructions are attached at Annex F and an example of an invitation letter is at Annex G.

3.13 Fieldwork was conducted between 9 February and 18 March 2016. Many of the questions ask about respondents' experiences over the last full teaching week. The experiences of individual respondents will, to a greater or lesser extent, vary from week to week ( e.g. in some weeks they may experience more positive behaviours than in others). However, the large sample size means that these variations should, in effect, cancel each other out – those respondents who experienced more positive behaviours than they usually do in the previous week are balanced by those who experienced fewer positive behaviours than they usually do. So, while the reports from some respondents will be 'atypical' for them as individuals, the overall picture of behaviour in schools across Scotland will be accurate. There may be some seasonal fluctuation in behaviours ( e.g. relating to the weather, the timing of exams, or whether it is towards the beginning or end of a term). However, any such fluctuations will not impact on trends over time as the fieldwork for this wave was conducted around the same time of year as for previous waves.

Response rates

3.14 The response rates are shown in Table 3.1 below. There had been a notable increase in response rates between 2009 and 2012 which may have been due to a combination of pre-survey publicity and the efforts of local contacts to encourage schools in their area to take part (particularly from Positive Behaviour Team link officers); the introduction of telephone calls to headteachers at the recruitment stage; and the introduction of key contacts in schools. The overall response rate in 2016 was 48%. While this is still a healthy response rate – similar to the 2009 rate and higher than many other surveys of this nature, it is represents a considerable drop from 2012. This may be due in part to the switch to online but may also be due to competing demands among school staff (school headteachers frequently cited this as a reason for their school being unable to participate for both the quantitative and qualitative elements of the research – and a common theme among staff who took part in the qualitative research was that they had experienced an increase in workloads) and reduced capacity (including the loss of some posts) at the LA level which resulted in fewer LA staff acting as survey 'champions'.

3.15 The profile of respondents was compared with the known profile of all staff (using Scottish Government data on size of school, proportion of pupils who live in the 20% most deprived datazones in Scotland, staff gender, and whether their role was full-time/part-time and permanent/temporary). The profiles were very similar . This indicates that the achieved sample was representative of all staff - at least in terms of those variables. The data were weighted to take account of the slight differences.

Table 3.1: Response rates

Staff category 2016 selected sample 2016 achieved sample 2016 response rate 2012 response rate 2009 response rate
Primary teachers 1503 707 47% 69% 43%
Primary headteachers 508 295 58% 73% 57%
Primary support staff 1022 480 47% 69% 45%
Secondary teachers 3907 1797 46% 61% 43%
Secondary headteachers 362 193 53% 70% 65%
Secondary support staff 1443 685 47% 60% 52%
Total 8745 4157 48% 64% 47%

Statistical significance

3.16 Where differences over time or between sub-groups are reported, they are statistically significant at the 5% level.

Qualitative research with headteachers, teachers, support staff, pupils and parents

3.17 Between November 2016 and February 2017, a programme of qualitative research was conducted to explore and build on elements of the quantitative survey findings.

3.18 It was agreed by the Association of Scottish Principal Educational Psychologists ( ASPEP) that Educational Psychologists ( EPs) would be involved in the qualitative research: this would be mutually beneficial in that the expert input of the EPs would enhance the research and that undertaking the research would contribute towards the EPs' professional development. Eight EPs, from eight different local authorities across Scotland, volunteered.

3.19 The qualitative research comprised visits to 11 primary schools (four schools visited by EPs and seven by Ipsos MORI) and 12 secondary schools (four schools visited by EPs and eight by Ipsos MORI). Each EP undertook a focus group with parents and a focus group with pupils at one school in their area. Ipsos MORI researchers undertook, at seven primary schools and eight secondary school, an in-depth interview with the headteacher, a focus group with teachers and a focus group with support staff [3] . At two of primary schools and two secondary schools, visited by Ipsos MORI researchers, we also undertook a focus group with pupils and a focus group with parents (Table 3.2). The number of participants who took part in the qualitative research is shown in Table 3.3.

Table 3.2: Fieldwork conducted at schools

Primary Secondary Total
Schools at which EPs conducted one focus group with parents and one with pupils 4 4 8
Schools at which Ipsos MORI researchers conducted research with parents and pupils and with staff (headteacher, teachers and support staff) 2 2 4
Schools at which Ipsos MORI researchers conducted research with staff (headteacher, teachers and support staff) only 5 6 11
Total number of schools visited 11 12 23
Total number of focus groups with pupils and parents 6 6 12
Total number of depth interviews with headteachers and focus groups with teachers 7 8 15
Total number of focus groups with support staff 7 7 14

Table 3.3: number of participants in the qualitative research

Number
Headteachers 15
Depute headteachers 5
Teachers 73
Support staff 53
Pupils 57
Parents 60
Total 263

Year group/school selection and recruitment

3.20 Three year groups were selected for the research with pupils and parents [4] :

  • P5 (6 focus groups in total)
  • S1 (3 focus groups in total) (in order to explore the transition from primary to secondary school)
  • S4 (3 focus groups in total) (support with exams was one of the topics covered and, while S5 or S6 would be able to offer greater perspective in relation to national exams, S4 was selected on the basis that there are likely to be fewer behaviour issues in S5 or S6 as the pupils have chosen to stay on at school).

3.21 EPs were assigned either the primary or secondary sector. They then selected a school from their caseload schools and approached them to participate.They did not know prior to selecting a school whether or not the school had taken part in the quantitative survey.

3.22 The schools visited by Ipsos MORI researchers were sampled from those which had taken part in the quantitative survey and had agreed to be recontacted about further research on the subject. Sampling was conducted with the aim of achieving a spread of schools in terms of school size, deprivation ( FSM eligibility), rurality and local authority.

3.23 Selected schools were recruited via an initial letter and follow-up phone calls and emails from the Ipsos MORI project team. The recruitment proved significantly more challenging than anticipated with large numbers of schools feeling unable to take part. When schools gave a reason, it tended to be linked to staff shortages – they were either too busy to release staff and/or did not want to ask staff to do anything else as they were already overstretched.

3.24 It is possible that the recruitment difficulties resulted in a sample of schools skewed towards those which were managing well, despite whatever pressures they were under. Indeed, it emerged during visits to schools as part of the qualitative research that, even in those schools reporting challenges in relation to behaviour and resources, staff tended to feel that the school itself had a positive ethos and was coping well in spite of these challenges. This was in contrast to the more negative experiences that some staff described in relation to other schools that they had worked in.

Fieldwork

3.25 All research was undertaken at the school over the course of a day. Each interview or focus group lasted around an hour and was structured around discussion guides (see Annex E), designed by Ipsos MORI in consultation with the Research Advisory Group. EPs also had input into the discussion guides for parents and pupils. In order that all of the desired topics could be covered in the research, some topics were only covered with a proportion of each audience.

3.26 Interviews were audio-recorded (with participants' permission). The transcripts of recordings and researcher notes were then analysed by the research team through use of an analysis template and a series of analysis meetings in order to identify the substantive themes which emerged in relation to each section in the discussion guides.

Interpreting the findings

3.27 Unlike survey research, qualitative social research does not aim to produce a quantifiable summary of population experiences or attitudes, but to identify and explore the different issues and themes relating to the subject being researched. The assumption is that issues and themes affecting participants are a reflection of issues and themes in the wider population concerned. Although the extent to which they apply to the wider population, or specific sub-groups, cannot be quantified, the value of qualitative research is in identifying the range of different issues involved and the way in which these impact on people.

Contact

Back to top