beta

You're viewing our new website - find out more

Publication - Research Publication

'What Works' in Drug Education and Prevention?

Published: 7 Dec 2016
Part of:
Children and families, Research
ISBN:
9781786526304

This literature review examines the evidence of effectiveness of different types of drug prevention and education for children and young people.

44 page PDF

860.4kB

44 page PDF

860.4kB

Contents
'What Works' in Drug Education and Prevention?
4. Manualised and Licensed Evidence Based Prevention Programmes

44 page PDF

860.4kB

4. Manualised and Licensed Evidence Based Prevention Programmes

Research has shown that a number of named prevention programmes are likely to be beneficial and cost effective ( ACMD, 2015). These have been subject to high quality research and are known as 'manualised' interventions, and have been standardised through the creation of manuals and protocols for those who implement them ( ACMD, 2015). Manualised programmes are often highly structured ( e.g. school based prevention programmes), and are often accompanied by training and implementation guidelines. Whilst many are available free of charge, particularly those programmes developed in the UK, the EU, and Australia, some well-known manualised interventions have licensing requirements, providing organisational rights to deliver the programme. Programme developers may sometimes also charge annual fees, with additional costs for official intervention materials, training, and analysis of screening questionnaires etc. (Sumnall, 2016). Other programmes that are available free of charge may have some conditions on their use; for example, deliverers must undergo training or implementation and cannot be funded by the alcohol and tobacco industries.

One example of a manualised prevention programme that has shown positive results is the Good Behaviour Game ( GBG) [18] . This is an evidence based early intervention programme delivered in primary schools which seeks to improve socialisation skills and behaviour in the classroom. Unlike many school based prevention approaches, the GBG is not a curriculum but it is based on a social influence approach. The Game is played in the classroom several times a week and teams are rewarded for adhering to classroom rules such as working quietly, being polite to others, not leaving their seats without permission, and following directions. Teachers monitor teams for rule-breaking, and good behaviour and team co-operation is rewarded with praise and small prizes such as stickers and badges. At the end of the Game the winning team are praised, and sometime prizes are offered.

Although the programme does not directly mention drugs or substance misuse, its intended outcomes are to prevent: substance misuse, risky sexual behaviour and violent and anti-social behaviour. Evaluations of the GBG have shown significant benefits in the short term (reductions in aggressive behaviour and ability to focus and work independently) as well as notable long term effects in males. In one long-term trial in the USA, participation in this programme in primary school was associated at age 19-21 with significantly lower rates of drug and alcohol use disorders, delinquency and imprisonment for violent crimes, suicide ideation and use of school based services ( ACMD, 2015).

Alongside the GBG, the ACMD paper also highlights 'PreVenture' [19] and the 'Strengthening Families' [20] programmes, as of interest to the UK, having been trialled, piloted or implemented in the UK. The Cochrane Drugs and Alcohol reviews highlighted 'Unplugged' and 'Life Skills Training' as showing positive effects and recommended these programmes for implementation. Some of these programmes aim to reduce all types of substance use, rather than focussing just on illegal drugs, and some also target other high risk behaviours ( e.g. sexual health). Rather than exploring each of these programmes in turn, there are a range of databases online that list details of programmes in the field of drug prevention that demonstrate effective practice (with varying degrees of evidence to prove their effectiveness). For example, the UK Centre for Analysis of Youth Transitions ( CAYT) repository of evidence based services and programmes for young people [21] , US National Registry of Evidence-Based Programmes and Practices [22] , the EMCDDA Exchange on Drug Demand Reduction Action ( EDDRA) examples of evaluated practices [23] , and the National Institute of Drug Abuse ( NIDA) [24] all list examples of evidence-based drug prevention programmes.

Despite showing evidence of success, programmes such as these cannot be guaranteed to be effective, and can often fail to replicate initial successful results. One notable example is the seven nation European trial of the Unplugged programmes, the largest European drug education trial ever conducted. At the follow up at 15 months after the lessons ended, the results were disappointing, showing that Unplugged probably had some of the intended effects, but the results were "patchy, modest and usually statistically insignificant" [25] . Some of the reasons why interventions that show evidence of effectiveness then go on to fail in other contexts are explored below.

Challenges in successfully implementing evidence based programmes

Prevention programmes which show initial successful results may not be replicated when implemented more widely, particularly if they are not led by the programmes developers and not implemented as the designers intended.

The importance of a nation's social context, drug policies and a need for high quality supporting structures has been emphasised by many commentators as having a significant influence on the effectiveness of programmes. An evidence based programme is necessary but not sufficient - also required are the structures in place to support the delivery and implementation (training of teachers, funding, support at national and local level etc.).

'Implementation fidelity' is the degree to which an intervention is delivered as intended and is critical to successful translation of evidence-based interventions into practice ( Breitenstein et al., 2010). Manualised and highly structured programmes do not always transfer from one geographic or cultural setting to another and the structures for delivering prevention programmes might not always be in place (Public Health England, 2015).

Diminished fidelity may be why interventions that show evidence of efficacy in highly controlled trials may not deliver evidence of effectiveness when implemented in real life contexts/routine practice. For example, the mechanisms for delivery might differ and the EDPQS stress that poorly trained staff members cannot deliver high quality prevention ( EDPQS, 2015). Transferring programmes to substantially different contexts may require adaptation and re-evaluation (Faggiano et. al, 2014). The ACMD briefing paper on prevention of drug and alcohol dependence emphasises that the difficulties and challenges in implementing manualised interventions in routine practice, with fidelity, and on a large scale are exacerbated because we do not have well established and robust national and local prevention systems in place ( ACMD, 2015). In most cases, more research needs to be done to determine whether the success of these interventions can be replicated in real-world settings in routine practice, and how programmes and policies can be effectively implemented and disseminated ( ACMD, 2015).

There are some steps that can be taken to maintain important elements of programmes which are rolled out in the UK (James, 2011). For example, the content of the programme needs to be realistic for the time available in schools, as in the past teachers have found the volume and content to be overambitious and unrealistic. Flexibility and adaptability of the programme is also instrumental, and while this can be positive in meeting the needs of different groups, programme developers should provide sufficient training and guidance to teachers on which parts of the programme can be adapted without compromising the core components (James, 2011).

The UNODC (2015) stress that when adapting evidence based programmes to different contexts several steps are taken: (i) "A careful and systematic process of adaptation that does not touch the core components of the programme, while making it more acceptable to the new socio-economic/ cultural context: this would take place with the support of the developers of the programme..." and (ii) "A scientific monitoring and evaluation component in order to assess whether the programme is actually effective in the new socio-economic/cultural context".


Contact