Independent review of hate crime legislation in Scotland: final report

Recommendations by Lord Bracadale to Scottish Ministers with analysis of his consultation exercise and an overview.


Chapter 6 - Online hate crime

Introduction

6.1. The internet is a powerful tool which enables communication on a scale which would have been unimaginable by previous generations. That communication has enabled many positive developments, but also allows negative behaviour to take place in new and different ways.

6.2. This chapter considers how well the current law operates in relation to the commission of hate crime and hate speech [47] online, and whether any changes are necessary. I discuss how recommendations made elsewhere in my report (in relation to stirring up of hatred and gender hostility) might impact on online behaviour.

6.3. I flag up various policy and legal developments which are taking place outside the context of this review which are likely to impact upon how harmful online behaviour is dealt with.

Summary of main themes from consultation responses

6.4. In the consultation paper, I reflected views which had been raised in the initial information gathering phase of the review that online activity is not taken as seriously as that which occurs 'in real life' and that the speed and potential anonymity of activity online mean that it can have an impact which is greater than similar offline activity. I noted steps that social media providers were being encouraged to take to deal with the commission of hate crime and hate speech online. I asked for views on whether the current law deals effectively with online hate, whether there were particular forms of online activity that required a different response, and whether this should be dealt with through prosecution of individuals, action by social media providers or both.

6.5. Consultation responses indicated a concern that the online environment was becoming increasingly hostile, with significant harm caused to individuals and groups as a result of online hate and harassment, and a perception that it is not taken as seriously as equivalent face-to-face conduct.

6.6. Areas where respondents felt the law does not respond at all, or responds inadequately, include: online bullying and harassment (including 'crowd-sourced harassment'); misogyny and incitement to misogyny; inciting self-harm or suicide; enabling pornography to be viewed by children; online paedophilia; publication of 'fake' news; expressions of hate through gaming platforms and sites; impersonating another person online; posting photographs or personal information without consent and with intention to harass, demean or degrade; threats to an individual's life, family or home. I would note here that some of the conduct described goes beyond what might be thought of as identity-based hate crime or hate speech. Respondents were concerned about more general forms of abuse and offensive communication.

6.7. There was recognition of some specific difficulties in prosecuting offences which can arise from online technology. This included the identification and location of suspects (who might not be located in Scotland), obtaining information from service providers and having offensive material removed from websites. There were suggestions for practical and technological measures which could be taken to tackle online hate and harassment.

6.8. The need to safeguard individuals' rights of freedom of expression was emphasised, as it had been in the responses to all the consultation questions. However, in the context of online behaviour in particular, there was a reflection that unfettered freedom of expression for some could result in a situation where others feel unable to express their views.

Identifying the nature and extent of the harm to be tackled

6.9. In approaching this issue, I have taken as a starting point the principle that what is unacceptable offline should also be unacceptable online. However, I found it useful to bear in mind four different categories of harm which arise from online hate crime. These were identified by the academic Chara Bakalis in a recently published article [48] . Although the article is focused on provisions which extend to England and Wales, her analysis is useful in considering the effectiveness of the provisions which exist in Scots law. She identified:

  • Harm caused to an individual when the harassment they experience takes place online but in a private form (for example, through emails or text messages). This may take the form of fear, alarm or distress.
  • Harm caused to an individual when hate is communicated on social media or another public forum. As well as fear, alarm or distress, a victim may suffer reputational harm (which may result in broken relationships, harm to their career and or to their ability to maintain a presence on the internet).
  • Harm caused by speech that is not directed at any one person in particular, but involves generalised hateful comments which 'poison the atmosphere' and demonise particular groups of individuals who share a protected characteristic.
  • The potential radicalisation of individuals or the entrenching of global hate movements.

Potential routes to prosecute online behaviour

6.10. At present, prosecutors in Scotland could deal with online hate crime and hate speech under a number of different offences. The route chosen would obviously depend on the precise nature of the conduct in question. In this section, I set out the main options available to prosecutors and discuss issues which have been raised about how they operate in practice. I also consider some more general issues raised about prosecuting online behaviour under any of the offences: difficulties in obtaining evidence; dealing with 'crowd-sourced' behaviour; and territorial jurisdiction.

Section 127 of the Communications Act 2003

6.11. The main offence which is specifically directed at online communications is the improper use of a public electronic communications network, contrary to section 127 of the Communications Act 2003. The offence may be committed in two ways. The first alternative is if a person sends a message or other matter by public electronic communications network that is grossly offensive or of an indecent, obscene or menacing character, or causes such a message or material to be sent. The second is if a person sends a message by public electronic communications network that he or she knows to be false, or causes such a message or matter to be sent, or persistently makes use of a public electronic communications network, in each case for the purpose of causing annoyance, inconvenience or needless anxiety to another.

6.12. This offence is used in practice alongside the statutory aggravations to deal with instances of online hate. It is likely to be of particular significance in relation to the second and third forms of harm identified by Bakalis. Figures obtained from the Scottish Government Criminal Proceedings dataset show a steady increase in the number of prosecutions under section 127 which are accompanied by a statutory aggravation, with 70 such prosecutions (around 11% of the total number of section 127 prosecutions) in 2014-15 and 2015-16. The offence has been used in some high-profile instances of online hate crime, including the conviction and imprisonment of the 4 th Viscount St Davids, Rhodri Philipps, in England for racist and menacing comments in relation to the anti-Brexit campaigner, Gina Miller [49] .

Breadth of offence – meaning of 'grossly offensive'

6.13. The potential breadth of the 'grossly offensive' element of the offence is worth noting. There is no requirement in the offence that there is an actual recipient of the message who is grossly offended. The offence is committed when such a message or other matter is sent. Bakalis notes that this is therefore a 'conduct crime' rather than a 'result crime' and could catch, for example, an online but private conversation between two racists on holocaust denial. She suggests that it might not be compatible with their rights of freedom of expression under article 10 of the ECHR to prosecute the sending of grossly offensive material where no harm had in fact been caused.

6.14. Bakalis notes that the CPS in England and Wales has published guidelines about prosecuting cases involving communications sent by social media, in part to ensure that such offences will only be prosecuted where that is compatible with Convention rights. The COPFS has published equivalent guidance in Scotland, and this was discussed in the review's consultation paper.

6.15. In its consultation response, the COPFS noted that there may be circumstances which would satisfy the evidential test but where, given the whole circumstances, which include the nature of the comments and their context, it would not be in the public interest for a criminal prosecution to take place.

6.16. I have considered the need to safeguard rights of freedom of expression at length elsewhere in this report (in particular, chapter 5 on stirring up of hatred). I conclude there that, while it can be difficult in the abstract to balance rights of freedom of expression against the rights of others not to be harmed, it is generally much easier to do this once the facts, context and language of a particular instance are considered. The same analysis applies to the prosecution of an individual for sending 'grossly offensive' material in terms of section 127. In deciding whether it is in the public interest to prosecute, the COPFS would of course need to take into account the impact of such a prosecution on the individual's rights under article 10 of the ECHR, and how those rights may be balanced against the rights of others. Likewise, the sheriff will need to take article 10 rights into account in deciding whether the offence has been committed. From the evidence which I have received in the course of this review, I am satisfied that the COPFS and courts are very aware of the need to do this. I do not think this points to any defect in the application of the section 127 offence in practice.

Forums to which the section 127 offence may be applied

6.17. When the Offensive Behaviour at Football and Threatening Communications (Scotland) Bill was introduced in 2010, the policy memorandum accompanying the Bill noted: "Case law has left some doubt about whether the Communications Act offence can be used to prosecute people who create offensive websites or 'groups' on social networks, as opposed to sending threatening emails or other communications." This was stated as part of the reason why the proposed offence of threatening communications with intent to stir up religious hatred (which became section 6 OBFTCA) was required.

6.18. The review has not been able to track down the specific case law, but it is possible that the policy memorandum was referring to a statement by Lord Bingham of Cornhill about telephone messages in DPP v Collins [50] . Some commentators thought the approach taken by Lord Bingham might imply that the section 127 offence could only be used in relation to direct messages such as emails or telephone messages, and not more indirect methods communication such as posting a message on a forum. However, the offence is now regularly used in practice in relation to information posted on social media platforms such as Facebook or Twitter. In Chambers v DPP [51] , the English Divisional Court considered arguments that a tweet should be considered as 'internet content' and not a message which had been 'sent' in terms of the section 127 offence. The Court appears to have considered this an unnecessarily technical argument: they considered what Lord Bingham had said in the earlier case but did not accept that led to a narrow construction of the section. It was plainly capable of applying to internet content as well as emails: such content was a 'message sent' at the point that it was posted. The Court noted that "It is immaterial that the appellant may have intended only that his message should be read by a limited class of people, that is, his followers, who, knowing him, would be neither fearful nor apprehensive when they read it."

6.19. I am therefore satisfied that section 127 can be (and is) used in relation to a wide range of online content, and that the doubts expressed in the policy memorandum in 2010 do not require to be dealt with through a separate form of offence.

Sentencing limitations of section 127

6.20. The offence in section 127 may only be prosecuted summarily ( i.e. before a sheriff sitting without a jury), and is subject to a maximum penalty of 6 months' imprisonment, or a fine up to £5,000 or both. The COPFS have noted that the fact that section 127 can only be prosecuted summarily can lead to limitations on its application in practice. In their evidence to the Justice Committee on the Offensive Behaviour and Threatening Communications (Scotland) (Repeal) Bill, the COPFS noted one case which had not been proceeded with under section 127 because it was considered so serious that it should be prosecuted on indictment. The accused in that case had used Twitter to express his hatred of Shias and Kurds and call for them to be killed as the Jews had been in Nazi Germany. It appears that he also sought information on how to join ISIS. He was instead prosecuted in respect of threatening communications with an intent to stir up religious hatred under section 6 OBFTCA [52] , a prosecution noted in the stirring up chapter of this report at paragraph 5.12. He pled guilty and was sentenced to 16 months imprisonment, a sentence which would not have been possible if he had been prosecuted under section 127.

6.21. There is an argument that section 127 should be amended to make it triable both summarily and on indictment. There is an offence in section 1 of the Malicious Communications Act 1998, which covers similar conduct but extends to England and Wales only. That offence was amended to become triable either way by the Criminal Justice and Courts Act 2015 and is now subject to a maximum penalty of two years imprisonment or fine or both. The amendment was proposed by Angie Bray MP in response to a particular constituency case which, she successfully argued, demonstrated the need for prosecution on indictment before a jury. It would appear that the arguments in favour of widening the prosecution options for the section 1 offence could also have been applied to section 127, but the MP's focus was specifically on the section 1 offence.

6.22. Section 127 is specifically concerned with public electronic communications networks, and telecommunications and internet services are matters which are reserved under the Scotland Act 1998. An amendment to the sentencing levels in section 127 in particular would probably not be within the legislative competence of the Scottish Parliament, and I therefore do not propose to make a recommendation about this in this report. However, as I explain in more detail below, the offence in section 127 is currently being analysed by the Law Commission in England and Wales in the context of a project looking at online offensive communications generally ( i.e. not just offensive communication which amounts to hate crime). I anticipate that the Law Commission will consider sentencing limitations in the course of that project. I am confident that the UK and Scottish Governments will act to ensure that any amendments to reserved legislation as a result of that project take proper account of the way that they will apply in a Scots law context.

Other more general offences

6.23. The offences in sections 38 and 39 of the Criminal Justice and Licensing (Scotland) Act 2010 are also likely to be relevant to the type of hostile and harassing behaviour directed at individuals or groups. This, too, has been reflected in the consultation responses. Section 38 applies to threatening or abusive behaviour; section 39 is the offence of stalking. The nature of these offences is described in more detail in annex 3 (current law). Either offence can be charged with one of the existing statutory aggravations where it is motivated by, or involves the demonstration of, hostility based on one of the protected characteristics.

6.24. I consider that these offences are likely to be relevant to deal with a significant amount of the online abuse which I have been made aware of; in particular, the online abuse with an element of gender hostility which was highlighted in Amnesty International's '#Toxic Twitter' report, discussed at chapter 4 above. If the recommendation which I have made in that chapter to create a new statutory aggravation relating to gender hostility is accepted, I anticipate that might be used in conjunction with one of these baseline offences to deal with egregious online abuse which causes fear or alarm.

6.25. Stonewall Scotland raised a specific concern about cases which they feel are not properly covered by either section 127 or section 38. In their consultation response, they argued that section 38 is too restrictive:

The actor must have acted in a way that is 'threatening or abusive', and in a way that would cause reasonable people fear or alarm (or is reckless to whether they have done so). However, where online abuse causes distress, rather than fear, or incites hatred rather than violence, this abuse slides under the radar. Amending the requirement for actions to cause 'fear and alarm' in order to be criminalised to 'fear, alarm, or significant distress' would ensure that language that was abusive, caused distress (either deliberately or recklessly as to whether distress would be caused) would be considered criminal.

6.26. I have carefully considered the arguments raised about the degree of distress or alarm which is appropriate to lead to a criminal sanction. I understand that the distress caused by unpleasant, prejudiced online content may be exacerbated by the risk of reputational harm which it may cause, as discussed in Chara Bakalis' research above. However, I do not think it is necessary or appropriate to alter the threshold in section 38 or to create a new offence to apply in relation to 'lower-level' online behaviour. As I noted in chapter 2 (underlying principles), criminalising behaviour has significant consequences and should be done with care in order to target specific harm. I am satisfied that the Scottish Parliament gave very careful consideration to the degree of harm caused by behaviour falling under the section 38 offence and adopted language ('fear or alarm') which reflects agreed social norms. In any event, in the context of an offence charged with a statutory hate crime aggravation ( i.e. one involving hostility based on a protected characteristic), I find it difficult to envisage realistic circumstances which would cause 'distress' but not also 'alarm'.

6.27. If there are instances where online hate behaviour causes distress, but no actual fear or alarm, there are other mechanisms (short of criminalisation) which may be appropriate to deal with it. These might include improved mechanisms to ensure that such material is removed from the online space quickly to avoid further reputational damage, and this is discussed further below. However, I do not consider a criminal response is needed.

6.28. Stonewall also raised a concern about incidents of incitement to hatred rather than violence. That concern would be met if my recommendations in relation to stirring up of hatred offences are accepted.

Offensive material online which is not directed at an individual but stirs up hatred based on a protected characteristic

6.29. Chapter 5 of this report considers offences relating to the incitement of hatred. These may be particularly relevant to the third and fourth categories of harm identified above (comments intended to demonise particular groups, online radicalisation and the entrenchment of global hate movements). As noted in chapter 5, Scots law includes offences in Part 3 of the Public Order Act 1986 related to the stirring up of racial hatred. However, there are at present no offences relating to the stirring up of hatred on other grounds.

6.30. The offences on the stirring up of racial hatred have been used successfully to prosecute online hate speech. For example, in one English case the two accused were convicted of stirring up racial hatred through the distribution of Holocaust-denial material on the internet [53] . I am satisfied from the evidence before the review that the internet is used in practice by people who wish to spread hateful attitudes and opinions in relation to a number of groups. In addition to the type of material which could be covered by the existing racial hatred offences, I was told about the extent of abusive material online which it would appear is intended to stir up hatred of certain religious groups and of women.

6.31. I have set out my reasons for recommending that there be a suite of stirring up offences fully in chapter 5, and do not repeat those here. Suffice it to say that I consider there is an important place in Scots law for an offence which allows the courts to mark out the particularly egregious behaviour of arousing hatred of a group as a whole in other persons. This goes beyond activity where harassment or threats are directed at individuals or groups with protected characteristics. If the recommendations which I have made in chapter 5 are implemented, I anticipate that the resulting offences will be of use in the context of online hate speech.

Specific challenges in bringing prosecutions under these provisions for online behaviour

6.32. As noted above, there are some specific features of online offending which have been raised with the review.

Obtaining appropriate evidence

6.33. Two consultation responses specifically highlighted difficulties posed in proceeding with a prosecution in their case because of the technology used to commit the offence. They highlighted the difficulties in proving who had actually sent the message in question and felt that the requirement for corroboration in Scotland posed particular challenges.

6.34. I discussed rules on corroboration in chapter 3 (current statutory aggravations). As a general rule, no person may be convicted of a criminal offence in Scotland in the absence of corroborated evidence. This means that there must be at least two sources of evidence in respect of each essential element of the crime. Scottish Ministers have considered abolishing the requirement of corroboration, and commissioned Lord Bonomy to carry out a review of the safeguards that might need to be put in place if this were to happen. Lord Bonomy and his reference group reported in April 2015 [54] . The question of whether corroboration should be abolished generally, and whether any safeguards would be needed if that were to happen, is currently with Ministers.

6.35. I have not identified any element of hate crime offending which would justify a different approach to the question of corroboration in this context when compared with any other offence. Questions about whether baseline offences should require more than one source of evidence do not therefore fall within the remit of this review. While I recognise the practical challenges of establishing appropriate evidence in online cases, I am not persuaded that any change to the legislation is appropriate here.

Dealing with crowd-sourced harassment / 'virtual mobbing'

6.36. A particular feature of online communication is that it may involve correspondence on a 'many to many' rather than 'one to one' basis. This can result in the phenomenon of crowd-sourced harassment or virtual mobbing. The House of Lords Select Committee on Communications issued a report in session 2014/15 entitled Social Media and Criminal Offences [55] . The Committee concluded (in relation to the law of England and Wales) that existing offences were generally appropriate to deal with the nature of offending which had been identified, although there were certain aspects that may be adjusted and gaps filled.

6.37. The Select Committee recognised the concern about identifying and prosecuting individuals in cases where the initial harassment might be fairly innocuous, but becomes magnified through the sheer volume of abuse which develops over time. The Committee concluded that the English common law principle of joint enterprise could apply, enabling the prosecution of members of a group acting with common purpose and intention. The courts would determine whether joint enterprise catches instances in which the people involved did not know each other and acted at different times and in different places.

6.38. I agree with this general approach. It is possible in Scots law for concerted action to arise spontaneously and give rise to art and part liability for the offence. I therefore do not think any recommendation for change in the law is required at this stage.

Jurisdiction/extra-territorial application

6.39. The global nature of the internet can give rise to specific challenges in establishing the jurisdiction of the courts of any particular country over accused persons. These were discussed in the context of debate on the Bill to repeal the OBFTCA, and it was suggested by some witnesses and MSPs that a provision to found extra-territorial jurisdiction for the courts could be justified to ensure that offences committed on the internet could be prosecuted in Scotland.

6.40. The English case of Sheppard and Whittle [56] illustrates the challenges in the context of a prosecution for stirring up racial hatred through the distribution of Holocaust-denial material on the internet. Sheppard uploaded the material to a website which he had set up but was hosted by a server in California. The material was accessible within the jurisdiction of England and Wales and the accused were convicted of offences under Part 3 of the Public Order Act 1986. They appealed on the basis that the material was published in the USA rather than England and Wales. The Court of Appeal mentioned three possible theories in relation to how a court's jurisdiction might apply to publications on the internet:

  • that jurisdiction lies with the country in which the server is hosted;
  • that jurisdiction lies with the country in which the material is downloadable;
  • that jurisdiction lies with the country which was targeted by the material.

The Court of Appeal did not need to express a preference between these theories, as it considered that there was no question that it had jurisdiction in the case, since the defendants were based in England, the material was written, edited and uploaded there and the defendants had control of the website in question.

6.41. I have considered the case law of the Scottish courts relating to jurisdiction in similar cases. In the case of a common law crime, the Scottish courts have jurisdiction if an act done outside Scotland has a practical effect in Scotland [57] . This rule has been considered in relation to statutory offences, the key decision being Clements v HM Advocate [58] , which related to the supply of drugs where the activities in question took place in both Scotland and England. The evidence led by the Crown was solely to the effect that both accused had been involved in collecting drugs in London and in giving them to a co-accused who had travelled from Scotland, and who thereafter returned to Scotland by train. Both accused were convicted and appealed on the ground that the Scottish courts did not have jurisdiction to try them because the only evidence against them related to what they had done in London and there was nothing in the Misuse of Drugs Act 1971 which overcame the presumption that a criminal statute was not intended to have extraterritorial effect. The High Court of Justiciary was satisfied that conduct which occurs in Scotland, or conduct abroad which has had its result in Scotland, should be treated as amounting to a crime committed in Scotland. The court was satisfied that this result followed from the application of the accepted rules governing questions of jurisdiction, and did not require the assertion of any extra-territorial jurisdiction.

6.42. Applying an equivalent reasoning to online hate cases, I am satisfied that the Scottish courts would have jurisdiction where the harm arising from the act occurs in Scotland, even if acts leading to that harm in fact took place elsewhere. I do not therefore see any need to recommend a provision to confer extra-territorial jurisdiction in relation to hate crime or hate speech which is committed online.

Measures short of criminalisation: the role of social media providers

6.43. If the online environment is to change, and be less of a place where some people feel that the wanton abuse of others is not just acceptable but also a way to demonstrate their superiority, then this requires a shift in attitudes. The prosecution of individuals will help in serious cases. However, as we have seen in chapter 2 (underlying principles), criminalisation is just one way in which attitudes may be shifted. It is important to bear in mind the whole suite of potential responses. I therefore highlight here certain other developments which may be relevant.

6.44. I am not arguing for the sanitisation of the internet: freedom of expression is important, even when it offends. However, it is also important to recognise that gratuitously offensive comments can create an environment where freedom of speech is a reality for some but not others. The Westminster Home Affairs Select Committee concluded:

It is essential that the principles of free speech and open public debate in democracy are maintained—but protecting democracy also means ensuring that some voices are not drowned out by harassment and persecution, by the promotion of violence against particular groups, or by terrorism and extremism. [59]

6.45. The problem is pernicious and requires a wider approach to ensure that the material in question is removed speedily. In practice, this requires the involvement of social media providers to act more proactively in removing unacceptable content.

6.46. The precise way in which social media providers should become aware of relevant content and be encouraged or required to deal with it goes beyond the scope of this review. The Home Affairs Select Committee at Westminster is continuing its inquiry into hate crime and its violent consequences (which it had started before the May 2017 general election) and has taken evidence from social media providers, academics and regulators about the use of social media to perpetrate hate crime and how this might be tackled.

6.47. The UK Government also published its Internet Safety Strategy green paper in October 2017 [60] . This discussed a number of measures designed to improve online safety, with a particular focus on protecting users from harm which does not reach a criminal threshold. Two policy developments which I consider might be particularly relevant here are a proposed voluntary code of practice for social media providers under section 103 of the Digital Economy Act 2017, and a possible annual internet safety transparency report.

6.48. Section 103 of the Digital Economy Act 2017 requires the Secretary of State to publish a code of practice giving guidance to social media providers. The guidance should concern the action which it is appropriate for social media providers to take against the use of their platforms for online conduct directed against individuals which involves bullying, insults or behaviour likely to intimidate and humiliate. The guidance must deal with arrangements allowing users to notify the provider of the conduct and processes for dealing with such notifications. Effectively, the code of conduct is therefore intended to cover the relationship between social media users and providers when the platform is used for bullying etc behaviour, which could include online expressions of hatred. The code of practice would not affect how unlawful conduct is dealt with but might provide an alternative means for users to deal with online hatred.

6.49. If the transparency reporting proposals are adopted, social media providers would be encouraged to produce an annual report with UK-wide data showing:

  • the volume of content reported to companies, the proportion of content that has been taken down from the service, and the handling of users' complaints;
  • categories of complaints received by platforms (including by groups and categories including under 18s, women, LGBT, and on religious grounds) and volume of content taken down;
  • information about how each site approaches moderation and any changes in policy and resourcing.

6.50. The green paper was consulted upon between October and December 2017, and a Government response is expected shortly.

Conclusions

6.51. Having reviewed the existing legislation, I consider that the current suite of offences (if supplemented in accordance with my recommendations for a gender hostility aggravation and stirring up offences) are capable of being used to prosecute all of the examples of online hate crime and hate speech drawn to my attention which justify a criminal response.

6.52. It is worth noting that some of the examples of online behaviour which were noted by respondents to the consultation, while undoubtedly harmful, distressing and offensive, would not amount to hate crime falling within the scope of this review. Examples include incitement to self-harm and suicide, online fraud and impersonating another person online. A number of the forms of harm identified by Bakalis could apply to online abuse which is not also hate crime.

6.53. I have mentioned above that the UK Government has requested the Law Commission of England and Wales to carry out a review of the law relating to online offensive communications. The review is not focused on prejudice/hate communications, but will cover all forms of trolling, harassment and cyber-bullying [61] . The first phase will run from April 2018 and lead to a report before the end of 2018 analysing the effect of the existing law. If deficiencies in the current law are identified, the Commission has agreed to further work looking at potential options for reform. The Law Commission's role is limited to the law of England and Wales. However, it is recognised that various offences in this area also extend to Scotland: the conclusions of that review should therefore also inform UK Government policy development which applies across the UK in relation to reserved matters. They may also be relevant to provisions of Scots criminal law which apply across reserved and devolved matters.

Recommendation 17

Recommendations 9 (gender hostility) and 13 (stirring up) will form part of an effective system to prosecute online hate crime and hate speech.

I do not consider any further legislative change necessary at this stage. However, I would encourage the Scottish Ministers in due course to consider whether the outcomes of the Law Commission's work on online offensive communications identify any reforms which would be of benefit to Scots criminal law across reserved and devolved matters.

Contact

Back to top