The Department for Digital, Culture, Media & Sport (DCMS) has published its Online Harms White Paper.

It marks the start of a consultation, as DCMS gathers views on how they can regulate online services and tackle the issue of online harm. The aim is to “make companies more responsible for their users’ safety online, especially children and other vulnerable groups.”

What is within scope?

The consultation is looking at two aspects; content that is illegal, for example copyright infringements, inciting hatred, or sale of illegal goods; and content that is unacceptable, covering behaviours such as bullying and abuse.

There is the issue of disinformation too; how hostile actors can use online channels to undermine democracy through sharing propaganda and false content.

It applies to companies “that allow users to share or discover user-generated content or interact with each other online.”

Why is the UK government looking at this?

The white paper points to a range of regulatory and voluntary initiatives aimed at solving the issues outlined above. However, the government’s view is that these have not gone far enough, blaming companies for not introducing measures to keep users safe in a consistent or successful way. The feeling is that self-regulation is not working and will not work.

This seems to be the latest example of an ongoing shift, with governments around the world stepping in to regulate social networks. Last week, the Australian government introduced strict new laws in this area in the wake of the Christchurch shooting. Now, providers that do not manage the content on their platforms risk a fine of up to 10% of their annual profits and staff could face time in prison.

New regulations to improve online safety will be introduced

The UK government wants to make companies more responsible for user safety and removing harmful content from their platforms. An independent regulator will be set up to oversee this, able to issue sanctions as and when required.

As well as social media companies, file hosting sites, public discussion forums, messaging services and search engines will be covered by the regulations. They will have to show that they are fulfilling a “duty of care” to users, which is outlined in a code of practice (see below).

online providers will have to prove they can be trusted

The paper also discusses the importance of transparency, trust and accountability. The plan is that online providers will have to report annually on “the prevalence of harmful content on their platforms and what counter measures they are taking to address these.” This will include details on how algorithms select content and serve it to users.

These reports will be published online by the regulator, allowing users to judge the platforms for themselves.

What’s in the code of practice?

Published on the same day, the code of practice has four principles:

  1. Social media providers should maintain a clear and accessible reporting process to enable individuals to notify social media providers of harmful conduct
  2. Social media providers should maintain efficient processes for dealing with notifications from users about harmful conduct
  3. Social media providers should have clear and accessible information about reporting processes in their terms and conditions
  4. Social media providers should give clear information to the public about action they take against harmful conduct

What happens next?

The white paper follows the publication of the Disinformation and “Fake News”: Final Report in February 2019 and will complement work by the Centre for Data Ethics and Innovation (CDEI).

The consultation is open 1 July 2019. For more details, including how to respond, click here.

It will be interesting to see what feedback is given. There is a lot within the detail that is open to interpretation, which may make it difficult to enforce regulations and codes of conduct in a consistent manner. For this to be successful, some consensus will need to be reached with online platforms and users so that there is clarity around the types of content being discussed, and the potential impacts on audiences.


Photo by Luiz Felipe, via Unsplash

Leave a Reply

Your e-mail address will not be published.