Keeping consumers safe online: A new report outlining how and why to regulate tech platforms

Thursday 5 July 2018

Sky has commissioned an independent report with Communications Chambers called ‘Keeping Consumers Safe Online’ that explores legislating for platform accountability and online content.

The report is available in full here

Executive Summary

Platforms regulate online content, but lack oversight

In a world of effectively infinite information, online platforms play a vital role in selecting, organising, ranking, recommending and suppressing content and content providers.

Platforms (or ‘online content intermediaries’) now find themselves on the frontline of online content regulation: an inevitable consequence of users’ ability to post content that ranges from illegal and seriously harmful, to legal but socially unwanted. Their rules are explicit, in community standards and terms of use, but also in the implicit and sometimes unintended effects of personalisation algorithms.

There is currently no systematic means of assessing the impact of platform’s content policies, algorithms and decisions, nor of holding intermediaries to account. Platforms are not above the law, but European law limits their liability to conditions in which they have ‘actual knowledge’ of illegal content.

A consensus is growing that further intervention is needed to address platforms’ role in governing online content, given its importance to the public interest in a host of areas. However, society’s expectations for this broader role have not yet been codified, either with respect to desired outcomes or good governance procedures.

Arguably, this is the single biggest gap in Internet regulation, although it could be addressed relatively easily. The state plays a role in setting standards in most other information markets, recognising the social harms and benefits of certain kinds of content. It could do the same here. Doing so need not conflict with European law, which anticipates states may apply a duty of care on intermediaries in relation to illegal activity.

Intermediaries’ policies vary widely. Some variation is appropriate, as users’ expectations also vary, but consumers have a right to know what to expect from platforms. Today’s fragmented (and not always well-advertised or effectively enforced) standards and processes may not help. Some platforms work closely with governments to address potential content harms, others do not.

A statutory framework for intermediary accountability

The UK Government has committed to bring forward a White Paper on online harms and safety, including a Code for Practice and Transparency Reports. The White Paper could establish a wider statutory framework for platform accountability for online content.

Such a framework would aim to:

  • Clarify what consumers can expect from intermediaries, in their handling of harmful and illegal content;
  • Ensure intermediaries’ governance of online content is proportionate and accountable, and takes a fair and responsible approach to balancing rights;
  • In achieving these goals, recognise differences between intermediaries of varying size and different business models, and the need for regulatory certainty and an outcomes-based approach.

Core components of the framework would be:

  • A Code of Practice for content intermediaries, defining broad content standards and procedural expectations
  • A List of intermediaries in scope for different tiers of obligation
  • Incentives and sanctions to encourage intermediaries to adhere to the Code
  • An independent oversight body, tasked with maintaining the Code and the List, requesting certain information from intermediaries, promoting consumers’ and intermediaries’ rights and responsibilities, and reporting publicly on the effectiveness of platforms’ content policies.

Many elements of this framework are contained in existing law and regulation, but they are not consistently brought together.

The envisaged Code would be broad and flexible enough to adapt to new concerns and platforms. Its requirements would be proportionate to evidence of harm, with the priority on illegal and seriously harmful content, and lower expectations for legal-but-harmful material. It would also differentiate on size, with reduced or no requirements for smaller platforms. The baseline requirements of intermediaries above a de minimis size would be to notify the oversight body, contribute to its costs of operation, and provide information or carry out a harm assessment in response to a specific, evidence-based and reasonable request.

The oversight body could be industry-led, if intermediaries can form an independent organisation with industry and Government support, able to make binding decisions, with a backstop regulator fulfilling a role in this co-regulatory model. The Advertising Standards Authority offers a precedent.

Or oversight could be provided by a statutory body (either an existing institution such as Ofcom, or a new body), in which case it should be funded by industry, as Ofcom is today.

Benefits and risks

In this model, responsibility for actual content regulation – policy development, notification and appeals, use of automated detection tools, human moderation – continues to sit with intermediaries themselves, who are best placed to govern platforms in users’ interests.

The purpose of these proposals is to provide better oversight of that activity, and thereby replace ‘regulation by outrage’ with a more effective and proportionate approach.

All stakeholders could benefit from such a model, including intermediaries, who would have greater clarity about what is expected of them, the legitimacy that comes from external scrutiny and validation, and defence against unreasonable or unevidenced requests.

Oversight needs to be cautious, and limited in statute, to mitigate potential risks to openness, innovation, competition and free speech. We believe this model does not require changes to intermediaries’ liabilities, although in the longer-term a review of the E-Commerce Directive may be appropriate.

Recommendations

Government should include an accountability framework for online content intermediaries in the planned White Paper on online harms and safety. This should make provision for a Code of Practice, an oversight body and incentives and sanctions.

Industry should consider the potential to form a co-regulatory body to provide independent oversight of intermediaries’ content policies, with buy-in from most platforms with significant numbers of UK users.

Government should develop options for a statutory oversight body, in case the industry option does not make sufficient progress within a reasonable time period.