Jeremy Darroch: How to regulate the online platforms

First published in The Times on 5 July 2018

Balancing freedom with responsibility has never been more difficult, nor more important, than in the era of the internet and social media. It now seems, after far too long, that a consensus to this view is forming and that those who have been most opposed are waking up to this urgent pressing need. Leaders such as Mark Zuckerberg now accept that online platforms such as Facebook, Google, YouTube and Twitter need effective rules.

This is to be welcomed. The pressure on these companies to take responsibility can no longer be ignored. The Times has led the charge in exposing the terrible consequences of a complete absence of enforceable standards on these platforms and continues to surface issues that should concern us all – such as the shocking news that over usage and addiction to online platforms is impacting the mental health of our children, as evidenced by the 5Rights group in their Disrupted Childhood report.

So how do we take action now and minimise the damage caused in future? And how do we do that while allowing the internet to continue to be the greatest ever hub of innovation and ideas sharing the world has ever seen?

It is perfectly possible that these platforms can continue to entertain and inform millions while still being accountable and responsible for how they deal with content on their platforms.

Companies operating in the broadcasting, telecoms and advertising sectors, where consumer safety is of fundamental importance, know from their experience that such an objective is achievable. That is why we are confident that a pragmatic and proportionate approach to dealing with online harms can and should be found.

Let’s start with the most obvious point. Purely voluntary approaches have failed. Digital and social platforms cannot be allowed to mark their own homework. The issues are too big and far too serious.

We as a society must find effective ways to stem the flow of hate, abuse, and offensive, illegitimate, and even dangerous content online. And we have to move fast.

The time for codes of conduct from the industry, or slaps on the wrist from politicians, has passed. Only legislation can provide teeth, enforcement, and real consequences for breaches of standards. It is also the only route to ensure clarity and uniformity of regulation.

Many online platforms have developed their own policies for dealing with illegal or harmful content. But those policies are not standardised across platforms and there is no public, political or independent oversight.

Statutes can and must provide an accountable legal framework.

We also need transparency. All online platforms must be required to report on the standards they seek to follow, the efforts they make to comply and their performance, not just their promises, must be independently and rigorously assessed.

All of this makes a strong case for the essential prerequisite for change: a strong, independent regulator.

This independent regulator, established in statute, needs the power to define the online platforms within the scope of the law, identify areas of concern, compel them to provide information and oversee the actions they take.

A regulator must have sharp teeth, starting with strong information gathering powers, the power to initiate enquiries, and the ability to impose effective sanctions including the ability to fine for non-compliance. This will take resources and funding, and as in other sectors this should be financed by the community regulated – the online platforms themselves. 

Importantly, none of this impedes freedom of expression or technological innovation. Rather, it mirrors approaches that have been tried and tested in other UK sectors, such as those we adhere to in broadcasting, telecoms and advertising. We expect even the smallest media companies to invest heavily in editorial and compliance such that every second of coverage meets the responsible standards that society through parliamentary regulation has laid down. It’s simply wrong that some of the largest, most profitable companies on the planet should not be expected to meet, if not even exceed, the same level of responsibility. 

We are far from alone in advocating changes like these. Charities, the police, community and victim groups all are clamouring for regulation not a free for all in our most important new public space. The time has come for action. 

That is why we have published today an independent report into platform accountability and online harms. We have also written to the Secretary of State for Digital, Culture, Media and Sport and the Home Secretary outlining how we can move this debate on to practical discussions on what a legislative framework could look like. 

With a Government White Paper promised this year politicians must now move to passing the laws we need to protect our children, our economy and our collective safety.