Proposals for Online Safety Bill getting worse, not better

17th March 2022 / United Kingdom
Proposals for Online Safety Bill getting worse, not better

By Jim Killock: The government’s proposals for the Online Safety Bill are getting worse, rather than better. These include mandatory age checks for any website with any proportion of adult content – so Twitter, Google search and Reddit – are proposed, as well as duties to scour posts for “legal but harmful” content and delete it.

We were told during early consultations that neither would be on the cards, as the government wished to protect free expression. In early March, the desperation of the government to reclaim the political agenda created a desire to appear active and capable of cracking down. As a result, we have a bill that is much more dangerous than it was six months ago, focusing on particular solutions, which are likely to fail in practice.

The Online Safety Bill however is just one of many proposed regressive measures, which include proposals that would make data protection meaningless and unenforceable. Data protection, Net neutrality and other areas of retained EU law will also come under swift attack and could be rewritten by executive fiat, under the terms of the Brexit Freedoms Bill – freedom meaning freedom for ministers, through less democratic accountability to Parliament.

Nevertheless, the current Ukraine crisis for instance shows the dangers of this approach. Whether it is encryption for personal messages, or Tor to guarantee access to websites, the ability to use security tools and the free flow of information that the Internet can bring, are clear to be seen as critical underpinnings for the advance of democratic values. The result is that Nadine Dorries praises efforts to roll out encrypted information to Ukraine, while elsewhere the government funds campaigns against it.

It is clear that the OSB would make it much harder if not impossible for social media companies to offer uncensored, unfiltered and privacy-preserving access to people in Russia and Ukraine. How would the UK’s attempts to undermine encryption, or force identity and age verification in this bill fit with attempts by Instagram to offer encrypted messaging or Twitter and Facebook providing Tor services to people living uder intense censorship and surveillance?

 

A history of digital failure

Failures to respect fundamental rights in the Online Safety Bill and efforts to undermine data protection are part of a pattern in the UK. For instance:

  • In the late 1990s, the UK legislated for ‘key escrow’, so that the government would always have a backdoor to encryption. The legislation failed to be implemented, as it was impossible and unsafe to impose these technologies in practice.
  • The 1998 Data Protection Act was hobbled from the start, with faulty definitions of personal data and consent, and a toothless regulator. These problems were eventually fixed in the 2010s and by the GDPR in 2017; now the government wants to move back to something like the 1998 approach.
  • the Digital Economy Act of 2010 became the subject of a massive campaign by ORG, 38 Degrees and others, because of proposals to cut off whole families from the Internet, if any family member was suspected of using sites like the Pirate Bay. Needless to say, the plans were eventually delayed and then dropped.
  • In the 2010s, Government persuaded ISPs to implement adult content filters under threat of legislation, claiming they would make the Internet safe for children. Take up of these filters has been much lower than anticipated, as they apply to whole households and cause difficulties accessing normal websites. While they haven’t reduced teenagers’ access to adult content, they do cause routine problems for business owners, sex and drugs advice services, LGBT community websites and ISP customers as the filters block the wrong websites.
  • The 2017 Digital Economy Act contained plans to implement Age Verification for adult websites, while making no provision for privacy; delays associated with that failure caused the plans to be dropped.

 

The current round of legislation tops even these shoddy efforts. The Online Safety Bill proposed to ‘solve’ safety issues through a ‘duty of care’ that would address ‘legal but harmful’ content. The Bill has not managed to find a means to define what would fall under these concepts; instead, Ministers or the regulator will decide. Worse still, the emphasis on content regulation will again punish the minorities the Bill is meant to protect; whether through language and cultural barriers, or because sexuality is easily mistaken for adult content by algorithms, minority content is punished by automated content adjudication, which will be the primary result of this Bill.

Perhaps worse, the government has failed to recognise the interaction between data protection, competition policy and online harms, as is explained in this letter signed by 25 organisations. The vast majority of harmful content and activity the government wants addressed are exacerbated by data profiling and content prioritisation that optimise for attention: behavioural profiling needs more data, more behaviour, more engagement to observe and make inferences about who you are and what you are interested in. In turn, this prioritisation of rage and anger becomes instrumental to maximise engagement for toxic business models and illegal online advertisement systems.

SafeSubcribe/Instant Unsubscribe - One Email, Every Sunday Morning - So You Miss Nothing - That's It


It is better to ensure enforcement of data protection legislation, to reduce data profiling and remove dangerous Adtech that fuels the attention market. This would start to address online harms by putting a halt on perverse incentives. Yet the government proposes to make it legal for any company to profile, optimise and manipulate you on the basis of your data, so long as it is done to “improve customer experience”. This would open the door to far more online harms, not less.

 

Addressing the attention market

If we accept that the problem with online harms is largely to do with the attention market that pushes prurient, shocking and provoking content to the fore, then we also need to accept that to fully address the problem, we will need to do so through the market.

It is now commonplace for people to remind each other that on Twitter, Facebook or Google, as you are not paying, it is you that is the product. The game is to capture users, their data and attention in order to monetise it.

What is needed, therefore, is to change the market, so that users can move service provider where they find the most satisfactory user experience, without the penalty of losing their social network, followers or friends.

This is a problem of competition policy, and can be addressed by interoperability. This means, for instance, having social media accounts that can move and communications between users that is more like email or conventional phone networks, where your platform makes no difference to whom you wish to contact.

Here, the Competition and Markets Authority (CMA) is making good progress. However, a push towards interoperability and other competition measures would be disastrous when combined with weaker data protection. People will not wish to engage with new platforms or tools if it means they lose control of their data. High data protection standards are vital for social media competition.

 

Helping institutions improve

We can see two broad problems for the UK: politics and governmental institutions. It’s commonplace and easy to assume that Internet issues are not regarded as a voting matter for UK citizens – in contrast to Germany for instance, where Internet freedoms are part of the coalition agreement. Similarly, departments frequently seem politically driven in their choice of evidence for policy decisions, while Parliament is under-resourced to tackle Internet policy and other matters of technical legislation. Brexit places these institutions under new and particular strain; hence the suggestion within the “Brexit Freedoms Bill” that the executive should rewrite former EU rules with minimal reference to Parliament.

This clash – between Parliament and the Executive over the right to hold our laws to democratic account – is a fundamental question of the nature of the UK in a post Brexit world. With a reduced role for Parliament, the UK risks a long term shift to a bureaucratic and lobby-led state, vulnerable to populism and corporate raids on government resources, with poor policy and economic results, across areas of former EU law.

These issues are bigger than just Internet policy, covering environment, energy, employment and more. While in each area groups like ORG can campaign and fight off individual policies, or just as frequently watch them fail when they face implementation and reality, we need our government to do much better than that.

 

Jim Killock is Director of Open Rights Group (ORG) – a UK-based digital campaigning organisation working to protect our rights to privacy and free speech online.

 

 

At a time when reporting the truth is critical, your support is essential in protecting it.
Find out how

The European Financial Review

European financial review Logo

The European Financial Review is the leading financial intelligence magazine read widely by financial experts and the wider business community.