Reversing the Ratchet

Busy times for me at the moment, but this is an aide memoir / place-holder for some later posts.

Its now undeniable that the current British Government is damaging our democracy. Several measures either proposed or enacted that strengthen the power of the executive, reduce accountability and/or threaten free speech.

  • The intent to scrap the Human Rights Act
  • The measures in the Police, Crime, Sentencing and Courts Act 2022 which allow the suppression of protests which cause a ‘nuisance’
  • Eroding the independence of the Electoral Commission
  • Insisting on the regressive ‘First Past The Post’ method for elections that previously used something more proportional.
  • The Online Safety Bill, which would impose impossible moderation standards onto social media companies and hand too much power to the Government to suppress speech it doesn’t like
  • Measures to constrain Judicial Review
  • New plans to curb the rights of workers to strike
  • The undermining of ministerial standards and accountability, as demonstrated by the way the Prime Minister ignored the findings of a report that the Home Secretary bullied civil servants
  • The Covert Human Intelligence Sources (Criminal Conduct) Act 2021 allows the security services to authorise criminal conduct in new, unaccountable ways.

These are the ones I can think of off the top of my head. There are probably more.

Continue reading “Reversing the Ratchet”

Time to Ditch ‘Word Count’ in Favour of Bytes

There’s an amusing detail in the judgment of Mr Justice Peel in WC v HC (Financial Remedies Agreements) [2022] EWFC 22:

The parties’ s25 statements were limited to 20 pages of narrative.  Para 5.2 of PD27A mandates that narrative statements, among other documents, shall be typed in “a font no smaller than 12 point and with 1½ or double spacing”. H complied. W’s statement purported to comply in that it consisted of 20 pages, but because it used smaller font and spacing it was, in fact, about 27 pages compressed within the 20 page limit provided for by me.

— Paragraph 1(i)

This is a classic tactic that has been used by students the world over since the dawn of the word processing age. When I did it as a school boy, the aim was to increase the margins and font spacing so that one had to write less. Here, the tactic was deployed in order to write more.

Since we routinely use computers for everything, its time we abandoned the analogue concept of ‘pages’ as the standard for submissions. Why not simply specify a word count?

Or better still, bytes. There are 1,498 bytes of text in this blog post, for example. A 20 page document typeset at 12 point, 1.5 lines amounts to 45 to 50 Kb of text. Imposing a rule based on data would kill off any typesetting trickery, but also incentivise plain language — because drafters would not be penalised for using three shorter words in preference to one longer word.

(Hat-tip to Gordon Exall and the superb Civil Litigation Brief blog)

Online Safety Bill: Sweeping Ministerial Harms

A third post in a trilogy of analyses of the draft Online Safety Bill.

The Joint Parliamentary Committee scrutinising the government’s Draft Online Safety Bill concluded its evidence sessions on 4 November. The group of MPs and Peers are now writing their report, which will include recommendations for amending the Bill to address the issues identified by those who gave evidence.

One area of particular concern to human rights groups, including ORG, is the powers given in the Bill that would allow the Secretary of State to direct and influence the work of the regulator, and therefore interfere with how the social media companies operate their services.

Read the rest of this post on the Open Rights Group blog.

What’s The Harm In The Online Safety Bill?

Another post analysing aspects of the draft Online Safety Bill.

Throughout the development of the government’s Online Harms policy, a central concern of ORG and other human rights organisations is how any legally mandated content moderation policy could practically be achieved. The algorithmic moderation deployed by most social media companies is notoriously literal, and the human review of content is often performed by people who are unaware of the context in which messages are sent.

These flaws result in false positives (acceptable content being removed) and false negatives (unacceptable content remaining visible).

The draft Online Safety Bill considers two distinct types of content: illegal content, and content that is legal but which has the potential to cause harm. The social media companies will have to abide by OFCOM’s code of practice in relation to both.

The definitions of these two types of content are defined are therefore crucial to the coherence of the new regulatory system. Ambiguous definitions will make it harder for social media platforms to moderate their content. If the new system causes more acceptable content to be taken down, while allowing illegal and/or harmful content to remain on the platforms, then the law will be a failure.

Read the rest of this post on the Open Rights Group blog.