By Abe Chauhan
This essay was awarded third place in UCL CAJ SPBC writing competition to answer the question: “Technology is a useful tool for furthering access to justice”.
Many are heralding a new era of ‘posthuman governance’ in which complex social issues are ‘deconstructed into neatly defined, structured and well-scoped problems that can be solved algorithmically’. However, new technologies are not neutral and egalitarian tools but human creations and their effects on access to justice depend on the way in which they are designed, deployed and operated by public authorities. Promises of a digital technocracy disguise ruthless cost-cutting measures which will likely perpetuate structural injustice and worsen access to justice. In particular, the digitisation of courts and tribunals could introduce access barriers and offend principles of natural justice (Part II) and automated decision-making (“ADM”) systems look set to outpace the development of traditional judicial review doctrines with the result that claimants’ rights may become materially unenforceable before the courts (Part III). As long as technological solutions are developed to prioritise efficiency in monetary terms, access to justice will continue to be under threat.
Accessing Online Courts
In order to address the growing number of cases before the courts as well as consistent budget cuts, the Ministry of Justice envisages the introduction of a digital justice system in which all cases begin online and where some no longer require in-person proceedings at all. This transition to online courts is being accelerated by the COVID-19 health crisis and its impacts on in-person proceedings. Automated adjudication was trialled with the Parking Appeals Service and it delivered on promises of cost-efficiency. However, the proliferation of online courts will create access problems for many claimants. Around 22% of the population do not have the necessary digital skills for day-to-day life, 19% cannot turn on a device or use an app and 8% are offline entirely and these individuals are disproportionately from lower-income households or of advanced age.
Online adjudication may also violate common law principles of natural justice which require not only that a claimant is able to participate in fair and unbiased hearings in order to effectively make their case, but also that the justice system does not engender a sense of injustice. It has been shown that a claimant’s chances of success fall when they are denied oral hearings, which will likely be the case in many online court settings. Alongside instrumental concerns, face-to-face interactions allow litigants to achieve emotional engagement; this disappears in a dehumanised online system. Digitisation is likely to become especially problematic in contexts like welfare benefits entitlement hearings; it is doubtful that vulnerable claimants can be protected properly through digitised adjudication platforms.
Challenging Automated Decisions
Access to justice necessitates not only access to adjudication but also the ability to assert rights before courts. This can become practically impossible where developments in public authority decision-making systems outpace those in judicial review such that decisions become unreviewable. Public authorities are beginning more widely to use tools such as machine learning algorithms (“MLAs”), which infer correlations between sets of variables with a high degree of accuracy, as part of ADM systems in order to save costs on employing human decision makers. These are already used by local authorities to predict at-risk homes for intervention in order to prevent child maltreatment and by the police to confer individual risk scores and to influence custody decisions.
Efficiency, though, comes at the cost of transparency. Judicial review principles are modelled on assessing how a decision maker reached a decision, for example whether she discriminated against groups protected under the Equality Act 2010. However, the exact correlations which MLAs infer from input data in many cases cannot be determined by operators because of opacity, in the form of technical illiteracy as a result of the MLA’s complexity or intellectual property protection over the formula and data. This means that, with some automated decisions, public law error like discrimination will be impossible to ascertain. For example, an MLA used in parole decisions to predict the likelihood of reoffending could ascribe a higher risk value to a black individual than to a white individual based on this race characteristic, however this may only be discoverable by carrying out ex post algorithmic auditing once the ADM system has been applied in thousands of cases.
Systemic review cases initiated by civil society organisations could be essential to challenging ADM systems.However, the Conservative Party has shown hostility towards public interest litigation, emphasising that judicial review ‘is available to protect the rights of the individual…not…to conduct politics by another means’. It remains to be seen whether the Administrative Court will be free to develop principles for the review of automated decisions with the looming threat of retaliatory measures from the Government.
New technologies promise to radically alter the way in which all public services are delivered and, unlike physical infrastructural overhaul, all within the space of a few years. This warrants a cautious approach. In the same way that exposing the provision of public services to market forces led in many areas to a fall in service quality and the watering down of individual protections, moving towards a digital technocratic state risks the adoption of an amoral conception of administration and the exacerbation of the current downward decline of access to justice.
 Hughes, ‘Algorithms and Posthuman Governance’ (2017) 1 JPS 166.
 Janssen and Kuk, ‘The Challenges and Limits of Big Data Algorithms in Technocratic Governance’ (2016) 33 GIQ 371, 371.
 Ministry of Justice, ‘Transforming our Justice System’ (2016) 6.
 Raine, ‘Modernising Courts and Tribunals through ICTs – Lessons from The London Parking Appeals Service’ (2001) 9 IJLIT 115, 126: the cost to run the PAS system is around £213/hour, compared to around £250/hour for the Magistrates’ Court.
 ‘The digitally disadvantaged’ in Lloyds Bank, UK Consumer Digital Index 2019 – Key Findings (2019).
 R (Osborn) v Parole Board  UKSC 61 .
 Elliott and Thomas, Public Law (3rd edn, OUP 2017) 409-410.
 Burton, ‘Justice on the Line? A Comparison of Telephone and Face-to-face Advice in Social Welfare Legal Aid’ (2018) 40 JSWFL 195, 210.
 Hackney County Council.
 Avon & Somerset Police.
 Durham Constabulary.
 Burrell, ‘How the Machine “Thinks”: Understanding Opacity in Machine Learning Algorithms’ (2016) 3 BD&S 1.
 Angwin et al, ‘Machine Bias’ ProRepublica (New York, 23 May 2016).
 See eg Lord Chancellor v Detention Action  EWCA Civ 840.
 The Conservative and Unionist Party Manifesto 2019, 48.
 On retaliation, see Harlow and Rawlings, ‘“Striking Back” and “Clamping Down”. An
Alternative Perspective on Judicial Review’ in Bell et al (eds), Public Law Adjudication in Common Law Systems: Process and Substance (Hart 2016).