28 January 2026
Read more about: Blog, Data protection and information law, Human Rights, Latest news, Public law,
11 September 2025
On 4 September 2025, as per its duty under s3(1) of the Law Commissions Act 1965, the Law Commission (the “Commission”) published its 14th Programme of Law Reform.
This Programme outlined ten new projects it will be focusing on over the next few years. These will sit alongside the seventeen other existing law reform projects that the Law Commission is currently taking forward as recommendations for Parliament to consider.
One project of particular interest to public sector authorities, is ‘Public sector automated decision-making’ (“ADM”), but what does ADM entail exactly? The Commission states that it “encompasses decisions made by or for public bodies using algorithmic processes and artificial intelligence”.
READ MORE ON PUBLIC LAW FROM SHARPE PRITCHARD
Across the country, ADM is already being used in areas such as policing, social services, child welfare, fraud detection and healthcare. Clearly, its use will only become more prevalent as technological advances continue, and the Commission raises several concerns with the current legislative approach (or arguable lack thereof) to ADM.
Specifically, it highlights that,
“there is no specific legal framework governing the use of ADM by the state to make decisions affecting the public: public law developed to ensure the accountability of human officials and not automated systems. […] At the same time, judicial review is not well-suited to scrutinising decisions made using ADM.”
Given these concerns, coupled with issues raised in industry commentary and specific case studies, it is no surprise that ADM has been tabled as a focus-project for the Commission.
The use of ADM in the public sector has been the subject of extensive commentary for many years. For example, in 2019, Philip Alston, a UN Special Rapporteur on extreme poverty and human rights, reported on the use of ADM in welfare systems across a number of States.
Expressing concerns over the introduction of more demanding and intrusive forms of conditionality that ADM may lend itself to, he found that there had been “a complete reversal of the traditional notion that the State should be accountable to the individual”[1].
This report particularly raised questions as to whether it is appropriate for areas, such as welfare to be left to the design choices of technological developers, highlighting the sometimes rigid dichotomy between the need for optimisation and efficiency, versus the strict application of the rule of law and principles of “good government”.
Alongside the concerns raised by legal commentators over the use of ADM in the public sector, the lawfulness of specific instances of ADM-use have also been challenged in courts.
R (Bridges) v CC South Wales[2] : lawfulness of use of live automated facial recognition technology (“AFR”) by South Wales Police
In this 2019 case, the Court of Appeal found that the use of AFR by the South Wales Police was in breach of several key pieces of legislation, which led to declaratory relief being granted:
The judgment emphasised the need for a clear and sufficient legal framework (with robust safeguards and clear criteria) to govern the use of AFR technology, to ensure that any interference with privacy rights is lawful, necessary, and proportionate.
It also underscored the necessity for public authorities to conduct thorough data protection impact assessments that adequately address risks to privacy and ensure compliance with data protection principles.
The Court also stressed the ongoing duty of public authorities to consider the potential for indirect discrimination and to take reasonable steps to mitigate such risks, particularly in the context of novel technologies.
There are two key administrative law principles that align with the argument that ADM could benefit from legal reform. In particular:
The public law case of Anisminic v Foreign Compensation Commission[3], established the principle that decision-makers are required to consider all relevant factors and exclude irrelevant information when making decisions.
However, in algorithmic systems, irrelevant factors that may be used in the commissioning and building stages, could pass through to the ultimate decision-making stage.
The use of ‘proxies’ (i.e. to avoid the use of sensitive information) by automated programmes in these earlier stages may therefore be problematic, and potentially unlawful from an administrative law perspective.
Within judicial review, the standard of a court’s review is not to step into the shoes of the decision-maker, but rather to assess the decision-making process that was followed. To do this, courts break the decision-making process down into its constituent parts and take a holistic view as to whether the process (including its development, context and outcomes) is in line with public law principles.
In a similar vein, separating an ADM process into its earlier development and later decision-making stages before assessing the rationality of each, would more likely build public confidence in the reviewability of ADM systems. Further, a more transparent approach to the wider socio-technical processes, of which the technology is developed within, is a key way to increase accountability in this area.
Though the Commission is in the very early stages of research on this topic, there are several areas it may investigate against the backdrop set out below. The following are potential areas to watch out for:
For now, public bodies have been encouraged to make use of AI tools by the Government.
However, in doing so, they will need to consider and implement the principles for safe and responsible use of these systems, as set out in Government Digital Service: Artificial Intelligence Playbook for the UK Government.
We are on hand to advise and assist public authorities in adapting to this evolving and uncertain area of law. It will be interesting to see how the Commission’s work on this develops over the next few years, and we will keep public authorities updated throughout.
Please sign up to our mailing list for updates.
To read more about the AUTHOR, Beatrice Wood, please CLICK HERE.
[1] 2019 Report of Philip Alston, UN Special Rapporteur on extreme poverty and human rights
[2] [2019] EWHC 2341 (Admin)
[3] [1969] AC 147
[4] Of course, competing IPR and confidentiality considerations will be a relevant counter-consideration for lawmakers here.