Use of AI in the Tribunal – Brendan McGurk KC

VP Evans (as executrix of HB Evans, deceased) & Ors v The Commissioners for HMRC [2025] UKFTT 1112 (TC) is in all other respects an unremarkable case management decision of the First tier Tax Tribunal. The underlying appeals challenged Closure Notices issued by HMRC concerning Capital Gains Tax liabilities arising from tax planning arrangements involving offshore trusts and the application of double taxation conventions between the UK and New Zealand, and the UK and Mauritius. The application was brought by the Appellants for the disclosure of documents by HMRC. The application was dealt with on the papers and granted in part.

More remarkable was the Tribunal’s postscript, entitled ‘The Use of AI’, in which Tribunal Judge McNall indicated that he had used AI in the preparation of his ruling and explained for what purposes the AI was deployed. He said ([48]): “I have used AI to summarise the documents, but I have satisfied myself that the summaries – treated only as a first-draft – are accurate. I have not used the AI for legal research.” The Judge used Microsoft’s ‘Copilot Chat’, available to judicial office holders through the eJudiciary platform. All data entered into Copilot Chat on that platform remains secure and private.

The FTT’s decision follows publication, in April 2025, of “AI: Guidance for Judicial Office Holders” available here. That guidance replaces guidance issued in December 2023. The refreshed guidance expands the glossary of common terms and provides additional details on misinformation, bias, quality of datasets, and other areas of concern. Relevantly in the context of Evans v HMRC, it also advises judges to inform litigants that they are responsible for the AI-generated information they present to the court/tribunal, just as for any other type of evidence. The covering note to the refreshed guidance (signed by the Lady Chief Justice, the Master of the Rolls, the Senior President of the Tribunals and the Deputy Head of Civil Justice) states: “The growing accessibility and relevance of AI in the court and tribunal system means it is important that its use by or on behalf of the judiciary is consistent with its overarching obligation to protect the integrity of the administration of justice.”

In Evans, Judge McNall also noted the ‘Practice Direction on Reasons for Decisions’, released on 4 June 2024, in which the Senior President of Tribunals wrote: “Modern ways of working, facilitated by digital processes, will generally enable greater efficiencies in the work of the tribunals, including the logistics of decision-making. Full use should be made of any tools and techniques that are available to assist in the swift production of decisions.” The Practice Direction clearly covers AI, and was endorsed by the Upper Tribunal in Medpro Healthcare v HMRC [2025] UKUT 255 (TCC). The Judge observed that the disclosure application before him was well-suited to the use of AI since it was a discrete case-management matter, dealt with on the papers, and without a hearing. “The parties’ respective positions on the issue which I must decide are contained entirely in their written submissions and the other materials placed before me. I have not heard any evidence; nor am I called upon to make any decision as to the honesty or credibility of any party” ([43]). He concluded by noting that he was mindful that “the critical underlying principle is that it must be clear from a fair reading of the decision that the judge has brought their own independent judgment to bear in determining the issues before them”: see Medpro at [43]. To that end he observed that he was the decision-maker, and was responsible for the material created by the AI. “The judgment applied – in the sense of the evaluative faculty, weighing-up the arguments, and framing the terms of the order – has been entirely mine” ([49]).

It is right and proper for Courts and Tribunals to indicate where they have used AI and to what end. Questions will no doubt arise as to when a Court or Tribunal must give notice of how it is has been used, and what has been produced. One can well imagine arguments from litigants that they are entitled to see how, for example, an AI has summarised various matters and how the Judge has satisfied him or herself that it accurately reflects the materials. Analogies with the GDPR-conferred right to rectification might be drawn, albeit that any such analogous right would not be conditional on the processing of personal data. We are in the foothills of AI use by the judiciary. Empirical research is being undertaken to consider how litigants respond to various uses of AI by Tribunals. Transparency in the deployment of the least controversial applications of AI by Courts and Tribunals will begin to normalise AI use in these settings, meaning that we can probably expect more expansive uses of these tools in Courts and Tribunals in the not too distant future.

 

Publication of new practitioners’ text on Artificial Intelligence and Public Law

Brendan McGurk KC and Professor Joe Tomlinson of the University of York have published their new practitioners’ text on Artificial Intelligence and Public law. The Government’s use of algorithmic-based decision-making is rapidly expanding across many policy areas, including immigration, social security, regulation, security and policing. This book provides the first comprehensive analysis of how public law applies to the use of artificial intelligence and automation in the public sector in England and Wales.

Starting with an accessible account of the nature of AI and the automated systems being increasingly deployed in the public sector, the book covers the various legal regimes which regulate their use. It considers how the principles of judicial review might be deployed to challenge automated decision-making by public authorities. It also explains how equality law, human rights law, procurement law, data protection law and private law apply to government use of AI and automation. This book is a vital guide for practitioners in both private practice and government, and for anyone navigating this quickly changing, complex and uncertain environment.

Holger Hestermeyer – written evidence for the House of Lords International Agreements Committee

Holger Hestermeyer together with with Alex Horne have written a submission for the House of Lords International Agreements Committee’s inquiry on the review of treaty scrutiny.

This joint submission to the International Agreements Committee is based upon a research project undertaken by the authors for the Centre for Inclusive Trade Policy which concluded in 2024.1 It considered the role of Parliament in scrutinising international agreements and proposed several ideas for reform. Both authors have practical experience of the issue having recently worked for the UK Parliament undertaking treaty scrutiny.

Nicholas Khan KC on Global Competition Review (GCR)

Nicholas Khan KC has left the European Commission after two decades to join Monckton Chambers.

Khan joined the set today, after reaching the commission’s mandatory retirement age in September. He said Monckton Chambers is very good for the lawyer like himself who has been immersed in EU law for many years.

To read full article please click here.

Azeem Suterwalla and Will Perry co-author public law and procurement law chapter of the Second Edition of the ‘Law of Artificial Intelligence’

Azeem Suterwalla and Will Perry have co-authored the public law and procurement law chapter of the Second Edition of the ‘Law of Artificial Intelligence’, which is published by Sweet & Maxwell. The chapter considers the increasing use by public authorities of AI, including commercial reasons to innovate with emerging technologies. The chapter covers the current relevant legal and regulatory framework and guidance for public authorities in England and Wales. It also identifies recent proposals for changes to that framework. It is hoped that this chapter will be relevant and informative to public authorities and those delivering public functions using AI, and also to private organisations seeking to supply to the public sector. The chapter will also be useful to parties who wish to carefully scrutinise the decisions of public authorities, including in the context of judicial review and public procurement proceedings.

Professor Carl Baudenbacher has testified before the Committee for Economic Affairs and Taxes of the National Council (the Grand Chamber) of the Swiss Parliament

Professor Carl Baudenbacher has on 12 February 2024 testified before the Committee for Economic Affairs and Taxes of the National Council (the Grand Chamber) of the Swiss Parliament on whether Switzerland should in future accept the dispute settlement model of the Association Agreements of the EU with the former Soviet republics of Georgia, Moldova, Ukraine and Armenia. This model, which is based on a pro forma arbitration tribunal that must request a binding judgment from the ECJ if EU law is “implied”, was also discussed in the UK during the Brexit years. It is part of the Withdrawal Agreement, but was ultimately rejected for the TCA. However, the UK has left the single market, while the Swiss government wants to keep its country in the single market.

Carl Baudenbacher’s written paper: Institutional aspects of the planned Switzerland-EU Framework Agreement 2.0