The Secretary of State for the Home Department has today announced that the Home Office will discontinue the use of an algorithm used to categorise visa applicants on racially discriminatory grounds.
The algorithm, known as the Streaming Tool, was used to process a range of applications for visas to enter the UK. The Streaming Tool allocated a Red, Amber or Green risk rating to relevant visa applications. Applications made by people holding ‘suspect’ nationalities received a higher risk score. Their applications received intensive scrutiny by Home Office officials, were approached with more scepticism, took longer to determine, and were much more likely to be refused.
The Joint Council for the Welfare of Immigrants (JCWI), a charity which advocates for the elimination of injustice within the immigration system, sought judicial review of the use of the Streaming Tool. JCWI was supported in its claim by Foxglove, a non-profit organisation which focuses on the fair use of technology. JCWI’s grounds of claim, and the Home Office’s decision to retire the Streaming Tool, can be found on Foxglove’s website.
JCWI argued that, in taking account of a visa applicant’s nationality, the Streaming Tool directly discriminated on the grounds of race, in breach of sections 13 and 29 of the Equality Act 2010. JCWI also argued that the Streaming Tool was irrational, because it created a ‘feedback loop’ in which applicants with ‘suspect’ nationalities were more likely to have their visa application rejected, with a high level of rejections in turn being used to justify keeping that nationality on the ‘suspect’ list. Furthermore, the Streaming Tool promoted confirmation bias in Home Office officials, who were encouraged to rely on the algorithm as a tool to aid their substantive decision-making.
In response to JCWI’s claim, the Home Secretary today confirmed that the Home Office will suspend the algorithm with effect from 7 August 2020, “pending a redesign of the process” which will consider “issues around unconscious bias and the use of nationality” in automatic visa systems. The Home Secretary also undertook to undertake and disclose Equality Impact Assessments and Data Protection Impact Assessments for any new system.
The claim is the first known successful legal challenge to an algorithmic decision-making system. It has received extensive press coverage, including from the Guardian, the BBC, and Sky News.
Nikolaus Grubeck and Ciar McAndrew acted for JCWI, led by Ben Jaffey QC. They were instructed by Rosa Curling and Erin Alcock of Leigh Day.