London24NEWS

AI object, Your Honour! Immigration judges are utilizing chatbots to verify and draft rulings to take care of file courtroom backlog

Immigration judges are using artificial intelligence to help draft rulings and have been given permission from officials to check their decisions with chatbots.

Judges in immigration cases have been trained to use a restricted version of Microsoft‘s AI Copilot tool to help prepare for hearings and write skeleton judgements.

Britain’s justice system is creaking under a record backlog of immigration appeals, delaying the Government’s attempts to deport illegal migrants.

The number of asylum seekers appealing their rejected claims has nearly doubled in the last year to 104,400.

Those appealing are able to remain in taxpayer-funded accommodation, including hotels.

In February a Government advisor said AI should be used to help understand the risk of letting criminals go free.

Martyn Evans, chair of the Sentencing and Penal Policy Commission, said AI should have a ‘role’ in the criminal justice system and could be used by judges making decisions on whether to jail offenders.

And last year David Lammy, the Justice Secretary, said the system was ‘testing transcription in the courts and tribunals… and in the immigration and asylum chamber, some judges are using it to help formulate notes and write remarks’.

Immigration judges have received training on Microsoft's AI assistant Copilot

Immigration judges have received training on Microsoft’s AI assistant Copilot

Justice Secretary David Lammy previously said 'some judges' were using AI 'to help formulate notes and write remarks'

Justice Secretary David Lammy previously said ‘some judges’ were using AI ‘to help formulate notes and write remarks’

Now The Observer has reported that training materials encourage judges to use AI to generate a ‘case outline’ – an overview of the parties’ evidence – and a ‘bundle summary’, which creates a timeline of events and outlines each side’s case.

It can also reportedly draw up a list of the disputed issues and use that to produce a ‘decision template’.

In a training video, Lord Justice Dingemans, senior president of tribunals, said judges could use AI and its ‘decision-making tree’ to make summaries of their findings on issues including anonymity, the case background, witness statements and arguments.

He added: ‘All of that work is pre-done. What that will do is mean that when you get to the hearing, you will be a better judge because you’re completely on top of the issues.’

Judges are expected to deliver decisions within two weeks of a hearing and have been told they must not use AI for analysis and that they alone are responsible for the judgement.

But a chatbot can review the decision against a summary of the evidence and submissions.

It can also be used to ‘comment on how fully the decision addresses matters raised in the evidence and submissions, identifying any omissions’.

HM Courts and Tribunals Service said AI would not contribute to analysis or balancing of evidence or the arguments presented.

It said chatbots could be used to convert judges’ audio into text and would be checked by the judge before being issued.

A spokesman said HMCTS welcomed the ‘appropriate use of artificial intelligence in supporting an efficient and effective courts and tribunals system’.

‘However, while technology may assist in some legal work and associated administrative tasks, it cannot replace the pivotal judgement and responsibilities required to make decisions on cases.’

In October an immigration barrister was accused of using AI to prepare for an asylum case after he was said to have baffled a judge by citing cases that were ‘entirely fictitious’ or ‘wholly irrelevant’.

Chowdhury Rahman was using software including ChatGPT to prepare his legal research, a tribunal heard.

The experienced immigration barrister was found not only to have used AI to prepare his work, but ‘failed thereafter to undertake any proper checks on the accuracy’.