We've published a new legal opinion, raising questions over the legality of the use of AI tools in the UK asylum system.

Applicants aren't being informed of their use by decision-makers, nor being given the opportunity to correct errors that might be made in AI generated summaries.

Find out more ⬇️

https://www.openrightsgroup.org/press-releases/home-office-use-of-ai-in-asylum-cases-likely-to-be-unlawful-legal-opinion-finds/

#asylum #ai #migrants #homeoffice #ukpolitics #ukpol

Home Office use of AI in asylum cases likely to be unlawful, legal opinion finds

The Home Office’s failure to inform asylum applicants that AI tools are being used in their assessments is likely to be unlawful, according to a legal opinion, published today.

Open Rights Group

We need an immediate ban on the use of AI tools in the UK asylum process.

There are many ways to clear the Home Office’s backlog of asylum cases and raise standards – these tools are not the answer.

The UK government must not automate the hostile environment for migrants with AI.

Write to your MP ⬇️

https://action.openrightsgroup.org/ban-ai-tools-asylum-decision-making

#asylum #ai #migrants #homeoffice #ukpolitics #ukpol

Ban AI tools in asylum decision making

Take action! What’s the problem? The Home Office is using two AI tools: The Asylum Case Summarisation (ACS) tool uses ChatGPT-4 to summarise asylum interview transcripts. The Asylum Policy Search (APS) tool summarises country Policy and Information Notes (CPINs), guidance documents, and Country of Origin Information (COI) reports. The Home Office’s own evaluation revealed that 9% of the ACS AI-generated summaries were so flawed they had to be removed from the pilot. It also found that 23% of caseworkers lacked full confidence in the tools’ outputs.

Open Rights Group
@openrightsgroup I’m waiting, with much fear and loathing, for some idiot to suggest that AI would be the ideal way to clear our crowded Court system. There are many areas where AI MUST be banned.