#AI is not neutral. It can discriminate and make mistakes. It should not be used to change information that informs life-changing asylum assessments.
Ask your MP to take a stand against the use of AI tools in #asylum assessments.
https://action.openrightsgroup.org/ban-ai-tools-asylum-decision-making

Ban AI tools in asylum decision making
Take action! What’s the problem? The Home Office is using two AI tools: The Asylum Case Summarisation (ACS) tool uses ChatGPT-4 to summarise asylum interview transcripts. The Asylum Policy Search (APS) tool summarises country Policy and Information Notes (CPINs), guidance documents, and Country of Origin Information (COI) reports. The Home Office’s own evaluation revealed that 9% of the ACS AI-generated summaries were so flawed they had to be removed from the pilot. It also found that 23% of caseworkers lacked full confidence in the tools’ outputs.
