We need more transparency about when and how AI is being used in the UK asylum process.

We need to know what safeguards are in place to prevent errors and who is accountable when AI gets it wrong.

Without these vulnerable people will be harmed and these tools will be rolled out across government.

#AI #asylum #mirgants #ukpolitics #ukpol #digitalhostileenvironment

The digital hostile environment spreads with AI in the UK asylum process.

AI isn't neutral. It can be easily tweaked to produce preferred results.

Alongside the UK Home Office's ever-increasing hostile narratives, we ask how can migrants trust such automation?

https://www.openrightsgroup.org/blog/saving-time-risking-lives-government-uses-ai-tools-to-inform-asylum-decisions/

#AI #asylum #mirgants #ukpolitics #ukpol #digitalhostileenvironment #homeoffice

Saving time, risking lives: Government uses AI tools to inform asylum decisions

The automation of the hostile environment continues with the Home Office rolling out the use of AI in the asylum decision-making process.

Open Rights Group

The use of AI tools to clear the asylum backlog prioritises speed over accuracy.

The UK Home Office’s own evaluation reveals that 9% of AI-generated summaries were so flawed they had to be removed from the pilot.

And 23% of caseworkers lacked full confidence in the tools’ outputs.

#AI #asylum #mirgants #ukpolitics #ukpol #digitalhostileenvironment

The UK Home Office has a track record of using controversial tech to target migrants.

They're not even telling people that AI is being used on their asylum application.

Using these tools in critical situations without safeguarding, transparency and accountability could lead to fatal results.

#AI #asylum #mirgants #ukpolitics #ukpol #digitalhostileenvironment