"Canada rejected her permanent residence application. Her job duties were made up — by Immigration’s AI reviewer"

Added wiring and assembling control to expected job duties... She's a doctoral researcher in the immunology of aging.

I'm really hoping that this sets a legal precedence for the standard of the duty of care that the government is held to when using technology that does not care for reality.

https://www.thestar.com/gift-redeem?t=2a2f2922-489d-47cb-b5ff-36b188263d9a

#cdnpoli

Gift-redeem

Toronto Star

@mayintoronto

Not only is it a 'blackbox', it is almost certainly a *US* blackbox.

Why is GoC IRCC giving US corporations money it could spend on human reviewers who are in Canada? Not an efficient use of funds. Especially as the lawsuits pile up.

@Amgine They're probably using Cohere's models. Cohere's Canadian on paper.

@mayintoronto

<snort> Their primary HQ is in NY, they worked with the Biden White House on AI risks/responsibility.

Meh. They're basically Google, imo.

@Amgine Oh, I know. But they get the Canadian funding.
@Amgine maplewashing at its finest.

@mayintoronto

And South Korean funding, no doubt (they opened in Seoul last year, too.)

@mayintoronto It is terrifying that a government would be using AI.
@mayintoronto And it's not ok to say, "oh, well, mistakes will happen." :(
@CStamp @mayintoronto hahaha *laughs in the deposition of dogebros explaining that they used chatgpt to cancel arts grants*
@ricci @mayintoronto I would feel so tainted and violated if those asses had been handling my private data.
@CStamp @mayintoronto I think it depend for what they are using what kind of AI.
@OutOfSpace @mayintoronto I can't think of a use that doesn't come with a lot of risk and a lot of frustration for users and people who need a person to talk to.
@CStamp @mayintoronto me neither, but I am not an adminiatration person. there might be good cases, who knows. I think we have to treat the current AI as what it is: An improved search engine, although a very expensive one in energy consumption. All the rest is just marketing.
@OutOfSpace @mayintoronto It's not even that. It give errors, so you don't know if you should believe the search.
@CStamp @mayintoronto I agree, you have to be critical. But you have to critical with Google and other search engines or any source of information too. If a moron uses AI it's a bad thing. But a moron itself is a bad thing. Given the context you were thinking of and referring to in your first post: AI should not be used in that context.
@OutOfSpace @mayintoronto It should also not replace human judgement or human customer support.
@CStamp @OutOfSpace @mayintoronto They are using AI to speed up a pro forma rejection of anyone from a population they deem as “surplus” or “unimportant.”

@OutOfSpace

This defense sounds exactly like what an AI-driven test account said to me last year, and the avatar was similar.

@CStamp @mayintoronto

@SnowyCA @CStamp @mayintoronto It's also a stupid defense of AI. It is a shittier search engine. Far, far shittier in every regard.

@driusan 💯 in agreement!

@mayintoronto @CStamp

@SnowyCA @mayintoronto @CStamp "It's like a search engine except it doesn't give you any results, doesn't have any way to find the sources, is non-repeatable, makes things up, and uses more energy."

@driusan @mayintoronto @SnowyCA @CStamp

Without being trite, the use of computers to assist in human decision making and its unintended but still erroneous and deadly consequences is highlighted in this link.
As the professor says, long read. Sobering.

https://zirk.us/@ChrisMayLA6/116294189557576927

Emeritus Prof Christopher May (@[email protected])

The Iranian children killed in the airstrike on the Shajareh Tayyebeh primary school early on in the US/Israeli assault on Iran were not as the popular trope increasingly has it 'killed by AI' but rather were killed by years & years of lazy & inaccurate bureaucracy; hiding behind AI allows the real culprit(s) to enjoy impunity from an atrocity & likely war crime. Kevin Baker's detailed exploration of what was behind the airstrike is a long read but worth it! #iran https://www.theguardian.com/news/2026/mar/26/ai-got-the-blame-for-the-iran-school-bombing-the-truth-is-far-more-worrying

zirkus
@mayintoronto I had someone in the planning department of Auckland Council send me a ChatGPT summary of why I needed heritage approval to put double glazing in my daughter's apartment.
It's a 1920's bra factory. The fourth floor, where she lives, was added when it was converted to apartments in the 1990s. She is on the "garden" side, which means exactly no-one can see her window frames unless they stand on a toilet in the building on the other side and crane their neck.
Outcome: my daughter is using more energy than she needs to. I know Iran gets all the press, but this is why we have an energy crisis.
I miss when humans used to think about things.
@rupert @mayintoronto Heritage foundations are the HOAs of century homes.
@mayintoronto Thanks for sharing. I wrote to my MP to object to the use of LLMs in decision making.

@mayintoronto

This is racism as a service. The bias machine can't contextualise a black woman having a PhD from the Sorbonne, so she must actually be a technician.

@mayintoronto

Obviously, also the use of a bias machine in the context of processing immigration applications is extremely concerning.

Surely, because these systems are known to be racist, this must violate existing laws?

@mayintoronto JFC. So much terrible about this story. At a macro level, Canada is in an unusual situation of being able to attract really smart people who might otherwise go to (or stay in) the US. Yet we manage to screw up the whole "attractive place to work" thing by being (at best) inept and bureaucratic. We could start to reverse 50 years of brain drain and seed the next generation of innovation, but we won't because we rely on stochastic parrots to make decisions about immigration. SMH.
@mayintoronto why does the URL look like a scam link? "gift-redeem?t=" is not a typical URL scheme for articles
@brianna It's the Toronto Star for you.
@brianna Most Toronto Star articles are paywalled. This is a gift link.
@mayintoronto “They have to put a process in place to make sure that the decision is made fairly,” said Adé, who blames both AI and the person who reviewed the decision. “They cannot just trust tools like this to make such big decisions that will change people’s lives like this. I cannot be very confident in the system now.”What about the person who mandated the AI? The person who manages the person reviewing the file and putting pressure on them to get through as many as fast as possible to meet their KPIs?