Canada’s Immigration, Refugees and Citizenship (IRCC) has rejected the permanent residence application of Nigerian scholar Kémy Adé, citing a mismatch between the job duties she submitted and her claimed Canadian work experience.
The refusal followed a review that reportedly involved generative AI, which flagged Ms Adé’s listed duties as including wiring and assembling control circuits, building control and robot panels, and programming tasks unrelated to her actual role.
Ms Adé, a post-doctoral research fellow and guest lecturer at McMaster University, expressed shock at the decision.
She holds a PhD from France’s Sorbonne University, specializing in immunology of aging.
“I saw this description of a job that has nothing to do with me. I was disoriented about how this could happen,” she said.
Critics warned that relying on AI in complex economic immigration cases could undermine confidence in the system.
Toronto-based immigration lawyer Zeynab Ziaie, co-founder of AI Monitor for Immigration in Canada and internationally, described the approach as a “black box.”
She explained, “You give it a prompt, and it can generate a refusal or an acceptance. The problem is you don’t know how it reaches the final decision.”
Ms Adé’s lawyer, Luka Vukelic, raised concerns about the human verification of the AI-generated content, calling the refusal “incomprehensible.”
He alleged, “Somehow, it hallucinated my client’s job description. Something seriously went wrong here.”
The refusal came shortly after IRCC released its first AI strategy, detailing plans to use Artificial Intelligence to “boost efficiency, enhance service delivery and strengthen program integrity.”
Officials said AI tools have been in use since 2013 to flag fraud and streamline processing.
In response, IRCC maintained that a human officer made the final decision, with generative AI playing no role in judgment.
Ms Adé’s legal team has requested reconsideration, and the case file has been reopened for review.


