Articles

Articles

AI Facial Recognition Technology in the Canadian Immigration System

Facebook
Twitter
LinkedIn

This article was authored by Gideon Christian PhD, a critical race scholar and an Assistant Professor of artificial intelligence and law (AI and Law) at the Faculty of Law, University of Calgary. His research seeks to highlight racial bias in artificial intelligence technologies.

Rapid adoption of Artificial Intelligence (AI) technology is permeating the Canadian public sector from criminal to immigration law enforcements. AI in now increasingly being deployed and used in various aspect of the Canadian immigration system. Recent information emerging from Federal Court litigation about the use of AI facial recognition technology (FRT) in immigration enforcement especially in refugee status revocation has shocked many in the immigration law community resulting in calls for openness and transparency in the use of the technology by the government. This article will examine recent developments in the use of FRT in Canada’s immigration system, especially its impacts on individual members of the Canadian society characterized by race.

Facial recognition technology and racial bias

Facial recognition technology (FRT for short) is an AI-based biometric technology that uses computer vision to analyze and identify individuals based on their unique facial characteristics. It uses sophisticated algorithms to process images and identify unique characteristics in the images. These unique characteristics are then used to compare the image to other images in a database for the purpose of identifying similar or identical images. This unique feature of FRT makes it a suitable tool for identity verification. For example, in the immigration context, it can be used to match travelers with the photos on their travel documents. In refugee context, facts from recent Federal Court litigation tend to suggest that the technology is now being used by the immigration authorities to compare photos of successful refugee claimants with photos in the immigration database to verify individuals’ identities.

The use of FRT in the public sector has generated some serious concern about its propensity to perpetuate racial bias and discrimination. FRT has an almost perfect accuracy rate in recognizing White male faces, with an accuracy rate of about 99.2%. However, when it comes to recognizing Black faces and faces of colour, the accuracy rate dramatically drops, with its highest error rate in recognizing darker female faces, particularly those of Black women. The tendency of results from AI-powered tools such as FRT to be biased against people of colour has been referred to as algorithmic racism.

Algorithmic racism stemming from the use of AI-FRT may arise from disproportionate data from people of color or darker-skinned individuals used in the training of the AI FRT algorithm, resulting in misidentification. This problem has been traced to the fact that the AI FRT industry is White male dominated, and training data used to develop the algorithm are predominantly images of White males. This explains the high accuracy rate in identifying White individuals compared to those with darker skin. In the United States, this has resulted in wrongful arrests arising from false identification by AI FRT. Not surprisingly, all documented cases of wrongful arrest resulting from false identification in the United States involved Black people, the most recent case involving an 8-month pregnant Black woman.

FRT in refugee revocation

One of the most controversial scandals arising from the use of FRT in the public sector in Canada involved the clandestine use of Clearview AI FRT by some police departments in Canada, including the RCMP. Aside from that well-documented controversy, one area where the use of the technology is now generating serious concerns in Canada are its uses by Canadian immigration authorities in the revocation of refugee status for certain successful Canadian refugee claimants. To date, there have been reported cases involving five individuals who have challenged the revocation of their refugee status on the basis of information from the use of FRT. All the cases involve Black individuals; four of them are Black women – the racial and gender group where the technology has demonstrated its lowest accuracy rate. A review of these Federal Court cases reveals not just lack of openness and transparency by Immigration, Refugee and Citizenship Canada (IRCC) and Canada Border Services Agency (CBSA), but also deliberate attempts by these government departments to keep Canadians, the court, and affected individuals in the dark about the government departments’ use of the technology. The cases below illustrate this fact.

Barre v. Canada (Citizenship and Immigration), 2022 FC 1078 was the case that drew the attention of the immigration law community to the use of FRT by IRCC and CBSA in the revocation of refugee status. That case involved two Black Somali women who made successful refugee claim in Canada. Following their successful claim, the Minister of Public Safety and Emergency Preparedness successfully brought an application at the Refugee Protection Division to vacate their refugee status on the ground that they misrepresented their identity as Somali citizens when in fact they were Kenyan nationals. It appears that IRCC had matched the photos of these women with those of two other individuals who had entered Canada as international students from Kenya. While the RPD accepted evidence of the photo comparison, it refused the women’s request to compel the Minister to disclose the source of the photo comparison.

At the Federal Court, the women asserted that the Minister used the controversial Clairview AI FRT. Consequently, a major issue at the Federal Court judicial review proceeding was whether the Minister used FRT in the photo comparison. The Minister tactfully (albeit unsuccessfully) sought to evade the issue by invoking s.22(2) of the Privacy Act arguing that the provision ‘allows law enforcement agencies to protect the details of their investigation’. The women produced empirical data and research which shows the high error rate FRT in the identification of darker-skinned women. Justice Go ruled that the Minister cannot rely on s.22(2) Privacy Act to evade the disclosure of information relating to the technology used in the photo comparison. Accepting as a fact that FRT was used in the photo comparison, Justice Go accepted the Applicants reference to the technology as a unreliable pseudoscience that “has consistently struggled to obtain accurate results, particularly with regard to Black women and other women of colour.”

While Barre was the first reported case in which IRCC and CBSA have sought to revoke the refugee status of a successful Canadian refugee claimant using FRT evidence, it sure was not going to be the last. Shortly after Barre, another case with similar fact but different outcome emerged. The case of Abdulle v. Canada (Citizenship and Immigration) 2023 FC 162 also involved a Black Somali woman. The Minister of Public Safety sought to have her refugee status revoked at the Refugee Protection Division because her photo was matched to that of another individual in their database who is of Kenyan nationality. Unlike Barre where the Claimants made some attempt (although unsuccessful) at RPD for the disclosure of the technology used in the photo comparison, it appears that was not the case in Abdulle. At the Federal Court judicial review of the revocation decision of the RPD, the Applicant argued that the Minister must have used Clearview AI FRT to match her face with millions of other people in the database. The Applicant made no request for disclosure at the RPD or during the review at the Refugee Appeal Division (RAD).

In apparent continuation of its tradition of secrecy and lack of transparency in its use of the technology, the Minister denied using Clairview FRT and instead insisted that it used “traditional investigation techniques”. While noting that “the weaknesses of facial recognition software are common knowledge”, Justice Mosley noted the weakness of the Applicant’s revocation judicial review litigation, which according to him, was built on speculation without evidentiary foundation as to the use of FRT in the photo comparison. According to Justice Mosley the Applicant’s assertion with regards to the use of FRT by the Minister was “undermined by the fact that she did not seek a direction for disclosure from the RAD on the methods or processes used but proceeded to make her appeal argument based on the assumption, without an evidentiary foundation, that such software had been employed.” He noted that the argument “was merely speculation, particularly in the face of the Respondent’s uncontroverted assertion that an exhaustive search had been conducted using “traditional investigation techniques”. Whatever those techniques were, no inference can be drawn that they included facial recognition software in the absence of supporting evidence.”

While noting Justice Mosley’s criticism of the Applicant’s judicial review litigation, one cannot help but raise concerns about the advanced level of secrecy and lack of transparency from the Minister. Unlike Barre, where the Minister sought unsuccessfully to rely on s.22(2) of the Privacy Act, in Abdulle, the tradition of secrecy advanced to the coining of an extraordinary code term “traditional investigation techniques” to conceal the use of FRT by the Minister.

The case of AB v. Canada (Citizenship and Immigration), 2023 FC 29 was even more troubling in many respects. This case highlight not just the problem with use of FRT but also the lack of disclosure of its use to affected individuals, and the transfer of personal information obtained from the use of the technology by one government department to another without the knowledge or consent of the affected individual. In AB, a Black woman from Cameroun made a successful refugee claim in Canada. Many years after her successful refugee claim, she walked into the licensing office of the Ontario Ministry of Transportation (MTO) to have her photo taken for her Ontario driver licence. Unknow to her, the MTO used FRT to compare her photo with other photos in the MTO database. The FRT matched her face to some other woman. MTO secretly passed this information to the IRCC which successfully brought an application before RPD to revoke her refugee status. During the RPD proceeding, the Applicant sought an order to compel the MTO agent to testify regarding the evidence used to match her face to another person. Not surprisingly, the Minister strongly opposed the order, and the RPD refused to make the order.

The tradition of secrecy in the use of the technology continued in Ali v. Canada (Citizenship and Immigration), 2023 FC 671. The Minister’s old trick of relying on the Privacy Act was successful at the tribunal stage. But when the matter got to the Federal Court on judicial review, apparently sensing that the Privacy Act defence will not provide any shield, evident from the decision in Barre, the Minister sought to adduce new affidavit evidence. The move was opposed by the Applicant and disallowed by the court, and the affidavit evidence was deemed inadmissible.

Conclusion

If Canada operates a system of government that is characterized as open and transparent, one wonders why government departments’ use of this Blackbox technology continues to be marked by secrecy and a lack of transparency – even in judicial litigation. AI FRT is a Blackbox technology; IRCC and CBSA’s use of this technology remains shrouded in secrecy and a lack of transparency. This is quite contrary to the principle of openness and transparency that should guide the public sector’s use of the technology.

Immigration and refugee lawyers who seemingly lack expertise in FRT will increasingly encounter evidence of the use of this Blackbox technology in their litigation. Perhaps a starting point in addressing the government’s lack of transparency would be to file a disclosure request at the tribunal of first instance, which, not surprisingly, will be opposed by the government. Even when the request is successfully opposed at the tribunal of first instance, it will be on record that the request was made, and that could make a difference at the Federal Court judicial review. At the very least, it made a difference in Barre.

Postscript: On 22 August 2023, I published an Op-Ed in Toronto Star, drawing from preliminary research for this article. The Op-Ed was critical of the CBSA and IRCC use of FRT in refugee revocation proceedings. After the publication, I was contacted by the CBSA. The Agency denied it uses the FRT in immigration enforcement. That assertion though appears to be contrary to the facts in Barre.

Become a member now!

Join a growing community of Canadian immigration lawyers, academics and law students.

Our Latest Articles