Facial Recognition Technology: The Privacy Commissioner Responds in Canada

Facial expression technology

From unlocking phones to assisting in a missing person investigation, facial recognition technology (“FRT”) has proven to be a useful technology. However, FRT is being adopted by an increasing number of organizations in the public and private sector and has engendered controversy and the scrutiny of privacy regulators in Canada.

Broadly speaking, FRT functions in four key sequential steps:

  1. A database of images of faces and associated data is created;
  2. Algorithms are used to produce numerical representations of faces based on the geometry of the faces in the images;
  3. A new image is assessed against those biometric identifiers and matched to images in the database;
  4. the FRT provides a list of results.

The Privacy Commissioner of Canada (“OPC”) has responded to the increasing use of FRT by private companies and government organizations by investigating three organizations that create or use FRT.  Canada’s private sector privacy legislation, the Personal Information Protection and Electronic Documents Act (“PIPEDA”), defines “personal information” to mean “information about an identifiable individual”. Canada’s public sector privacy legislation, the Privacy Act, uses essentially the same definition, but specifies that the information must be “recorded”.

In his recent investigative reports, the OPC takes the consistent stance that images of faces and the biometric facial data created from them, are “personal information” and any collection, use or disclosure of images of faces and the biometric facial data is subject to Canada’s privacy laws. The OPC takes the position that biometric information, such as numerical representations of faces, is “sensitive in almost all circumstances because it is intrinsically, and in most instances permanently, linked to the individual”.

The Law

Both PIPEDA and the Privacy Act deal with consent, limit collection of personal information, and create requirements around the retention and disposal of personal information. However, the obligations imposed on private organizations under PIPEDA differ from those imposed on government organizations under the Privacy Act.


Privacy Act


PIPEDA states that the knowledge and consent of the individual is required for the collection, use, or disclosure of their personal information. It further sets out the requirements for valid consent and requires organizations to take into account the sensitivity of the information when deciding the information and format to seek consent:

Valid Consent

6.1 For the purposes of clause 4.3 of Schedule 1, the consent of an individual is only valid if it is reasonable to expect that an individual to whom the organization’s activities are directed would understand the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting.

The Privacy Act limits the use and disclosure to an identified purpose, unless the individual provides consent to another use or disclosure. It further requires government organizations to inform the individual of the purpose for which their information is being collected:

Individual to be informed of purpose

5(2) A government institution shall inform any individual from whom the institution collects personal information about the individual of the purpose for which the information is being collected.


PIPEDA limits the collection, use, or disclosure of personal information to that which a reasonable person would consider is appropriate in the circumstances:

Appropriate purposes

5(3) An organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances

The Privacy Act limits collection to information that is directly related to a (lawful) operating program or activity:

Individual to be informed of purpose

4 No personal information shall be collected by a government institution unless it relates directly to an operating program or activity of the institution.


PIPEDA requires personal information to be destroyed or depersonalized when it is no longer needed to fulfill the identified purpose for its collection:

Principle 4.5.3 of Schedule 1

Personal information that is no longer required to fulfil the identified purposes should be destroyed, erased, or made anonymous. Organizations shall develop guidelines and implement procedures to govern the destruction of personal information.

The Privacy Act requires personal information to be disposed of in accordance with other rules:

Disposal of personal information

5(3) A government institution shall dispose of personal information under the control of the institution in accordance with the regulations and in accordance with any directives or guidelines issued by the designated minister in relation to the disposal of that information.

The Investigative Reports

  1. Numerical representations of faces are personal information

In PIPEDA Report of Findings #2020-004 the OPC, along with his provincial counterparts in Alberta and British Columbia, found that Cadillac Fairview contravened PIPEDA’S retention and consent requirements in its use of ‘Anonymous Video Analytics’ (AVA) technology to assess the age range and gender of visitors to Cadillac Fairview’s malls. AVA functions by (1) taking temporary digital images of the faces of any individual near the camera, (2) using FRT to convert those images into biometric numerical representations of each face; and (3) using that information to assess the age range and gender of the individual in the digital image.

Cadillac Fairview retained the numerical representations of faces created by AVA without an identified purpose and without explanation in contravention of the retention requirements at Principle 4.5.3 of PIPEDA. The age range and gender information was truly anonymous and could be retained without violating this principle.

The OPC found that Cadillac Fairview did not ensure valid consent and notice for its collection and use of images of individuals in the malls and the creation of numerical representations of faces, in contravention of the consent requirements at Principle 4.3 and s.6.1 of PIPEDA. The OPC concluded that mall goers would not reasonably expect their image to be captured and used to create numerical representations of their face, nor would they expect the numerical representations would be used to assess their age and gender. For this reason and because the numerical representations are sensitive personal information, Cadillac Fairview ought to have obtained express opt-in consent from each individual whose image was captured and processed by AVA.

  1. Populating FRT databases from online sources must be done for appropriate purposes and with valid consent

In PIPEDA Report of Findings #2021-001 the OPC, along with his provincial counterparts in Quebec, British Columbia, and Alberta,  found that Clearview AI contravened PIPEDA’s consent and appropriate purpose requirements in relation to its FRT tool. The database used in Clearview’s FRT consisted of over three billion images of individuals and numerical representations of those images. The images were scraped from publicly accessible online sources, including social media. Clearview did not seek consent from the individuals whose information it collected, because the information was publicly available. Clearview markets its technology for use by law enforcement agencies to identify suspects caught on video.

By collecting images and creating numerical representations of those images without consent, Clearview contravened the consent requirements at Principle 4.3 and s.6.1 ofPIPEDA. While PIPEDA exempts some publicly available information from the consent, information from public websites, including social media, do not fall withinthe ‘publicly available’ exception.

Furthermore, Clearview contravened the appropriate purposes requirement at ss.5(3) of PIPEDA, which applies even when valid consent is obtained. Clearview’s collection and use of images was inappropriate because the images (and associated numerical representation of faces) were: collected for purposes unrelated to the purpose for which the images were originally posted, retained indefinitely, used to the detriment of the individual (e.g. for investigation and prosecution), and used in a way that creates the risk of significant harm to individuals whose images are captured by Clearview.

  1. Using a third-party’s FRT tool constitutes the collection of personal information

Further, in the Privacy Act Report of Findings on the RCMP’s use of FRT Tools the OPC found that the RCMP contravened the Privacy Act’s collection requirement by using Clearview AI’s FRT because the images in Clearview AI’s database were collected without consent (i.e. unlawfully). To use Clearview’s FRT, the RCMP would upload an image to Clearview. The image would be transformed into a numerical representation and compared against other faces in the FRT’s software. Clearview would then display likely matches in the form of images with hyperlinks to the internet address the image was scraped from.

The OPC found that the displaying of likely matches constitutes a collection of personal information and is subject to the Privacy Act’s collection requirement. Because Clearview collected the information in its FRT database without consent and in contravention of the appropriate purpose requirement, the RCMP’s subsequent collection of this information was outside the scope of a lawful program or activity.


These three decisions of the OPC over the past eight months underline the importance of complying with Canada’s privacy legislation when creating and using FTR in Canada. The OPC has clearly taken the position that FRT involves the collection and use of personal information even when the information is being used for anonymous purposes, even when the information was collected from publicly available sources, and even when an organization is using a third-party’s FRT tool.

It will be important to see the effect that the proposed reforms and amendments to the federal privacy legislation in Bill C-11, the Consumer Privacy Protection Act, will have on the use of FRT in Canada. While the cases outlined show that the Privacy Commissioners in Canada take a restrictive view of the use of FRT, if Bill C-11 becomes law, the enforcement provisions for such breaches will be significant. Under the current federal legislation the Privacy Commissioner has no power to impose compliance orders and penalties on organizations for breaches of the legislation, and only a limited power to impose fines up to $100,000 for certain breaches of the legislation. In the subject FRT cases the Privacy Commissioner effectively ‘named and shamed’ the organizations breaching the legislation. This has been practically effective as the organizations have complied with the Privacy Commissioner’s recommendations. In fact, Clearview AI has withdrawn its FRT product from the Canadian market entirely.

However, under the proposed reforms of Bill C-11, the Privacy Commissioner would be able to issue a compliance order directly on the organization and recommend a new administrative tribunal impose penalties of up to $10 million or 3% of the gross global revenue of the organization. Further, Bill C-11 provides for a private right of action for damages for individuals affected by a contravention of the privacy legislation. If Bill C-11 passes into law, the risks of organizations using technologies such as FRT will increase significantly given the potential downside consequences for a finding of a breach of Canada’s privacy laws.