PUBLICATIONS circle 12 Nov 2025

Facial recognition technology and the law: Lessons from the Kmart privacy breach

By Katherine Jones, Morgan Lane and Samyuktha Rajagopalan

After a 3-year investigation, the Privacy Commissioner found Kmart’s use of facial recognition breached the Privacy Act by collecting biometric data without consent and failing to provide adequate notice. The ruling reinforces the need to balance security with privacy, transparency and proportionality.


In brief

Following a 3-year investigation, the Australian Privacy Commissioner Carly Kind (Commissioner) concluded that Kmart's use of Facial Recognition Technology (FRT) between June 2020 and July 2022 was in breach of the Privacy Act 1988 (Cth) (Privacy Act). Commissioner Initiated Investigation into Kmart Australia Limited (Privacy) [2025] AICmr 155 (26 August 2025). 

This determination is the second major ruling on the use of FRT in retail stores and follows the Commissioner's findings in the October 2024 decision against Bunnings, where it was determined that the use of FRT breached the Privacy Act and the Australian Privacy Principles (APPs). Determinations are considered binding under section 55A of the Privacy Act, which empowers the Commissioner to commence proceedings in the Federal Court of Australia to enforce a determination. 

In this determination, the Commissioner found that Kmart interfered with the privacy of the individuals whose personal information and sensitive information Kmart collected via its FRT by: 

(a) collecting the sensitive information of those individuals in circumstances where the individuals did not consent and APP 3.4 did not apply in relation to the information, contrary to APP 3.3 (the requirement for consent to collect sensitive information);  

(b) failing to take such steps as were reasonable in the circumstances to notify or otherwise ensure those individuals were aware about the relevant APP 5.2 matters, contrary to APP 5.1 (requirement to take steps as were reasonable to notify); and  

(c) that Kmart breached APP 1.3 by failing to include in its privacy policies information about the kinds of personal information that it collected and held and how it collected and held that personal information, as required by APP 1.4(a) and APP 1.4(b) (the requirement in respect of the content of a privacy policy). 

Kmart's use of FRT  

Kmart implemented FRT at store entrances and returns counters across 28 locations in Australia. As customers entered the store or approached the returns desk, their faces were captured via CCTV. These images were then processed by the FRT system, which compared them against a database of individuals flagged as ‘persons of interest’, typically those suspected of attempting fraudulent returns. 

If a match was detected, the system would alert Kmart staff, who could then review the CCTV footage and decide whether to proceed with the refund transaction.  

Legal arguments and the Commissioner's response  

Under the Privacy Act and the APPs, the use of FRT constitutes the collection of sensitive information as the resulting images are biometric information. The key question was, did Kmart comply with the Privacy Act and the APPs in its collection of sensitive information, particularly regarding the requirement to obtain consent. 

Kmart has argued that it is experiencing a number of escalating incidents of theft in stores which are often accompanied by acts of violence or anti-social behaviour. It relied on the exemption under APP 3.4 which permits the collection of sensitive information without consent if a 'permitted general situation' exists under section 16A of the Privacy Act.  

Under section 16A of the Privacy Act, a permitted general situation exists if:  

(a) the entity has reason to suspect that unlawful activity, or misconduct of a serious nature, that relates to the entity's functions or activities has been, is being or may be engaged in; and 

(b) the entity reasonably believes that the collection, use or disclosure is necessary in order for the entity to take appropriate action in relation to the matter. 

Kmart contended that it had reason to suspect fraud and theft were occurring in its stores and that FRT was necessary to appropriately address the issue, meaning consent was not required from individuals whose faces were captured. 

While the Commissioner accepted that Kmart had reason to suspect unlawful activity or serious misconduct, the Commissioner found that the use of FRT to prevent fraud was disproportionate. It was further noted that the effectiveness of FRT in preventing fraud was limited and that less privacy-intrusive alternatives were available to Kmart. 

The Commissioner also found that customers were not adequately notified and that notices were inconsistently displayed and lacked sufficient detail. In accordance with APP 5.2, Kmart's privacy policy should have contained more comprehensive information about the FRT system.  

While financial penalties were not imposed, under section 52(1A) of the Privacy Act the Commissioner declared that Kmart: 

(a) must not repeat or continue the acts;  

(b) must, within 30 days of the publication of the determination, make an apology available on its website; 

(c) must, within 30 days of the publication of the determination, publish a statement which is accessible from and prominently featured on the homepage of the Kmart website for at least 30 days which confirms the determination; and  

(d) must retain all personal and sensitive information obtained for 12 months after publishing the statement and destroy that information 12 months and one day later.  

Lessons for legal practitioners and agencies  

Since the Commissioner's decision in the Bunnings case, retailers have called for privacy laws to be changed to allow the use of facial recognition in stores to reduce shoplifting and protect staff. Retailers maintain that customer privacy is not at risk and that technology could be used responsibly and ethically.  

While surveillance in stores remains a divisive issue in Australia, it will be up to regulators to provide clear guidance and direction to give businesses confidence to act. 

In the Commissioner's official OAIC blog post, it was clear that the decision does not constitute a ban on the use of FRT. She states that:   

"It may be tempting to suggest that my successive determinations amount to an effective ban on the use of this technology. However, that is incorrect; the Privacy Act is technology-neutral.

This determination underscores the importance of implementing technology like FRT in compliance with the Privacy Act and the APPs. 

In the absence of parliamentary intervention to specifically authorise the use of FRT systems without consent, it is critical that companies rigorously assess their compliance obligations, consider the privacy impacts and follow best practice standards. 

Key takeaway for businesses considering FRT:  

  • Consent is key: sensitive information like biometric data requires explicit, informed consent. 

  • Transparency matters: notices must be clear, accessible and uniformly deployed. 

  • Proportionality is essential: Privacy impacts must be weighed against the intended benefits. 

  • Alternatives must be considered: less intrusive methods should be evaluated and documented. 

If you have any questions about integrating FRT into your business operations, or would like support reviewing your privacy policies, please don’t hesitate to get in touch with us. Our Corporate & Commercial team is available to provide you expert advice when it comes to your business.  

This is commentary published by Colin Biggers & Paisley for general information purposes only. This should not be relied on as specific advice. You should seek your own legal and other advice for any question, or for any specific situation or proposal, before making any final decision. The content also is subject to change. A person listed may not be admitted as a lawyer in all States and Territories. Colin Biggers & Paisley, Australia 2025

Stay connected

Connect with us to receive our latest insights.