OAIC rules against Bunnings over use of facial recognition tech

facial recognition

Popular home hardware and garden retailer, Bunnings Group, has been found in breach of Australian privacy laws by the Privacy Commissioner for its use of a facial recognition technology system.

The national chain was investigated for its use of the system in 63 retail stores in New South Wales and Victoria between November 2018 and November 2021, which captured the faces of approximately hundreds of thousands of individuals through CCTV.

Commissioner Kind made several findings against Bunnings, including that the retailer “collected individuals’ sensitive information without consent, failed to take reasonable steps to notify individuals that their personal information was being collected, and did not include required information in its privacy policy”.

“Facial recognition technology, and the surveillance it enables, has emerged as one of the most ethically challenging new technologies in recent years,” she said.

“We acknowledge the potential for facial recognition technology to help protect against serious issues, such as crime and violent behaviour. However, any possible benefits need to be weighed against the impact on privacy rights, as well as our collective values as a society.

“Facial recognition technology may have been an efficient and cost effective option available to Bunnings at the time in its well-intentioned efforts to address unlawful activity, which included incidents of violence and aggression. However, just because a technology may be helpful or convenient, does not mean its use is justifiable.

“In this instance, deploying facial recognition technology was the most intrusive option, disproportionately interfering with the privacy of everyone who entered its stores, not just high-risk individuals.”

Mike Schneider, Managing Director of Bunnings Group, confirmed that the company had only used facial recognition technology to “safeguard our business and protect our team, customers and suppliers from violent, aggressive behaviour, criminal conduct and prevent them from being physically or mentally harmed by these individuals”.

“We believe that customer privacy was not at risk. The electronic data was never used for marketing purposes or to track customer behaviour,” Schneider said in a statement published online after the Privacy Commissioner announced their determination.

“Unless matched against a specific database of people known to, or banned from stores for abusive, violent behaviour or criminal conduct, the electronic data of the vast majority of people was processed and deleted in 0.00417 seconds – less than the blink of an eye.

“Every day we work hard to earn the trust of our team, suppliers, and customers and this includes keeping people safe in and around our stores. It’s our highest priority and a responsibility we take very seriously.”

During her investigation, Commissioner Kind also indicated that Bunnings had not taken the “reasonable steps” to comply with the Privacy Act, which included developing and integrating certain practices, processes and systems.

“Individuals who entered the relevant Bunnings stores at the time would not have been aware that facial recognition technology was in use and especially that their sensitive information was being collected, even if briefly,” Commissioner Kind said.

“We can’t change our face. The Privacy Act recognises this, classing our facial image and other biometric information as sensitive information, which has a high level of privacy protection, including that consent is generally required for it to be collected.

“This decision should serve as a reminder to all organisations to proactively consider how the use of technology might impact privacy and to make sure privacy obligations are met.

“Organisations should be aware that ensuring the use of emerging technologies aligns with community expectations and regulatory requirements is high among our priorities.”

Schneider confirmed the company will seek a review of the Privacy Commissioner’s determination.

“FRT was an important tool for helping to keep our team members and customers safe from repeat offenders,” he said.

“Safety of our team, customers and visitors is not an issue justified by numbers. We believe that in the context of the privacy laws, if we protect even one person from injury or trauma in our stores the use of FRT has been justifiable.”