Skip to main content


Bunnings Australia wins legal fight to use AI facial recognition in stores


Australia's Privacy Commissioner Carly Kind determined in 2024 that Bunnings breached privacy laws by scanning hundreds of thousands of customers' faces without their proper consent.

A review of that decision by the Administrative Review Tribunal of Australia has now found the opposite

The retailer did not break the law by scanning customers' identities, but should improve its privacy policy and notify customers of the use of AI-based facial recognition technology, the ruling said


Petty typical stuff by this point. The privacy-invading company wins, pissweak government makes a few privacy "recommendations" but stops short of enforcing anything

in reply to freedickpics

Well, looks like its medical face mask, big sunnies and blank clothing covering my arms and legs, just to go get some nails.
in reply to dumbass

Wear clothing with realistic faces printed on it to confuse the FR. A baseball cap with infrared LEDs in the brim too
This entry was edited (2 days ago)
in reply to freedickpics

Fuck. Cunts. Shit's going to be everywhere tomorrow. (I rarely profane as much, but this)
This entry was edited (2 days ago)
in reply to MalReynolds

Exactly. Kmart did a similar trial a while back. This ruling will just open the floodgates for every company to roll out AI-powered facial recognition cameras everywhere
in reply to freedickpics

The way Bunnings used facial recognition was as private as it could be ... mostly. It is possible to run all of the facial recognition locally, but instead they run it on a central Bunnings-controlled server in Sydney. They only scan for faces of known offenders based on previous entanglements.

My concern isn't surveillance, but mass surveillance. Because this is exclusive to Bunnings it doesn't quite reach mass surveillance, but because the processing is centralized and Bunnings is so big it edges dangerously close.

This isn't Flock, nor Palantir, nor Google.

in reply to FoundFootFootage78

They only scan for faces of known offenders based on previous entanglements.


How does the system know who’s an offender and who isn’t without scanning their face first?

in reply to ageedizzle

They look at CCTV, take an violentcustomer.jpeg, and add it to the database.

EDIT: Oh I see what you're asking. You misread my quote, what I said is "they only scan for" and not "they only scan".

This entry was edited (2 days ago)