Goodbye job applications, hello dream career
Seize control of your career and design the future you deserve with LW career

New laws needed to address facial recognition technology

Facial recognition technology (FRT) is growing exponentially in Australia and around the world, yet our laws do not offer protections against potential human rights violations, says the Human Rights Law Centre (HRLC).  

user iconJess Feyder 28 September 2022 Big Law
New laws needed to address facial recognition technology
expand image

FRT holds the possibility to assess characteristics such as a person’s age, gender, ethnicity, health conditions, emotional states, and behaviours.

Face data can be captured remotely by a variety of widely available devices.

This new technological capacity poses a risk to privacy; it can — and has — enabled mass surveillance in some countries. 

Advertisement
Advertisement

It also leaves people vulnerable to rights being restricted or violated, and behavior being manipulated. 

The University of Technology’s (UTS) Human technology Institute has published a report outlining a Model Law for FRT, titled Facial Recognition Technology: Towards a Model Law.

The HRLC has welcomed the report, and called on the Attorney-General, Mark Dreyfus KC, to urgently regulate FRT to prevent human rights harms.

The report proposes a novel and comprehensive legal framework for regulating the use of FRT under Australian law, the HRLC said in a statement. 

The legal framework proposed by UTS would require the developers and deployers of FRT to undertake impact assessment processes, provide a range of safeguard and oversight mechanisms and prohibit the use of FRT in high-risk contexts, unless conditions are met.

Under the proposal, the Office of the Australian Information Commissioner would be given increased powers to regulate the use of FRT.

“Surveillance using facial recognition technology limits fundamental human rights,” said Kieran Pender, senior lawyer at the HRLC, who served on the expert reference group of the report.

“Our current laws were not drafted to address the challenges posed by facial recognition technology to human rights such as the right to privacy, freedom of assembly and association, freedom of expression and of movement.

“Right now, Australian governments and corporations are using these technologies in an unregulated landscape, with few specific safeguards or oversights. 

“This is a technology that comes with potential benefits, but also significant risks,” said Mr Pender.

“Surveillance using facial recognition technology limits fundamental human rights.

“Facial recognition technology has the potential to disproportionately impact women and people of colour, given proven algorithmic biases. 

“There are also concerns around the use of this technology to undertake surveillance on marginalised communities, activists, whistleblowers, and journalists,” he said.

Under the model law, police and intelligence services would be prohibited from using FRT unless certain conditions — including a minimum seriousness threshold — were met. 

Law enforcement agencies would be prohibited from using FRT to identify whistleblowers or journalistic sources to protect press freedom.

“The Attorney-General should heed the call of this report and use its model law as a starting point for a dedicated legal framework that regulates facial recognition technology in Australia,” said Mr Pender. 

“Together with the ongoing overhaul of the Privacy Act, these changes can ensure Australians’ right to privacy are adequately protected in the digital age,” he added. 

You need to be a member to post comments. Become a member for free today!