Goodbye job applications, hello dream career
Seize control of your career and design the future you deserve with LW career

Accountability of AI: Where law goes wrong

Laws designed to enforce accountability of cyber technology and artificial intelligence should be strengthened as “shonky dealings” reveal where the legal protections have been inadequate and have the potential to undermine profitability of information.

user iconNaomi Neilson 25 September 2019 Big Law
Fiona McLeod
expand image

Speaking in Brisbane, senior counsel at the Victorian Bar Fiona McLeod said the legal corruption bodies, consumer protection regulations and anti-corruption bodies “seeking to protect the community from cyber abuse” needs more power and resources to invest in creating increased accountability around the way AI asks for and retains information.

“We are clutching at fig leaf regulations and laws to protect our privacy, human rights and right to hold institutions to account, to insist on our side of the social contract,” Ms McLeod said. “Clearly the Genie is out of the bottle. The question is, how do we make sure our three wishes are worthy as we head willingly toward oblivion.”

Ms McLeod noted there was “early vague commitments” to FOI law reform and a prior promise of transformation, but nothing explicit in terms of assurances of accountability. She said much like a Genie in the lamp, the trick is in the framing of the question: “You asked AI to solve climate change, AI gave you Armageddon.”

Advertisement
Advertisement

In the digital age, users are signing over personal information which is then used by a range of government and business entities to share across multiple platforms. It’s been generally assumed there is no choice: “We hand over our data and trust it is protected.”

“We need to develop our privacy protections to address complaints and consequences – enforceable causes of action by affected individuals, impact assessments and with notifications,” Ms McLeod said.

“We need to capture actors in privacy laws and give citizens a cause of action concerning false information during elections.”

Ms McLeod said laws and practices are usually outdated by the time they have made it to being introduced and “we need serious investment in catching up”. From electoral laws through to human and privacy rights, there needs to be adequate updates.

“How do we protect against inherently unfair, unconscionable personalised intrusions that threaten to overwhelm our neurological processes? We must recognise that the permissions, consent and disclosures are virtually meaningless and have mandatory standard template and cookie policies that are widely misunderstood,” she said.

Ms McLeod said she recently learnt the government’s translation interpreting service has been asked to record calls from community lawyers to clients using an interpreter with no consent process and no option to opt out, apparently to aid identification.

In Australia, laws compel Telco and internet providers to retain the metadata, or digital access history, of all users for two years despite concerns from industry legal experts that this collection of data “creates more a security risk” than being without them.

“A missing step in the construction of machine learning models is in the explanation of the logic expressed in a comprehensible, human-readable format, that highlights bias learned by the model, allowing AI developers and other stakeholders to understand and validate the decision rationale,” Ms McLeod said on collecting data via AI.

“This information impacts not only information ethics, but also accountability.”

It has been more than a decade since the Australian Law Reform Commission review of the Australian Privacy Laws and five years since the Serious Invasions of Privacy in the Digital Era Inquiry Report. Ms McLeod said “self-regulation has failed” and that “protections are lagging well behind the technology and well behind other states”.

“Consent to digital use is essentially meaningless, and drafted by the same contracts lawyers use to draft the conditions on parking tickets for years. In the context of digital use, we need to assume that consent is mostly uninformed. Even worse, appearance of consent creates a comfort blanket being legalistic and never read,” she said.

This email address is being protected from spambots. You need JavaScript enabled to view it.

You need to be a member to post comments. Become a member for free today!

Tags