Goodbye job applications, hello dream career
Seize control of your career and design the future you deserve with LW career

Lawyers need to ‘get moving’ with tech reform

The special counsel in ethics at the Queensland Law Society discussed why we need to move faster with regulating technology, how we should go about formulating it, and the importance of taking a balanced approach so as not to suppress the benefits tech brings.

user iconJess Feyder 15 May 2023 Big Law
expand image

Recently on The Lawyers Weekly Show, host Jess Feyder spoke with Shane Budden, special counsel in ethics at the Queensland Law Society.

Recently, an open letter signed by over a thousand technology leaders and researchers called for artificial intelligence (AI) labs to pause the development of advanced AI systems, a sentiment that Mr Budden said he understood.

“Those who signed that letter are saying, ‘We need to take a pause and build ethical standards’,” he noted.

Advertisement
Advertisement

“One of the big issues in this area is that tech tends to run ahead of the regulation. We saw how social media exploded. It was unprecedented at that time, and by virtue that it’s largely unregulated, we’ve seen it used for both good things and very bad things.

“The problem with these technologies is how quickly they move. If we don’t get on top of it now, it’ll be far too late — it’s difficult to put the genie back in the bottle, as we’ve seen,” said Mr Budden.

“If you look at the legislation that deals with technology, a lot of that legislation is 20 years old. It’s been amended and updated over time, but it’s probably time we sat down and went through it.”

“Technology is a bit like fire; it can be very useful or very damaging. To some extent, we regulate the use of fire, but that doesn’t mean we can stop someone from starting a fire.

“We need regulation in place, but we also need to accept that a lot of it is going to be very difficult to stop,” highlighted Mr Budden.

This, Mr Budden explained, would require “buy-in”, as to regulate the space effectively, “it’s going to involve a lot of consultation with stakeholders”.

“You’re going to need to bring big tech into the tent. You can’t just slam down the law; you need them to be involved,” he highlighted.

“It would be much easier if there’s a level of self-regulation — if companies themselves are happy to do it.”

Mr Budden also compared how different entities are going about legislating.

“The European Union is taking a heavy regulation approach, whilst the United States are more open, with a free-flowing capitalism-style model,” he stated.

“The two are going to have to reconcile somehow, and the way to do that is to get big tech on board and involved in the discussion rather than just trying to stamp down from the top.”

Mr Budden also highlighted the need to be deliberate in forming regulation so as not to take away the important benefits technology provides.

“Technology such as social media has been incredibly valuable for people suffering oppression,” he explained.

“It’s been a great way of getting their stories to the outside world, and of organising communication between people trying to fight despotic regimes,” he explained, “so we’ve got to be very careful”.

“We’ve seen lately that regulation has started to encroach upon what people can and can’t say, in more of a freedom of speech question.”

“We don’t want to overregulate,” Mr Budden warned. “We don’t want to get too zealous and take away people’s right to protest.”

Additionally, Mr Budden discussed the risks facing Australian society if technology advances in the wrong direction and why technology is so difficult to regulate.

“We’ve seen AI create incredible deep fakes; you can see a photo of someone doing or saying something they never did.

“It can be detected at this time, but they’re getting better at it. That’s going to undermine trust in our media institutions,” he noted.

“Another big problem is all the biases being created by technology. The tech is trained on biases, and they become baked into the technology.

“These sorts of things are very concerning.”

“The other thing we need to watch is the ‘black box problem’ — a lot of the time, tech creators don’t even know how the technology is coming to its decisions.

“The machine is learning at an exponential rate, and you need to dig down to find out what it’s doing,” he said.

“We’re at a dangerous point,” he said. “We really need to understand this stuff before we can regulate it and effectively distribute it and use it.”

“It’s very hard from a practical point of view to stay ahead of this technology in the regulatory sense. You usually need at least three months to get solid legislation together, and technology is changing so rapidly, it’s hard for legislation to be drafted,” he said.

“It’s probably going to be a case of giving broad-ranging powers to a regulator and getting them to school themselves up on it.

“That’s where concerns will arise because the broader that power is, the greater potential for abuse, or even accidental misuse,” highlighted Mr Budden.

“Our politicians need to get aware of it, but whether or not they can move with agility is yet to be seen — it’s well and truly time for us to get moving with this kind of reform.”

You need to be a member to post comments. Become a member for free today!