John C Havens (pictured) has a hopeful vision for the future but it comes with a word of caution. He believes that the cause of humanity’s demise will not be because of killer robots but rather due to our stubborn fixation on money.
Speaking to Lawyers Weekly ahead of a major talk he was due to give in Sydney yesterday, Mr Havens argued that to avoid this fate, more companies should adopt the concept of the triple bottom line — a notion that values people, the planet and profit equally — and measure quarterly performance to those three standards.
“I am very pro-wellbeing and by that I mean adopting a more holistic sense about what actually improves ‘longitudinal flourishing’,” Mr Havens said.
“If, at the end of every quarter, a company is not prioritising its numbers as to how they relate to sustaining the planet and our people, then it just does not prioritise them.”
“You do not measure what you do not count,” he added.
An American thought-leader and writer, who regularly contributes articles for Mashable and The Guardian on technology and wellbeing, Mr Havens hastens to add that his view is not anti-business. Nor does his vision for a better society run contrary to core commercial goals such as earning a profit, he said.
In fact, Mr Havens said he firmly believed the idea that desirable growth should be both exponential and considered exclusively in monetary terms, could be reimagined.
“Profit does not have to go away. There is nothing wrong with making money to pay your bills or improve your business,” he said.
“I often say to business: ‘Look, GDP right is the primary metric of value for the planet. It is not evil but it is very singularly focused on things such as financial metrics. We need to start measuring things like care-giving and the environment.’”
He added that it was unfair to expect that modern businesses, including law firms, keep people in jobs when so many traditional functions could be automated.
Artificial Intelligence (AI) and machine learning would inevitably change how workforces look and operate, Mr Havens suggested, but this was also an opportunity for lawyers in particular to think about how to pair those machine capabilities with values-based design.
“There is no business imperative not to automate every single job on the planet — period. Once a task can be automated, you do because you can,” Mr Havens said.
“There are great examples today of machine learning that can replicate or do tasks like look through 1,000 law journals in seconds and find five cases in ways that no human ever can.”
He explained that, from a Utilitarian standpoint, applied ethics should be used to ask questions about the new adoption of technology in business and how this may affect the end-user. Lawyers were therefore empowered to think more about what their clients valued, Mr Havens said.
“This is a good methodology you can use, especially now that AI is affecting our agency and identity and emotion in brand new ways,” he said.
“You are basically using ethics as a tool. It is like ‘agile for ethics’ and it means that the answers you discover provide more innovation than you might have had before.”
Mr Havens recently authored the book entitled Heartifical Intelligence: Embracing our Humanity to Maximize Machines. He is also the executive director of the IEEE Global initiative on Ethics and Autonomous Intelligent Systems.
“I’ve been involved in technology for almost 15 years and it’s mainly because I am a big geek,” he said.
“I just love technology and I find it magical — in the sense of, I could not do something before and now I can.”
Mr Havens flew in from New York to visit Bond University law school on the Gold Coast, where he delivered a public seminar on human identity in the Algorithmic Age. He later delivered a keynote address for the Financial Times’ first Innovative Lawyers Forum to be hosted in Australia.
Yesterday, Mr Havens told an audience of lawyers in Sydney that there were three things they could do now to future-proof the profession. He said that firstly, human lawyers had the advantage of empathy and, secondly, a unique way of interpreting knowledge.
Thirdly, he said legal professionals would also be called upon to help regulate and define the difference between humans and machines as technological capabilities advanced and further blurred that line.
Mr Havens used the example of the new Google Duplex product, which has been designed to mimic human speech so well that people cannot identify they are listening to a machine.
“While machines with a lot of emotion AI that can help with certain empathetic situations, clients still prefer a human face to be looking at them,” Mr Havens said.
“You can also have lots of information, in six or seven piles, of really well-researched, machine-sourced information. But then the human lawyer needs to come to a decision and make a judgment about that information. That, at least for a while, for humans is really important,” he said.
Professor Nick James, Bond University’s executive dean of law, said that Mr Havens visit was made possible by the Baker McKenzie eminent visitor program, with the law firm a philanthropic supporter of the law school since its inception in 1989.
“Law schools are preparing the lawyers of tomorrow and therefore have to be at the forefront of changes in the legal services sector,” Professor James said.
“It is no longer about simply helping law students become technically competent lawyers. Law schools must teach our students how to deliver technology-mediated legal solutions, work within new regulatory frameworks and become familiar with the use of AI and other emergent technologies,” he said.