The ups (and downs) of using AI in legal work
A global tech leader has highlighted how in-house counsel could use AI in their work, but warned of confidentiality and privacy issues.
Ahead of his keynote address at the Corporate Counsel Summit 2023, Nick Abrahams — global co-leader of the digital transformation practice at Norton Rose Fulbright — said that while artificial intelligence (AI) is not a new concept, it previously focused largely on machine learning.
To continue reading the rest of this article, please log in.
Create free account to get unlimited news articles and more!
This had some impact in e-discovery but was not very effective, he noted.
“Then, in November 2022, ChatGPT became publicly available,” he told Lawyers Weekly.
“I personally believe the technology is far more advanced than I thought it would ever be in my lifetime. There’s an old saying that any sufficiently advanced technology is indistinguishable from magic. That’s what we’ve got with ChatGPT because of the responses it is capable of generating.”
His comments preceded his address to the Corporate Counsel Summit later this month, where he will cover the benefits and challenges of using AI in the legal profession and how in-house counsel could choose the right AI tools for their legal tasks and integrate them into their processes while balancing them with human expertise.
Mr Abrahams said that while AI tools will not replace legal professionals, those who use them could replace those who do not.
ChatGPT operates under the “large language models”, an AI algorithm that uses deep learning techniques and large data sets to understand, summarise, generate, and predict new content.
“Therefore, it is capable of much greater utility for corporate counsel,” Mr Abrahams said.
For example, Mr Abrahams — who is also an adjunct professor at Bond University and teaches The Breakthrough Lawyer Program — said in-house counsel could use ChatGPT for document analysis or to produce condensed policy document drafts, government submissions, mergers and acquisitions agreements, or privacy policies.
One of his clients used it to draft around 80 per cent of a government submission, which reduced the preparation time from 20 hours to two hours, he said.
“But, note that getting something out of ChatGPT is not the end of the road,” he warned.
“It needs a lot of critical analysis to make sure that what is generated is accurate.”
Legal professionals could also use ChatGPT to produce concise, “punchy” marketing content with the appropriate tone of voice to attract the targeted audience, Mr Abrahams said.
Beware the ‘Pinocchio’ effect
AI tools like ChatGPT are not without its downsides, however, as they are prone to “hallucinating”, or “lying confidently”, and generating misinformation, he warned.
AI depends on the data it is being fed, which means that if it is being fed misinformation, it will generate misinformation.
This is dangerous in the practice of law, which demands exactitude and accuracy as companies could make commercial decisions based on the advice provided by in-house lawyers.
“I think we’re a long way off from ChatGPT being 100 per cent error-free,” Mr Abrahams said.
As such, in-house counsel should view the tools as a starting point rather than relying on them to complete all the tasks, he suggested.
“AI has what I call the Pinocchio effect,” Mr Abrahams said.
“If you enter a question, you’ll definitely get a response that seems incredibly impressive and convincing, but it could be completely wrong.
“I once asked ChatGPT for quotes from general counsels of large organisations in relation to why purpose is important for lawyers to get job satisfaction. ChatGPT gave me three beautiful quotes from general counsels of the top 50 global organisations, along with URLs, within seconds.
“Unfortunately, the quotes and URLs were completely made up. The general counsels existed in those organisations, but ChatGPT made up everything else.”
Confidentiality can’t be guaranteed
In-house lawyers have also been warned not to feed ChatGPT confidential information as there are no safeguards to ensure it will remain confidential.
“Right now, confidentiality cannot be guaranteed with what’s available in the market, so that risk can’t be mitigated,” Mr Abrahams said.
“But there are pilots being run by companies like Microsoft (a significant investor in ChatGPT’s parent company OpenAI) that could solve this issue. You can still benefit from the large language model without making your data publicly accessible.
“It might be six to nine months before we start to see these sorts of solutions.”
To hear more from Nick Abrahams about how you could benefit from using AI for legal work in your organisation and how to integrate it into your existing workflow, come along to the Corporate Counsel Summit 2023.
It will be held on Thursday, 25 May, at Sofitel Wentworth Sydney.