Goodbye job applications, hello dream career
Seize control of your career and design the future you deserve with LW career

ChatGPT use high in legal profession

Only one sector uses artificial intelligence platform ChatGPT more than the legal profession, new international research has revealed.

user iconLauren Croft 10 October 2023 Big Law
expand image

Since ChatGPT launched last November, the AI platform has made global headlines and attracted hundreds of millions of global users.

ChatGPT has prompted waves of change within a number of industries, including the legal profession. You can read Lawyers Weekly’s full coverage of ChatGPT and other AI platforms and what lawyers need to know here.

Following this, Indusface surveyed 2,000 workers across varying job levels and sectors to find out more about the use of ChatGPT in the workplace. The survey asked respondents whether they trust ChatGPT (or similar AI), how often they use ChatGPT at work, and what they are using ChatGPT for at work.

Advertisement
Advertisement

Although the study was conducted in the UK, it provides interesting insights into how the Australian legal profession might be using AI to drive efficiency.

Thirty-eight per cent of the legal industry was revealed to be using ChatGPT – the second-highest percentage behind the advertising industry at 39 per cent.

Writing up reports was the most common reason for using ChatGPT at work, with more than a quarter (27 per cent) of respondents naming this as their reason for using this form of AI. Twenty-five per cent used the bot for translation, with 17 per cent using it for research purposes.

However, 55 per cent of UK workers stated that they do not trust working with another business that uses ChatGPT.

Indusface founder and president Venky Sundar said that there are a number of risks and benefits to using ChatGPT in workplaces.

“Specific to business documents, the risks are: legal clauses have a lot of subjectivity, and it is always better to get these vetted by an expert. The second risk is when you share proprietary information into ChatGPT, and there’s always a risk that this data is available for the general public, and you may lose your IP. So never ask ChatGPT for documentation on proprietary documents, including product roadmaps, patents and so on,” he explained.

“The benefits are that a V1 draft could be easily obtained, and it is helpful to frame thoughts, especially for generic templates such as email templates and so on. For application security, the risk is [that] you are unsure that the code snippets written by ChatGPT are secure. You will still need to perform in-depth security testing before deploying them.

“The maturity level of addressing the data and ownership of trust is still not well defined, and the businesses are right in not trusting it completely as they are worried about the use or, more appropriately, misuse of their data. Like every technology, there will be early adopters, but these people are tech-savvy and a minority. For everyone to adopt, it will take its own time.”

The least common reason for using ChatGPT was for client emails – with only 11 per cent of respondents using the bot for this.

“ChatGPT or LLMs in general have made the development cycles very short. It is easier now to convert an idea to a working proof of concept in a matter of days when compared to months before,” Mr Sundar continued.

“The risk, though, is that POC should just be used for that purpose. If you go to market with the POC, there could be serious consequences around application security and data privacy. The other risk is with just using LLMs as an input interface for the products and there could be prompt injections and the risk is unknown there.

“One thought process is the knowledge base used to build productivity use cases, and the knowledge base used to build defence use cases on what’s not acceptable have to be separate sources that need to be trained and updated continuously.”

You need to be a member to post comments. Become a member for free today!