Goodbye job applications, hello dream career
Seize control of your career and design the future you deserve with LW career

Potential impacts of AI in legal recruitment and redundancies

HR professionals in the legal industry and beyond have been warned against implementing AI into recruitment or redundancy decisions without proper safeguards in place.

user iconLauren Croft 11 December 2023 Big Law
expand image

In Australia, the use of artificial intelligence (AI) in recruitment has nearly doubled over the past year, and spending on AI systems is set to grow to $3.6 billion by 2025 – an increase of 24 per cent since 2020.

As such, more industries than ever are using AI in their day-to-day, including decisions around recruitment and redundancies.

Potential bias in AI

Advertisement
Advertisement

Earlier this year, research from Monash University and the University of Gothenburg revealed that for women in tech, AI was more likely to hire them than a human recruiter.

The study provided recruiters with applicants for a web designer role. Women scored “substantially lower” than men when ranked by human recruiters, yet performed more equally with men when genders were hidden.

This prompted a number of debates across varying industries, including the legal profession, as to whether AI was less biased than human recruiters and should – or shouldn’t – be used more. ChatGPT has made headlines over the course of this year – and will likely continue to alter the legal landscape moving forward. You can read Lawyers Weekly’s full coverage of ChatGPT and other AI platforms and what lawyers need to know here.

Speaking to Lawyers Weekly following the research, legal recruiters expressed doubts that AI recruiters could ever fully replace human recruiters, particularly within the legal profession.

“Legal recruitment is a multifaceted process that goes beyond mere technical skills. Human recruiters possess the invaluable ability to evaluate a candidate’s interpersonal skills, judgement, ethics, and cultural fit, all of which are essential in the legal profession; a human recruiter develops a long-term relationship with candidates; they understand what is important to them and what motivates them in a personal way,” Carlyle Kingswood Global in-house legal and governance director Phillip Hunter said at the time.

Despite the potential benefits of AI recruiters, nrol director Jesse Shah agreed that “the human element in the recruitment process will always be crucial”.

“Human recruiters possess contextual understanding, allowing them to interpret factors beyond what is explicitly mentioned in a résumé or application,” he said.

“They can assess soft skills such as communication abilities and cultural fit, which AI systems often struggle to evaluate. While AI can reduce bias, it is not immune to biases itself, whereas human recruiters can be trained to recognise and mitigate their own biases.”

Six months on, however, a three-year project undertaken by Diversity Council Australia (DCA), Hudson RPO and Monash University has revealed that without the right approach, AI has the potential to mirror society’s inequalities, resulting in bias.

In response to this, the DCA has developed The Inclusive AI at Work in Recruitment Employer Guidelines, which features a set of guidelines developed in consultation with an expert panel of stakeholders representing marginalised jobseekers, employers with experience using AI, academics and tech experts.

“We know that unless AI is deployed with a focus on diversity and inclusion, it has the potential to mirror society’s inequalities and bake in systemic biases. Conversely, if it’s used with D&I front of mind, the benefits can be astounding,” DCA chief executive Lisa Annese said.

“These guidelines will help provide employers with the tools they need to take advantage of this incredible technology in a way that reduces bias and helps foster a more inclusive and diverse Australian workforce.”

Hudson RPO CEO Kimberley Hubble added that if AI is being used in HR decisions, it needs to be used with a “necessary understanding”.

“If used appropriately, it can pave the way for superior experiences and job opportunities for diverse candidates, but conversely, if used without the necessary understanding, it can reinforce systemic bias and discrimination,” she said.

“We are proud to support DCA’s research into using AI in recruitment and believe their evidence-based guidelines will give employers practical advice for using AI inclusively during recruitment and selection to maximise diversity outcomes for their business.”

Is AI being used for redundancies in law?

AI can “be a double-edged sword”, according to Ms Hubble – and needs to be used with caution and care.

“While AI offers us immense opportunities in recruitment and many areas of HR, we need to use it purposefully and carefully to ensure we improve and not hinder diversity outcomes in the workplace,” she said.

However, there have also been reports of HR leaders using AI data and tech to make redundancy decisions, as reported by Lawyers Weekly’s sister brand, HR Leader.

This comes as a number of BigLaw firms, including Clyde & Co and MinterEllison, make back-office cuts, with redundancies also reported among fee earners in August this year.

According to several reports from Capterra, 98 per cent of HR leaders said they were planning on using AI to determine redundancies. Within the SME market, just 21 per cent of SME respondents confirmed they had HR-related AI technology. However, employees are worried that this will change, with 71 per cent of SME workers stressed about the use of AI in redundancy decisions.

Content analyst at Capterra Australia Laura Burgess said that if AI was being used in any form of redundancies or recruitment, it needed to be done transparently.

“AI can help automate many HR tasks, but it’s still important for SMEs to strike a balance between machine learning and human judgement; this is especially true if AI data is being used to help make sensitive HR decisions, such as employee dismissals,” she said.

“Some employees may feel concerned about AI, but companies must be transparent on how and why these tools are being used so that the reasons behind important decisions, such as recruitment and redundancies, can be trusted.”

As to whether this research rings true in the legal profession, Beacon Legal director Alex Gotch said that he hasn’t heard of AI being used in the redundancy process for lawyers as of yet, but that it is being increasingly used for routine tasks.

“AI is becoming more commonly used for routine drafting tasks in the legal profession, including production of standard documents. This could lead to cost and human resource efficiencies in law firms, particularly impacting work usually given to juniors and paralegals,” he said.

“As AI continues to improve, it is inevitable that more routine drafting tasks will be produced using technology, and the requirements to hire humans for these tasks may diminish. Having said that, there are plenty of tasks [that] juniors and paralegals undertake [that] AI will not be able to replicate.”

LOD head of insights Mark Dodd echoed a similar sentiment – and added that while AI is being adopted in the profession, the use of emerging tech is still fairly low in legal departments.

“In our 2023 global survey report, Under Pressure, which surveyed over 300 in-house lawyers around the world, revealed that 61 per cent of in-house legal leaders aren’t taking any definitive action on AI,” he said.

“We also know over 20 per cent are refusing to use it until the risk levels are lower. It, therefore, seems unlikely that they would be comfortable using AI to make redundancy decisions. This would be a step too far for most GCs.”

You need to be a member to post comments. Become a member for free today!