As AI tools surge into the legal world, general counsel face mounting pressure to manage significant legal risks without stifling innovation. Here, Gartner’s research director provides insight on how legal leaders can navigate this high-stakes balancing act.
The rapid rise of generative and agentic AI is thrusting general counsel into a delicate balancing act – controlling legal and regulatory risks while unlocking AI’s potential to transform legal support.
Stuart Strome, research director at Gartner, offered insights on what these developments mean for general counsel and legal leaders and how they can navigate the opportunities and challenges AI presents.
The role of GCs
While general counsel remain ultimately accountable for AI-related legal risks, Strome explained that this balancing act can’t sit with legal alone, requiring a coordinated, organisation-wide approach supported by strong governance frameworks.
“GCs are ultimately accountable for legal risk events related to AI. However, in most instances, legal is not fully responsible for AI governance,” he said.
“Those responsibilities are shared across multiple functions. Therefore, legal’s role in governance depends on the existing efforts at the organisation and the presence of strong AI governance leadership.”
Strome explained that general counsel play a critical role in establishing effective AI governance by delivering strong oversight without slowing the business, using targeted measures such as risk triggers and clear review thresholds to focus on high-risk use cases while still enabling innovation.
“The trick is to provide robust oversight without creating unnecessary friction for the business,” he said.
“Techniques like embedding triggers into existing risk assessments to flag high-risk AI projects and providing clear go/no-go criteria to guide when a use case requires additional legal review help achieve this.
“By concentrating their efforts on high-risk use cases and setting transparent criteria for the business to know when legal review is required, GC can protect the organisation while supporting innovation.”
However, where no governance framework exists, Strome highlighted that general counsel must take a more proactive stance by establishing structures to define roles, responsibilities and decision rights for managing AI risks.
“On the other hand, if there are no pre-existing efforts, GC must be more proactive, establishing strong AI governance structures throughout the organisation – establishing an AI governance framework and defining roles, responsibilities and decision rights for governing and managing AI risks throughout the organisation,” he said.
“This does not mean that legal must ‘own’ AI governance in perpetuity. However, with AI being increasingly integrated into the business, the risks are growing, and GC must not take a wait-and-see approach.”
Getting ahead of AI risk
Another challenge for legal departments is keeping pace with evolving AI regulations.
Strome observed that many general counsel are being pushed onto the back foot, reacting to evolving AI regulations with constant policy updates, despite the fact that many new rules largely build on existing compliance frameworks.
“We find that so many GCs find themselves on the back foot, continuously reacting by updating policies, controls and compliance frameworks in response to new developments in AI regulation. However, new AI regulations often mirror or build upon existing frameworks,” he said.
Instead of reacting to each new regulation in isolation, Strome advised legal teams to build compliance frameworks grounded in core principles across the jurisdictions in which they operate.
“Rather than reacting to every new regulation, legal departments should build their compliance framework around principles common across the jurisdictions they operate in to ensure policies and controls are broadly applicable yet resilient to change,” he said.
“This approach minimises confusion, promotes consistency, and allows for minor policy tweaks rather than major overhauls as new laws and technologies emerge.”
Where legal should use GenAI
While there are several practical applications available for legal teams to integrate into their daily workflows, Strome recommended that general counsel prioritise GenAI use cases that deliver high value and are easy to adopt.
“GC should prioritise GenAI applications that offer high value and easy adoption. The top use cases for legal departments include contract visibility, legal document summarisation, meeting transcription, and legal research,” he said.
“These solutions are proven, reliable, and can be implemented quickly, delivering immediate business impact.”
By using these tools to automate repetitive, low-value tasks, Strome explained that general counsel can unlock greater efficiency and shift their focus to higher-value, strategic work, while choosing practical use cases ensures AI delivers maximum impact with minimal friction.
“By automating highly repetitive, low-value tasks, legal teams can boost efficiency and redirect resources to more strategic work,” he said.
“Focusing on these practical use cases ensures that the department maximises the benefits of AI while minimising implementation challenges.”
How legal can win with AI
Strome emphasised that successful AI integration requires general counsel to look beyond the technology itself and focus equally on the readiness of the people, processes and data that support it.
“To fully capitalise on AI’s potential, GC must invest in the people, processes, and data that empower the technology, rather than focusing solely on the technology itself,” he said.
“Successful AI adoption requires standardised processes, robust data management, and upskilling legal teams.”
To position legal departments for success, Strome stressed that those prioritising digital readiness – by preparing their teams, workflows and data for AI integration – are nearly twice as likely to succeed.
“Departments that prioritise digital readiness – by preparing their teams and workflows for AI integration – are nearly twice as likely to achieve successful implementation,” he said.
“Simply purchasing AI tools without adequate preparation leads to low adoption, resistance, or incorrect use, which can undermine efficiency.
“GC should promote digital readiness by instituting clear processes, ensuring high-quality data, and offering ongoing training to their teams.”
Want to see more stories from trusted news sources?
Make Lawyers Weekly a preferred news source on Google.
Click here to add Lawyers Weekly as a preferred news source.