What’s law got to do with it? With proper precautions and training, legal departments expect to gain huge strides with AI

AI has the potential to synthesize and analyze data in minutes, but this kind of rapid advancement does not come without risk, experts say

Recommended Videos

Sam Ip, a partner and member of the Technology Group at Osler Hoskin & Harcourt LLP, said corporate legal departments are reaching out to him “almost on a weekly basis” regarding AI, whether for their own strategic imperatives in unlocking opportunities and improving efficiencies in-house or because they’ve been called upon to advise on business strategies by senior executives.

He believes generative AI offers “tremendous opportunity” for unlocking decades of institutional knowledge that legal teams are eager to explore, but have never had the resources to.

“This is one (technology) where we’re seeing a lot of enthusiasm not just from the top down, but from the bottom up … from individuals who see real value in it,” he said.

“To give you a sense of context, that level of adoption took about a decade to play out with cloud adoption in legal,” Jack Newton, Clio’s chief executive and founder, said. “AI is easily the most pronounced and rapid adoption of any technology among lawyers in the last 20 years.”

For example, he said legal professionals can use his company’s software tools to quickly analyze a set of documents using natural language queries, then “interrogate” the data.

“At the end of the day, this could help lawyers save huge amounts of time … in analyzing documents and in giving recommendations to what edits need to be made,” he said.

With the sheer scope and volume of data housed in legal departments of multinational corporations and their subsidiaries, AI has the potential to synthesize and analyze data in minutes rather than the months that an entire team of in-house lawyers and analysts would have trouble doing it in, said Newton.

But this kind of rapid advancement does not come without risk.

“I think corporations and law firms alike need to be aware of the privacy and security implications of whichever large language model they’re leveraging,” Newton said, adding that it’s also up to lawyers to be educated on what the inherent limitations of AI are.

AI models are often trained on public data sets, so bias is an important consideration, especially for legal departments that rely on former precedents in garnering information, Munir Nasser, managing director and partner at Boston Consulting Group (BCG) in Toronto, said.

“This can exacerbate bias and exacerbate inaccurate algorithms,” he said. “We spend a lot of time on responsible AI frameworks with clients that include not just automated testing, but human oversight to ensure that biases and errors are caught.”

But Nasser said that for every task gen AI is automating, there is a percentage of time unlocked that can be allocated for individuals to focus on more strategic productivity.

IP at Osler said the real threat to jobs is probably not AI, but other lawyers and firms using the technology effectively. He said using AI to its full advantage requires certain skillsets that legal professionals will need to master. This means knowing how to “tease out” the best answers by asking the right questions.

He also advises legal departments to go after the “low-hanging fruit” first, such as using AI to help create a first draft of a contract or as a starting point for research, before tackling more complex duties.

“I see a future where a lawyer amped up with great AI skills is just going to be more effective than one without,” he said.

Related Posts


This will close in 0 seconds