Legal Tech trainwreck #6

LegalTech Trainwrecks #6 (Knowledge Resource Retrieval)

A few months ago, I wrote about my experience market testing a potential AI product that could retrieve relevant examples of past work (‘precedents’) more quickly and easily, to support lawyers with contract drafting and review.

While this idea seemed attractive at first, and lawyers were enthusiastic, I soon discovered it didn’t solve as big a problem as I thought.

It turns out law firm teams don’t generally have trouble knowing which precedents they should use, and can retrieve them quite easily. Indeed, it would be worrying if a client hired a lawyer based on their expertise in a particular niche, and that lawyer didn’t have a good set of relevant precedents ready to go.

Despite this realisation, I still felt there was an opportunity to tackle the broader problem of how to help lawyers better leverage the collective knowledge of their law firm.

It was time to pivot.

The knowledge retention problem

Medium to large law firms spend vast amounts of time and money on internal knowledge and training. That makes a lot of sense. What’s hard, though, is ensuring their lawyers retain that knowledge and utilise those resources, whether they’re legal updates, guides, checklists, templates or other know-how material. A few examples illustrate the problem well:

    1. A lawyer spends 15 hours writing a guide and delivering an internal presentation. Half the room are playing on their phones during the presentation. The guide is uploaded to the intranet, but it’s soon forgotten about and never accessed.
    2. New joiners are unaware there’s a template or guide for a particular task or that information becomes lost amongst the firehose of other induction material they are subjected to. Effort is wasted while the new joiner tries to do the task from scratch.
    3. There is a law change, which means updated wording needs to be used in certain documents going forward. A department-wide communication goes out, and an internal training session is organised. Some people miss the email because their inbox is a mess, some miss the training session because they’re on holiday, and others skim-read the update but fail to remember eight weeks later when that issue suddenly becomes relevant to the document they’re drafting.

The immense pressure of client work means lawyers’ attention spans can be extremely limited for anything that isn’t directly applicable to whatever they are currently working on. Several lawyers I spoke to had a habit of dragging knowledge-related emails into a separate folder in Outlook, so they could read them when they got a moment. Except, of course, that moment never came, and they never read those emails.

To address this problem, what if we could deliver knowledge resources to lawyers at the exact point in time when it would be directly applicable to their current work? Not only would they naturally want to engage with that material, but by immediately applying that knowledge, learning would be reinforced, and retention would be improved. It seemed like something worth exploring.

Testing the prototypes

I experimented with a few different designs.

One option was an automated, personalised email that could be sent every few days to each lawyer in a law firm. It would provide a short list of links to knowledge resources selected by AI as being relevant to the matters the lawyer was currently working on. I quickly scrapped this, however, when I realised these emails would soon become spammy and after a certain point, inevitably regurgitate the same material over and over again.

Another option was to integrate with a law firm’s existing knowledge intranet site or portal. An AI could personalise the site to display links to recommended knowledge resources based on the work the logged-in user was currently doing. This was a better idea but suffered from the significant friction of needing people to manually navigate to that site.

The design I ultimately landed on was a plug-in to Word and Outlook. When the user launches the plug-in, an AI reads the document or email on the screen and displays links to recommended knowledge resources in a side panel. A lawyer could use the plug-in before they started work on a task to make sure they had the right resources upfront, or they could use it in the middle or end of a task to make sure they hadn’t missed anything.

Lawyers loved it. It not only saved them from searching for knowledge resources but would also make them aware of knowledge resources they didn’t even know, or forgot, existed and therefore would never have searched for.

Buoyed by the initial feedback, I further developed the prototype so people could click buttons to simulate the workflow. Everyone was excited by how easy it was to use. You simply clicked a button and waited for the knowledge recommendations to pop up.

In terms of the recommendations themselves, I obviously just made these up for the purposes of the prototype, with items such as ‘Tax update (.eml): Updated tax laws effective 19 May 2022 require key changes to common documents’ and ‘Corporate governance 101 (.ppt): Training session from Petra Partner on 15 Sep 2021’.

Given these were fictitious, I hadn’t given too much thought to them. What I didn’t expect was that the lawyers playing with my prototype would give a lot of thought to them. They wanted to know what the AI was serving up, even though they knew the prototype was a mock-up. But they still wanted to read it. They really wanted to read it.

And it was a good thing they did because it soon became apparent to me how much time and effort was needed for people to read the recommendations, work out what they were about, and decide whether they were relevant. All without any promises, there would actually be anything useful at the end of that process over and above the knowledge already in the lawyer’s head.

Perhaps this wouldn’t be so bad at the start, given the novelty factor and people finding resources they didn’t know about. But after a few weeks, with the same lawyer doing the same type of work and therefore being recommended the same knowledge resources, I couldn’t escape the conclusion that the product would inevitably become more burdensome than beneficial.

Solve one problem, create another

It's a tough challenge. New technology products might solve certain problems, but they’ll probably create some new ones too. It’s important not to lose sight of the net position.

I’d argue this was the case with the various AI-powered contract clause identification and extraction tools around 2016 (the technology has now moved on, of course). This required additional work to:

    • train, or at the very least test, the AI;
    • train lawyers on how to use the product;
    • configure the AI and make sure all the documents were properly organised; and
    • review the AI’s output. 

While these tools probably made the discrete act of reviewing documents more efficient, there was always a lingering question mark over the net gain once you add back all the extra things that now needed to be done compared to a traditional manual process.

In a similar vein, my product would have needed quite a bit of effort to deploy and maintain within a law firm, but more worryingly, more effort from lawyers themselves to sift through the recommendations presented to them. I still believe the struggle to stay on top of internal knowledge resources is a real and important problem faced by lawyers in medium to large law firms. But my proposed solution didn’t stack up.

This was part 6 of a series on legal tech trainwrecks. If you liked this article, feel free to follow me on LinkedIn and also check out the previous articles in the series:

 Daniel Yim
Daniel Yim writes and teaches on legal technology and transformation. He is the founder of technology start-ups Sideline and Bumpd, and previously worked worked at Gilbert + Tobin and Axiom.

Subscribe to the Legal Practice Intelligence fortnightly eBulletin. Follow the links to access more articles related to the business of law and legal technology.    

Disclaimer:  The views and opinions expressed in this article do not necessarily reflect the official policy or position of Novum Learning or Legal Practice Intelligence (LPI). While every attempt has been made to ensure that the information in this article has been obtained from reliable sources, neither Novum Learning or LPI nor the author is responsible for any errors or omissions, or for the results obtained from the use of this information, as the content published here is for information purposes only. The article does not constitute a comprehensive or complete statement of the matters discussed or the law relating thereto and does not constitute professional and/or financial advice.

Back to blog