It's a prickly question

Measuring success

I have written before on the difficulty of measuring the return on investment in knowledge activities. Prompted by a couple of recent conversations, I have been pondering the issue a little more. What follows is a rumination on how successful knowledge activities might be identified within a law firm, especially over a period of time.

It's a prickly questionIn the past, some knowledge folk might have responded to a question about the value of the work they do by pointing to volumes of documents in a know-how database. All this demonstrates is the amount of work done — it doesn’t help people understand how the firm benefits. Nick Milton helpfully summarised some ways in which business might benefit in a blog post earlier this year (together with survey results showing which measures were most commonly used).

In addition to the kind of successes that Nick points to, I have also been fond of using qualitative assessment of knowledge activities. Often it is easier to ask people (whether inside or outside the business) about their experience of KM work. Their responses serve a double purpose — as well as indicating how successful past activities might have been, they can also suggest fruitful directions for the future.

However, I have become more doubtful about the merit of highlighting one-off successes or of depending on how people feel about a service that is designed to make them feel good. These may give an impression of how well certain parts of the knowledge function perform, but they don’t help with a wider picture.

After reflection, I think the answer can be found in an analogy I have used before. Back in June 2014, I likened knowledge management to farming.

In order to improve the yield of the organisation (by whatever measure is appropriate), managers need to enhance people’s natural capabilities (fertilising for growth), while reducing the impact of adverse conditions (sheltering crops from bad weather). That isn’t possible without a deep understanding of the environment within which the organisation works, the natural capabilities of the people within the organisation, and the value of whatever the organisation produces.

The key to measuring the value of our knowledge activities is yield. If the set of things that we do to improve productivity are successful (allowing for the fact that some may be more successful than others), the firm’s yield will improve.

The next question is how yield might be measured in a law firm. The answer here, I think, depends on whether you want to consider the firm in isolation or compare it with the market as a whole. Financial data that is available within the firm may not match what is made available for publication.

Generally speaking, productivity is an expression of the ratio of outputs to inputs. At the national level, the UK Office for National Statistics derives labour productivity estimates by dividing measures of output by some measure of labour input. In professional services, productivity is measured by the turnover of companies adjusted for average wage rises in the sector.

Within a firm, inputs and outputs can be measured precisely. Firms know how many people they employ, how much they are paid, and how long they work. They also know which of these people contribute directly to the firm’s turnover. Productivity could therefore be measured as a ratio of turnover per person (full-time equivalent or otherwise) or per hour worked. Such a measure would not be useful for comparison over time, since inflation might increase fees without a real increase in yield. Using the ratio of income to salaries would smooth out such variations, since inflation in fees is likely to run at a similar rate to pay inflation.

Over time, then, a firm can see how productivity changes from year to year. As law firm knowledge management efforts tend to focus on the income-generating side of the business, examining the productivity of fee-earners in isolation over a significant period might help to show whether those efforts have had a real impact.

If comparison beyond the firm is needed (does our productivity match changes in the market?), then firms need to find publicly-available datasets. The most easily-accessible data is collected annually by The Lawyer (in the UK) and The American Lawyer (in the US). Both the Lawyer UK 200 and the AmLaw 100 calculate revenue per lawyer (RPL), which can be used as a proxy for more precise measurement of productivity. Because this measure does not take account of inflation, comparison between firms is only possible in a single year. On its own, that comparison is almost worthless. Factors such as the firm’s employment profile (does it depend on low-cost associates or is it partner-heavy in high-cost locations?), its client types, or work profile, little real insight is possible. At best, firms might pick comparators they know to be broadly similar.

There is a useful way to use published figures for revenue per lawyer. That is to compare the trend in a firm’s performance with a larger set. For example, the median RPL figure for the top 100 firms can be plotted against time. That line is likely to ascend, with occasional dips when the wider market was under stress. (I would use the median in preference to the mean, in order to reduce the impact of particularly high- or low-performing firms in any given year.) When the RPL for a single firm is plotted alongside the whole set, one can see whether the profile of the line matches that for the whole set (performance in line with the market) or whether it rises more steeply (outperforming the market) or more shallowly (underperforming against the market).

This graphical information, when combined with what is known inside the firm about any special factors, allows the firm to understand better how well it is doing in the market and what might be causing any difference in performance. The special factors could include investment in knowledge activities, as well as significant client wins or losses, so some caution is still needed.

I suspect very few firms do this kind of meaningful analysis. In a later post, I want to explore the implications for law firm support teams of not having this kind of insight.

1 thought on “Measuring success”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.