When a large law firm rolls out new software, how can it measure success? Do firms even ask this question? Reader feedback is welcome. 

As CIO at a large law firm, I deployed a portal. We bought separate tracking software to measure and analyze portal usage. I was surprised that tracking was not built in – I assumed all firms would want to measure usage. Hits to the firm directory and some other administrative information were very high (there was no other way to get it); but otherwise, after one year, use was pretty low. Recent conversations as both a consultant and participant at professional events suggest that tracking is still not common.

This topic arises because I wrote a case study (to be published soon) of RealPractice at Littler Mendelson as part of my affiliation with Practice Technologies. My analysis of growth in and frequency of lawyer usage of RealPractice at Littler looks good to me but a friend asks how it compares to other roll-outs.

Great question! I’m not sure if anyone has good comparative data; consider some examples:
– Mandatory systems (e.g., document management) say little because lawyers have no choice.
– Highly specialized practice applications say little because usage is inherently limited.
– Anecdotes suggest that CRM uptake is low, though the intent was for widespread usage.
– I suspect Lexis and Westlaw took over a decade or two to achieve current usage rates.

Can anyone share data on lawyer usage after rolling out software available to all lawyers? OK, maybe you don’t have data. So then let me ask: what percentage use do you think reflects a good result after two years? Would 25% shock you – and if so, in which direction! How about 50%?

Though motivated by this case study, the question is of broader interest. Click here to comment publicly or click here to reply privately by e-mail. I will not use e-mail answers for any purpose without explicit permission.