Categories

Archives

Standardizing Deal Documents: Expert Discussion

Initial Blog Post: Measuring the Consistency of Legal Documents

Transactional lawyers spend huge amounts of time drafting and negotiating documents. Yet few have the tools to assess how their documents compare to similar ones drafted by other lawyers. That is changing.

My former consulting client, Practice Technologies’s RealDealDocs makes it easy to find and compare public disclosure documents and like clauses.

An emerging product applies statistical metrics to compare like documents and help both lawyers and clients understand how to improve document drafting. More specifically, it facilitates building standard templates and knowing where to focus efforts on customizing. Kingsley Martin, known to many knowledge management professionals, has formed KIIAC LLC. His web site, which has the documents used for his initial analysis, is available at www.kiiac.com.

I have his permission to publish an e-mail message he sent to me. His note describes some interesting initial findings from his metrics-driven approach:

As part of our work to create document templates automatically, quantify differences among like documents, and develop very accurate searches for transactional documents, our research has discovered an interesting correlation: the more complex the transaction, the more likely the document consists of standard terms and conditions.

The table below shows a range of agreements and their consistency, measured by document structure commonality and clause language consistency. We base our analysis on 250-500 publicly available samples of each document type. We need to increase the sample set but the early trends of consistency from the document collection are emerging from our research:

Document Type Consistency
Interest Rate Swap Agreement 97%
Merger Agreement 90%
Finance Agreement: (e.g. Term Loans, Credit Agreements etc.) 85%
Corporate Formation: (e.g. Articles of Incorporation, Bylaws) 85%
Employment, Consulting Agreements 65%
Purchase or Lease of Real Property 60%
Supply Agreements 55%

The statistical methods used to measure commonality are based on three main elements, simplified here for purposes of explanation.

  • First, the presence of articles, clauses and sub-sections, namely the building blocks of a deal document. For example, the technology identifies whether each document has survival, amendment and waiver clauses, irrespective of where they may appear in a document. We also identify and count the number of deal-specific clauses that do not typically appear in a particular type of document. The ratio of standard to non-standard clauses gives us the clause commonality measure.
  • Second, for clauses that have sub-sections, we measure the commonality of such sub-clauses. For example, in a merger agreement, what are the clauses in the representations and warranties article and how do they compare to the list clauses in this section from other documents? The ratio of common sub-clauses to non-standard clauses gives us sub-clause commonality measure.
  • Third, the analysis measures the commonality of the words in each of the matching building blocks. The analysis identifies the common words for a particular clause, and then using this information computes the uncommon or deal specific terms. The ratio of common words to uncommon words in each matching clause gives us the measure of word commonality.

Using standard statistical techniques, we aggregate the commonality measures for each element to compute the overall document score.

We’ve performed the statistics. We are eager to hear from readers, especially practicing deal lawyers, why more sophisticated transactions tend to be more standard. Is it because sample documents for complex deals are more available online, thus causing a de facto trend to standards? On a related note, are the deal-specific terms the critical differentiator that marks the value of the document and the negotiating skill of the author?

If anyone has answers to Kingsley’s questions, you can e-mail him (kingsley dot martin at kiiac dot com) or leave a comment or contact me.

Doug

You need to throw out the Interest Rate Swap Agreement. That is a standardized document. Although the financial implications may be complex, the transaction is fairly straightforward. Standardization removes the transaction costs and allows the market to exists. The derivatives market would not exist without this standardization.

You see a similar standardization in residential mortgages. This allowed the RMBS market to exist. Besides the financial failure, a big failure of the CMBS market was not standardizing the document package.

On another note, one of the difficulties of a work product retrieval system for transaction documents is that the words and provisions are very similar. Much of the value of a particular document is the information that is not in the text of the document itself: industry of the transaction, the bargaining strength of the parties, etc.

As a former real estate practioner, I can tell you that leases and P&S agreements for real estate are very similar. Since you took your collection from EDGAR you are only seeing the biggest and the most highly negotiated of these types of agreements. That may skew the results.

As for merger agreements, I think the existence of EDGAR has changed that practice. You have a big collection of these documents, so everyone can look at these for guidance. The other side is that your collection of merger agreements is for public companies. You may get a bigger spread if there were more private-private merger agreements.

I think the results show the benefit of document automation systems. The majority of provisions in a document do not change from transaction to transaction. A lawyer’s time is better spent on the pieces that distinguish that transaction from others.

Standardization will be good for the legal profession. It reduces transaction costs, which is good for the client. It allows the lawyer to focus on the key issues and language in agreements which should make the lawyer’s practice more interesting.

Kingsley

Thanks for the feedback.

I included the ISDA document as a yardstick or control, but I do agree that using a standard form is hardly a satisfactory measure of consistency.

I also agree with Doug that one likely direction for transactional documents is a Master Document, configurable through Definitions and/or a term sheet.

Indeed, this is the way the ISDA document works.

We are beginning to see industry groups, such as the IACCM in the US and the ICC in Europe, develop standard documents that can be rapidly drafted and customized with an addendum. Mortgage lenders, for example, have for a long time used riders.

The challenge is still to identify the standard terms and to secure agreement of interested parties. Individual corporations and law firms are starting to create their own standards. Where the effort involves am industry group, the process can take many years, as with ISDA. And in this case, parties to the OTC Derivative contracts are often the same organizations, sometimes on different sides of the deal. In other words, they had a shared interest in conformity and fairness.

Where there are divergent interests, it is likely that the process of standardization will take longer, unless as Doug points out one side can dictate terms. However, one of the goals I seek to achieve is to narrow the points of divergence. Whether it is a loan agreement or an asset purchase agreement there are a few key provisions; the remainder are already fairly consistent, or in some cases inconsistent for really no good commercial reason.

Doug

I found it interesting that the ISDA was not higher. I would have expected 99-100% consistency. Again it could be that that the documents you pulled from EDGAR are highly negotiated for that type of document.

There was some effort to standardize early investment documents from a Silicon Valley legal association. They never got very far.

The one standardization I saw over the last few years was the intercreditor agreement between a mortgage lender and mezzanine lender in real estate documents. The form was drafted by Dechert as counsel for S&P. Lenders started requiring that form or a comparison to that form in securitized mortgage loan originations. That financing market has now disappeared so I am not sure if the form will stick.

Too many lawyers think of themselves at artisans for these agreement and that they must use their template. Having a common starting point would make the legal work easier on the lawyers and the client. Your study goes a long way toward showing the need to have at least common outline.

Side Note on XBRL

Ron’s asked Doug and Kingsley about the potential impact and relationship of the SEC adopting XBRL on document drafting and standardization.   Here are their comments:

Doug

One of the interesting features for markup is the approach taken by Fannie Mae in their DUS program for multi-family mortgage loans. All changes to the document are in an addendum rather than incorporated into the document. Anyone can quickly see how that document differs from the standard form.

Of course, for a document with lots of changes, it gets very difficult to read. The Fannie Mae documents are very fair to the borrower so there is generally very little negotiation. (As a result, legal fees are low.)

The Rouse Company used to take the addendum approach for their smaller retail leases as well. That worked, not because their lease form was fair, but that they rarely agreed to changes. There was a big imbalance in bargaining power.

Kingsley

I am not fully conversant with the SEC’s strategy, but I can comment wearing my techy-hat.

Most efforts at markup language tend to focus on meta-data extraction, such as names, dates and amounts. The technology typically uses rules-based extraction techniques and unfortunately these are language and document type specific, thereby hampering widespread adoption.

Yet another one goals is apply statistical analysis methods that work on all document types and languages to identify the substantive provisions of documents: i.e. what terms does it contain; and what terms may be missing.

I am not aware if SEC project includes the latter analysis, but I think it would add value. In fact, the first application I developed, a document assembly system, created a document summary of its key provisions, so I could be better recall the contents of the document when a client called.