Thursday, February 17, 2011

CFO’s, Treasurers, Bank Risk and the Adequacy of Internal Controls

The only constant about risk analysis is that it is ever changing. It’s 2011 and make no mistake about it keeping up with understanding bank risk continues to evolve. The FDIC just released 2011 Rule 1 that updates the bank assessment rules to protect the Deposit Insurance Fund (DIF). Banks must now begin to adapt to these new statutes. For everyone else, it would be prudent to keep paying attention.

As of Feb 16, 2011, since the beginning of 2008, three-hundred and forty banks have failed in the United States. As banks have failed, we’ve been making forensic summaries available on our website to help improve transparency over what happened. We have seen the focus of worry facing the banking industry’s customers shift from looking at operating stresses stemming from unsafe or unsound operating practices – the primary objective of IRA’s Bank Stress Index (BSI) testing approach -- to looking for tactical warning about when banks are degrading in condition and in danger of failing.

Mathematically, the crux of the demand is to deliver analytics capable of maintaining linear tracking of banks past something called the investment grade threshold deeper into the degradation curve of the system. In English, people want to know when to get out of the way of the icebergs because it’s become all too clear that there’s no such thing as a ship that is too big to fail. To respond to this need we have constructed a Counterparty Quality Scoring (CQS) measure that provides tactical warning as to when it may be prudent to consider reducing exposure in case there is an FDIC failure. IRA’s CQS is less sensitive than our bomb detector dog nose BSI technique but it reveals something very interesting about the FDIC. Most of the time a bank will fail while there’s still some meat on the bones so that a pre-packaged resolution makes sense. And while there are the occasional oddballs, most of the time, the stress and trouble writing on the wall is clear as day with sufficient time to react.

For CFO’s and Treasurers, we are finding that companies are beginning to become aware that their corporate treasury “investments” in banks are in fact “bank liabilities” potentially subject to loss in the event of a bank failure. We did see a few instances in 2010 of companies calling up to subscribe to our system “right now” because they had taken a real loss following a failure and found out the hard way what the term uninsured deposits really means. We’ve talked to a number of companies now and we think it’s important to begin to explain the variety of internal controls strategies we’ve been running into.

First of all, corporate America can have some very complex interrelationships with banking. The classic notion that a company only needs one “too big to fail” banker to tend to all their needs turns out not to be how things work in real life. For certain things, these large institutions provide efficient solutions and integration costs can lead to very stable and sticky long term relationships indeed. But we also found that companies can and do shop around among the services offerings, a finding that validates the assertion that there’s no such thing as “run off proof” deposits, not even for the most cozy of historical banking relationships. The bottom line is that even for those for whom “size matters,” there are many horses to bet on.

We’re also finding that America’s companies carry a lot more exposure to middle and small sized banks than most people think. Certain industries such as retail, franchising, and integrated supply chains -- to name a few -- have business models requiring dozens and in extreme cases hundreds of separate bank interactions. In a number of cases, maintaining good “local” geographic presence is responsible for enlarging the number of banks a company interacts with.

Lastly we are finding that, in our opinion, far too few corporations practice active risk mitigation. The penchant to “find comfort” and nest in it remains very much a characteristic of U.S. business at this time. There’s even a component of the old culture of “plausible denial” that continues to permeate the system – something that the rules governing the “adequacy of internal controls” of Sarbanes-Oxley should have long driven out of corporate America. We also find a number of companies relying on indirect measures testing techniques to gain their comfort about the banks they do business with. The most common of these is relying on their investment bankers who package their corporate bonds to use market inference indicators to imply the soundness of their banking relationships. So things like stock price and CDS indices substitute for looking at the real FDIC CALL reports.

But we've also seen that some companies do avail themselves of approaches out of the banking world that have mitigating effects protecting against bank failures; the most common of these being to engage in forms of deposit risk placement spreading either directly or via a brokered deposit program technique. So not everyone is behind the curve.

With all this as a backdrop, we find ourselves reminding companies that it’s critical to “know” not guess about how healthy their banks are, what their alternatives are, when they need to act and how they will act if they have to.

First, companies need to remember that risk surveillance is only useful when it’s reporting on reality. The consequences of the “Imagineering” of Wall Street have already wreaked havoc on the U.S. economy and the days of granite reputation banks are long gone. The characteristics of what constitutes risk behavior in banking are understood well enough that they’re codified in legal statutes. Interpreting it is somewhat of an art form but ultimately there’s no valid business reason not to know. What is important to do is focus surveillance on the action thresholds. If all banks were healthy, the focus would be on the best yield/services offering bundle. But, given the state of things, asset preservation or loss avoidance is presently equally important.

We believe that when it comes to loss avoidance with one’s bank(s), marking to reality means looking at FDIC data. Secondary market surrogate measures can have volatile spin in them that could cause false alarms. Next to realizing a "Black Swan" loss because you weren't looking, the next worst thing to do is mistakenly abandon a banker who was acting in your best interest. It’s a waste of opportunity cost best avoided by any depositor, business or consumer.

We further believe that it takes two surveillance data points to gain perspective on a bank’s soundness. One should be an early indicator that allows one to assess confidence in a bank’s operational leadership before trouble ever happens. Second is a present state indicator benchmark that delivers a relevant warning to act while there’s still time to do so. One thing we will note for those wishing to do internal analysis is that certain regulatory benchmarks like capital adequacy are designed to be lagging indicators of strength and soundness. If you look at those forensic reports on the failed banks, you’ll see they tend to falter only at the very end of the life cycle. Acting in a rush is no way to manage one’s larder. So if you choose to read those CALL/TFR reports directly off the FDIC website and gain comfort yourself you’ve been warned. Also, remember that things like TCE’s and Texas Ratios are investor’s analytics tools. That’s potentially useful stuff if your company’s trading desk book holds stock or debentures in a bank but it’s not quite the same thing as treasury asset protection of your cash and cash equivalents.

If you’re really lucky it’ll never happen, but real world odds are that sooner or later everyone will ponder what valid alternatives there are to one’s current bank relationships. Bear in mind that change events also happen for positive reasons. Ideally, for every bank you are in, you want to know what the next three best alternatives are to that institution in case you decide to move your money. Your criteria may have minimum/maximum size, geographic, service offering set, switching cost and other narrowing criteria, but you still always want to know your short list of where to go if you have to. We believe your list should always include apples-to-apples quality benchmark soundness criteria so if you elect to deviate from the norm you’ll know it and by how much. Of course, you may have to explain why at the next shareholders meeting. And please remember that while IRA is authoring is this article, it’s not the only bank ratings company focusing on a next generation of tools for independently risk testing banks. The point of this note is that having a rational basis for knowing one’s alternatives is good internal control procedure. Whether you use external reference sources or do it yourself is up to you.

And last but not least of course, have specific, measurable, actionable and achievable plans to act for each option you intend to execute.

Tuesday, February 8, 2011

XBRL Usability Part 2: Checking the Extension Cord

In our previous installment of IRA’s testing of U.S. XBRL filings we reported that one only needs the EDGAR Accession file to be able to properly monitor and locate SEC filings both with and without xml exhibits. We are now confident enough to cease monitoring of the SEC’s experimental XBRL feed and begin using a true production version to process filings via the standard EDGAR system.

As we await the arrival of a new wave of filings in June 2011 we thought we’d check out the library of variables that will be available. In addition to US-GAAP data elements, filers are allowed to add “extensions” to their submittals. These extensions are independently defined by each company and do not require external coordination or rationalization at this time. The way extensions are added to a filing is via the xsd file, an XBRL definitions file that starts with statements to bring in standardized taxonomies followed by the company’s independent extension set. Every SEC filing with an XBRL exhibits file set has an xsd file. Not all of them have extensions.

We instructed our computer to count up the population of 10-K, 10-Q and 20-F filings for the past year and rummage through the xsd’s. On the particular test run we did there were 36,136 SEC filings meeting our test run criteria in the Accessions tape. Of these 3,880 included XBRL exhibit attachments representing 1,488 Central Index Key (CIK) SEC Registrants. This was roughly one fifth of the population expected to start submitting in June 2011.

Of the 3,880 filings, we found that 3,039 (78%) contained extension elements. The remainder only used primary taxonomies in their construction. The number of extensions in these documents ranged from a low of one extension to a high of nine hundred twenty-two extensions in a single filing. The total number of extension elements created by the sampled filings was 183,846.

Extrapolating this to an estimated 8,100 companies submitting beginning in June 2011 (8,100/1,488 = 5.44X) says that we are looking at an annual extension library to keep track of just over one million independently created elements in addition to the US-GAAP set. Wowie!

This actually doesn’t bother us that much. We have expected for a long time now that the extensions work around for the taxonomy architects not being able to anticipate every possible data element would create this type of algal bloom in the data. It merely points out two things.

First, unless one is doing merger and acquisition work, one can probably ignore most of these extensions and do most first and second level screening analytics just using the US-GAAP subset. This replicates – if not surpasses – the detail coming out of the best of the fundamental feeds. Besides if you are doing M&A diligence, you are looking at more than just financial reporting filings anyway.

There’s always been the notion among analysts that the first true use of XBRL filings exhibits would be based on subset analytics as opposed to extreme diligence tracing of every nuance item in a filing. From what we see, the June 2011 filings population should provide the first near-census test opportunity to do aggregate and sector analysis on U.S. public companies where the data goes directly from SEC’s EDGAR system to the research department without needing to pass through an intermediary processor. And it is chain of process traceable to the government evidentiary source. We like that!

Second, from our cursory inspection we believe many of these company created extensions are undoubtedly common in nature. They can – with proper effort - be aligned into new standardized taxonomy elements over time. As these winnow down, we expect what remains will be the types of specifics that are truly company unique. Still, seeing a million data elements to catalog is once again a good lesson to all that in data management for information to be usable, less is more.