Wednesday, March 16, 2011

FDIC Bank Assessments Change on April 1st as Dodd-Frank Comes to Banking

The Federal Deposit Insurance Corporation (FDIC) is the buck stops here office of U.S. bank regulation. Chairman Sheila Bair and company are checkbook behind the sign on every bank’s door sign guaranteeing that your deposit is FDIC insured. Funding America’s Deposit Insurance Fund (DIF) is done by charging banks an insurance premium. It’s this cash paid by the banks themselves that is supposed to maintain the DIF’s reserve. Come April 1st the rules for how banks get charged will be changing dramatically.

The FDIC approved 2011 Final Rule changes to 12 CFR 327. The rule is effective April 1, 2011, and will be reflected in the June 30, 2011 fund balance and the invoices for assessments due September 30, 2011. The final rule incorporates many policy changes based on the intent of the Dodd-Frank Wall Street Reform and Consumer Protection Act.

The most important change of all is in something called the “base assessment amount”. In the past this number was – and until March 30th is – the domestic deposits held by a bank. But as Dodd-Frank recognizes, banks have become more complex and leveraged. Most importantly, since the demise of Glass-Steagal, many of the larger institutions have incorporated the complexities of investment banking into their business models. The FDIC assessment methodology essentially gave a free ride to these aspects of a bank’s operations. Well come April Fool’s Day, the free ride is over. The “base assessment amount” changes to a new formula. From now on it’s the bank’s total assets minus its tangible common equity (TCE) that determines the base amount. I do note though that it’s actually the Tier 1 Capital that the FDIC will be using at first because despite the popularity of the notion of TCE by Wall Street, the Treasury and the Federal Reserve, no one’s quite sure how to properly calculate a TCE for an FDIC unit certificate holder just yet. We ran the computations both ways using IRA’s methodology for unit level TCE estimation just to confirm that the Tier 1 approximation is, as they say, close enough for government work. It is.

The no more free ride for investment banking policy slant of the 2011 rules is a huge shift. And yes it affects the largest complex banks most. We completed an estimation method – available in our Professional IRA Bank Monitor product – that looks at the change in assessment base and also implements the guidance of the new rule to estimate the DIF premiums. As with all IRA analytics, data is available for all 7,500+ FDIC certificate units plus 4,500+ bank-only components of BHC’s. Just for grins, we back computed “what would it have been” data for prior periods quarterly to 1995. Looking at this new mass of data we saw the following,

• Smaller banks that have traditional unleveraged business operating models generally see little change from the new rules.
• Mid-size and larger banks, over $10 billion assets, generally see an increase in their assessment base.
• Within the largest banks, the ones that are heavily involved in investment banking are most impacted by the new rules.

There’s also an important second point embedded in these new rules. The math for estimating risk category assignment is done using FDIC’s CAMELS scoring system. CAMELS stands for Capital Adequacy, Asset Quality, Management, Earnings, Liquidity, and Sensitivity to Market Risk. It uses privileged data and is sharable only between a bank and its regulator. For those of you that are old Basel II aficionados, think of it as “everything is IRB”.

So what’s the big deal about this?

The “2011 Final Rule” specifically states that the going forward process eliminates “the use of long-term debt issuer ratings for calculating risk-based assessments for large institutions”. The methodologies of the Nationally Recognized Statistical Rating Organizations (NRSRO’s) are no longer part of the process. This reduction in official dependence on rating agencies that failed to detect the last wave of systemic risks is also consistent with the intent of Dodd-Frank.

For our estimator tool, we used IRA’s public data “Shadow” CAMELS methodology we developed in 2009 as part of a contract to support the Securities and Exchange Commission to set up our Risk Category analysis. That contract requires us to compute these shadows for every bank unit and bank holding company in a timely manner every quarter. It turned out to be fortuitous for our look into the FDIC 2011 Assessment Rule. SEC specified we compute “Shadow” CAMELS to 1/10th’s and this allows our tool to pick off a point within the cap/floor limits of the assessment rate range specified in the statute to set up a better 2011 FDIC Initial Base Assessment Rate or IBAR. This refines the remainder of computations that depend on this number. Yes good people. That is math talk for the deep end of the pool. Translation to English = blah-blah-blah.

Thursday, February 17, 2011

CFO’s, Treasurers, Bank Risk and the Adequacy of Internal Controls

The only constant about risk analysis is that it is ever changing. It’s 2011 and make no mistake about it keeping up with understanding bank risk continues to evolve. The FDIC just released 2011 Rule 1 that updates the bank assessment rules to protect the Deposit Insurance Fund (DIF). Banks must now begin to adapt to these new statutes. For everyone else, it would be prudent to keep paying attention.

As of Feb 16, 2011, since the beginning of 2008, three-hundred and forty banks have failed in the United States. As banks have failed, we’ve been making forensic summaries available on our website to help improve transparency over what happened. We have seen the focus of worry facing the banking industry’s customers shift from looking at operating stresses stemming from unsafe or unsound operating practices – the primary objective of IRA’s Bank Stress Index (BSI) testing approach -- to looking for tactical warning about when banks are degrading in condition and in danger of failing.

Mathematically, the crux of the demand is to deliver analytics capable of maintaining linear tracking of banks past something called the investment grade threshold deeper into the degradation curve of the system. In English, people want to know when to get out of the way of the icebergs because it’s become all too clear that there’s no such thing as a ship that is too big to fail. To respond to this need we have constructed a Counterparty Quality Scoring (CQS) measure that provides tactical warning as to when it may be prudent to consider reducing exposure in case there is an FDIC failure. IRA’s CQS is less sensitive than our bomb detector dog nose BSI technique but it reveals something very interesting about the FDIC. Most of the time a bank will fail while there’s still some meat on the bones so that a pre-packaged resolution makes sense. And while there are the occasional oddballs, most of the time, the stress and trouble writing on the wall is clear as day with sufficient time to react.

For CFO’s and Treasurers, we are finding that companies are beginning to become aware that their corporate treasury “investments” in banks are in fact “bank liabilities” potentially subject to loss in the event of a bank failure. We did see a few instances in 2010 of companies calling up to subscribe to our system “right now” because they had taken a real loss following a failure and found out the hard way what the term uninsured deposits really means. We’ve talked to a number of companies now and we think it’s important to begin to explain the variety of internal controls strategies we’ve been running into.

First of all, corporate America can have some very complex interrelationships with banking. The classic notion that a company only needs one “too big to fail” banker to tend to all their needs turns out not to be how things work in real life. For certain things, these large institutions provide efficient solutions and integration costs can lead to very stable and sticky long term relationships indeed. But we also found that companies can and do shop around among the services offerings, a finding that validates the assertion that there’s no such thing as “run off proof” deposits, not even for the most cozy of historical banking relationships. The bottom line is that even for those for whom “size matters,” there are many horses to bet on.

We’re also finding that America’s companies carry a lot more exposure to middle and small sized banks than most people think. Certain industries such as retail, franchising, and integrated supply chains -- to name a few -- have business models requiring dozens and in extreme cases hundreds of separate bank interactions. In a number of cases, maintaining good “local” geographic presence is responsible for enlarging the number of banks a company interacts with.

Lastly we are finding that, in our opinion, far too few corporations practice active risk mitigation. The penchant to “find comfort” and nest in it remains very much a characteristic of U.S. business at this time. There’s even a component of the old culture of “plausible denial” that continues to permeate the system – something that the rules governing the “adequacy of internal controls” of Sarbanes-Oxley should have long driven out of corporate America. We also find a number of companies relying on indirect measures testing techniques to gain their comfort about the banks they do business with. The most common of these is relying on their investment bankers who package their corporate bonds to use market inference indicators to imply the soundness of their banking relationships. So things like stock price and CDS indices substitute for looking at the real FDIC CALL reports.

But we've also seen that some companies do avail themselves of approaches out of the banking world that have mitigating effects protecting against bank failures; the most common of these being to engage in forms of deposit risk placement spreading either directly or via a brokered deposit program technique. So not everyone is behind the curve.

With all this as a backdrop, we find ourselves reminding companies that it’s critical to “know” not guess about how healthy their banks are, what their alternatives are, when they need to act and how they will act if they have to.

First, companies need to remember that risk surveillance is only useful when it’s reporting on reality. The consequences of the “Imagineering” of Wall Street have already wreaked havoc on the U.S. economy and the days of granite reputation banks are long gone. The characteristics of what constitutes risk behavior in banking are understood well enough that they’re codified in legal statutes. Interpreting it is somewhat of an art form but ultimately there’s no valid business reason not to know. What is important to do is focus surveillance on the action thresholds. If all banks were healthy, the focus would be on the best yield/services offering bundle. But, given the state of things, asset preservation or loss avoidance is presently equally important.

We believe that when it comes to loss avoidance with one’s bank(s), marking to reality means looking at FDIC data. Secondary market surrogate measures can have volatile spin in them that could cause false alarms. Next to realizing a "Black Swan" loss because you weren't looking, the next worst thing to do is mistakenly abandon a banker who was acting in your best interest. It’s a waste of opportunity cost best avoided by any depositor, business or consumer.

We further believe that it takes two surveillance data points to gain perspective on a bank’s soundness. One should be an early indicator that allows one to assess confidence in a bank’s operational leadership before trouble ever happens. Second is a present state indicator benchmark that delivers a relevant warning to act while there’s still time to do so. One thing we will note for those wishing to do internal analysis is that certain regulatory benchmarks like capital adequacy are designed to be lagging indicators of strength and soundness. If you look at those forensic reports on the failed banks, you’ll see they tend to falter only at the very end of the life cycle. Acting in a rush is no way to manage one’s larder. So if you choose to read those CALL/TFR reports directly off the FDIC website and gain comfort yourself you’ve been warned. Also, remember that things like TCE’s and Texas Ratios are investor’s analytics tools. That’s potentially useful stuff if your company’s trading desk book holds stock or debentures in a bank but it’s not quite the same thing as treasury asset protection of your cash and cash equivalents.

If you’re really lucky it’ll never happen, but real world odds are that sooner or later everyone will ponder what valid alternatives there are to one’s current bank relationships. Bear in mind that change events also happen for positive reasons. Ideally, for every bank you are in, you want to know what the next three best alternatives are to that institution in case you decide to move your money. Your criteria may have minimum/maximum size, geographic, service offering set, switching cost and other narrowing criteria, but you still always want to know your short list of where to go if you have to. We believe your list should always include apples-to-apples quality benchmark soundness criteria so if you elect to deviate from the norm you’ll know it and by how much. Of course, you may have to explain why at the next shareholders meeting. And please remember that while IRA is authoring is this article, it’s not the only bank ratings company focusing on a next generation of tools for independently risk testing banks. The point of this note is that having a rational basis for knowing one’s alternatives is good internal control procedure. Whether you use external reference sources or do it yourself is up to you.

And last but not least of course, have specific, measurable, actionable and achievable plans to act for each option you intend to execute.

Tuesday, February 8, 2011

XBRL Usability Part 2: Checking the Extension Cord

In our previous installment of IRA’s testing of U.S. XBRL filings we reported that one only needs the EDGAR Accession file to be able to properly monitor and locate SEC filings both with and without xml exhibits. We are now confident enough to cease monitoring of the SEC’s experimental XBRL feed and begin using a true production version to process filings via the standard EDGAR system.

As we await the arrival of a new wave of filings in June 2011 we thought we’d check out the library of variables that will be available. In addition to US-GAAP data elements, filers are allowed to add “extensions” to their submittals. These extensions are independently defined by each company and do not require external coordination or rationalization at this time. The way extensions are added to a filing is via the xsd file, an XBRL definitions file that starts with statements to bring in standardized taxonomies followed by the company’s independent extension set. Every SEC filing with an XBRL exhibits file set has an xsd file. Not all of them have extensions.

We instructed our computer to count up the population of 10-K, 10-Q and 20-F filings for the past year and rummage through the xsd’s. On the particular test run we did there were 36,136 SEC filings meeting our test run criteria in the Accessions tape. Of these 3,880 included XBRL exhibit attachments representing 1,488 Central Index Key (CIK) SEC Registrants. This was roughly one fifth of the population expected to start submitting in June 2011.

Of the 3,880 filings, we found that 3,039 (78%) contained extension elements. The remainder only used primary taxonomies in their construction. The number of extensions in these documents ranged from a low of one extension to a high of nine hundred twenty-two extensions in a single filing. The total number of extension elements created by the sampled filings was 183,846.

Extrapolating this to an estimated 8,100 companies submitting beginning in June 2011 (8,100/1,488 = 5.44X) says that we are looking at an annual extension library to keep track of just over one million independently created elements in addition to the US-GAAP set. Wowie!

This actually doesn’t bother us that much. We have expected for a long time now that the extensions work around for the taxonomy architects not being able to anticipate every possible data element would create this type of algal bloom in the data. It merely points out two things.

First, unless one is doing merger and acquisition work, one can probably ignore most of these extensions and do most first and second level screening analytics just using the US-GAAP subset. This replicates – if not surpasses – the detail coming out of the best of the fundamental feeds. Besides if you are doing M&A diligence, you are looking at more than just financial reporting filings anyway.

There’s always been the notion among analysts that the first true use of XBRL filings exhibits would be based on subset analytics as opposed to extreme diligence tracing of every nuance item in a filing. From what we see, the June 2011 filings population should provide the first near-census test opportunity to do aggregate and sector analysis on U.S. public companies where the data goes directly from SEC’s EDGAR system to the research department without needing to pass through an intermediary processor. And it is chain of process traceable to the government evidentiary source. We like that!

Second, from our cursory inspection we believe many of these company created extensions are undoubtedly common in nature. They can – with proper effort - be aligned into new standardized taxonomy elements over time. As these winnow down, we expect what remains will be the types of specifics that are truly company unique. Still, seeing a million data elements to catalog is once again a good lesson to all that in data management for information to be usable, less is more.

Monday, January 24, 2011

SEC Interactive Data: Approaching Usefulness in 2011

By the second quarter of 2011, we will see another wave of machine-to-machine interactive financial data become available directly from the U.S. government. In June 2011, approximately 8,700 companies will begin to file supplementary “xml” files accompanying their quarterly and annual financial statement filings with the Securities and Exchange Commission. XML files can be read directly by computers allowing instant absorption by the analysis programs -- institutional and individual – to evaluate public companies. At its most grandiose, it means that investors, litigants and policy makers will be able to examine and assess the official legal version of these filings as fast as Regulation FD (Fair Disclosure) will allow.

This latest technological jump in financial information transparency is the result of a number of years of work by the SEC. The process included having to develop a specific sub-dialect of xml called XBRL, a many year exercise to turn the free form reporting of the U.S. Securities Act into a workable codified set of data construction rules. Because it blends management statements, the legal requirements of speaking about both “Financial Statements” and “Safe Harbor” discussions, and numerical precision of form-like data enhanced by company unique extensions, it is the most complex attempt of this type to date.

Not that the SEC is a stranger to XML. It originally started the process by requiring the relatively simple Forms 3, 4 and 5 that report on company ownership to be submitted per an XML specification via an online form. Hundreds of thousands of these documents have been filed and machines capable of reading them are able to render who has what at the most complex companies transparent. The SEC is also working on the XFDL specification that will codify many more form types submitted to the SEC.

Prior to this the most complex government financial data collection exercise took place in the parallel universe of the U.S. Banking Act. The FFIEC had brought the reporting of FDIC Call Reports into a Central Data Repository (CDR) and pioneered a three tongued publishing methodology that delivered perfect synchronization among a human viewable HTML file and two computer readable data files, one in xml format and one in CSV format thus covering 99.99% of possible downstream analysis interfacing cases. It set a high standard for all direct government-to-public dissemination to follow.

Getting Ready for Prime Time

One of the true tests of a new idea is whether or not it still works when one shuts down all the “experimental versions” of the process. The U.S. government version of this is beginning this phase now. Being analysts who use filings data to assess companies as opposed to XBRL developers seeking to make a living by filing documents with the SEC, we decided the time has come to do a “production acceptance test” of things as we wait for June 2011 to arrive.

Test number one was to ignore all experimental filings data feeds from the SEC and ask the key question, “ Is it possible to find and catalog these documents using only the official EDGAR Accessions file?“ And our favorite follow up government accessibility and transparency question, “Is what it takes to do that sufficiently low hurdle that anyone can do it for free or near free?” This is a critical operational issue because in the end, if it doesn’t work via a truly publicly accessible librarian pathway it isn’t soup yet.

We are happy to report that the answer is yes and yes. One needs nothing more than the SEC Accessions Catalog file to generate a complete table of the URL pointers to every xml filing. It’s implementable as a lights out program and we plan to create a look up utility with the link pointers to all the xml support files that will be incorporated into our IRACorpFilings.com site. Each filing has a set of xml files that together constitute the XBRL submission supplement to the main filing document.

The SEC’s implementation of downstream transmission support does not presently have the CSV check file version of the data alongside the xml as is done by the FFIEC. For one thing that makes it slower to process back into an RDBMS but that’s just an inconvenience and not a show stopper. What bothers us on this one is that we would like to see the CSV check file accompanying the xml file set – or at least the main xml file with the blocked data elements in it -- because having two machine readable versions of the same output file from the evidentiary source will help immensely for downstream users who need to automate testing for internal consistency in the incoming reports. We recommend that SEC OID look into this as a production feature to come online hopefully by the end of 2011.

We did note that the earliest 1,503 companies from the first and second wave of these filings did something odd … to us anyway. They prefixed the filenames of their xml files with their stock ticker symbols, an identifier that is not an internally verifiable construct – CIK is the real U.S. Securities Act legal identifier and is already in the header of the filing. You have to look up the stock symbol using an external “private” source and we flagged it as something that will become a “human reader” issue later on. There will come a point when the filers reach beyond just the major exchange traded public companies to what we like to refer to at IRA as the remainder of affected SEC Registrants.

It’s not an issue for machine-to-machine reading by the way. Computer programs don’t read and any unique string of text constituting a valid filename is sufficient. The bottom line is that locating usable URL links to XBRL xml file sets in an SEC filing is not a make or break issue requiring any sort of global Legal Entity Identifier. The xml files accompanying each filing could be named “Fred” and can still be successfully targeted by any well programmed computer. The SEC Accession Catalog is dandy and we look forward to our program – and ones written by others -- reading out and data basing the links to xml from these filings as they continue to appear.

Next installment, we’ll talk about what’s in the files themselves and what we think about using them to do surveillance and assessment analytics. Once you know where the files are the next question is, “Can you do anything with them besides print them out?” The real value after all is in the distillation.

Thursday, January 20, 2011

A Deepening Dearth of Lending

I was on Canadian network BNN last week. It is earnings time for banks and as much as I loathe talking about economic safety and soundness through the distorting lens of equities I attempted to field questions. It seems that “earnings per share” is looking better at some banks this quarter and people are asking if the time is “now” to get in on the gamble. The buzz must be hot to get people to pony up because I’m getting emails asking if I think this is the bottom of the well. Just a reminder, brokerages earn a living by charging commissions on the volume of transactions, not on the gain or loss of the investment.

As I tried to explain on air, picking through the lint in my belly button I’m not sure that today is the equivalent of the day Ford was at $1.00/share for the banking sector. We’re still seeing a lot of accounting based earnings coming from numbers in a computer being moved from one ledger to another creating what – in the time of Sarbanes-Oxley (remember that?) – would be categorized as one-time events. As for me, I still see banking as a supporting cast service provider to the economy. I’m waiting to see indicators of fundamental change in the direction of Main Street. Everything else is what the Wall Street townies call “optics” when happy hour comes around.

The continuing decline of domestic economic reinvestment

It’s not looking all that great on Main Street. Domestic economic reinvestment continues to slumber like Sleeping Beauty waiting for true love’s kiss. It’s weird really. How else can one explain the juxtaposition of “exceeds analysts expectations” with “six one-hundredths of a percent of real growth” in the same news cycle? These are the times when the solace of perspective is best found by ignoring volatility and focusing on the deeper trend lines.

Just so you know the overall amount of commercial and industrial lending by banks in the United States eroded by about 1/3rd from roughly $1.4 trillion in Dec-2007 to a bit over $1 trillion at the end of Sep-2010. Not to make light of a $400 billion dollar loss in going concern domestic economic investment by the banking system, but the really shocking numbers are in the unused line of credit commitments of banks to U.S. business. This is the canary number I like to look at because it is a direct expression of banking and finance confidence in Main Street industry. It’s gone from $92 billon in Dec -2007 to just $24 billion as of Sep-2010. More importantly, the vast majority of this contraction of credit availability to American industry has been by the larger banks, C&I LOC from $87B down to $18.8B by the institutions with assets over $10B. Poof!

We are now entering the fourth year of our saga. The kicking the can down the road approach to preserving banking infrastructure as a vital national resource continues. It’s now been husbanded by both a Republican and a Democratic White House. Both have succeeded in preserving banking. The “can” itself – the US domestic economy -- is still getting smaller. Is that really the best plan we can come up with?

So what can you do?

Next time you interact with your bank, ask them to tell you more about they are doing about expanding loan production. Ask specifically to tell you some details about what they are actively doing to clear away their remaining impediments to new lending. Are they modifying or disposing of whatever non-performing assets they have to get them back on track? How else are they using their resources to invigorate the Main Street economy? How are those line of credit commitments to small business commercial and industrial borrowers coming along towards recovering to pre-2008 levels?

Some bankers will balk that you'd dare to ask such questions. Others will gladly wax on about all the things they are doing to make things better. You'll certainly learn something about who's being a responsible banker and who isn't. Be prepared to be both disapppointed and pleasantly surprised.

Bear in mind that these questions aren't about small versus large. They are about discovering where decency and responsibility still are in America. It's there. The task at hand is to find and reward it. Remember that in America the voices of ordinary people still matter. Don't let anyone tell you otherwise.