Wednesday, February 4, 2009

Why Technology is Integral to Legislation

With the current economic indicators and overall malaise we've found ourselves in I thought I would use the opportunity to throw out a novel idea. That idea is centered on the need for an understanding of what our government is capable of before we get around to spending hundreds of billions of dollars to fix a problem. Now this is not a political blog as I think there are enough of those that seek to place blame and there is plenty to go around. What I'm talking about is the rest of our government that has to implement these grand ideas and somehow try to show the results. In an attempt to keep an open mind, avoid groupthink and look at the solutions not only to this problem so that we do not let another ‘bubble’ catch us by surprise but also how things should be done from the top down in regulating our financial markets.



The title of this blog entry has to with technology and legislation or policy but before we get too deep into that I will set the stage discussing the state of technology as used in the Federal Government for various purposes today. I've blogged before about how you can use frameworks for an effective BPM based SOA solution around Governance, Risk and Compliance that I believe applies to this issue. The Federal government has done a good job in providing a defining schema (an XML based data model) for their budgeting process which works quite well (I have programmed Federal budgeting systems with it so I can attest) but rarely is this schema used other than on a yearly basis to make programs and projects appear to be most valuable per the metrics supported in the system. This becomes mainly a black art of spreadsheet 'magic' to try and position the way spending will benefit the citizen, war fighter or whatever the mission(s) that have the most visibility and therefore higher spending. This is a framework about how the finances of the government are managed in a portfolio. What we are attempting to address here is the financial and operational data regulate our nation’s markets from agencies like Federal Housing Administration, Federal Reserve, Treasury, FDIC, SEC/CFTC etc.


We’ll begin with some discussion of how these and other parts of the government interact to provide oversight to the activities within the private business community that affect our economy. While these interests do have some combined oversight and even included Fannie Mae and Freddie Mac in one case called OFHEO which now hails as FHFA, it’s obvious that the ties that bind them have been woefully inadequate to predict the overall effect of the mortgage industry on the health of Wall Street, Banks and therefore the overall economy. There are programs intending to tie them together such as the FFIEC and the Shared National Credit program. I believe the SNC had the best of intentions as outlined in a 2007 report from OCC covering some of the financial issues facing the banking system and the economy as a whole. During the last couple of years there was a resurrection of the modernization of the Shared National Credit program followed by a Secretary Paulson proposal for the complete re-structuring of a lot of the players involved.


These items are all positive, even if disruptive, but we are up against complexities encountered in this crisis that our government just isn't designed to handle. This blog entry isn't going to be about policy or even placing blame but more so about what I've seen that works and what we should be looking at to institute the best mechanisms moving forward to make sure the government is able to handle these complexities, seen or unseen, in the future. After looking longer at policies and proposals I'm more prone to believe suggestions such as this and this. As you look the previous links to LA Times article and the white paper, one theme is clear and that is that new institutions are needed not just for oversight and enforcement but potentially for actually operating some of the core functions just as Ginnie Mae have been forced into doing in light of Fannie Mae and Freddie Mac implosion.


If you look at the documents I referenced earlier from OCC and the Risk Management Association one of the themes that run through them is the incorporation of the Basel II or similar framework to measure potential for default, exposure at default, etc. as a consistent baseline in understanding the way each institution would handle those calculations. FDIC was averse to Basel II for a while due to the effect of capital requirements that would be brought to bear on lending institutions that it saw as unnecessarily burdensome (shown here on slide 37). As one who has an innate affection for frameworks due to their very nature, I will present one here for some pretext to the larger argument I'm trying to make and that is XBRL (which has some additional explanation here).


FDIC has not only since come around to Basel II but has gone to some lengths to look at XBRL as a solution for sanitizing the way financial data is transmitted. SEC has done some things with XBRL in regards to EDGAR and you can see here this is starting to get enriched as it pertains to more diverse banking paradigms in the case of the mutual fund taxonomy for example. I've done work with the SEC around options using a framework called FIXML which serves its purpose well. This proves that a single framework isn't necessarily the answer just as Basel II isn't necessarily the silver bullet either. Take a look at these two postings from The Institutional Risk Analyst in 2006 to look at XBRL as it pertains to Basel II within the Federal Government:


Here's an excerpt from the first:


IRA’s core mission is to develop and field benchmarking analytics. As a developer of computer enabled data mining tools, we strongly support the advent of publicly available, well-structured or “interactive” data. In the past we have lauded the FDIC’s modernization effort, which now has all FDIC-insured depository institutions submitting quarterly financial reports using eXtensible Business Reporting Language or XBRL. The transparency, completeness, consistency and quality of the FDIC’s bank information pipeline, which is used in our analysis engines to produce uniform benchmarks for Basel II, enables IRA’s “Basel II by the Numbers” report series to serve as a canvas upon which to demonstrate the power of “distilling” structured data.


And one from the second:


Fact is, a growing number of senior people in government are pondering the use of XML-based technology solutions to address the issues like those raised by the Corrigan Group, in particular the issue of gathering sufficient financial statement data about hedge funds and other lightly regulated entities to understand counterparty risk. And the FDIC's use of XBRL for gathering bank data is only one example.


One of the items that starts to emerge here is not only how to effectively rate complex banking institutions like hedge funds but also looking back at the OCC document you start to see concerns of how to regulate traditionally depository institutions like a Bank of America when acquisitions such as Countrywide for instance, begin to conglomerate (under Horizontal Reviews of Large Banks in the OCC document). Moving in to 2007 you start to see the sobering writing on the wall as seen here where it is more clearly understood how tied the performance of these credit derivatives like credit default swaps (CDSs) and Collateralized Debt Obligations(CDOs) were to the real estate market, specifically sub-prime and speculative mortgages. If you are not up to speed on how this meltdown occurred here is a crude animation on the 'evolution' of this problem.


When you take this to the macro level where the government should be managing the Shared National Credit risk you find a lag problem where indicators like you see from Bureau of Labor Statistics are simply a good indicator of what's already happened as are the economists' data coming from places like HUD. They are not however a good indicator of what is to come when what is coming is unique and as a pattern, somewhat unidentifiable. To be able to effectively spot a contagion you need the most accurate data in a format you can consistently retrieve and integrate for predictive analytics. There are great data mining operations going on in all of these institutions and there are vendors like UBMatrix that provide tools that XBRL solutions like the FFIEC Call Report can be built on.


Going back to the first posting from The Institutional Risk Analyst earlier I believe that major vendors in this space like IBM, Oracle, Microsoft, Fujitsu, etc. coupled with the advances in storage mechanisms for XML will render the following statement:


We rub our worry beads pondering the anthropology of innovation, each component developed piecemeal and each maturing to serve the interactive data space. Not unexpectedly, we see evidence of classic early adoption myopia -- competing solutions ignoring each other’s value, while pushing, at times aimlessly, in the hope of owning as much of the interactive data real estate as possible. We know from experience that the “one wrench does it all” approach hurts rather than helps the adoption of interactive data as a resource to the financial community. We believe there needs to be more context as to what functional purpose a technology has to each step in the value pipeline – collection, validation, storage, distillation & dissemination – over which data travels from source to user.


can and will be somewhat ameliorated by methods to handle schema evolution coupled with the XBRL organization maintaining the technology artifacts that represent the line of business involved.


And from the second posting from The Institutional Risk Analyst related to risk modeling:


To us, the chief obstacles preventing regulators and risk managers from understanding the nature of the next systemic tsunamis are 1) over-reliance on statistical modeling methods and 2) the use of derivatives to shift and multiply risk. Of note, continued reliance on VaR models and Monte Carlo simulations is enshrined in the Basel II proposal, the pending rule revision on CSFTs and the SNC proposal. All share an explicit and common reliance on statistical methods for estimating the probability of a default or P(D), for example. These ratings, in turn, depend heavily upon stability in the assumptions about the likely size and frequency of risk events. None of these proposed rules focus great attention or resources on assessing specific obligor behavior.


With a new XBRL based SOA underpinning this new framework adds discrete event simulation capabilities which give the ability to use computing models to play ‘games’ like the Department of Defense does that I've blogged about here. In addition is the capabilities for statisticians and economists to use this data in aggregate to measure true national credit and risk factors more accurately.


Another from the second posting from The Institutional Risk Analyst related to oversight of the risk calculations:


Thus the urgency in some corners of Washington regarding revisions to SNC, including a quarterly reporting schedule and enhanced disclosure of counterparty financial data. Remember that one of the goals of the SNC enhancements is to gather private obligor P(D) ratings by banks and to aggregate same to build a composite rating system for regulators to use to assess counterparty risk. That is, the creation of a privileged data rating matrix which could be used to assess the efficacy of both bank internal ratings and third party agency P(D) ratings alike. More on this and the effect of derivatives on visible bank loan default rates in a future comment.


Even though some say SOA is dead I know the platform is very much alive with products this and this which I worked on while at Oracle which are the underpinnings of Basel II solutions such as this. While Basel II isn’t the silver bullet here it is being recommended that is should stick around. Basel III won’t necessarily be the answer either but what we have is a method to surface the data artifacts of XBRL into processes (including business intelligence for items like risk calculations) that are easily mapped and understood into larger and larger scopes. That is really the beauty of these XML based frameworks and I've had the pleasure to implement others like AiXM, HL7 v3 and NIEM which support native message types and processes, for examples, airlines to the FAA or doctors to the FDA (and all applicable points in between). The resulting instances of these items become instantly transparent and ease the need to harmonize them for understanding in the process of oversight.


Back to the last paragraph of the second IRA posting which begins to delve into policy:


Bankers, after all, are not very good at understanding future risks, no matter how many ERM consultants they hire, default risk software implementations they direct, or meetings they attend at the Federal Reserve Bank of New York. Even making accurate observations about the present day risk events seems to be a challenge. Witness the fact that commercial bankers as a group managed to direct more than $2 out of every $3 in political contributions this year to Republican members of Congress, even as the GOP looks ready to lose control over the House and perhaps even the Senate. When Barney Frank (D-MA) is Chairman of the House Committee on Financial Services, perhaps the industry will take notice of this operational risk event and adjust accordingly.


Obviously this article is from 2006 and we've since moved back to a democrat controlled Congress and White House. In fact the gentleman in charge of the Federal Reserve Bank of New York at that time is now the new Secretary of the Treasury. Tim Geithner had this to say in 2006:


"Credit derivatives have contributed to dramatic changes in the process of credit intermediation, and the benefits of these changes seem compelling. They have made possible substantial improvements in the way credit risk is managed and facilitated a broad distribution of risk outside the banking system. By spreading risk more widely, by making it easier to purchase and sell protection against credit risk and to actively trade credit risk, and by facilitating the participation of a large and very diverse pool of non-bank financial institutions in the business of credit, these changes probably improve the overall efficiency and resiliency of financial markets. With the advent of credit derivatives, concentrations of credit risk are made easier to mitigate, and diversification made easier to achieve. Credit losses, whether from specific, individual defaults or the more widespread distress that accompanies economic recessions, will be diffused more broadly across institutions with different risk appetite and tolerance, and across geographic borders. Our experience since the introduction of these new instruments—a period that includes a major asset price shock and a global recession—seems to justify the essentially positive judgment we have about the likely benefits of ongoing growth in these markets."


While trying not to place blame on the current state of legislation or the operation of government as ‘it is what it is’ and to put it bluntly there is no possibility that you can prescribe legislation, hope to take its goals and objectives (measured semi-annually by OMB) and turn them over to an agency or agencies who's top officials may change every 4 years then expect their CIO's and others to let competitive bidding to the usual suspects in around the beltway while expecting different results. In fact, quite the opposite as we've compounded issues we can't fully understand because of a lack of transparency, not just of government and the oversight of industry but the overarching process models we have for doing business (risk models, etc.) and how they are audited by the government.


At the end of the day policy makers do things that sound appropriate and Sarbanes-Oxley is a good example of that which was passed to combat the abuses of Enron, WorldCom and others. The unintended consequences, sometimes in the form of a sense of false security, are often the ones that end up biting you the worst. The problem as I see it is that the institutions involved in the current crisis deal in finance specifically and not other lines of business that yield financial results. Not that these companies weren't subjected to the same policies only that valuation was the root of this crisis. There is blame to go around here from the housing policy that said banks should do the lending to the unqualified including the minions that became real estate speculators as a second job and the financial institutions that packaged, re-packaged and sold this debt. Since these complex financial instruments are the backbone of this contagion, it's virtually impossible to 'unwind' them at this point and most of them are at some point tied to mortgages. Dealing with this part of the problem could allow for stabilization of the situation to a certain extent.


Looking at what’s been done on housing policy thus far I don't see anything wrong with a forced stoppage of foreclosures although after having worked at FHA for the better part of 2008 I can tell that no one likely even remembers the Hope for Homeowners or its revisions for 'flexibility'. It's not to say that these things were and are without noble intentions but if we look back in history we see that HUD has shaped homeownership policy, at times to the detriment of the very banks in trouble today and FDIC has been in receivership of these banks as well (IndyMac comes to mind as a good example of an institution straddling that duality). If we look at the results of Hope for Homeowners we see that while the legislation targeted 400,000 homeowners only 25 have actually leveraged the relief offered in the legislation. Of course one of the unintended consequences was that FHA was able to hire many employees with the $25 million provided for implementation. This is significant because HUD and its largest program, FHA, have no budget for shared IT modernization as the entire pot (~$50 million per year) goes to maintain the ~50 mainframe applications running the systems there which take 18 months and many millions more for the simplest of changes to support new operational goals. Looking at the future and what’s happening with Federal Student Aid, who like HUD don’t even own their own data…indeed YOUR own data, and Sallie Mae there is another wave of this economic tsunami headed our way not to mention to the additional Adjustable Rate Mortgages that are about to reset hopefully at a reasonable enough rate to keep qualified homeowners in their home or some subsidies to keep potentially unqualified ones there as well.


Given what is happening to the banking industry at large, due mostly to mortgage lending and securities derived from mortgages, it's tough to make an argument against nationalization or making Bank of America the real ‘Bank of America’ or in lieu of continuing to feed them money and turning them into 'zombies' as seen in this paper. With the regulations and communization of strictly depository banking like local incumbent telecom companies, serving up a local telephone line or checking account isn’t viable as a growth business. It could be time to create some fresh banks seeing the Federal Reserve Board, Treasury and FDIC are really the mother of all banks anyway. Let the bad performers die, let the government use these funds to start a shadow banking system and mortgage underwriting and use new technology to do it right this time along with turning those entities back into commercial ones after the bad ones get valuation and/or simply die. I find it hard to believe that anyone would care whether they banked with Wells Fargo or some government version of a depository institution but would certainly care if their bank was insolvent like most of them are today but seem to get ongoing support when they should be allowed to fail. The other financial operations that deal in equities, insurance, risk and other financial sub sectors would be in a position, as many like JP Morgan are now, to perform many levels of financial services including acquisition of insolvent depository institutions like Washington Mutual.


When you really look at this problem you start to understand that people and companies they run, when left to their own devices will end up with a conflict of interest without consistent, thorough and timely oversight. Who ‘polices the police’ as they say and additional oversight from our government agencies and their respective Office of Inspector General along with Government Accountability Office will just never be enough. With the new paradigm presented in this blog encoded in their DNA the government has the ability to re-organize its enforcement staffs into a cohesive model that fits the institutions they are regulating along with allowing them the flexibility to morph as those institutions are likely to in the Brave New World we are facing. This frees up capitalism to move on about its merry way to recovery even if the depository side of banking and mortgages in the form of Freddie, Fannie and Ginnie all need to stay ‘governmentized’ for a while until the free market is able to sort out the mess the last debacle leaves behind. Using techniques like this we can make sure these items are spun off for good and, perhaps most importantly, no longer considered to be GSEs all while giving them the proper policy oversight.


At some point the right solution will be realized, perhaps when we come up with a price index and allow all homeowners to refinance (those who were rightfully financed in the first place) to a 10 year adjustable or 30 year fixed product at this adjusted home value. Before you dismiss the idea what will be stopping someone with good credit to move down the street for a nicer house at less than what they owe on their current mortgage? This will perhaps allow the bank and homeowner to share an increase in value over the coming years up to the original value of the mortgage at which point the homeowner would be the recipient of the additional equity or perhaps in some tapering sharing of equity. Interest rates would remain low for some time to allow for these loans and the 10 and 30 year products would hopefully put homeowners out of a time horizon for huge interest rates hikes that will undoubtedly occur to fight inflation. Homeownership would be tough for a few years during the time interest rates are going up but the banks would have sound balance sheets and at least the CDOs could be unwound and credit default swaps absorbed. At some point all would return to homeostasis.


What we need is the ability to not only found 'language' along with these goals, objectives and measures but levels of process models that ensure how they will be carried out. The main components can be put into a process model that decomposes to another level and eventually into the implementation of the systems that facilitate the negotiation of complex instruments by presenting counterparty risk in aggregate each time they are bought and sold. More importantly is that oversight and measures of efficiency for what the government may be doing to bail these institutions out as an example would be immediately available. Simple diagram of how these levels of complexity and volume decompose is shown here:


Effectively this would make multiple iterations of the Troubled Asset Relief Program (TARP) not only inherently transparent but also be conducted on a transactional basis from the funds set aside to perform duties assigned to them by the legislative policy. If anyone believes that TARP, a National ID Card or an electronic medical record maintained by the government can be devised, funded, implemented, managed and reported on to allow for adequate oversight that it would accomplish the goals that were originally intended and not instigate other, possibly worse side effects is not being realistic or needs to be educated as to why it’s impossible to let ‘the smart people at IBM take care of it’. At some point while we may stop foreclosures or even subsidize mortgage payments it will not stop what has devolved into the end of a game of musical chairs where someone has taken all of the chairs. Whatever the solution, we are all in this together, homeowners, banks and government so the solution should allow all 3 to participate and have visibility to results on a real time basis to rebuild the trust within our capitalist society. Otherwise government will spend more money and not accomplish desired results; banks will foreclose on more homes and commercial properties as their capital levels are fortified by the government while waiting for an uptick in housing to sell off foreclosed inventory. The problem there is the new homeowners won't exist as there won't be an economy with jobs to support any new homeowners. We'd better get the smart people on this and allow them to participate on how we solve this, implementing technology at every step in the process from legislation forward to insure success. We don’t have the money available in the whole world to support feeding this problem as it exists now. Otherwise we had better be prepared to understand that (especially without such techniques as espoused here in this blog) there will be more Orwellian debacles yet to come and perhaps most importantly, we won’t see the full impact of their aggregate perils until it’s too late.


In conclusion I'm essentially sounding the alarm that while things coming out of Congress can be debated to great end about their intentions or fairness they cannot be measured ahead of time for their efficiency in addressing the problem(s) at hand and periodic measurements of aggregated efficiency which could be construed as ‘effectiveness’ just isn’t agile enough. There isn’t the kind of ammunition left to keep firing $1 trillion birdshot with our double barrel sawed off that we call the Treasury and Federal Reserve to clean up this mess. We need is a fresh start with a few well placed 7mm sniper rounds to solve some of these systemic issues. I'm not suggesting we throw caution to the wind and adopt some Isaac Asimov state of machine rule, nor am I suggesting that I should be the next ruler of the free world because I understand how these systems work and more importantly, should work to support new initiatives. I'm not sure about how the rest of the world feels about a technocracy but it's obvious our Federal Government is far from that at this point. Keep in mind IT spending for the entire Federal Government is only around $78 Billion which is only 10% of the new stimulus bill just passed by Congress. What I'm saying is that in the world where we are more and more dependent on technology we cannot let the inefficiencies of government permeate the implementation of the new programs especially the IT that is mainly responsible for 'making the trains run on time' as it were. We need a new era of the President's Management Agenda where a Federal CTO who oversees FOIA and the like are going to fall way short of not only enabling technology that can support the goals of legislation but mitigating the risks (doing away them with in an ideal world) of the unintended consequences by providing a framework to provide a ‘line of sight’ when tweaking policy with automatic instant transparency, neither of which would otherwise be provided.

No comments: