Showing posts with label bpm. Show all posts
Showing posts with label bpm. Show all posts

Wednesday, February 4, 2009

Why Technology is Integral to Legislation

With the current economic indicators and overall malaise we've found ourselves in I thought I would use the opportunity to throw out a novel idea. That idea is centered on the need for an understanding of what our government is capable of before we get around to spending hundreds of billions of dollars to fix a problem. Now this is not a political blog as I think there are enough of those that seek to place blame and there is plenty to go around. What I'm talking about is the rest of our government that has to implement these grand ideas and somehow try to show the results. In an attempt to keep an open mind, avoid groupthink and look at the solutions not only to this problem so that we do not let another ‘bubble’ catch us by surprise but also how things should be done from the top down in regulating our financial markets.



The title of this blog entry has to with technology and legislation or policy but before we get too deep into that I will set the stage discussing the state of technology as used in the Federal Government for various purposes today. I've blogged before about how you can use frameworks for an effective BPM based SOA solution around Governance, Risk and Compliance that I believe applies to this issue. The Federal government has done a good job in providing a defining schema (an XML based data model) for their budgeting process which works quite well (I have programmed Federal budgeting systems with it so I can attest) but rarely is this schema used other than on a yearly basis to make programs and projects appear to be most valuable per the metrics supported in the system. This becomes mainly a black art of spreadsheet 'magic' to try and position the way spending will benefit the citizen, war fighter or whatever the mission(s) that have the most visibility and therefore higher spending. This is a framework about how the finances of the government are managed in a portfolio. What we are attempting to address here is the financial and operational data regulate our nation’s markets from agencies like Federal Housing Administration, Federal Reserve, Treasury, FDIC, SEC/CFTC etc.


We’ll begin with some discussion of how these and other parts of the government interact to provide oversight to the activities within the private business community that affect our economy. While these interests do have some combined oversight and even included Fannie Mae and Freddie Mac in one case called OFHEO which now hails as FHFA, it’s obvious that the ties that bind them have been woefully inadequate to predict the overall effect of the mortgage industry on the health of Wall Street, Banks and therefore the overall economy. There are programs intending to tie them together such as the FFIEC and the Shared National Credit program. I believe the SNC had the best of intentions as outlined in a 2007 report from OCC covering some of the financial issues facing the banking system and the economy as a whole. During the last couple of years there was a resurrection of the modernization of the Shared National Credit program followed by a Secretary Paulson proposal for the complete re-structuring of a lot of the players involved.


These items are all positive, even if disruptive, but we are up against complexities encountered in this crisis that our government just isn't designed to handle. This blog entry isn't going to be about policy or even placing blame but more so about what I've seen that works and what we should be looking at to institute the best mechanisms moving forward to make sure the government is able to handle these complexities, seen or unseen, in the future. After looking longer at policies and proposals I'm more prone to believe suggestions such as this and this. As you look the previous links to LA Times article and the white paper, one theme is clear and that is that new institutions are needed not just for oversight and enforcement but potentially for actually operating some of the core functions just as Ginnie Mae have been forced into doing in light of Fannie Mae and Freddie Mac implosion.


If you look at the documents I referenced earlier from OCC and the Risk Management Association one of the themes that run through them is the incorporation of the Basel II or similar framework to measure potential for default, exposure at default, etc. as a consistent baseline in understanding the way each institution would handle those calculations. FDIC was averse to Basel II for a while due to the effect of capital requirements that would be brought to bear on lending institutions that it saw as unnecessarily burdensome (shown here on slide 37). As one who has an innate affection for frameworks due to their very nature, I will present one here for some pretext to the larger argument I'm trying to make and that is XBRL (which has some additional explanation here).


FDIC has not only since come around to Basel II but has gone to some lengths to look at XBRL as a solution for sanitizing the way financial data is transmitted. SEC has done some things with XBRL in regards to EDGAR and you can see here this is starting to get enriched as it pertains to more diverse banking paradigms in the case of the mutual fund taxonomy for example. I've done work with the SEC around options using a framework called FIXML which serves its purpose well. This proves that a single framework isn't necessarily the answer just as Basel II isn't necessarily the silver bullet either. Take a look at these two postings from The Institutional Risk Analyst in 2006 to look at XBRL as it pertains to Basel II within the Federal Government:


Here's an excerpt from the first:


IRA’s core mission is to develop and field benchmarking analytics. As a developer of computer enabled data mining tools, we strongly support the advent of publicly available, well-structured or “interactive” data. In the past we have lauded the FDIC’s modernization effort, which now has all FDIC-insured depository institutions submitting quarterly financial reports using eXtensible Business Reporting Language or XBRL. The transparency, completeness, consistency and quality of the FDIC’s bank information pipeline, which is used in our analysis engines to produce uniform benchmarks for Basel II, enables IRA’s “Basel II by the Numbers” report series to serve as a canvas upon which to demonstrate the power of “distilling” structured data.


And one from the second:


Fact is, a growing number of senior people in government are pondering the use of XML-based technology solutions to address the issues like those raised by the Corrigan Group, in particular the issue of gathering sufficient financial statement data about hedge funds and other lightly regulated entities to understand counterparty risk. And the FDIC's use of XBRL for gathering bank data is only one example.


One of the items that starts to emerge here is not only how to effectively rate complex banking institutions like hedge funds but also looking back at the OCC document you start to see concerns of how to regulate traditionally depository institutions like a Bank of America when acquisitions such as Countrywide for instance, begin to conglomerate (under Horizontal Reviews of Large Banks in the OCC document). Moving in to 2007 you start to see the sobering writing on the wall as seen here where it is more clearly understood how tied the performance of these credit derivatives like credit default swaps (CDSs) and Collateralized Debt Obligations(CDOs) were to the real estate market, specifically sub-prime and speculative mortgages. If you are not up to speed on how this meltdown occurred here is a crude animation on the 'evolution' of this problem.


When you take this to the macro level where the government should be managing the Shared National Credit risk you find a lag problem where indicators like you see from Bureau of Labor Statistics are simply a good indicator of what's already happened as are the economists' data coming from places like HUD. They are not however a good indicator of what is to come when what is coming is unique and as a pattern, somewhat unidentifiable. To be able to effectively spot a contagion you need the most accurate data in a format you can consistently retrieve and integrate for predictive analytics. There are great data mining operations going on in all of these institutions and there are vendors like UBMatrix that provide tools that XBRL solutions like the FFIEC Call Report can be built on.


Going back to the first posting from The Institutional Risk Analyst earlier I believe that major vendors in this space like IBM, Oracle, Microsoft, Fujitsu, etc. coupled with the advances in storage mechanisms for XML will render the following statement:


We rub our worry beads pondering the anthropology of innovation, each component developed piecemeal and each maturing to serve the interactive data space. Not unexpectedly, we see evidence of classic early adoption myopia -- competing solutions ignoring each other’s value, while pushing, at times aimlessly, in the hope of owning as much of the interactive data real estate as possible. We know from experience that the “one wrench does it all” approach hurts rather than helps the adoption of interactive data as a resource to the financial community. We believe there needs to be more context as to what functional purpose a technology has to each step in the value pipeline – collection, validation, storage, distillation & dissemination – over which data travels from source to user.


can and will be somewhat ameliorated by methods to handle schema evolution coupled with the XBRL organization maintaining the technology artifacts that represent the line of business involved.


And from the second posting from The Institutional Risk Analyst related to risk modeling:


To us, the chief obstacles preventing regulators and risk managers from understanding the nature of the next systemic tsunamis are 1) over-reliance on statistical modeling methods and 2) the use of derivatives to shift and multiply risk. Of note, continued reliance on VaR models and Monte Carlo simulations is enshrined in the Basel II proposal, the pending rule revision on CSFTs and the SNC proposal. All share an explicit and common reliance on statistical methods for estimating the probability of a default or P(D), for example. These ratings, in turn, depend heavily upon stability in the assumptions about the likely size and frequency of risk events. None of these proposed rules focus great attention or resources on assessing specific obligor behavior.


With a new XBRL based SOA underpinning this new framework adds discrete event simulation capabilities which give the ability to use computing models to play ‘games’ like the Department of Defense does that I've blogged about here. In addition is the capabilities for statisticians and economists to use this data in aggregate to measure true national credit and risk factors more accurately.


Another from the second posting from The Institutional Risk Analyst related to oversight of the risk calculations:


Thus the urgency in some corners of Washington regarding revisions to SNC, including a quarterly reporting schedule and enhanced disclosure of counterparty financial data. Remember that one of the goals of the SNC enhancements is to gather private obligor P(D) ratings by banks and to aggregate same to build a composite rating system for regulators to use to assess counterparty risk. That is, the creation of a privileged data rating matrix which could be used to assess the efficacy of both bank internal ratings and third party agency P(D) ratings alike. More on this and the effect of derivatives on visible bank loan default rates in a future comment.


Even though some say SOA is dead I know the platform is very much alive with products this and this which I worked on while at Oracle which are the underpinnings of Basel II solutions such as this. While Basel II isn’t the silver bullet here it is being recommended that is should stick around. Basel III won’t necessarily be the answer either but what we have is a method to surface the data artifacts of XBRL into processes (including business intelligence for items like risk calculations) that are easily mapped and understood into larger and larger scopes. That is really the beauty of these XML based frameworks and I've had the pleasure to implement others like AiXM, HL7 v3 and NIEM which support native message types and processes, for examples, airlines to the FAA or doctors to the FDA (and all applicable points in between). The resulting instances of these items become instantly transparent and ease the need to harmonize them for understanding in the process of oversight.


Back to the last paragraph of the second IRA posting which begins to delve into policy:


Bankers, after all, are not very good at understanding future risks, no matter how many ERM consultants they hire, default risk software implementations they direct, or meetings they attend at the Federal Reserve Bank of New York. Even making accurate observations about the present day risk events seems to be a challenge. Witness the fact that commercial bankers as a group managed to direct more than $2 out of every $3 in political contributions this year to Republican members of Congress, even as the GOP looks ready to lose control over the House and perhaps even the Senate. When Barney Frank (D-MA) is Chairman of the House Committee on Financial Services, perhaps the industry will take notice of this operational risk event and adjust accordingly.


Obviously this article is from 2006 and we've since moved back to a democrat controlled Congress and White House. In fact the gentleman in charge of the Federal Reserve Bank of New York at that time is now the new Secretary of the Treasury. Tim Geithner had this to say in 2006:


"Credit derivatives have contributed to dramatic changes in the process of credit intermediation, and the benefits of these changes seem compelling. They have made possible substantial improvements in the way credit risk is managed and facilitated a broad distribution of risk outside the banking system. By spreading risk more widely, by making it easier to purchase and sell protection against credit risk and to actively trade credit risk, and by facilitating the participation of a large and very diverse pool of non-bank financial institutions in the business of credit, these changes probably improve the overall efficiency and resiliency of financial markets. With the advent of credit derivatives, concentrations of credit risk are made easier to mitigate, and diversification made easier to achieve. Credit losses, whether from specific, individual defaults or the more widespread distress that accompanies economic recessions, will be diffused more broadly across institutions with different risk appetite and tolerance, and across geographic borders. Our experience since the introduction of these new instruments—a period that includes a major asset price shock and a global recession—seems to justify the essentially positive judgment we have about the likely benefits of ongoing growth in these markets."


While trying not to place blame on the current state of legislation or the operation of government as ‘it is what it is’ and to put it bluntly there is no possibility that you can prescribe legislation, hope to take its goals and objectives (measured semi-annually by OMB) and turn them over to an agency or agencies who's top officials may change every 4 years then expect their CIO's and others to let competitive bidding to the usual suspects in around the beltway while expecting different results. In fact, quite the opposite as we've compounded issues we can't fully understand because of a lack of transparency, not just of government and the oversight of industry but the overarching process models we have for doing business (risk models, etc.) and how they are audited by the government.


At the end of the day policy makers do things that sound appropriate and Sarbanes-Oxley is a good example of that which was passed to combat the abuses of Enron, WorldCom and others. The unintended consequences, sometimes in the form of a sense of false security, are often the ones that end up biting you the worst. The problem as I see it is that the institutions involved in the current crisis deal in finance specifically and not other lines of business that yield financial results. Not that these companies weren't subjected to the same policies only that valuation was the root of this crisis. There is blame to go around here from the housing policy that said banks should do the lending to the unqualified including the minions that became real estate speculators as a second job and the financial institutions that packaged, re-packaged and sold this debt. Since these complex financial instruments are the backbone of this contagion, it's virtually impossible to 'unwind' them at this point and most of them are at some point tied to mortgages. Dealing with this part of the problem could allow for stabilization of the situation to a certain extent.


Looking at what’s been done on housing policy thus far I don't see anything wrong with a forced stoppage of foreclosures although after having worked at FHA for the better part of 2008 I can tell that no one likely even remembers the Hope for Homeowners or its revisions for 'flexibility'. It's not to say that these things were and are without noble intentions but if we look back in history we see that HUD has shaped homeownership policy, at times to the detriment of the very banks in trouble today and FDIC has been in receivership of these banks as well (IndyMac comes to mind as a good example of an institution straddling that duality). If we look at the results of Hope for Homeowners we see that while the legislation targeted 400,000 homeowners only 25 have actually leveraged the relief offered in the legislation. Of course one of the unintended consequences was that FHA was able to hire many employees with the $25 million provided for implementation. This is significant because HUD and its largest program, FHA, have no budget for shared IT modernization as the entire pot (~$50 million per year) goes to maintain the ~50 mainframe applications running the systems there which take 18 months and many millions more for the simplest of changes to support new operational goals. Looking at the future and what’s happening with Federal Student Aid, who like HUD don’t even own their own data…indeed YOUR own data, and Sallie Mae there is another wave of this economic tsunami headed our way not to mention to the additional Adjustable Rate Mortgages that are about to reset hopefully at a reasonable enough rate to keep qualified homeowners in their home or some subsidies to keep potentially unqualified ones there as well.


Given what is happening to the banking industry at large, due mostly to mortgage lending and securities derived from mortgages, it's tough to make an argument against nationalization or making Bank of America the real ‘Bank of America’ or in lieu of continuing to feed them money and turning them into 'zombies' as seen in this paper. With the regulations and communization of strictly depository banking like local incumbent telecom companies, serving up a local telephone line or checking account isn’t viable as a growth business. It could be time to create some fresh banks seeing the Federal Reserve Board, Treasury and FDIC are really the mother of all banks anyway. Let the bad performers die, let the government use these funds to start a shadow banking system and mortgage underwriting and use new technology to do it right this time along with turning those entities back into commercial ones after the bad ones get valuation and/or simply die. I find it hard to believe that anyone would care whether they banked with Wells Fargo or some government version of a depository institution but would certainly care if their bank was insolvent like most of them are today but seem to get ongoing support when they should be allowed to fail. The other financial operations that deal in equities, insurance, risk and other financial sub sectors would be in a position, as many like JP Morgan are now, to perform many levels of financial services including acquisition of insolvent depository institutions like Washington Mutual.


When you really look at this problem you start to understand that people and companies they run, when left to their own devices will end up with a conflict of interest without consistent, thorough and timely oversight. Who ‘polices the police’ as they say and additional oversight from our government agencies and their respective Office of Inspector General along with Government Accountability Office will just never be enough. With the new paradigm presented in this blog encoded in their DNA the government has the ability to re-organize its enforcement staffs into a cohesive model that fits the institutions they are regulating along with allowing them the flexibility to morph as those institutions are likely to in the Brave New World we are facing. This frees up capitalism to move on about its merry way to recovery even if the depository side of banking and mortgages in the form of Freddie, Fannie and Ginnie all need to stay ‘governmentized’ for a while until the free market is able to sort out the mess the last debacle leaves behind. Using techniques like this we can make sure these items are spun off for good and, perhaps most importantly, no longer considered to be GSEs all while giving them the proper policy oversight.


At some point the right solution will be realized, perhaps when we come up with a price index and allow all homeowners to refinance (those who were rightfully financed in the first place) to a 10 year adjustable or 30 year fixed product at this adjusted home value. Before you dismiss the idea what will be stopping someone with good credit to move down the street for a nicer house at less than what they owe on their current mortgage? This will perhaps allow the bank and homeowner to share an increase in value over the coming years up to the original value of the mortgage at which point the homeowner would be the recipient of the additional equity or perhaps in some tapering sharing of equity. Interest rates would remain low for some time to allow for these loans and the 10 and 30 year products would hopefully put homeowners out of a time horizon for huge interest rates hikes that will undoubtedly occur to fight inflation. Homeownership would be tough for a few years during the time interest rates are going up but the banks would have sound balance sheets and at least the CDOs could be unwound and credit default swaps absorbed. At some point all would return to homeostasis.


What we need is the ability to not only found 'language' along with these goals, objectives and measures but levels of process models that ensure how they will be carried out. The main components can be put into a process model that decomposes to another level and eventually into the implementation of the systems that facilitate the negotiation of complex instruments by presenting counterparty risk in aggregate each time they are bought and sold. More importantly is that oversight and measures of efficiency for what the government may be doing to bail these institutions out as an example would be immediately available. Simple diagram of how these levels of complexity and volume decompose is shown here:


Effectively this would make multiple iterations of the Troubled Asset Relief Program (TARP) not only inherently transparent but also be conducted on a transactional basis from the funds set aside to perform duties assigned to them by the legislative policy. If anyone believes that TARP, a National ID Card or an electronic medical record maintained by the government can be devised, funded, implemented, managed and reported on to allow for adequate oversight that it would accomplish the goals that were originally intended and not instigate other, possibly worse side effects is not being realistic or needs to be educated as to why it’s impossible to let ‘the smart people at IBM take care of it’. At some point while we may stop foreclosures or even subsidize mortgage payments it will not stop what has devolved into the end of a game of musical chairs where someone has taken all of the chairs. Whatever the solution, we are all in this together, homeowners, banks and government so the solution should allow all 3 to participate and have visibility to results on a real time basis to rebuild the trust within our capitalist society. Otherwise government will spend more money and not accomplish desired results; banks will foreclose on more homes and commercial properties as their capital levels are fortified by the government while waiting for an uptick in housing to sell off foreclosed inventory. The problem there is the new homeowners won't exist as there won't be an economy with jobs to support any new homeowners. We'd better get the smart people on this and allow them to participate on how we solve this, implementing technology at every step in the process from legislation forward to insure success. We don’t have the money available in the whole world to support feeding this problem as it exists now. Otherwise we had better be prepared to understand that (especially without such techniques as espoused here in this blog) there will be more Orwellian debacles yet to come and perhaps most importantly, we won’t see the full impact of their aggregate perils until it’s too late.


In conclusion I'm essentially sounding the alarm that while things coming out of Congress can be debated to great end about their intentions or fairness they cannot be measured ahead of time for their efficiency in addressing the problem(s) at hand and periodic measurements of aggregated efficiency which could be construed as ‘effectiveness’ just isn’t agile enough. There isn’t the kind of ammunition left to keep firing $1 trillion birdshot with our double barrel sawed off that we call the Treasury and Federal Reserve to clean up this mess. We need is a fresh start with a few well placed 7mm sniper rounds to solve some of these systemic issues. I'm not suggesting we throw caution to the wind and adopt some Isaac Asimov state of machine rule, nor am I suggesting that I should be the next ruler of the free world because I understand how these systems work and more importantly, should work to support new initiatives. I'm not sure about how the rest of the world feels about a technocracy but it's obvious our Federal Government is far from that at this point. Keep in mind IT spending for the entire Federal Government is only around $78 Billion which is only 10% of the new stimulus bill just passed by Congress. What I'm saying is that in the world where we are more and more dependent on technology we cannot let the inefficiencies of government permeate the implementation of the new programs especially the IT that is mainly responsible for 'making the trains run on time' as it were. We need a new era of the President's Management Agenda where a Federal CTO who oversees FOIA and the like are going to fall way short of not only enabling technology that can support the goals of legislation but mitigating the risks (doing away them with in an ideal world) of the unintended consequences by providing a framework to provide a ‘line of sight’ when tweaking policy with automatic instant transparency, neither of which would otherwise be provided.

Wednesday, November 12, 2008

Leveraging BPM, SOA, Identity Management and Enterprise 2.0 for Governance, Risk and Compliance

Running an IT organization for government or business in this day and age has brought about new challenges which place a focus on capabilities and tremendous strain on resources that ideally would have occurred only per the natural requirements of the business or mission. This somewhat artificial digression from the politics or competitive landscape that has historically shaped how most IT systems were built, delivered and managed is a new layer of complexity that has appeared on the horizon and which can easily engulf scarce IT resources if not handled strategically.
In this white paper we will attempt to address Governance, Risk and Compliance while prescribing the new technology paradigms of BPM, SOA, Identity Management and Enterprise 2.0 as a unified set of patterns and tools that can be brought to bear on these new initiatives. This should be the driving force behind how you modernize your IT environment to service these needs while also providing the value of agility to your enterprise. At the conclusion of this read we hope to have presented a compelling story around how and why this set of technological offerings will be all you need to implement in the foreseeable future for solving these problems while continuing to improve the overall quality of your IT mission.

Compliance

In thinking about this new wave of Governance, Risk and Compliance let’s start in reverse and look at the end result, Compliance. For the scope of this white paper ‘Compliance’ could be anything from Sarbanes-Oxley Section 404, HIPAA, CMMI Level 3-5, ISO 9001, Basel II, even anything that is internal to your organization such as capitalization or Service Level Agreements (SLAs), and the list goes on….
No matter what you are faced with in the ways of Compliance, the end result is likely some kind of an audit or periodic report to someone or something responsible for verifying that you are in Compliance. Such requirements as that are usually tied to some sort of Business Intelligence system that will tend to aggregate data from all kinds of places and systems to produce reports that verify levels of Compliance. The difficult part of such period based reporting systems, in addition to the mad scramble to actually make them produce positive results, is showing your work, e.g. decomposing the aggregate numbers for proof of Compliance. While Business Intelligence, such as that of the aforementioned variety, isn’t mentioned in the title of this white paper it has become very much a part of BPM at large and will be discussed under that topic later in this white paper.
In the end the old adage about those things measured and reported on are those things which are acted upon is the real rule of thumb here. No matter what you are expected to fall into Compliance with you will first need to figure out how it will be measured. We will take a more in depth look at how to define these metrics in the next section, Risk.

Risk

As previously discussed Compliance is something that may come from a myriad of places. It may come in the form of an audit to uphold some certification or perhaps simply adherence to some plan of capital outlays for value in your IT portfolio. Whatever these items are they should measured in terms of the level of Risk you acquire by somehow falling out of Compliance. There are many types of Risk, Operational Risk, Financial Risk, etc. and in some cases the Risk you are trying to measure has prescribed methods for doing so. The Basel II Accord for Banking where your Risk is measured in a monetary fashion is one such standard. Where there are government institutions enforcing Basel II on the largest (about 100 of them in the US called Tier 1) banks there is additional Risk of finding that you haven’t complied in addition to fines and publicity that may come in tow. The Basel II calculation of Potential Default (PD) or Exposure at Default (EAD) is likely something that should have been measured a little bit more closely by all institutions with regard to the recent housing market lending issues that materialized in poor ratings for those aggregate Collateralized Debt Obligations (CDOs) rife with subprime mortgage write offs.

Prior to thinking about Compliance or Governance you must plot those Risks that are important to your organization. One approach is to scatter plot such items as in the chart below. We’ve stated that Risk could be measured in negative value but let Value to your organization be the X axis and assign some other weighting say 1-10 for the items that have the greatest Risk for your organization. Again those things may be moved by their negative value but they may also realistically fall into the category of Risks you are willing to take. You can size the point on the chart for the levity of the Risk. Obviously you cannot hope to attack all points equally but it is necessary to make this a living exercise to constantly re-evaluate where you stand in the vast world of Risks that affect your operation. If you are at Risk of losing market share for instance then you will certainly become out of Compliance with shareholder expectations!
There in lies the point of having a sound strategy for Governance, Risk and Compliance so that you’ve controlled your Risk internally before having to worry about it externally. After all, exposing your customers to that risk can cost the most important capital an organization possesses, credibility. We all experience Risk in everyday life for instance when we approach an intersection with a yellow light we make a calculated decision based on the Risk that we may be caught breaking a law if we proceed. If you were to get into an accident or get a ticket in doing so it would pose great risk to you in the form of bodily harm or financial responsibility. Externally, however, insurance companies would have new ideas about the risk you present for them in continuing to insure your operation of a motor vehicle in the future. Some Risk happens that quickly but identifying all of those things ahead of time that are possible and preparing to handle them proactively is what Governance is all about.

Governance

While Compliance is usually done to appease some authority that has the ultimate say as to whether have effectively mitigated or managed our Risk, Governance is the practice of managing the Risk of not being in Compliance. We’ve stated earlier that this Compliance may come down to something at the very core of your business such as whether or not you are generating enough revenue for the marketing campaign that was just funded. Perhaps this Compliance is more of an absolute such as that of Sarbanes-Oxley Section 404. No matter what the total sum of these Compliance items that assure that you’ve managed the Risk specific to you, there is also likely a number of Governance frameworks established to deal with those same issues. In the case of Sarbanes-Oxley (SOX) there is the CobIT framework which is meant to put in place the Controls necessary to be able to attest to Compliance with SOX. There are many complimentary frameworks that every publicly traded company should implement or at least investigate for portions applicable to enhance CobIT such as ITIL and ISO 17799 (now ISO 27002). Many of these involve internal processes that must be implemented, verified and measured as an ongoing, ‘in-situ’ audit rather than the mad dash of period based reporting most experience these days.
Governance then is the sum of these policies and procedures that you put in place, some of which are based on industry standard frameworks, in order to effectively manage your total Risk. Don’t let all of the alphabet soup of all of the frameworks, regulations and standards scare you. Once you’ve gained an understanding of your Risks you will be able to map the appropriate frameworks to them for building your own Governance ‘mashup’ (see Enterprise 2.0 at the end of this paper for a definition of ‘mashup’). The point of this white paper is to explain how a modern approach in implementing these controls with state of the art technology patterns can actually provide a vehicle to sustain any combination of these needs while also modernizing your IT infrastructure to be defined and driven by all business goals. Rather than consider any of the items addressed in this article as a ‘siloed’ cost center investment one should look at the overall agility these patterns can provide to an ever changing marketplace that demands more visibility into how you are protecting the interests of your customers, citizens or investors.

BPM

In addition to this plethora of frameworks (see Glossary at the end of this paper) aimed at supporting Governance there are a number of methodologies that support Quality and other initiatives in general such as ISO 9001:2000, CMMI and ISO/IEC 15504 which attempts to harmonize many frameworks starting with the two previously mentioned. There are also any number of derivatives of kaizen or Continuous Process Improvement methodologies such as Lean (from the Toyota manufacturing process), Six Sigma and even Lean Six Sigma. These all exist to minimize the number of defects per opportunity thereby increasing quality while allocating resources to the process steps in a ‘just in time’ fashion. The Continuous part involves understanding, measuring, simulating and re-engineering processes for gained effectiveness and efficiencies. The round trip for this Continuous Improvement Process is all about the reporting of the Risk measurements determined by what are seen as Key Performance Indicators or KPI’s. This data about performance is ideally fed back into a business process analysis tool that can use it as a simulation baseline.
Because these Risks have Governance frameworks associated with them it also ideal to weave these activities inside of the normal everyday duties that your lines of business perform. As mentioned in the Governance section of this white paper it becomes increasingly difficult not only to generate Compliance reporting around your business processes, but more importantly how you can decompose those reports to provide on the spot actual data. By fusing the techniques provided in this white paper your organization can provide a line of sight from any vantage point of your operation to any other(s). Although not a substitute for period based business intelligence aggregated for the purpose of performance management this brings the necessary aspect of decision support into your operational systems. Also because data from these more robust periodic systems can and should be embedded into your business process management applications you get an accurate picture of ‘who knew what and when did they know it’ that seems to be at the crux of most critical forensic audits occurring today.
The other part of BPM that is critical, especially since BPM is at least somewhat overlapping if not a superset of BPR (Business Process Re-Engineering), is the ability to understand how your human resources interact within business processes. Even more importantly, strategic human resource management involves understanding how your people can best perform and in what quantity especially if your workforce has a highly repeatable set of tasks. Understanding the activities of each individual in a discreet manner but always in relationship to the macro set of processes they participate in is where BPM intersects with Identity Management and is sometimes called ‘Role Mining’. This study has far reaching impact not only in BPM but also Human Capital Management where you literally are able to grasp the impact of enabling Human Resources with certain capabilities before investing in those initiatives. After all it is really the adoption of any initiative by a larger business community that enables success of any of the things such as those discussed in this white paper. Giving your organization protection from harm and increased value to the people it serves will win over many line of business owners and users who too many times have seen change come for the sake of change.

SOA

Service Oriented Architecture (SOA) isn’t an entirely new concept. It is however a new acronym with a lot of hype. In fact it has its own ‘hype cycle’ and now potentially an extended ‘trough of disillusionment’. This last part occurs when most realize that even though this new paradigm or technology is quite attractive, the reality of getting it implemented to derive its promised value seems distant if not impossible. SOA is one of these that experienced a steep slope that exists on the upside of that trough known as the ‘slope of enlightenment’. During this enabling phase many have realized that it takes more buy-in from various factions in an organization than what they may have assumed initially. An interesting statistic put forth by Gartner (whom by the way founded the ‘hype cycle’, ‘trough of disillusionment’ and ‘slope of enlightenment’ being discussed here) recently states that only about one quarter of larger companies will have the organizational or technical skills to realize an SOA by the year 2010.
SOA is largely an IT exercise and because IT has been somewhat separated from business in that its cyclical nature of responding to change in business models it is not seen as adequate in many business owners’ opinions. While services are typically portrayed as those interactions between systems in an SOA the other perhaps more key tenet of an SOA is how those systems are presented to and allowed to interact with the users involved in the business processes they support. An SOA is the fundamental center of the holistic concept presented in this white paper as it embodies all of the enterprise wide integration aspects that have heretofore been known as EAI, EII, ETL, MDM, B2B and the list goes on. An SOA requires its own set of rigors for Governance because of its own inherent Risks whereby measuring it for Compliance against its stated goals are the beginning of a truly shared model where business and IT are joined at the hip.
A perfect example of software vendors addressing this challenge has been the phenomenon of ERP and other COTS business applications that attempted to insulate business owners from dealing with IT in terms of actually creating systems to run parts of their business. The process of configuring these systems is what BPM looked like for many years until people realized how changes made to those systems affected upgrade paths not to mention stability of the applications themselves. The nice thing about this new philosophy of SOA and BPM is that those investments as well as investments in other legacy systems are preserved. Using BPM as your genesis for an SOA gives you an opportunity to attack this problem from a known set of requirements which are those of the business owners in the organization. They will give the commitment you need to get started not only because they are actually driving SOA requirements at the appropriate layer but also because you give them the ability to modernize their legacy or ERP systems without actually touching them. This is something that will save them huge budget and also allow those systems to continue providing the functionality they provide today including remaining the system of record for mission critical data.

Identity Management

As described in the previous two sections, the most important parts of your business are those resources that are not automated but human. They pose most of the Risk once you are into even the most basic maturity stage of an SOA and are responsible for carrying out operations using the Governance model that you’ve put in place in order to stay in Compliance. It is now apparent that Identity Management is the sharp end of the spear known as Human Capital Management as it was discussed earlier. It is literally where the rubber meets the road in that it is how your people gain access to the systems they interoperate with everyday to conduct your business. In addition most organizations have realized that the same digital identity should be used for gaining access to locations in which physical systems and other resources reside. The cost of managing on-boarding, off-boarding and otherwise managing credentials for these varying communities of individuals as they relate to your business has historically been a tough cost center to deal with. With a sound Identity Management Strategy much of this process can be centralized and provided in a self service fashion.
Outside of the HR or BPM side of knowing who your folks are and what they do, Identity Management provides one of the most critical items for Compliance and that is Attestation. Simply stated, Attestation is what an organization’s executives must sign off on periodically that says you’ve taken appropriate measures (implemented appropriate Governance) to mitigate any Risks. These include any or all of the Risks mentioned in this document plus untold others unique to certain industries or even those yet to be enacted or enforced. The one Risk that cuts across all others is that of the insider empowered to conduct your business that does so with a malicious intent, the so-called ‘insider threat’. This is the one thing that weighs the most on a company executive’s mind as it, aside from reports he’s looking at, is ensconced completely within a black box until discovered and by then it is often too late. Identity Management along with other appropriate Governance measures implemented in BPM and SOA helps to ensure that your employees act ethically and with your mission as their driving priority.

Enterprise 2.0

Let’s start by stating that Enterprise 2.0 simply means Web 2.0 and how that phenomenon applies to the enterprise. The key element of Web 2.0 and indeed Enterprise 2.0 is the ‘social network’ or the idea that in everything you do that involves communication with others there is a set of attributes or ‘social fabric’ that ties you together with that person or group you are interacting with. This allows you to participate in each task you perform everyday with those attributes front and center in the form of a collective context or ‘presence’. Presence is something you are familiar with if you’ve ever used an internet chat program and categorized your ‘buddies’ into groups for family, work, friends, etc. In the enterprise however presence is a more richly intuitive list of who’s available to you and what their role is in the scope of tasks you are currently working on. The other thing about Enterprise 2.0 in this collaborative scope of activities are the communications that come from this presence interface such as instant messaging but also including voice over IP (VOIP), video conferencing or web conferencing where a user’s desktop or document(s) are shared.
The other services provided in an Enterprise 2.0 fabric are those that were previously thought of as content management applications but now are seamlessly integrated into the ability to search for content, create it on the fly and share it any way imaginable. What you are working with at all times is data from your SOA that can be materialized as a printable document on the fly. For imaging or other legacy captured documents those can be passed as part of a ‘worklist’ that may be subscribed to for personal tasks assigned to you or tasks assigned to those in a certain role necessary to perform the work. In any case the idea of a ‘document repository’ or really locations in general is abstracted from the users in an Enterprise 2.0 environment. And since everything is locatable via a search engine interface or by attribute tags that give the documents the same project based context as presence, producing, accessing or editing documentation becomes a seamless part of a business user’s tasks.
Enterprise 2.0 components known as wikis and blogs allow for effectively introducing your new BPM centric SOA to personnel, both old and new. Wikis are essentially online encyclopedias of knowledge about things in your enterprise. Everyone can make entries in a wiki and those wiki entries are searchable as content. This is really a readable index of what people think is important to your organization and again its entries are presented along the hierarchy of your business taxonomies. Blogs are essentially similar but are more personal in nature in that it is used to record notes about how certain things were accomplished or perhaps more importantly are to be accomplished thereby alleviating the pain for the next individuals who experience the same challenges. Blogs allow for community comments on their content whereas wiki comments are effectively another entry into the wiki linked to the previous entry. These things provided together are known as a ‘mashup’ in Web 2.0 parlance and delivered to your users as an AJAX based Rich Internet Application (RIA). This combination also allows for a harness of sorts to be provided for business users to accomplish a very necessary goal of having a self identifying, self training work environment to be immersed in. Ideally this environment has gleaned all of the knowledge from your body of workers that is subject to disappear if not captured adequately. There is no bigger need for this than the current set of baby boomers that have been performing their work for decades and have their respective and collective knowledge bottled up in a form currently not transferable to the next generation. This paradigm allows for capture of that knowledge and its embrace by the new systems you put forth in order to address the challenges of the future while assuring good practices are not lost and also of equal importance that others are.

Conclusion

In today’s environment of regulations, competition and changing market conditions software vendors have thrust an abundance of new offerings into the marketplace in an attempt to enable their customers to cope and either remain in or gain a proactive posture towards IT investments. In this white paper we’ve presented some of these new patterns of software as well as visibility into some of the drivers necessitating an agile approach. You are hopefully now armed with a holistic view of these matters which you can bring to the attention of the appropriate decision makers within your organization to take action. Understand also that this story itself could start at any section in this white paper and easily transition to other sections as they part of a contiguous whole. This is a gestalt of the largest order in that it is increasingly difficult to enact certain combinations of these capabilities in a silo and still more difficult to actually do some of them without consideration for the others. Yet every day products are purchased and architectures are founded without consideration for a whole picture similar to the one drawn here.
When you start down the path of SOA with BPM as the key driver it becomes a self fulfilling prophecy due to the nature of how the onion is peeled with these techniques. Your BPM effort allows you to start with what your business actually does today and use that to drive the façade around your existing systems that becomes your initial SOA. You then take BPM a step further to analyze appropriate frameworks and methodologies that need to be embraced for the reasons discussed in this document. Next you add the study of your workforce and their enablement within these systems via Identity Management and its BPM components. Finally you modernize the way your business users interact in this new world of BPM by introducing Enterprise 2.0 with Identity Management as the secured wrapper that allows them to be entrenched in the appropriate role based contexts for the work they are expected to carry out. This is in stark contrast to those institutions today who for instance can’t even lend money even though lending is their primary line of business, all due to their own accumulated risk they’ve realized with their poor controls (or absence thereof)
On the road to this nirvana is the ability to rationalize the appropriate portfolio of services for your SOA as well as a cookie cutter understanding of how to procure IT assets. In the end rolling out a new virtualized line of business or change to your mission should be as easy as filling your plate at a buffet and not much more difficult to put into this framework with newly enabled personnel who are now able to more effectively multi-task due to the homogenized IT environment presented to them. In the end this really makes IT transparent if not invisible to businesses who have struggled in large part due to the antithesis to this picture that exists in many places today.