Posts Tagged ‘linkedin’

layman’s view of financial crisis

April 1, 2009

Since the financial crisis is all the rage these days, and pundits pontificate about greed on wall street and lack government oversight, collaterized debt obligations, credit default swaps, etc., and consequently take the discourse out of the reach of the common man such as myself, I have tried to put a historical perspective together based on my personal journey to try to make some sense of the current situation in simple lay terms. There are famous economists in famous universities who have blogs (see http://baselinescenario.com run by MIT profs, and other blogs run the freakonomics guys, university of chicago profs, etc..), and I actively recommend reading those blogs as well.

The story begins in my sophomore year of college when I had to write a term paper for the economics course. I decided to write about the debt trap that many countries, especially developing countries, found themselves in. While researching the paper, I was astonished to find that the country that was the most ‘in debt’ back in the late 1980s was the United States. The reason the US never got caught in the debt trap, I was told, was the capitalist society in the US almost always guaranteed that the capital was invested in projects whose return was higher than the rate of interest on the debt. Indeed people were lining up to lend money to the government of the United States because the perception was there was essentially no risk of default. Other countries, especially developing countries, ‘mismanaged’ the capital and invested in projects that did not provide sufficient returns making them borrow money to repay existing debts and consequently getting into a debt trap.

Now why is it that the capitalist system in America is able to find things to invest in that return better results? Well, it is because it is a ‘huge market’, that is, the GDP is high and there are many possible uses of the capital and we have good institutions, etc… What constitutes GDP? It turns out that 2/3rd of GDP is consumer spending. So, as long as people are buying, it looks like we have a good thing going. Now how are people able to buy? Are their incomes high? It turns out that over the last few years, the income distribution has become more polarized, so for even though the median income in 2007 was 50K, the bottom 40% earned significantly less than that, i.e., less than 36K.

In most countries, people accumulate assets by saving and investing, and that usually assumes that their incomes are higher than their spending. Here is where the logic seemed to start falling apart for me. It turned out that not only did the US have a huge public debt, but also a lot of private debt. Meaning the average US citizen spent more than he earned. But I was told that there is social security, medicare, etc., so the average american does not need to save as much. But to actually regularly spend more than what they earned? How could that be sustained? Well, that was because 68.1% of americans in 2007 (http://www.danter.com/statistics/homeown.htm) owned their homes, and as long as the value of their homes kept going up, they could borrow against their homes and spend.

Coming from a country where half the folks live on less than a dollar a day, this was an amazing phenomenon. Home ownership is a privilege for most people in the world, it seemed to be a right in America. In fact, politicians would routinely talk about the ‘American Dream’. They conveniently forgot to mention that the great american dream is financed by the great american debt! So the whole world is betting on the American citizenry consuming, and actively encouraging that behavior by lending them money.

The argument seemed to be backward, meaning that normally one would think that one accumulates assets and puts the down payment on the house and pays a mortgage, so, one has to reduce consumption to afford a home, but it seemed like the expectation of increase in home values drove an increase in consumption. Credit was freely available to folks who wanted to consume because the government kept the interest rates low, and the government itself was able to do this because it was able to borrow easily, mostly because there would be possibilities for investment that were dependent on the US citizen spending even more money. So in some sense the whole world was betting that the real-estate market in this country would keep going up and people would keep buying. In fact, it seems that they kept lending money to the US government on this essential premise. All this when industry after industry essentially disappeared to lower-wage countries which in my mind would undermine the consumer demand that seemed to be the primary driver of economic activity and growth.

I remember talk about US being the ‘ownership’ society where the house was the single largest asset that most people owned, so the argument seemed to predicate on getting more people to own homes. That would give people the asset base to continue to spend money and grow the economy. And then, from the late 90s and early 2000s, especially after the dot-com crash, I could see the distinct push to enable even more people to own their own homes, even people who were among the lower income brackets. I remember various powers-that-be rationalizing it as putting people in homes drives our economy. There were multiple research papers written by prestigious universities on why housing for low-income folks was critical for their economic well being (see http://www.jchs.harvard.edu/publications/finance/w05-9.pdf for a fascinating view of the financial returns of home ownership for low income folks). Of course, the paper did talk about the risk of the property values going down, the risk of default, etc., but, the general theme clearly was that home ownership was good for low income folks, and it was good for the economy as a whole. In fact, home ownership rates went up from 66% in 1998 to 69% in 2004 and dropped to 68% in 2007.

The net effect of this was that interest rates on home loans were amazingly low.  In fact, we were able to buy a home because of what we thought were really low interest rates. So, I was part of the statistic that contributed to the increase in rate of home ownership. And of course, once I got into a home, the number of things I spent money on went through the roof. So, I could see the argument that home ownership drives a certain level of economic activity that in all likelihood renters dont drive.

At the back of my mind, what did puzzle me was how were these financed? Who would lend money to people whose incomes are low, so that they can buy a house, and spend even more money to stay in the house? It would be too risky. If I was one of the folks who needed to borrow, I would be thinking about the fact that a significant portion of my assets would be tied up in a fairly immovable asset, and if its value were to fall, it would devastate me.  I could not understand what the economics/budget of a supposedly ‘low-income’ person on a mortgage would be like. In addition, common sense seemed to indicate that for the lenders, the higher risk of lending to lower income folks would mean that they would have to charge higher interest rates to folks with lower incomes making it even less affordable for lower income folks to borrow large amounts of money required to buy houses. Micro-lending works in really poor countries, because the amounts involved are small, not because the interest rates are lower than market rates. After all, the median house price was four times the median household income in 2007. How can a person on a 50K income afford a 200K house, and also spend money on all the other stuff as well? That was the puzzle.

I just could not understand how we could put a roof over everyone’s head unless it was significantly subsidized by someone. Assuming a population of 300M, with 80M families, a 3% increase in home ownership rate (from 66-69%) translates to about 2.4 million homes. The kind of incremental lending that would need to happen conservatively assuming an average of 100K loan per home is a staggering $240 billion. That is some serious financing.

What was even more puzzling was how could traditional financial institutions make money in the process? It seemed too good to be true, you are doing a ‘social good’, by putting a roof over everyone’s heads, and you know what, you are able to do it in a capitalist system. Who would have thought this was possible? I was told it was the magic of securitization. That is, you bundle a bunch of these loans with other loans, and magically the bundle becomes less risky than the original loans. You then sell pieces of the bundled loans to folks and give them a return that is lower than the original interest on the loans that the original borrower will continue to pay, and you can make good money on the spread.

What is interesting is that financial companies borrowed money for short periods of time to buy up bundles of these loans until they are able to securitize and sell them off to others, and as long as there was a spread in the interest rates, and you were essentially printing money. As long as they are able to keep borrowing for short periods of time to keep doing this, there would be profits for the foreseeable future. Here is where the leverage of the banks comes in. All the banks were making these bets with borrowed money. The only risk was that enough people would stop paying their mortgage, or refinance, or someone would stop lending them money in the short term to do this securitization. What people did not expect was all of these things to happen at the same time. And whoever ended up ‘owning’ these securitized assets was left in the lurch because they were not worth as much as they once were. What is worse, no one can price these things because there is no clear understanding of the risk associated with the promised cashflow associated with the instrument. Hence they are toxic assets. Since the financial regulators also require banks to guess the value of assets on their balance sheets, and ‘balance’ the balance sheet (the so-called mark to market rule), it meant that as the random guess about the values of these instruments fell, banks now had to set aside real assets to balance their books. Here is where the banks that were over-leveraged, i.e., ones that had borrowed money to buy these assets took it on the chin. And there were some real wall street powerhouses such as Bear Sterns, Lehman Brothers, and Merrill Lynch.

Now, it is interesting that folks think that this mark to market is the cause of the problem because it forces to guess the value of the assets on their books and ‘balance’ it every quarter. They even point to the fact that this mark-to-market has caused many financial scandals in the past, including Enron, and more on that later. As it is, these financial instruments are opaque, and when you are a public company and take in investments from the general public, I think you should be required to explain the use of the public funds in a clear manner. Now this is more easily said than done, and may prevent certain kinds of economic activity, but, if we believe in having transparency for use of public funds in a public company, we have to have some rule like mark-to-market.

Everything seemed to be going fine with the system and everyone seemed to be enjoying the party until someone started thinking, now wait a minute, all this is based on free flow of credit and house values and therefore personal consumption going on for ever., and yelled fire, and there was a stampede. What is amazing is that this is the explanation I was given for what happened at Enron in 2001. Enron, also ‘securitized’ assets on its balance sheet and sold it to various partnerships, to get the assets off the balance sheet so that the performance of the company would look superlative. I remember as if it was yesterday when Enron declared a record profit in some quarter of 2001, and also said something like they had to reduce shareholder equity by some obnoxious account. This was because the securitized assets were backed by Enron stock and cash if they fell below some value. These assets were properties such as the infamous Dabhol power plant in India, the Azurex water utility in scotland, etc.So, they were very unique assets that were very risky and securitization did not help in reducing or managing the risk of the actual endeavor.What is more interesting about Enron is that the law that they actually broke was that the parties to which they ‘sold’ the securitized assets were run by their CFO whose participation in which was underwritten by Enron. So, Enron should not have been able to take the assets off their balance sheet.

So, the net-net is that securitization, or market-based mechanisms while very interesting and fascinating, are not the answer to managing risk in every situation.

Now this system of securitization was predicated on the bundles of loans having lower risk attributes than a single loan. Think of it as there is always a small chance that a family loses income or has some other unfortunate circumstance and therefore cannot make it payments, but if you put in 1,000s of homes together, the argument was that the chance of similar default are lower. But, is that really true? Now I know that if you bundle financial instruments that are less than perfectly correlated, they will offer a better risk-return tradeoff than any single instrument. So if you buy shares in a company that makes sunscreen and one that makes raincoats and umbrellas, your portfolio value will be more stable and you will likely have a higher risk-adjusted return than if you bought shares in only one of those companies.

So, the kazillion dollar question was whether these bundles of mortgages were like any other financial instruments. The only way it would work is if these bundles comprised of mortgages of folks whose risk of default was not perfectly correlated. The question was how was the risk of the bundle calculated, and how did it match reality. Here is where the answers become very fuzzy, it looks like there were mechanisms to estimate risk, and even to provide ‘insurance’ (so called credit-default-swaps), but, it all had to be predicated on some assumptions of default from millions of families who had historically not owned homes, even in America. So, I cannot imagine that there were reliable models to predict the default risk of folks who got the so-called sub-prime mortgages. Our assumptions around the risk of default were all essentially broken by the increase in home ownership because our models for predicting risk of default were completely off.

It turns out that all of us are paying for trying to put an additional couple of americans per hundred into homes. In some sense, my hunch that putting everybody into a house needs to be subsidized by society is right. Either the government has to tax and subsidize, in which case, we would be socialists, but at least you get to vote every few years to decide who gets to call the shots, or we have to redistribute resources based on ‘market-based mechanisms’ where we have very little to no control over the companies unless we have enough ownership stake and are active shareholders.

In economics/finance, one needs to be a empiricist, meaning, one needs to gather the evidence for the application of mathematics to the phenomena that we observe. It seems to me that wall street had a platonist view thinking that the reality was generated by the numbers. There are of course the contrarians such as Nassim Nicholas Taleb who are probably laughing saying ‘I told you so’, but we need more of those skeptics who are able to question the emperor’s clothes when the herd is rampaging and taking all of us over the precipice.

Cloud Computing

October 25, 2008

I have been following the cloud computing discussion group for a while, and even see some people with important sounding titles from important companies put up their “vision” about cloud computing, etc. with very little substance technical or business wise in their presentations. One then wonders why folks like Larry Ellison say that there is nothing new here.

What I do not see is a simple explanation of what is the change that cloud computing engenders for IT and what are some of the challenges for cloud computing to succeed in supplanting the current model of IT that is prevalent in the industry. People keep talking about how the existing business models will be busted, disrupted, etc. without necessarily having all the evidence. Just because one asserts that the current model will be broken, does not mean that it will.

I have been quoted in the past as saying that technical progress should be evolutionary, not revolutionary mostly because people who use these technologies want to evolve, not revolve. 🙂 (http://www.usenix.org/publications/library/proceedings/worlds04/tech/full_papers/karp/karp_html/index.html) I think the same is true of cloud computing.

Seriously, there is a real danger of us taking on the wrong problems in cloud computing and it will end up not achieving its potential. Some of these thoughs were spurred by my old friend Krishna Sankar’s blogs and posts on cloud computing (http://doubleclix.wordpress.com/2008/10/21/what-is-cloud-computing-and-do-i-need-to-be-scared/), and the active discussion on cloud computing group.

I wrote this piece for Krishna Sankar’s blog, but decided to make it my first stab at articulating what I think the axes are along which cloud computing claims to change the current model of IT, and what are some of the challenges that will have to be overcome, and what are the ‘limits’ of use for cloud computing.

I like to think of things along three dimensions for managing IT environments:
People (can be shared or dedicated),
Technology (can be standard or custom), and
Processes (can be manual or automated)

Now, you may say there is a continuum along all these dimensions, and you are right. But, go along with me for the moment..

Cloud computing implies: shared resources, standard technology, and automated processes. Amazon’s EC2 does not have folks dedicated to my account, it gives me a fixed set of standard technologies, and the provisioning/management is highly automated.

Most IT shops at an aggregate (there are clear exceptions, but I am trying to make a point here) are closer to the opposite end of the spectrum, i.e.,
People are likely to be dedicated
Technology is likely to be custom
Processes are likely to be manual

Many IT shops have some notion of shared services, but more often than not, there are dedicated teams per organizaitonal unit, even it the IT as a whole is centralized. The technology decisions are made wherever they are made, and often each IT shop has a variety of technologies through out the stack, and the processes are often not even standard, i.e, ITIL compliant, let alone automated.

Outsourced service providers maybe better than vanilla IT departments in some areas such as adherence to standard processes, but the people and technology dimensions are unlikely to be very different. Outsourced service providers would like to standardize the technology where possible, but, often they have to support what the customer currently has, and the cost of moving technologies to the ‘standard’ platform is often difficult to justify for both the customer and the service provider.

In theory, cloud is an enabler, for both in-house IT Shops and outsourced service providers to dramatically reduce the operating cost and move to the opposite end of the spectrum.

In theory, there is nothing ‘fundamentally new’ in cloud computing if you abstract it enough. However, the actual realization can cause dramatic difference on how IT solutions are created, delivered, managed, etc.

In theory, one analogy that I find interesting is the airline one, especially Southwest. It is not an accident that Southwest is more profitable than any other airline, in no small part because of standardizing on 737s. However, the flip side of it is that they dont do long-hauls, esp international long-hauls.

So, the point I want to make is that cloud computing may work well for some parts of the IT, and not so well for others. Parts that are likely to be standard across customers, dont have sensitive data elements that one has to demonstrate control over for compliance purposes, etc.

Here are some things to think about though..

1. If you are a company that already has a lot of investment in IT, you have many applications supporting many business processes, and you are looking to get the same ‘functionality’ but at a lower cost point. That is you have a ‘defensive’ strategy as it relates to IT, or to put it another way, the business and functional requirements dont change (or at least dont change a whole lot), but you want the technical and implementation components to transform to provide you a new cost point, while cloud computing seems to fit that bill very well, the challenge is in transforming the existing IT hair-ball that the customer likely has into the standards-based cloud platform. Of course, there are specific areas in your IT environment where this transformation is easier, e.g., messaging and collaboration, and maybe that is a good foothold for cloud computing to take hold. But, i dont think one should generalize to all of IT before we can at least get a clear picture of the problems we will need to solve.
(If you have only 737s, you can do only short hauls. If you want to do long hauls, maybe you need to have A380s)

2. If you are an outsourced service provider who has taken over someone’s ‘mess for less, and you think that you can do the same, you will have the same issues. The issue is transformation. And more importantly, who pays for it. So, unless cloud computing providers provide ‘on-ramps’ for migrating onto the cloud platform, they may severely limit the impact of cloud computing on existing IT solutions.

3. If you are an IT shop that is willing to invest in new capabilities, either because there are processes that you can ‘automate’, or because you can open up new revenue possibilities through technology, then the cloud platform is a platform definitely worth considering from the beginning. The problem in this case is only integrating this specific ‘cloud app’ with other things in your enterprise, and that is a relatively easier problem to solve. So, cloud computing folks should focus on creating the next generation of these apps. Anticipate the next gen of IT spending and enable the alternative economics on that.

4. If you are an outsourced service provider, one way you can ‘innovate’ is by enabling these new apps on the new platform, else all you will be doing is ‘your mess for less’, squeezing squeezed lemons, trying to make lemonade, etc.. )
Most clients expect outsourced service providers to expose them to the new platforms, and shield the client from technical disruptions. And it is indeed the responsibility of the service provider to do that.