Jeremy Cutcher is a political science senior and Mustang Daily liberal columnist.
Last week I wrote about how a confluence of perverted incentives and compromised values precipitated the financial crisis. Now that we’re a year and half removed from the onset of the financial crisis in September 2008, we can look back and see how a number of factors combined to create the Great Recession.
It is important to note, however, that markets are so complex and interconnected that it is often difficult to understand how they interact and affect one another — economists still do not completely agree on what caused the Great Depression.
For simplicity sake, I’ll try to explain the financial crisis from the ground up, from the housing market, where the real value resided, to the financial markets, where complex and innovative products created additional value which imploded when the asset bubble popped.
The housing bubble itself was a complex phenomenon. Homes are unique economic assets because of the expense, transaction costs, and the manner in which they influence so many other markets, from real estate to hardware stores to construction companies.
Many believe homes to be safe and solid investments, yet between 1890 and 2004, home prices rose a total of 66 percent when adjusted for inflation, a measly 0.4 percent a year. In 1998, however, prices began to skyrocket, with house prices rising 52 percent between 1994 and 2004, or roughly 5 percent a year. Because supply costs have actually been decreasing since the 1980s due to technological advances, the rise in house prices must be attributed to increased demand for homeownership.
With the advent of new lending practices, such as adjustable rate mortgages and “teaser rates,” mortgage lenders began lending to riskier borrowers, creating the pool of subprime mortgages that initially drove up home prices and collapsed the housing market. As house prices increased rapidly, wages remained stagnant, leading to increased borrowing on the part of average Americans.
While the number of subprime mortgages was relatively small compared to regular mortgages (rising from 8 percent in 2001 to nearly a quarter of all mortgage originations in 2005), regular Americans had gone into increased debt to finance their standard of living, taking out home equity loans to make large purchases, including college tuition. But as subprime borrowers defaulted, home prices began to deflate and the equity homeowners believed they had stored in their homes evaporated, causing more foreclosures. The Fed also helped expand the housing bubble by keeping interest rates at historical lows in the period between 2001 and 2005, spurred initially by the recession following 9/11, allowing for easy credit without the usual interest rate hikes for riskier investments.
As a result of residential property being so profitable, many banks invested heavily in these securities, seeing an enormous opportunity for profit with little supposed risk. However, rather than shielding financial institutions from the risk of subprime mortgages, securitization (which pools mortgages and turns them into products that could be traded) spread risk throughout the financial sector, culminating in the collapse of Bear Sterns and Merrill Lynch.
In essence, what developed was an elaborate, worldwide Ponzi scheme where new debt was issued to repay older debt. The influx of subprime mortgages bundled with regular mortgages (while still receiving AAA ratings, the rating for the least risky financial products) caused ever-increasing risk to be spread ever more, resulting in the near complete collapse of the financial sector and affecting financial markets around the world.
The financial crisis was exacerbated by the extensive use of leverage, which is when entities use debt rather than equity to fund their investments, whether home buyers or investment banks. In other words, every investment bank has its own wealth (known as equity) and when its assets exceed its equity base, the entity is leveraged. This money can then be invested to make more money than if the bank only used its own assets.
During the 1990s and early 2000s, banks became highly leveraged with ratios around 30:1, meaning that a 3 percent decrease in the value of the banks’ assets would leave them insolvent (for instance, if I play poker with $3 of my own money and $97 from you (my lender), if I lose a paltry $4, I cannot pay you back). This perverts incentives because the investment bank risks only a small fraction of its own wealth, while the extra borrowed wealth allows them to make riskier investments that have bigger potential payoffs. Being highly leveraged was not a problem as home prices continued to soar and mortgage-backed securities were safe investments. Nonetheless, the financial sector seemed incredulous of the fact that house prices may not rise forever and as house prices began to decline, many of the large investment banks found themselves insolvent, resulting in the Troubled Asset Relief Program bailout.
Many economists believe the culture of bailouts, beginning with the bailout of Continental Illinois in 1984, has relaxed risk management standards. They believe these banks have become careless in their risk assessment because the banks know that if any truly disastrous event occurs, the government will bail them out. As Milton Friedman pointed out, capitalism is a profit and loss system; take away the potential for losses and incentives become distorted.
It is important to note that bailouts are not for the investment bank itself; rather, the bailout is for the creditors of the investment bank. There is talk that Lehman Brothers was allowed to go bankrupt because its creditors were largely foreign banks in China and Japan rather than American investors. But it is also worth noting that bailing out creditors does not necessarily require 100 cents on the dollar — the government could allow creditors to recover only 50 percent of their investments. This would help prevent an economic catastrophe but it would also force creditors to be more vigilant with their investments.
I find it sickening that taxpayer money was used to bailout these financial institutions that make billions of dollars in profits, but I can’t imagine the state of the economy if we had not done so. What I don’t like, though, is that taxpayers, through government policy, have shouldered the losses while banks retain the profits. If we would create a policy of repaying creditors only a portion of their losses, we will help ameliorate the effects of a recession while not altering incentives too much.
The discussion among many economists following the crisis seems to be the role of government in private markets. Assuredly government bears some of the blame in the way incentives were perverted, but it must be remembered that lobbyists for financial institutions often lobbied for these changes because it would allow them greater profits. Talking about the role of government, I think, misses the point. The Great Depression stemmed from the laissez-faire economics of the late 19th and early 20th centuries; it was the calamitous signal that government needed to undergird the economy to prevent such widespread destitution during downturns.
Government does not belong in every market, but it does need to intervene in markets that are vital to Americans’ well-being, which for me, includes public safety, education, health and housing. The question should not be if the government should intervene, but rather how the government should do so to make the discussion about smart intervention rather than the value of intervention at all. Our political system seems to have learned the lessons from the Great Depression while everyone else has forgotten them.