Because I was there when the economics department of my university got an IBM 360, I was very much caught up in the excitement of combining powerful computers with economic research. Unfortunately, I lost interest in econometrics almost as soon as I understood how it was done. My thinking went through four stages:
- Holy shit! Do you see what you can do with a computer's help.
- Learning computer modeling puts you in a small class where only other members of the caste can truly understand you. This opens up huge avenues for fraud:
- The main reason to learn stats is to prevent someone else from committing fraud against you.
- More and more people will gain access to the power of statistical analysis. When that happens, the stratification of importance within the profession should be a matter of who asks the best questions. Very soon, it will not be about who has the biggest computers, it will be about knowing enough to ask important questions.
- Precision manufacture is an obvious application for computing. And for many applications, this worked magnificently. Any design that combined straight lines and circles could be easily described for computerized manufacture. Unfortunately, the really interesting design problems can NOT be reduced to formulas. A car's fender, for example, can not be described using formulas—it can only be described by specifying an assemblage of multiple points. If math formulas cannot describe something as common and uncomplicated as a car fender, how can it hope to describe human behavior?
- When people started using computers for animation, it soon became apparent that human motion was almost impossible to model correctly. After a great deal of effort, the animators eventually put tracing balls on real humans and recorded that motion before transferring it to the animated character. Formulas failed to describe simple human behavior—like a toddler trying to walk.
And here we see what happens when economics substitutes the appearance of scientific rigor for the real thing. Unfortunately, real lives are destroyed in the process.
How Laissez-Faire Economics Led to Inequality and RecessionJeff Madrick Huffpo 10/14/2014
Remember in 2009 when everyone was dodging blame for the financial crisis? Depending on who you asked, it was the bankers, the federal regulators, Fannie Mae, fraudster mortgage companies, the ratings agencies and the sub-prime borrowers themselves. The favorite claim of excuse makers was that no single group was to blame -- it was a cluster-f*** as one journalist friend put it.
If everyone did it, no one could be held accountable. But it wasn't true. Bankers and regulators were the major creators of the crisis, for their neglect and single-minded self-aggrandizement that often involved bending the rules.
But let me single out one group that avoided blame and deserved plenty of it: mainstream economists. The deeply held ideas of the nation's most elite economists from the Right and the Left were direct causes of the crisis, justifying perverse behavior on Wall Street and in Washington, and careless and ignorant behavior at the Federal Open Market Committee of the nation's central bank, the Federal Reserve.
These ideas did a lot of harm along the way -- in particular, they were responsible for slower than necessary economic growth that resulted in higher unemployment and inequality.
But first, consider this one enormous area of theoretical neglect, and you get an idea of the inadequacies of the prevailing body of economic ideas. The Federal Reserve just named a new committee headed by vice chairman Stanly Fischer to research how unstable financial markets may affect the real economy of jobs, production, business investment and profits. If you read the 2008 minutes of the Federal Open Market Committee (released earlier this year), which meets roughly every six weeks to set interest rate and other policies, you'll see that the policymakers and their staffs had little idea how to account for financial risk. Finance simply wasn't in their economic models.
In short, the policymakers had no firm concept that the roiled financial markets, which had been in turmoil since 2007, could undo the nation's Gross Domestic Product. The Fed economists, as able a bunch as there are, did once try to put a guestimate to the effect of the troubled mortgage markets, and they were way off the market. The FOMC didn't anticipate a serious recession until the December meetings, after the economy fully crashed and credit dried up two months earlier -- and even then they underestimated by a long shot how far the nation's total income would fall. At that point, they thought the unemployment rate would go to, at worst, about eight percent, but it rose to 10 percent.
I think the casual reader may find this hard to believe -- not that economists missed the forecast (they generally have an abysmal record at predicting recessions) but that they didn't even really take financial excesses into account in their models.
This was not just the oversight or prejudice of stuffy FOMC members. It directly reflected the ideas of mainstream macroeconomists at elite universities -- the economists typically quoted in the media -- from the so-called fresh-water conservatives at schools like the University of Chicago to the salt-water semi-liberals of Yale, Princeton and MIT. (I leave out Harvard, which on balance now has a politically conservative economics department, including Gregory Mankiw, Alberto Alesina, Robert Barro and Martin Feldstein, for example.) As the highly regarded Olivier Blanchard, a left-of-center MIT economist who is now chief economist at the International Monetary Fund, admitted after the collapse, there had simply been no place for financial regulation in macroeconomics up to that point.
This is dismaying but it is important to understand that a fundamental mainstream idea was behind it. Generally, the reaction of the economic mainstream to the inflationary turmoil of the 1970s was to retreat to an ideological interpretation of their fundamental ideas -- a doctrinaire reinforcement of laissez-faire economics. As Americans turned away from government, so did the economics profession. In regard to the financial markets, it boiled down to this. Free markets without government interference work too well to become dangerously unstable; therefore, no need to account for how a credit crisis might affect the real economy. It would correct itself too quickly to do damage.
Since the 1980s, this had been a central economic idea, one of several major ones that did great damage. Financial markets were "rational." If a stock price or mortgage security was overvalued, a smart professional would sell it. Milton Friedman said as much in the early 1950s about letting currencies trade in free financial markets. Speculation would usually lead to stability, not instability.
As with many fundamental ideas, they were often useful initially. Eugene Fama and several other economists at the normally conservative University of Chicago and the usually liberal MIT made persuasive cases that individuals could not "beat the market," which was composed of countless smart investors incorporating information accurately when assessing how much a stock was worth. That is, even if they invested in a professionally run mutual fund, odds were high simply buying an index fund that mimicked the Standard & Poor's 500 would do better. Fama won a Nobel Prize for his early work.
But the economics profession became more extreme in their support of the power of free markets, and what had been known as the efficient markets theory went off the rails. Economists like Fama began to claim there were no speculative bubbles. Regulations to limit them, like credit restraints, would only interfere with the efficient workings of the markets. Others like Michael Jensen, a Harvard Business School disciple of Fama's, argued as did Fama that stock prices rationally reflected the future value of a company. To get CEOs to manage companies better, just give them stock options. They will get rich as the company's stock price rose due to their abilities.
It turned out, however, that stock prices weren't all that rational at all. They were subject to fashions, as Robert Shiller, the Nobelist from Yale showed. It also turned out that, as Lucian Bebchuck at Harvard Law School has shown, there is little relationship between CEO compensation and a company's performance.
Some purely bad ideas were resurrected, such as Say's Law. It argued in part that any savings in a nation would be productively invested. But what we know is that just isn't true if there is no buying power for goods and services. Austerity economics was one damaging result, not merely at the University of Chicago but among researchers at, for example, Harvard, led by Alberto Alesina, and to a large extent Kenneth Rogoff, and some economists at the Brookings Institution. But deficits became the bogeyman.
The wide-ranging turn to laissez-faire doctrine reached across the economy, but here is how it contributed directly to poor and unequal incomes. After the inflationary 1970s, the nearly sole objective of government policy should be to keep inflation low. Again, there was a fundamental idea here. Inflation upset the rational workings of free markets by introducing uncertainty. With low inflation, economies would be efficient and prosperous.
But inflation targeting, led by the Federal Reserve, resulted in higher unemployment rates than necessary and slower growth in wages for most workers. There was a deliberate effort to keep wages from rising rapidly to avoid a squeeze on profit margins that would force business to raise wages. The unemployment rate was higher than what government economists thought a natural rate should be most of the time since the 1970s.
The founding idea of modern economics is Adam Smith's invisible hand, and this great idea, badly over-simplified, was the foundation of many bad ideas of the last generation. The invisible hand tells us how an economy free of government regulations may work, not how it does work. Competitors will push prices down to maximize consumer buying, pure and simple. Government need not regulate these competitors. Increasingly, the profession took a dogmatic view. Financial deregulation, a low minimum wage, reduce government invest -- these were all results of a purist interpretation of the invisible hand.
The great nineteenth century economist, John Stuart Mill, writing well after Adam Smith, was skeptical that competition alone was the great regulator as Smith insisted it was in his invisible hand. He said that economics was by nature "hypothetical." Its laws were not engraved in stone. If you looked around, wrote Mill, "custom," which he used to describe many non-economic aspects of culture and behavior, was equally important. "It would be a great misconception of the actual course of human affairs to suppose that competition exercises in fact this unlimited sway."
The ideas that governed the mainstream economics profession since the 1980s were turned into rules when they were at best only hypotheses. Rules are easier to deal with; ambiguity and uncertainty are shunted aside. But the world is not so simple, and good policy is scarce when a profession once dedicated to brilliant thinking and extemporaneous judgment to fit changing times turns to formula and ultimately cliché. more