Wednesday, January 25, 2012

Heterodox economics, heterodox thinking

We are often told that the big distinction in learning strategies is between the specialists and the generalists.  Academia in USA is so heavily skewed to the production of specialists that it is almost impossible to become a well-trained generalist.  And this state of affairs certainly has its advantages.  People who devote the majority of their energy to learning one subject are highly useful members of society.  We certainly want our heart surgeons and tower crane operators to be more than gifted amateurs.

Unfortunately, the same strategy that produces our gifted specialists is terrible for producing the sort of generalists necessary for the kind of thinking required to make community-wide decisions.  So while our computers become more intricate (the output of specialists), our politics (a profession requiring generalists) become more primitive—and thousands of other excellent examples.

The problem is that there aren't many agreed-on paths to becoming a high-level generalist.  So even natural generalists blessed with an omnivorous curiosity tend to get funneled into some specialism.  Schools have experimented with cross-disciplinary degree programs and the whole idea of liberal arts distribution requirements is to encourage a basic generalism.  But in the end, about the only way to produce a high-level generalist is to train curious kids to a specialist level in three or four skill-sets and hope the intellectual sparks created by the process will encourage them to fill in the gaps on their own.

Back when I learned economics, the great practitioners were all generalists.  Some of this was the legacy of Thorstein Veblen who was called "the last man to know everything."  There really was once a time when with great effort, someone could essentially learn everything humans knew.  Of course, now it is almost impossible to know even 1% of those bits of knowledge that are beyond rational debate.  So in the 1970s the economics profession essentially threw in the towel and abandoned any sort of generalist approach.  The result is that we have had a profession devote 30 years of big computers and big math in an effort to prove the narrow and utterly bogus proposition that markets are infallible.  The specialist economists who are dazzling at fast algebra are unfortunately so socially backward and narrowly focused they regularly prescribe disasters.

So it is comforting in a small way to discover that an over-reliance on specialist thinking doesn't work very well in medicine either.  The following from Wired Magazine about Pfizer's $21 billion dollar disaster drug torcetrapib is a real economics manifestation of the intellectual problem that reduced economics to its present level of irrelevance.

Trials and Errors: Why Science Is Failing Us
By Jonah Lehrer
December 16, 2011

Every year, nearly $100 billion is invested in biomedical research in the US, all of it aimed at teasing apart the invisible bits of the body.

On November 30, 2006, executives at Pfizer—the largest pharmaceutical company in the world—held a meeting with investors at the firm’s research center in Groton, Connecticut. Jeff Kindler, then CEO of Pfizer, began the presentation with an upbeat assessment of the company’s efforts to bring new drugs to market. He cited “exciting approaches” to the treatment of Alzheimer’s disease, fibromyalgia, and arthritis. But that news was just a warm-up. Kindler was most excited about a new drug called torcetrapib, which had recently entered Phase III clinical trials, the last step before filing for FDA approval. He confidently declared that torcetrapib would be “one of the most important compounds of our generation.” 
Kindler’s enthusiasm was understandable: The potential market for the drug was enormous. Like Pfizer’s blockbuster medication, Lipitor—the most widely prescribed branded pharmaceutical in America—torcetrapib was designed to tweak the cholesterol pathway. Although cholesterol is an essential component of cellular membranes, high levels of the compound have been consistently associated with heart disease. The accumulation of the pale yellow substance in arterial walls leads to inflammation. Clusters of white blood cells then gather around these “plaques,” which leads to even more inflammation. The end result is a blood vessel clogged with clumps of fat. 
Lipitor works by inhibiting an enzyme that plays a key role in the production of cholesterol in the liver. In particular, the drug lowers the level of low-density lipoprotein (LDL), or so-called bad cholesterol. In recent years, however, scientists have begun to focus on a separate part of the cholesterol pathway, the one that produces high-density lipoproteins. One function of HDL is to transport excess LDL back to the liver, where it is broken down. In essence, HDL is a janitor of fat, cleaning up the greasy mess of the modern diet, which is why it’s often referred to as “good cholesterol.” 
And this returns us to torcetrapib. It was designed to block a protein that converts HDL cholesterol into its more sinister sibling, LDL. In theory, this would cure our cholesterol problems, creating a surplus of the good stuff and a shortage of the bad. In his presentation, Kindler noted that torcetrapib had the potential to “redefine cardiovascular treatment.” 
There was a vast amount of research behind Kindler’s bold proclamations. The cholesterol pathway is one of the best-understood biological feedback systems in the human body. Since 1913, when Russian pathologist Nikolai Anichkov first experimentally linked cholesterol to the buildup of plaque in arteries, scientists have mapped out the metabolism and transport of these compounds in exquisite detail. They’ve documented the interactions of nearly every molecule, the way hydroxymethylglutaryl-coenzyme A reductase catalyzes the production of mevalonate, which gets phosphorylated and condensed before undergoing a sequence of electron shifts until it becomes lanosterol and then, after another 19 chemical reactions, finally morphs into cholesterol. Furthermore, torcetrapib had already undergone a small clinical trial, which showed that the drug could increase HDL and decrease LDL. Kindler told his investors that, by the second half of 2007, Pfizer would begin applying for approval from the FDA. The success of the drug seemed like a sure thing. 
And then, just two days later, on December 2, 2006, Pfizer issued a stunning announcement: The torcetrapib Phase III clinical trial was being terminated. Although the compound was supposed to prevent heart disease, it was actually triggering higher rates of chest pain and heart failure and a 60 percent increase in overall mortality. The drug appeared to be killing people. 
That week, Pfizer’s value plummeted by $21 billion. 
The story of torcetrapib is a tale of mistaken causation. Pfizer was operating on the assumption that raising levels of HDL cholesterol and lowering LDL would lead to a predictable outcome: Improved cardiovascular health. Less arterial plaque. Cleaner pipes. But that didn’t happen. 
Such failures occur all the time in the drug industry. (According to one recent analysis, more than 40 percent of drugs fail Phase III clinical trials.) And yet there is something particularly disturbing about the failure of torcetrapib. After all, a bet on this compound wasn’t supposed to be risky. For Pfizer, torcetrapib was the payoff for decades of research. Little wonder that the company was so confident about its clinical trials, which involved a total of 25,000 volunteers. Pfizer invested more than $1 billion in the development of the drug and $90 million to expand the factory that would manufacture the compound. Because scientists understood the individual steps of the cholesterol pathway at such a precise level, they assumed they also understood how it worked as a whole. 
This assumption—that understanding a system’s constituent parts means we also understand the causes within the system—is not limited to the pharmaceutical industry or even to biology. It defines modern science. In general, we believe that the so-called problem of causation can be cured by more information, by our ceaseless accumulation of facts. Scientists refer to this process as reductionism. By breaking down a process, we can see how everything fits together; the complex mystery is distilled into a list of ingredients. And so the question of cholesterol—what is its relationship to heart disease?—becomes a predictable loop of proteins tweaking proteins, acronyms altering one another. Modern medicine is particularly reliant on this approach. Every year, nearly $100 billion is invested in biomedical research in the US, all of it aimed at teasing apart the invisible bits of the body. We assume that these new details will finally reveal the causes of illness, pinning our maladies on small molecules and errant snippets of DNA. Once we find the cause, of course, we can begin working on a cure. more

3 comments:

  1. Part of the problem with training or developing generalists I believe is because we lack a "General Theory of Reality" that is existential in its approach. From there, devices may be identified that act us to help understand the reality experience. Many of have done such an exercise independently and modeled a general template that allows one to move quickly amoung areas of concern, delineate structure, digest predominating aspects and finally integrate them with reasoned and logical flow. Such models allow one to determine the obvious parallels amoung fields of concern. For instance, in nature there are predominating similarities between algebraic equations and sentences; both connect ideas to form a collective meaning.

    Good generalists need thinking skills and related models. As far as I know there are few degrees offered in general thinking skills at university. Perhaps such an endeavour serves up other dangers to the status quo.

    firstfinancialinsights.blogspot.com

    ReplyDelete
  2. I believe part of the issue with education moving toward specialization starts with students entering college with a weak concept or philosophy of life over all. If this is lacking in a young person, then the structures, relationships and models learned for their specialization (even at the associate/trade level) become the foundation for interpreting life. The higher or should we say deeper one goes into the subject of choice the more limited becomes their ability to interpret the full scope of life.

    This also relates to the slow transition from intellegence being interpreted as being able to think in multiple systems to being able to command all factual data within one system.

    Certainly the fact that young people grow up with many options for passively entertaining themselves vs. only having reading and thus learning as even people such as Henry Ford, or the better example of being a generalist, the Wright Brothers had is an issue.

    Lastly, I do not like the word generalist. A much more accurate term for what we need to forster in people so that we have the Jeffersons, Lincolns, Fords and Wrights is: Multidimensional.

    ReplyDelete
  3. Great first paragraph!

    I believe you are on to something. For me, I grew up surrounded by some of the great generalists to ever walk the earth—the farmers of the corn belt. They knew subjects from plant genetics to animal medicine. They could weld steel and wire their barns for electricity. They got that way because when you live out on the prairies, if you couldn't figure something out for yourself, you were screwed. They also had plenty of time to worry about their place in the universe—lots of routine jobs in agriculture. And no matter how rich you became, you were still judged by how many tools you had mastered.

    So yeah! It helps a lot of have all this in your head before you get to college.

    Thanks!

    ReplyDelete