Thursday, September 24, 2009

Questions of Knowledge


Part 11 of Prisoners of the Real


The intellectual achievements of Athenian society before and during the life of Plato have almost no parallel in the recorded history of humanity. The small state of Athens, about the size of Connecticut, made unprecedented contributions to literature, thought, and the fine arts. After expelling the tyrant Hippias in 510 B.C., Athenians discovered a new confidence in themselves and sought forms for the expression of creative instincts and a harmonious order for ideas. For the next century the state experimented with an aristocratic approach to democracy, developed the arts of architecture, sculpture and poetry, expressed a unity of social goals, and pursued a path of hard and rigorous thought.


Plato's early life coincided with the completion of the Parthenon and the Peloponnesian War, which later shattered Athens' empire. He knew Socrates from boyhood and developed an early interest in politics. After Socrates execution, however, he cynically concluded that, "public affairs at Athens were not carried on in accordance with the manners and practice of our fathers." Thus, he began 18 years of travel in Greece, Egypt and Italy. Upon returning, he founded The Academy, and presided over it for the rest of his life. This school became the intellectual center of Greek life.


Using the ideas of Socrates as his foundation, Plato lectured regularly without manuscripts or notes, setting out his "problems" for solution through the collective effort of his students. Science, law, philosophy – all came under scrutiny. Eventually, other city-states and colonies sought out The Academy to furnish advisers on legislative and scientific matters.


Plato's ideal was a world order in which the logical process of dialectic was used to obtain and classify ideas. In The Meno, for example, he examined the features of knowledge – the need for definition, the difficulty of knowing when a true answer has been reached, the "innate" ability to recall ideas, and the existence of degrees of knowledge. It began with Socrates' suggestion, originally proposed in The Protagoras, that if virtue is based on something objective such as knowledge it can therefore be taught. A person should inquire "about that which he does not know," knowledge that is possessed by the soul and released through questioning:


“Some things I have said of which I am not altogether confident. But that we shall be better and braver and less helpless if we think that we ought to inquire, than we should have been if we indulged in the idle fancy that there was no knowing and no use in seeking to know what we do not; that is a theme upon which I am ready to fight, in word and deed, to the utmost of my power.”


Truth, said Socrates, could be reached through both knowledge and right opinion, with both serving as guides to right action. Meno followed up with a question about the preference for knowledge, and, according to Plato, Socrates replied by explaining that both knowledge and opinions must be "fastened":


“I mean to say that they are not very valuable possessions if they are at liberty, for they will walk off like runaway slaves; but when fastened they are of great value, for they are really beautiful works of art. Now this is an illustration of the nature of true opinions: while they abide with us they are beautiful and fruitful, but they run away out of the human soul, and do not remain long, and therefore they are not of much value until they are fastened by the tie of the cause; and this fastening of them, Friend Meno, is recollection, as you and I have agreed to call it. But when they are bound in the first place, they have the nature of knowledge; and in the second place, they are abiding. And this is why knowledge is more honorable and excellent than true opinion, because fastened by a chain.”


The distinction between opinion and knowledge, both of which can be acquired through teaching, was pursued further in Euthydemus, which critiqued the exploitation of ambiguities by the Sophists while demonstrating the need for definite criteria of valid thinking:


“If a person had wealth and all the goods of which we were just now speaking, and did not use them, would he be happy because he possessed them?”


“No indeed, Socrates.”


“Then I said, a man who would be happy must not only have the good things, but he must also use them; there is no advantage in merely having them?”


“True.”


“Well, Cleinias, but if you have the use as well as the possession of good things, is that sufficient to confer happiness?”


“Yes, in my opinion.”


“And may a person use them either rightly or wrongly.”


“He must use them rightly.”


“That is quite true, I said. And the wrong use of a thing is far worse than the non-use; for the one is an evil, and the other is neither a good nor an evil.”


Through their discussion Cleinias, Euthydemus and Socrates deduced that happiness is gained only through the right use of things, that this use is made possible by knowledge, and that "everybody ought by all means to try and make himself as wise as he can."


In another dialogue Theaetetus, a member of The Academy and the founder of solid geometry, explored the nature of truth. His discussion with Socrates revealed Plato's synthesis of intuition and analysis in the knowing process. Truth can't be reached only through sense impressions, he concluded, nor identified by easy intuition. The two aspects of knowing – perception and definition – must be used together; substance and structure irreducibly linked:


“Socrates: The simple sensations which reach the soul through the body at birth are given to men and animals by nature, but their reflections of the being and use of them are slowly and hardly gained, if they are ever gained, by education and long experience.”


“Theaetetus: Assuredly.”


“And can a man attain truth who fails of attaining being?”


“Impossible.”


“And can he who misses the truth of anything, have a knowledge of that being?”


“He cannot.”


“Then knowledge does not consist in impressions of sense, but in reasoning about them; in that only, and not in mere impression, truth and being can be attained?”


“Clearly.”


Despite this pairing of perception and definition, however, Socrates and Plato accorded knowledge discovered through logical effort the primary role in the search for truth. Intellectual rigor was the standard in this knowledge society, and "wise men" – those most capable on analysis – were seen as the appropriate leaders of the ideal social order.


The long and arduous educative process of life, as described by Plato in The Republic, is molded by each individual's experience as a citizen of the state. Recognizing differences in and degrees of ability, the ideal state secures appropriate parentage and equal opportunity based on ability, and maintains an ethical aristocracy. Throughout the development of this model state, the values of logic, self-control and order are strongly emphasized. Plato's ideal society is ruled by the wise – that is, those who keep instincts under the control of their intellects, and populated by citizens who embody the cardinal virtues of wisdom, courage, temperance and justice.


Speaking to Glaucon, Socrates explained that equal distribution of authority ends in degeneration; what is required is state-enforced unity of political and cultural goals. The life of the citizen must be "coordinated" in his and society's best interests. Opinions, right or not, and discussion not in accord with the goals of the ethical aristocrats must be controlled or prohibited. The alternatives are bleak: timocracy – governance based on ambition; oligarchy –rule by a privileged groups, which remains the status quo in many modern societies; democracy – defined by Plato as government by the incompetent average; or tyranny – the exercise of arbitrary power. Although Plato's ideal society calls for strict obedience to the authority of the state, it attempts to eliminate favoritism by overruling the family. All children are held in common, and those who demonstrate the highest mental ability are trained for leadership.


Clearly, Plato's description of utopia made the state essential to the full development of humanity. Unfortunately, this established a lasting precedent: that in an ideal world the organization itself determines the ends and the methods for achieving them. Socrates had argued that knowledge was the only criterion of virtue acceptable in a rational world. His student, Plato, detailed the nature of this knowledge, and provided a seminal description of a society governed by the process of thought. Their concepts led eventually to the development of philosophical systems hinged on a desire to find satisfaction in life through self-control, moderation, and the suspension of judgment.


Assuming that the "base instincts" and "selfish passions" of human nature posed a fundamental threat to an orderly world, these philosophers and their successors concluded that humanity could only be held in check through unwavering allegiance to logic, a bondage to rational explanation.


Next: The Path of Certainty


To read other chapters, go to Prisoners of the Real: An Odyssey

Sunday, September 20, 2009

Scared Socialist

(or, How Deregulation Trashed the US Economy and Government Intervention Became the Only Way Out)

Remember the panic-inducing headlines of 2008? Government Takes Over Troubled Mortgage Giants, Lehman Brothers Files for Bankruptcy, Bank of America Buys Merrill Lynch, and Stock Prices Plummet – that last one just as the government announced an $85 billion emergency loan to rescue insurance giant AIG. And that was just the beginning.

Next came the $700 billion bailout of “distressed” banks, a plan that gave Treasury Secretary Henry Paulson and the lame-duck Bush administration unprecedented power. We need to "remove the distressed assets from the financial system," argued Paulson, who had resigned as CEO of Goldman Sachs to become Treasury “czar” in 2006 – after amassing a personal net worth of $700 million during his time at the bank.

Even President Bush, in remarks to the country on Sept. 24, 2008, admitted that “democratic capitalism” – though he still considered it “the best system ever devised” – needed serious help. With the economy’s “fundamentals” clearly in jeopardy and the disaster wrought by deregulation and corporate excess exposed, government intervention had become the only way out.

Just one year later, nevertheless, the lessons of that economic meltdown have been largely forgotten. Instead, Republicans, media zealots and the paranoid Right have returned to their default position – crying “socialism” while blaming the government for what greed and unrestrained capitalism have wrought. The L-word – liberal – has been replaced by the S-word, and almost half of the population seems to believe that Obama is an alien radical intent on destroying the country from within.

Many people are currently living in a frightening fantasy world. According to an NBC/Wall Street Journal poll, 45 percent of all Americans believe that health care reform will create “death panels” that withhold medical care for the elderly, 55 percent think it will provide health insurance to illegal immigrants, and half say it will pay for women to have abortions – none of which are true. As sociology professor Peter Phillips explains, too many people, especially consumers of corporate television news, “are embedded in a state of excited delirium of knowinglessness.”

Given the current level of rampant misinformation and distortion of public discourse, it’s vital to remember how the economic crisis happened and what people really thought about it at the time. To begin with, it’s clear that the root of the problem tracks back to the deregulation era launched during the Reagan administration. What Bush I once called "voodoo economics" became the biggest redistribution of wealth since the New Deal.

The central article of faith in the "Reagan Revolution" was that money redistributed from the poor to the rich would produce a burst of productivity and economic growth. Give to the corporations and the wealthy, said "supply side" economists, and they will invest the money in new factories and technology, and the country will be restored to greatness. Did it work? Hardly.

Rather than putting their money into jobs, research or equipment, the country’s biggest businesses went on the largest merger binge in history, buying up companies in a trend that spelled less competition, less productivity, and more control of the economy in fewer hands. Multi-billion dollar war chests were assembled to finance takeovers of large oil and coal companies, communications giants, and prestigious financial institutions.

After a stock market meltdown in 1987, Wall Street advised the US Treasury not to meddle in financial markets. This paved the way for consolidation around large merchant banks, institutional investors, stock brokerage firms, and insurance companies. Complex speculative instruments – derivatives, options, futures, and hedge funds – were largely unregulated, becoming vulnerable to manipulation.

In 1999, the Financial Services Modernization Act – also known as the Gramm-Leach-Bliley Act – removed the remaining regulatory restraints on banking conglomerates. Repealing the Glass-Steagall Act of 1933, a New Deal reform put in place in response to corruption that had resulted in more than 5,000 bank failures following the 1929 Wall Street crash, it permitted commercial banks, brokerage firms, institutional investors and insurance companies to invest in each others’ enterprises and integrate their financial operations.

In short, the 2008 financial crisis had been building for a long time. But alarm bells didn’t start ringing until June 2007, when two hedge funds of the New York investment bank Bear Stearns lurched toward collapse because of their extensive investments in mortgage-backed securities. They were forced to dump assets as the trouble spread to major Wall Street firms such as Merrill Lynch, JPMorgan Chase, Citigroup, and Goldman Sachs, which had loaned the firm money.

Over that summer, German banks with bad investments in the US real-estate market got caught up in the crisis. But the most obvious sign of trouble was the Federal Reserve’s decision on August 9 to pump $24 billion into the US banking system through large purchases of securities; meanwhile, the European Central Bank made a record cash injection of $130 billion into its markets to shake off credit fears. On the same day, Wall Street suffered its second-worst decline of the year as the Dow Jones dropped by nearly 400 points.

The next day, the Fed pumped another $38 billion in temporary reserves into the financial system. But the government rejected a request for Fannie Mae and Freddie Mac to take on more debt. At the end of the month, President Bush announced a plan to use the Federal Housing Administration, which insures loans for low-income borrowers, to offer government-guaranteed loans to around 80,000 homeowners already in default. Big-time federal intervention was underway.

On Sept. 18, 2007 the Federal Reserve started cutting interest rates, citing the credit crunch on Wall Street and in the broad economy. The nation's central bank made cuts at seven straight meetings. It also agreed to start loaning money directly to Wall Street firms, rather than only to commercial banks, and to accept troubled mortgage-backed securities as collateral. In October, profits at Citigroup dropped sharply. One large financial institution after another reported heavy losses.

In January 2008, the Bank of America acquired Countrywide Financial in a deal that rescued the country's biggest mortgage lender. In February, Congress approved a $150-billion spending package to stimulate the sluggish economy. In March, on the verge of collapse and under pressure by the Federal Reserve, Bear Stears was forced to accept a buyout by investment bank JPMorgan Chase at a fire-sale price. The deal was backed by Fed loans – up to $29 billion in financing to cover potential losses. In July, the California mortgage lender IndyMac collapsed and troubles deepened for Fannie Mae and Freddie Mac.

Which brings us to September 6, 2008, when Paulson announced the takeover of Fannie and Freddie. That put the government in charge of firms that owned or backed more than $5 trillion in mortgages. The Treasury Department agreed to provide up to $200 billion in loans to the cash-starved firms, crucial sources of mortgage funding for banks and other lenders. It was a bid to reverse the prolonged housing and credit crisis. This was the same week when Presidential candidate John McCain announced that the “fundamentals of the economy are strong.”

A week later, on Sept, 15, 2008, Lehman Brothers, burdened by $60 billion in soured real-estate holdings, filed for bankruptcy after attempts to rescue the 158-year-old firm failed. Merrill Lynch also agreed to be acquired by the Bank of America, and AIG asked for a bridge loan of billions of dollars from the Federal Reserve. The $50 billion Bank of America deal created a bank that would rival Citigroup, the biggest US bank in terms of assets. Meanwhile, stocks fell, the Dow Jones sliding 504 points – the worst drop since the 9/11 attacks. Stocks also posted big losses in markets across much of the globe. The day became known as “Black Monday."

The next day, the government agreed to an $85 billion emergency loan to rescue AIG, saying failure of the company could hurt the already delicate financial markets and the economy. That was Tuesday. On Wednesday, the Dow lost about 450 points more. Markets around the world were also having a confidence crisis, and Russia shut down its market for a third day following its worst plunge since 1998.

On Thursday, the Federal Reserve, working with banks in Europe, Canada and Asia, pumped as much as $180 billion into money markets to combat a seizing up of lending. Republicans blasted the Treasury Department and Fed for orchestrating the AIG bailout, and the White House for not informing them of the plan. John McCain said he would fire SEC Chair Chris Cox (which the President can’t actually do), and Barack Obama called it evidence of the failure of deregulation and Bush-McCain policies.

A joke in Washington was that the crisis seemed to be turning deregulators into socialists – at least as far as business risks were concerned. That was a bit exaggerated. But for a while the crisis did force the whole country to take a look at Reagan’s old claim that “government is the problem.”

In the end, Congress settled on a 451-page bill called The Emergency Economic Stabilization Act. The week after it passed the Dow plunged by 500 points in one day, making the overall loss 5000 points for the year. Recession? Obviously. Depression? It was now a clear possibility.

Next, European central banks and the Federal Reserve cut interest rates at the same time in hopes of freeing up credit. Bloomberg News called this “an unprecedented coordinated effort to ease the economic effects of the worst financial crisis since the Great Depression.” The same day the Federal Reserve gave AIG a new $37 billion loan. That made it $120 billion total (of which executives and "high earners" quickly used more than $400,000 on resort lodging and spas). The market plunged again the very next day, this time over 650 points, down to 1995 levels.

By then, almost everyone agreed that government intervention was needed. Even some conservatives saw the light. On October 12, for example, James Baker III, Republican stalwart and the man to helped install George W. as president during the Florida election fiasco in 2000, said, “This is bigger than the private sector can fix by itself.”

Clearly, this wasn’t the dawn of socialism in America. But neither was it unprecedented. In 1917, the government had seized the railroads to make sure goods, weapons and troops moved smoothly during World War I. Bondholders and stockholders were compensated, and railroads were returned to private ownership after the war ended. During World War II, Washington had seized dozens of companies including railroads, coal mines and, briefly, the Montgomery Ward department store chain. In 1952, President Harry Truman had seized 88 steel mills across the country, saying that their owners were provoking an industry-wide strike that would cripple the Korean War effort. That time, the Supreme Court ruled the seizures an unconstitutional abuse of presidential power.

In 1984, the government took an 80 percent stake in the Continental Illinois National Bank and Trust. The bank had failed, in part, because of bad oil loans in Oklahoma and Texas. One of the country's top 10 banks at the time, Continental Illinois was considered "too big to fail" by regulators, who feared wider turmoil in the financial markets. Sound familiar? It was sold to Bank of America in 1994.

The nearest precedent for the 2008 “rescue” plan was the investments made by the Reconstruction Finance Corporation in the 1930s. It not only made loans to distressed banks. It bought stock in 6,000 banks, at a total cost of about $3 billion. When the economy eventually stabilized, the government sold the stock to private investors or the banks themselves.

Like most western democracies, the US government has clearly been operating with some socialist programs – within an undeniably capitalist system – for decades. Still, what are we actually talking about? We clearly don’t have the state running the economy. Rather, we have adopted programs designed to increase economic equality – and sometimes programs that have done the opposite. In other words, we’ve had redistribution of wealth. During the last few decades, it’s largely been redistribution toward the top.
So, given all the yelling and sign-waving, what do socialists actually believe? That capitalism unfairly concentrates power and wealth, creating an unequal society. Basically, a no brainer at this point. Yet they disagree about how much government intervention will work. Some certainly do advocate complete nationalization of production. But others prefer state control of capital within a market economy, while social democrats talk about selective nationalization of key elements in a mixed economy, along with tax-funded welfare programs. On the other hand, libertarian socialists don’t like state control and prefer direct collective ownership – workers coops, workers councils, basically workplace democracy.
Most libertarian socialists, like libertarians in general, weren’t happy about the 2008-2009 financial bailouts. Social democrats, in contrast, felt they didn’t go far enough. And most capitalists? Well, they decried the situation but went along. Some even chirped that “we are all socialists now” – at least as far as losses are concerned.
The truth is, Americans have been using socialist ideas – although not living in a socialist society – for many years, and the sky hasn’t fallen. But this doesn’t matter to the politicians and talking heads hawking “out of control” government and a hostile takeover of the country. The attempt to stir up fears about socialism, and link it to terrorism and un-American activity, is a cheap but tried-and-true political ploy. It’s also the latest incarnation of an ongoing culture war based on resentment, ignorance, and selfishness. The subtext is that we are not all equal, that being truly American means embracing a specific, very narrow set of values, and that the government shouldn’t be a force for equality.

But let’s give a conservative the last word. During the 2008 presidential campaign, George Will put it this way: “Ninety-five percent of what the government does is redistribute wealth. It operates on the principle of concentrated benefits and dispersed costs. Case in point: we have sugar subsidies. Costs the American people billions of dollars but they don't notice it it's in such small increments. But the few sugar growers get very rich out of this. Now we have socialism for the strong - that is the well-represented and organized in Washington like the sugar growers. But it's socialism none the less and it's not new.”

Wednesday, September 16, 2009

Defining the "Real World"

Part 10 of Prisoners of the Real


"The great tradition of the nineteenth century was that science pursues the true nature of the real, pushing back the decimal places of its measurement."

-- C. West Churchman


"Man himself may be controlled by his environment, but it is an environment which is almost wholly of his own making."

-- B.F. Skinner


Consistent with the scientific approach, most leaders see their primary task as making values “operational,” using their analytical powers to help maintain a reasonable world. Loyal to the god of concrete knowledge, they avoid idealism and utopian visions, fleeing from pure thought in a quest for refined practice. Science holds out a seductive promise – objectivity, the ability to see the unvarnished facts. This affection for facts, along with the desire for a rational world order, is rooted linguistically in the 17th century definition of man as a rational being with a rational soul.


Once the term “rational” became part of common English usage, it also began to serve as the basis for a dangerous distinction between humans and other animals. St. Thomas Aquinas had opened the door for this dichotomy four centuries earlier, arguing that while most animals possessed only a "sensitive soul" – otherwise known as instinct, a "rational soul" was divinely implanted in the human fetus. Therefore, he concluded, the behavior of humans – at least human males – depended on reason. Descartes eventually took the idea even farther, celebrating man the “reasoner” while simultaneously insisting that all other living creatures were no more than flesh and blood machines.


By 1750 humanity's unique status – rational rather than "merely animal" – had been combined with the notion of a "rational sovereignty of soul" by groups such as the Rational Christians Society. Before long faith was defined as a rational asset. Aristotle was re-examined and found to suggest that "our rational faculty is the gift of God." According to the emerging wisdom, humans had the capacity to be naturally rational beings, sound in judgment, sane and sensible, and surely not foolish, absurd or extravagant. However, this essentially theological definition implied that possession of a soul wasn't enough. The right of salvation had to be earned through reason – that is, by differentiating between good and evil. Difficult as this might be, the person of “regular life and rational mind” would not despair.


In the 18th and 19th centuries considerable stress was placed on forming opinions through pure or a priori reasoning. Even religion and revelation became matters of sensible evidence or intellectual demonstration. Rational theologians regarded reason as the chief or only guide in matters religious, and explained the apparently supernatural in a manner agreeable to the new standard.


Metaphysicians opposed empiricism and sensationalism, claiming that reason rather than sense was the foundation of certainty in knowledge. This assumption allowed physician-lexicographer Peter Mark Roget to make reason a synonym for consciousness, and to ban intuition as a hollow form of evasion.


In the 1852 edition of his Theosaurus of English Words and Phrases, Roget equated reason with understanding, reflection, meditation and wisdom. Its supposed opposite, intuition, was given synonyms such as chicanery, evasion, perversion, mystification, speciousness, nonsense, hairsplitting and quibbling; in short, it was defined as the absence of reason. Roget's linguistic encyclopedia etched this conceptual error in the language grooves of the Western world.


More recent psychological research has been used to overthrow of the dichotomy of rational vs. instinctive behavior, and the rational nature of human beings has become a social standard that threatens to completely eclipse the idea of a spontaneous or inner nature. Many geneticists predict that the concept of instinct as unlearned behavior is destined to disappear, once scientists fully define the relationships between genes and behavior and determine the specific factors that control the form of various responses. The objective of their systematic analysis of response patterns is to replace instinct with scientifically valid explanations. The theory is that instinct consists of a motivating impulse, the signal or stimulus that releases the behavior, and the sequence of activities that carries it out. It is frequently argued that all three aspects may be learned.


B.F. Skinner welcomed the intention to eliminate the concept of instinct, which he called the notion of autonomous man with his inner agent. Yet he felt that research that was directed inward, seeking physiological correlates of mental activities, simply diverted attention from the external environment. "Evolutionary and environmental histories change an organism, but they are not stored within it," he argued. "Anatomists and physiologists will not find an ape, or a bull, or for that matter instincts (concealed in man's inner self). They will find anatomical and physiological features which are the product of an evolutionary history." All instincts, are merely habits, he argued, and the self is a repertoire of behaviors within our grasp.


In A Guide to Rational Living, an early pop psychology self-help book, psychoanalysts Elbert Ellis and Robert Harper described a method of self-control that fit well with the Skinnerian view. Avoiding completely the idea of instinct, they focused instead on a thought vs. emotion dichotomy. But emotions – defined as physiological responses to stimulus situations and the behaviors that express them – were actually considered forms of thought, the type that could only exist briefly, in extreme situations, without being bolstered by some type of thinking. The task of each human being, they suggested, is to select the appropriate mode of thought and practice its use.


In this way, they resolved the dichotomy. What is usually labeled thinking is "calm and dispassionate appraisal." Emotion, on the other hand, is thinking that is slanted or biased by some past perception or experience. The process of emoting is "semi-logical, fixated, prejudiced or bigoted thought," they wrote. Arguing that it is simply "illogical, stupid and ignorant" not to live rationally, they defined the term as "not foolish or silly; sensible; leading to efficient results, producing desired effects with a minimum of expense."


Humans are happiest, they argued, when they discipline their thoughts. The way to do this is by using observation and analysis for the self-regulation of emotions, since biological tendencies lead human beings to act in "ridiculous, prejudiced, amazingly asinine ways." Such views are entirely consistent with the 17th century interpretation of madness as the loss of rational truth.


For Ellis and Harper, as for renaissance men, the norm was reason and any divergence was irrational. Though they stopped short of calling for disciplinary action in cases where someone failed to live rationally, they strongly urged that this is the only mature choice. And if one was "too emotionally blocked" to benefit from their approach, they simply suggested "working, working and (yes, everlastingly) working at it." Thus, rational living was linked directly to discipline and work. For these psychoanalysts and many of their self-help successors, work means dedication to scientific self-analysis, and idleness is submission to biased thought.


Although this simplistic approach to rationalism distorts the Platonic concept of truth as a synthesis of intuition and analysis, it nonetheless owes much to Plato's philosophy. Any study of rationalism will eventually lead to the chronicler of Socrates, since Plato's assumptions concerning the individual and society served as a main springboard for European thought over the following 2400 years. Rationalists such as Descartes, Hobbes, Diderot and Hegel enlarged upon and further refined many of the principles articulated in Plato's Meno, The Republic, Phaedrus, Theaetetus, and other dialogues.


Pythagoras, who coined the term philosophy, felt it had a mystical as well as an intellectual basis. And like Pythagoras, Plato often transcended the clear boundaries of reason, constructing a philosophy of organism that paired the temporal and the eternal, the actual and the potential, the operational and the ideal. Nevertheless, Plato's legacy has been an extended philosophical excursion within the arbitrary boundaries of rational thought, a vain search for the "true nature of the real."


Next: Questions of Knowledge


Previous:

The Creative Also Destroys

Deconstructing Leadership

Anatomy of Insecurity

Managers and Their Tools

The Corporate Way of Life

The Dictatorship of Time

Rules for Rationals

The Age of Adaptability

Living with Rational Management

Friday, September 11, 2009

The Oily Road to 9/11

As troops and planes headed toward Afghanistan in October 2001, few people questioned the reasons for military engagement. But the causes of war are rarely simple and, as time has passed, other powerful motives have come into focus.

As it turns out, the US war plan was in the works months before the 9/11 attacks. And, like the two Gulf Wars, the rationale was also, if not mainly, rooted in a struggle over access to oil and gas, in this case huge finds in the Caspian Sea Basin. What looked at the time like justified retaliation was, in essence, the first resource war of the 21st century.

For the major energy companies, the Caspian is a new “El Dorado.” North of the Persian Gulf, and including Russia, Iran, and former republics of the Soviet Union, it is estimated to contain the world’s second or third largest reserves of petroleum, along with a vast supply of natural gas. The region is landlocked, however, so resources found there must move to market by rail or pipeline through adjacent, often unstable states.

Despite complex geopolitics and considerable risks, major oil companies have been acquiring development rights and preparing for production since the early 1990s. By 2001, offshore drilling operations were underway in Azerbaijan and Kazakhstan, and were set to commence elsewhere. The majors have also invested significantly in the future construction of oil and gas pipelines to distant ports and refineries. By 2010, they expect to invest at least $50 billion in production and transportation.

The first big move was a joint venture between Chevron and Kazakhstan, signed in 1993 to develop the huge Tenzig oil field on the Caspian coast. Three years later, ExxonMobil purchased a 25 percent share. Another consortium focused on Azerbaijan’s offshore fields, with estimated reserves of 32 billion barrels of oil and 35 trillion cubic feet of natural gas, making it the third largest potential regional source.

In 1994, BP Amoco, Lukoil, Unocal, Penzoil, Statoil, and others joined with Azerbaijan’s state oil company to form the Azerbaijan International Operating Company. Bush family adviser James A. Baker III, who spearheaded George W. Bush’s victory in the Florida election dispute, headed the law firm representing this consortium and sat on the U.S.-Azerbaijan Chamber of Commerce advisory council, as did Vice Pres. Dick Cheney before him. But before their investments could produce profits, roadblocks would have to be removed. The biggest was how to get the fuel to markets.

Prior to 9/11, the U.S. government’s preferred future route, known as the Baku-Tbilisi-Ceyhan (BTC) project, went from Azerbaijan through Georgia and then south to the Turkish coast. The goal was to reduce reliance on Russia and bring the southern Caucasus into the US fold. National Security Adviser Condoleezza Rice is a former director of Chevron, a lynchpin of the BTC consortium with extensive operations in Azerbaijan. Until 2000, Cheney was chief executive at Halliburton Co., named a finalist in 2001 to bid on engineering work in the Turkish sector.

Some companies showed more interest in a less expensive route to the Persian Gulf through Iran. But this clashed with official US policy, including a 1995 executive order prohibiting US business with Iran and the 1996 Iran and Libya Sanctions Act, which limited oil investments. A third option was a pipeline from the Dauletebad gas fields in eastern Turkmenistan south through Pakistan to the Arabian Sea, a route across western Afghanistan. After 1995, however, that meant dealing with the Taliban.

This wasn’t easy. Although a delegation from Afghanistan visited Washington in February 1997 to secure recognition and meet with Unocal, only two months later the new regime unexpectedly announced that it would award a pipeline contract to the company that started work first. Unocal Pres. John Imle was baffled but refused to give up.

During the summer, a new association chaired by Unocal was formed to promote Turkmenistan-U.S. cooperation. But the Taliban threw another curve ball, announcing that it was leaning toward Bridas. To press its advantage, the Argentina-based company joined forces with another major, Amoco. Still in the game, however, Unocal made some headway with Pakistan, signing a 30-year pricing agreement. Despite complaints, US pressure was paying off. By October, the pieces appeared to be in place. Led by Unocal, Delta, Turkmenistan, Japan’s Itochi Oil, Indonesia Petroleum, Crescent Group, and Hyundai became partners in the new Central Asia Gas Pipeline Ltd. (CentGas). Gazprom signed soon after.

Still hoping to win over the Taliban, Unocal invited a delegation to visit corporate headquarters in Sugarland, Texas. The Afghan visitors also met with State Department officials. But the negotiations failed, allegedly because the Taliban wanted too much money. Sensing trouble, Gazprom pulled out of the consortium, leaving Unocal at risk with a 54 percent interest. Shortly thereafter, Unocal Vice Pres. John J. Maresca, later to become a special ambassador to Afghanistan, testified before the US House. Until a single, unified, and friendly government was in place in Afghanistan, he told lawmakers on Feb. 12, 1998, a trans-Afghani pipeline wouldn’t be built. The need for a regime change had been put on the table.

By this time, it was quite clear that Afghanistan was one of bin Laden’s main operational bases. But the CIA apparently ignored the warnings until the US embassies in Kenya and Tanzania were bombed. Thirteen days later, the United States retaliated, sending cruise missiles into al Qaeda camps near Khost and Jalalabad. Finally getting the message, Unocal officially suspended its Afghan pipeline plan and pulled out staff throughout the region. Before the end of 1998, it also withdrew from the $2.9 billion Turkmenistan-to-Turkey natural gas project, as well as the Afghan consortium. Unocal’s quest for “El Dorado” had been indefinitely postponed.

Taking advantage of an opening, Bridas resumed negotiations with Russia, Turkmenistan, and Kazakhstan. Shortly, Turkmenistan’s foreign minister met with the Taliban’s Mullah Omar to discuss the proposed pipeline. Enron also expressed an interest. With $3 billion invested in a plan to build an electrical generating plant at Dabhol, India, it had recently lost access to liquid natural gas supplies from Qatar to fuel the plant. A trans-Afghani gas pipeline from Turkmenistan, terminating near the Pakistan-India border, looked like a promising alternative. Before the end of April 1999, Pakistan, Turkmenistan, and the Taliban had sealed an agreement to revive that project.

The Bush family was well acquainted with the bin Ladens long before the Saudi renegade declared war on the United States and its allies in Saudi Arabia’s royal family. One of Bush II’s former business partners claims to have made his first million in the early 1980s with the aid of a company financed by Osama bin Laden’s elder brother, Salem. Both Bush I and II had investments with the Saudi family in the Carlyle Group, a relatively small company that went on to become a large US defense contractor.

Even after the 1998 embassy attacks, the relationship remained cordial. In 1998 and 2000, the first Pres. Bush traveled to Saudi Arabia on behalf of Carlyle, meeting privately with both the Saudi royals and several of Osama’s relatives.

Shortly after it moved into the White House, the Bush II administration reportedly told the FBI and intelligence agencies to back off investigations involving the family. The bureau was apparently interested in two bin Laden relatives, Abdullah and Omar, who were living near CIA headquarters in Virginia. A blind spot for Saudi Arabia, as well as Bush family contacts with the bin Ladens, also helps explain why no action was taken when the FBI told the new administration there was clear evidence tying al Qaeda to the October 2000 bombing of the USS Cole.

Then-National Security Advisor Rice certainly knew something was up. Her predecessor, Sandy Berger, had briefed her in detail, advising that she would “be spending more time on this issue than on any other.” Yet, according to a May 2002 Newsweek cover story, “What Bush Knew,” a strategic review “was marginalized and scarcely mentioned in the ensuing months as the administration committed itself to other priorities, like national missile defense (NMD) and Iraq.”

The administration didn’t ignore the Taliban, however. On the contrary, it offered aid. In May 2001, Secretary of State Colin Powell announced a $43 million package for the regime, purportedly to assist hungry farmers who were starving since the destruction of their opium crop on orders from the Taliban’s leaders.

By June 2001, the warning signs were obvious for anyone willing to listen. German intelligence had informed both the CIA and Israel that Middle Eastern terrorists were “planning to hijack commercial aircraft to use as weapons to attack important symbols of American and Israeli culture.” On June 28, CIA Director George Tenet informed Rice that it was “highly likely” that a “significant Qaeda attack” would take place “in the near future.” Before he reached Genoa in July for the G-8 summit, Bush obviously understood the danger. Among others, Egyptian Pres. Hosni Mubarak had issued a blunt warning: Someone wanted to crash a plane filled with explosives into the conference site.

But word of imminent US military action was also leaking out. During a meeting with Pakistani and Russian intelligence officers in Berlin, three former US officials revealed that Washington was planning military strikes against Afghanistan. They even speculated on the launch date — October 2001.

Unfortunately, Taliban members may also have been in the room, or at least privy to what was said. In any event, the British press later reported that Pakistan’s secret service had relayed the news to the Taliban leadership. So much for the element of surprise.

When revelations surfaced that the United States had received credible warnings of an impending attack, officials protested that the information was too vague and that, in any case, Bush II didn’t know about the possibility that airplanes might be hijacked until an Aug. 6, 2001, briefing. A key element of this defense was that intelligence available to the CIA never reached the president’s desk. True or not, it was the most convenient explanation. However, given the available warnings, not to mention US plans to mount an attack on Afghanistan, the failure to take effective preventive measures looks, at the very least, like a case of willful disregard.

In the weeks after 9/11, national mourning, frustration, and anger, adroitly stoked by the major media, provided a more than adequate justification for the military battle plan hatched months before. A worldwide campaign against terrorism and an alleged “axis of evil” that included Iraq, Iran, and North Korea would have sounded needlessly militant or overly ambitious before 9/11. Afterward, it was hard, even risky, to speak out against the call to war. The order of the day was unity, and whatever the administration needed, Congress (and the public) seemed willing to supply.

This essay is partially excerpted from Uneasy Empire: Repression, Globalization and What We Can Do, and was originally published by The Vermont Guardian and Toward Freedom in September, 2006.

Sunday, September 6, 2009

Mission Improbable: Revisiting 9/11

Eight years after the terrible events of September 11, 2001, questioning the official story can still get you into trouble, as Obama administration advisor Van Jones just found out. On September 5, Jones, a California leader in both the civil rights and environmental movements, was forced to resign from his green jobs role with the Council on Environmental Quality, in large part because he signed a petition in 2004 that questioned whether Bush administration officials "may indeed have deliberately allowed 9/11 to happen, perhaps as a pretext for war." Issuing an apology for such an opinion, even though millions of people agree and credible evidence points in that direction, just didn’t cut it.


In 2005, I went farther than Jones, publishing an article that examined some of the theories surrounding those attacks in New York and Washington, DC. Subsequently, that story was cited by some journalists as evidence that its author was some kind of “conspiracy nut.” I disagreed then, and I still do. One of the top stories on the 2010 Project Censored list supports my view. Covering some of the same ground, it notes that 700 scientific professionals in the fields of architecture, engineering, and physics have now concluded that the official explanation for the collapse of the World Trade Center buildings is implausible according to laws of physics, and that the collapse of WTC 7, the 47-story building that wasn’t hit by planes, is clear evidence of controlled demolition.


We’ll probably be debating what really happened for many years, and perhaps will never know the whole story. Nevertheless, as we approach another 9/11 anniversary, here again is my 2005 article:

The public has repeatedly been urged to ignore "outrageous" conspiracy theories about the Sept. 11, 2001, attacks that set in motion the so-called "war on terrorism." However, the official explanation that has been provided — and widely embraced — also requires the acceptance of a theory, one involving a massive intelligence failure, 19 Muslim hijackers under the sway of Osama bin Laden, and the inability of the world’s most advanced Air Force to intercept four commercial airplanes.

"A good theory explains most of the relevant facts and is not contradicted," notes David Ray Griffin, who has been examining the available evidence and has so far published two books on the subject. Griffin summarized his findings for more than 1,000 people in four well-attended Vermont talks. The bottom line, he informed a packed house in Burlington on Oct. 12, (2005) is that "every aspect of the official story is problematic," contradicting the available evidence and defying even the laws of physics.

You may well ask, how can this be true? And, if so, why haven’t we heard more about it? The answer to the second question is easy: Mainstream media outlets have consistently declined to examine the highly technical and exhaustively documented case Griffin has developed. That may also sound like a conspiracy theory, but the almost total news blackout of Griffin’s Vermont talks suggests that it’s an unfortunate fact.

Explaining why the official story doesn’t hold water is a bit more difficult, involving a many-layered analysis not easily summarized in sound bytes. Nevertheless, in person Griffin did manage to provide a provocative and persuasive summary of some information in his books, The New Pearl Harbor: Disturbing Questions about the Bush Administration and 9/11 and The 9/11 Commission Report: Omissions and Distortions.

For example, he explained that the Federal Emergency Management Agency (FEMA) concluded that structural damage and extremely hot fires caused the collapse of the Twin Towers, despite the fact that "fire has never caused steel frame high-rise buildings to collapse" and, contrary to rumor, the towers were designed to withstand the impact of airliners as large as those seen crashing into them.

FEMA also said that the steel beams of the building buckled. For that to happen, however, the fires had to reach 2,800 degrees Fahrenheit and last a long time. But jet fuel only reaches 1,700 degrees, and the black smoke seen billowing from the towers established that most of the jet fuel had burned up within the first 10 minutes. In fact, the fires were clearly waning; their heat didn’t even break most of the windows, and one survivor who reached the 80th floor of the south tower saw only smoke and very little fire, Griffin notes.

So, if the "fire theory" doesn’t stand up to scrutiny, how did it happen? Although Griffin doesn’t completely commit himself to a specific theory, his evidence demonstrates that the collapses have at least 10 features in common with "controlled demolition"— that is, a series of carefully planned and timed explosions. When confronted with this unsettling information, some people simply go into denial.

In any case, as everyone who watched the tragedy unfold on TV knows, the collapses were quite sudden. But steel doesn’t suddenly buckle, so the process should have been gradual. The buildings also came straight down at almost free fall speed in about 10 seconds. That indicates there was no resistance. As Griffin explains in The New Pearl Harbor, "If each floor produced just a little resistance, so that breaking through each one took a half second, the collapse of all those floors [in the south tower] would have taken 40 to 47 seconds."

The collapses were also total, creating a neat pile of rubble with no core columns left standing, an outcome consistent with the use of explosives to slice beams in controlled demolitions. In this case, the beams recovered at Ground Zero were already neatly cut up, then removed without forensic examination and quickly sold to scrap dealers who exported them to places like China and Korea.

In addition, a lot of dust was produced, a pulverization effect that also strongly suggests the use of powerful explosives, and a considerable amount of material was horizontally ejected for long distances, a reaction that defies physics unless explosives were used. If that isn’t enough to raise some questions, many witnesses reported seeing and hearing a series of explosions, some of them coming from beneath the building.

The New York Fire Department recorded more than 500 oral histories concerning the events, but New York Mayor Michael Bloomberg refused to release them until a New York court of appeals required it. As a result, we now know that many witnesses mentioned explosions — "like bombs going off," one said — as well as low level flashes and the buildings blowing out on all sides.

There’s a lot more. For example, Building 7, which collapsed later in the day, wasn’t hit by a plane, and suffered only small fires on a few floors, ruling out fire as the cause. More troubling still, 25 firefighters and medical workers were told that the building was coming down hours before it happened and were ordered to move at least five blocks away. Then-Mayor Rudolph Giuliani was also informed about the impending collapse of Building 7 beforehand. The 9/11 Commission dealt with the whole problem by omitting any mention of Building 7 from its report.

Another startling revelation, also backed up with solid evidence and corroborative testimony, is that Flight 77, a Boeing 757, could not have caused the damage to the West Wing of the Pentagon. Among other things, its burned-out wreckage was not found at the site, and the 20-foot wide hole in the building was far too small for such a plane. Within minutes of the crash, the FBI showed up at a gas station across the street and seized the footage from a security camera that may have recorded the moment of impact. Most evidence points to a smaller plane or some sort of missile.

In addition to the physical evidence, reasons citied by Griffin and others who question what actually hit the Pentagon and why include: the extreme difficulty of maneuvering a huge plane during a 7,000 foot descent in less that three minutes; the inexperience of the supposed pilot; the flight’s approach to Washington for 29 minutes without being detected by radar; and the failure of the National Military Command Center to protect the most well-defended building on the planet.

As for the fourth plane, Flight 93, which supposedly crashed in Pennsylvania after passengers seized control, the evidence points instead to a shoot down. Fighter jets tailed that plane, according to a flight controller, CBS news, and witnesses on the ground. One passenger who placed a cell phone call just before contact was lost reported hearing "some sort of explosion" and seeing "white smoke." Witnesses on the ground heard loud bangs consistent with a missile strike and saw the plane suddenly drop "like a stone," Griffin reports.

Hard to believe? Certainly. But one of the reasons we don’t know for sure is that the tape of the cockpit recording reviewed by relatives of the victims ends at 10:02, while a seismic study establishes that the crash occurred at 10:06. Adding to the mystery, no flight control transcript of that flight has been released.

Such claims raise numerous questions, many of which are addressed in Griffin’s books and subsequent lectures. One of the biggest is why, if 9/11 was some sort of "inside" job, anyone would want such destruction to take place. Another is who knew and, even more important, who made it possible?

Aware of such questions, Griffin sticks with what is known. For example, both the Bush administration and Larry Silverstein, who owned Building 7 and had leased the WTC earlier in 2001, clearly benefited. How? Silverstein collected $7 billion in insurance on property that was losing money and faced major problems caused by asbestos, while the administration needed "an archetypal event" in order to implement the plans to invade Afghanistan and Iraq that several key administration figures had been developing for more than a decade.

If there was some "official complicity," there are many possible forms it could have taken, Griffin points out. The least serious — but still enough to provide grounds for impeachment — is that U.S. officials played no role, but covered up embarrassing facts to exploit the tragedy. Another alternative is that intelligence agencies knew something in advance, but didn’t prevent the attacks and persuaded the administration to help with a cover-up.

More difficult to accept is the suggestion that those agencies or the Pentagon may have had specific information, or even direct involvement. And, of course, the most troubling idea — the one most apt to spark angry charges of Bush bashing or anti-American extremism — is that someone in the executive branch may have known something, or even provided some help.

While most of these options may sound unlikely (at least until one hears Griffin speak or reads his books), it certainly wouldn’t be the first time elements within the government orchestrated horrific events — and lied afterward — to achieve a long-term aim. As Richard Falk points out in his introduction to The New Pearl Harbor, events were manipulated to justify the Spanish-American War, the U.S. entry into World War II, the expansion of the Vietnam War, and the current Iraq war. Scholars also have challenged the official accounts of the atomic bombing of Hiroshima and Nagasaki.

Griffin is often asked why, if his account is to be seriously considered, someone involved hasn’t spoken up. In other words, how could it be kept secret? "We don’t know the secrets they have kept," he replied in Burlington. "The Manhattan Project [to create the atomic bomb] was kept secret for a long time, as well as a war in Indonesia during the 1950s. Things are compartmentalized, with information available on a need-to-know basis.

"Most people are afraid for their jobs," he continued, adding that "if they talk and disobey, they can be imprisoned and worse … And when people have spoken the press hasn’t reported it. But members of the New York police and fire departments know it was an inside job." Another reason, he said, is that "those who know probably think it would be so disturbing that it’s better to let people believe the official version."

As more information emerges, however, and is catalogued on websites like 911truth.org and 911citizens.org, the official "conspiracy theory" begins to look more "outrageous" than the admittedly controversial contention that the attacks were somehow orchestrated from within the United States.

Hammering home the point that most of what we think we know may be mistaken, Griffin also points out that even the identities of the hijackers remain in doubt. In the months following 9/11, the London Times, Associated Press, and Saudi embassy in Washington reported that at least five of the 19 men whose photos and names circulated worldwide were still alive.

So, was bin Laden really the mastermind? If he was a player, did he have some help? These are two of the many troubling questions that arise from Griffin’s analysis. At this point, we simply don’t know, and not much can be said with complete certainty, except that without 9/11, George Bush would not have been able to declare himself a "war president" and there would have been no convincing reason to expand the federal government’s power through legislation like the USA PATRIOT Act.

Given the administration’s now discredited claims about Iraqi leader Saddam Hussein’s connection to the attacks and weapons of mass destruction, it doesn’t stretch credulity to conclude that, based on the considerable conflicting evidence (rather than more comforting assumptions), the public has yet to hear the whole story. For that to change, however, the media’s self-imposed myopia will have to end, at last granting Griffin’s research a thorough review, and perhaps even prompting a more credible and comprehensive official examination than has so far been conducted.