NEWS2U Articles & Comments
Critical Reporting

Monday, March 31, 2008

What Every American Should Know About the Middle East


By Daniel Miessler
March 30th, 2008


Most in the United States don’t know much about the Middle East or the people that live there.

This lack of knowledge hurts our ability to understand world events and, consequently, our ability to hold intelligent opinions about those events.

For example, frighteningly few know the difference between Sunni and Shia Muslims, and most think the words “Arab” and “Muslim” are pretty much interchangeable.

They aren’t. So here’s a very brief primer aimed at raising the level of knowledge about the region to an absolute minimum.

Basics

Arabs are part of an ethnic group, not a religion.

Arabs were around long before Islam, and there have been (and still are) Arab Christians and Arab Jews.

In general, you’re an Arab if you 1) are of Arab descent (blood), or speak the main Arab language (Arabic).

Islam is a religion.

A Muslim (roughly pronounced MOOSE-lihm) is someone who follows the religion. So you wouldn’t say someone follows Muslim or is an Islam, just as you wouldn’t say someone follows Christian or is a Christianity.

Shia Muslims are similar to Roman Catholics in Christianity.

They have a strong clerical presence via Imams and promote the idea of going through them to practice the religion correctly. Sunni Muslims are more like Protestant Christians. They don’t really focus on Imams and believe in maintaining a more direct line to God than the Shia.

People from Iran, are also known as Persians, and they are not Arabs.

Arabs are Semites. We’ve all heard the term antisemitism being used — often to describe Arabs.

This doesn’t make sense given the fact that the word “Semite” comes from the Bible and refers to anyone who speaks one of the Semitic Languages. That includes both Jews and Arabs.

According to the Bible, Jews and Arabs are related [Genesis 25]. Jews descended from Abraham’s son Isaac, and Arabs descended from Abraham’s son Ishmael. So not only are both groups Semitic, but they’re also family.

Sunni Muslims make up most of the Muslim world (roughly 90%). 1

The country with the world’s largest Muslim population is Indonesia. 2

The rift between the Shia and Sunni started right after Muhammad’s death and originally reduced to a power struggle regarding who was going to become the authoritative group for continuing the faith.

The Shia believed Muhammad’s second cousin Ali should have taken over (the family/cleric model).

The Sunni believed that the best person for the job should be chosen by the followers (the merit model) and that’s how the first Caliph, Abu Bakr, was appointed.

Although the conflict began as a political struggle it now mostly considered a religious and class conflict, with political conflict emanating from those rifts.

Sunni vs. Shia Arab vs. Non-Arab

Here’s how the various Middle Eastern countries break down in terms of Sunni vs. Shia and whether or not they are predominantly Arab. Keep in mind that these are generalizations; significant diversity exists in many of the countries listed.

Iraq Mostly Shia (roughly 60%), but under Saddam the Shia were oppressed and the Sunni were in power despite being only 20% of the population being Arab.

Iran Shia. NOT Arab.
Palestine Sunni. Arab.
Egypt Sunni. Arab.
Saudi Arabia Sunni. Arab.
Syria Sunni. Arab.
Jordan Sunni. Arab.
Gulf States Sunni. Arab.

Conclusion

What’s depressing is the fact that this only took me 20 minutes to write and you 2 minutes to read. Yet most people in the United States, including those in the media, the house of representatives, and probably even the Pentagon, lack even this cursory level of knowledge about the region.

References:

1The CIA World Fact Book Field Listing - Religions
2The CIA World Fact Book Field Listing - Indonesia
Wikipedia Sunni Muslims
Wikipedia Shia Muslims
Wikipedia Arabs

Source:
http://dmiessler.com/blog/what-every-american-should-know-about-the-middle-east
______________

Sunday, March 30, 2008

Bats are Dying and No One Knows Why


March 24th, 2008


In what is one of the worst calamities to hit bat populations in the United States, on average 90 percent of the hibernating bats in four caves and mines in New York have died since last winter.

Whatever is killing the bats leaves them unusually thin and, in some cases, dotted with a white fungus.

Al Hicks was standing outside an old mine in the Adirondacks, the largest bat hibernaculum, or winter resting place, in New York State.

It was broad daylight in the middle of winter, and bats flew out of the mine about one a minute.

Some had fallen to the ground where they flailed around on the snow like tiny wind-broken umbrellas, using the thumbs at the top joint of their wings to gain their balance.

All would be dead by nightfall. Mr. Hicks, a mammal specialist with the state's Environmental Conservation Department, said: "Bats don't fly in the daytime, and bats don't fly in the winter.
Every bat you see out here is a 'dead bat flying,' so to speak."

They have plenty of company. In what is one of the worst calamities to hit bat populations in the United States, on average 90 percent of the hibernating bats in four caves and mines in New York have died since last winter.

Wildlife biologists fear a significant die-off in about 15 caves and mines in New York, as well as at sites in Massachusetts and Vermont. Whatever is killing the bats leaves them unusually thin and, in some cases, dotted with a white fungus. Bat experts fear that what they call White Nose Syndrome may spell doom for several species that keep insect pests under control.

Researchers have yet to determine whether the bats are being killed by a virus, bacteria, toxin, environmental hazard, metabolic disorder or fungus. Some have been found with pneumonia, but that and the fungus are believed to be secondary symptoms.

"This is probably one of the strangest and most puzzling problems we have had with bats," said Paul Cryan, a bat ecologist with the United States Geological Survey. "It's really startling that we've not come up with a smoking gun yet."

Merlin Tuttle, the president of Bat Conservation International, an education and research group in Austin, Tex., said: "So far as we can tell at this point, this may be the most serious threat to North American bats we've experienced in recorded history. "It definitely warrants immediate and careful attention."

This month, Mr. Hicks took a team from the Environmental Conservation Department into the hibernaculum that has sheltered 200,000 bats in past years, mostly little brown bats (Myotis lucifugus) and federally endangered Indiana bats (Myotis sodalis), with the world's second largest concentration of small-footed bats (Myotis leibii).

He asked that the mine location not be published, for fear that visitors could spread the syndrome or harm the bats or themselves.

Other visitors do not need directions. The day before, Mr. Hicks saw eight hawks circling the parking lot of another mine, waiting to kill and eat the bats that flew out.

In a dank galley of the mine, Mr. Hicks asked everyone to count how many out of 100 bats had white noses. About half the bats in one galley did. They would be dead by April, he said.

Mr. Hicks, who was the first person to begin studying the deaths, said more than 10 laboratories were trying to solve the mystery.

In January 2007, a cave explorer reported an unusual number of bats flying near the entrance of a cavern near Albany. In March and April, thousands of dead bats were found in three other mines and caves. In one case, half the dead or living bats had the fungus.

One cave had 15,584 bats in 2005, 6,735 in 2007 and an estimated 1,500 this winter. Another went from 1,329 bats in 2006 to 38 this winter. Some biologists fear that 250,000 bats could die this year.

Since September, when hibernation began, dead or dying bats have been found at 15 sites in New York. Most of them had been visited by people who had been at the original four sites last winter, leading researchers to suspect that humans could transmit the problem.

Details on the problem in neighboring states are sketchier. "In the Berkshires in Massachusetts, we are getting reports of dying/dead bats in areas where we do not have known bat hibernacula, so we may have more sites than we will ever be able to identify," said Susi von Oettingen, an endangered species biologist with the United States Fish and Wildlife Service.

In Vermont, Scott Darling, a wildlife biologist with the Fish and Wildlife Department, said: "The last tally that I have is approximately 20 sites in New York, 4 in Vermont and 2 in Massachusetts. We only have estimates of the numbers of bats in the affected sites” more or less 500,000. It is impossible for us to count the dead bats, as many have flown away from the caves and died "we have over 90 reports from citizens across Vermont ” as well as many are still dying.

People are not believed to be susceptible to the affliction. But New Jersey, New York and Vermont have advised everyone to stay out of all caverns that might have bats. Visitors to affected caves and mines are asked to decontaminate all clothing, boots, ropes and other gear, as well as the car trunks that transport them.

One affected mine is the winter home to a third of the Indiana bats between Virginia and Maine. These pink-nosed bats, two inches long and weighing a quarter-ounce, are particularly social and cluster together as tightly as 300 a square foot.

"It's ironic, until last year most of my time was spent trying to delist it," or take it off the endangered species list," Mr. Hicks said, after the state's Indiana bat population grew, to 52,000 from 1,500 in the 1960s.

"It's very scary and a little overwhelming from a biologist's perspective," Ms. von Oettingen said. "If we can't contain it, we're going to see extinctions of listed species, and some of species that are not even listed."

Neighbors of mines and caves in the region have notified state wildlife officials of many affected sites when they have noticed bats dead in the snow, latched onto houses or even flying in a recent snowstorm.

Biologists are concerned that if the bats are being killed by something contagious either in the caves or elsewhere, it could spread rapidly, because bats can migrate hundreds of miles in any direction to their summer homes, known as maternity roosts. At those sites, females usually give birth to one pup a year, an added challenge for dropping populations.

Nursing females can eat up to half their weight in insects a day, Mr. Hicks said.

Researchers from institutions like the Centers for Disease Control and Prevention, the United States Geological Survey's National Wildlife Health Center, Boston University, the New York State Health Department and even Disney's Animal World are addressing the problem.

Some are considering trying to feed underweight wild bats to help them survive the remaining weeks before spring. Some are putting temperature sensors on bats to monitor how often they wake up, and others are making thermal images of hibernating bats.

Other researchers want to know whether recently introduced pesticides, including those released to stop West Nile virus, may be contributing to the problem, either through a toxin or by greatly reducing the bat's food source.

Dr. Thomas H. Kunz, a biology professor at Boston University, said the body composition of the bats would also be studied, partly to determine the ratio of white to brown fat. Of particular interest is the brown fat between the shoulder blades, known to assist the bats in warming up when they begin to leave deep hibernation in April.

"It appears the white nose bats do not have enough fat, either brown or white, to arouse," Dr. Kunz said. "They're dying in situ and do not have the ability to arouse from their deep torpor."

His researchers' cameras have shown that bats in the caves that do wake up when disturbed take hours longer to do so, as was the case in the Adirondack mine. He also notes that if females become too emaciated, they will not have the hormonal reactions necessary to ovulate and reproduce.

In searching for a cause of the syndrome, researchers are hampered by the lack of baseline knowledge about habits like how much bats should weigh in the fall, where they hibernate and even how many bats live in the region.

"We're going to learn an awful lot about bats in a comprehensive way that very few animal species have been looked at," said Dr. Elizabeth Buckles, an assistant professor at Cornell who coordinates bat research efforts. "That's good. But it's unfortunate it has to be under these circumstances."

The die-offs are big enough that they may have economic effects. A study of Brazilian free-tailed bats in southwestern Texas found that their presence saved cotton farmers a sixth to an eighth of the cash value of their crops by consuming insect pests.

"Logic dictates when you are potentially losing as many as a half a million bats in this region, there are going to be ramifications for insect abundance in the coming summer," Mr. Darling, the Vermont wildlife biologist, said.

As Mr. Hicks traveled deeper in the cave, the concentrations of bats hanging from the ceiling increased. They hung like fruit, generally so still that they appeared dead. In some tightly packed groups, just individual noses or elbows peeked through. A few bats had a wing around their nearest cavemates. Their white bellies mostly faced downhill. When they awoke, they made high squeaks, like someone sucking a tooth.

The mine floors were not covered with carcasses, Mr. Hicks said, because raccoons come in and feed on them. Raccoon scat dotted the rocks along the trail left by their footprints.

In the six hours in the cave taking samples, nose counts and photographs, Mr. Hicks said that for him trying for the perfect picture was a form of therapy. "It's just that I know I'm never going to see these guys again," he said. "We're the last to see this concentration of bats in our lifetime."

Source:
http://www.impactlab.com/2008/03/24/bats-are-dying-and-no-one-knows-why/#more-15993
______________

Thursday, March 27, 2008

Oil for War

After invading one of the most petroleum-rich countries on earth, the U.S. military is running on empty.


by Robert Bryce
March 10, 2008
The American Conservative


Napoleon famously said that an army marches on its stomach. That may have been true for his 19th-century force. But the modern American military runs on jet fuel—and lots of it.

Today the average American G.I. in Iraq uses about 20.5 gallons of fuel every day, more than double the daily volume consumed by U.S. soldiers in Iraq in 2004. Thus, in order to secure the third-richest country on the planet, the U.S. military is burning enormous quantities of petroleum. And nearly every drop of that fuel is imported into Iraq.

These massive fuel requirements—just over 3 million gallons per day for Operation Iraqi Freedom, according to the Pentagon’s Defense Energy Support Center—are a key reason for the soaring cost of the war effort.

Controlling Iraq’s oil has historically been a vital factor in America’s involvement in Iraq and was always a crucial element of the Bush administration’s plans for the post-Saddam era. Of course, that’s not how the war was sold to the American people.

A few months before the invasion, Secretary of Defense Donald Rumsfeld declared that the looming war had “nothing to do with oil, literally nothing to do with oil.” The war was necessary, its planners claimed, because Saddam Hussein supported terrorism and, left unchecked, he would unleash weapons of mass destruction on the West.

Nevertheless, oil was the foremost strategic focus for the U.S. military in Iraq.

The first objectives of the invading forces included the capture of key Iraqi oil terminals and oilfields. On March 20, 2003, Navy SEALs engaged in the first combat of the war when they launched a surprise invasion of the Mina al-Bakr and Khor al-Amaya oil loading terminals in the Persian Gulf. A few hours later, Marine Lt. Therral Childers became the first U.S. soldier to die in combat in the invasion when he was killed fighting for control of the Rumaylah oil field in southern Iraq.

Oil was also the first objective when U.S. forces reached Baghdad on April 8.

Although the National Library of Iraq, the National Archives, and the National Museum of Antiquities were all looted and in some cases burned, the oil ministry building was barely damaged. That’s because a detachment of American soldiers and a half-dozen assault vehicles were assigned to guard the ministry and its records.

After all, the war’s architects had promised that oil money was going to rebuild Iraq after the U.S. military took control.

In March 2003, Paul Wolfowitz told a Congressional panel, “The oil revenues of that country could bring between $50 and $100 billion over the course of the next two or three years. Now, there are a lot of claims on that money, but … we are dealing with a country that can really finance its own reconstruction and relatively soon.”

As Michael Gordon and Bernard Trainor explained in their 2006 book, Cobra II, “The Pentagon had promised that the reconstruction of Iraq would be ‘self-financing,’ and the preservation of Iraq’s oil wealth was the best-prepared and -resourced component of Washington’s postwar plan.”

After the invasion, when inspectors failed to find any weapons of mass destruction, Bush and his supporters changed their story, claiming that the U.S. had invaded Iraq to spread democracy in the Middle East. When democracy failed to materialize, the justification for the invasion turned to oil. During an October 2006 press conference, Bush declared that the U.S. could not “tolerate a new terrorist state in the heart of the Middle East with large oil reserves that could be used to fund its radical ambitions or used to inflict economic damage on the West.”

The U.S. military and the new Baghdad government have failed, however, to secure Iraq’s tattered oil sector.

As A.F. Alhajji, energy economist and professor at Ohio Northern University, has said, “whoever controls Iraq’s oil, controls Iraq.” For the last five years, it’s never been exactly clear who controls Iraq’s oil. That said, the country’s leading industry is slowly increasing output. In January, daily production hit 2.4 million barrels per day, the highest level since the U.S. invasion.

But America’s presence in Iraq isn’t making use of the local riches. Indeed, little, if any, Iraqi oil is being used by the American military. Instead, the bulk of the fuel needed by the U.S. military is being trucked in from the sprawling Mina Abdulla refinery complex, which lies a few dozen kilometers south of Kuwait City.

In 2006 alone, the Defense Energy Support Center purchased $909.3 million in motor fuel from the state-owned Kuwait Petroleum Corporation. In addition to the Kuwaiti fuel, the U.S. military is trucking in fuel from Turkey. But some of that Turkish fuel actually originates in refineries as far away as Greece.

In 2007 alone, the U.S. military in Iraq burned more than 1.1 billion gallons of fuel. (American Armed Forces generally use a blend of jet fuel known as JP-8 to propel both aircraft and automobiles.) About 5,500 tanker trucks are involved in the Iraqi fuel-hauling effort. That fleet of trucks is enormously costly.

In November 2006, a study produced by the U.S. Military Academy estimated that delivering one gallon of fuel to U.S. soldiers in Iraq cost American taxpayers $42—and that didn’t include the cost of the fuel itself. At that rate, each U.S. soldier in Iraq is costing $840 per day in fuel delivery costs, and the U.S. is spending $923 million per week on fuel-related logistics in order to keep 157,000 G.I.s in Iraq. Given that the Iraq War is now costing about $2.5 billion per week, petroleum costs alone currently account for about one-third of all U.S. military expenditure in Iraq.

Soaring fuel costs are largely a product of the fact that U.S. forces have been forced to defend themselves against improvised explosive devices. The majority of American casualties in Iraq have been due to IED attacks, primarily on motor vehicles. The U.S. military has spent billions of dollars on electronic countermeasures to combat the deadly devices, but those countermeasures have largely failed. Instead, the troops have had to rely on old-fashioned hardened steel. Since the beginning of the war, the Pentagon has introduced numerous programs to add armor skins to its fleet of Humvees.

But even the newest armored Humvees, which weigh about six tons, haven’t been enough to protect soldiers against the deadly explosives. Last year, Congress, the White House, and the Pentagon agreed on a four-year plan to spend about $20 billion on a fleet of 23,000 mine-resistant ambush protection vehicles or MRAPs. Last August, the Pentagon ordered 1,520 of the vehicles at a cost of $3.5 million each.

The MRAPs mean even greater demand for fuel from U.S. troops in Iraq. An armored Humvee covers perhaps 8 miles per gallon of fuel. One version of the MRAP, the Maxxpro, weighs about 40,000 pounds, and according to a source within the military, gets just 3 miles per gallon. The increased demand for fuel for the MRAPs will come alongside the need for an entirely new set of tires, fan belts, windshields, alternators, and other gear.

This swelling of the logistics train creates yet another problem for the military: an increase in supply trucks on the road, which demands yet more fuel and provides insurgents with a greater range of targets to attack.

While the U.S. military chases its own fuel tail in Iraq, a country that sits atop 115 billion barrels of oil—about 9.5 percent of the world’s total—the global energy industry is racing forward with new alliances and deals, many of which would have been unthinkable before the invasion. Those alliances have far-reaching significance for America’s foreign and energy policy.

The world’s oil market is no longer shaped by U.S. military power.

Markets are trumping militarism. As one analyst put it recently, dollars are replacing “bullets as shapers of the geopolitical picture.”

The importance of this point is obvious: as the effectiveness of militarism in controlling global energy trends is declining, the U.S. is spending billions of dollars a week in Mesopotamia on a war effort that—if John McCain is right—could drain the American treasury for decades to come.

Meanwhile, America’s key rivals, China and Russia in particular, are using their influence to forge economic alliances that are realigning the global balance of power. They are creating a multi-polar world in which America’s influence will be substantially diminished.

This realignment is particularly advantageous for major energy exporting countries such as Russia, Abu Dhabi, Saudi Arabia, Qatar, and of course, Iran. These states are taking advantage of higher energy prices caused by ever-increasing global energy demand and tightening supplies.

And while the Bush administration has tried to diminish the influence of countries like Iran and Russia, there’s little, if anything, the U.S. can do to slow the trend. The myriad of energy exploration and production contracts that the Iranians have signed in recent months proves the point.

Meanwhile, Russia’s state-controlled behemoth, Gazprom, has consolidated its hold on the European natural gas market.

Add the massive financial power of the sovereign wealth funds of just three countries—Abu Dhabi, Saudi Arabia and Kuwait, who hold a combined $1.4 trillion in assets—and the shift in power becomes even more apparent. Higher energy prices are the main difference between the first Iraq War and the second, says Jeff Dietert, a managing director at Simmons & Company International, a Houston-based investment banking firm that focuses on the energy sector. “It’s a completely different result from the first Iraq War, which was really a demonstration of military prowess. It was quick and decisive versus the current situation in Iraq, which is slow, expensive and drawn out.”

The Kurds have been quick to exploit new opportunities in the fast-changing oil market.

In direct defiance of the weak central government in Baghdad, the Kurdistan Regional Government has signed 15 oil exploration deals with 20 companies from 12 countries. Increasing oil production benefits the Kurds. It also helps Turkey, which stands to reap more revenue from the Kirkuk to Ceyhan pipeline, which will carry much of the new production. A Norwegian company, DNO ASA, has already built a pipeline from their Tawke oil field north of Mosul to an interconnection point immediately next to the Kirkuk-Ceyhan pipeline.

Geneva-based Addax Petroleum is another big player in Kurdistan.

During a presentation at an oil and gas conference in Connecticut in September, the company’s chief financial officer, Michael Ebsary, said that Addax’s potential reserves in Kurdistan may be as large as 2.7 billion barrels of oil. (Addax’s partner in the project is a Genel Enerji, a subsidiary of the Cukorova Group, one of Turkey’s biggest conglomerates.) “Everyone sees the Kurdish region as an area that has to be developed. There’s tons of oil there,” Ebsary told me. “It has to get out.”

The same can be said for Iranian oil and gas.

One of the unintended consequences of the Iraq War has been the strengthening of Iran’s influence in the region. In 2007 alone, the Iranians cut deals—worth perhaps $50 billion over the next few decades—with companies from Britain, Spain, Brazil, China, Austria, Turkey, and Malaysia. In addition to those projects, the Iranian government is still negotiating the pricing formulas for the long discussed, much-delayed Peace Pipeline, the $7 billion, 1600-mile conduit to carry Iranian gas to Pakistan and India.

In 2005, Susil Chandra Tripathi, the secretary of India’s ministry of petroleum and natural gas, promised that the deal would eventually go through. He told me that the U.S. may “want to isolate Iran, but that doesn’t mean Iran will quit producing crude oil and gas, or that we will stop buying it.”

Another indication of the shift in power can be seen by looking at the new the Dubai Mercantile Exchange, which last June began trading the Oman Crude Oil Futures Contract. By getting into the energy futures business, Dubai is assuring that the crude oil coming out of the Persian Gulf has its own benchmark price—one that is not reliant on Western crude oil standards such as West Texas Intermediate and North Sea Brent. It also puts Dubai in competition with the traditional trading hubs in New York and London.

In July 2006, Gary King, the CEO of the Dubai exchange, told me that the emergence of the exchange and the new futures contract indicates that the Persian Gulf is “the center of the world’s biggest hydrocarbon province. Most of the growth in oil consumption is in Asia-Pacific. So it’s a natural shift in gravity. Our timing is very opportune to be in that center of gravity.”

This change cannot be stopped or ignored.

In today’s multi-polar world, economic interests, not military force, predominate. “It used to be that the side with the most guns would win,” says G.I. Wilson, a recently retired Marine Corps colonel, who has written extensively on terrorism and asymmetric warfare and spent 15 months fighting in Iraq. Today, says Wilson, the side “with the most guns goes bankrupt.”

Since World War II, America has held fast to the idea that controlling the oil flow out of the Persian Gulf must be assured at the point of a M-16 rifle.

But the cost of that approach has been crippling. As the U.S. military pursues its occupation of Iraq—with the fuel costs approaching $1 billion per week—it’s obvious that the U.S. needs to rethink the assumption that secure energy sources depend on militarism. The emerging theme of the 21st-century energy business is the increasing power of markets.

The U.S. can either adapt or continue hurtling down the road to bankruptcy.

Robert Bryce is the managing editor of Energy Tribune magazine. His third book, Gusher of Lies: The Dangerous Delusions of Energy Independence, will be published on March 10.

Source:
http://www.amconmag.com/2008/2008_03_10/cover.html
_____________

Sunday, March 23, 2008

Calling Bullshit on the Idea of 'Marijuana Addiction'


By Paul Armentano
AlterNet
March 22, 2008

[What is bullshit is that the govt. and media expect us to believe anything they tell us is truth, they are lying this country into ruin. News2U]

The U.S. government believes that America is going to pot -- literally.

Earlier this month, the U.S. National Institute on Drug Abuse announced plans to spend $4 million to establish the nation's first-ever "Center on Cannabis Addiction," which will be based in La Jolla, Calif. The goal of the center, according to NIDA's press release, is to "develop novel approaches to the prevention, diagnosis and treatment of marijuana addiction."

Not familiar with the notion of "marijuana addiction"?

You're not alone. In fact, aside from the handful of researchers who have discovered that there are gobs of federal grant money to be had hunting for the government's latest pot boogeyman, there's little consensus that such a syndrome is clinically relevant -- if it even exists at all.

But don't try telling that to the mainstream press -- which recently published headlines worldwide alleging, "Marijuana withdrawal rivals that of nicotine."

The alleged "study" behind the headlines involved all of 12 participants, each of whom were longtime users of pot and tobacco, and assessed the self-reported moods of folks after they were randomly chosen to abstain from both substances.

Big surprise: they weren't happy.

And don't try telling Big Pharma -- which hopes to cash in on the much-hyped "pot and addiction" craze by touting psychoactive prescription drugs like Lithium to help hardcore smokers kick the marijuana habit.

And certainly don't try telling the drug "treatment" industry, whose spokespeople are quick to warn that marijuana "treatment" admissions have risen dramatically in recent years, but neglect to explain that this increase is due entirely to the advent of drug courts sentencing minor pot offenders to rehab in lieu of jail. According to state and national statistics, up to 70 percent of all individuals in drug treatment for marijuana are placed there by the criminal justice system.

Of those in treatment, some 36 percent had not even used marijuana in the 30 days prior to their admission. These are the "addicts"?

Indeed, the concept of pot addiction is big business -- even if the evidence in support of the pseudosyndrome is flimsy at best.

And what does the science say?

Well, according to the nonpartisan National Academy of Sciences Institute of Medicine -- which published a multiyear, million-dollar federal study assessing marijuana and health in 1999 -- "millions of Americans have tried marijuana, but most are not regular users [and] few marijuana users become dependent on it." The investigator added, "[A]though [some] marijuana users develop dependence, they appear to be less likely to do so than users of other drugs (including alcohol and nicotine), and marijuana dependence appears to be less severe than dependence on other drugs."

Just how less likely?

According to the Institute of Medicine's 267-page report, fewer than 10 percent of those who try cannabis ever meet the clinical criteria for a diagnosis of "drug dependence" (based on DSM-III-R criteria). By contrast, the IOM reported that 32 percent of tobacco users, 23 percent of heroin users, 17 percent of cocaine users and 15 percent of alcohol users meet the criteria for "drug dependence."

In short, it's the legal drugs that have Americans hooked -- not pot.

But what about the claims that ceasing marijuana smoking can trigger withdrawal symptoms similar to those associated with quitting tobacco?

Once again, it's a matter of degree. According to the Institute of Medicine, pot's withdrawal symptoms, when identified, are "mild and subtle" compared with the profound physical syndromes associated with ceasing chronic alcohol use -- which can be fatal -- or those abstinence symptoms associated with daily tobacco use, which are typically severe enough to persuade individuals to reinitiate their drug-taking behavior.

The IOM report further explained, "[U]nder normal cannabis use, the long half-life and slow elimination from the body of THC prevent[s] substantial abstinence symptoms" from occurring. As a result, cannabis' withdrawal symptoms are typically limited to feelings of mild anxiety, irritability, agitation and insomnia.

Most importantly, unlike the withdrawal symptoms associated with the cessation of most other intoxicants, pot's mild after-effects do not appear to be either severe or long-lasting enough to perpetuate marijuana use in individuals who have decided to quit. This is why most marijuana smokers report voluntarily ceasing their cannabis use by age 30 with little physical or psychological difficulty. By comparison, many cigarette smokers who pick up the habit early in life continue to smoke for the rest of their lives, despite making numerous efforts to quit.

So let's review.

Marijuana is widely accepted by the National Academy of Sciences, the Canadian Senate Special Committee on Illegal Drugs, the British Advisory Council on the Misuse of Drugs and others to lack the severe physical and psychological dependence liability associated with most other intoxicants, including alcohol and tobacco. Further, pot lacks the profound abstinence symptoms associated with most legal intoxicants, including caffeine.

That's not to say that some marijuana smokers don't find quitting difficult. Naturally, a handful of folks do, though this subpopulation is hardly large enough to warrant pot's legal classification (along with heroin) as an illicit substance with a "high potential for abuse." Nor does this fact justify the continued arrest of more than 800,000 Americans annually for pot violations any more than such concerns would warrant the criminalization of booze or nicotine.

Now if I can only get NIDA to fork me over that $4 million check.

Paul Armentano is deputy director of NORML and the NORML Foundation.

Source:
http://www.alternet.org/story/80408/
______________________

Friday, March 21, 2008

McCain Spiritual Guide Accused Govt. of Enabling 'Black Genocide'



Video courtesy of RightWingWatch.org


By Sam Stein
Huffington Post
March 21, 2008


This past week, Sen. Barack Obama's pastor, Reverend Jeremiah Wright, has taken an exceptional amount of heat in part for comments that suggested the U.S. government had introduced AIDS into black communities.

But it turns out he's not the only religious confidant to a presidential candidate who thinks the state has targeted black populations with death and disease.

Reverend Rod Parsley of the World Harvest Church of Columbus, Ohio -- whom Sen. John McCain hailed as a spiritual adviser -- has suggested on several occasions that the U.S. government was complicit in facilitating black genocide.

In speeches that have gone largely unnoticed, Parsley (who is white) compares Planned Parenthood, the reproductive care and family planning group, to the Klu Klux Klan and Nazis, and describes the American government as enablers of murder for supporting the organization.

"If I were call for the sterilization or the elimination of an entire segment of society, I'd be labeled a racist or a murderer, or at very best a Nazi," says Parsley. "That every single year, millions of our tax dollars are funding a national organization built upon that very goal -- their target: African Americans. That's right, the death toll: nearly fifteen hundred African Americans a day. The shocking truth of black genocide."

He goes on.

"Right now our own government is allowing organizations like Planned Parenthood to legally take the innocent lives of precious baby girls and baby boys and even footing the bill for it all with our tax dollars, turning every single one of us into accessories to murder," he says. "You know who their biggest fans must be, that must be the Klu Klux Klan, because the woman who founded this organization detested black people.... African Americans were number one on Margaret Sanger's list. So this 'Lady MacDeath,' as I like to call her, studied the works of Englishman Thomas Robert Malthus, and embraced his plan of eugenics."

Unlike Wright's statements, Parsley's are more accepted in conservative circles, in which a strict anti-abortion sentiment is not only tolerated, but applauded. Moreover, as a white pastor expressing anger on behalf of black populations, Parsley's testimony may come off as more sympathetic and less conspiratorial than Wright's.

However, there are issues with Parsley's stats.

While black populations in America do have higher abortion rates than white populations, there are far more abortions among white mothers than among blacks. Meanwhile, Sanger, who founded the American Birth Control League (which eventually became Planned Parenthood), was an advocate of both birth control and eugenics. And while she did not publicly denounce Nazi Germany's eugenics program, privately she expressed deep concern.

This is the second time that controversial remarks by Parsley have surfaced on the campaign trail. Last week, David Corn of Mother Jones reported that the televangelist "called upon Christians to wage a 'war' against the 'false religion' of Islam with the aim of destroying it."

The relationship between Parsley and McCain is, to be sure, far less personal -- and more political -- than that of Obama and Wright. In late February, McCain attended a rally in Cincinnati, in which the Arizona Republican was praised as a "strong, true, consistent conservative."

The endorsement, Corn writes:

... was important for McCain, who at the time was trying to put an end to the lingering challenge from former Arkansas governor Mike Huckabee, a favorite among Christian evangelicals. A politically influential figure in Ohio, Parsley could also play a key role in McCain's effort to win this bellwether state in the general election. McCain, with Parsley by his side at the Cincinnati rally, called the evangelical minister a "spiritual guide.'

Sam Stein is a Political Reporter at the Huffington Post, based in Washington, D.C.
© 2008 Huffington Post All rights reserved.


Source:
http://www.alternet.org/bloggers/http://www.huffingtonpost.com//80436/
__________________

Five Years of Iraq Lies


How President Bush and his advisors have spent each year of the war peddling mendacious tales about a mission accomplished.


by Juan Cole


Each year of George W. Bush’s war in Iraq has been represented by a thematic falsehood. That Iraq is now calm or more stable is only the latest in a series of such whoppers, which the mainstream press eagerly repeats. The fifth anniversary of Bush’s invasion of Iraq will be the last he presides over.
Sen. John McCain, in turn, has now taken to dangling the bait of total victory before the American public, and some opinion polls suggest that Americans are swallowing it, hook, line and sinker. The most famous falsehoods connected to the war were those deployed by the president and his close advisors to justify the invasion. But each of the subsequent years since U.S. troops barreled toward Baghdad in March 2003 has been marked by propaganda campaigns just as mendacious. Here are five big lies from the Bush administration that have shaped perceptions of the Iraq war.

Year 1’s big lie was that the rising violence in Iraq was nothing out of the ordinary. The social turmoil kicked off by the invasion was repeatedly denied by Bush officials. When Iraqis massively looted government ministries and even private shops, then Secretary of Defense Donald Rumsfeld joked that U.S. media had videotape of one man carrying off a vase and that they kept looping it over and over again. The first year of the war saw the rise of a Sunni Arab guerrilla movement that repeatedly struck at U.S. troops and at members and leaders of the Shiite-dominated Interim Governing Council appointed by the American government.

After dozens of U.S. and British military deaths, Rumsfeld actually came out before the cameras and denied, in July of 2003, that there was a building guerrilla war. When CNN’s Jamie McIntyre quoted to him the Department of Defense definition of a guerrilla war — “military and paramilitary operations conducted in enemy-held or hostile territory by an irregular predominantly indigenous forces” — and said it appeared to fit Iraq, Rumsfeld replied, “It really doesn’t.” Bush was so little concerned by the challenge of an insurgency that he cavalierly taunted the Sunni Arab guerrillas, “Bring ‘em on!” regardless of whether it might recklessly endanger U.S. soldiers. The guerrillas brought it on.

In Year 2 the falsehood was that Iraq was becoming a shining model of democracy under America’s caring ministrations. In actuality, Bush had planned to impose on Iraq what he called “caucus-based” elections, in which the electorate would be restricted to the provincial and some municipal council members backed by Bush-related institutions. That plan was thwarted by Grand Ayatollah Ali al-Sistani, who demanded one-person, one-vote open elections, and brought tens of thousands of protesters out onto the streets of Baghdad and Basra.

The elections were deeply flawed, both with regard to execution and outcome. The U.S. campaign against Fallujah in November 2004, marked by more petulant rhetoric from Bush, had angered Sunni Arabs — who feared U.S. strategy favored Shiite ascendancy — and led to their boycotting the elections. The electoral system chosen by the United Nations and the U.S. would guarantee that if they boycotted, they would be without representation in parliament.
Candidates could not campaign, and voters did not know for which individuals they were voting.

Much of the American public, egged on by White House propaganda, was sanguine when elections were held at the end of January 2005, mistaking process for substance. Why the disenfranchisement of the Sunni Arabs, who were becoming more and more violent, was a good thing, or why the victory of Shiite fundamentalists tied to Iran was a triumph for the U.S., remains difficult to discern. Nobody in the Middle East thought such flawed elections, held under foreign military occupation, were any sort of model for the region.

In Year 3, the Bush administration blamed almost everything that was going wrong on one shadowy figure: Abu Musab al-Zarqawi. Bush set the tone for Year 3 with a speech at Fort Bragg on July 28, 2005, in which he said, “The only way our enemies can succeed is if we forget the lessons of September 11 … if we abandon the Iraqi people to men like Zarqawi .. and if we yield the future of the Middle East to men like bin Laden.” The previous week, Bush had said that the U.S. was in Iraq “because we were attacked.” Zarqawi was the perfect plot device for an administration who wanted to perpetuate the falsehood that the Iraq war was directly connected with Sept. 11 and al-Qaida.

In spring of 2006, Maj. Gen. Rick Lynch came out and attributed 90 percent of suicide bombings in Iraq to Zarqawi and his organization, which he had rebranded as “al-Qaida in Iraq,” but which had begun in Afghanistan as an alternative to Osama bin Laden’s terrorist organization.
Meanwhile, security analysts discerned 50 distinct Sunni Arab guerrilla cells in Iraq. Some were Baathists and some were Arab nationalists; some were Salafi Sunni fundamentalists while others were tribally based. To attribute so many attacks all over central, western and northern Iraq to a single entity suggested an enormous, centrally directed organization in Iraq called “al-Qaida.” But there was never any evidence for such a conclusion, and when Zarqawi was killed by a U.S. airstrike in May of 2006, the insurgent violence continued without any change in pattern.

In Year 4, as major sectors of Iraq descended into hell, Bush’s big lie consisted of denying that the country had fallen into civil war. In late February 2006, Sunni guerrillas blew up the golden-domed Askariya shrine of the Shiites in Samarra. In the aftermath, the Shiites, who had shown some restraint until that point, targeted the Sunni Arabs in Baghdad and its hinterlands for ethnic cleansing. After May 2006, the death toll of victims of sectarian violence rose at times to an official figure of 2,500 or more per month, and it fluctuated around that level for the subsequent year. The Baghdad police had to form a new unit, the Corpse Patrol, to collect dozens of bodies every morning in the streets of the capital.

On Sept. 1, 2006, Sunni guerrillas slaughtered 34 Asian and Iraqi Shiite pilgrims passing near Ramadi on their way to the Shiite holy city of Karbala south of Baghdad. In his weekly radio address the next day Bush said, “Our commanders and diplomats on the ground believe that Iraq has not descended into a civil war.” Many lesser conflicts have been dubbed civil wars by journalists, academics and policy thinkers alike. But Bush continued with the fantastic spin: “The people of Baghdad are seeing their security forces in the streets, dealing a blow to criminals and terrorists.”

Year 5, the past year, has been one of troop escalation, or the “surge.” (Calling the policy a “surge” rather than an “escalation” is emblematic of the administration’s propaganda.) The big lie is that Iraq is now calm, that the surge has worked, and that victory is within reach.
In early 2007, the U.S. made several risky bargains. It pledged to the Shiite government of Prime Minister Nouri al-Maliki that it would disarm the Sunni Arab guerrillas in Baghdad first, before demanding that the Shiites lay down their arms. It thus induced Muqtada al-Sadr to declare a freeze on the paramilitary activities of his Mahdi army militia.
The Americans would go on to destroy some of his Sunni Arab enemies for him. U.S. military leaders in Iraq began paying Sunni Arab Iraqi guerrillas and others in provinces such as al-Anbar to side with the United States and to turn on the foreign jihadis, most of them from Saudi Arabia and North Africa. U.S. troops also began a new counterinsurgency strategy, focused on taking control of Sunni Arab neighborhoods, clearing them of armed guerrillas, and then staying in them on patrol to ensure that the guerrillas did not reestablish themselves.

The strategy of disarming the Sunni Arabs of Baghdad — who in 2003 constituted nearly half the capital’s inhabitants — had enormous consequences. Shiite militias took advantage of the Sunnis’ helplessness to invade their neighborhoods at night, kill some as an object lesson, and chase the Sunnis out. Hundreds of thousands of Baghdad residents were ethnically cleansed in the course of 2007, during the surge, and some two-thirds of the more than 1.2 million Iraqi refugees who ended up in Syria were Sunni Arabs. Baghdad, a symbol of past Arab glory and of the Iraqi nation, became at least 75 percent Shiite, perhaps more.

That outcome has set the stage for further Sunni-Shiite conflict to come. Much of the reduction in the civilian death toll is explained by this simple equation: A formerly mixed neighborhood like Shaab, east of the capital, now has no Sunnis to speak of, and so therefore there are no longer Sunni bodies in the street each morning.

But the troop escalation has failed to stop bombings in Baghdad, and the frequency and deadliness of attacks increased in February and March, after falling in January. In the first 10 days of March, official figures showed 39 deaths a day from political violence, up from 29 a day in February, and 20 in January. Assassinations, attacks on police, and bombings continue in Sunni Arab cities such as Baquba, Samarra and Mosul, as well as in Kirkuk and its hinterlands in the north. On Monday, a horrific bombing in the Shiite shrine city of Karbala killed 52 and wounded 75, ruining the timing of Vice President Cheney’s and Sen. McCain’s visit to Iraq to further declare victory.

Moreover, Turkey made a major incursion into Iraq to punish the guerrillas of the Kurdish Workers Party from eastern Anatolia, who have in the past seven months killed dozens of Turkish troops. The U.S. media was speaking of “calm” and “a lull” in Iraq violence even while destructive bombs were going off in Baghdad, and Turkey’s incursion was resulting in over a hundred deaths. The surge was “succeeding,” according to the administration, and therefore no mere attacks by a third country, or bombings by insurgents, could challenge the White House story line.

Bush’s five big lies about Iraq powerfully shaped press coverage of the war and have kept the mess there going at least long enough to turn it over to the next president. As he campaigns for the White House, John McCain, Bush’s heir apparent in the Iraq propaganda department, has been signaling that “complete victory” in Iraq will be his talking point of choice for Year 6. If the mainstream media and the American public don’t wake up to the truth about how the war has gone, they’ll find themselves buying into an even longer and deeper tragedy.

Juan Cole teaches Middle Eastern and South Asian history at the University of Michigan. His most recent book Napoleon’s Egypt: Invading the Middle East (New York: Palgrave Macmillan, 2007) has just been published. He has appeared widely on television, radio and on op-ed pages as a commentator on Middle East affairs, and has a regular column at Salon.com. He has written, edited, or translated 14 books and has authored 60 journal articles. His weblog on the contemporary Middle East is Informed Comment.
Source:
___________________

Saturday, March 15, 2008

Billions at risk from wheat super-blight


By Debora Mackenzie
New Scientist
April 3, 2007


"This thing has immense potential for social and human destruction." Startling words - but spoken by the father of the Green Revolution, Nobel laureate Norman Borlaug, they are not easily dismissed.
















Click to enlarge

An infection is coming, and almost no one has heard about it. This infection isn't going to give you flu, or TB. In fact, it isn't interested in you at all. It is after the wheat plants that feed more people than any other single food source on the planet. And because of cutbacks in international research, we aren't prepared. The famines that were banished by the advent of disease-resistant crops in the Green Revolution of the 1960s could return, Borlaug told New Scientist.

The disease is Ug99, a virulent strain of black stem rust fungus (Puccinia graminis), discovered in Uganda in 1999. Since the Green Revolution, farmers everywhere have grown wheat varieties that resist stem rust, but Ug99 has evolved to take advantage of those varieties, and almost no wheat crops anywhere are resistant to it.

The strain has spread slowly across east Africa, but in January this year spores blew across to Yemen, and north into Sudan (see Map). Scientists who have tracked similar airborne spores in this part of the world say it will now blow into Egypt, Turkey and the Middle East, and on to India, lands where a billion people depend on wheat.

There is hope: this week scientists are assessing the first Ug99-resistant varieties of wheat that might be used for crops. However, it will take another five to eight years to breed up enough seed to plant all our wheat fields.

The threat couldn't have come at a worse time. Consumption has outstripped production in six of the last seven years, and stocks are at their lowest since 1972. Wheat prices jumped 14 per cent last year.

Black stem rust itself is nothing new. It has been a major blight on wheat production since the rise of agriculture, and the Romans even prayed to a stem rust god, Robigus. It can reduce a field of ripening grain to a dead, tangled mass, and vast outbreaks regularly used to rip through wheat regions. The last to hit the North American breadbasket, in 1954, wiped out 40 per cent of the crop. In the cold war both the US and the Soviet Union stockpiled stem rust spores as a biological weapon.

After the 1954 epidemic, Borlaug began work in Mexico on developing wheat that resisted stem rust. The project grew into the International Maize and Wheat Improvement Center, known by its Spanish acronym CIMMYT. The rust-resistant, high-yielding wheat it developed banished chronic hunger in much of the world, ended stem rust outbreaks, and won Borlaug the Nobel peace prize in 1970.

Yet once again Borlaug - now 93 and fighting cancer - is leading the charge against his old enemy. When Ug99 turned up in Kenya in 2002, he sounded the alarm. "Too many years had gone by and no one was taking Ug99 seriously," he says. He blames complacency, and the dismantling of training and wheat testing programmes, after 40 years without outbreaks.

Now a Global Rust Initiative (GRI) is under way at CIMMYT. It's head, Rick Ward, blames the delay on cuts, starting in the 1980s, in CIMMYT's funding for routine monitoring and maintenance of crops and pests.

"CIMMYT was slow to detect the extent of susceptibility to Ug99 [because] it didn't have the scientific eyes and ears on the ground any more," says Chris Dowswell of CIMMYT. "Once it did, it had to start a laborious fund-raising campaign to respond."

Ward is now being promised adequate support as fears grow in rich wheat-growing countries, but meanwhile Ug99 has got worse. It was first noticed because it started appearing on wheat previously protected by a gene complex called Sr31, the backbone of stem rust resistance in most wheat farmed worldwide. Then last year it acquired the ability to defeat another widely used complex, Sr24. "Of the 50 genes we know for resistance to stem rust, only 10 work even partially against Ug99," says Ward. Those are present in less than 1 per cent of the crop.

The first line of defence is fungicide, but the poor farmers who stand to lose most from the blight generally cannot afford it, or don't have the equipment or know-how to apply it. CIMMYT is considering "fire brigade" spray teams armed with cheap, generic fungicides in poor areas.

However, they will be competing with the rich for fungicide, and depending on where Ug99 strikes, stocks could be limited.

Even rich countries face problems. The US has been fighting soybean rust with fungicide ever since spores blew in on hurricane Ivan in 2004. If Ug99 arrives as well, the US could be in trouble because it doesn't make enough fungicide for both crops. Kitty Cardwell of the US Department of Agriculture says there might be enough if the US fights Ug99 the same way as it is tackling soya rust: spotting outbreaks with a fast DNA-based field test and posting the results on an interactive website (http://www.sbrusa.net/), so farmers spray only when danger looms.

Ultimately, says Ward, the only real answer "is to get new, resistant varieties out there".

CIMMYT has been working on this by taking countries' top-yielding varieties and crossing them with wheat from its seed collections that does resist Ug99. For two years now the crosses have been tested for resistance at field stations in Njoro, Kenya, and in Ethiopia, where it is safe to release Ug99 as it is already there. Resistant strains are sent back to CIMMYT in Mexico and assessed for yield and other qualities, then sent out again for further tests. Resistant lines are now being grown on 27 plots in Nepal, India, Afghanistan and Pakistan.

So far so good, but the real challenge is multiplying up enough resistant seed so that if Ug99 hits, there will be enough to plant the next crop. This takes time - and it will only happen if the new resistant varieties match or exceed existing yields. Nor is it an exact science. No one knows why wheat that looks good in Mexico might grow as well in Egypt, say, but fail in China unless it is crossed with a local variety.

There is nothing for it but to do the tests, says Ravi Singh, the GRI's chief wheat pathologist. The resistant lines must be just as good as the ones people are growing now, he says, or farmers won't use them, and government-owned seed companies that dominate the wheat industry in developing countries won't sell them, no matter what new disease the scientists say is coming.

Singh calculates that if he can get countries to devote 3 to 5 per cent of their wheat-growing area to resistant varieties, the seed harvest will be enough to plant the whole country with resistant wheat if Ug99 hits.

So it's a race, and who wins depends on what Ug99 does now. Stem rust can arrive in a new area and lurk for years before it gets the right conditions for an outbreak. "It won't suddenly explode everywhere. It will be like a moving storm," says Dowswell.

However, Ug99 has another ace up its sleeve. The spores blowing in the wind now are from the asexual stage that grows on wheat. If any blow onto the leaves of its other host, the barberry bush (Berberis vulgaris), they will change into the sexual form and swap genes with whatever other stem rusts they find. Barberry is native to west Asia. "As if it wasn't challenging enough breeding varieties that resist this thing," laments Ward. "All I know is that what blows into Iran will not be the same as what blows out."

What's more, Ug99 will find agriculture has changed to its liking in the decades stem rust has been away. "Forty years ago most wheat wasn't irrigated and heavily fertilised," says Borlaug. Now, thanks to the Green Revolution he helped bring about, it is. That means modern wheat fields are a damper, denser thicket of stems, where dew can linger till noon - just right for fungus.

Another worry is that travel has exploded in the past 40 years. There have now been several documented cases of travellers carrying rust spores on their clothing. Some fear Ug99 will hitchhike as much as it flies - and its spread need not be innocent. New Scientist has learned that the US Department of Homeland Security met in March to discuss the possibility that someone could transport Ug99 deliberately.

Even at 93, Borlaug is looking to the long term. Eventually, scientists will have to create wheat with a wide spectrum of resistances. The genes may be hiding in other grains and grasses. "Why has rice had no rusts for millions of years?" he asks.

For now, Borlaug says, we have to rely on fungicides, wheat breeding and luck. "We're moving as fast as we can now, but we started three years too late," he says. "We'd better have some good luck. Governments think this is still small and local, but these things build up."

From issue 2598 of New Scientist magazine, 03 April 2007, page 6-7

Source:
http://environment.newscientist.com/channel/earth/mg19425983.700-billions-at-risk-from-wheat-superblight.html
_________________

Sunday, March 09, 2008

Why the US has really gone broke

The economic disaster that is military keynesianism

Global confidence in the US economy has reached zero, as was proved by last month’s stock market meltdown. But there is an enormous anomaly in the US economy above and beyond the subprime mortgage crisis, the housing bubble and the prospect of recession: 60 years of misallocation of resources, and borrowings, to the establishment and maintenance of a military-industrial complex as the basis of the nation’s economic life.


By Chalmers Johnson


The military adventurers in the Bush administration have much in common with the corporate leaders of the defunct energy company Enron. Both groups thought that they were the “smartest guys in the room” — the title of Alex Gibney’s prize-winning film on what went wrong at Enron. The neoconservatives in the White House and the Pentagon outsmarted themselves. They failed even to address the problem of how to finance their schemes of imperialist wars and global domination.

As a result, going into 2008, the United States finds itself in the anomalous position of being unable to pay for its own elevated living standards or its wasteful, overly large military establishment. Its government no longer even attempts to reduce the ruinous expenses of maintaining huge standing armies, replacing the equipment that seven years of wars have destroyed or worn out, or preparing for a war in outer space against unknown adversaries. Instead, the Bush administration puts off these costs for future generations to pay or repudiate.

This fiscal irresponsibility has been disguised through many manipulative financial schemes (causing poorer countries to lend us unprecedented sums of money), but the time of reckoning is fast approaching.

There are three broad aspects to the US debt crisis. First, in the current fiscal year (2008) we are spending insane amounts of money on “defence” projects that bear no relation to the national security of the US. We are also keeping the income tax burdens on the richest segment of the population at strikingly low levels.

Second, we continue to believe that we can compensate for the accelerating erosion of our base and our loss of jobs to foreign countries through massive military expenditures — “military Keynesianism” (which I discuss in detail in my book Nemesis: The Last Days of the American Republic). By that, I mean the mistaken belief that public policies focused on frequent wars, huge expenditures on weapons and munitions, and large standing armies can indefinitely sustain a wealthy capitalist economy. The opposite is actually true.

Third, in our devotion to militarism (despite our limited resources), we are failing to invest in our social infrastructure and other requirements for the long-term health of the US. These are what economists call opportunity costs, things not done because we spent our money on something else. Our public education system has deteriorated alarmingly. We have failed to provide health care to all our citizens and neglected our responsibilities as the world’s number one polluter.

Most important, we have lost our competitiveness as a manufacturer for civilian needs, an infinitely more efficient use of scarce resources than arms manufacturing.

Fiscal disaster

It is virtually impossible to overstate the profligacy of what our government spends on the military. The Department of Defense’s planned expenditures for the fiscal year 2008 are larger than all other nations’ military budgets combined. The supplementary budget to pay for the current wars in Iraq and Afghanistan, not part of the official defence budget, is itself larger than the combined military budgets of Russia and China. Defence-related spending for fiscal 2008 will exceed $1 trillion for the first time in history. The US has become the largest single seller of arms and munitions to other nations on Earth. Leaving out President Bush’s two on-going wars, defence spending has doubled since the mid-1990s. The defence budget for fiscal 2008 is the largest since the second world war.

Before we try to break down and analyse this gargantuan sum, there is one important caveat.

Figures on defence spending are notoriously unreliable.

The numbers released by the Congressional Reference Service and the Congressional Budget Office do not agree with each other. Robert Higgs, senior fellow for political economy at the Independent Institute, says: “A well-founded rule of thumb is to take the Pentagon’s (always well publicised) basic budget total and double it” (1). Even a cursory reading of newspaper articles about the Department of Defense will turn up major differences in statistics about its expenses. Some 30-40% of the defence budget is “black”,” meaning that these sections contain hidden expenditures for classified projects. There is no possible way to know what they include or whether their total amounts are accurate.

There are many reasons for this budgetary sleight-of-hand — including a desire for secrecy on the part of the president, the secretary of defence, and the military-industrial complex — but the chief one is that members of Congress, who profit enormously from defence jobs and pork-barrel projects in their districts, have a political interest in supporting the Department of Defense.

In 1996, in an attempt to bring accounting standards within the executive branch closer to those of the civilian economy, Congress passed the Federal Financial Management Improvement Act. It required all federal agencies to hire outside auditors to review their books and release the results to the public. Neither the Department of Defense, nor the Department of Homeland Security, has ever complied. Congress has complained, but not penalised either department for ignoring the law. All numbers released by the Pentagon should be regarded as suspect.

In discussing the fiscal 2008 defence budget, as released on 7 February 2007, I have been guided by two experienced and reliable analysts: William D Hartung of the New America Foundation’s Arms and Security Initiative (2) and Fred Kaplan, defence correspondent for Slate.org (3). They agree that the Department of Defense requested $481.4bn for salaries, operations (except in Iraq and Afghanistan), and equipment. They also agree on a figure of $141.7bn for the “supplemental” budget to fight the global war on terrorism — that is, the two on-going wars that the general public may think are actually covered by the basic Pentagon budget.

The Department of Defense also asked for an extra $93.4bn to pay for hitherto unmentioned war costs in the remainder of 2007 and, most creatively, an additional “allowance” (a new term in defence budget documents) of $50bn to be charged to fiscal year 2009. This makes a total spending request by the Department of Defense of $766.5bn.

But there is much more. In an attempt to disguise the true size of the US military empire, the government has long hidden major military-related expenditures in departments other than Defense. For example, $23.4bn for the Department of Energy goes towards developing and maintaining nuclear warheads; and $25.3bn in the Department of State budget is spent on foreign military assistance (primarily for Israel, Saudi Arabia, Bahrain, Kuwait, Oman, Qatar, the United Arab Republic, Egypt and Pakistan).

Another $1.03bn outside the official Department of Defense budget is now needed for recruitment and re-enlistment incentives for the overstretched US military, up from a mere $174m in 2003, when the war in Iraq began. The Department of Veterans Affairs currently gets at least $75.7bn, 50% of it for the long-term care of the most seriously injured among the 28,870 soldiers so far wounded in Iraq and 1,708 in Afghanistan. The amount is universally derided as inadequate. Another $46.4bn goes to the Department of Homeland Security.

Missing from this compilation is $1.9bn to the Department of Justice for the paramilitary activities of the FBI; $38.5bn to the Department of the Treasury for the Military Retirement Fund; $7.6bn for the military-related activities of the National Aeronautics and Space
Administration; and well over $200bn in interest for past debt-financed defence outlays. This brings US spending for its military establishment during the current fiscal year, conservatively calculated, to at least $1.1 trillion.

Military Keynesianism

Such expenditures are not only morally obscene, they are fiscally unsustainable. Many neo-conservatives and poorly informed patriotic Americans believe that, even though our defence budget is huge, we can afford it because we are the richest country on Earth. That statement is no longer true. The world’s richest political entity, according to the CIA’s World Factbook, is the European Union. The EU’s 2006 GDP was estimated to be slightly larger than that of the US. Moreover, China’s 2006 GDP was only slightly smaller than that of the US, and Japan was the world’s fourth richest nation.

A more telling comparison that reveals just how much worse we’re doing can be found among the current accounts of various nations. The current account measures the net trade surplus or deficit of a country plus cross-border payments of interest, royalties, dividends, capital gains, foreign aid, and other income. In order for Japan to manufacture anything, it must import all required raw materials. Even after this incredible expense is met, it still has an $88bn per year trade surplus with the US and enjoys the world’s second highest current account balance (China is number one). The US is number 163 — last on the list, worse than countries such as Australia and the UK that also have large trade deficits. Its 2006 current account deficit was $811.5bn; second worst was Spain at $106.4bn. This is unsustainable.

It’s not just that our tastes for foreign goods, including imported oil, vastly exceed our ability to pay for them. We are financing them through massive borrowing. On 7 November 2007, the US Treasury announced that the national debt had breached _$9 trillion for the first time. This was just five weeks after Congress raised the “debt ceiling” to $9.815 trillion. If you begin in 1789, at the moment the constitution became the supreme law of the land, the debt accumulated by the federal government did not top $1 trillion until 1981. When George Bush became president in January 2001, it stood at approximately $5.7 trillion. Since then, it has increased by 45%. This huge debt can be largely explained by our defence expenditures.

The top spenders

The world’s top 10 military spenders and the approximate amounts each currently budgets for its military establishment are:

Military budget

1. United States (FY 2008 budget) - $623bn
2. China (2004) - $65bn
3. Russia - $50bn
4. France (2005) - $45bn
5. United Kingdom - $42.8bn
6. Japan (2007) - $41.75bn
7. Germany (2003) - $35.1bn
8. Italy (2003) - $28.2bn
9. South Korea (2003) - $21.1bn
10. India (2005 est.) - $19bn


World total military expenditures (2004 est) - $1,100bn
World total (minus the US) - $500bn

Our excessive military expenditures did not occur over just a few short years or simply because of the Bush administration’s policies. They have been going on for a very long time in accordance with a superficially plausible ideology, and have now become so entrenched in our democratic political system that they are starting to wreak havoc. This is military Keynesianism — the determination to maintain a permanent war economy and to treat military output as an ordinary economic product, even though it makes no contribution to either production or consumption.

This ideology goes back to the first years of the cold war. During the late 1940s, the US was haunted by economic anxieties. The great depression of the 1930s had been overcome only by the war production boom of the second world war. With peace and demobilisation, there was a pervasive fear that the depression would return.

During 1949, alarmed by the Soviet Union’s detonation of an atomic bomb, the looming Communist victory in the Chinese civil war, a domestic recession, and the lowering of the Iron Curtain around the USSR’s European satellites, the US sought to draft basic strategy for the emerging cold war. The result was the militaristic National Security Council Report 68 (NSC-68) drafted under the supervision of Paul Nitze, then head of the Policy Planning Staff in the State Department. Dated 14 April 1950 and signed by President Harry S Truman on 30 September 1950, it laid out the basic public economic policies that the US pursues to the present day.

In its conclusions, NSC-68 asserted: “One of the most significant lessons of our World War II experience was that the American economy, when it operates at a level approaching full efficiency, can provide enormous resources for purposes other than civilian consumption while simultaneously providing a high standard of living” (4).

With this understanding, US strategists began to build up a massive munitions industry, both to counter the military might of the Soviet Union (which they consistently overstated) and also to maintain full employment, as well as ward off a possible return of the depression. The result was that, under Pentagon leadership, entire new industries were created to manufacture large aircraft, nuclear-powered submarines, nuclear warheads, intercontinental ballistic missiles, and surveillance and communications satellites.

This led to what President Eisenhower warned against in his farewell address of 6 February 1961: “The conjunction of an immense military establishment and a large arms industry is new in the American experience” — the military-industrial complex.

By 1990 the value of the weapons, equipment and factories devoted to the Department of Defense was 83% of the value of all plants and equipment in US manufacturing. From 1947 to 1990, the combined US military budgets amounted to $8.7 trillion. Even though the Soviet Union no longer exists, US reliance on military Keynesianism has, if anything, ratcheted up, thanks to the massive vested interests that have become entrenched around the military establishment.

Over time, a commitment to both guns and butter has proven an unstable configuration.

Military industries crowd out the civilian economy and lead to severe economic weaknesses.

Devotion to military Keynesianism is a form of slow economic suicide.

Higher spending, fewer jobs

On 1 May 2007, the Center for Economic and Policy Research of Washington, DC, released a study prepared by the economic and political forecasting company Global Insight on the long-term economic impact of increased military spending. Guided by economist Dean Baker, this research showed that, after an initial demand stimulus, by about the sixth year the effect of increased military spending turns negative. The US economy has had to cope with growing defence spending for more than 60 years. Baker found that, after 10 years of higher defence spending, there would be 464,000 fewer jobs than in a scenario that involved lower defence spending.

Baker concluded: “It is often believed that wars and military spending increases are good for the economy. In fact, most economic models show that military spending diverts resources from productive uses, such as consumption and investment, and ultimately slows economic growth and reduces employment” (5).

These are only some of the many deleterious effects of military Keynesianism.

It was believed that the US could afford both a massive military establishment and a high standard of living, and that it needed both to maintain full employment. But it did not work out that way. By the 1960s it was becoming apparent that turning over the nation’s largest manufacturing enterprises to the Department of Defense and producing goods without any investment or consumption value was starting to crowd out civilian economic activities.

The historian Thomas E Woods Jr observes that, during the 1950s and 1960s, between one-third and two-thirds of all US research talent was siphoned off into the military sector (6). It is, of course, impossible to know what innovations never appeared as a result of this diversion of resources and brainpower into the service of the military, but it was during the 1960s that we first began to notice Japan was outpacing us in the design and quality of a range of consumer goods, including household electronics and automobiles.

Can we reverse the trend?

Nuclear weapons furnish a striking illustration of these anomalies. Between the 1940s and 1996, the US spent at least $5.8 trillion on the development, testing and construction of nuclear bombs. By 1967, the peak year of its nuclear stockpile, the US possessed some 32,500 deliverable atomic and hydrogen bombs, none of which, thankfully, was ever used. They perfectly illustrate the Keynesian principle that the government can provide make-work jobs to keep people employed.

Nuclear weapons were not just America’s secret weapon, but also its secret economic weapon. As of 2006, we still had 9,960 of them. There is today no sane use for them, while the trillions spent on them could have been used to solve the problems of social security and health care, quality education and access to higher education for all, not to speak of the retention of highly-skilled jobs within the economy.

The pioneer in analysing what has been lost as a result of military Keynesianism was the late Seymour Melman (1917-2004), a professor of industrial engineering and operations research at Columbia University. His 1970 book, Pentagon Capitalism: The Political Economy of War, was a prescient analysis of the unintended consequences of the US preoccupation with its armed forces and their weaponry since the onset of the cold war. Melman wrote: “From 1946 to 1969, the United States government spent over $1,000bn on the military, more than half of this under the Kennedy and Johnson administrations — the period during which the [Pentagon-dominated] state management was established as a formal institution. This sum of staggering size (try to visualize a billion of something) does not express the cost of the military establishment to the nation as a whole. The true cost is measured by what has been foregone, by the accumulated deterioration in many facets of life, by the inability to alleviate human wretchedness of long duration.”

In an important exegesis on Melman’s relevance to the current American economic situation, Thomas Woods writes: “According to the US Department of Defense, during the four decades from 1947 through 1987 it used (in 1982 dollars) $7.62 trillion in capital resources. In 1985, the Department of Commerce estimated the value of the nation’s plant and equipment, and infrastructure, at just over _$7.29 trillion… The amount spent over that period could have doubled the American capital stock or modernized and replaced its existing stock” (7).

The fact that we did not modernise or replace our capital assets is one of the main reasons why, by the turn of the 21st century, our manufacturing base had all but evaporated. Machine tools, an industry on which Melman was an authority, are a particularly important symptom. In November 1968, a five-year inventory disclosed “that 64% of the metalworking machine tools used in US industry were 10 years old or older. The age of this industrial equipment (drills, lathes, etc.) marks the United States’ machine tool stock as the oldest among all major industrial nations, and it marks the continuation of a deterioration process that began with the end of the second world war. This deterioration at the base of the industrial system certifies to the continuous debilitating and depleting effect that the military use of capital and research and development talent has had on American industry.”

Nothing has been done since 1968 to reverse these trends and it shows today in our massive imports of equipment — from medical machines like _proton accelerators for radiological therapy (made primarily in Belgium, Germany, and Japan) to cars and trucks.

Our short tenure as the world’s lone superpower has come to an end. As Harvard economics professor Benjamin Friedman has written: “Again and again it has always been the world’s leading lending country that has been the premier country in terms of political influence, diplomatic influence and cultural influence. It’s no accident that we took over the role from the British at the same time that we took over the job of being the world’s leading lending country. Today we are no longer the world’s leading lending country. In fact we are now the world’s biggest debtor country, and we are continuing to wield influence on the basis of military prowess alone” (8).

Some of the damage can never be rectified. There are, however, some steps that the US urgently needs to take. These include reversing Bush’s 2001 and 2003 tax cuts for the wealthy, beginning to liquidate our global empire of over 800 military bases, cutting from the defence budget all projects that bear no relationship to na.tional security and ceasing to use the defence budget as a Keynesian jobs programme.

If we do these things we have a chance of squeaking by. If we don’t, we face probable national insolvency and a long depression.

Source:
http://mondediplo.com/2008/02/05military
____________________

Saturday, March 08, 2008

They knew, but did Nothing


by Philip Shenon
Book Excerpt
March 8, 2008


In this exclusive extract from his new book, Philip Shenon uncovers how the White House tried to hide the truth of its ineptitude leading up to the September 11 terrorist attacks.

In the American summer of 2001, the nation's news organisations, especially the television networks, were riveted by the story of one man. It wasn't George Bush. And it certainly wasn't Osama bin Laden.

It was the sordid tale of an otherwise obscure Democratic congressman from California, Gary Condit, who was implicated - falsely, it later appeared - in the disappearance of a 24-year-old government intern later found murdered. That summer, the names of the blow-dried congressman and the doe-eyed intern, Chandra Levy, were much better known to the American public than bin Laden's.

Even reporters in Washington who covered intelligence issues acknowledged they were largely ignorant that summer that the CIA and other parts of the Government were warning of an almost certain terrorist attack. Probably, but not necessarily, overseas.

The warnings were going straight to President Bush each morning in his briefings by the CIA director, George Tenet, and in the presidential daily briefings. It would later be revealed by the 9/11 commission into the September 11 attacks that more than 40 presidential briefings presented to Bush from January 2001 through to September 10, 2001, included references to bin Laden.

And nearly identical intelligence landed each morning on the desks of about 300 other senior national security officials and members of Congress in the form of the senior executive intelligence brief, a newsletter on intelligence issues also prepared by the CIA.

The senior executive briefings contained much of the same information that was in the presidential briefings but were edited to remove material considered too sensitive for all but the President and his top aides to see. Often the differences between the two documents were minor, with only a sentence or two changed between them.

Apart from the commission's chief director, Philip Zelikow, the commission's staff was never granted access to Bush's briefings, except for the notorious August 2001 briefing that warned of the possibility of domestic al-Qaeda strikes involving hijackings. But they could read through the next best thing: the senior executive briefings.

During his 2003 investigations it was startling to Mike Hurley, the commission member in charge of investigating intelligence, and the other investigators on his team, just what had gone on in the spring and summer of 2001 - just how often and how aggressively the White House had been warned that something terrible was about to happen.

Since nobody outside the Oval Office could know exactly what Tenet had told Bush during his morning intelligence briefings, the presidential and senior briefings were Tenet's best defence to any claim that the CIA had not kept Bush and the rest of the Government well-informed about the threats. They offered a strong defence.

The team's investigators began to match up the information in the senior briefings and they pulled together a timeline of the headlines just from the senior briefings in the northern spring and summer:

"Bin Ladin Planning Multiple Operations" (April 20)

"Bin Ladin Threats Are Real" (June 30)

It was especially troubling for Hurley's team to realise how many of the warnings were directed to the desk of one person: Condoleezza Rice, the National Security Adviser.

Emails from the National Security Council's counter-terrorism director, Richard Clarke, showed that he had bombarded Rice with messages about terrorist threats.

He was trying to get her to focus on the intelligence she should have been reading each morning in the presidential and senior briefings

"Bin Ladin Public Profile May Presage Attack" (May 3)

"Terrorist Groups Said Co-operating on US Hostage Plot" (May 23)

"Bin Ladin's Networks' Plans Advancing" (May 26)

"Bin Ladin Attacks May Be Imminent" (June 23)

"Bin Ladin and Associates Making Near-Term Threats" (June 25)

"Bin Ladin Planning High-Profile Attacks" (June 30),

"Planning for Bin Ladin Attacks Continues, Despite Delays" (July 2)

Other parts of the Government did respond aggressively and appropriately to the threats, including the Pentagon and the State Department. On June 21, the US Central Command, which controls American military forces in the Persian Gulf, went to "delta" alert - its highest level - for American troops in six countries in the region. The American embassy in Yemen was closed for part of the summer; other embassies in the Middle East closed for shorter periods.

But what had Rice done at the NSC?

If the NSC files were complete, the commission's historian Warren Bass and the others could see, she had asked Clarke to conduct inter- agency meetings at the White House with domestic agencies, including the Federal Aviation Administration and the FBI, to keep them alert to the possibility of a domestic terrorist strike.

She had not attended the meetings herself.

She had asked that the then attorney-general, John Ashcroft, receive a special briefing at the Justice Department about al-Qaeda threats.

But she did not talk with Ashcroft herself in any sort of detail about the intelligence.

Nor did she have any conversations of significance on the issue with the FBI director, Louis Freeh, nor with his temporary successor that summer, the acting director Tom Pickard.

There is no record to show that Rice made any special effort to discuss terrorist threats with Bush.

The record suggested, instead, that it was not a matter of special interest to either of them that summer.

Bush seemed to acknowledge as much in an interview with Bob Woodward of The Washington Post that Bush almost certainly regretted later. In the interview in December 2001, only three months after the attacks, Bush said that "there was a significant difference in my attitude after September 11" about al-Qaeda and the threat it posed to the United States.

Before the attacks, Bush said: "I was not on point, but I knew he was a menace, and I knew he was a problem. I knew he was responsible, or we felt he was responsible, for the previous bombings that killed Americans. I was prepared to look at a plan that would be a thoughtful plan that would bring him to justice, and would have given the order to do that. I have no hesitancy about going after him. But I didn't feel that sense of urgency, and my blood was not nearly as boiling."

If anyone on the White House staff had responsibility for making Bush's blood "boil" that summer about Osama bin Laden, it was Rice.

The members of Mike Hurley's team were also alarmed by the revelations, week by week, month by month, of how close the commission's chief director, Philip Zelikow, was to Rice and others at the White House.

They learned early on about Zelikow's work on the Bush transition team in 2000 and early 2001 and about how much antipathy there was between him and Richard Clarke.

They heard the stories about Zelikow's role in developing the "pre-emptive war" strategy at the White House in 2002.

Zelikow's friendships with Rice and others were a particular problem for Warren Bass, since Rice and Clarke were at the heart of his part of the investigation. It was clear to some members of team that they could not have an open discussion in front of Zelikow about Rice and her performance as National Security Adviser.

They could not say openly, certainly not to Zelikow's face, what many on the staff came to believe: that Rice's performance in the spring and summer of 2001 amounted to incompetence, or something not far from it.

David Kay, the veteran American weapons inspector sent to Iraq by the Bush Administration in 2003 to search for weapons of mass destruction, passed word to the commission that he believed Rice was the "worst national security adviser" in the history of the job.

For Hurley's team, there was a reverse problem with Clarke. It was easy to talk about Clarke in Zelikow's presence, as long as the conversation centred on Clarke's failings at the NSC and his purported dishonesty.

Long before Bass had seen Clarke's files, Zelikow made it clear to the team's investigators that Clarke should not be believed, that his testimony would be suspect.

He argued that Clarke was a braggart who would try to rewrite history to justify his errors and slander his enemies, Rice in particular. The commission had decided that in its private interviews with current and former government officials, witnesses would be placed under oath when there was a substantial reason to doubt their truthfulness. Zelikow argued that Clarke easily fell into that category; Clarke, he decreed, would need to be sworn in.

When he finally got his security clearance and was allowed into the reading room, Bass discovered he could make quick work of Rice's emails and internal memos on the al-Qaeda threat in the spring and summer of 2001. That was because there was almost nothing to read, at least nothing that Rice had written herself.

Either she committed nothing to paper or email on the subject, which was possible since so much of her work was conducted face-to-face with Bush, or terrorist threats were simply not an issue that had interested her before September 11. Her speeches and public appearances in the months before the attacks suggested the latter.

Tipped off by an article in The Washington Post, the commission discovered the text of a speech that she had been scheduled to make on September 11, 2001 - the speech was canceled in the chaos following the attacks - in which Rice planned to address "the threats of today and the day after, not the world of yesterday".

The speech, which was intended to outline her broad vision on national security and to promote the Bush Administration's plans for a missile defence system, included only a passing reference to terrorism and the threat of radical Islam. On the day that Osama bin Laden launched the most devastating attack on the United States since Pearl Harbour, bin Laden's terrorist network was seen by Rice as only a secondary threat, barely worth mentioning.

But if Rice had left almost no paper trail on terrorism in 2001, Clarke's files were everything that Bass could have hoped for. Clarke wrote down much of what he saw and heard at the White House, almost to the point of obsession when it came to al-Qaeda. Bass and his colleagues could see that Clarke had left a rich narrative of what had gone so wrong at the NSC in the months before September 11, albeit filtered through the writings of the very opinionated Clarke.

Repeatedly in 2001, Clarke had gone to Rice and others in the White House and pressed them to move, urgently, to respond to a flood of warnings about an upcoming and catastrophic terrorist attack by Osama bin Laden. The threat, Clarke was arguing, was as dire as anything that he or the CIA had ever seen.

He pushed for an early meeting in 2001 with Bush to brief him about bin Laden's network and the "nearly existential" threat it represented to the United States.

But Rice rebuffed Clarke.

She allowed him to give a briefing to Bush on the issue of cyber terrorism, but not on bin Laden; she told Clarke the al-Qaeda briefing could wait until after the White House had put the finishing touches that summer on a broader campaign against bin Laden. She moved Clarke and his issues off centre stage - in part at the urging of Zelikow and the transition team.

Bass told colleagues that he gasped when he found a memo written by Clarke to Rice on September 4, 2001, exactly a week before the attacks, in which Clarke seemed to predict what was just about to happen.

It was a memo that seemed to spill out all of Clarke's frustration about how slowly the Bush White House had responded to the cascade of terrorist threats that summer. The note was terrifying in its prescience.

"Are we serious about dealing with the al-Qaeda threat?" he asked Rice. "Decision makers should imagine themselves on a future day when the CSG [Counterterrorism Security Group] has not succeeded in stopping al-Qaeda attacks and hundreds of Americans lay dead in several countries, including the US."

Bass's colleagues said he knew instantly that the September 4 email was so sensitive - and potentially damaging, especially to Rice - that the White House would never voluntarily release a copy to the commission or allow him to take notes from the room if they came close to reproducing its language.

Under a written agreement between the commission and the White House, notes could not "significantly reproduce" the wording of a classified document.

Bass decided he would have to try to memorise it in pieces, several sentences at a time, and then rush back to the commission to bat them out on a computer keyboard.

The day he discovered the document, Bass all but burst into the commission's offices and rushed over to Hurley.

"Holy shit, chief," Bass said excitedly. "You won't believe what I found."

He told Hurley that Clarke's September 4 memo was a "document that grabs you by the throat, a document that you write when you're at the end of your tether - or well past it", as Clarke clearly was in the weeks before September 11. Hurley instantly understood the significance of what he was being told by Bass.

The question for both men was whether Zelikow would allow them to share any of it with the public.

Months later, Bass could not take it any longer. He was going to quit, or least threaten to quit, and he was going to make it clear that Zelikow's attempts at interference - his efforts to defend Rice and demean Clarke - were part of the reason why. He marched into the office of Dan Marcus, the general counsel, to announce his threat to leave the investigation.

"I cannot do this," he declared to Marcus, who was already well aware of Bass's unhappiness. "Zelikow is making me crazy."

He was outraged by Zelikow and the White House; Bass felt the White House was trying to sabotage his work by its efforts to limit his ability to see certain documents from the NSC files and take useful notes from them.

Marcus urged him to calm down: "Let's talk this through." But Bass made it clear to colleagues that he believed Zelikow was interfering in his work for reasons that were overtly political - intended to shield the White House, and Rice in particular, from the commission's criticism.

For every bit of evidence gathered by Bass and Hurley's team to bolster Clarke's allegation that the White House had ignored terrorist threats in 2001, Zelikow would find some reason to disparage it.

Marcus and Hurley managed to talk Bass out of resigning, although the threat lingered until the final weeks of the investigation.

On May 15, 2002, CBS network reported that a daily briefing presented to Bush a few weeks before the September 11 attacks warned him specifically about the threats of a domestic hijacking by al-Qaeda.

Instead of releasing the briefing or at least offering a detailed explanation of what was in the document, the White House chose to have Rice hold a news conference at the White House in which she raised as many questions about the briefing as she answered.

It would later become clear to many of the commission's members and its staff that she had tried to mislead the White House press corps about the contents of the briefing.

She acknowledged that Bush had received a briefing about possible al-Qaeda hijackings, but she claimed that the brief offered "historical information" and "was not a warning - there was no specific time, place, or method".

She failed to mention, as would later be clear, that the briefing focused entirely on the possibility that al-Qaeda intended to strike within the United States; it cited relatively recent FBI reports of possible terrorist surveillance of government buildings in New York.

Tom Kean, the commission's chairman, could not deny the thrill of this. A former governor of New Jersey who had left politics to become president of Drew University in his home state, Kean took a seat in the reading room in the New Executive Office building where the commission was reviewing the White House's most secret files.

Kean was handed a sheaf of presidential briefings from the Clinton and Bush administrations.

Here in his hands were the documents that the White House had been so determined for so long to keep from him. Lee Hamilton liked to refer to the briefings as the "holy of holies" - the ultimate secret documents in the government - and Kean assumed that must be the case.

"I thought this would be the definitive secrets about al-Qaeda, about terrorist networks and all the other things that the President should act on," he said. "I was going to find out the most important things that a president had learned." He assumed they would contain "incredibly secretive, precise, and accurate information about anything under the sun."

Each brief was only several pages long, so Kean could read through months of them in a stretch of a few hours.

And he found himself terrified by what he was reading, really terrified. Here were the digests of the most important secrets that were gathered by the CIA and the nation's other spy agencies at a cost of tens of billions of dollars a year.

And there was almost nothing in them.

"They were garbage," Kean said. "There really was nothing there - nothing, nothing."

If students back at Drew turned in term papers this badly researched, "I would have given them an F," he said.

Kean pointed that out to one of his White House minders who accompanied him to the reading room. "I've read all this," he told the minder in astonishment. A lot of the information in the briefings and other supposedly top secret intelligence reports had already been revealed by the nation's big news organisations. "I already knew this."

"Oh, but you're missing the point," the minder replied. "Now you know it's true."

It occurred to Kean that this might be the commission's most frightening discovery of all: The emperors of espionage had no clothes. Perhaps the reason the White House had fought so hard to block the commission's access to the briefings was that they revealed how ignorant the Government was of the threats it faced before September 11.

Kean could understand their fear.

Imagine the consequences if al-Qaeda and its terrorist allies knew how little the US really knew about them.

Commission member Jamie Gorelick, who, along with Zelikow, was given access to the larger universe of briefings, was more impressed by the documents than Kean had been. Or at least she was less unimpressed. She knew the Bush Administration was right to complain that much of the intelligence in the briefs in the months before September 11 was maddeningly non-specific about a possible date or place of an attack. Some of the intelligence in the briefs was "paltry"; sometimes the information contradicted itself from one day to the next, Gorelick said.

But she was astonished by the sheer volume of the warnings.

Flood, cascade, tsunami, take your pick of metaphors.

She could see that in the spring and summer of 2001, there was a consistent drum beat of warnings, day after day, that al-Qaeda was about to attack the United States or its allies.

It was clear to Gorelick that the CIA had gone to Bush virtually every morning for months in 2001 to give him the message that the United States needed to be ready for a catastrophic terrorist strike, and from what she was reading, no one ruled out the possibility of a domestic attack.

"Something is being planned, something spectacular," she said, summarising what the President had been told by George Tenet and what Bush should have read in the briefings. "We don't know what it is, we don't know where it is, but something is happening."

She said CIA analysts were trying to tell Bush, as bluntly as they could, that the threat in those months was "the worst thing they've ever seen - an unprecedented threat," worse than the threats before the millennium.

It seemed to Gorelick that Rice had "assumed away the hardest part of her job" as national security adviser - gathering the best intelligence available to the White House and helping the President decide how to respond to it.

Whatever her job title, Rice seemed uninterested in actually advising him. Instead, she wanted to be his closest confidant - specifically on foreign policy - and to simply translate his words into action. Rice had wanted to be "the consigliere to the President", Gorelick thought.

Domestic issues seemed to bore her.

Her deputy, Stephen Hadley, had told the commission something remarkable in his private interview the month before: He and Rice had not seen themselves as responsible for co-ordinating the FBI and other domestic agencies about terrorism.

But if they weren't responsible, who was?

There was no separate domestic security adviser in the White House.

They had just demoted Clarke.

At the time of her May 2002 news conference, no reporter had a copy of the presidential briefing. CBS had broken the story of its existence but had few details of what was actually in the document. So the White House press corps would have to trust Rice's description of what was in it.

She described it as a "warning briefing but an analytic report" about al-Qaeda threats and said that it contained "the most generalised kind of information - there was no time, there was no place, there was no method of attack" mentioned apart from a "very vague" concern about hijacking. "I want to reiterate," she said. "It was not a warning."

Asked if September 11 didn't represent an intelligence failure by the Administration, she replied almost testily: "I don't think anybody could have predicted that these people would take an airplane and slam it into the World Trade Centre, take another one and slam it into the Pentagon - that they would try to use an airplane as a missile."

Rice's news conference came eight months after the attacks. Yet she was suggesting that in all that time, no one had bothered to tell her that there were indeed several reports prepared within the CIA, the aviation administration, and elsewhere in the Government about the threat of planes as missiles.

Had no one told her in all those months that the Department of Defence had conducted drills for the possibility of a plane-as-missile attack on the Pentagon? Had she forgotten that when she and Bush attended the G8 summit in Italy in July 2001, the airspace was closed because of the threat of an aerial suicide attack by al-Qaeda?

Commission member Tim Roemer made it his goal to get the August 6 briefing made public and to prove once and for all that Rice and her White House colleagues had a concept of the truth about September 11 that was, at best, "flexible". To Roemer, Rice had long ago passed the "threshold" between spin and dishonesty.

"She'd lost credibility with me," he said. The question among the Democratic commissioners was whether anybody would be brave enough to go public to question Rice's competence and her honesty.

Much as the staff felt beaten down by Zelikow, so did the other Democratic commissioners.

By the end, they had given up the fight to document the more serious failures of Bush, Rice, and others in the Administration in the months before September. Zelikow would never have permitted it. Nor, they realised, would Kean and Hamilton.

The Democrats hoped the public would read through the report and understand that September 11 did not have to happen - that if the Bush Administration had been more aggressive in dealing with the threats flooding into the White House from January 2001 through to September 10, 2001, the plot could have been foiled.

The Clinton administration could not duck blame for having failed to stop bin Laden before 2001.

But what had happened in the White House in the first eight months of George Bush's presidency had all but guaranteed that 19 young Arab men with little more than pocket knives, a few cans of mace, and a misunderstanding of the tenets of Islam could bring the US to its knees.

The Commission - The Uncensored History Of The 9/11 Investigation by Philip Shenon (Little, Brown, $35) is published on Monday.

Source:
http://www.smh.com.au/news/world/they-knew-but-did-nothing/2008/03/07/1204780065676.html?page=fullpage#contentSwap1
____________________