Nuking Iran: Radiation

[Promoted by Chris]

Secrets and lies have driven the history of the Bomb. We see this pattern repeated today, in an effort to make nuclear weapons seem no different from other explosives.  But with continuing signs that the Bush administration may be heading for war on Iran, with reports of U.S. officials considering using nuclear weapons in Iran, and with a large explosives test scheduled in Nevada in early June that many believe is intended to provide data for designing a new nuclear “bunker-buster” weapon, or in reference to an existing nuclear device, these lies become even more dangerous.

The most important secrets and lies concerned radiation, the distinguishing effect of the Bomb, beyond its sheer power.  The effects of radiation were denied, dismissed and minimized for decades.  Today they are not even mentioned.  It is especially important to revisit this history because, according to the the Physicians for Social Responsibility, a nuclear earth penetrating weapon “would actually create more fallout than a ground-burst or airburst weapon, due to the increased distribution of radioactive debris from detonation at a shallow depth in soil or rock.”  

Radiation and the history of denying it and confronting it is the subject of this essay.

From its very beginnings, the atomic bomb has been mysterious.  Even the physicists who lived together in Los Alamos to develop it did not know what they had invented.  They didn’t know how powerful the first Bomb would be–they had a betting pool on the yield, and many seriously underestimated and overestimated the result.   About half the scientists didn’t think the device would explode at all. Enrico Fermi was taking bets on whether it would burn off the Earth’s atmosphere.

But most of the mystery was deliberate.  The Bomb was developed in complete secrecy so as not to tip off the Nazis, who were believed to be working on their own Bomb project.  Even after Germany’s defeat, the Bomb was kept secret from the remaining enemy of Japan, but also from America’s war allies.  Then after the war, as the truth of what the Bomb’s effects became clear to scientists, the American military and Washington policymakers tried to keep some of those effects secret from U.S. citizens, even to the point of outright lies.

The Bomb produces three effects: blast, heat and radioactivity (commonly called radiation.)  The blast is immensely more powerful, and the heat is immensely more intense, than any other manmade device can produce.  Together they resulted, in the Bomb’s first test, in killing every living thing within a mile, including insects.  A single Bomb each virtually leveled the cities of Hiroshima and Nagasaki.  Human beings were vaporized.  Nothing was left of some but their shadows burnt into concrete.  Others were seared to a small pile of ashes.  The remains of some were fused with metal doors and other objects.  

Those effects were immediately apparent.  But it took some time for the effects of radiation to be understood, and even longer to be acknowledged.

What We Know Now

The first humans exposed to an atomic bomb blast were those living in Hiroshima and Nagasaki in the summer of 1945.  These included some American children of parents who were Japanese or of Japanese ancestry or origin living in the U.S. who were sent to internment camps.  These children were sent to “safe” areas in Japan, such as Hiroshima.

Some of those who survived the initial blast and heat but were heavily exposed to radiation began to develop radiation poisoning symptoms after about twenty-four hours: severe nausea, fever and vomiting. In his 2005 book, “The Bomb: a Life”, scholar Gerard DeGroot writes: “The damage to cells was so widespread that recovery was impossible.  Death occurred after about a week, before doctors had any inkling of what was wrong.”

For others, symptoms didn’t begin for ten to fifteen days.  They suffered from bloody diarrhea, “a loss of appetite, general malaise, persistent fever and hair loss.”  The symptoms were delayed because gamma rays attack bone marrow where new blood cells are formed, and begin to produce defective cells.  

“In the worst cases of radiation poisoning, the gamma rays virtually destroy the entire bone marrow.. The cessation of red cell formation leads to progressive anemia.  Deficiency of platelet formation causes thin blood to hemorrhage into the skin and the retina of the eye, and sometimes into the intestines and kidneys.  The fall in the number of white cells lowers the victim’s resistance to infections. When infections occurred among Hiroshima and Nagasaki victims, it usually spread from the mouth and was accompanied by gangrene of the lips, tongue and throat.  Patients often emitted a terrible smell—they had effectively started to decay from the inside.”[pages 107-08]

Those who escape this severe radiation poisoning, who may be farther from the blast, will not know for years, and perhaps will never know, the extent of the damage caused by radiation.  “Ionizing radiation released in a nuclear explosion passes through the skin without causing external damage. It interacts immediately with tissues within the body, causing an irregular pattern of cell damage. ”   Those who survive the attack on high turnover tissues, such as those involved in blood formation, may suffer effects on tissues with slower turnover, in the brain, liver or thyroid gland.  In these, “…the effects of radiation damage may not become apparent for months or years, and can eventually manifest themselves as cancers.”

Then there is danger to the unborn.  Damaged or destroyed cells in a fetus may impair the development of organs and parts of the body.  “Radiation can also damage DNA in the reproductive system, causing mutation in future generations. While scientists once thought that a `safe’ level of exposure existed, current medical opinion holds that there is no threshold dose below which an effect is not produced.”[emphasis added]      

These effects were caused by the Bomb dropped on Hiroshima (approximately 20 kilotons) and Nagasaki (about 15 kilotons.)  The nuclear bunker-busters that could be used in Iran may have a yield up to 10 kilotons, but most believe the yield goes up to 340 kilotons, more than 22 times more powerful than the Bomb that destroyed Hiroshima.

More than half of those who died from the effects of that Bomb within the first five years after it was dropped, and more than two-thirds of those who died within five years after Nagasaki, had survived the blast and fire.              

A Generation of Lies

Although radioactive fallout from the first Bomb test in 1945 contaminated cattle, the effects were kept secret, along with everything else about the test.  But even after the war, the secrecy continued.  In “Bombs in the Backyard”, a self-described balanced account of nuclear testing, A. Costandina Titus writes that “Even Congress has been denied access to information.”  

General Leslie Groves had ridden herd over the Manhattan Project that developed the Bomb, and he continued the policy of secrecy, which soon became a policy of denial.  When the first reports of radiation sickness in Hiroshima surfaced, he dismissed them as “Japanese propaganda.”  William Laurence, the only reporter permitted to follow the Bomb’s development, echoed the charge.

Later, when radioactive fallout entered the news, American officials  insisted that radiation exposure was painless to humans and test animals.  General Groves testified to Congress that radiation poisoning was “a very pleasant way to die.”

Few precautions were taken for service personnel involved in the first postwar Bomb tests in the South Pacific in 1946, nor for many subsequent tests there and in Nevada.  When military personnel and others exposed to test fallout either deliberately or accidentally later became ill, the government refused to consider that the nuclear explosions were related or responsible, and they maintained this heartless lie for decades.

  But one of the doctors involved in monitoring radiation and physical effects from those 1946 tests would be among the first to sound a public alarm.  David Bradley’s book,“No Place to Hide”, was published in 1948 and became an immediate best seller.  He later revised it to include further information as well as medical studies from later atomic and hydrogen bomb tests in the Pacific. He reported for example that after 406 Pacific islanders were exposed to H-bomb fallout in 1954, nine children were born retarded, ten more with other abnormalities, and three were stillborn, including one reported to be “not recognizable as human.”  

But the first writing to bring some of the reality of radiation to Americans was John Hershey’s “Hiroshima”, published in the New Yorker in August 1946, and soon after as a best-selling book.  The stories of six Hiroshima survivors ended with riveting accounts of the ongoing effects of radiation.  This was the occasion for more stories (and more denials) about the effects of radioactivity.  

Still, Bomb testing went on, as necessary to the national defense, particularly when the Soviet Union unexpectedly exploded their first atomic Bomb in 1951. The U.S. returned to exploding atomic Bombs within its borders that same year, and radiation from a Nevada test was detected in the snow that fell on Rochester, New York.  By early 1953, there had been 20 tests in Nevada.  A seven year old boy 70 miles from Ground Zero in Nevada who died of leukemia “became possibly the first baby boom casualty of the atomic age.” (“Great Expectations” by Landon Y. Jones, p59.)

Them!

The undercurrent of news about radiation’s effects continued throughout the 1950s, as the U.S. and Soviet Union exploded hundreds of atomic bombs, including hydrogen bombs (which some say are to atom bombs what atom bombs are to conventional explosives.)  Testing and its effects became a campaign issue in the 1956 presidential election. Strontium 90, a radioactive isotope that lodges in bones and causes cancer, was discovered in cow’s milk across America.  Still, the official word was there was nothing to worry about.

The likelihood (since proven) that U.S. nuclear secrets were passed to the Russians, added fuel to what became McCarthyism in the 1950s.  Now dissent concerning the Bomb could be criminal treason as well as unpatriotic.  So much of the fear Americans had about nuclear radiation and the Bomb itself was driven underground, into the collective unconscious, and to the popular expression of that unconscious: the movies.

Monsters created or unleashed by nuclear explosions became the decade’s B-movie cliché.  But one of the first remained one of the best: “Them!” released in 1954.  The film is fascinating today partly because several relatively unknown actors became stars, mostly in the new medium of television: James Arness in “Gunsmoke,” James Whitmore in “The Law and Mr. Jones,” Leonard Nimoy (with a very small part) in “Star Trek,” and Fess Parker, a young actor Walt Disney saw in this movie and cast as Davy Crockett, the first TV hero to be a national phenomenon.

But the fact that these actors were unknowns in 1954 led credibility to the story, which was mostly a step by step investigation into a horrific phenomenon—radiation from atomic testing mutated a colony of ordinary ants into a race of giant ants, killing, breeding and preparing to swarm on Los Angeles and other cities, where they could begin their conquest of humanity.

The movie dealt with a number of themes related to the Cold War and the Bomb, but it was remarkably forthright about the source of the fears it symbolized. “If these monsters got started as a result of the first atomic bomb tests in 1945, what about all the others that have been exploded since then?” asks James Arness, the FBI man of action.  “I don’t know,” says the beautiful woman scientist.  “Nobody knows,” says her father, the elder scientist. “When man entered the atomic age, he opened the door into a new world.  What we eventually find in that new world nobody can predict.”

There would be many more Bomb-themed films (including the original Japanese version of Godzilla, which dealt more forthrightly with Bomb themes than the version Americans saw.  The original will be available on DVD for the first time in September.)  In his book, “Apocalypse Movies”, Kim Newman makes the valuable point that the B movie divisions of major studios tended to glorify the military in their Bomb-theme movies, while independent films were more questioning, and revealed more of the real horror.  They also tended to extend mutations to human beings, as in “The Incredible Shrinking Man.”  

  But as these eruptions from the unconscious became formulaic “bug-eyed monster” movies, a few filmmakers began to deal openly with the effects of nuclear war.  The most influential of the 1950s, and the one that dealt most directly with mass death by radioactive fallout as the ultimate outcome of nuclear war, was Stanley Kramer’s “On the Beach,” starring Gregory Peck, Ava Gardner and Fred Astaire.  Set in Australia after the U.S. and the Soviet Union have destroyed each other, the characters learn they are doomed from the fallout heading their way.   There are no explosions, no monsters, no gruesome deaths.  Yet Nobel Laureate and anti-Bomb activist Linus Pauling said, “It may be that some years from now can look back and say that `On the Beach’ is the movie that saved the world.”

Throughout the Cold War and the nuclear arms race, there were films and TV movies that tried to bring the horror into public consciousness.  “The War Game” by Peter Watkin, a docudrama about the effects of a nuclear war in one English village, was made for BBC-TV in 1967 but the BBC refused to show it until 1985.  It was seen in art houses in the U.S. and elsewhere as a feature in the 60s and 70s.  Also on British TV in the 1980s was “Threads,” which carried the effects of radiation past one generation.  While a survivor society struggles in a burnt-out and irradiated world, a 12 year old giving birth screams at the sight of her deformed stillborn baby.  It was a harrowing ending to a truly horrifying film.

There were two prominent TV films in the U.S in the 80s, which also showed survivors of nuclear war struggling valiantly and hopelessly.  The better known was “The Day After” directed by Nicholas Meyer, starring Jason Robards and Jobeth Williams.  Set in Lawrence, Kansas, it centers on a doctor (Robards) who deals with an impossible emergency over days and weeks as he and everyone else gradually succumbs to radiation poisoning.  The TV movie ends with the warning that as fatalistic as the story seemed, a full-scale nuclear war would have far worse effects.

“The Day After” had an effect on American consciousness in the 1980s similar to “On the Beach” in the late 1950s.  But another TV film brought the effects home. “Testament” by Lynne Littman, starring Jane Alexander, followed events in an isolated northern California town.  Without graphic images, it simply shows a family and a town living to the end of the world, as radiation poisons everyone and everything.

Radiation from hundreds of thermonuclear bombs is enough to destroy civilization.  But radiation from a single Bomb of relatively low yield killed hundreds of thousands in Japan.  It could happen in Iran and perhaps the surrounding region, with some dying in days, some in weeks, and some in years or even decades.  Yet no one is talking about this.  It is time to start.

Coming Attractions

All aspects of the Bomb have been a challenge to the human psyche as well as to human institutions.  Exploring the psychological aspects, expressed again in films like “Dr. Strangelove”, may help illuminate our current head-in-the-sand attitude about the prospect of unleashing nukes in Iran.  Exploring the institutional and political structures that have allowed humanity to live with the existence of the Bomb, may be instructive in exploring the possible geopolitical consequences of nuking Iran.  Those are subjects of forthcoming essays.