Showing posts with label radiation risk to Japanese. Show all posts
Showing posts with label radiation risk to Japanese. Show all posts

Wednesday, April 13, 2011

RADIATION RISKS: SCIENCE MAGAZINE ARTICLE PUBLISHED 25 MARCH 2011 BY THE AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE


Science 25 March 2011:
Vol. 331 no. 6024 p. 1504
DOI: 10.1126/science.331.6024.1504

NEWS & ANALYSIS

Devastation in Japan
Radiation Risks Outlined by Bombs, Weapons Work, and Accidents

Jocelyn Kaiser

The ongoing leaks from Japan's crippled Fukushima Daiichi nuclear power plant have raised concern that some workers and even the public could be exposed to dangerous levels of radiation. So far, officials have said that levels outside the plant are low. But how do they know how much radiation is harmful? Risk calculations are based heavily on a 63-year study of 94,000 people who survived the two atomic bombs dropped on Japan in August 1945. It is one of the largest, longest population studies ever done; for radiation safety, it is the gold standard. Its breadth and precision are “magnificent,” says John Boice, scientific director of the International Epidemiology Institute in Rockville, Maryland, and former chief of radiation epidemiology at the U.S. National Cancer Institute (NCI) in Bethesda, Maryland.

Up to 200,000 people in Nagasaki and Hiroshima died soon after the bomb blasts— some from radiation sickness—but more survived. To understand the delayed effects of radiation, the U.S. National Academy of Sciences launched a joint study with Japan of bomb survivors, using a 1950 census to track them down. Initial case reports of cataracts, leukemia, and birth defects eventually became a long-term study to follow cancer and other illnesses in about 94,000 survivors and 26,000 unexposed residents. It was run by a U.S. and Japanese–funded agency eventually named the Radiation Effects Research Foundation (RERF).


Researchers “spent a huge amount of time reconstructing where people were ATB, at time of blast,” says epidemiologist Richard Monson of Harvard University. They asked subjects whether they were inside or outdoors, near a window, upstairs or downstairs, and which direction they faced. They even constructed a mock Japanese village in the Nevada desert, hoisted a uranium reactor up in a tower, and measured the neutrons it spewed to study the movement of radiation through the air and into buildings. To fine-tune gamma-radiation estimates, researchers tested ceramic roof tiles on Japanese houses for a high-energy electron signature, says Tore Straume of NASA Ames Research Center in Mountain View, California. The dosimetry underwent several revisions over the decades.

Other RERF researchers monitored the health of survivors. Using death records and cancer registries, they soon documented leukemias, particularly in the young, tallying 219 deaths by 2002 in people receiving a significant exposure—a 45% rise above the number expected. It appeared to peak in 1950. By the 1970s, researchers were tracking an elevated rate of solid cancers; “it looks like it persists for a lifetime,” says NCI epidemiologist Kiyohiko Mabuchi.

Still, “people's perceptions of cancer caused are probably different from reality,” says biostatistician Dale Preston of Hirosoft International, who worked at RERF for 23 years. Leukemias were relatively rare, and although by 1998 about 7851 survivors in the study who received significant exposures developed solid cancers, only 850 cases, or 11%, have been attributed to radiation. (The cancer risk was about 50% higher for those who received at least 1 sievert of radiation; the risk drops with dose to 2% below 0.1 sievert. Lifetime risks are lower; for an exposure of 0.250 sieverts, the allowed limit for workers at the Daiichi plant, the increased risk of ever developing cancer is about 2.5%, Boice says.)

Data from hundreds of medical studies have been used to bolster the A-bomb– survivor results. In the early 20th century, before the risks were recognized, radiation was used to diagnose or treat everything from mastitis to tonsillitis—and some patients developed cancer, Boice notes. Studies of workers—such as women who applied radium paint to clock dials and later developed bone cancer—also proved useful. Studies of about 21,000 workers exposed to radiation starting in 1948 at the former Soviet Union's Mayak nuclear weapons plant, and of 30,000 villagers nearby along the Techa River, are proving “important,” Preston says: The cohorts received a wide range of radiation doses, and many workers inhaled plutonium, a long-lived radioisotope absent in the Japanese A-bomb survivors.

These studies and others of nuclear workers “in general have supported estimates from the A-bomb survivors,” says NCI's Ethel Gilbert. Controversy remains, however, about whether the bomb survivors' brief, one-time exposure would be as harmful if spread over many years. “It's the one major unanswered question,” Boice says.

Studies of nuclear accidents have been less useful for estimating dose responses, although they confirm that it's hard to see health effects from low-level exposures. The 1979 accident at Three Mile Island in Pennsylvania exposed the nearby population to a “trivial” amount of radiation, Boice says; health effects were not detected. The 1986 Chernobyl accident, on the other hand, spewed iodine-131 and cesium-137 for 10 days in a plume that reached 5 million people. Researchers expect that 4000 excess cancer deaths will eventually result. But precise dose information is lacking even for the “liquidators,” the 600,000 workers who helped clean up, says Mabuchi, making it difficult to link exposure to disease.

The only clear health effect among the public from Chernobyl so far has been more than 6000 cases of thyroid cancer (15 of them fatal), mainly in people who as children and adolescents drank milk from cows that fed on grass tainted with iodine-131. This should “not be a problem in Japan” because contaminated milk and vegetables are being removed from the food supply, says radiologist Fred Mettler of the University of New Mexico, Albuquerque, a consultant to the United Nations on the Chernobyl disaster.

Studies of the A-bomb survivors (40% of the original group are still living) continue at RERF. A few of its 45 researchers are helping Japanese officials monitor the population near the Daiichi plant, says RERF vice chair and research chief Roy Shore. They're also discussing a possible study of the hundreds of workers involved in keeping the plant under control. If the Fukushima crisis becomes another Chernobyl, Shore says, “we'd certainly want to compile all the data we could.”

Blogger's Note: This is a lot less scary than I had expected. The main risk seems to be to children drinking milk, which has been reported to be 300% above EPA maximums in the U.S. However, I suspect that EPA maximums may be too conservative for adults for reasons I have reported here.  Personally, I will continue drinking my glass of milk with breakfast unless and until the radioisotope concentrations are measured to be substantially greater.

Sunday, April 10, 2011

WHAT DO EXPERTS SAY ARE THE CONSEQUENCES OF EXPOSURE TO LOW-LEVEL NUCLEAR RADIATIONS? WELL, IT ALL DEPENDS ON WHETHER OR NOT YOU ASK EXPERTS WHO ACTUALLY DID THE STUDIES.


Certain Levels of Nuclear Radiations Are Not Only Harmless, but Actually Helpful to Human Health

Cave men (and our African “Eve”) surely breathed a lot more Radon-222 than we above-ground dwellers ...but instead of dying of radiation poisoning, they survived two brutal ice ages and passed their genes along to us!

By David L. Griscom


From the time I joined the Naval Research Laboratory (NRL) in Washington DC in December of 1967 until my retirement in January of 2001, I continually used high energy x-ray and Cobalt-60 gamma ray sources in the course of my research. These radiations were very dangerous if misused, but the precautions were not too difficult to carry out nor hard to remember. Still, the U.S. government required us to take refresher courses in safety procedures once a year from health physics experts.

These courses usually began by recitation of the litany of radiations that practically everyone is subject to:
5 mrem per round-trip flight NY-LA
10 mrem per chest x-ray
40 mrem/year due to radioactive Potassium-40 in your body
60 mrem/year background radiation (including the residuum of all past A-bomb tests)
60 mrem/year other medical x rays, and finally
200 mrem/year due to Radon-222 leaking out of the ground into our homes
All of the preceding numbers are whole-body doses. At some point I learned from a reputable science magazine that the worst of all is Polonium-210 at 8,000 mrem/per year for smokers of 1 ½ packs of cigarettes a day – and this dose is delivered directly to the lungs! I wasn’t sure if that was true or not. But one warm spring morning at the halfway point of our 2-hour refresher training, the instructor proposed that we go outside for a short break. So when he came out and lit up a cigarette, I sauntered over to him and said simply “Polonium-210?” ...to which he sheepishly replied “Well, I only smoke 2 or 3 cigarettes a day.” Confirmed: Po-210 in your lungs from cigarette smoking can be really bad news.

That left only Radon-222 for me to worry about (provided I avoided serious doses of the x and gamma rays I worked with). So my wife and I bought some detector cartridges and placed them around our home, and after a period of time sent them out to a lab for analysis. Rn-222 is a gaseous decay product of Uranium-238, which is present in various amounts in virtually all rocks and soils (see the primer I posted here). Therefore it was no surprise that only the detector we placed in our deep basement had a significant Rn-222 reading. I didn’t worry much about it at the time, but should I have?

When I first arrived at NRL, one of the older scientists there, Herb Rosenstock, was working on the question of how much ionizing radiation is too much for humans. There were – and still are – two theories on that. One of them, the Linear No-Threshold (LNT) model, states that no amount of radiation is too small to cause harm to human beings.  And it is exclusively this theory that seems to be promulgated in recent news accounts (for example in this recent article from Al Jazeera).

The other theory contemplates that doses of radiation may be relatively harmless up to some threshold to be determined. I believe that Herb was using data for the survivors of the Hiroshima and Nagasaki atomic bomb detonations according to their distance from ground zero (those at greater distances would have received less radiation according to an inverse-square law). I can’t remember what Herb concluded, if I ever knew. However, just three weeks ago my attention was called to a huge body of scrupulously designed and performed research that seems to prove that levels of irradiation substantially exceeding the Rn-222 “threat” are not merely not dangerous ...but actually appear to be beneficial to people’s health!

The article I refer to is pretty long but can be accessed here if you’d like to read it in its entirety. Here below I will pop out a few cameos (with all footnotes deleted; please see original if you wish to trace the assertions back to their origins):



It’s Time to Tell the Truth About the Health Benefits of Low-Dose Radiation

by James Muckerheide muckerheide@comcast.net

(Excerpted from article from Summer 2000 21st Century)


Low-dose radiation has been shown to enhance biological responses for immune systems, enzymatic repair, physiological functions, and the removal of cellular damage, including prevention and removal of cancers and other diseases. Research on low-level radiation has also shown it to have no adverse effects. Yet, current radiation protection policy and practice fail to consider these valid data, instead relying on data that are poor, ambiguous, misrepresented, and manipulated.

With no regard for the cost to scientific truth, and to taxpayers, radiation policy is based on the linear no-threshold (LNT) concept, that holds that radiation at any levels above zero is deleterious. In the LNT view, the known damaging effects of high-dose radiation are linearly extrapolated down the dose scale. LNT contradicts the scientific evidence, which shows that there is a radiation threshold, below which there is no harm and, in fact, there is benefit for human health, a process known as hormesis. In defiance of this evidence, radiation-protection policy relies on falsification of the actual science research and reporting. Such malfeasance warrants scientific misconduct investigations for the results promulgated by some radiation protection-funded scientists.

(snip)

Scientific Data Biased by Early Health Physics Goals
The bias against recognition of the benefits of low-dose radiation is not new. In a March 1996 meeting at the U.S. Nuclear Regulatory Commission (NRC), Charles Willis of the NRC stated, as reported in the transcript:

. . . [I]t’s clear to many of us that we are not seeing the predicted ill effects at low doses, as has been pointed out to you. I personally came to this hormesis observation fairly late in the game. It wasn’t until 1958 that I was working with the laboratory [Oak Ridge National Laboratory] situation where we were doing experiments with below background levels of radiation, taking the [radioactive] potassium-40 out and seeing what the effects would be on the cellular level when we saw that the cells looked good but they didn’t function. So we couldn’t publish the results, another ill effect of the paradigm about the linear hypothesis.
Marshall Brucer, M.D., states with respect to the Manhattan Project:
Their first experiment, raising mice in an atmosphere of uranium dust, showed exposed mice lived longer than controls. They set up an arbitrary Maximum Permissible Dose (MPD) after proving that mice in radiation fields 10 times the MPD lived longer than controls.
After World War II, Brucer writes, about 20 articles per year mentioned hormetic effects but:
Health Physicists soon learned that their livelihood depended on scaring the pants off Congress. Every Genetics budget meeting opened its request for funds with an anti-nuclear litany. During the 1960s and 1970s about 40 articles/year described hormesis. In 1963, the AEC [Atomic Energy Commission] repeatedly confirmed lower mortality in guinea pigs, rats, and mice irradiated at low dose. In 1964, the cows exposed to about 150 rads after the Trinity A-bomb in 1946 were quietly euthanized because of extreme old age.
(snip)

In 1971, after the Federal Appeals Court “Calvert Cliffs decision,” that found the Atomic Energy Commission (AEC) Environmental Impact Statement to be inadequate, the AEC contracted for the “Argonne Radiological Impact Program,” to improve the basis for assessing low-level radiation health effects. Dr. Norman Frigerio analyzed U.S. cancer rates by average background radiation doses for each state, applying the linear no-threshold models. His results were found to contradict the LNT: There were consistently lower cancer rates in high-background-radiation states. This finding has since been consistently confirmed.

(snip)

Radon: Misrepresenting the Data
In the 1980s, Dr. Bernard Cohen, at the University of Pittsburgh, personally undertook natural background radiation studies similar to those terminated by the Atomic Energy Commission in 1973 (and by AEC’s successors, ERDA and later DOE, and the NRC). He tested the LNT using the significant lung cancer data compared with variations in residential radon. Initially, he found that lung cancer incidence in the high-radon area of Cumberland County, Pennsylvania, was lower than the Pennsylvania average. Many other studies found similar results.

Because radon data did not exist at the county level, Dr. Cohen obtained at least 100 radon measurements in the 16 large counties with the lowest lung-cancer rates, and the 25 counties with the highest rates. He also found identical results in the various random counties in which 450 university physics professors at 101 universities supported his effort to obtain residential radon measurements.

Dr. Cohen then succeeded in a private effort to do, for radon and lung cancer, what the U.S. government had terminated with the Frigerio study—measuring radon in 272,000 homes in the most populated U.S. counties. These data also consistently found inverse results, in dozens of independent studies of, for example, “all-rural” counties, “all urban” counties, and so on. Dr. Graham Colditz of Harvard University, a world renowned epidemiologist, contributed to an interim analysis of the data by counties. He confirmed the validity of the epidemiological analysis of these data.

Dr. Cohen also acquired all Environmental Protection Agency and state radon data. These data showed an inverse relationship: the higher the radon levels, the lower the incidence of lung cancer. In the full data set, the inverse correlation exceeds 20 standard deviations, compared with the predictions of [Biological Effects of Ionizing Radiations Committee] BEIR IV. The chance of error is equivalent to one in all the electrons in the universe! [Blogger’s comment: Anyone who knows the meaning of “20 standard deviations” knows that this is no exaggeration!] Any confounding factor must be: (1) much greater than smoking, (2) inversely correlated with radon, and (3) unrecognized. This is inconceivable—except for one postulate: Radon doses at the range of normal background levels stimulate lung tissue functions to protect against lung cancer [emphasis added by blogger].

(snip)

Prof. Dr. Werner Schuttmann, of the former East Germany, and Prof. Dr. Klaus Becker of Berlin, Germany, both documented research results that show that women in the very high radon uranium mining areas of Saxony, Germany, who have negligible smoking, have significantly lower lung cancer rates than women in lower radon areas. The Health Physics Journal denied publication of the Schuttmann and Becker article, however, as a result of comments by reviewers that contained such non-scientific statements as, “this is just another ecological study,” and “everyone knows that Dr. Cohen’s studies are erroneous.”

(snip)

The Case of the Radium Dial Painters
In 1974, the pre-eminent radium health effects researcher, Dr. Robley Evans of the Massachusetts Institute of Technology, rigorously demonstrated in an article in the Health Physics Journal, that BEIR [the Biological Effects of Ionizing Radiations Committee] in 1972 had misrepresented the data on the health effects of radium in order to produce a linear no-threshold result from extremely non-linear data. On Evans’s retirement in 1970, the Center for Human Radiobiology (CHR) was established at the Argonne National Laboratory.

In 1981, Dr. Evans gave the “Invited Summary” at an international conference in which it was reported that in thousands of cases of radium dial painters worldwide, there were still no occurrences of bone cancer or nasal carcinoma in individuals who had ingested less than 250 microcuries of radium-226, which produced an estimated dose of 1,000 rad to the bone. A report on these data was published in 1983.

Dr. Evans told the conference:

The studies of the radium cases during the past dozen years . . . have continued to show no radiogenic tumors, or other effects, in hundreds of persons whose effective initial body burden was less than about 50 microcuries of Ra-226, and whose cumulative skeletal average dose is less than about 1,000 rad.
(snip)

Dr. Luckey summarized the major nuclear worker vs. non-nuclear-worker studies. He shows that the nuclear workers have 52 percent of the cancer rate in comparable non-exposed workers, in 7 million person-years of exposure!

(snip)

Stimulating Health Benefits on the Cellular Level
The biological justification claimed for the LNT model is that a single ionizing photon or particle can damage DNA in a cell, and that this damage can lead to cancer. But an adult body is impacted by about 15,000 nuclear rays or particles every second—there are more than a billion such events every day—from natural sources [blogger’s emphasis]. And each day, the DNA in each cell loses approximately 5,000 purine bases, because the body’s normal heat breaks their linkages to deoxyribose. More damage is caused by normal cell division and DNA replication. But the most damage—a million DNA nucleotides in each cell damaged each day—is caused by free radicals created in the normal process of metabolism, resulting from routine eating and breathing and the stress of heat and exercise.
 

Radiation causes more double breaks per event in the DNA than normal metabolism does, and these are harder to repair than single breaks. But even given this difference, the mutations (unrepaired or misrepaired damage) from metabolism outnumber those caused by natural radiation by 10-million-fold [emphasis added, again]. There are a large variety of anti-oxidants that prevent damage, enzymes that continually repair damaged nucleotides in DNA, and removal processes to eliminate those it cannot repair. Even high-level radiation adds only a few more mutations to the millions that are occurring from metabolism.

(snip)

Time for Extreme Corrective Action
Hundreds of credible scientific studies, reported in the peer-reviewed literature, during the 50 years since the Manhattan Project studies, demonstrate beneficial responses to low-level radiation. With more than 2,000 studies going back more than 100 years, research has consistently demonstrated beneficial health effects and biological responses. The LNT has been substantially contradicted. However, these data are shown to be systematically ignored and actively suppressed, and their research terminated, by the radiation-protection interests that control radiation science policy and scientific reviews [emphasis again added by blogger]. To the contrary, no evidence of adverse effects for human beings exists in hundreds of studies in low-, moderate, and even high-radiation-dose populations that in any way confirm the LNT premise.


(snip to end)

O.K. You’ve read this far, so you must wonder... “How could such a thing happen in science?” Well, as stated in the preceding paragraph, large institutions (like U.S. government agencies) may systematically ignore or actively suppress the valid results of scientific research that may stand in the way of institutional policies. The paramount present-day example of this is the National Institute of Science and Technology’s (NIST) bogus research on what caused the World Trade Center towers to disintegrate (see here, for example).

But what about the scientists? Aren’t we seekers of the truth? Why haven’t scientists weighed in on such issues? Well, from the excerpts from James Muckerheide’s article that I’ve posted above, it is clear that a whole lot of scientists actually have jumped in and actually disproved the LNT postulate (LNT certainly doesn’t merit being called a theory, since theories explain all available facts) that no radiation exposure is too small to not injure human beings.

However, to tell the truth, scientists are human too, and so each and every one has his/her own biases when it comes to his/her own field of work. So if some scientist somewhere comes out with a new theory that busts a widely-held paradigm, there is knee jerk resistance to accepting such a discombobulating change of world view.

For example, when Alfred Wegener first published his theory of continental drift in 1915, he was not merely disbelieved but whole scientific conferences were held just to “prove” him wrong. Of course, nowadays plate tectonics is well accepted as a proven Earth process.  Current science focuses on determining the detailed mechanisms by which it operates.

Another example is the famous physical chemist and philosopher Michael Polyani whose potential theory of chemical adsorption was not only refused publication, but he was also forbidden to teach it when he was a full professor. Later on, others concluded the same thing and managed to get their work published, eventually leading to the recognition that Polyani had been right all along ...but not before this new paper was declared to be "Polanyi’s old error."

Last, and certainly least by comparison to the two stellar scientists above, is my own story. I have proved that about 10,000 km2 of the U.S. east coastal plain from Washington DC to North Carolina is covered by ejecta from the well-accepted 35.5 million-year-old, 90 km-diameter Chesapeake Bay crater submerged beneath the eponymous bay. I've reasoned this from my own petrology and materials science performed on rocks from this layer, together with arguments based on both classical (uniformitarian) geology and the highly complex field of the geophysics of major impacts on the earth, moon, and terrestrial planets. I’ve presented my story before many of the world’s top impact geologists without eliciting a single word of serious criticism. Still, my manuscript for the proceedings of the last conference I attended was rejected. I suspect that, like what happened to Polyani, this work of mine has been black listed by some powerful institutional forces.  So for the moment I’ve web published it.


Up Date: In February of 2012 I successfully published in a geological journal the story alluded to above under the title
"In plain sight: the Chesapeake Bay crater ejecta blanket"
(For this reason, I've removed the self-published web site mentioned above.)

[end]

Please feel free to distribute widely.