RiskWorld Home Page
(Reprinted
with permission from Physics Today, 52(9), September 1999,
pp.
24-29, American Institute of Physics.)
The established worldwide practice of protecting people from radiation costs hundreds of billions of dollars a year to implement and may well determine the world's future energy system. But is it right?
The
psychosomatic disorders observed in the 15 million people in Belarus, Ukraine,
and Russia1 who were affected by the April 1986 Chernobyl accident
are probably the accident’s most important effect on public health.2
These disorders could not be attributed to the ionizing radiation, but were
assumed to be linked to the popular belief that any amount of man-made
radiation—even minuscule, close to zero doses—can cause harm, an
assumption that gained wide currency when it was accepted in the 1950s,
arbitrarily, as the basis for regulations on radiation and nuclear safety.
It was under the same assumption that
an ad hoc Soviet government
commission decided to evacuate and relocate more than 270 000 people from many
areas of the former Soviet Union where the 1986–95 average radiation doses
from the Chernobyl fallout ranged between 6 and 60 millisieverts. (See the definition of the
sievert.) By comparison, the world’s
average individual lifetime dose due to natural background radiation is about
150 mSv. In the Chernobyl-contaminated regions of the former Soviet Union, the
lifetime dose is 210 mSv—and in many regions of the world it is about 1000
mSv.3 The forced evacuation of so many people from
their—presumably—poisoned homes calls for ethical scrutiny. Examining the
physical and moral basis of that evacuation action and other radiation
policies is the subject of this article.
As they have developed over the last
three decades, the principles and concepts of radiation protection seem to
have gone astray and to have led to exceedingly prohibitive standards and
impractical recommendations. Revision of these principles and concepts is now
being proposed by an increasing number of scientists and several
organizations. They include Roger Clarke, who chairs the International
Commission on Radiological Protection, the Health Physics Society, and the
French Academy of Sciences. In addition, in April this year, the United
Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR)
decided to study a possible revision of the basic dosimetric and biological
concepts and quantities generally being applied in radiation protection. In
the years to come, such reevaluations may trigger what I believe will be
welcome changes in the basic worldwide approach to radiological protection.
Natural
and man-made radiation
We are all immersed in naturally
occurring ionizing radiation. Radiation reaches us from outer space and it
comes from radionuclides present in rocks, buildings, air, and even our own
bodies. Each flake of snow, each grain of soil, every drop of rain—and even
every person on this planet—emits radiation. And every day, at least a
billion particles of natural radiation enter our bodies.
The individual dose rate of natural
radiation the average inhabitant of Earth receives is about 2.2 mSv per year.
In some regions—for example, parts of India, Iran, and Brazil—the natural
dose rate is up to a hundred times higher. And no adverse genetic,
carcinogenic, or other malign effects of those higher doses have ever been observed
among the people, animals, and plants that have lived in those parts since
time immemorial.4,5
In the case of man-made radiation, the
global average dose has increased by about 20% since the beginning of the 20th
century—mainly as a result of the broader application of x-ray diagnostics
in medicine. Other major sources of man-made radiation, such as nuclear power,
nuclear weapons tests (figure 1), and the Chernobyl accident, have contributed
only a tiny proportion—less than 0.1%—to that increase.
In the regions of the former Soviet
Union that were highly contaminated by the fallout from the Chernobyl
accident, the increased radiation dose rate for local inhabitants is far less
than the dose rate in areas of high natural radiation (see figure
2). In those
places, the entire man-made contribution to radiation dose amounts to a mere
0.2% of the natural component.
Three and a half billion years ago, when life on Earth began, the natural level of ionizing radiation at the planet’s surface was about three to five times higher than it is now.6 Quite possibly, that radiation was needed to initiate life on Earth. And it may be essential to sustain extant life-forms, as suggested by experiments with protozoa and bacteria.7
At the early stages of evolution,
increasingly complex organisms developed powerful defense mechanisms against
such adverse radiation effects as mutation and malignant change. Those effects
originate in the cell nucleus, where the DNA is their primary target. That
evolution has apparently proceeded for so long is proof, in part, of the
effectiveness of living things’ defenses against radiation.
Other adverse effects—which lead to
acute radiation sickness and premature death in humans—also originate in the
cell, but outside its nucleus. For them to take place requires radiation doses
thousands of times higher than those from natural sources. A nuclear explosion
or cyclotron beam could deliver such a dose; so could a defective medical or
industrial radiation source. (The malfunctioning Chernobyl reactor, whose
radiation claimed 28 lives, is one example.)
The concern about large doses is
obviously justified. However, the fear of small doses, such as those absorbed
from the Chernobyl fallout by the inhabitants of central and western Europe,
is about as justified as the fear that an atmospheric temperature of 20°C may
be hazardous because, at 200°C, one can easily get third-degree burns—or
the fear that sipping a glass of claret is harmful because gulping down a
gallon of grain alcohol is fatal.
According to recent studies, by far the
most DNA damage in humans is spontaneous and is caused by thermodynamic decay
processes and by reactive free radicals formed by the oxygen metabolism. Each
mammalian cell suffers about 70 million spontaneous DNA-damaging events per
year.8 Only if armed with a powerful defense system could a living
organism survive such a high rate of DNA damage.
An effective defense system consists of
mechanisms that repair DNA, and other homeostatic mechanisms that maintain the
integrity of organisms, both during the life of the individual and for
thousands of generations. Among those homeostatic mechanisms are enzymatic
reactions, apoptosis (that is, suicidal elimination of changed cells), cell
cycle regulation, and intercellular interactions.
Ionizing radiation damages DNA also,
but at a much lower rate. At the present average individual dose rate of 2.2
mSv per year, natural radiation could be responsible for no more than about 5
DNA-damaging events in one cell per year.
Perhaps we humans lack a specific organ
for sensing ionizing radiation simply because we do not need one. Our
bodies’ defense mechanism provides ample protection over the whole range of
natural radiation levels—that is, from below 1 mSv to above 280 mSv per
year.3,4 That range is much greater than the range of
temperatures—about 50K—that humans are normally exposed to. Increasing the
water temperature in your bath tub by only 80 K, from a pleasant level of 293
K to boiling point at 373 K (that is, by a factor of only 1.3), or decreasing
it below freezing point (that is, by a factor of 1.07), would eventually kill
you.
Because such lethal high or low
temperatures are often found in the biosphere, the evolutionary development of
an organ that can sense heat and cold has been essential for survival. Organs
of smell and taste have been even more vital as defenses against dangerously
toxic or infected food. But a lethal dose of ionizing radiation delivered in
one hour—which for an individual human is 3000 to 5000 mSv—is a factor of
10 million higher than the average natural radiation dose that one would
receive over the same time period (0.00027 mSv). Compared with other noxious
agents, ionizing radiation is rather feeble. Nature seems to have provided
living organisms with an enormous safety margin for natural levels of ionizing
radiation—and also, adventitiously, for man-made radiation from controlled,
peacetime sources.
In short, conditions in which levels of
ionizing radiation could be noxious do not normally occur in the biosphere,
so no radiation-sensing organ has been needed in humans and none has evolved.
Why
radiophobia?
If radiation and radioactivity, though
ubiquitous, are so innocuous at normal levels, why do they cause such
universal apprehension? What is the cause of radiophobia—the irrational fear
that any level of ionizing radiation is dangerous? Why have radiation
protection authorities introduced a dose limit for the public of 1 mSv per
year, which is less than half the average dose rate from natural radiation and
less than 1% of the natural dose rates in many areas of the world? Why do the
nations of the world spend hundreds of billions of dollars a year to maintain
this standard?9
Here I propose some likely reasons:
·
The psychological reaction to
the devastation and loss of life caused by the atomic bombs dropped on
Hiroshima and Nagasaki at the end of World War II.
·
Psychological warfare during the
cold war that played on the public’s fear of nuclear weapons.
·
Lobbying by fossil fuel
industries.
·
The interests of radiation
researchers striving for recognition and budget.
·
The interests of politicians for
whom radiophobia has been a handy weapon in their power games (in the 1970s in
the US, and in the 1980s and 1990s in eastern and western Europe and in the
former Soviet Union).
·
The interests of news media that
profit by inducing public fear.
·
The assumption of a linear,
no-threshold relationship between radiation and biological effects.
Since nuclear weapons are regarded as a
deterrent, naturally the countries that possess them wish to make radiation
and its effects seem as dreadful as possible. Not surprisingly, national
security agencies seldom qualify or correct even the most obviously false
statements, such as “Radiation from a nuclear war can annihilate all
mankind, or even all life,” or “200 grams of plutonium could kill every
human being on Earth.”10
The facts say otherwise. Between 1945
and 1980, the 541 atmospheric nuclear tests that were performed together
yielded an explosive energy equivalent to 440 megatons of TNT (1.8 x 1024
joules). After all those explosions, despite the injection into the global
atmosphere of about 3 tons of plutonium (that is, almost 15 000 supposedly
deadly 200-gram doses), somehow we are still alive! The average individual
dose of radiation from all these nuclear explosions, accumulated between 1945
and 1998, is about 1 mSv, which is less than 1% of the natural dose for that
period.
In the heyday of atmospheric testing,
1961 and 1962, there were 176 atmospheric explosions, with a total yield of 84
megatons. The maximum deposition on Earth’s surface of radionuclides from
those explosions took place in 1964. The average individual dose accumulated
from the fallout between 1961 and 1964 was about 0.35 mSv.
At its cold war peak of 50 000 weapons,
the global nuclear arsenal had a combined potential explosive power of about
13 000 megatons, which was only 30 times larger than the megatonnage already
released in the atmosphere by all previous nuclear tests. If that whole global
nuclear arsenal had been deployed in the same places as the previous nuclear
tests, the average individual would have received a lifetime radiation dose of
about 30 mSv from the ensuing worldwide fallout. If we use the years 1961 and
1962 as a yardstick instead, the dose would have risen to about 55 mSv. And
even exploding all the nuclear weapons in just a few days rather than over a
two-year period would not change that estimate by very much. Clearly, 55 mSv
is a far cry from the short-term dose of 3000 mSv that would kill a human.
Of course, the approach taken above,
based as it is on averages, fails to account for the immense loss of life and
human suffering caused by the mechanical blast, fires, and local fallout that
follow nuclear explosions in highly populated areas. However, no matter what
the losses to those areas might be, it is certain that human and other life on
Earth would survive even an all-out global nuclear war.
A-bomb
survivors and linear no-threshold
The survivors of the atomic bombing of
Hiroshima and Nagasaki who received instantaneous radiation doses of less than
200 mSv have not suffered significant induction of cancers.11 And
so far, after 50 years of study, the progeny of survivors who were exposed to
much higher, near-lethal doses have not developed adverse genetic effects.12
Until recently, such findings from the
study of A-bomb survivors had been consistently ignored. In place of the
actual findings—and driving the public’s radiophobia—has been the theory
of linear no-threshold (LNT), which presumes that the detrimental effects of
radiation are proportional to the dose, and that there is no dose at which the
effects of radiation are not detrimental.
It was LNT theory that the
International Commission on Radiological Protection chose, in 1959, as the
basis for its rules of radiation protection. At that time, applying LNT theory
was regarded as an administrative decision, based on practical (not to mention
political13) considerations. Adopting a linear relationship between
dose and effect, along with no threshold, enabled doses in individual
exposures to be added and enabled population-averaged quantities to be
evaluated, and made the administration of radiation protection generally
easier. Furthermore, the policy undertone—that even the smallest, near-zero
amounts of radiation could cause harm—was politically useful at the time: It
played an important part in effecting first a moratorium and then a ban on
atmospheric nuclear tests. LNT theory was and still is the pillar of the
international theory and practice of radiation protection.
Over the years, however, what started as just a
working assumption for the leadership of ICRP came to be regarded—in public
opinion and by the mass media, regulatory bodies, and many scientists, and
even by some members of the ICRP—as a scientifically documented fact.
The absurdity of the LNT was brought to
light after the Chernobyl accident in 1986, when minute doses of Chernobyl
radiation were used by Marvin Goldman, Robert Catlin, and Lynn Anspaugh to
calculate that 53 400 people would die of Chernobyl-induced cancer over the
next 50 years.14 The frightening death toll was derived simply by
multiplying the trifling Chernobyl doses in the US (0.0046 mSv per person) by
the vast number of people living in the Northern Hemisphere and by a cancer
risk factor based on epidemiological studies of 75 000 atomic bomb survivors in
Japan. But the A-bomb survivor data are irrelevant to such estimates, because
of the difference in the individual doses and dose rates. A-bomb survivors
were flashed within about one second by radiation doses at least 50 000 times
higher than those which US inhabitants will ever receive, over a period of 50
years, from the Chernobyl fallout.
We have reliable epidemiological data
for a dose rate of, say, 6000 mSv per second in Japanese A-bomb survivors. But
there are no such data for human exposure at a dose rate of 0.0046 mSv over 50
years (nor will there ever be any). The dose rate in Japan was larger by 2 x
1015 than the Chernobyl dose rate in the US. Extrapolating over
such a vast span is neither scientifically justified nor epistemologically
acceptable. Indeed, Lauriston Taylor, the former president of the US National
Council on Radiological Protection and Measurements, deemed such
extrapolations to be a “deeply immoral use of our scientific heritage.”
Radiation
dose and eternity
An offspring of the LNT assumption is
the concept of dose commitment, which was introduced in the early 1960s. At
that time, the concept reflected the concern that harmful hereditary effects
could be induced by fallout from nuclear tests. After almost four decades, the
concept of dose commitment is still widely used, although both the concept and
the concern ought to have faded into oblivion by now.
UNSCEAR, which first used “dose
commitment” in 1962, defined it as “the integral over infinite time of the
average dose rate in a given tissue for the world population, as a result of a
given practice—for example, a given series of nuclear explosions.” Such
integration requires making some daring assumptions and having a superhuman
omniscience about population dynamics and environmental changes for all the
eons of time to come. Later, in a humbler frame of mind, UNSCEAR introduced
the so-called truncated dose commitment, limited arbitrarily to 50, 500, 10
000 or many millions of years. However, the original “infinite” definition
is still retained in recent UNSCEAR documents.
To accept the definitions of dose commitment and
of collective dose, we must also accept the following premises:
·
An LNT relationship between
absorbed dose and risk to an individual.
·
The additivity of risk (by means
of the additivity of dose) during the lifetime of an individual.
·
The additivity of risk (dose)
across individuals of the same generation.
·
The additivity of risk (dose)
across the lifetimes of individuals over any number of generations.
·
The expectation that late harm
due to a dose accumulated over many years or generations (dose commitment) be
the same as the harm done by an instantaneous dose of the same magnitude.
·
The expectation that late harm
due to a given value of collective dose or dose commitment calculated for a
large number of people exposed to trifling doses be the same as that
calculated for a small number of people exposed to large doses. (This
expectation is contrary to the common practice of diluting or dispersing
noxious agents below dangerous levels.)
In 1969, UNSCEAR advised making the
level of natural radiation a convenient reference for comparing dose
commitments from man-made sources. However, during the three decades since the
introduction of the dose commitment concept, UNSCEAR has not followed its own
advice. The collective dose commitment for the world population from natural
sources, truncated to 50 years (650 000 000 man Sv), was published for the
first time in UNSCEAR’s 1993 report. But why stop at 50 years—when, for
man-made radiation, UNSCEAR estimates the dose commitments over infinite time?
It is easy to calculate the individual dose commitment from past exposures to
natural radiation for periods comparable to those used for calculating
man-made sources of radiation. In making the calculation, one may assume that
during the past several million years the natural radiation dose rate has been
the same as is now—that is, 2.2 mSv per year.
In the table on this page are presented
the values of truncated natural dose commitment for various periods since the
putative appearance of some of our ancestors. One may compose a similar table
for the collective truncated dose commitments for the global populations
integrated over the past generations, information that is also given in the
table. One may also calculate the future natural dose commitments of our
descendants for tens or thousands of generations.
Each of us is burdened with these
values of dose commitment. Do these values represent anything real, or are
they just an academic abstraction? What are the medical effects of these
enormously high doses?
In an international study, the
collective dose for the world population from nuclear dumping operations in
the Kara Sea (part of the Arctic Ocean), truncated to the year 3000 AD, has
been estimated to be about 10 manSv.15 Let us explore the
implications of that value, which may be equivalent to:
·
10 Sv in 1 person in 1 day
(lethal acute effect), or
·
10 Sv in 1 person in 1 year
(chronic effect—for example, cancer),
·
0.5 Sv in 20 people in 1 day
(chronic effect), or
·
10–5 Sv in 1000
people in 1000 years (no biological or medical concern), or
·
2 x 10–12 Sv per
each of 5 x 109 people now living and their descendants from 33
generations in 1000 years (no concern).
Obviously, the use of collective dose
obliterates information on the patterns of dose deposition in space and time,
which are of major importance for estimating their biological effects, in
terms of risk to humans. Individual doses cannot be additive over generations,
simply because humans are mortal, and the dose dies when an individual does.
Similarly, individual doses cannot be added for individuals of the same
generation because we do not contaminate one another with a dose that we have
absorbed. The presence of biological repair processes and the multistage
process of cancer induction render the linear addition of small contributions
of individual dose to estimate the associated risk of cancer occurrence highly
unlikely. Collective dose and dose commitment cannot have any biological
meaning.
The large values of collective doses and
collective dose commitments that have often been published were derived from
minuscule individual doses. For example, UNSCEAR’s calculations include the
following: 100 000 man Sv from nuclear explosions during the past 54 years,
205 000 man Sv for the global population in the next 10 000 years from power
reactors and reprocessing plants, 600 000 man Sv from Chernobyl fallout in the
Northern Hemisphere for eternity, and 650 000 000 man Sv for the world’s
population from natural radiation in the past 50 years. These large values,
terrifying as they are to the general public, do not imply that individuals or
populations are harmfully burdened by nuclear explosions, nuclear power
plants, Chernobyl fallout, or nature. In fact, they provide society with no
relevant biological or medical information. Rather, they create a false image
of the imminent danger of radiation, with all its actual negative social and
psychosomatic consequences. If harm to the individual is trivial, then the
total harm to members of his or her society over all past or future time must
also be trivial—regardless of how many people are or will have been exposed
to natural or man-made radiation. The intellectually invalid concepts of
collective dose and dose commitment deserve to be hacked off with William of
Occam’s razor.
Enter
hormesis
The LNT theory is contradicted by the
phenomenon of hormesis—that is, the stimulating and protective effect of
small doses of radiation, which is also termed adaptive response. The first
report on hormetic effects in algae appeared more than 100 years ago.16
More recently published hormetic effects include A-bomb survivors’ apparent
lower-than-normal incidence of leukemia and their greater longevity.17
Although more than 2000 scientific papers had been published on radiation
hormesis, the phenomenon was forgotten after World War II and was ignored by
the radiation-protection establishment. It was only in 1994 that UNSCEAR
recognized and endorsed the very existence of radiation hormesis. It caused a
revolutionary upheaval of radiology’s ethical and technical foundations.
Many radiologists have come to realize
that their overreaction to theoretical (actually imaginary) health-harming
effects of radiation is unethical in that it leads to the consumption of funds
that are desperately needed to deal with real health problems. Applying the
no-threshold principle for the alleged protection of the public has led to the
imposition of restrictive regulations on the nuclear utilities, restrictions
that have virtually strangled the development of environmentally benign
nuclear energy in the US and in other countries. My own country, Poland, spent
billions of dollars on the construction of its first nuclear power
reactor—only to abandon the project after what I regard as the politically
motivated manipulation of public opinion by means of the LNT theory.
Each human life hypothetically saved in
a Western industrial society by implementation of the present radiation
protection regulations is estimated to cost about $2.5 billion. Such costs are
absurd and immoral—especially when compared to the relatively low costs of
saving lives by immunization against measles, diphtheria, and pertussis, which
in developing countries entails costs of $50 to $99 per human life saved.18
Billions of dollars for the imaginary protection of humans from radiation are
actually spent year after year, while much smaller resources for the real
saving of lives in poor countries are scandalously lacking.
A
practical alternative
There is an emerging awareness that
radiation protection should be based on the principle of a practical
threshold—one below which induction of detectable radiogenic cancers or
genetic effects is not expected. Below such a threshold, radiation doses
should not require regulation. Nor is any regulation required for extreme
levels, such as those experienced at Hiroshima and Nagasaki, where dose rates
were extremely high.
The practical threshold to be proposed could be
based on epidemiological data from exposures in medicine, the nuclear
industry, and regions with high natural radiation. The current population dose
limit of 1 mSv per year could then be changed to 10 mSv per year or more.
Individual doses could be evaluated at any level below the practical
threshold, but radiation-protection authorities would be required to intervene
only if individual doses above the threshold were involved. Adopting a
practical threshold would be an important step taken toward dealing with
radiation rationally and toward regaining the public’s acceptance of
radioactivity and radiation as blessings for mankind.
*********
*Zbigniew Jaworowski is a professor at the Central Laboratory for Radiological Protection in Warsaw, Poland, and has served on the United Nations Scientific Committee on the Effects of Atomic Radiation. His e-mail address is: jaworo@clor.waw.pl.
*********
References
1. L.
A. Ilyin, Chernobyl: Myth and Reality,
Megapolis, Moscow (1995).
2. Chernobyl—Ten
Years On, Radiological and Health Impact, Nuclear Energy Agency,
Organization for Economic Co-operation and Development, Paris (1996).
3. Sources
and Effects of Ionizing Radiation, UNSCEAR, New York (1993).
4. M.
Sohrabi, in High Levels of Natural
Radiation, J. U. A. M. Sohrabi, S. A. Durrani, eds., International Atomic
Energy Authority, Vienna, Austria (1990), p. 39.
5. P.
C. Kesavan, in High Levels of Natural
Radiation L. Wei, T. Sugahara, Z. Tao, eds. Elsevier, Amsterdam (1996), p.
111.
6. P.
A. Karam, S. A. Leslie, in Proc. 9th
Congress of the International Radiation Protection Association,
International Atomic Energy Authority, Vienna, Austria (1996), p. 12.
7. H.
Planel et al., Health Physics 52
(5), 571 (1987).
8. D.
Billen, BELLE Newsletter 3 (1), 8
(1984).
9. J.
S. Hezir, statement at the US Environmental Protection Agency’s public
hearing on the proposed recommendations for federal radiation protection
guidance for exposure of the general public, held in Washington, DC, on
22–23 February 1995.
10. H.
Koning, International Herald Tribune, 27 November 1996, p. 9.
11. B.
L. Cohen, Radiation Research 149,
525 (1998).
12. K.
Sankaranarayanan, lecture presented at 46th session of the United Nations
Scientific Committee on the Effects of Atomic Radiation, 18 June 1997.
13. L.
S. Taylor, Proc. International Congress
of the International Radiation Protection Association, Israel Health
Physics Society, Jerusalem (1980), p. 307.
14. M.
Goldman, R. J. Catlin, L. Anspaugh, US Department of Energy research report,
DOE/RR-0232 (1987).
15. K.L.
Sjöblom, G. Linsley, International Atomic Energy Authority Bulletin 40
(4), 18 (1999).
16. G.
F. Atkinson, Science 7, 7 (1898).
17. S.
Kondo, Health Effects of Low-level
Radiation, Kinki U. P., Osaka, Japan (1993).
18. B. L. Cohen, in Rational Readings on Environmental Concerns, J. H. Lehr, ed., Van Nostrand Reinhold, New York (1992), p. 461.
Go to . . .
No comments:
Post a Comment