A blog about energy, electricity and its politics.
Monday, 27 October 2014
Earth used to be much more radioactive in the past
In the regions of the former Soviet Union that were highly contaminated by the fallout from the Chernobyl accident, the increased radiation dose rate for local inhabitants is far less than the dose rate in areas of high natural radiation (see figure 2). In those places, the entire man-made contribution to radiation dose amounts to a mere 0.2% of the natural component.
Three and a half billion years ago, when life on Earth began, the natural level of ionizing radiation at the planet’s surface was about three to five times higher than it is now.2 Quite possibly, that radiation was needed to initiate life on Earth. And it may be essential to sustain extant life-forms, as suggested by experiments with protozoa and bacteria.3
At the early stages of evolution, increasingly complex organisms developed powerful defense mechanisms against such adverse radiation effects as mutation and malignant change. Those effects originate in the cell nucleus, where the DNA is their primary target. That evolution has apparently proceeded for so long is proof, in part, of the effectiveness of living things’ defenses against radiation.
Other adverse effects—which lead to acute radiation sickness and premature death in humans—also originate in the cell, but outside its nucleus. For them to take place requires radiation doses thousands of times higher than those from natural sources. A nuclear explosion or cyclotron beam could deliver such a dose; so could a defective medical or industrial radiation source. (The malfunctioning Chernobyl reactor, whose radiation claimed 28 lives, is one example.)
P. A. Karam, S. A. Leslie, in Proc. 9th Congress of the International Radiation Protection Association, International Atomic Energy Authority, Vienna, Austria (1996), p. 12.
H. Planel et al., Health Physics 52(5), 571 (1987).
No comments:
Post a Comment