Monday 27 October 2014

Earth used to be much more radioactive in the past

In the regions of the former Soviet Union that were highly contaminated by the fallout from the Chernobyl accident, the increased radiation dose rate for local inhabitants is far less than the dose rate in areas of high natural radiation (see figure 2). In those places, the entire man-made contribution to radiation dose amounts to a mere 0.2% of the natural component.
Three and a half billion years ago, when life on Earth began, the natural level of ionizing radiation at the planet’s surface was about three to five times higher than it is now.2 Quite possibly, that radiation was needed to initiate life on Earth. And it may be essential to sustain extant life-forms, as suggested by experiments with protozoa and bacteria.3
At the early stages of evolution, increasingly complex organisms developed powerful defense mechanisms against such adverse radiation effects as mutation and malignant change. Those effects originate in the cell nucleus, where the DNA is their primary target. That evolution has apparently proceeded for so long is proof, in part, of the effectiveness of living things’ defenses against radiation.
Other adverse effects—which lead to acute radiation sickness and premature death in humans—also originate in the cell, but outside its nucleus. For them to take place requires radiation doses thousands of times higher than those from natural sources. A nuclear explosion or cyclotron beam could deliver such a dose; so could a defective medical or industrial radiation source. (The malfunctioning Chernobyl reactor, whose radiation claimed 28 lives, is one example.)
Figure 2. Average individual global radiation dose in the 1990s from nuclear explosions, the Chernobyl accident, and commercial nuclear power plants combined was about 0.4% of the average natural dose of 2.2 mSv per year. In areas of Belarus, Ukraine, and Russia that were highly contaminated by Chernobyl fallout, the average individual dose was actually much lower than that in the regions with high natural radiation. The greatest man-made contribution to radiation dose has been irradiation from x-ray diagnostics in medicine, which accounts for about 20% of the average natural radiation dose. Natural exposure is assumed to be stable. The temporal trends in medical and local Chernobyl exposures are not presented. (Based on data from UNSCEAR.)
  1. Radiation risk and ethics, by Zbigniew Jaworowski, Physics Today, 52(9), Sep 1999, pp. 24-29
  2. P. A. Karam, S. A. Leslie, in Proc. 9th Congress of the International Radiation Protection Association, International Atomic Energy Authority, Vienna, Austria (1996), p. 12.
  3. H. Planel et al., Health Physics 52(5), 571 (1987).

No comments:

Post a Comment