Introduction
Ionizing radiation plays an important role in the modern world. The use of X-rays has revolutionized medical diagnostics. It is hard to imagine modern medical care without X-ray imaging, computed tomography (CT) and nuclear medicine. Every medium-sized hospital in developed countries has a radiotherapy unit that cares for cancer patients. More than 10% of the world’s electricity is supplied by nuclear power plants. In contrast, a high dose of ionizing radiation can kill a person and any other living organism. Moreover, even survivors of acute radiation syndrome (ARS) can suffer the carcinogenic effects of radiation exposure [1]. Therefore, ionizing radiation should be used with caution (like any other potentially dangerous agent). The fundamental question is: what are the harmful health effects of exposure to low doses of radiation, such as those used in medical diagnostics or experienced by radiation workers and the public? Typically, radiation with a cumulative dose of up to 100 mSv is referred to as low-dose radiation.
Differences in the biological effects of low- and high-dose radiation
It has been repeatedly shown that the immune response of organisms is stimulated by low-dose exposures [2] but suppressed by high doses [3]. DNA repair has also been found to be stimulated by low-dose exposures and inhibited by high-dose exposures [3]. In general, DNA damage induced by low doses has been shown to be significantly smaller than the damage caused by the oxidative processes of normal metabolism [4].
DNA repair mechanisms are effective in the case of low-dose irradiation and, as expected, become less effective with increasing doses [5]. At single doses below 100 mGy, the beneficial effects outweigh the adverse effects [6].
The damage and repair processes are interrelated and are accompanied by highly coordinated adaptive modulation of epigenetic regions [6].
Most well-designed and methodologically rigorous studies of the health effects of ionizing radiation have involved particular types of participants: exposed workers and populations living in areas with above-average levels of natural background radiation [7]. An overview of these studies is provided in this article. Since readers may not be familiar with the radiobiological terminology used, the main terms and units are explained in Table 1.
Basic epidemiological indicators used in radiation epidemiology are given in Table 2.
Bq: unit of radioactivity or strength of a radioactive source; 1 Bq = 1 radiation emission event per second |
Gy: a measure of absorbed dose, i.e. energy deposited per unit of mass. 1 Gy = 1000 mGy |
Absorbed dose: the amount of radiation taken by an organ or tissue |
Cumulative dose: the total dose resulting from repeated or continuous exposure to ionizing radiation |
Sv: equivalent or effective dose (for X-ray or gamma radiation), 1 Sv is equivalent to 1 Gy |
Dose equivalent: a measure of the biological effect of radiation depending on the type of radiation, the dose absorbed, and the organs or tissues irradiated |
RR: in cohort studies, the ratio of disease incidence between exposed and unexposed cohorts |
SMR: same as RR, but the “risk” is death, not a disease |
HR: the ratio of the risk of an event occurring at a particular point in time in the two groups being compared |
ERR: ERR = RR > 1 |
OR: analogue of RR in case-control studies |
Exposure to occupational radiation
The assessment of health outcomes related to exposure to ionizing radiation in various occupational groups has been the subject of extensive research since the second half of the 20th century. The main research activities focused on cancer incidence and mortality among people who are occupationally exposed to ionizing radiation, such as radiologists, radiation therapists, workers in the nuclear industry and military personnel involved in nuclear weapons testing. These studies are of particular importance for radiation protection, as most people in these occupational groups are usually subjected to long-term exposure to low-dose radiation.
Medical occupational exposure
Currently, the most thorough studies of the health effects of exposure to low doses of radiation are conducted in cohorts of radiology physicians and technical staff. Several serious types of health problems, such as the increased risk of skin cancer and leukaemia, as well as increased cancer incidence and all-cause mortality, have been reported among radiology technicians and radiologists in the first half of the last century [8]. For example, increased mortality rates caused by leukaemia were shown in 8 historical cohorts of more than 270,000 radiologists and radiology technicians employed before 1950, when radiation exposure levels in these occupational groups were high (e.g., 30,000 mSv/year in 1902). After the introduction of the first radiation protection recommendations in the early 1920s (dose limit 500 mSv/year), the increased mortality was no longer reported [9].
In a study by Berrington et al. [10], all-cause mortality among British radiologists who first registered with the radiological society in or after 1920 was significantly lower than in the general population. In this cohort, the number of deaths from cancer was similar to the number reported for all physicians combined. Mortality rates for British radiologists registered after 1954 were significantly lower compared with other medical groups; this included both cancer mortality and all causes of death combined. Based on these findings, Cameron [11] concluded that “British radiological data show that moderate doses of radiation are beneficial and not a health risk”.
The results of British radiologists were largely in line with the results of Mohan et al. [12].
An analysis of a cohort of U.S. radiology technicians (total n = 146,022) found SMRs to be 24% lower for all-cause mortality and 18% lower for cancer mortality compared with the general U.S. population. Relative risk (RR) was higher for both breast cancer (RR = 2.92) and all cancers (RR = 1.28) [12].
In a study by Kitahara et al. [13], total mortality was compared in cohorts of physicians performing interventional procedures with fluoroscopy (n = 45,634) and psychiatrists (n = 64,401). Physicians exposed to radiation (both males and females) had 20% lower total mortality and cancer mortality (male: RR = 0.92, female: RR = 0.83) compared with the mortality rates for psychiatrists. Moreover, mortality from certain types of cancer and cardiovascular disease was not increased in physicians compared with psychiatrists.
Summarizing the results of research in this field, Tubiana [14] stated that the lowest potentially carcinogenic cumulative radiation dose is about 500 mSv. Considering all the available evidence, it can be suggested that lower doses have no effect and may even be beneficial.
Cohorts of workers in the nuclear industry
Extensive observational studies have been conducted in cohorts of personnel employed in the nuclear industry. A comprehensive study of nuclear shipyard workers was conducted in the United States in the 1980s. Radiation personnel were exposed to external cobalt-60. Three cohorts were compared: a high-dose cohort (n = 7,872, cumulative doses > 5 mGy), a low-dose cohort (n = 10,348, cumulative doses < 5 mGy), and an unexposed group (n = 32,510) of age-matched shipyard workers [15]. Workers exposed to high doses showed clear health benefits, including a 24% lower all-cause mortality and a significantly lower respiratory, cardiovascular and cancer mortality than unexposed workers. Unfortunately, the report has not been published in its entirety, and only a summary [15] is available in an easily obtainable form. Similar data were obtained from a collective cohort of nuclear and non-nuclear workers in 4 power divisions of nuclear weapons facilities in the United States (n = 119,195; mean cumulative dose = 20 mSv) [16]. In most studies, mortality in the exposed cohort was lower than in the general U.S. population, but the rates of pleural cancer and mesothelioma were significantly elevated. No statistically significant evidence of an association between radiation exposure and mortality from all forms of cancer or leukaemia was found in the analysed nuclear power plant workers who were continuously exposed to low-dose radiation (average cumulative dose < 50 mSv) at the Hanford site, Rocky Flats Nuclear Weapons Plant and Oak Ridge National Laboratory (United States) [17, 18]. Multiple myeloma was the only type of cancer with a significantly increased risk in the exposed cohort. A low all-cause mortality rate Standardized Mortality Rate (SMR = 0.82) was demonstrated in 46,970 employees of Rocketdyne/Atomics International in California between 1948–1999 [19]. Reduction in cancer mortality compared with the general population was observed in a study involving a large cohort (n = 45,468) of Canadian nuclear power plant workers [26]. A significant reduction in the risk of all combined cancers (RR = 0.70) was found in the range of 1 to 49 mSv compared with the lowest dose category (< 1 mSv). Above 100 mSv, the risk increased.
Most large-scale studies in this field have been conducted in multinational cohorts. In a large international cohort (n = 410,000) of nuclear power plant employees in 15 countries (Australia, Belgium, Canada, Finland, France, Hungary, Japan, Korea, Lithuania, Slovakia, Spain, Sweden, Switzerland, United Kingdom and the United States), no excess cancer risk was found for cumulative doses below 150 mSv [21, 22]. In a chronic lymphocytic leukaemia mortality study conducted in 7 countries belonging to this cohort (n = 295,963), the RR for the dose of 100 mSv was 0.84 compared with that for the unexposed control group [23].
A positive relationship between exposure to ionizing radiation and the risk of haematological malignancies was revealed in a collective group (n = 19,536). The risk increased in patients who received doses of 80 mGy. No association was found in the analyses of mortality [24].
Several studies on miners, including those mi-ning uranium, have been conducted. In a cohort of former German miners (n = 58,972) exposed to low linear energy transfer (LET) (mainly external ionizing radiation) and high LET (mainly radon and its decay products) – red radiation doses to the bone marrow of 48 mGy and 9 mGy, respectively — there was an increased risk of death of chronic myelogenous leukaemia in relation to low LET. Such a relationship was not demonstrated for chronic lymphocytic leukaemia [25]. In a cohort of uranium miners in Ontario, there were 28,546 men with an average cumulative radon exposure of 21.0 working months. An increased risk of lung cancer was observed in miners exposed to > 100 working months [26]. These miners had a significant increase in lung cancer risk (RR = 1.89) compared with the non-miner group, with similar mortality trends. No association was observed with cancer sites other than the lung or with non-cancer death.
A positive relationship between low doses of radiation and the risk of lung cancer was also found in case-control studies with cohorts of Belgian, French and British uranium and plutonium mine workers [27].
For uranium miners, a causal relationship between radon exposure and lung cancer risk has been repeatedly demonstrated. For example, a 34% higher risk of dying of lung cancer (SMR = 1.34) was found in a French cohort of uranium miners employed between 1946 and 2007; this risk increased significantly with cumulative exposure to radon [28]. A similar correlation between the level of lung cancer morbidity/mortality and the level of radon exposure has been observed in cohorts of uranium miners in other countries such as Germany [29], Canada [28] and the United States [30].
Summarizing the results of studies from around the world, it is now widely accepted that radiation exposure at doses lower than 100 mSv is too low to detect a statistically significant increase in cancer incidence in the presence of naturally occurring malignancies [31, 32]. Doses received by workers in the nuclear industry obviously fall into this category, as the dose received is usually accumulated over many years with an average annual dose of approximately 2 orders of magnitude lower than 100 mSv. Indeed, annual monitoring of over 100,000 radiation workers in the United States since 1983 has shown that no worker in the U.S. nuclear industry has been exposed to more than 50 mSv in a year [31].
Environmental radiation
Natural environmental radiation can come from a variety of sources. About three-quarters of the background, radiation comes from natural gamma rays emitted by rocks, earth, and terrestrial radon. About a quarter of the background radiation comes from cosmic rays and radionuclides incorporated into the human body [33]. In recent decades, the level of environmental radiation has been largely dependent on artificial sources of radiation around the world, such as nuclear power plant accidents, which are discussed in detail in the following sections.
Background radiation in the environment
The level of natural background radiation varies greatly, sometimes by as much as 2 orders of magnitude, in different geographic regions around the world. In most areas, average effective dose values are between 2 and 4 mSv/year. Regions with an effective dose above 10 mSv/year are generally referred to as areas of high natural background radiation. In some regions, such as Guarapari (Brazil), Kerala (India), Ramsar (Iran) and Yangjiang (China), the natural background radiation can reach several hundred mSv/year. For example, in the Ramsar province of Iran, the total annual effective dose reaches 260 mSv/year [34].
Some studies have been conducted to determine the potential link between high levels of background radiation and health effects in exposed populations, mainly cancer incidence and mortality. The advantage of this type of research is that it is relatively easy to carry out, as it usually uses already existing data. The disadvantage of this type of study is that the analyses do not include data from individual cases, that is, such studies are usually descriptive and ecological in nature [35].
Most epidemiological studies evaluating health outcomes in areas with high natural background radiation levels have examined risks for cancer and non-cancer diseases based on incidence or mortality data. Although initially most of these studies expected a positive association between background radiation levels and disease risk, comparing populations living in areas with high background levels to those with low background levels, no health risk was found. Indeed, neither cancer nor early childhood deaths were positively correlated with radiation dose in areas of high background radiation [36]. Additionally, several studies have shown evidence that levels of natural background radiation are inversely associated with cancer mortality.
There was no increase in either malignant mortality or generalized congenital malformation mortality in a U.S. study. On the contrary, a steady and continuous decrease in these phenomena was observed despite an increase in background radiation levels [37]. A more recent study found that cancer mortality rates are inversely related to natural background radiation in the United States (r = 0.656, p < 0.0001) [38].
Since background radiation levels tend to increase with increasing altitude, cancer mortality rates in 6 jurisdictions were compared at low levels versus high levels of residence above sea level [39]. Statistically significant reductions in mortality at high altitudes were found for 3 out of 4 health outcomes examined, including cancer. Because mortality rates vary by race, only Caucasian data were subsequently analysed by Hart [40]. In this study, U.S. counties with higher elevation also had significantly lower rates of cancer mortality compared to regions with lower elevation (53.90 and 73.47, respectively, p < 0.0001). Higher-elevation counties also had significantly lower rates of death from heart disease compared to lower-elevation counties (p < 0.0001 for black and white ethnicities) [41]. Based on these analyses, the authors suggested that radiation hormesis is one possible explanation for reduced mortality in high-altitude regions. Admittedly, other explanations, such as adaptive physiological responses to reduced oxygen levels (at least in the case of mortality due to heart disease), cannot be ruled out. In Ireland, no relationship was observed between the cancer mortality rate and the level of natural background radiation [42]. In China, similar cancer mortality rates have been found in regions with high (average 2.31 mSv/year) and low (0.96 mSv/year average) average background radiation levels [43]. Similarly, no increase in cancer incidence or mortality associated with high levels of background radiation was observed in Yangjiang, China [44], and Kerala, India [45].
Based on the analysis of the available literature data, Cameron et al. [46] concluded that the non-threshold linear hypothesis cannot explain these results. They can be better explained by the threshold or hormesis model. Generalizing these findings, Cameron provocatively stated that “we need increased background radiation to improve our health”.
One study conducted in Bavaria, Germany, provided evidence that increasing the dose from natural background radiation and thus increasing the cumulative dose, can have adverse effects on human health [47].
Accidents in nuclear power plants
Epidemiological studies conducted so far have not shown any adverse health effects in populations living in areas with high levels of background radiation.
Since the beginning of the atomic age, the expansion of nuclear technology has raised widespread concern about the health and environmental risks posed by nuclear power plant failures. During this time, several major nuclear accidents have occurred around the world. The Three-Mile Island Nuclear Power Plant accident in 1979 was perhaps the first to be reported in the media. More accidents occurred at the Chernobyl nuclear power plant in the Soviet Union in 1986 and at the Fukushima Daiichi nuclear power plant in Japan in 2011. The long-term health risks associated with these accidents are the subject of comprehensive investigations.
Although the Three-Mile Island nuclear accident was serious and led to the loss of the plant, the average radiation dose received by the exposed individuals (up to 2,000,000 in total) was rather low (approximately 1.7 mrem). Surprisingly, there was no increased cancer risk in either men or women (RR = 1.00 and 0.99, respectively) [48]. In the long-term monitoring of the inhabitants of the area (n = 32,135), total cancer mortality was also similar to the local mortality (SMR = 103.7 for men; 99.8 for women) [49].
The long-term effects of the Chernobyl disaster have been studied in the most thorough way to date. Due to this nuclear accident, many regions of Ukraine, Belarus and South Russia were significantly contaminated with iodine-131 (131I) and cesium-137 (137Cs) radionuclides.
A total of 116,000 people were relocated from the area surrounding Chernobyl to uncontaminated regions in the spring and summer of 1986; another 220,000 people were relocated in the subsequent years [50]. Iodine-131 is a radionuclide with a very short half-life (8 days), but it can quickly enter the human body through the consumption of contaminated vegetables and milk and the air. Most 131I localizes to the thyroid gland. Due to the size of the thyroid gland in children and the characteristics of their physiology, radiation doses are usually much higher for children than for adults. Radiation doses to the thyroid were high in the affected areas owing to the high levels of contamination (no shielding, no food restriction, and no late removal of contaminants from the population) and high uptake of radioiodine by the thyroid gland (due to both iodine deficiency and no iodine prophylaxis).
A unique feature of the Chernobyl accident was that radiation doses to the thyroid were 3–4 orders of magnitude higher than doses to other organs [51].
After the accident, the incidence of thyroid cancer increased dramatically in infants and children, especially those aged 0–5 years [52]. In 2005, more than 6,000 cases of thyroid cancer (15 fatal cases) were diagnosed among approximately 2 million highly contaminated patients who had been children and teenagers at the time of the accident. It has been assumed that a large proportion of these thyroid tumours can be attributed to 131I exposure. This has been confirmed in several clinical trials [53].
Except for the significant increase in the incidence of thyroid cancer in children and adolescents, there has been no increase in other cancers, radiation-related leukaemia or non-malignant disorders in the exposed populations [54].
In their discussion of the long-term consequences of the Chernobyl disaster, Takamura and Yamashita [55] noted that these accidents led to psychoemotional trauma and social instability, which resulted in many more negative health effects than those caused by radiation exposure. In fact, the post-accident relocation resulted in a ‘deep trauma’ for some 350,000 people displaced from their homes in the affected regions. More precisely, the average dose of 100 mSv for liquidators (n = 240,000) and 33 mSv for those evacuated in 1986 (n = 160,000) [54]. Overall, no carcinogenic effects were observed in persons exposed to radiation doses below 100 mSv after the Chernobyl accident.
The second worst nuclear accident in history after Chernobyl occurred at Japan’s Fukushima Daiichi (Fukushima I) power plant in 2011 following an earthquake and subsequent quakes. Although the Chernobyl and Fukushima Daiichi accidents were classified as level 7 (the highest level in the International Nuclear Scale of the International Atomic Energy Agency), the actual conditions and damage scales varied significantly [56]. As with Chernobyl, large amounts of radioisotopes, including 131I, were released in Fukushima and the surrounding prefectures. Radiation doses to the thyroid gland, however, were much lower at Fukushima, mainly because the Japanese authorities implemented a timely food restriction. As a result, the mean individual thyroid dose was only < 1 mSv, with a maximum dose of 33 mSv. Therefore, it is not surprising that in the 5 years following the accident, no increased incidence of clinical thyroid carcinomas was observed [57].
Most likely, the main public health problem after the Fukushima accident is chronic mental stress, as well as stress-related lifestyle disorders such as obesity, hypertension and type 2 diabetes in displaced persons, all of which may result in an increased risk of cardiovascular disease in the future [58].
Discussion
Summary of epidemiological studies: limitations and opportunities
In most epidemiological studies of the long-term effects of low doses of radiation, there were several methodological issues and limitations. When generalizing the results of occupational research, it should be stated that statistically significant harmful health effects of occupational exposure to low doses of radiation were infrequent. A common trend observed in many occupationally exposed cohorts worldwide is that their mortality rate is generally lower than that of the general population. Several authors believe that the phenomenon of radiation hormesis caused by low doses of radiation may be responsible for this observation [59, 60].
When it comes to research on the effects of environmental radiation exposure, one of the most important methodological problems is the “ecological bias” that arises when conclusions about individuals are based solely on the analysis of group data. Indeed, ecological studies usually do not include estimates of individual radiation exposure; instead, aggregated population estimates or proxies such as geographic location are commonly used to determine the population dose for a group of individuals [35]. For example, it is assumed that people living near a nuclear power plant receive higher doses of radiation than those who live far from the facility, and everyone within the exposed area is equally exposed.
In general, no causal conclusions can be drawn from the results of such studies. The limitations of the ecological approach can be overcome by using a cohort study, which compares the experiences of several groups of individuals who are simultaneously followed prospectively, or by constructing a case-control study, comparing people suffering from the disease (“cases”) with people who do not have such a pathology, but are otherwise similar (“controls”). Both approaches are beneficial but not always possible, as they require the reconstruction of individual doses.
Conclusions
Currently, radiation safety regulations assume that the risk of carcinogenicity is proportional to radiation exposure for all radiation doses. However, the latest epidemiological and radiobiological evidence completely contradicts this statement. There is growing evidence that low-dose radiation, such as that used in X-ray imaging, including CT, has beneficial health effects rather than poses a risk.
Although much information about the biological effects of low-dose radiation has been obtained, many important issues require further scientific research. However, given the social, economic and ethical aspects of the current regulations and their extremely high costs (both economic and human) to society, caution should certainly be exercised when changing the current practices.