2 Yasser A. Noureldin1,2 & Sero Andonian2 1 Department of Urology, Benha Faculty of Medicine, Benha University, Benha, Egypt 2 Division of Urology, McGill University Health Centre, McGill University, Montreal, QC, Canada According to a 2009 National Council on Radiation Protection and Measurements (NCRP) report, the total ionizing radiation exposure to United States citizens had almost doubled over the previous two decades [1]. The report attributed this to increased exposure from computed tomography (CT) scans, image‐guided fluoroscopic procedures, and nuclear medicine studies, which were estimated at 67 million, 17 million, and 18 million, respectively. During 2006, these imaging modalities constituted 89% of the total annual radiation exposure [1, 2]. On the other hand, recent studies have shown significant worldwide increase in the prevalence of stone disease [3, 4]. In the United States, prevalence increased from 5.2% in 1994 to 8.8% in 2010 [5]. Similarly, in the United Kingdom, renal colic episodes increased by 63% from 2000 to 2010 [5]. This was associated with marked decline by 83% in open surgeries and a marked increase in minimally invasive endourologic procedures such as shock‐wave lithotripsy (SWL), which increased by 55%, and ureteroscopy, which increased by 127% [5]. The increase in the incidence of stone disease and management with endourologic procedures is not without risks. Since ionizing radiation is not only an integral part of modern endourologic interventions but also constitutes the basis for diagnosis, preoperative planning, and post‐operative follow‐up, it is important for the urologists to have an intimate knowledge of radiation safety measures and to minimize ionizing radiation as much as possible to themselves, operating room personnel, and most importantly to their patients. This chapter discusses the state of the art in radiation safety measures during diagnosis, treatment, and follow‐up. To follow radiation safety measures, one needs to understand how ionizing radiation is generated in the first place. Therefore, this chapter will start by explaining the anatomy of the X‐ray tube and the generation of X‐rays. It will then go through potential hazards of ionizing radiation. At the end of this chapter, readers will have developed strategies in lowering radiation exposure during diagnosis, treatment, and follow‐up of their patients requiring endourologic procedures. According to the International System of Units (SI), the absorbed dose, measured in Grays (Gy) or joules/kg, is the amount of energy absorbed per mass of tissue [6, 7]. To measure the biological effect of radiation on human tissue, the mean absorbed dose in an organ or tissue is multiplied by radiation weighting factor to calculate the “equivalent dose.” The radiation weighting factor differs according to the type of radiation. For X‐ray imaging (photons), it equals 1. Therefore, the mean absorbed dose and the equivalent dose are numerically equal. The SI unit for equivalent dose is the Sievert (Sv). Thus, 1 Gy equals 1 Sv. As the doses used for radiographic imaging are generally very low, milliSieverts (mSv) is used as a standard nomenclature describing doses administered [8]. The probability and severity of harmful effects from the same value of equivalent dose of radiation differ among different body organs and tissues. The International Commission on Radiological Protection (ICRP) refers to the combination of probability and severity of harm as “detriment.” In order to determine the combined detriment from stochastic effects due to the equivalent doses in all body organs and tissues, the “effective dose” is described. It is calculated by multiplying the equivalent doses in each organ by a tissue weighting factor (WT), and the results are summed over the whole body [8] (Figure 2.1). The SI unit for effective dose is also the Sievert (Sv). The weighting factors vary according the radiosensitivity of each organ and are determined by the ICRP. For example, a WT of 0.12 was described for red bone marrow, colon, lung, stomach, breast, kidney, pancreas, and prostate, a WT of 0.08 for gonads, and WT of 0.04 for the skin and brain [8]. Another factor used for estimation of radiation exposure is the dose area product (DAP), which is calculated from the radiation dose to air multiplied by the area of the X‐ray field. It is expressed in Gy/cm2 and it could be reliably used to estimate the effective dose by combining the DAP with the appropriate coefficient (which varies for the irradiated portion of body and the protocol used) derived from Monte Carlo simulations with anthropomorphic digital phantoms [9–11]. The size‐specific dose estimate is considered as a novel method of reporting a patient’s radiation dose. It is calculated by multiplying CT dose index volume (CTDIvol) by a size‐dependent conversion factor. It accounts for patient size in a way that the increase in patient size decreases the size of specific dose estimates [12, 13]. The average medical imaging exposure ranges from 0.7 mSv for a single plain kidney‐ureter‐bladder (KUB) film to 18 mSv for a CT scan, depending upon device settings and body habitus. Even with relatively low radiation doses, the concern over excessive radiation exposure has grown recently due to the exponential rate at which medical imaging is used [14]. Prior to understanding the hazards of radiation and methods of protection, it is wise to briefly consider the principles of radiation production. The basic components required for X‐ray generation are (i) rotating anode (usually made of tungsten which is capable of handling a high heat load without warping or vaporizing), (ii) cathode (source of electrons), (iii) high voltage source, (iv) X‐ray vacuum glass tube, (v) housing (steel casing which provides shielding to prevent leakage of stray X‐rays), and (vi) collimator (which specifies the X‐ray field). Within the shielded housing, X‐rays are generated when a high voltage is applied between the cathode and the rotating anode. The amount of electrons liberated from the cathode is directly proportional to the amount of current applied in milliamps (mAs). A large voltage potential (kVp) in the range of 50–120 kVp is usually required. Higher energy X‐rays will penetrate deeper into tissues and penetrability of X‐ray photons is directly proportional to the average energy of the photons generated. Hence, obese patients are expected to receive higher doses of radiation to obtain appropriate image quality compared with thinner ones. Therefore, the potential energy is adjusted to compensate for differences in patient body mass index (BMI). X‐rays exit the housing through a narrow beryllium window, which permits the passage of only focused X‐rays. Lead collimators further restrict the area of exposure, limiting the radiation area and, in turn, the amount of scattered radiation from the patient to operating room personnel (Figure 2.2) [6, 7]. The amount of X‐ray photons, or radiation, generated by individual electrons varies depending on the proximity to the target material’s nucleus. Therefore, a direct impact transfers a large quantity of kinetic energy and generates maximum energy X‐ray photons, while a more distant interaction generates weaker X‐ray photons. The X‐ray photons produced is termed bremsstrahlung, which translates to “braking radiation.” The variability of distance from the nucleus therefore creates a spectrum of energy emitted from the X‐ray tube, the bremsstrahlung spectrum. The mean energy of the photons across the spectrum is typically equal to one‐third the total potential difference generated by the voltage. The energy of photons is an important consideration because the higher the energy of the photon, the more shielding is required and the more readily the photon will penetrate tissue. Shielding for the X‐ray housing is restricted to less than 100 mrad/hour at 1 m to prevent excessive leakage rates while the tube is operating at maximum potential difference and current. Photons leave the tube through a small opening in the shielding and are directed towards the area of interest (Figure 2.2). Filters are then generally applied to these diagnostic X‐ray beams. These filters attenuate the lower‐energy photons which do not have sufficient energy to reach the detector and therefore are not of clinical benefit. The electrons with energy high enough to penetrate through the filters are focused by the collimators through the tissue area of interest. Typically, the collimators further restrict exposure to the patient and the operator by eliminating radiation exposure outside the area of interest (Figure 2.2). The quality of the image obtained is significantly reliant on the energy level and quantity of the electrons reaching the detector. X‐ray photons need to be of high energy enough to penetrate through the tissue but low energy enough to limit “over‐exposure” of the detector. Therefore, the most controllable aspects of diagnostic imaging equipment are (i) the potential difference (kVp), which determines the amount of energy in X‐ray photons, (ii) the current (milliamps), which determines the number of electrons that create photons for a given time period, and (iii) exposure time, which combines with the current to determine the total quantity of X‐rays produced. Because radiation exposure is proportional to the square of kVp and linearly related to milliamps, ideally it is better to increase kVp prior to increasing milliamps to enhance image quality. Conversely, if an initial image is “over‐exposed,” reducing current will greatly reduce radiation dose. Emitted X‐rays have three potential fates as they interact with living tissue: (i) some X‐rays are absorbed by dense tissues such as bones, (ii) others penetrate soft tissues to reach the image intensifier, and (iii) 0.1% of X‐rays are scattered 90° to the incident radiation, exposing bystanders. By placing the image intensifier above the patient and the source below the patient, the amount of scattered radiation exposure to the surgeon can be minimized. Furthermore, leakage from the tube itself is shielded and is further away from the operating surgeon’s head and body (Figure 2.2). Understanding the basic principles and mechanisms of X‐ray generation provides a foundation for the understanding of radiation protection. Proper equipment maintenance and inspection of shielding and filters are requisite requirements for operators of imaging equipment. Fluoroscopy units in hospitals are usually inspected and maintained annually by biomedical engineers. Nonetheless, it is important for the urologist to understand the basic electrical parameters (kVp and current) and their effects on X‐ray generation leading to optimization of the image quality while minimizing dose to patients and operators. Finally, the use of collimation to limit the exposure field to precisely target the tissue area of interest can greatly restrict unnecessary exposure. There are two broad categories describing the risks from radiation exposure; namely, deterministic and stochastic. These generally refer to a direct cell death resulting from direct exposure of a biologic tissue to high doses of radiation. This type of tissue reaction needs a threshold dose to occur; below this threshold nothing will happen and the degree of tissue response correlates with the intensity of radiation dose above the threshold. An example is the opacification of the lens of the eye or cataract formation at a threshold of 200 rad, skin erythema and depilation at a threshold of 300–600 rad, and even skin ulceration and necrosis at a threshold of 1500–2000 rad [1, 8]. These refer to events that are likely to occur by chance irrespective of the threshold levels of radiation, and likelihood of occurrence increases with dose. However, there is no relationship between severity of the event and the dose. The stochastic risks of greatest concern are hematologic and solid organ malignancy. The proposed explanation of these carcinogenic hazards is that X‐rays, as a type of ionizing radiation, possess enough energy to overcome the binding energy of the electrons orbiting atoms. Thus this energy can knock electrons out of their orbits and create ions. In biologic material, X‐rays can either ionize DNA directly or produce hydroxyl (OH) radicals from the interaction with water molecules. These hydroxyl radicals in turn interact with nearby DNA and cause epigenetic responses in the form of radiation‐induced genomic instability or bystander signaling, base damage, or strand breaks. While most X‐ray‐induced DNA damage undergoes rapid repair by different intracellular mechanisms, occasional problems in DNA repair may happen resulting in chromosomal translocations or point mutations which may lead to induction of carcinogenesis, especially in growing cells such as in infants and children [14]. Whereas a correlation between excessive radiation exposure and deterministic dermal and ocular effects exists, models used to calculate the stochastic effects associated with radiation exposure are still debated [15]. Two recent studies assessing the risks of cancer following repeated or protracted low‐dose radiation exposure among 308 297 radiation‐monitored workers from United States, France, and United Kingdom [16] showed that there was a linear increase in the rate of cancer with increasing radiation exposure. Workers were selected for monitoring based on conditions of being employed for at least 1 year by the Departments of Energy and Defence in the United States, the Atomic Energy Commission or the National Electricity Company in France, and nuclear industry employers included in the National Registry for Radiation Workers in the United Kingdom. After exclusion of chronic lymphocytic leukemia, there was an excess relative risk of mortality from leukemia of 2.96 per Gy with marked association between the radiation dose and mortality from chronic myeloid leukemia (excess relative risk per Gy 10.45) and doses were accumulated at very low rates (mean of 1.1 mGy/year) [16]. Another study revealed a direct estimate of the association between solid cancer mortality and protracted low‐dose exposure to ionizing radiation [17]. Another international study with 407 391 nuclear industry workers from 15 countries estimated the oncologic risks following prolonged low doses of ionizing radiation exposure. The authors found significant association between the radiation dose and all cancer mortality, with excess relative risk per Sievert (ERR/Sv) of 0.97 (90% confidence interval [CI] 0.28–1.77; 5233 deaths). Furthermore, duration of employment had a large effect on the ERR/Sv and lung cancer was the only type which had a significant association among 31 malignancies studied (ERR/Sv = 1.86 [90% CI 0.49–3.63; 1457 deaths]) [18]. In another study, the risk of cancer from diagnostic X‐ray in 14 developed countries was 0.6–1.8% compared with more than 3% in Japan, which has the highest estimated worldwide annual exposure [19]. Eisenberg and co‐investigators studied the risk of cancer in 82 861 patients without previous history of cancer and undergoing diagnostic or therapeutic imaging following acute myocardial infarction from 1996 to 2006 [20]. The cumulative radiation exposure was found to be 5.3 mSv/patient/year. Over a median follow‐up of 5 years, a total of 12 020 incident cancers were diagnosed. Every 10 mSv of low‐dose ionizing radiation was associated with a 3% increase in the risk of age‐ and sex‐adjusted cancer with a hazard ratio of 1.003 per mSv (95% CI 1.002–1.004) [20]. Additionally, Berrington de González and coworkers estimated future cancers attributable to ionizing radiation from CT scans to be 29 000 cases in the United States in 2007. Of these, 66% were females and the largest contribution (14 000) was from abdominopelvic CTs [21]. Furthermore, Smith‐Bindman and co‐investigators reported that 1/600 men and 1/270 women patients who underwent CT coronary angiography at the age of 40 are at risk of cancer and this risk is halved for those who were in their sixties and doubled for those who were in their twenties [22]. There is no doubt that most medical diagnoses are based on ionizing radiation. However, diagnostic modalities such as ultrasound and magnetic resonance imaging (MRI) are free from radiation. Plain radiography, fluoroscopy, CT, and nuclear medicine studies are associated with various doses of ionizing radiation. Traditionally, plain radiography such as KUB films was the initial imaging modality for patients with suspected nephrolithiasis with a sensitivity of 59% and specificity of 71% [23]. This low accuracy is due to the failure of plain radiographs to detect uric acid stones. Furthermore, identification of stones might be masked by the overlying bowel gases. Advantages of KUB include low cost which rendered it available worldwide and lower radiation exposure compared to other modalities (effective radiation dose of 0.2–0.7 mSv) [23, 24]. Intravenous urography (IVU) has additional advantages over KUB such as demonstration of the anatomical and functional status of the urinary system. It is associated with higher effective radiation dose ranging from 0.7 to 3.7 mSv, depending on the number of images [24, 25]. However, IVU is time‐consuming and necessitates the use of intravenous contrast materials with the risk of allergic reactions and contrast‐induced nephropathy Since being invented in 1970s, CT has been largely used in medical diagnosis and intervention. During a CT scan, multiple cross‐sectional two‐dimensional images of a specific area are produced by a rotating source passing X‐rays through the patient’s body. Later on, these two‐dimensional images can be digitally combined to yield three‐dimensional images [26]. In urologic practice, CT scans play a vital role in preoperative planning of percutaneous nephrolithotomy (PCNL) and SWL. For PCNL, it provides an orientation of the pyelocalyceal system and the relationship of the kidney to the surrounding organs such as the colon, liver, and spleen. In addition, preoperative non‐contrast CT (NCCT) scanning forms the basis of calculating different scoring systems assessing PCNL complexity such as the S.T.O.N.E. and Seol scoring systems [27, 28]. Regarding SWL, CT is very important for calculating the skin‐to‐stone distance to determine whether shock waves can reach the stone. For both SWL and PCNL, it detects postprocedural complications and presence of residual stones [29, 30]. Furthermore, it is ideal for diagnosis, staging, and follow‐up of most urologic malignancies [31, 32]. In addition, NCCT is the imaging modality of choice for initial evaluation of patients with suspected urolithiasis with reported diagnostic sensitivity up to 98% and specificity up to 100% [33]. It is worth mentioning that the stone protocol CT scans performed for diagnosis and follow‐up of urolithiasis patients have scan acquisition parameters that differ from traditional abdominopelvic CT scans. Therefore, patients presenting with suspected stone disease to the emergency department undergo NCCT scans with acquisition parameters different from those used in the normal stone protocol CT scans to allow emergency physicians to diagnose pathologies other than stones [34, 35]. With the exception of stones encountered in patients undergoing indinavir therapy and pure matrix stones, NCCT scan can diagnose all types of urolithiasis such as calcium, uric acid, xanthine, and cystine stones [36, 37]. Furthermore, it detects other renal and abdominal pathologies which present with acute abdominal pain resembling renal colic such as diverticulitis, appendicitis, or ovarian torsion. Therefore, it is not surprising that 10–14% of NCCTs performed in the emergency departments for renal colic were associated with alternative diagnoses [38–43]. In addition to determination of stone burden, location, multiplicity, density (Hounsfield unit), and associated hydronephrosis, dual‐energy CT (DECT) technology, which has been introduced in the past decade, seems to possess the potential ability for accurate determination of stone composition [44–47]. It includes dual‐energy simultaneous scanning using two different energies, which permits tissue characterization. There are two currently available DECT systems: single‐source DECT (ssDECT) and dual‐source DECT (dsDECT). While ssDECT is assembled with one X‐ray tube with rapid kV switching between high (140 kVp) and low energy (80 kVp), dsDECT is assembled with two X‐ray tubes (140 and 80 kVp) and two detectors on a single gantry perpendicular to each other [48]. Several in vitro and in vivo studies have validated DECT technology for detection of stone composition using both dsDECT and ssDECT. It allows detection of stone composition based on the variation in attenuation characteristics of stones at different X‐ray energies with up to 100% sensitivity and accuracy in distinguishing non‐uric acid and uric acid stones, regardless of the size of stones [44, 46, 49, 50]. Furthermore, with evolving DECT algorithms, there is an application for accurate subcategorization of renal stones [49, 51]. To reduce radiation exposure, the current DECT protocol for renal stones includes the use of single‐energy mode at a low dose covering the abdomen and pelvis, to recognize possible calculi in the urinary tract. Once the urinary stone is detected, a dual‐energy acquisition targeted to the anatomical area of the stone is performed [37]. It should be noted that obese patients are at higher risk of more radiation exposure from CT [52]. In a recent study by Wang and coworkers, they found that obese patients are at more than threefold‐increased risk of radiation exposure compared with nonobese patients during the stone‐protocol CT employing automatic tube current modulation (10.22 vs. 3.04 mSv; P < 0.0001) [53]. Disadvantages of CT include the high cost which limits worldwide availability, and the high radiation exposure, with effective radiation doses of 4.5–18 mSv [25]. In a study by John and colleagues, patients who underwent a CT were exposed to 14.46 mSv [54]. Furthermore, the median effective radiation dose (ERD) associated with an acute stone episode and 1 year of follow‐up at two American academic centers was 29.7 mSv [55]. Moreover, it was found that 20% of patients received over 50 mSv with an average of 3.5 CTs [55]. In a similar study by Fahmy et al. assessing the ERD to which urolithiasis patients are exposed during the evaluation and follow‐up over the first and second year following the acute stone episode, they found that the average ERD per CT scan was 23.16 mSv (range 4.94–72.77 mSv). Furthermore, they found that 17.3% of patients exceeded 50 mSv during the first follow‐up year with mean ERD of 29.29 mSv (1.7–77.27 mSv) [56]. Another study reported that a case of acute kidney stone episode that may require radiological imaging in the form of one or two KUBs, one or two abdominopelvic CT, and one IVU during the first year of follow‐up may be exposed to a total effective dose between 20 and 40 mSv [24]. In addition, it has been calculated that the effective dose from CT scan of the abdomen in adult patient is roughly equivalent to effective dose from 400 chest X‐rays [25]. Even some surveillance protocols using CTs have been shown to be associated with significant radiation exposure and the risk of developing cancers. Tarin and colleagues assessed the estimated cancer risks associated with the 5‐year surveillance protocol of stage I non‐seminomatous germ cell tumors of the testis, as recommended by the National Comprehensive Cancer Network. They reported a lifetime cancer risk of 1.5% for different age groups with lung and colon cancer accounting for most of the risk [57]. Nuclear medicine imaging procedures such as position emission tomography CT (PET CT) scans and renal scintigraphy (renal scanning) entails the use of a radioactive material, called a radiotracer or radiopharmaceutical, which is introduced into the human body by injection, swallowing, or inhalation. This radiotracer accumulates in the area of the body being examined, where it gives off a small amount of energy that can be detected by gamma camera and provides details on both the function and structure of tissues and organs being examined. These radiotracers stay within the tissues for a period of time and liberate radiation. However, unlike other sources of ionizing radiation with a source of external radiation, nuclear studies represent a source of internal radiation. These studies are of special importance for identifying the function of organs such as the differential renal function which especially important in children with congenital renal anomalies. The ICRP reported a total of 18 million nuclear studies in the United States in 2006. Furthermore, the doubling of ionizing radiation exposure in United States over the last two decades was largely attributable to increased imaging from CT, interventional fluoroscopy, and nuclear medicine [26]. According to the US NCRP, CT scans, fluoroscopy‐guided procedures, and nuclear medicine studies represent approximately 26% of the annual imaging procedures. Nonetheless, they cause 89% of the total annual radiation exposure [1, 26]. The main interventional endourologic procedures such as SWL, ureteroscopy, and PCNL use fluoroscopic guidance. The increase in the number of these procedures over the past few decades has raised concerns regarding the amount of radiation exposure associated with these interventions. For example, in the United States, per‐capita radiation exposure from medical sources increased about 600% (from 0.54 to 3.0 mSv) between 1982 and 2006 [25]. In terms of radiation exposure associated with each procedure, a study by Safak et al. found a mean ERD of 9.2 mSv (0.82–26.0 mSv) for PCNL [58]. These results were similar to the mean ERD reported by the Mancini group (9.09 mSv) [59]. Higher ERD was associated with higher BMI, larger stone burden, non‐branched stone configuration, and higher number of percutaneous access (PCA) tracts [59]. Furthermore, increased stone burden, prolonged operative time, multiple access tracts, and blood loss >250 cm3 have been associated with significantly prolonged fluoroscopy time (FT) [60, 61]. Concerning ureteroscopy, the mean ERD has been recently reported as 1.13 mSv, which is equivalent to that of an abdominopelvic X‐ray [62]. This was lower than older studies which reported a mean ERD of 2.5 mSv [63]. The use of recent fluoroscopy devices might explain the cause behind the lower ERD in the more recent study. Predictors of prolonged FT during ureteroscopy were found to be urology trainees, male gender, ureteral balloon dilation, residual stones, use of access sheath, surgeon/trainee behavior, longer duration of the procedure (14 seconds per 10 minutes), and higher BMI [64–66]. Furthermore, the use of flexible ureteroscopes or both flexible and semi‐rigid ureteroscopes in the same setting was associated with higher radiation exposure compared with semi‐rigid ureteroscopy alone [63]. Regarding SWL, precise stone localization necessitates the use of fluoroscopy. SWL is associated with mean radiation exposure of 1.63 mSv [67]. Higher radiation exposures are associated with higher stone burden, higher BMI, ureteral stones locations, physician inexperience, and higher number of shocks delivered [67–69]. Moreover, radiation is used to follow‐up patients with post‐SWL. Recently, Kaynar and coworkers evaluated ERD of 129 patients during the first follow‐up year following SWL. They reported a mean ERD of 15.91 mSv following SWL of kidney stones, 13.32 mSv following SWL of ureteral stones, and 27.02 mSv following SWL for multiple stone locations. Furthermore, SWL for multiple stone locations was associated with significantly higher radiation exposure during the first follow‐up year compared with SWL of solitary renal or ureteral stones [70]. Until recently, there was paucity in the literature in terms of the amount of radiation to which urologists are exposed and its impact on their health. The 2007 ICRP guidelines recommend 50 mSv (5000 milliroentgen equivalent man or mrem) as an occupational dose limit per year or 100 mSv (10 000 mrem) averaged over 5 years [6]. Furthermore, the United States regulations (Title 10, part 20 of the Code of Federal Regulations) mandate a yearly accepted limit of 5000 mrem as deep‐dose equivalent, 15 000 mrem as lens dose equivalent, and 50 000 mrem as shallow‐dose equivalent. To ensure practitioners are within these guidelines, there exists a need for a controlled measurement of radiation exposure experienced by practicing urologists over a period of time. Therefore, urologists are recommended using a dosimeter to monitor their radiation exposure. A single‐center prospective German study measured the radiation exposure in 12 urologists by placing two thermoluminescent dosimeters (one on the forehead and the other on the ring finger). A total of 188 patients underwent 235 endourologic procedures including 51 ureteral stent changes (USCs), 67 ureteral stent placements (USPs), 67 percutaneous stent changes (PSCs), 39 ureteroscopies, and 11 PCNLs. The authors recorded an average value of 0.04 mSv during USP and USC, 0.03 mSv during PSC, 0.18 mSv during PCNL, and 0.1 mSv during ureteroscopy using the forehead dosimeter and average values of 0.13, 0.21, 0.20, 4.36, and 0.15 mSv during USP, USC, PSC, PCNL, and ureteroscopy, respectively using a dosimeter positioned at the ring finger [71]. Furthermore, a study from North America measured the fluoroscopic radiation exposure of an experienced urologist over 9 months using a thermoluminscent dosimeter installed outside the thyroid shield. The total radiation exposure over 9 months was 87 mrems deep‐dose equivalent, 293 mrem lens dose equivalent, and 282 mrem for shallow‐dose equivalent [72]. Therefore, these studies showed that radiation exposure for urologists was below the annual accepted limits. Nevertheless, radiation safety protocols and recommendations should be followed to guard against the potential hazards from ionizing radiation. Over the past two decades, reports have highlighted the increasing use of radiation in medicine. During 2000, a report from the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) found that radiation used for medical purposes accounts for more than 95% of human‐made radiation exposure. Furthermore, radiological diagnostic procedures equaled or exceeded one per each individual of the population per year. It has been reported that a person who lives in United States is exposed yearly to an average of 6.2 mSv radiation from ambient sources such as cosmic rays, radon, and medical procedures [73]. While the recommended occupational radiation exposure to medical personnel should be limited to 50 mSv/year, there is no dose threshold for developing stochastic hazards [73]. According to the linear‐no‐threshold model (or LNT), which is used to quantify radiation exposure and set the regulatory limits for radiation protection, there is no safe dose of radiation and cancer can occur in 1/1000 individuals following exposure to an effective dose as low as 10 mSv [74]. This is because some organs such as the gonads and the eyes are more sensitive than others, such as the extremities. Thus the recommended exposure limits vary according to the body part. In addition, there are variations in radiation exposure among healthcare centers in terms of ERD for the same diagnostic procedure [75]. According to a study by Smith‐Bindman and colleagues, there were wide variations among radiation doses reported for different types of medical imaging studies. For example, a mean 13‐fold variation was reported between the lowest and highest dose of CT scans for adult patients across and within institutions in the San Francisco Bay area [22]. Therefore, different groups including the American Association of Physicists in Medicine, the American College of Radiology, and NCRP, together with the US Food and Drug Administration (FDA), have worked to establish nationally recognized diagnostic reference levels (DRLs) for different imaging procedures [76]. The ICRP recommended three basic principles for reducing radiation exposure. First is the principle of justification, which is defined as “Any decision that alters the radiation exposure situation should do more good than harm,” refers to avoidance of unnecessary studies and replacing procedures associated with higher radiation with others that may play the same role, whenever appropriate. Second is the principle of optimization, which is defined as “The likelihood of incurring exposure, the number of people exposed, and the magnitude of their individual doses should all be kept as low as reasonably achievable (ALARA), taking into account economic and societal factors,” entails performing the diagnostic or interventional procedure with an acceptable quality and lowest radiation exposure. Third, Application of dose limits, is defined as “The total dose to any individual from regulated sources in planned exposure situations other than medical exposure of patients should not exceed the appropriate limits specified by the Commission” [6]. Justification is a key principle for radiation safety during diagnosis and treatment using ionizing radiation. It entails several items that should be addressed. First, it is mandatory that any ionizing radiation examination prescribed by the referring clinician is required for an individual patient and that the examination has a specific objective, is risk‐effective, reliable, and anticipated to influence the decision‐making, patient treatment, and final outcome, and that the necessary information cannot be obtained by other modalities with lower risk. Furthermore, a single person should be responsible for the examination and this person is normally a radiologist who is trained in radiological techniques and radiological protection as certified by a competent authority. In addition, a documented request including patient’s clinical information, signed or endorsed by a referring clinician, should be available before an examination is performed. In the case of a female patient in the childbearing period the potential that she may be pregnant should always be kept in mind and date of the last menstrual period should be documented and pregnancy test should be ordered in case of doubt in pregnancy [8, 77]. Furthermore, all biomedical research projects that involve the use of ionizing radiation should be institutionally approved by ethics review boards and radiation protection committees. Following justification of the examination or the image‐guided intervention, optimization is the next step towards radiation protection. Optimization of protection is considered a forward‐looking iterative process directed to performing the procedure with an acceptable quality while keeping radiation exposure as low as reasonably achievable (ALARA). It takes into account both technical and socioeconomic factors. Therefore, it is a process of “frame of mind” that always questions whether all that is reasonable has been done to reduce doses. The concept of ALARA or “as low as reasonably achievable” was described as a fundamental part of the optimization process. The basis of ALARA principles is that these measures have been compiled to minimize radiation exposure. They are cost‐effective to increase compliance. They do not cause unnecessary delay to procedure, do not hinder the performance nor affect the outcome of the procedure. The three basic principles for ALARA are time, distance, and shielding. One of the most effective methods to reduce radiation exposure to patients and healthcare personnel is to limit the time of radiation exposure. This is particularly important during fluoroscopy where shortening of the FT leads to substantial decrease in radiation exposure. This could be achieved by multiple avenues. (i) Substitute fluoroscopy with other imaging modalities such as ultrasound‐guided PCA during PCNL or totally ultrasound‐guided PCNL. (ii) Use digital fluoroscopy. (iii) Use “last‐image‐hold” technology which has been shown to reduce radiation exposure dose by almost 10‐fold (from 3000 to 400 mGy). (iv) Use pulsed fluoroscopy with fewer frames per second (such as 1 or 4 frames per second) rather than standard fluoroscopy at 30 frames per second. (v) Fluoroscopy should be turned on/off by the surgeon only during absolute key points rather than continuous stretches. (vi) Keep tracking the FT. (vii) Document the FT after each procedure [78–82] (Box 2.1). Increasing the distance is considered the cheapest and most effective way to reduce radiation to operating room personnel. It is of particular importance for healthcare personnel. As radiation exposure has an inverse relationship with the square of the distance, doubling the distance reduces radiation to one‐quarter and at a distance of 3 m the radiation dose becomes similar to background levels. Healthcare providers can increase the distance by avoiding being in the room during the procedure, whenever possible such as when performing kidney–ureter–bladder (KUB) plain radiographs, intravenous pyelography, or computed tomography (CT). Also, the use of lens‐mounted video cameras decreases the distance of the surgeon from the radiation source [7, 82] (Box 2.1). Shielding is the third line of defense, especially for personnel who have to be in the radiation field. Shielding is made of heavy metals, most commonly lead which is capable of attenuating radiation. Examples of shields are lead‐impregnated eye glasses, gloves, thyroid shields, chest and pelvic aprons, and ceiling‐mounted shields. Lead aprons vary in thickness; most are 0.5 mm lead equivalent that attenuate radiation by 96.5–99.5%. Thus shielding does not provide 100% protection from radiation. Therefore, it should not be considered a substitute for other principles of ALARA [7, 83]. Lead aprons should be inspected annually for cracks. In one study, thyroid shields were found to decrease radiation exposure 23 times (from 46 to 0.02 mSv), thereby reducing radiation exposure to background levels [84]. Despite the major benefits of shielding in reducing radiation exposure, the weight of chest and pelvic aprons are associated with orthopedic problems. In 2011, Elkoushy and Andonian surveyed compliance with radiation safety measures and the prevalence of orthopedic complaints among members of the Endourological Society. Almost 64% of respondents reported orthopedic complaints in the form of back problems in 38.1%, neck problems in 27.6%, hand problems in 17.2%, and hip and knee problems in 14.2% [85]. Although there was good compliance (97%) in wearing chest and pelvic aprons, compliance with thyroid shields was only 68%. In addition, only 34.3% of respondents used dosimeters, 17.2% used lead‐impregnated glasses, and 9.7% used lead‐impregnated gloves [85]. Similarly, other national and international surveys have shown the lack of radiation exposure monitoring in the United States, Europe, India, Brazil, and Turkey [86–89]. Therefore, all of those studies recommended regular radiation safety courses for trainees and urologists. Radiation safety courses have been successful in reducing radiation dose with the implementation of a “radiation awareness program,” where focus was placed on taking fewer “snapshots” during SWL [90]. In another study, after implementing a radiation safety education initiative, a significant decrease in dose area product and FT was achieved during pediatric procedures [82, 91] (Box 2.1). This limit entails that the effective dose or the equivalent dose values prescribed to individuals should not be exceeded during planned exposure situations. This applies only during occupational exposure. For example, the dose limit for a surgeon or the technician should not exceed 50 mSv per year. However, limiting the dose to individual patients is not recommended because it may impact the effectiveness of the patient’s diagnosis or treatment. For example, dose limits cannot be applied for patients undergoing radiotherapy, where a target is needed irrespective of the dose of radiation. Therefore, for patients, only the principles of justification and optimization apply [6]. During 2010, the FDA launched a collaborative initiative aimed at reducing unnecessary radiation exposure from medical imaging. This initiative focused on CT, fluoroscopy, and nuclear medicine as the three medical imaging modalities that are associated with highest radiation [1]. The measures included promotion of safe use of medical imaging devices through (i) establishing requirements for manufacturers of CT and fluoroscopic devices to include additional safeguards into the equipment design and developing nationwide recognized DRLs, (ii) supporting informed clinical decision making by adding requirements for manufacturers of fluoroscopy and CT devices to record radiation dose information for use in patient medical records; this may allow the physician together with the patient and the radiologist to further justify procedures which necessitate ionizing radiation, and (iii) increase patient awareness and provide them with tools to track their personal medical imaging history such as using a patient medical imaging record card [1]. During 2014, the government of Germany hosted an international conference sponsored by the International Atomic Energy Agency and the World Health Organization calling for action on radiation protection in medicine in the next decade. This conference was coined the “Bonn Call for Action” [92]
Radiation Safety During Diagnosis and Treatment
Introduction
Basic terminology and International System of Units in radiology
How are X‐rays generated?
Potential hazards of excessive radiation exposure
Deterministic effects
Stochastic effects
Sources of ionizing radiation encountered in urology
Diagnostic imaging
Interventional imaging
Occupational radiation exposure for urologists
Dose‐reduction strategies
Principle of justification
Principle of optimization
ALARA principles
Time (minimize time)
Distance (maximize distance)
Shielding (always use shields)
Principle of application of dose limits
Stay updated, free articles. Join our Telegram channel