Radiation Safety During Ureteroscopy

and D. Duane Baldwin 



(1)
Department of Urology, Loma Linda University School of Medicine, Loma Linda, CA, USA

(2)
Department of Urology, Loma Linda University Medical Center, Loma Linda University School of Medicine, 11234 Anderson Street, Room A560, Loma Linda, CA 92354, USA

 



 

D. Duane Baldwin




Abstract

Medical imaging is essential for state-of-the-art diagnosis, surgical treatment and follow-up in individuals undergoing ureteroscopy. The imaging utilized has evolved from the use of plain films and intravenous pyelograms (IVP) with long acquisition times to spiral CT imaging with rapid image acquisition, high sensitivity and specificity. Although these advances in imaging have improved patient care, they have produced a moderate to significant increase in radiation exposure. Since the effects of radiation are not immediately perceived by the patient or the physician, their inherent risks may be easily overlooked. It is important that the urologic surgeon consider the potential risks and benefits of all imaging modalities prior to employing them.

Recently, concerns regarding increasing patient radiation exposure from medical imaging have led the US Food and Drug Administration (FDA) to call for a reduction in exposure during diagnostic and therapeutic medical procedures. In order to ensure high quality healthcare while optimizing patient safety it becomes essential for the treating physician to develop a clear understanding of the units of radiation exposure, the amount of radiation provided by different diagnostic and therapeutic interventions, the potential risks associated with this radiation exposure and the reduced radiation alternatives currently available. By adhering to the principles outlined in this chapter for the appropriate utilization of ionizing radiation, the urologic surgeon can achieve optimal outcomes with a significant reduction in risk for both the patient and staff.



Introduction


Since the first description by Hugh Hampton Young in 1912 [1], there have been major advances in ureteroscopy including miniaturization of ureteroscopes and instruments, the ­development of the holmium laser for stone fragmentation, and the creation of actively deflectable flexible ureteroscopes [24]. The imaging has also advanced from the use of plain films and intravenous pyelograms (IVP) with long acquisition times and the risk of dye administration to the use of noncontrast spiral CT imaging with rapid image acquisition, high sensitivity and specificity and no requirement for intravenous contrast. Although these advances in imaging have improved patient care, they have produced a moderate to significant increase in radiation exposure.

Since radiation cannot be seen, smelled, or felt, it is not surprising that its risks could be underestimated. In a survey of 45 emergency and 38 radiology physicians published in 2004, it was found that 91 and 53% believed that radiation did not increase the risk of malignancy, respectively. Furthermore, 51% of emergency and 24% of radiology physicians underestimated the radiation exposure by a factor greater than tenfold [5].

In 2009, a tragic set of circumstances at Cedars-Sinai Hospital in Southern California forever altered both patient and physician perceptions of the risks associated with medical imaging. More than 200 patients were exposed to radiation levels up to 8–10  ×  normal over an 18-month period while having CT brain scans during stroke evaluation. Hair loss and skin erythema developed in 40% of patients. Perhaps most concerning was the fact that this was not an isolated practice as several other hospitals in California and across the nation were employing similar practices [6]. The images of these patients portrayed in the media and the ensuing lawsuits ­forever changed both the physicians and patients’ perspectives of radiation safety [7, 8]. Furthermore, the use of CT has grown exponentially from 3 million in 1980 to 62 million in 2007 [9], producing some public health concerns as it is estimated that 1.5–2.0% of all cancers diagnosed in the USA may be due to radiation from CT scans [10]. It is estimated that in 2007 alone, 29,000 cancers would develop, 14,000 due to CT of the abdomen and pelvis [10]. These public health concerns prompted the US Food and Drug Administration (FDA) to issue a white paper in February of 2010, describing the impact of the >60 million CT scans, 18 million nuclear medicine procedures, and 17 million interventional fluoroscopy procedures that were performed in the USA in 2006. This FDA campaign was designed to reduce the exponential growth rate in radiation exposure [11].

The primary dictum of the Hippocratic Oath is to “do no harm.” Yet, as evidence mounts suggesting that the radiation associated with medical imaging is potentially harmful to the patient, it becomes essential for the physician to develop a clear understanding of the units of radiation exposure, the amount of radiation provided by different diagnostic and therapeutic interventions, the potential risks associated with this radiation exposure, and the reduced radiation alternatives currently available.


Historical Implications of Medical Radiation


X-rays were first discovered in 1895 by Wilhelm Conrad Roentgen [12]. Later, Henri Becquerel demonstrated that uranium salts caused fogging of an unexposed photographic plate and Marie Curie is credited with discovering that thorium and uranium gave off rays of energy and she coined the term “radioactivity” [13, 14]. Scientists eager to recreate Roentgen’s discoveries exposed themselves to large amounts of radiation, and due to the latency noted no harmful effects initially. With time, however, the devastating effects of their large radiation exposure became evident. Clarence M. Dally, who assisted Thomas Edison in early X-ray experiments over 8 years, was one of the early victims of radiation toxicity and developed facial burns, degenerative skin changes in his hands, and extensive hair loss. He later required hand amputation for a non-healing wound and ultimately died of metastatic cancer at the age of 39 [15].

Despite the accepted risk of high radiation doses, there is still debate regarding the harmful potential of low and moderate doses of radiation exposure. One theory, called radiation hormesis, is that low doses of ionizing radiation stimulate the activation of repair mechanisms that not only cancel the detrimental effects of ionizing radiation but also inhibit disease that is not related to radiation exposure. This view is not widely accepted [1619]. A second view of radiation biology is the threshold model which contends that small doses of radiation may not be damaging until a certain threshold is reached [20]. The predominantly held view is the linear no-threshold model. This theory suggests cancer risk will increase in direct proportion with an increase in radiation exposure and the risk of sequential radiation doses is equivalent to their sums. Furthermore, there is no safe radiation exposure and small amounts of radiation have the potential to result in future malignancy. This is the view supported by the US National Research Council and the National Council on Radiation Protection and Measurements and the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) [19].


Radiation Physics


The physical term radiation includes ionizing and nonionizing forms and describes a process in which energetic particles or energetic waves propagate through space. The lowest frequency of electromagnetic radiation is radio waves, followed in increasing frequency by microwaves, terahertz, infrared, visible light, ultraviolet, X-rays, and gamma rays [21]. Nonionizing forms of radiation may include radio waves, heat or visible light and are usually considered harmless to organisms at low levels that do not produce temperature rise [22]. In contrast, ionizing radiation contacts atoms and removes an electron, leaving the atom with a net positive charge. This ionization of individual atoms may result in DNA damage and subsequently the development of malignancy [21]. Just as the effects of low to moderate dose radiation exposure are poorly understood, the units used to express radiation exposure are also not well understood by many medical professionals. The absorbed dose is the energy absorbed per unit of mass and is measured in Gray (Gy). One Gray is equal to 1 J of radiation energy absorbed per kilogram of tissue (1 J/kg). The organ dose is the absorbed dose averaged over an organ. In other words, it is the amount of energy absorbed by the organ divided by the mass of the organ and is expressed in Gray. The organ dose is the preferred entity to use when assessing the risk to the organ from radiation [9, 23]. The effective dose is expressed in Sieverts (Sv) and is used for dose distributions that are not homogenous like those which occur in CT imaging. It is designed to be a proportional estimate of the overall harm to a population caused by a defined radiation exposure [24]. When discussing X-rays, gamma radiation, and beta radiation, Grays and Sieverts are equivalent [25]. Dose is expressed in Gray for matter or Sieverts for biological tissue when using SI units. In this instance 1 Gy or 1 Sv is equal to 1 J/kg. Non-SI units are still often employed where dose is expressed in rads (1 Gy  =  100 rad) and dose equivalent in rems (1 Sv  =  100 rem). Table 20.1 provides radiation exposure conversions. Individuals living in the United States are routinely exposed to approximately 3.0 mSv annually due to environmental background exposure although significant regional differences do exist [26]. A lethal dose of radiation for a human is between 3 and 5 Sv over an hour [27]. The Nuclear Regulatory Commission has mandated that 50 mSv is the maximal permissible annual radiation dose for occupational exposure in adults [28], while exposure should not exceed 100 mSv over any 5-year period [29].


Table 20.1
Radiation equivalents










































1 Gy

1,000 mGy

1 mGy

100 mrad

1 Sv

1,000 mSv

1 Sv

100 rem

1 mSv

100 mrem

10 mSv

1 rem

1 rad

1 rem

1 rem

1,000 mrem

1 Gya

1 Sva

1 Gy

100 Roentgen

10 mGy

1 Roentgen

1 mGya

1 mSva


aThis only holds true when discussing X-rays, gamma rays, and beta-radiation

Data from [25, 123126].

There are two types of side effects seen following radiation exposure; the immediate deterministic effects and delayed stochastic effects. Deterministic effects (like those seen in the Cedars-Sinai patients) have a short latency and are rarely seen in urologic applications but are seen more commonly after cardiology, interventional and neurointerventional procedures. They will not occur unless a threshold of 2–3 Gy are exceeded at one setting [3035].

The second type of radiation side effect, known as stochastic effects, includes the development of secondary malignancies. Much of our understanding of stochastic effects of radiation exposure is inferred from the side effects seen in the atomic bomb survivors, although this may not be an ideal comparison since atomic bomb survivors also received neutrons, protons, and other radioactive materials for which the biological significance is less well-characterized [36, 37]. Brenner and colleagues reviewed a group of victims that were exposed to a mean of 200 mSv with 50% of that cohort being exposed to less than 50 mSv. The stochastic effects were usually not seen for many years following radiation exposure and increased as the relative radiation exposure and the period from exposure increased. Although variable, the average effective dose of a single noncontrast CT of the abdomen or pelvis has been estimated at 10 mSv [38]. Therefore, >3 single phase CT scans of the abdomen and pelvis or one 3-phase study in 1 year (60 mSv) would exceed the maximum permissible “occupational” exposure [28] and be similar to the exposure of a Japanese atomic bomb survivor within 3 km of detonation [39].

According to Preston and colleagues after exposure at age 30, it is estimated that at age 70 solid cancer rates will increase by 35% per Gy for men and 58% per Gy for women [40]. Individuals that have been exposed to 50–100 mSv over a protracted amount of time or 10–50 mSv during an acute exposure have a dose response relationship with regards to solid cancer related mortality [41]. The likelihood that ionizing radiation will result in cancer is dependent upon the absorbed dose of radiation, as adjusted for the damaging tendency of the type of radiation (equivalent dose) and the sensitivity of the organism and tissue to radiation which is the effective dose.


Preoperative Imaging for Patients Undergoing Ureteroscopy


Typically, the most common indication for ureteroscopy is in the management of urinary stone disease and flank pain. Imaging options in these patients include plain KUB, renal and bladder US, IVP, MRI, and CT imaging. The modality selected for imaging may have significant effects upon imaging time, study cost, and patient radiation exposure.

One of the oldest, simplest, and fastest imaging modalities for the evaluation of stone patients is the flat plate of the abdomen (KUB). The KUB is low cost (US national average $240) [42], rapid to acquire and easy to interpret. It provides a sensitivity of 44–77% and a specificity of 77–80% [43]. The radiation exposure provided by a KUB is modest at 0.7 mSv [44]. The disadvantage of a KUB is that it will not detect radiolucent stones (uric acid and cysteine), stones <4 mm and it may be difficult to distinguish small stones from phleboliths. Finally, KUB cannot determine if hydronephrosis is present, if the system is obstructed, or evaluate renal function [45].

Ultrasonography of the kidney and bladder represents another method to image patients with flank pain, or potential stones and results in no radiation exposure. Ultrasound has other advantages including a rapid performance time (20–30 min) and a high sensitivity for detection of large renal stones (96–100%) [46]. Ultrasound also can determine ureteral patency indirectly by identifying ureteral jets and absence of hydronephrosis. The disadvantages of ultrasound are that it is user dependent, it is difficult to image non-obstructing ureteral calculi, it may image poorly in obese individuals, and it is moderately costly ($400) [47].

Historically, IVP was considered the diagnostic test of choice for evaluating the upper urinary tract and had a mean exposure of 3.3 mSv [48]. Today this study is rarely performed in the USA as it has largely been replaced by CT imaging [49]. IVP would diagnose both radio-opaque and larger radiolucent stones (by their negative filling defect), and was excellent in providing information regarding the presence of function and obstruction. The sensitivity (87–90%) and specificity 94–100% were good [50]. Of concern the contrast causes adverse allergic reactions in up to 13% with ionic and 3% of people that receive nonionic contrast. Severe, life-threatening reactions can occur between 0.04 and 0.22% of the time [51]. Secondly, the use of contrast media may result in renal deterioration due to acute tubular necrosis in 3–26% of patients which is increased in patients with chronic renal insufficiency, diabetes mellitus, dehydration, multiple myeloma, and congestive heart failure [52]. Furthermore, IVP is potentially labor intensive sometimes requiring hours to perform [53]. The total cost of an IVP is typically between $150 and 200 [54].

Another imaging modality with no radiation exposure is the MRI. Although good at evaluating soft tissue lesions, it is not sensitive at diagnosing urinary calculi with a sensitivity and specificity of 78 and 96.1%, respectively [55]. Other limitations of MRI include slow acquisition time, limited availability during off hours, and high cost. The national average cost of an abdominal-pelvic MRI is approximately $5100.00 [56]. MRI is predominately used to determine the presence or absence of hydronephrosis and can also detect a filling defect if contrast is used [57, 58]. Until imaging times and costs are reduced it is not likely that MRI will replace CT imaging in the evaluation of flank pain and urinary stone patients.

The noncontrast CT scan of the abdomen and pelvis is currently the standard of care for the evaluation of patients with acute flank pain and potential urolithiasis due to its high sensitivity and specificity, rapid acquisition time, and universal availability. The sensitivity and specificity of CT scan has been noted to be 94–100% and 92–100%, respectively [46]. In addition noncontrast CT has no risk of dye allergy, or ATN, and requires very little technician time. Another advantage of CT imaging is its ability to establish alternative diagnoses other than urinary stones in the 14% of patients that present with flank pain that will require immediate or deferred treatment for other conditions [59]. The only significant disadvantages of computed tomography are the amount of radiation exposure and relatively high cost. The radiation exposure with a single abdomen and pelvic CT is approximately 20 mSv but there can be tremendous range between protocols employed at different centers [60]. The cost of a CT in one study was approximately $2200.00 [47].

CT urogram, although not often employed, is a useful tool to assess the upper and lower urinary tract in patients that present with hematuria, strictures, or complicated anatomy [61]. Traditionally, a CT urogram has multiple phases including a noncontrast phase and delayed phase following opacification of the collecting system and has a relatively high radiation exposure at 25–35 mSv [62, 63]. It has a sensitivity for detecting upper tract neoplasms of 64% and a specificity of 98% in patients with asymptomatic hematuria. Furthermore, the sensitivity and specificity of a CT urogram to detect lower tract neoplasms was 79 and 94%, respectively. CT urography should be cautiously applied in high-risk patients with hematuria and complicated anatomy, but should probably not be routinely employed for evaluation of routine urinary stone patients due to its high radiation exposure.

Another imaging modality sometimes employed in the evaluation of patients prior to ureteroscopy is nuclear medicine renography which is associated with approximately 3.7 mSv per study [64]. This test is typically conducted when one is evaluating a patient for obstruction and in an attempt to determine the function of a renal unit. Technetium-99m-mercaptoacetyltriglycine (MAG3) is commonly used because 40–50% of it is extracted by the proximal tubules and then secreted into the tubular lumen. This radiotracer is a good diagnostic agent in neonates, patients with impaired renal function, patients with suspected obstruction, and can be used as a independent measure of renal function [65]. Technetium-99m-diethyenetriaminepentaacetic acid (DTPA) is also commonly used in renal nuclear renography and has the advantage that it is less expensive than MAG3. The extraction rate of DTPA is approximately 20%. Table 20.2 provides a summary of the characteristics of the imaging modalities employed prior to ureteroscopy along with other commonly employed studies.


Table 20.2
Characteristics of imaging modalities employed prior to ureteroscopy













































 
Sensitivity

Specificity

Time to perform exam

Radiation exposure

IVP

87–90% [50]

94–100% [50]

Intermediate to long

3.3 mSv

KUB

44–77% [43]

77–80% [43]

Rapid

0.7 mSv

Ultrasound

74–96% [46]

100% [46]

Intermediate

0 mSv

unenhanced CT scan abd/pelvis

94–100% [46]

92–100% [46]

Rapid

20 mSv

MRI

69–78% [55]

89–96% [55]

Long

0 mSv


Imaging Employed During Ureteroscopy


Fluoroscopy provides important information to the urologic surgeon that improves the safety and efficacy of this procedure and it is almost universally employed during ureteroscopy. It identifies stones, assists in the placement of guidewires, ureteral access sheaths, and stents and assists during renal mapping. Extensive use of fluoroscopy during ureteroscopy has the potential to provide a large radiation exposure.

The amount of radiation that the patient is exposed to during fluoroscopy is dependent upon fluoroscopy time, distance between the patient and the X-ray source, and the parameters on the fluoroscopy machine (kVp, mA). When the kVp is increased the penetration and exposure are increased and the contrast is decreased. Increasing the milliamperage (mA) increases the exposure which causes the film to darken [66]. During a typical fluoroscopic examination the X-ray tube is typically operated below 100 kVp and 3 mA current. Despite this, the radiation exposure from fluoroscopy is quite variable and it is estimated to produce a radiation dose between 10 and 500 mGy per minute, with spot films increasing this dose by a factor of 10–60 times [67].

In a study by Bagley and Cubler-Goodman published in 1990, it was determined that patients undergoing ureteroscopy received 100 mGy to the posterior skin at the level of the costovertebral angle. In this study fluoroscopy time ranged from 2 min for a simple diagnostic rigid ureteroscopy up to 4.7 min for a combined rigid and flexible ureteroscopy [68]. In a more recent study performed by Krupp and colleagues in 2010, all clinical ureteroscopy cases from 2006 to 2008 were reviewed. Simulating ureteroscopy, male and female cadavers were then exposed to 145 s of fluoroscopy. Dosages seen in this study were posterior skin 10.5 mGy, left kidney 3.5 mGy, and left ureter 2.7 mGy. Additionally, the left ovary received a higher radiation dosage (3.4 mGy) than the left testicle (0.36 mGy) which should be considered when performing ureteroscopy in younger females [69].

In addition to the patient, urologists, radiologic technicians, and nurses are also exposed to radiation primarily from patient scatter. Bagley and Cubler-Goodman performing 13 ureteroscopies during 1 month and found the urologist had a neck dose of 0.3 mSv and a hand dose of 12.7 mSv [68]. Giblin and colleagues measured the radiation exposure to the urologist positioned 6 in. from the perineum during direct vision endoscopy using a fluoroscopy table. The urologist’s head and neck received 11 mSv per hour from radiation scatter [70]. Certainly, the use of video endoscopy has moved the surgeon further from the source and decreased exposure but the dosage received by patients and staff from fluoroscopy may be considerable.


Imaging During Follow-Up of Ureteroscopy Patients


Ureteroscopy patients receive ionizing radiation during the preoperative evaluation, intraoperatively during fluoroscopy, and also during imaging performed for follow-up with all exposures being cumulative. In the postoperative evaluation of stone patients a combination of a KUB for residual stones and a renal ultrasound to detect possible silent hydronephrosis are adequate for most uncomplicated patients and has minimal radiation exposure. In a study conducted by Ferrandino and colleagues, 108 patients received 4 radiographic examinations with a mean effective dose of 29.7 mSv (excluding fluoroscopy) during the year following stone presentation. Twenty percent of patients received >50 mSv [71]. The frequent young age of presentation and high recurrence rates in stone patients should encourage urologists to strongly consider alternative imaging strategies to decrease the radiation exposure whenever possible. Table 20.3 provides a summary of the different amounts of radiation exposure from various imaging modalities and environmental exposure.


Table 20.3
Radiation dose comparisona

































































































































Diagnostic procedure

Average effective dose (mSv)

PA chest X-rays (equivalent effective dose)

Ranges reported in the literature (mSv)

Chest X-ray (PA film)

0.02

1

0.007–0.050

Chest X-ray (PA and lat)

0.1

5

0.05–0.24

Mammography

0.4

20

0.10–0.60

Lumbar spine X-ray

1.5

75

0.5–1.8

Abdominal X-ray (KUB)

0.7

35

0.04–1.1

Intravenous urogram

3.0

150

0.7–3.7

Renal scan DTPA

1.8

90

n/a

Renal scan MAG3

2.6

130

n/a

Renal scan DSMA

3.3

165

n/a

Bone scan

6.3

315

n/a

Upper G.I. (with fluoroscopy)

6.0

300

1.5–12.0

Barium enema (with fluoroscopy)

8.0

400

2.0–18.0

CT head

2.0

100

0.9–4.0

CT chest

7.0

350

4.0–18.0

CT chest (PE protocol)

15.0

750

13.0–40.0

Ventilation/perfusion scan

0.5

25

n/a

CT abdomen

10.0

500

3.5–25.0

CT pelvis

10.0

500

3.3–10.0

Thallium cardiac stress test

40.7

2,035

n/a

PET scan

14.1

705

n/a

Chernobyl containment workers (mean)

165 mSv

8,250

n/a

Organ-specific radiation dose (kidneys) during 1 min of fluoroscopy

1.8 mSv/min

n/a

n/a

Fluoroscopy exposure 1 min

10 mGy/min

n/a

10–500 mGy


aActual exposures received may vary widely based upon patient and imaging parameters

Data from [26, 43, 44, 46, 123, 127130]


Alternative Imaging Strategies for Ureteroscopy Patients


As discussed previously, there is no ideal imaging modality for the evaluation of ureteroscopy patients. The sensitivity of ultrasound, MRI, and KUB are too low. In addition the acquisition time is too long for MRI and the risks associated with contrast are too high to justify use of IVP. As a result, noncontrast CT has become the diagnostic test of choice for patients with flank pain and renal colic despite its high radiation exposure. There are several potential ways to reduce radiation exposure in ureteroscopy patients including avoiding medical imaging whenever possible, spacing out the interval of ionizing radiation and finally reducing the radiation associated with conventional imaging studies.

In today’s litigious medical environment physicians have become accustomed to routine CT imaging in a high proportion of patients presenting to the emergency room. Between 1996 and 2007, CT utilization increased by 330%, despite only a 30% increase in ER visits. The utilization of CT for abdominal pain increased 10× and the use of CT for flank pain increased from 3.5 to >40% over the same time period [72]. Physicians fearing a missed diagnosis must also now contend with the threat of lawsuit for malignancy resulting from unnecessary conventional CT imaging. It is imperative that physicians use sound medical judgment and consider the risks and benefits of each imaging modality prior to its use.

Ureteral stones 2–4 mm in size have a 95% spontaneous passage rate [73]. Therefore, in a known stone patient with signs and symptoms compatible with ureteral colic, adequate pain control and no infection, imaging is not mandatory. Furthermore, in patients with recent imaging it is not mandatory to repeat imaging with each subsequent presentation.

In patients presenting with flank pain, another option to reduce radiation exposure is to use imaging that does not employ ionizing radiation. Renal ultrasound has excellent sensitivity for the detection of renal stones and hydronephrosis and may be a reasonable alterative to CT in uninfected, minimally symptomatic patients. The combination of renal ultrasound and a KUB has been shown to increase the sensitivity and specificity as high as 97 and 67%, respectively, with a negative predictive value of 95% while providing minimal ionizing radiation [50].

Perhaps one of the most promising means to decrease radiation exposure in ureteroscopy patients is the use of reduced radiation CT imaging. A variety of studies have consistently shown that low-dose CT provides a sensitivity and specificity that is greater than 90% and similar to conventional CT [74]. Zilberman and colleagues added noise to a 160 mA CT simulating a 70, 100 and 130 mA CT. There was no difference in interobserver and intra-observer variability for stone detection or radiographic signs of obstruction between settings [75]. Similarly, Poletti and colleagues performed low-dose CT (30 mAs) and conventional CT (180 mAs) for 125 stone patients and found that when BMI was <30, the low-dose CT had a sensitivity of 95%, a specificity of 97%, and detected all ureteral calculi >3 mm [76]. Hamm and colleagues performed a study utilizing a low-dose 70 mA (1.5 mSv) CT scan in 109 patients and discovered that low-dose CT scan had a sensitivity and specificity of 96 and 97%, respectively [74]. In a cadaveric study, Jellison and colleagues compared mAs settings ranging from 7.5 to 140 and found no difference in sensitivity or specificity at any setting despite a 95% reduction in radiation (the radiation dose similar to a KUB). All stones >3 mm were detected [38]. Similarly, Jin and colleagues used a cadaver model to compare renal stone detection at 100, 60, and 30 mAs equivalent to a CT dose index volume of 6.7, 4.0, and 2.0 mGy, respectively. The blinded radiologists determined that there was similar detection for stones >3 mm despite a 70% reduction in radiation exposure [77].

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Tags:
Sep 21, 2016 | Posted by in UROLOGY | Comments Off on Radiation Safety During Ureteroscopy

Full access? Get Clinical Tree

Get Clinical Tree app for offline access