Radiation dose optimization refers to the balance between imaging quality and efficacy and radiation dose. Radiation dose reduction can not be considered as the only endpoint in optimization. Generally speaking, many of the technical parameters that may need to be adjusted to reduce dose may also affect the quality of the image. This topic is the current focus of a number of research studies–to be able to predict these effects and optimize dose prior to imaging.

Physicists, radiologists and technologists need to work together to optimize dose to ensure that the image quality is maintained and the image is adequate for the diagnostic task. When optimizing radiation dose, many sites will track the quality by getting real time feedback from their radiologists and technologists. Having to repeat an image because of low quality will only add to the radiation burden.

The dose benchmarks we are striving for by equipment and protocol are described using a “Gaussian” distribution or normal curve. Images that are at the low end of the dose distribution (tails) may have image quality concerns and these “outliers” need to be investigated to ensure that they were able to serve their diagnostic purpose. Similarly, those images at the high end of the distribution, high doses, may be of extremely high quality and contain less noise, but it is likely that some of this quality is unnecessary if previous research has shown that noisier images can be as diagnostic as those with less noise.