Refine your search:     
Report No.
 - 

Development of a method calculating detection efficiency maps for quantitative image reconstruction of a Compton camera

Nagao, Yuto; Yamaguchi, Mitsutaka; Kawachi, Naoki; Fujimaki, Shu; Kamiya, Tomihiro; Takeda, Shinichiro*; Watanabe, Shin*; Takahashi, Tadayuki*; Torikai, Kota*; Arakawa, Kazuo*; Nakano, Takashi*

We have been studying the application of Compton cameras to the field of medicine and biology. A quantitative image reconstruction method for Compton cameras has been investigated to analyze quantitatively the physiological functions of the target subjects, since it is essential to estimate the quantitative distribution of radioactive tracers within a given field of view (FOV) of the camera. Detection efficiency maps play an important role in quantitative image reconstruction in a statistical image reconstruction algorithm. Particularly, there are significant spatial variations of efficiency in the near-field area of the camera, which is the main FOV in the medical and biological application. A method to calculate the efficiency maps has been developed considering geometrical and physical conditions. A Monte Carlo simulation was carried out to test the validity of the method. The point sources of 511 keV photons were placed in a plane at a distance of 150 mm from the scattering detector of the CdTe Compton camera. The efficiency map agreed well with the result of Monte Carlo simulation. An imaging experiment of a $$^{22}$$Na source in the shape of line was performed using the efficiency map. The source was placed in the same plane as considered in Monte Carlo simulation. The shape of line is well reconstructed by list-mode ML-EM, though distribution of activity is not uniform sufficiently.

Accesses

:

- Accesses

InCites™

:

Altmetrics

:

[CLARIVATE ANALYTICS], [WEB OF SCIENCE], [HIGHLY CITED PAPER & CUP LOGO] and [HOT PAPER & FIRE LOGO] are trademarks of Clarivate Analytics, and/or its affiliated company or companies, and used herein by permission and/or license.