In this work, we developed a model able to predict in a few seconds the response of a gamma camera based on continuous scintillator in terms of linearity and spatial resolution in the whole field of view (FoV). This model will be useful during the design phase of a SPECT or PET detector in order to predict and optimize gamma camera performance by varying the parameter values of its components (scintillator, light guides, and photodetector). Starting from a model of the scintillation light distribution on the photodetector sensitive surface, a theoretical analysis based on the estimation theory is carried out in order to find the analytical expressions of bias and FWHM related to four interaction position estimation methods: the classical Center of Gravity method (Anger Logic), an enhanced Center of Gravity method, a Mean Square Error fitting method, and the Maximum Likelihood Estimation method. Afterwards, spatial resolution as well as depth of interaction (DOI) distribution effects are evaluated by processing biases and FWHMs at different DOIs. The comparison between the model and GEANT4 Monte Carlo simulations of four different detection systems has been carried out. Our model prediction errors of spatial resolution, in terms of percentage RMSDs with respect to the simulated spatial resolution, are lower than 13.2% in the whole FoV for three estimation methods. The computational time to calculate spatial resolutions with the model in the whole FoV is five order of magnitudes faster than an equivalent standard Monte Carlo simulation. © 2016 IEEE.
A theoretical model for fast evaluation of position linearity and spatial resolution in gamma cameras based on monolithic scintillators / Galasso, Matteo; Fabbri, Andrea; Borrazzo, Cristian; Cencelli, Valentino Orsolini; Pani, Roberto. - In: IEEE TRANSACTIONS ON NUCLEAR SCIENCE. - ISSN 0018-9499. - STAMPA. - 63:3(2016), pp. 1386-1398. [10.1109/TNS.2016.2558101]
A theoretical model for fast evaluation of position linearity and spatial resolution in gamma cameras based on monolithic scintillators
FABBRI, ANDREA;BORRAZZO, CRISTIAN;PANI, Roberto
2016
Abstract
In this work, we developed a model able to predict in a few seconds the response of a gamma camera based on continuous scintillator in terms of linearity and spatial resolution in the whole field of view (FoV). This model will be useful during the design phase of a SPECT or PET detector in order to predict and optimize gamma camera performance by varying the parameter values of its components (scintillator, light guides, and photodetector). Starting from a model of the scintillation light distribution on the photodetector sensitive surface, a theoretical analysis based on the estimation theory is carried out in order to find the analytical expressions of bias and FWHM related to four interaction position estimation methods: the classical Center of Gravity method (Anger Logic), an enhanced Center of Gravity method, a Mean Square Error fitting method, and the Maximum Likelihood Estimation method. Afterwards, spatial resolution as well as depth of interaction (DOI) distribution effects are evaluated by processing biases and FWHMs at different DOIs. The comparison between the model and GEANT4 Monte Carlo simulations of four different detection systems has been carried out. Our model prediction errors of spatial resolution, in terms of percentage RMSDs with respect to the simulated spatial resolution, are lower than 13.2% in the whole FoV for three estimation methods. The computational time to calculate spatial resolutions with the model in the whole FoV is five order of magnitudes faster than an equivalent standard Monte Carlo simulation. © 2016 IEEE.File | Dimensione | Formato | |
---|---|---|---|
Galasso_Gamma-cameras-based_2016.pdf
solo gestori archivio
Tipologia:
Documento in Post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
458.98 kB
Formato
Adobe PDF
|
458.98 kB | Adobe PDF | Contatta l'autore |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.