Since the digital X-Ray images are the results of a measurement process, they are affected by noise. Linear filters remove the fluctuations of pixels value related to the presence of noise attenuating or removing the high frequency content of the images, having the drawback that they smooth also the edges and so they corrupt structural information inside them. A more principled approach, widely adopted for image denoising since the paper of Rudin et al., is based on regularization. In this framework the filtered image is obtained through the maximization of a properly defined cost function, in which both the distance between the filtered and the measured image and user-defined a-priori characteristics of the image data are considered. Many authors demonstrated that a Total Variation (TV) a-priori term leads to an effective, edge preserving denoising algorithm generally termed TV-regularization. From a Bayesian point of view, the filtered image maximizes the posterior probability composed by the likelihood term, that expresses the distance between filtered and measured image, and the prior probability, which describes the desired characteristics of the solution in statistical form. The TV filter, in the considered case, depends on the regularization parameter and on a parameter introduced in order provides differentiability of the regularization term. The effect of the latter parameter produces a loss of sensibility of the regularization term to the image gradients with "small" magnitude, so it is suggested to set it to a small value. Optimality of the regularization parameter has been expressed in terms of similarity between the filtered and the noise-free images. Through the similarity indexes presented in this work it has seen that the optimal value of the regularization parameter is somehow related to the images' features. Low frequency and low photons count images (images with lower SNR) require a higher regularization effect, so the optimal value of the regularization parameter assumes higher values. However, for the case of real images, we do not have the noise-free image and the value of the regularization parameter has to be derived from the measured image and the filtered one. Several attempts in this direction have been proposed, especially for the Gaussian noise and Tikhonov regularizer case . These are based on the "Discrepancy principle" that states that the residual should have the same distribution of the noise on the image. We have adapted this principle to the Poisson noise case with TV regularizer, considering the fact that, in the Poisson noise, there is a dependency between the signal level and the noise variance and defined a criterion to automatically set the most adequate value of the regularization parameter. The filtering performance have been translates as the similarity between the filtered and noise-free images. The similarity between images is evaluated in term of the distance between images by using the classical indexes derived from signal processing, like sum of squares or Peak Signal to Noise ratio, but also with the newly introduced indices based on the Human Visual System properties, like Structure Similarity (SSIM) and Feature Similarity (FSIM). In the case of high photons count images NMAE (Normalized MAE), PSNR, RMSE, and FSIM generally agree in identifying the optimal value of the regularization parameter for radiographs with high SNR, while for low photon counts (low SNR), FSIM prefers more smoothed images with respect to NMAE, RMSE and PSNR; in this context, it is not clear which of the indexes provides the best image quality evaluation. Finally the SSIM index always select as optimal over-smoothed images and it does not appear to be a reliable quality index. We noticed that during the filtering process of images with low spatial frequency content the similarity between the filtered image and the noise-free one increases in the first iterations, reaches a maximum value, and then it starts to decrease while the cost function monotonically decreases throughout. We have further analysed this and we show that in high photon count regions the optimization algorithms, in the later iterations, cuts the image valleys and ridges introducing spurious plateaus. As a result, the image is locally over-smoothed. This is explained by considering the fact that since the likelihood term comes from the Poisson distribution, allows a large correction in the high photon count area and only small corrections in the low-photon count areas. This imply that the weight of the likelihood term, in the whole cost function is signal dependent, and in the high photon count area the regularization term is predominant. Lastly we have compared TV regularization with another family of denoising algorithms based on wavelet decomposition, widely used in image processing in general: the Bayesian Least Squares - Gaussian Scale Mixture(BLS-GSM), that has been proposed by Portilla et al., for the denoising of images corrupted by Gaussian noise. This algorithm has been extended also to the Poisson case, by using an improved version of the Anscombe inverse transformation. Results on real and simulated images have shown the superiority of the BLS-GSM with respect to the Total Variation on the low frequency images and on the real ones. As higher frequencies are incorporated, the two methods show similar behavior and TV regularization performs better for very high frequency images, whose frequency content is more similar to the assumptions of total variation.

TOTAL VARIATION POISSON NOISE REMOVAL IN DIGITAL RADIOGRAPHY / M. Lucchese ; supervisori: N. A. Borghese, I. Frosio ; coordinatore: G. Naldi. UNIVERSITA' DEGLI STUDI DI MILANO, 2012 Jul 06. 24. ciclo, Anno Accademico 2011.

TOTAL VARIATION POISSON NOISE REMOVAL IN DIGITAL RADIOGRAPHY

M. Lucchese
2012

Abstract

Since the digital X-Ray images are the results of a measurement process, they are affected by noise. Linear filters remove the fluctuations of pixels value related to the presence of noise attenuating or removing the high frequency content of the images, having the drawback that they smooth also the edges and so they corrupt structural information inside them. A more principled approach, widely adopted for image denoising since the paper of Rudin et al., is based on regularization. In this framework the filtered image is obtained through the maximization of a properly defined cost function, in which both the distance between the filtered and the measured image and user-defined a-priori characteristics of the image data are considered. Many authors demonstrated that a Total Variation (TV) a-priori term leads to an effective, edge preserving denoising algorithm generally termed TV-regularization. From a Bayesian point of view, the filtered image maximizes the posterior probability composed by the likelihood term, that expresses the distance between filtered and measured image, and the prior probability, which describes the desired characteristics of the solution in statistical form. The TV filter, in the considered case, depends on the regularization parameter and on a parameter introduced in order provides differentiability of the regularization term. The effect of the latter parameter produces a loss of sensibility of the regularization term to the image gradients with "small" magnitude, so it is suggested to set it to a small value. Optimality of the regularization parameter has been expressed in terms of similarity between the filtered and the noise-free images. Through the similarity indexes presented in this work it has seen that the optimal value of the regularization parameter is somehow related to the images' features. Low frequency and low photons count images (images with lower SNR) require a higher regularization effect, so the optimal value of the regularization parameter assumes higher values. However, for the case of real images, we do not have the noise-free image and the value of the regularization parameter has to be derived from the measured image and the filtered one. Several attempts in this direction have been proposed, especially for the Gaussian noise and Tikhonov regularizer case . These are based on the "Discrepancy principle" that states that the residual should have the same distribution of the noise on the image. We have adapted this principle to the Poisson noise case with TV regularizer, considering the fact that, in the Poisson noise, there is a dependency between the signal level and the noise variance and defined a criterion to automatically set the most adequate value of the regularization parameter. The filtering performance have been translates as the similarity between the filtered and noise-free images. The similarity between images is evaluated in term of the distance between images by using the classical indexes derived from signal processing, like sum of squares or Peak Signal to Noise ratio, but also with the newly introduced indices based on the Human Visual System properties, like Structure Similarity (SSIM) and Feature Similarity (FSIM). In the case of high photons count images NMAE (Normalized MAE), PSNR, RMSE, and FSIM generally agree in identifying the optimal value of the regularization parameter for radiographs with high SNR, while for low photon counts (low SNR), FSIM prefers more smoothed images with respect to NMAE, RMSE and PSNR; in this context, it is not clear which of the indexes provides the best image quality evaluation. Finally the SSIM index always select as optimal over-smoothed images and it does not appear to be a reliable quality index. We noticed that during the filtering process of images with low spatial frequency content the similarity between the filtered image and the noise-free one increases in the first iterations, reaches a maximum value, and then it starts to decrease while the cost function monotonically decreases throughout. We have further analysed this and we show that in high photon count regions the optimization algorithms, in the later iterations, cuts the image valleys and ridges introducing spurious plateaus. As a result, the image is locally over-smoothed. This is explained by considering the fact that since the likelihood term comes from the Poisson distribution, allows a large correction in the high photon count area and only small corrections in the low-photon count areas. This imply that the weight of the likelihood term, in the whole cost function is signal dependent, and in the high photon count area the regularization term is predominant. Lastly we have compared TV regularization with another family of denoising algorithms based on wavelet decomposition, widely used in image processing in general: the Bayesian Least Squares - Gaussian Scale Mixture(BLS-GSM), that has been proposed by Portilla et al., for the denoising of images corrupted by Gaussian noise. This algorithm has been extended also to the Poisson case, by using an improved version of the Anscombe inverse transformation. Results on real and simulated images have shown the superiority of the BLS-GSM with respect to the Total Variation on the low frequency images and on the real ones. As higher frequencies are incorporated, the two methods show similar behavior and TV regularization performs better for very high frequency images, whose frequency content is more similar to the assumptions of total variation.
6-lug-2012
Settore INF/01 - Informatica
Settore MAT/06 - Probabilita' e Statistica Matematica
X-Ray images ; Poisson noise ; Bayesian Filtering ; Total Variation ; Wavelet
BORGHESE, NUNZIO ALBERTO
BORGHESE, NUNZIO ALBERTO
FROSIO, IURI
NALDI, GIOVANNI
Doctoral Thesis
TOTAL VARIATION POISSON NOISE REMOVAL IN DIGITAL RADIOGRAPHY / M. Lucchese ; supervisori: N. A. Borghese, I. Frosio ; coordinatore: G. Naldi. UNIVERSITA' DEGLI STUDI DI MILANO, 2012 Jul 06. 24. ciclo, Anno Accademico 2011.
File in questo prodotto:
File Dimensione Formato  
phd_unimi_R08104.pdf

accesso riservato

Tipologia: Tesi di dottorato completa
Dimensione 17.68 MB
Formato Adobe PDF
17.68 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/203243
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact