Open Access Open Access  Restricted Access Subscription Access

Boltzmann and Non-Boltzmann Sampling for Image Processing


Affiliations
1 Presidency College, Kamarajar Salai, Triplicane, Chennai-600 005, Tamil Nadu, India
2 Chennai Mathematical Institute (CMI), Siruseri, Kelambakkam, Chennai-603 103, Tamil Nadu, India
 

Objectives: We present two algorithms for image processing; the first is based on Boltzmann sampling and the second on entropic sampling.

Methods: These algorithms come within the Bayesian framework which has three components: 1. Likelihood: a conditional density - the probability of a noisy image given a clean image, 2. A Prior and, 3. A Posterior: a conditional density - the probability of a clean image given a noisy image. The Likelihood provides a model for the degradation process; the Prior models what we consider as a clean image; it also provides a means of incorporating whatever data we have of the image; the Posterior combines the Prior and Likelihood and provides an estimate of the clean counterpart of the given noisy image. The algorithm sets a competition between: 1. The Likelihood that tries to anchor the image to the given noisy image so that the features present can be retained including perhaps the noisy ones and, 2. The Prior which tries to make the image smooth, even at the risk of eliminating some genuine features of the image other than the noise.

Findings: A proper choice of the prior and the likelihood functions would lead to good image processing. We need also good estimators of the clean image.

Application: The choice of estimators is somewhat straight forward for image processing employing Boltzmann algorithm. For non-Boltzmann algorithm we need efficient estimators that make full use of the entropic ensemble generated.


Keywords

Image Processing, Prior, Posterior, Boltzmann Sampling, Entropic Sampling, Bayesian.
User
Notifications

  • S. Gemen, D. Gemen. Stochastic relaxation, Gibbs distributions, and Bayesian restoration. IEEE Transactions of Pattern Analysis and Machine Intelligence. 1984; 6, 721-741.
  • J.E. Besag. On the statistical analysis of dirty pictures (with discussions). Journal of the Royal Statistical Society Series B. 1986; 48(3), 259.
  • E. Tanaka. Statistical mechanical approach to image processing. Journal of Physics A: Mathematical and General. 2002; 35(37), 1-81.
  • J.M. Pryce, A.D. Bruce. Statistical mechanics of image restoration. Journal of Physics A: Mathematical and General. 1995; 28, 1-511.
  • H. Nishimori, K.M.Y. Wong. Statistical mechanics of image restoration and error correcting codes. Physical Review. 1999; 60(1), 132-44.
  • K.P.N. Murthy, M. Janani, B. ShenbagaPriya. Bayesian restoration of digital images employing Markov chain Monte Carlo - a review. ArXiv. 2005.
  • G.M. Torrie, J.P. Valleau. Nonphysical sampling distributions in Monte Carlo free-energy estimation - umbrella sampling. Journal of Computational Physics. 1977; 23(2), 187-199.
  • B.A. Berg, T. Neuhaus, Multicanonical ensemble: A new approach to simulate first-order phase transitions. Physical Review Letters. 1992; 68(9).
  • J. Lee. New Monte Carlo algorithm: entropic sampling. Physical Review Letters. Erratum. 1993; 71.
  • F. Wang, D.P. Landau. Efficient, multiple-range random walk algorithm to calculate the density of states. Physical Review Letters. 2001; 86.
  • K.V. Ramesh. Boltzmann and non-Boltzmann sampling for image processing, Thesis, M Tech (Computational Techniques), University of Hyderabad. 2009.
  • S. Kullback, R.A. Leibler. On information and sufficiency. Annals of Mathematical Statistics. 1951; 22(1), 79-86.
  • R.W. Hamming. Error detecting and error correcting codes. Bell System Technical Journal. 1950; 29(2), 147-160.
  • N. Metropolis, A.W. Rosenbluth, M.N. Rosenbluth, A.H. Teller, E. Teller, Equation of state calculations by fast computing machine. Journal of Chemical Physics. 1953; 21, 1-7.
  • K.P.N. Murthy. Non-Boltzmann ensembles and Monte Carlo simulation. Journal of Physics: Conference Series. 2016; 759.

Abstract Views: 240

PDF Views: 118




  • Boltzmann and Non-Boltzmann Sampling for Image Processing

Abstract Views: 240  |  PDF Views: 118

Authors

T. Pramananda Perumal
Presidency College, Kamarajar Salai, Triplicane, Chennai-600 005, Tamil Nadu, India
K. R. Srivaishnavi
Presidency College, Kamarajar Salai, Triplicane, Chennai-600 005, Tamil Nadu, India
D. L. Asha Rani
Presidency College, Kamarajar Salai, Triplicane, Chennai-600 005, Tamil Nadu, India
K. P. N. Murthy
Chennai Mathematical Institute (CMI), Siruseri, Kelambakkam, Chennai-603 103, Tamil Nadu, India

Abstract


Objectives: We present two algorithms for image processing; the first is based on Boltzmann sampling and the second on entropic sampling.

Methods: These algorithms come within the Bayesian framework which has three components: 1. Likelihood: a conditional density - the probability of a noisy image given a clean image, 2. A Prior and, 3. A Posterior: a conditional density - the probability of a clean image given a noisy image. The Likelihood provides a model for the degradation process; the Prior models what we consider as a clean image; it also provides a means of incorporating whatever data we have of the image; the Posterior combines the Prior and Likelihood and provides an estimate of the clean counterpart of the given noisy image. The algorithm sets a competition between: 1. The Likelihood that tries to anchor the image to the given noisy image so that the features present can be retained including perhaps the noisy ones and, 2. The Prior which tries to make the image smooth, even at the risk of eliminating some genuine features of the image other than the noise.

Findings: A proper choice of the prior and the likelihood functions would lead to good image processing. We need also good estimators of the clean image.

Application: The choice of estimators is somewhat straight forward for image processing employing Boltzmann algorithm. For non-Boltzmann algorithm we need efficient estimators that make full use of the entropic ensemble generated.


Keywords


Image Processing, Prior, Posterior, Boltzmann Sampling, Entropic Sampling, Bayesian.

References