Patch similarity under non Gaussian noise

Similar documents
How to compare noisy patches? Patch similarity beyond Gaussian noise

Poisson NL means: unsupervised non local means for Poisson noise

Poisson Image Denoising Using Best Linear Prediction: A Post-processing Framework

A Dual Sparse Decomposition Method for Image Denoising

Variational inference

Modeling Multiscale Differential Pixel Statistics

Recent developments on sparse representation

Spatially adaptive alpha-rooting in BM3D sharpening

Efficient Variational Inference in Large-Scale Bayesian Compressed Sensing

Detection theory 101 ELEC-E5410 Signal Processing for Communications

Computer Vision & Digital Image Processing

Linear Dynamical Systems

Independent Component Analysis and Unsupervised Learning

Lecture 5: GPs and Streaming regression

Introduction to Probabilistic Graphical Models: Exercises

Statistical and Learning Techniques in Computer Vision Lecture 2: Maximum Likelihood and Bayesian Estimation Jens Rittscher and Chuck Stewart

Probability Theory for Machine Learning. Chris Cremer September 2015

Lecture 8: Signal Detection and Noise Assumption

Sélection adaptative des paramètres pour le débruitage des images

Expectation Propagation Algorithm

F & B Approaches to a simple model

Introduction to Statistical Inference

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

Lecture : Probabilistic Machine Learning

Sparsity Measure and the Detection of Significant Data

IEOR E4570: Machine Learning for OR&FE Spring 2015 c 2015 by Martin Haugh. The EM Algorithm

OPTIMAL SURE PARAMETERS FOR SIGMOIDAL WAVELET SHRINKAGE

Independent Component Analysis and Unsupervised Learning. Jen-Tzung Chien

PILCO: A Model-Based and Data-Efficient Approach to Policy Search

A CUSUM approach for online change-point detection on curve sequences

Gaussian Process Optimization with Mutual Information

NL-SAR: a unified Non-Local framework for resolution-preserving (Pol)(In)SAR denoising

A METHOD OF FINDING IMAGE SIMILAR PATCHES BASED ON GRADIENT-COVARIANCE SIMILARITY

A Generative Perspective on MRFs in Low-Level Vision Supplemental Material

Parametric Unsupervised Learning Expectation Maximization (EM) Lecture 20.a

Lecture 4: Probabilistic Learning

Mathematical statistics

Design of Image Adaptive Wavelets for Denoising Applications

Uncertainty. Jayakrishnan Unnikrishnan. CSL June PhD Defense ECE Department

Using Entropy and 2-D Correlation Coefficient as Measuring Indices for Impulsive Noise Reduction Techniques

Bayesian Machine Learning

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Bayesian Learning. Bayesian Learning Criteria

Stat 5101 Lecture Notes

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

Blind Spectral-GMM Estimation for Underdetermined Instantaneous Audio Source Separation

Statistical Filters for Crowd Image Analysis

Machine Learning. Lecture 4: Regularization and Bayesian Statistics. Feng Li.

Composite Hypotheses and Generalized Likelihood Ratio Tests

Unsupervised Learning with Permuted Data

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions

Lecture 4: Probabilistic Learning. Estimation Theory. Classification with Probability Distributions

Unsupervised machine learning

Removing Mixture of Gaussian and Impulse Noise by Patch-Based Weighted Means

Clustering and Gaussian Mixture Models

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Reading Group on Deep Learning Session 1

Inverse problem and optimization

Mathematical statistics

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Note Set 5: Hidden Markov Models

Statistical Data Analysis Stat 3: p-values, parameter estimation

Parallel Gaussian Process Optimization with Upper Confidence Bound and Pure Exploration

MULTI-SCALE IMAGE DENOISING BASED ON GOODNESS OF FIT (GOF) TESTS

Recent Advances in Bayesian Inference Techniques

Linear Regression (9/11/13)

Relevance Vector Machines for Earthquake Response Spectra

EEL 851: Biometrics. An Overview of Statistical Pattern Recognition EEL 851 1

Outline. Supervised Learning. Hong Chang. Institute of Computing Technology, Chinese Academy of Sciences. Machine Learning Methods (Fall 2012)

Lecture 7 Introduction to Statistical Decision Theory

Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project

Chris Bishop s PRML Ch. 8: Graphical Models

Image Denoising using Uniform Curvelet Transform and Complex Gaussian Scale Mixture

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

An Introduction to Statistical and Probabilistic Linear Models

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics

Lecture 1 October 9, 2013

Pattern Recognition and Machine Learning

Algorithmisches Lernen/Machine Learning

Probability and Estimation. Alan Moses

Maximum likelihood estimation

Templates, Image Pyramids, and Filter Banks

Nonparameteric Regression:

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis

The geometry of Gaussian processes and Bayesian optimization. Contal CMLA, ENS Cachan

Probabilistic & Unsupervised Learning

DETECTION theory deals primarily with techniques for

STA 4273H: Statistical Machine Learning

Detection of Anomalies in Texture Images using Multi-Resolution Features

EECS 275 Matrix Computation

Image Restoration with Mixed or Unknown Noises

Parameter Estimation. Industrial AI Lab.

MAXIMUM A POSTERIORI ESTIMATION OF SIGNAL RANK. PO Box 1500, Edinburgh 5111, Australia. Arizona State University, Tempe AZ USA

Model Selection Tutorial 2: Problems With Using AIC to Select a Subset of Exposures in a Regression Model

Probabilistic Graphical Models for Image Analysis - Lecture 1

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017

Introduction to Machine Learning. Maximum Likelihood and Bayesian Inference. Lecturers: Eran Halperin, Yishay Mansour, Lior Wolf

STA 4273H: Statistical Machine Learning

Minimum message length estimation of mixtures of multivariate Gaussian and von Mises-Fisher distributions

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References

Transcription:

The 18th IEEE International Conference on Image Processing Brussels, Belgium, September 11 14, 011 Patch similarity under non Gaussian noise Charles Deledalle 1, Florence Tupin 1, Loïc Denis 1 Institut Telecom, Telecom ParisTech, CNRS LTCI, Paris, France Observatoire de Lyon, CNRS CRAL, UCBL, ENS de Lyon, Université de Lyon, Lyon, France September 13, 011 C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 1 / 18

Motivation Increasing use of patches to model images Texture synthesis, Inpainting, Image editing, Denoising, Super-resolution, Image registration, Stereo vision, Object tracking. Image model based on the natural redundancy of patches C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 / 18

Motivation Increasing use of patches to model images Texture synthesis, Inpainting, Image editing, Denoising, Super-resolution, Image registration, Stereo vision, Object tracking. (a) Microscopy (b) Astronomy (c) SAR polarimetry C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 / 18

Motivation Increasing use of patches to model images Texture synthesis, Inpainting, Image editing, Denoising, Super-resolution, Image registration, Stereo vision, Object tracking. How to compare noisy patches? }{{}? }{{}? How to take into account the noise model? }{{}? C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 / 18

Table of contents 1 Limits of the Euclidean distance Variance stabilization approach 3 How to adapt properly to the noise distribution? 4 Evaluation of similarity criteria C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 3 / 18

Table of contents 1 Limits of the Euclidean distance Variance stabilization approach 3 How to adapt properly to the noise distribution? 4 Evaluation of similarity criteria C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 4 / 18

Noise models Gaussian noise assumption A pair (x 1, x ) of noisy patches can be decomposed as: = + and = +. }{{} x 1 } {{ } θ 1 }{{} n 1 }{{} x } {{ } θ }{{} n Beyond Gaussian noise Noise can be non-gaussian, e.g., Poisson or Gamma distributed, Non-additive decomposition for Poisson noise: }{{} x 1 = } {{ } θ 1 + }{{} n 1 The noise level is signal-dependent. and }{{} x = } {{ } θ + }{{} n C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 4 / 18

Euclidean distance under noisy conditions Why the Euclidean distance under Gaussian noise? 1 Euclidean distance estimates the dissimilarity between noise-free patches: E = + PatchSize σ, When θ 1 = θ = θ 1, the residue is statistically small and independent on θ 1: = 1 < τ, 3 When θ 1 θ, the residue is statistically higher: = 1 > τ. C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 5 / 18

Limits of the Euclidean distance Limit of the Euclidean distance with Poisson noise 1 Euclidean distance does not estimate the dissimilarity between noise-free patches: E = + 1 + 1, When θ 1 = θ = θ 1, the residue cannot be controlled and is dependent on θ 1: = 1? τ, 3 Then, when θ 1 θ, there is no guarantee that the residue is statistically higher: = 1? τ. C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 6 / 18

Table of contents 1 Limits of the Euclidean distance Variance stabilization approach 3 How to adapt properly to the noise distribution? 4 Evaluation of similarity criteria C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 7 / 18

Variance stabilization approach Variance stabilization approach Use an application s which stabilizes the variance for a specific noise model, Evaluate the Euclidean distance between the transformed patches: ( ) ( ) s s =, (1) C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 7 / 18

Variance stabilization approach Variance stabilization approach Use an application s which stabilizes the variance for a specific noise model, Evaluate the Euclidean distance between the transformed patches: ( ) ( ) s s =, (1) Example Gamma noise (multiplicative) and the homomorphic approach: s(x ) = log X Var[s(X )] = Var[log X ] = Ψ(1, L) () where L is the shape parameter of the gamma distribution. Poisson noise and the Anscombe transform: s(x ) = X + 3 (θ 4 Var[s(X )] = 1). (3) 8 C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 7 / 18

Variance stabilization approach Why does it seem to work? 1 Euclidean distance estimates the dissimilarity between transformed noise-free patches: E s ( ) s ( ) = + Constant, When θ 1 = θ = θ 1, the residue is statistically small: = 1 < τ, 3 Then, when θ 1 θ, the residue is statistically higher: = 1 > τ. C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 8 / 18

Variance stabilization approach Limits Only heuristic, No optimality results, Does not take into account the statistics of the transformed data, Does not exist for all noise distribution models. (a) Image with impulse noise (b) SAR cross correlation C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 9 / 18

Table of contents 1 Limits of the Euclidean distance Variance stabilization approach 3 How to adapt properly to the noise distribution? 4 Evaluation of similarity criteria C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 10 / 18

What are we looking for? Definitions and properties A similarity criterion can be based on the hypothesis test (i.e., a parameter test): H 0 : θ 1 = θ θ 1 (null hypothesis), H 1 : θ 1 θ (alternative hypothesis). Its performance can be measured as: P FA = P(answer dissimilar; θ 1, H 0) P D = P(answer dissimilar; θ 1, θ, H 1) (false-alarm rate), (detection rate). The likelihood ratio (LR) test minimizes P D for any P FA : L(x 1, x ) = p(x1, x; θ1, H0) p(x 1, x ; θ 1, θ, H 1). Problem: θ 1, θ 1 and θ are unknown. given by the noise distribution model C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 10 / 18

Patch similarity criteria Generalized likelihood ratio Generalized likelihood ratio (GLR) Estimate θ 1, θ 1 and θ with the maximum likelihood estimate (MLE), Define the (negative log) generalized likelihood ratio test: log GLR(x 1, x ) = log sup t p(x 1, x ; θ 1 = t, H 0) sup t1,t p(x 1, x ; θ 1 = t 1, θ = t, H 1) = log p(x1; θ1 = ˆt 1)p(x ; θ = ˆt 1) p(x 1; θ 1 = ˆt 1)p(x ; θ = ˆt ) C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 11 / 18

Patch similarity criteria Generalized likelihood ratio Generalized likelihood ratio (GLR) Estimate θ 1, θ 1 and θ with the maximum likelihood estimate (MLE), Define the (negative log) generalized likelihood ratio test: log GLR(x 1, x ) = log sup t p(x 1, x ; θ 1 = t, H 0) sup t1,t p(x 1, x ; θ 1 = t 1, θ = t, H 1) = log p(x1; θ1 = ˆt 1)p(x ; θ = ˆt 1) p(x 1; θ 1 = ˆt 1)p(x ; θ = ˆt ) Maximal self similarity Assume x 1 x, then: ( p log x 1 = ; θ 1 = ( p x 1 = ; θ 1 = ) ( p x = ; θ 1 = ) ( p x = ; θ = ) ) > 0 C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 11 / 18

Patch similarity criteria Generalized likelihood ratio Generalized likelihood ratio (GLR) Estimate θ 1, θ 1 and θ with the maximum likelihood estimate (MLE), Define the (negative log) generalized likelihood ratio test: log GLR(x 1, x ) = log sup t p(x 1, x ; θ 1 = t, H 0) sup t1,t p(x 1, x ; θ 1 = t 1, θ = t, H 1) = log p(x1; θ1 = ˆt 1)p(x ; θ = ˆt 1) p(x 1; θ 1 = ˆt 1)p(x ; θ = ˆt ) Equal self similarity Assume x 1 = x, then: ( p log x 1 = ; θ 1 = ( p x 1 = ; θ 1 = ) ( p x = ; θ 1 = ) ( p x = ; θ = ) ) = 0 C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 11 / 18

Patch similarity criteria Generalized likelihood ratio name pdf log GLR Stabilization Euclidean Gaussian Poisson Gamma e (x θ) σ (x πσ 1 x ) θ x e θ x! L L x L 1 e Lx θ Γ(L)θ L log x 1 +x x x 1 1 xx (x 1 +x ) x 1 +x ( ) x1 x log + log x x1 ( ) x1 +3/8 x +3/8 ( log x ) 1 x The three criteria for three noise models C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 1 / 18

Patch similarity criteria Generalized likelihood ratio name pdf log GLR Stabilization Euclidean Gaussian Poisson Gamma e (x θ) σ (x πσ 1 x ) θ x e θ x! L L x L 1 e Lx θ Γ(L)θ L log x 1 +x x x 1 1 xx (x 1 +x ) x 1 +x ( ) x1 x log + log x x1 ( ) x1 +3/8 x +3/8 ( log x ) 1 x The three criteria for three noise models Does it work? Illustration with Gamma noise 1 When θ 1 = θ = θ 1, the residue is statistically small: ( ) log GLR, = Then, when θ 1 θ, the residue is statistically higher: ( ) log GLR, = < τ 1 > τ 1 C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 1 / 18

Table of contents 1 Limits of the Euclidean distance Variance stabilization approach 3 How to adapt properly to the noise distribution? 4 Evaluation of similarity criteria C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 13 / 18

Detection 1 Gamma Generalized likelihood ratio Variance stabilization Euclidean distance Probability of detection 0.9 0.8 0.7 0.6 0.5 0.4 0.3 L G 0. S 0.1 N Q G 0 0 0. 0.4 0.6 0.8 1 Probability of false alarm Maximum joint likelihood [Alter et al., 006] C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 13 / 18

Detection 1 Poisson Generalized likelihood ratio Variance stabilization Euclidean distance Probability of detection 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0. 0.1 Q G 0 0 0. 0.4 0.6 0.8 1 Probability of false alarm Maximum joint likelihood [Alter et al., 006] Mutual information kernel [Seeger, 00] Bayesian likelihood ratio [Minka, 1998, Minka, 000] Bayesian joint likelihood [Yianilos, 1995, Matsushita and Lin, 007] C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 13 / 18 L G K B S L B N Q B

Evaluation on denoising Non-local filtering with GLR Poisson Gamma (a) Noisy image (b) Euclidean distance (c) GLR NB: Variance stabilization provides visual quality very close to GLR. C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 14 / 18

Evaluation on denoising Non-local filtering with GLR Poisson Gamma (a) Noisy image (b) Euclidean distance (c) GLR NB: Variance stabilization provides visual quality very close to GLR. C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 14 / 18

Evaluation on denoising Iterative non-local filtering with GLR (a) SAR cross correlation (b) Non-local filtering with GLR GLR can be used when the variance stabilization approach cannot be applied. C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 15 / 18

Stereo-vision Gamma (a) Noisy image (b) Euclidean distance (c) GLR Poisson (d) Ground truth (e) Euclidean distance (f) GLR C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 16 / 18

Motion tracking Glacier monitoring with a stereo pair of SAR images (g) Noisy image (h) Euclidean distance (i) GLR Glacier of Argentie re. With GLR, the estimated speeds matches with the ground truth: average over the surface of 1.7 cm/day and a maximum of 41.1 cm/day in the areas with crevasses. C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 17 / 18

Conclusion Conclusion GLR behaves well to compare patches under non-gaussian noise conditions, It can be used when variance stabilization cannot be applied, Under high levels of gamma and Poisson noise, it outperforms six other criteria: Best probability of detection for any probability of false-alarm. We have shown the interest of GLR in: Patch-based denoising, Patch-based stereo vision, and Patch-based motion tracking. C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 18 / 18

Conclusion Conclusion GLR behaves well to compare patches under non-gaussian noise conditions, It can be used when variance stabilization cannot be applied, Under high levels of gamma and Poisson noise, it outperforms six other criteria: Best probability of detection for any probability of false-alarm. We have shown the interest of GLR in: Patch-based denoising, Patch-based stereo vision, and Patch-based motion tracking. Future work Under high SNR the variance stabilization approach defeats GLR: Why such a change of relative behavior? Euclidean distance estimates the dissimilarity between noise-free patches, While GLR evaluates the equality of noise-free patches. Extend this approach to derive criteria with contrast invariance (e.g., for stereo-vision or flickering). C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 18 / 18

Conclusion Questions? deledalle@telecom-paristech.fr http://perso.telecom-paristech.fr/~deledall/ C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 18 / 18

[Alter et al., 006] Alter, F., Matsushita, Y., and Tang, X. (006). An intensity similarity measure in low-light conditions. Lecture Notes in Computer Science, 3954:67. [Matsushita and Lin, 007] Matsushita, Y. and Lin, S. (007). A Probabilistic Intensity Similarity Measure based on Noise Distributions. In IEEE Conference on Computer Vision and Pattern Recognition, 007. CVPR 07, pages 1 8. [Minka, 1998] Minka, T. (1998). Bayesian Inference, Entropy, and the Multinomial Distribution. Technical report, CMU. [Minka, 000] Minka, T. (000). Distance measures as prior probabilities. Technical report, CMU. [Seeger, 00] Seeger, M. (00). Covariance kernels from Bayesian generative models. In Advances in neural information processing systems 14: proceedings of the 001 conference, page 905. MIT Press. [Yianilos, 1995] Yianilos, P. (1995). Metric learning via normal mixtures. Technical report, NEC Research Institute, Princeton, NJ. C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 19 / 18

Evaluation on denoising Numerical performance Medium noise levels Noisy Q G L G S G Gamma barbara 14.34.61 5.66 5.67 3.83 boat 13.78 3.40 5.50 5.50 4.06 bridge 14.58 0.17.36.36 1.01 cameraman 13.96 3.88 5.04 5.01 14.93 couple 14.37 3.19 5.08 5.06 3.68 fingerprint 13.00 18.37 1.88 1.89 0.7 hill 14.80 1.46 4.4 4.4.47 house 13.35.5 6.33 6.34 4.36 lena 14.09 4.61 7.71 7.7 5.61 man 14.88 3.49 6.00 6.01 4.50 mandril 14.0 1.61 3.0 3.0. peppers 14.0.95 5.54 5.51 3.41 PSNR values: the higher the better Generalized likelihood ratio Variance stabilization Euclidean distance Maximum joint likelihood [Alter et al., 006] C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 0 / 18

Evaluation on denoising Numerical performance Strong noise levels Noisy Q G L G S G Gamma barbara 5.86 0.5 0.97 0.90 0.33 boat 5.3 0.90 1.47 1.4 0.97 bridge 6.09 18.44 19.1 19.16 18.49 cameraman 5.54 18.56 0.88 0.87 7.48 couple 5.98 0.93 1.54 1.51 0.99 fingerprint 4.60 15.34 16.30 16. 15.57 hill 6.35 0.18 0.68 0.61 0.0 house 4.84 0.54 1.0 1.13 0.64 lena 5.64.14.89.83.3 man 6.47 1.56.16.10 1.64 mandril 5.5 0. 0.44 0.41 0.7 peppers 5.56 18.59 0.44 0.43 18.65 PSNR values: the higher the better Generalized likelihood ratio Variance stabilization Euclidean distance Maximum joint likelihood [Alter et al., 006] C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 0 / 18

Evaluation on denoising Numerical performance Medium noise levels Noisy Q B Q G L B L G K B S G Poisson barbara 14.43 3.59 3.57 5.43 5.40 5.41 5.44 4.79 boat 13.99 4.00 3.98 5.8 5.6 5.7 5.9 4.74 bridge 14.58 1.06 1.04.30.9.30.31 1.84 cameraman 14.33 3.63 3.57 5.01 5.0 5.0 5.03 4. couple 14.31 3.54 3.5 4.88 4.85 4.86 4.88 4.9 fingerprint 13.6 0.59 0.58.03 1.99.00.04 1.60 hill 14.6.49.48 3.98 3.96 3.97 3.98 3.36 house 13.73 4.36 4.34 6.58 6.57 6.57 6.58 5.76 lena 14.0 5.57 5.55 7.40 7.37 7.38 7.40 6.58 man 14.64 4.08 4.06 5.66 5.65 5.66 5.67 5.09 mandril 14.03.18.17 3.03 3.01 3.0 3.04.68 peppers 14.0 3.38 3.35 5.45 5.41 5.43 5.45 4.41 PSNR values: the higher the better Generalized likelihood ratio Variance stabilization Euclidean distance Maximum joint likelihood [Alter et al., 006] Mutual information kernel [Seeger, 00] Bayesian likelihood ratio [Minka, 1998, Minka, 000] Bayesian joint likelihood [Yianilos, 1995, Matsushita and Lin, 007] C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 0 / 18

Evaluation on denoising Numerical performance Strong noise levels Noisy Q B Q G L B L G K B S G Poisson barbara 5.68 0.5 0.5 0.5 0.68 0.65 0.59 0.4 boat 5.3 0.90 0.90 1.11 1.1 1.19 1.15 1.04 bridge 5.83 18.36 18.36 18.65 18.81 18.78 18.7 18.53 cameraman 5.59 18.61 18.61 19.17 19.56 19.49 19.37 19.01 couple 5.55 0.91 0.91 1.11 1.0 1.18 1.15 1.04 fingerprint 4.87 15.48 15.48 16.18 16.41 16.38 16.30 15.96 hill 5.88 0.13 0.13 0.41 0.54 0.5 0.47 0.31 house 4.94 0.48 0.49 0.81 0.97 0.94 0.88 0.67 lena 5.44.14.15.44.59.56.49.30 man 5.89 1.55 1.55 1.77 1.89 1.87 1.8 1.69 mandril 5.31 0.3 0.3 0.34 0.38 0.37 0.36 0.30 peppers 5.46 18.55 18.56 19.09 19.46 19.38 19.5 18.88 PSNR values: the higher the better Generalized likelihood ratio Variance stabilization Euclidean distance Maximum joint likelihood [Alter et al., 006] Mutual information kernel [Seeger, 00] Bayesian likelihood ratio [Minka, 1998, Minka, 000] Bayesian joint likelihood [Yianilos, 1995, Matsushita and Lin, 007] C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 0 / 18

Max. self sim. Eq. self sim. Id. of indiscernible Invariance Asym. CFAR Asym. UMPI Q B Q G L B L ( ) G K ( ) B G S ( ) ( ) ( ) ( ) ( ) Properties of the different studied criteria. Legend: ( ) the criterion holds, ( ) the criterion does not hold. Holds only if the observations are statistically identifiable ( ) through their MLE or ( ) through their likelihood (such assumptions are frequently true). ( ) Holds only for an exact variance stabilizing transform s( ) (such an assumption is usually wrong). The proofs of all these properties are available in the Online Resource 1. C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 1 / 18

name pdf Q B Q G L B L G K B S G Gaussian Gamma e (x θ) σ e (x 1 x ) πσ LL x L 1 e Lx ( θ Γ(L)θ L 1 x 1 x x 1 x (x 1 +x ) ( ) L x 1 x log x ) 1 x (x 1 +x ) e Poisson θ x e θ x! Γ ( ) (x 1 +x ) x1 +x x1 +x x 1 +x x 1!x! (e) x 1 +x x 1!x! Γ ( ) (x 1 +x ) x1 +x x1 +x x 1 +x Γ (x 1 )Γ (x ) x 1 +x x x 1 1 xx Γ (x 1 +x ) Γ (x 1 )Γ (x ) e ( x 1 +a x +a ) Instances of the seven criteria for Gaussian, gamma and Poisson noise (parameters σ and L are fixed and known). All Bayesian criteria are obtained with Jeffreys priors (resp. 1/σ, L/θ, 1/θ). All constant terms which do not affect the detection performance are omitted. For clarity reason, we define Γ (x) = Γ(x + 0.5) and the Anscombe constant a = 3/8. C. Deledalle, F. Tupin, L. Denis (Telecom ParisTech) Patch similarity September 13, 011 / 18