Single Exposure Enhancement and Reconstruction. Some slides are from: J. Kosecka, Y. Chuang, A. Efros, C. B. Owen, W. Freeman

Similar documents
Visual Object Recognition

Super-Resolution. Shai Avidan Tel-Aviv University

What is Image Deblurring?

CS 3710: Visual Recognition Describing Images with Features. Adriana Kovashka Department of Computer Science January 8, 2015

Lecture 8: Interest Point Detection. Saad J Bedros

TRACKING and DETECTION in COMPUTER VISION Filtering and edge detection

Lecture 8: Interest Point Detection. Saad J Bedros

Super-Resolution. Dr. Yossi Rubner. Many slides from Miki Elad - Technion

Illumination, Radiometry, and a (Very Brief) Introduction to the Physics of Remote Sensing!

Computer Vision & Digital Image Processing

Multimedia communications

Using Entropy and 2-D Correlation Coefficient as Measuring Indices for Impulsive Noise Reduction Techniques

Wavelets and Multiresolution Processing

Computational Photography

Department of Electrical Engineering, Polytechnic University, Brooklyn Fall 05 EL DIGITAL IMAGE PROCESSING (I) Final Exam 1/5/06, 1PM-4PM

Optical Flow, Motion Segmentation, Feature Tracking

Digital Image Processing ERRATA. Wilhelm Burger Mark J. Burge. An algorithmic introduction using Java. Second Edition. Springer

ADMINISTRIVIA SENSOR

SATELLITE REMOTE SENSING

A Spatially Adaptive Wiener Filter for Reflectance Estimation

INTERNATIONAL JOURNAL OF ELECTRONICS AND COMMUNICATION ENGINEERING & TECHNOLOGY (IJECET)

Image Noise: Detection, Measurement and Removal Techniques. Zhifei Zhang

Motion Estimation (I) Ce Liu Microsoft Research New England

The shapes of faint galaxies: A window unto mass in the universe

Video and Motion Analysis Computer Vision Carnegie Mellon University (Kris Kitani)

Lucas-Kanade Optical Flow. Computer Vision Carnegie Mellon University (Kris Kitani)

CSE 559A: Computer Vision EVERYONE needs to fill out survey.

On the FPA infrared camera transfer function calculation

6.869 Advances in Computer Vision. Prof. Bill Freeman March 1, 2005

Linear Operators and Fourier Transform

Introduction to Computer Vision. 2D Linear Systems

ADMM algorithm for demosaicking deblurring denoising

3D from Photographs: Camera Calibration. Dr Francesco Banterle

Intensity Transformations and Spatial Filtering: WHICH ONE LOOKS BETTER? Intensity Transformations and Spatial Filtering: WHICH ONE LOOKS BETTER?

Image Degradation Model (Linear/Additive)

Science Insights: An International Journal

ECE Digital Image Processing and Introduction to Computer Vision

Nonlinear Diffusion. 1 Introduction: Motivation for non-standard diffusion

Templates, Image Pyramids, and Filter Banks

OPPA European Social Fund Prague & EU: We invest in your future.

Atmospheric Turbulence Effects Removal on Infrared Sequences Degraded by Local Isoplanatism

Inverse Problems in Image Processing

Image Processing in Astrophysics

Edges and Scale. Image Features. Detecting edges. Origin of Edges. Solution: smooth first. Effects of noise

Lecture 12. Local Feature Detection. Matching with Invariant Features. Why extract features? Why extract features? Why extract features?

Detectors part II Descriptors

Discrimination of Closely-Spaced Geosynchronous Satellites Phase Curve Analysis & New Small Business Innovative Research (SBIR) Efforts

How we wanted to revolutionize X-ray radiography, and how we then "accidentally" discovered single-photon CMOS imaging

IMAGE WARPING APPLICATIONS. register multiple images (registration) correct system distortions. register image to map (rectification)

ITK Filters. Thresholding Edge Detection Gradients Second Order Derivatives Neighborhood Filters Smoothing Filters Distance Map Image Transforms

Vlad Estivill-Castro (2016) Robots for People --- A project for intelligent integrated systems

LORD: LOw-complexity, Rate-controlled, Distributed video coding system

Image enhancement. Why image enhancement? Why image enhancement? Why image enhancement? Example of artifacts caused by image encoding

Gauge optimization and duality

RESTORATION OF VIDEO BY REMOVING RAIN

Multiscale Image Transforms

Finite Apertures, Interfaces

Edge Detection in Computer Vision Systems

EE Camera & Image Formation

7. Telescopes: Portals of Discovery Pearson Education Inc., publishing as Addison Wesley

Feature detectors and descriptors. Fei-Fei Li

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Linear Diffusion. E9 242 STIP- R. Venkatesh Babu IISc

Variational Methods in Bayesian Deconvolution

Review for Exam 1. Erik G. Learned-Miller Department of Computer Science University of Massachusetts, Amherst Amherst, MA

Motion Estimation (I)

Gradient-domain image processing

Screen-space processing Further Graphics

Image Segmentation: Definition Importance. Digital Image Processing, 2nd ed. Chapter 10 Image Segmentation.

Vision & Perception. Simple model: simple reflectance/illumination model. image: x(n 1,n 2 )=i(n 1,n 2 )r(n 1,n 2 ) 0 < r(n 1,n 2 ) < 1

ADMM algorithm for demosaicking deblurring denoising

Lecture 7: Edge Detection

CHAPTER 4 PRINCIPAL COMPONENT ANALYSIS-BASED FUSION

Gradient Domain High Dynamic Range Compression

Computer Vision Lecture 3

Image representation with multi-scale gradients

Old painting digital color restoration

Notes on Regularization and Robust Estimation Psych 267/CS 348D/EE 365 Prof. David J. Heeger September 15, 1998

Intrinsic Images by Entropy Minimization

Multiview Geometry and Bundle Adjustment. CSE P576 David M. Rosen

Mixture Models and EM

Today. MIT 2.71/2.710 Optics 11/10/04 wk10-b-1

Signal Denoising with Wavelets

Image Degradation Model (Linear/Additive)

Image Restoration. Enhancement v.s. Restoration. Typical Degradation Sources. Enhancement vs. Restoration. Image Enhancement: Image Restoration:

Advances in Computer Vision. Prof. Bill Freeman. Image and shape descriptors. Readings: Mikolajczyk and Schmid; Belongie et al.

Digital Matting. Outline. Introduction to Digital Matting. Introduction to Digital Matting. Compositing equation: C = α * F + (1- α) * B

Optical/IR Observational Astronomy Telescopes I: Optical Principles. David Buckley, SAAO. 24 Feb 2012 NASSP OT1: Telescopes I-1

SIFT: Scale Invariant Feature Transform

Noise, Image Reconstruction with Noise!

A Method for Blur and Similarity Transform Invariant Object Recognition

Tikhonov Regularization in General Form 8.1

DIFFRACTION GRATING. OBJECTIVE: To use the diffraction grating in the formation of spectra and in the measurement of wavelengths.

Optical Systems Program of Studies Version 1.0 April 2012

Sampling in 1D ( ) Continuous time signal f(t) Discrete time signal. f(t) comb

Fitting Narrow Emission Lines in X-ray Spectra

Camera calibration. Outline. Pinhole camera. Camera projection models. Nonlinear least square methods A camera calibration tool

18/10/2017. Image Enhancement in the Spatial Domain: Gray-level transforms. Image Enhancement in the Spatial Domain: Gray-level transforms

Application to Hyperspectral Imaging

Properties of detectors Edge detectors Harris DoG Properties of descriptors SIFT HOG Shape context

Transcription:

Single Exposure Enhancement and Reconstruction Some slides are from: J. Kosecka, Y. Chuang, A. Efros, C. B. Owen, W. Freeman 1

Reconstruction as an Inverse Problem Original image f Distortion & Sampling h n noise ( f ) n g = h + measurements g Reconstruction Algorithm fˆ 2

g = h( f ) + n g h 1( g n) fˆ Typically: The distortion h is non-invertible or ill-posed. The noise n is unknown, only its statistical properties can be learnt. 3

Typical Degradation Sources Low Illumination Optical distortions (geometric, blurring) Sensor distortion (quantization, sampling: spatial+temporal+spectral, sensor noise) Atmospheric attenuation (haze, turbulence, ) 4

Today s s Topics Spatial and spectral sampling De-mosaicing Contaminated noise De-noising Geometrical distortion Geometrical rectification Illumination White balancing 5

Key point: Prior knowledge of Natural Images 6

The Image Prior P f (f) 1 Image space 0 7

Problem: P(image) is complicated to model Defined over a huge dimensional space. Sparsely sampled. Known to be non Gaussian. form Mumford & Huang, 2000 8

Spatial and Spectral Sampling Demosaicing CCD CMOS 9

Spatial/Spectral Sampling Possible Configurations light beam splitter 3 CCD 1 CCD 10

Color Filter Arrays (CFAs) Bayer pattern 11

Image Mosaicing 12

Inverse Problem: Image Demosaicing The CCD sensor in a digital camera acquires a single color component for each pixel. Problem: How to interpolate the missing components? 13

Demosaicing Types of Solutions Exploiting spatial coherence: Non adaptive: Nearest Neighbor interpolation Bilinear interpolation Bicubic interpolation, etc. Adaptive: Edge sensing interpolation Exploiting color coherence: Non Adaptive Adaptive 14

Nearest neighbor interpolation: Assumption: Neighboring pixels are similar. Conclusion: Take the value of a missing pixel from its nearest neighbor s value Assuming piecewise constant function Problematic in piecewise linear areas and near edges G 15

Linear interpolation 1D: G G(k)=(G(k+1)+G(k-1)) /2 Assuming piecewise linear function Artifacts near edges original input linear interpolation 16

Bilinear interpolation 2D: 17

Mosaic Image 18

Bilinear Interpolation 19

Bilinear Interpolation 20

Bilinear Interpolation 21

Adaptive Interpolation 2D (example 1): Assumption: Neighboring pixels are similar- if there is no edge h v = abs = abs [ G 45 G 43 ] [ ] G 34 G 54 G 44 = G 34 G G + G 34 43 54 + G 2 + G 2 + G 4 54 45 43 + G 45 if if if h >> v v v >> h h Problem: G 44 does not exploit all available information. 22

Adaptive Interpolation 2D (example 2): 0.08 ϕ(x) 0.06 0.04 0.02 0-1 00-50 0 50 100 w w w w 45 34 43 54 = ϕ = ϕ = ϕ = ϕ ( B46 B44 ) ( B24 B44 ) ( B42 B44 ) ( B B ) 64 44 G 44 = w 45 G 45 + w w 45 34 G + w 34 34 + w + w 43 43 G 43 + w + w 54 54 G 54 23

24 Color Coherence: Diffuse Model: In the Log domain: + = + = = B G R B G R K K K L K K K n l B G R b g r log ) log( log ( ) ( ) ( ) ( ) = B G R K K K y x n l y x b y x g y x r,,,, most edges uncommon edges Assumption: Color bands are highly correlated in high frequencies

Color Coherence: Example R=R L +R H luminance edge (common) G=G L +G H R-G R L +G L I chrominance edge (uncommon) R-G 25 x

Outcomes: R(x)-G(x) rg I R(x) G(x) + rg R x (x) = B x (x) = G x (x) k x Spatial coherence: G(k)=(G(k-1)+G(k+1))/2 Color coherence: rg =average{r(x)-g(x)} R(k)=G(k)+ rg Problem: - Assumes piecewise constant - Fails near luminance and chrominance edges Solution: Assume piecewise linear + adaptive interpolation 26

27 Exploiting color coherence for green interpolation: Since R x (x)=g x (x) we have x I ( ) ( ) 2 ~ 2 ~ 2 1 1 2 1 1 + + + + = = k k k k k k k k k k R R G G R R G G k These estimates for G k can be improved using adaptive scheme

28 Edge Sensing Interpolation using Color Coherence: 54 43 34 45 44 54 54 44 43 43 44 34 34 44 45 45 44 ~ ~ ~ ~ w w w w G w G w G w G w G + + + + + + = ( ) ( ) ( ) ( ) 2 ~ 2 ~ 2 ~ 2 ~ 44 64 54 44 54 44 24 34 44 34 44 42 43 44 43 44 46 45 44 45 B B G G B B G G B B G G B B G G = = = = ( ) ij ij ij G R G R w w w w R w R w R w R w R + = + + + + + + = 44 44 53 55 35 33 44 53 53 44 55 55 44 35 35 44 33 33 44 ~ ~ ~ ~ ~

Bilinear 29

Adaptive + color coherence 30

Bilinear 31

Adaptive + color coherence 32

Bilinear 33 Adaptive

Image Denoising 34

Noise Models Additive noise: g = f + n Multiplicative noise g = f+f*n General case: ( g f ) P 35

Example Noise additive noise multiplicative noise 36

Examples of independent and identically distributed (i.i.d) Noise Gaussian white noise (i.i.d.): P 1 ( ) 2 2 ( f g ) 2σ g f = e 2π σ 0.08 0.06 0.04 0.02 0-100 -50 0 50 100 37 σ=20

Additive uniform noise [a,b]: P ( g f ) = 1 ( b a) a ( g f ) 0 otherwise b 0.1 0.08 0.06 0.04 0.02 0-100 -50 0 50 100 b-a=20 38

Impulse noise (S & P): P ( g f ) = 1 P P P a a b P b for for for g g g = = = a b f Note: this noise is not additive! 1 0.8 0.6 0.4 0.2 0 0 50 100 150 200 250 39

Bayesian denoising Assume the noise component is i.i.d. Gaussian: g= f + n where n is distributed ~N(0,σ) A MAP estimate for the original f: f ˆ = arg max P = arg max f ( f g) Using Bayes rule this leads to : fˆ = arg min f f P {( g f ) + λψ ( f )} 2 ( g f ) P( f ) P( g) where ψ(f) is a penalty for non probable f 40

First try prior Similarity of neighboring pixels ( f ) = ( ) ( ) p Ws p q f p fq ψ p q q N p W s is a Gaussian profile giving less weight to distant pixels. (f p -f q ) 2 0.08W s 2 0.06 0.04 0.02 0 0 0-100 -50 0 50 100 41

42 This leads to Gaussian smoothing filters: Reduces noise but blurs out edges. The parameter α depends on the noise variance. ( ) ( ) ( ) = N p q q p s p f f q p W f 2 ψ ( ) ( ) ( ) + = p p N q s q N q s p p q p W g q p W g f α α 1 ˆ

Noisy Image 43

Filtered Image 44

The role of prior term: Gaussian smoothing ˆ f = arg min g f + W p q f f f q Np 2 2 ( ) λ ( ) ( ) p p p s p q 10000 9000 8000 data term prior term sum 7000 150 50 6000 5000 4000 p q 3000 2000 1000 0 0 50 100 150 200 250 300 45

Example 2: Prior Term Edge sensitive similarity: ψ ( ) 2 ( f ) = ( ) + ( ) p Ws p q log 1 f p fq q N p log(1+(f p -f q ) 2 ) p q 0 0 46

This leads to the Bilateral filter (edge-preserving smoothing): fˆ p = ( 1 ) α g p + α q N p q N W p S W ( p q) W ( g g ) S R ( p qw ) ( g g ) R p p q q g q W S is a monotonically descending spatial weight W R is a monotonically descending radiometric weight A.k.a. Bilateral Filtering 47

Bilateral filter =. =. =. from P. Milinfar. 48

Gaussian Smoothing I W S = e ( p q) σ s 2 I Smooth edges x slide from Darya Frolova and Denis Simakov 49

Bilateral Filtering I W W S R = e 2 ( p q) ( gp gq ) σ s + σ R 2 I Preserves discontinuities x slide from Darya Frolova and Denis Simakov 50

Noisy Image 51

Filtered Image 52

Edge preserving smoothing Gaussian smoothing 53

Bilateral Filer- Example 54

55

56

The role of prior term: Robust {( ) 2 ψ ( )} fˆ = arg min g f + f f p p p robust p q f ψ ( ) robust v 10000 9000 8000 7000 6000 5000 4000 3000 2000 1000 0-100 -50 0 50 100 57

The role of prior term: Robust {( ) 2 ψ ( )} fˆ = arg min g f + f f p p p robust p q f 10000 9000 8000 data term prior term sum 200 170 150 130 50 7000 6000 5000 4000 p q 3000 2000 1000 0 0 50 100 150 200 250 300 58

Example 3: Prior Term Images are homogeneous Out Images are self-similar - In 59

Example 3: Prior Term ψ ( f ) = ( )( ) p WN N p Nq f p fq q N p N p (and N q ) denotes the local neighborhood of p (and q). 2 N q N p 60

Neighborhood weights: examples 61

62 This leads to the non-local mean filter: W is a monotonically descending function, e.g. ( ) ( ) ( ) + = q q p N q q q p N p p N N W g N N W g f α α 1 ˆ ( ) 2 N N p N q q p N e N N W σ =

Patch-based Denoising (NL-mean) Noisy patches w 1 w 2 + w N Linear combination v(n i ) v(n j ) 63

Non-Local means - Example 64

Non-Local means - Example 65

White Balancing Also known as Illumination Estimation Color constancy Color correction 66

Experiment 1: Yellow illumination From David Brainard 67

Experiment 2: Blue illumination From David Brainard 68

Experiment 3: Blue illumination From David Brainard 69

Yellow illumination Blue illumination Blue illumination 70

Conclusions: The eye cares more about objects intrinsic color, not the color of the light leaving the objects The HVS applies color adaptation. 71

Color under different illuminations from S. Alpert & D. Simakov 72

Experiment 2: which one looks more natural? 73

White Balance Problem The eye cares more about objects intrinsic color, not the color of the light leaving the objects The HVS applies color adaptation. When watching a picture on screen or print, we adapt to the illuminant of the room, not that of the scene in the picture We need to discount the color of the light source White Balancing 74

White Balance & Film Different types of film for fluorescent, tungsten, daylight Need to change film! Electronic & Digital imaging are more flexible 75

Image formation Image color E k k = R,G,B = e( λ) s( λ) ρ ( λ) λ k d λ Illumination object reflectance Sensor response from S. Alpert & D. Simakov 76

White Balancing If Spectra of Light Source Changes Spectra of Reflected Light Changes The goal : Evaluate the surface color as if it was illuminated with white light (canonical) from S. Alpert & D. Simakov 77

Von Kries adaptation Multiply each channel by a gain factor R G = B W r W E E E Note that the light source could have a more complex effect Arbitrary 3x3 matrix More complex spectrum transformation Can be justified if the sensor spectral sensitivity function are narrow. g W b 1 2 3 78

Manual white balancing white spot Take a picture of a neutral object (white or gray) Deduce the weight of each channel If the object is recoded as r w, g w, b w use weights 1/r w, 1/g w, 1/b w 79

Manual white balancing light selection Select type of light from a finite set. Correct accordingly. 80

Automatic white balancing Gray world assumption The average color in the image is grey Use weights ( ) 1 1 1 W R WG WB =,, R G B image image image Note that this also sets the exposure/brightness Usually assumes 18% grey 81

Automatic white balancing Brightest pixel assumption Highlights usually have the color of the light source Apply white balancing by using the highlight pixels Problem: Highlight pixels are contaminated with diffuse components 82

How can we detect highlight pixels? Following G. J. Klinker, S. A. Shafer and T. Kanade. A Physical Approach to Color Image Understanding. International Journal of Computer Vision, 1990. Two reflectance components total = diffuse + specular = + from S. Alpert & D. Simakov 83

Reminder - Image Formation R V N θ L Diffuse reflection: I diff = K(λ) e p (λ) (N L) Specular reflection: I spec = K s (λ)e p (λ) (R V) n e p - the point light intensities. K, K s [0,1] - the surface diffuse / specular reflectivity. N - the surface normal, L - the light direction, V viewing direction 84

Diffuse object in RGB space Linear cluster in color space from S. Alpert & D. Simakov 85

Specular object in RGB space Linear cluster in the direction of the illuminant color from S. Alpert & D. Simakov 86

Combined reflectance in RGB space Skewed T from S. Alpert & D. Simakov 87

Combined reflectance of several objects Several T-clusters Specular lines are parallel (why?) from S. Alpert & D. Simakov 88

Step I Group objects by region grouping Group together diffuse and specular image parts of the same object Grow regions in image domain so that to form clusters in color domain from S. Alpert & D. Simakov 89

Step II: Decompose into diffuse + specular Coordinate transform in color space R G B = [ C C C C ] matte spec matte spec matte specular noise Cspec Cmatte CmatteXCspec matte specular noise = [ C C C C ] matte spec matte spec 1 R G B from S. Alpert & D. Simakov 90

Decompose into matte + specular (2) + [ C C C ] 1 C matte spec matte spec * = in RGB space from S. Alpert & D. Simakov 91

Decompose into matte + specular (3) + Cmatte + Cspec = + from S. Alpert & D. Simakov 92

Final results: Reflectance Decomposition input image matte component specular component = + G. J. Klinker, S. A. Shafer and T. Kanade. A Physical Approach to Color Image Understanding. International Journal of Computer Vision, 1990. from S. Alpert & D. Simakov 93

Geometric Distortion 94

Geometric Distortion Correction No distortion Pin cushion Barrel Radial distortion of the image Caused by imperfect lenses Deviations are most noticeable for rays that pass through the edge of the lens 95

General Geometric Transformation Operations depend on Pixel s Coordinates. Independent of pixel values. x y f f x y ( x, y) = x ( x, y) = y I( x, y) = I' ( f ( x, y), f ( x y) ) x y, (x,y) (x,y ) I(x,y) I (x,y ) 96

Radial Distortions No Distortion Barrel Distortion Pincushion Distortion Distortion model (inverse mapping): 2 d r = r + ar + u d br 3 d r u = undistorted radius r d = distorted radius 97

Geometric Rectification using Calibration target 98

99 Parameter Estimation Minimum Least Squared Error solution: ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) b M p r r r r b a r r r r d u d u d d d d = = M M M 2 2 1 1 2 2 2 2 1 3 1 2 3 2 d d d u br ar r r + + = ( ) b M M M p T T 1 ˆ =

Correcting radial distortion from Helmut Dersch 100

Correcting Radial Distortions Before After http://www.grasshopperonline.com/barrel_distortion_correction_software.html 101

Single Image Enhancement - Summary Demosiacing Sensor correction Denoising Sensor/Acqusition correction White Balance Color correction Barrel/Pincusion Distortion Geometric correction (Lens) 102

THE END 103