Screen-space processing Further Graphics

Similar documents
Color and compositing

Digital Image Processing ERRATA. Wilhelm Burger Mark J. Burge. An algorithmic introduction using Java. Second Edition. Springer

Introduction to Computer Vision. 2D Linear Systems

SYDE 575: Introduction to Image Processing. Image Compression Part 2: Variable-rate compression

Gradient Domain High Dynamic Range Compression

18/10/2017. Image Enhancement in the Spatial Domain: Gray-level transforms. Image Enhancement in the Spatial Domain: Gray-level transforms

AN ADAPTIVE PERCEPTUAL QUANTIZATION METHOD FOR HDR VIDEO CODING

Vision & Perception. Simple model: simple reflectance/illumination model. image: x(n 1,n 2 )=i(n 1,n 2 )r(n 1,n 2 ) 0 < r(n 1,n 2 ) < 1

Old painting digital color restoration

Lecture 04 Image Filtering

A Model of Local Adaptation

Edge Detection in Computer Vision Systems

Computational Photography

JNDs, adaptation, and ambient illumination

Visual Imaging and the Electronic Age Color Science Metamers & Chromaticity Diagrams. Lecture #6 September 7, 2017 Prof. Donald P.

The Visual Perception of Images

CITS 4402 Computer Vision

Digital Image Processing

Prediction-Guided Quantization for Video Tone Mapping

Image Enhancement in the frequency domain. GZ Chapter 4

Visual Imaging and the Electronic Age Color Science

The Effect of a Small Nearby Village (Großmugl) on Night Sky Brightness

Vlad Estivill-Castro (2016) Robots for People --- A project for intelligent integrated systems

6 The SVD Applied to Signal and Image Deblurring

Application of Neugebauer-Based Models to Ceramic Printing

Introduction to Colorimetry

Image enhancement. Why image enhancement? Why image enhancement? Why image enhancement? Example of artifacts caused by image encoding

8 The SVD Applied to Signal and Image Deblurring

ECE Digital Image Processing and Introduction to Computer Vision. Outline

8 The SVD Applied to Signal and Image Deblurring

Visual Imaging and the Electronic Age Color Science

Perceptually Uniform Color Spaces

CLINICAL VISUAL OPTICS (OPTO 223) Weeks XII & XIII Dr Salwa Alsaleh

Computer Vision & Digital Image Processing

Gradient-domain image processing

Color vision and colorimetry

Reading. 3. Image processing. Pixel movement. Image processing Y R I G Q

Midterm Summary Fall 08. Yao Wang Polytechnic University, Brooklyn, NY 11201

Colour Part One. Energy Density CPSC 553 P Wavelength 700 nm

E27 SPRING 2015 ZUCKER PROJECT 2 LAPLACIAN PYRAMIDS AND IMAGE BLENDING

Color Image Correction

CHAPTER 4 PRINCIPAL COMPONENT ANALYSIS-BASED FUSION

A Model of Local Adaptation supplementary information

Color perception SINA 08/09

IMAGE ENHANCEMENT II (CONVOLUTION)

Review for Exam 1. Erik G. Learned-Miller Department of Computer Science University of Massachusetts, Amherst Amherst, MA

Radiometry, photometry, measuring color

Spatially adaptive alpha-rooting in BM3D sharpening

I Chen Lin, Assistant Professor Dept. of CS, National Chiao Tung University. Computer Vision: 4. Filtering

FILTERING IN THE FREQUENCY DOMAIN

ECE472/572 - Lecture 11. Roadmap. Roadmap. Image Compression Fundamentals and Lossless Compression Techniques 11/03/11.

UGRA Display Analysis & Certification Tool Report

POST-PROCESSING OF RADIANCE IMAGES: VIRTUAL LIGHTING LABORATORY

Opponent Color Spaces

Multimedia communications

Reprinted from Proceedings of the Pattern Recognition and Information Processing Conference (pp ), Dallas, TX (1981).

Edge Detection. Introduction to Computer Vision. Useful Mathematics Funcs. The bad news

Introduction to atmospheric visibility estimation

Color images C1 C2 C3

Review Smoothing Spatial Filters Sharpening Spatial Filters. Spatial Filtering. Dr. Praveen Sankaran. Department of ECE NIT Calicut.

AMP DISPLAY INC. SPECIFICATIONS. 4.3-in COLOR TFT MODULE AM480272QTZQW-00H

Computer Vision Lecture 3

Multimedia Information Systems

Lecture 3: Linear Filters

Digital Image Processing COSC 6380/4393

Multiscale Image Transforms

Total Variation Image Edge Detection

Linear Operators and Fourier Transform

Over-enhancement Reduction in Local Histogram Equalization using its Degrees of Freedom. Alireza Avanaki

CS681 Computational Colorimetry

Color Basics. Lecture 2. Electromagnetic Radiation - Spectrum. Spectral Power Distribution

Intensity Transformations and Spatial Filtering: WHICH ONE LOOKS BETTER? Intensity Transformations and Spatial Filtering: WHICH ONE LOOKS BETTER?

Mesopic Photometry for SSL. Teresa Goodman Metrology for SSL Meeting 24 th April 2013

Department of Electrical Engineering, Polytechnic University, Brooklyn Fall 05 EL DIGITAL IMAGE PROCESSING (I) Final Exam 1/5/06, 1PM-4PM

CONTINUOUS IMAGE MATHEMATICAL CHARACTERIZATION

Digital Image Processing COSC 6380/4393

Lecture 3: Linear Filters

CSE 559A: Computer Vision EVERYONE needs to fill out survey.

Lecture 7 Predictive Coding & Quantization

encoding without prediction) (Server) Quantization: Initial Data 0, 1, 2, Quantized Data 0, 1, 2, 3, 4, 8, 16, 32, 64, 128, 256

Lecture 7: Edge Detection

Optimum Notch Filtering - I

TRACKING and DETECTION in COMPUTER VISION Filtering and edge detection

Additional Pointers. Introduction to Computer Vision. Convolution. Area operations: Linear filtering

Filtering and Edge Detection

Today s lecture. Local neighbourhood processing. The convolution. Removing uncorrelated noise from an image The Fourier transform

Technical Report. Camera Encoding Evaluation for Image Archiving of Cultural Heritage Roy S. Berns

Digital Image Processing

Image Data Compression

4. References. words, the pseudo Laplacian is given by max {! 2 x,! 2 y}.

Brightness induction: Unequal spatial integration with increments and decrements

Perception of brightness. Perception of Brightness. Physical measures 1. Light Ray. Physical measures 2. Light Source

L ight color influence on obstacle recognition in road lighting. 1. Introduction

Colour. The visible spectrum of light corresponds wavelengths roughly from 400 to 700 nm.

EE Camera & Image Formation

ADMINISTRIVIA SENSOR

CAP 5415 Computer Vision Fall 2011

GBS765 Electron microscopy

CE 59700: Digital Photogrammetric Systems

Inverse Problems in Image Processing

Transcription:

Screen-space processing Rafał Mantiuk Computer Laboratory, University of Cambridge

Cornell Box and tone-mapping Rendering Photograph 2

Real-world scenes are more challenging } The match could not be achieved if the light source in the top of the box was visible } The display could not reproduce the right level of brightness 3

Digression: Linear filtering } Output pixel value is a weighted sum of neighboring pixels Input pixel value Kernel (filter) Resulting pixel value Sum over neighboring pixels, e.g. k=-1,0,1, j=-1,0,1 for 3x3 neighborhood compact notation: 4 Convolution operation

Linear filter: example Why is the matrix g smaller than f? 5

Padding an image Padded and blurred image Padded image 6

What is the computational cost of the convolution? } How many multiplications do we need to do to convolve 10x10 image with 9x9 kernel? } The image is padded, but we do not compute the values for the padded pixels 7

Separable kernels } Convolution operation can be made much faster if split into two separate steps: } 1) convolve all rows in the image with a 1D filter } 2) convolve columns in the result of 1) with another 1D filter } But to do this, the kernel must be separable " h 1,1 h 1,2 h 1,3 % $ ' $ h 2,1 h 2,2 h 2,3 ' # $ h 3,1 h 3,2 h 3,3 &' h = u v = " $ $ # $ u 1 u 2 u 3 % ' ' v 2 v 3 &' [ ] 8

Examples of separable filters } Box filter: } Gaussian filter: } What are the corresponding 1D components of this separable filter (u(x) and v(y))? ú û ù ê ë é ú ú ú ú ú ú û ù ê ê ê ê ê ê ë é = ú ú ú ú ú ú û ù ê ê ê ê ê ê ë é 3 1 3 1 3 1 3 1 3 1 3 1 9 1 9 1 9 1 9 1 9 1 9 1 9 1 9 1 9 1 G(x, y) = u(x) v(y) 9

Unsharp masking } How to use blurring to sharpen an image? results original image high-pass image blurry image 10

Why linear filters? } Linear functions have two properties: } Additivity: f(x) + f(y) = f(x+y) } Homogenity: f(ax) = af(x) (where f is a linear function) } Why is it important? } Linear operations can be performed in an arbitrary order } Example: blur( af + b ) = ablur( F ) + b } Performance can be improved by changing the order of operations } This is also how the separable filters work: Matrix multiplication Convolution The components of a separable kernel (u v) f = u (v f ) An image 11

Dynamic range Luminance max L min L (for SNR>3) 12

Measures of dynamic range (contrast) } As ratio: } Usually written as C:1, for example 1000:1. } As orders of magnitude or log10 units: } As stops: C = L max L min C 10 = log 10 L max L min C 2 = log 2 L max L min One stop is doubling of halving the amount of light 13

High dynamic range (HDR) 10-6 10-4 10-2 100 102 104 106 108 Dynamic Luminance [cd/m 2 ] Range 1000:1 1500:1 30:1 14

Digital vs. physical image 15

Digital vs. physical image 16

Digital vs. physical image 17

Basic display model: gamma correction } Gamma correction is used to encode luminance or tristimulus color values (RGB) in imaging systems (displays, printers, cameras, etc.) Gain Gamma (usually =2.2) V out = a V in γ (relative) Luminance Physical signal For color images: Luma Digital signal (0-1) R = a ( R #) γ and the same for green and blue 18

Why is gamma needed? <- Pixel value (luma) <- Luminance } Gamma corrected pixel values give a scale of brightness levels that is more perceptually uniform } At least 12 bits (instead of 8) would be needed to encode each color channel without gamma correction } And accidentally it was also the response of the CRT gun 19

Luminance physical unit } Luminance perceived brightness of light, adjusted for the sensitivity of the visual system to wavelengths L V = 0 L(λ) V (λ)dλ Luminance ISO Unit: cd/m 2 Light spectrum (radiance) Luminous efficiency function (weighting) 20

Luma gray-scale pixel value } Luma - pixel brightness in gamma corrected units L " = 0.2126R " + 0.7152G " + 0.0722B " } R ", G " and B " are gamma corrected colour values } Prime symbol denotes gamma corrected } Used in image/video coding } Note that relative luminance if often approximated with L = 0.2126R + 0.7152G + 0.0722B = 0.2126(R " ) 5 +0.7152(G " ) 5 +0.0722(B " ) 5 } R, G, and B are linear colour values } Luma and luminace are difference concepts despite similar formulas 21

srgb colour space } RGB colour space is not a standard. Colours may differ depending on the choice of the primaries } srgb is a standard colour space, which most displays try to mimic (standard for HDTV) } The chromacities above are also known as Rec. 709 22

srgb colour space } Two step XYZ to srgb transformation: } Step 1: Linear color transform } Step 2: Non-linearity 23

Why do we need tone mapping? } To reduce excessive dynamic range } To customize the look (colour grading) } To simulate human vision (for example night vision) } To adapt to a display and viewing conditions } To make rendered images look more realistic } Different tone mapping operators achieve different goals 24

Two ways to do tone-mapping Liminance, linear RGB Luma, gamma corrected RGB, srgb HDR image Tone mapping A LDR image Liminance, linear RGB HDR image Tone mapping B Inverse display model LDR image Display model can account for: Display peak luminance Display dynamic range (contrast) Ambient light Sometimes known as gamma 25

Inverse display model as tone mapping } Inverse display model is the simplest tone mapping } It transforms linear pixel values to gamma-corrected values Prime ( ) denotes a gamma-corrected value R " = R L 789:; < 5 Typically γ=2.2 } L 789:; is the intensity that should be mapped to the maximum brightness of a display (display white) } The same operation applied to red, green and blue channels } OpenGL offers srgb textures to automate RGB <-> srgb conversion } srgb textures store data in gamma-corrected space } srgb convered to (linear) RGB on texture look-up (and filtering) } RGB to srgb conversion when writing to srgb texture with glenable(gl_framebuffer_srgb) 26

Tone-curve Best tonemapping is the one which does not do anything, i.e. slope of the tone-mapping curves is equal to 1. Image histogram 27

Tone-curve But in practice contrast (slope) must be limited due to display limitations. 28

Tone-curve Global tonemapping is a compromise between clipping and contrast compression. 29

Sigmoidal tone-curves } Very common in digital cameras } Mimic the response of analog film } Analog film has been engineered over many years to produce good tone-reproduction } Fast to compute 30

Sigmoidal tone mapping } Simple formula for a sigmoidal tone-curve: R (x, y) = R(x, y) A L Ba A + R(x, y) A where L B is the geometric mean (or mean of logarithms): L B = exp 1 N G ln (L x, y ) (J,K) and L x, y is the luminance of the pixel x, y. 31

Sigmoidal tone mapping example a=0.25 a=1 a=4 32 b=0.5 b=1 b=2

Glare Illusion Alan Wake Remedy Entertainment 33

Glare Illusion Photography Painting 34 Computer Graphics HDR rendering in games

Scattering of the light in the eye From: Sekuler, R., and Blake, R. Perception, second ed. McGraw- Hill, New York, 1990 35

Point Spread Function of the eye Green daytime (photopic) Red night time (scotopic) } What portion of the light is scattered towards a certain visual angle } To simulate: } construct a digital filter } convolve the image with that filter From: Spencer, G. et al. 1995. Proc. of SIGGRAPH. (1995) 36

Selective application of glare } A) Glare applied to the entire image I M = I G Glare kernel (PSF) } Reduces image contrast and sharpness B) Glare applied only to the clipped pixels I M = I + I OP9Q;R G I OP9Q;R I for I > 1 where I OP9Q;R = T 0 otherwise Better image quality 37

Selective application of glare A) Glare applied to the entire image Original image B) Glare applied to clipped pixels only 38

Glare (or bloom) in games } Convolution with large, non-separable filters is too slow } The effect is approximated by a combination of Gaussian filters } Each filter with different sigma } The effect is meant to look good, not be be accurate model of light scattering } Some games simulate a camera rather than the eye 39

References } Tone-mapping } REINHARD, E., HEIDRICH, W., DEBEVEC, P., PATTANAIK, S., WARD, G., AND MYSZKOWSKI, K. 2010. High Dynamic Range Imaging: Acquisition, Display, and Image- Based Lighting. Morgan Kaufmann. } MANTIUK, R.K., MYSZKOWSKI, K., AND SEIDEL, H. 2015. High Dynamic Range Imaging. In: Wiley Encyclopedia of Electrical and Electronics Engineering. John Wiley & Sons, Inc., Hoboken, NJ, USA, 1 42. } http://www.cl.cam.ac.uk/~rkm38/pdfs/mantiuk15hdri.pdf (Chapter 5) } Display models } MANTIUK, R.K. 2016. Perceptual display calibration. In: R.R. Hainich and O. Bimber, eds., Displays: Fundamentals and Applications. CRC Press. } http://www.cl.cam.ac.uk/~rkm38/pdfs/mantiuk2016perceptual_display.pdf } Digital Image Processing } GONZALEZ, R.C. AND WOODS, R.E. 2009. Digital Image Processing. Pearson. } Ch. 3 and 7 40