Singular Value Decomposition: Compression of Color Images

Similar documents
Image Compression Using Singular Value Decomposition

Computational Methods. Eigenvalues and Singular Values

The Singular Value Decomposition

Introduction to SVD and Applications

Singular Value Decomposition and Digital Image Compression

Linear Algebra Review. Fei-Fei Li

Linear Algebra Review. Fei-Fei Li

Linear Algebra, part 3 QR and SVD

Singular Value Decomposition

Jordan Normal Form and Singular Decomposition

Singular Value Decompsition

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Linear Algebra and Matrices

CS 246 Review of Linear Algebra 01/17/19

Lecture: Face Recognition and Feature Reduction

Linear Algebra for Machine Learning. Sargur N. Srihari

Singular Value Decomposition. 1 Singular Value Decomposition and the Four Fundamental Subspaces

Learning goals: students learn to use the SVD to find good approximations to matrices and to compute the pseudoinverse.

1 Linearity and Linear Systems

Lecture: Face Recognition and Feature Reduction

Properties of Matrices and Operations on Matrices

UNIT 6: The singular value decomposition.

Image Registration Lecture 2: Vectors and Matrices

More Linear Algebra. Edps/Soc 584, Psych 594. Carolyn J. Anderson

Singular value decomposition

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

3D Computer Vision - WT 2004

Linear Algebra: Matrix Eigenvalue Problems

Foundations of Computer Vision

Math Fall Final Exam

0.1 Eigenvalues and Eigenvectors

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Lecture 11. Linear systems: Cholesky method. Eigensystems: Terminology. Jacobi transformations QR transformation

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Conceptual Questions for Review

What is Image Deblurring?

The Singular Value Decomposition

8 The SVD Applied to Signal and Image Deblurring

Econ Slides from Lecture 7

Linear Algebra - Part II

Quick Tour of Linear Algebra and Graph Theory

Background Mathematics (2/2) 1. David Barber

MATH 3795 Lecture 10. Regularized Linear Least Squares.

CS 143 Linear Algebra Review

6 EIGENVALUES AND EIGENVECTORS

q n. Q T Q = I. Projections Least Squares best fit solution to Ax = b. Gram-Schmidt process for getting an orthonormal basis from any basis.

MATH 122: Matrixology (Linear Algebra) Solutions to Level Tetris (1984), 10 of 10 University of Vermont, Fall 2016

MARN 5898 Regularized Linear Least Squares.

MAT 343 Laboratory 6 The SVD decomposition and Image Compression

Review of similarity transformation and Singular Value Decomposition

B553 Lecture 5: Matrix Algebra Review

Eigenvalues and diagonalization

Recall : Eigenvalues and Eigenvectors

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name:

Linear Least Squares. Using SVD Decomposition.

Math 489AB Exercises for Chapter 2 Fall Section 2.3

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach

Matrix decompositions

Singular Value Decomposition

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Large Scale Data Analysis Using Deep Learning

6 The SVD Applied to Signal and Image Deblurring

COMP 558 lecture 18 Nov. 15, 2010

Linear Algebra Methods for Data Mining

MATH 310, REVIEW SHEET 2

8 The SVD Applied to Signal and Image Deblurring

Review of Some Concepts from Linear Algebra: Part 2

18.06SC Final Exam Solutions

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

632 CHAP. 11 EIGENVALUES AND EIGENVECTORS. QR Method

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

. = V c = V [x]v (5.1) c 1. c k

18.06 Professor Strang Quiz 3 May 2, 2008

Latent Semantic Analysis (Tutorial)

Chapter 3. Determinants and Eigenvalues

Linear Algebra Methods for Data Mining

Main matrix factorizations

3.3 Eigenvalues and Eigenvectors

Knowledge Discovery and Data Mining 1 (VO) ( )

Cheng Soon Ong & Christian Walder. Canberra February June 2017

Linear Algebra Primer

The Haar Wavelet Transform: Compression and. Reconstruction

2. LINEAR ALGEBRA. 1. Definitions. 2. Linear least squares problem. 3. QR factorization. 4. Singular value decomposition (SVD) 5.

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Review of some mathematical tools

Data Mining Lecture 4: Covariance, EVD, PCA & SVD

Using semiseparable matrices to compute the SVD of a general matrix product/quotient

Deep Learning Book Notes Chapter 2: Linear Algebra

14 Singular Value Decomposition

Last Time. Social Network Graphs Betweenness. Graph Laplacian. Girvan-Newman Algorithm. Spectral Bisection

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

PCA, Kernel PCA, ICA

Numerical Linear Algebra

Chap 3. Linear Algebra

Maths for Signals and Systems Linear Algebra in Engineering

Computational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science

Lecture 6 Positive Definite Matrices

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Transcription:

1/26 Singular Value Decomposition: Compression of Color Images Bethany Adams and Nina Magnoni

Introduction The SVD has very useful applications. It can be used in least squares approximations, search engines, and image compression. In this presentation we will show how it is applied, more specifically, to the compression of color images. We begin by decomposing a 2x2 matrix, then we ll show how matlab is useful for SVDing larger matrices, and lastly applying the matlab SVD techniques to images, grayscale and truecolor. 2/26

Definition Singular Value Decomposition factorizes A, a rectangular matrix, into the matrices U, S, and V T. The SVD factorization doesn t require square matrices, therefore m, the number of columns, does not have to equal n, the number of rows. A mxn = U mxm S mxn V T nxn In this U and V are orthogonal matrices; therefore, all of the columns of U and V are orthogonal to one another. The S matrix, is a diagonal matrix whose entries are in descending order, σ 1 σ 2... σ n 0, along the main diagonal. σ 1 0...0 0 σ 2...0 0 0...σ n (1) In order to compute the matrices U, S, and V there are several hand computations that must be completed. These include finding some special eigenvalue/eigenvector pairs. 3/26

The eigenvalues come from the relationship Ax = λx or (A λi)x = 0. This shows that the eigenvectors are in the nullspace of A λi, therefore the determinant, A λi = 0., must equal zero. The eigenvalues surface when the found eigenvalues are substituted into the above equation((a λi)x = 0) and the nullspace is found. 4/26

An Example for a 2 2 Matrix [ ] 2 2 Let A =. In order to complete the decomposition, we must 1 1 first compute the eigenvalue eigenvector pairs for each AA T and A T A. [ ] [ ] [ ] 2 2 2 1 8 0 AA T = = (2) 1 1 2 1 0 2 AA T λ [ ] [ ] 8 0 1 0 = λ = 0 (3) 0 2 0 1 A = 8 λ 0 0 2 λ = 0 (4) To find λ, (8 λ)(2 λ) 0 = 0 (8 λ)(2 λ) = 0 λ 1 = 8 5/26

λ 2 = 2 When finding the eigenvalues of A T A we use the following computation: [ ] [ ] [ ] 2 1 2 2 5 3 A T A = = 2 1 1 1 3 5 (5) A T A λ [ ] [ ] = 5 3 1 0 λ = 0 3 5 0 1 (6) A = 5 λ 3 3 5 λ = 0 (7) To find λ, (5 λ)(5 λ) (3)(3) = 0 λ 2 10λ + 16 = 0 λ 1 = 8 λ 2 = 2 6/26

Finding The Eigenvectors The corresponding eigenvectors to these eigenvalues are in the nullspace of (A λi) or in our case the nullspace of (AA T λi) or A T A λi). Let s begin with the eigenvectors of AA T which will eventually make up the columns of the matrix U. 7/26 (AA T λi)x = 0 ([ ] [ ]) 8 0 1 0 λ 0 2 1 x 0 1 1 = 0 [ ] 8 8 0 x 0 2 8 1 = 0 [ ] 0 0 x 0 6 1 = 0 [ ] 1 x 1 = 0 (8)

(AA T λi)x = 0 ([ ] [ ]) 8 0 1 0 λ 0 2 2 x 0 1 2 = 0 [ ] 8 2 0 x 0 2 2 2 = 0 [ ] 6 0 x 0 0 2 = 0 [ 0 x 2 = 1] (9) 8/26 A similar computation can be used for determining the eigenvectors of the matrix A T A. [ ] 1 x 1 = 1 x 2 = [ 1 1 ]

Finding the full Decomposition The matrices in the singular value decomposition require that the columns are of length one. Therefore, we must divide each eigenvector by its magnitude. Also, the singular values are not equivalent to the eigenvalues, we must take the square root of each eigenvalue in order to find a singular value. The decompression then becomes A = USV T [ ] [ ] [ 1/ 1 0 = 0 1/ 8 0 1/ 2 1/ 2 1 0 2 1/ 2 1/ 2 [ ] [ ] [ ] 1 0 8 0 = 1/ 2 1/ 2 0 1 0 2 1/ 2 1/ 2 If you would like, it may be helpful to multiply these matrices [ together ] 2 2 to see that they truly do equal our original matrix A = 1 1 ] T 9/26

Decomposition of Larger Matrices In Matlab As the matrices get larger and larger, it becomes too challenging to do all these computations via brain, so Matlab comes in handy for decomposing the larger, more heinous, matrices. Doing Singular Value Decomposition using Matlab is quite simple. We need not compute eigenvalues, eigenvectors, unit eigenvectors, or singular values for our matrix A because Matlab does this for us. Notice the SVD of our sample matrix A A=[2 2;-1 1] A = 2 2-1 1 10/26

U = -1.00000000000000 0.00000000000000 0.00000000000000 1.00000000000000 S = 2.82842712474619 0 0 1.41421356237310 V = -0.70710678118655-0.70710678118655-0.70710678118655 0.70710678118655 The singular values or the square roots of our eigenvalues, are arranged in descending order down the diagonal of S, and the corresponding unit eigenvectors are in the correct positions of our matrices U and V. The typeset is equally as simple for matrices thousands of times this large. It is here that the SVD of image matrices is possible. 11/26

Quick Description of True Color Versus Gray Scale With gray scale images we have values for each pixel between 0 and 1 which represent the intensity of light emitted from the pixel. For the Color images we have the intensities for each color and there are three layers, each layer either being red, green, or blue in the RGB spectrum. To SVD the color image, we must strip away and SVD each color layer, and then recombine the decomposed layers to produce the whole decomposed true color image. 12/26

Decomposition of a Gray Scale Image Singular Value Decomposition for gray scale images in Matlab is a simple task. Because grayscale images are represented by matrices containing only numbers between one and zero, the SVD is a very popular and easy method for their compression. close all imfinfo( puppy.jpg ) [A,map]=imread( puppy.jpg ); B=im2double(A); imshow(b) [u,s,v]=svd(b); C=zeros(size(B)); for k=5:5:35 for j=1:k C=C+s(j,j)*u(:,j)*v(:,j). ; 13/26

end end figure imshow(c,[]) drawnow set(gcf, Unit, inches, Paperposition,[0,0,2,2.7]) filename=[ puppy,num2str(k),.jpg ]; print( -djpeg90,filename) 14/26

Decomposition of a True Color Image When decomposing a grayscale image, only one step is needed. Because the grayscale image is only one layer thick it is much easier to manage. Color images, however, are much more common, and are also much more difficult to decompose. Because truecolor images are composed of a red, green, and blue layer for each pixel, we cannot simply compute the SVD of the original matrix. In order to get around this, we have written a code in Matlab that peels off the layers of the image and stores them separately in different matrices. In this fashion the matrices can be decomposed individually and then later stacked back on top of one another to create a new SVDed truecolor image. 15/26

Matlab Code close all iptsetpref( ImshowBorder, tight ) clear Q L=imread( buildings.jpg ); imshow(l) 16/26 L1=L(:,:,1); L2=L(:,:,2); L3=L(:,:,3); I1=im2double(L1); I2=im2double(L2); I3=im2double(L3); figure imshow(i1) figure imshow(i2) figure imshow(i3) [u1,s1,v1]=svd(i1);

[u2,s2,v2]=svd(i2); [u3,s3,v3]=svd(i3); C1=zeros(size(I1)); C2=zeros(size(I2)); C3=zeros(size(I3)); k=15; for j=1:k C1=C1+s1(j,j)*u1(:,j)*v1(:,j). ; end for j=1:k C2=C2+s2(j,j)*u2(:,j)*v2(:,j). ; end for j=1:k C3=C3+s3(j,j)*u3(:,j)*v3(:,j). ; end figure imshow(c1) 17/26

figure imshow(c2) figure imshow(c3) C1(k)=1; C2(k)=1; C3(k)=1; 18/26 R1=im2uint8(C1); R2=im2uint8(C2); R3=im2uint8(C3); Q(:,:,1)=R1; Q(:,:,2)=R2; Q(:,:,3)=R3; figure imshow(q,[])

The resulting images 19/26 Original Image Red Layer Green Layer Blue Layer

20/26 SVD Red Layer SVD Green Layer SVD Blue Layer Recompiled Image

The previous images illustrate the individual steps that the Matlab routine executes during the SVD code. The first four images have not been decomposed and retain their original crispness, while the final four images are the decomposed red, green, and blue layers of the original image and the recompiled layers after the final iteration. These decomposed images have rank 15. 21/26

22/26 k = 1 k = 2 k = 5 k = 7

23/26 k = 10 k = 12 k = 15 k = 20

24/26 k = 30 k = 50 k = 75 k = 100 Notice that the clarity of the image improves as the k values of the previous Matlab code are increased. Eventually, however, the image is so clear that it is almost indistinguishable from the original image.

25/26 Original Image Rank 100 See here that at rank 100 the produced image is almost identical to the original image. It is notable that the original image has rank 300, meaning that in order to actually reproduce this image, we would need 300 iterations. By using only the first 100 singular values, we can cut off an amazing amount of storage space.

In conclusion In conclusion, the singular Value Decomposition is a very useful tool for image compression. We can take an image which originally is rank 300 and store it in a reasonably good representation matrix that has only rank 100. Sometimes, we can even cut off more of the original image without losing clarity. In a time when computer imaging and web browsing demand the use of many different images, it is very important for us to be able to store all the image types in reasonably sized matrices. The Singular Value Decomposition makes this possible. 26/26