EECS 442 Discussion. Arash Ushani. September 20, Arash Ushani EECS 442 Discussion September 20, / 14

Size: px
Start display at page:

Download "EECS 442 Discussion. Arash Ushani. September 20, Arash Ushani EECS 442 Discussion September 20, / 14"

Transcription

1 EECS 442 Discussion Arash Ushani September 20, 2016 Arash Ushani EECS 442 Discussion September 20, / 14

2 Projective Geometry Projective Geometry For more detail, see HZ Chapter 2 Arash Ushani EECS 442 Discussion September 20, / 14

3 Projective Geometry Lines Representing Lines How do we represent a line? Arash Ushani EECS 442 Discussion September 20, / 14

4 Projective Geometry Lines Representing Lines How do we represent a line? ax + by + c = 0 l = (a, b, c) Arash Ushani EECS 442 Discussion September 20, / 14

5 Projective Geometry Lines Representing Lines How do we represent a line? ax + by + c = 0 l = (a, b, c) Is this a unique representation? Arash Ushani EECS 442 Discussion September 20, / 14

6 Projective Geometry Lines Representing Lines How do we represent a line? ax + by + c = 0 l = (a, b, c) Is this a unique representation? (ka)x + (kb)y + kc = 0 l = (ka, kb, kc) Equivalance Classes Arash Ushani EECS 442 Discussion September 20, / 14

7 Projective Geometry Lines Points on Lines How can we tell if a point (x, y) is on the line l = (a, b, c) Arash Ushani EECS 442 Discussion September 20, / 14

8 Projective Geometry Lines Points on Lines How can we tell if a point (x, y) is on the line l = (a, b, c) ax + by + c = 0 Arash Ushani EECS 442 Discussion September 20, / 14

9 Projective Geometry Lines Points on Lines How can we tell if a point (x, y) is on the line l = (a, b, c) ax + by + c = 0 x = (x, y, 1) x l = 0 l x = 0 Arash Ushani EECS 442 Discussion September 20, / 14

10 Projective Geometry Lines Intersection of Lines How do we find the intersection x of lines l 1 and l 2 Arash Ushani EECS 442 Discussion September 20, / 14

11 Projective Geometry Lines Intersection of Lines How do we find the intersection x of lines l 1 and l 2 x l 1 = 0 x l 2 = 0 Arash Ushani EECS 442 Discussion September 20, / 14

12 Projective Geometry Lines Intersection of Lines How do we find the intersection x of lines l 1 and l 2 x l 1 = 0 x l 2 = 0 x = l 1 l 2 Arash Ushani EECS 442 Discussion September 20, / 14

13 Projective Geometry Lines Line through two points How do we find the line l that goes through x 1 and x 2 Arash Ushani EECS 442 Discussion September 20, / 14

14 Projective Geometry Lines Line through two points How do we find the line l that goes through x 1 and x 2 x 1 l = 0 x 2 l = 0 Arash Ushani EECS 442 Discussion September 20, / 14

15 Projective Geometry Lines Line through two points How do we find the line l that goes through x 1 and x 2 x 1 l = 0 x 2 l = 0 l = x 1 x 2 Arash Ushani EECS 442 Discussion September 20, / 14

16 Projective Geometry Lines HW Hints: Lines and transformations Helpful identity: (Hx) (Hy) = (det H)H (x y) Arash Ushani EECS 442 Discussion September 20, / 14

17 Projective Geometry Lines HW Hints: Lines and transformations Helpful identity: (Hx) (Hy) = (det H)H (x y) Alernatively, if l x = 0, what can you say about l H 1 Hx if H is a projective transformation? Arash Ushani EECS 442 Discussion September 20, / 14

18 Projective Geometry Lines MATLAB Exercise Go to CTools Resources Discussion matlab.zip Arash Ushani EECS 442 Discussion September 20, / 14

19 DLT For more detail, see chapter 4 section 1 in HZ Arash Ushani EECS 442 Discussion September 20, / 14

20 DLT We have a set of point correspondences x i to x i We want to find the transformation H such that x i = Hx i Arash Ushani EECS 442 Discussion September 20, / 14

21 DLT We have a set of point correspondences x i to x i We want to find the transformation H such that x i = Hx i How many degrees of freedom are there in H? Arash Ushani EECS 442 Discussion September 20, / 14

22 DLT We have a set of point correspondences x i to x i We want to find the transformation H such that x i = Hx i How many degrees of freedom are there in H? 8 Arash Ushani EECS 442 Discussion September 20, / 14

23 DLT We have a set of point correspondences x i to x i We want to find the transformation H such that x i = Hx i How many degrees of freedom are there in H? 8 How many correspondences do we need? Arash Ushani EECS 442 Discussion September 20, / 14

24 DLT We have a set of point correspondences x i to x i We want to find the transformation H such that x i = Hx i How many degrees of freedom are there in H? 8 How many correspondences do we need? 4 Arash Ushani EECS 442 Discussion September 20, / 14

25 DLT If x i = Hx i, equivalently x i (Hx i) = 0 Arash Ushani EECS 442 Discussion September 20, / 14

26 DLT If x i = Hx i, equivalently x i (Hx i) = 0 h 1 x i Hx i = h 2 x i where each h j is a row in H h 3 x i Arash Ushani EECS 442 Discussion September 20, / 14

27 DLT If x i = Hx i, equivalently x i (Hx i) = 0 h 1 x i Hx i = h 2 x i where each h j is a row in H h 3 x i If x i = (x i, y i, w i ), then: y x i h3 x i w i h2 x i i (Hx i ) = w i h1 x i x i h3 x i = 0 x i h2 x i y i h1 x i Arash Ushani EECS 442 Discussion September 20, / 14

28 DLT If x i = Hx i, equivalently x i (Hx i) = 0 h 1 x i Hx i = h 2 x i where each h j is a row in H h 3 x i If x i = (x i, y i, w i ), then: y x i h3 x i w i h2 x i i (Hx i ) = w i h1 x i x i h3 x i = 0 x i h2 x i y i h1 x i 0 w i x i y i x i h 1 w i x i 0 x i x i h 2 = 0 y i x i x i x i 0 h 3 Arash Ushani EECS 442 Discussion September 20, / 14

29 DLT [ 0 w i x i y i x i w i x i 0 x i x i ] h 1 h 2 = 0 h 3 Arash Ushani EECS 442 Discussion September 20, / 14

30 DLT [ 0 w i x i y i x i w i x i 0 x i x i ] h 1 h 2 = 0 h 3 A i h 1 h 2 = 0 h 3 Arash Ushani EECS 442 Discussion September 20, / 14

31 DLT [ 0 w i x i y i x i w i x i 0 x i x i ] h 1 h 2 = 0 h 3 A i h 1 h 2 = 0 h 3 With 4 correspondences, we have Ah = 0 where A is rank 8 Arash Ushani EECS 442 Discussion September 20, / 14

32 DLT What if we have more than 4 correspondences? (Why would we have more than 4?) Arash Ushani EECS 442 Discussion September 20, / 14

33 DLT What if we have more than 4 correspondences? (Why would we have more than 4?) Overdetermined solution, find h that minimizes error (with h = 1) It can be shown that the singular vector corresponding to the smallest singular value of A is the solution to h that minimizes Ah Arash Ushani EECS 442 Discussion September 20, / 14

34 DLT What if we have more than 4 correspondences? (Why would we have more than 4?) Overdetermined solution, find h that minimizes error (with h = 1) It can be shown that the singular vector corresponding to the smallest singular value of A is the solution to h that minimizes Ah SVD Decomposition: A = UDV D is a diagonal matrix of the singular values of A V contains the singular vectors of A as column vectors Arash Ushani EECS 442 Discussion September 20, / 14

35 Next Week Next Week Guest Speaker: Steven Parkison, Calibration Expert No Office Hours Arash Ushani EECS 442 Discussion September 20, / 14

3D Computer Vision - WT 2004

3D Computer Vision - WT 2004 3D Computer Vision - WT 2004 Singular Value Decomposition Darko Zikic CAMP - Chair for Computer Aided Medical Procedures November 4, 2004 1 2 3 4 5 Properties For any given matrix A R m n there exists

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

Linear Algebra (Review) Volker Tresp 2017

Linear Algebra (Review) Volker Tresp 2017 Linear Algebra (Review) Volker Tresp 2017 1 Vectors k is a scalar (a number) c is a column vector. Thus in two dimensions, c = ( c1 c 2 ) (Advanced: More precisely, a vector is defined in a vector space.

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

Foundations of Computer Vision

Foundations of Computer Vision Foundations of Computer Vision Wesley. E. Snyder North Carolina State University Hairong Qi University of Tennessee, Knoxville Last Edited February 8, 2017 1 3.2. A BRIEF REVIEW OF LINEAR ALGEBRA Apply

More information

Linear Algebra (Review) Volker Tresp 2018

Linear Algebra (Review) Volker Tresp 2018 Linear Algebra (Review) Volker Tresp 2018 1 Vectors k, M, N are scalars A one-dimensional array c is a column vector. Thus in two dimensions, ( ) c1 c = c 2 c i is the i-th component of c c T = (c 1, c

More information

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T. Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

Multiple View Geometry in Computer Vision

Multiple View Geometry in Computer Vision Multiple View Geometry in Computer Vision Prasanna Sahoo Department of Mathematics University of Louisville 1 Scene Planes & Homographies Lecture 19 March 24, 2005 2 In our last lecture, we examined various

More information

1. Projective geometry

1. Projective geometry 1. Projective geometry Homogeneous representation of points and lines in D space D projective space Points at infinity and the line at infinity Conics and dual conics Projective transformation Hierarchy

More information

MATRICES The numbers or letters in any given matrix are called its entries or elements

MATRICES The numbers or letters in any given matrix are called its entries or elements MATRICES A matrix is defined as a rectangular array of numbers. Examples are: 1 2 4 a b 1 4 5 A : B : C 0 1 3 c b 1 6 2 2 5 8 The numbers or letters in any given matrix are called its entries or elements

More information

XV - Vector Spaces and Subspaces

XV - Vector Spaces and Subspaces MATHEMATICS -NYC- Vectors and Matrices Martin Huard Fall 7 XV - Vector Spaces and Subspaces Describe the zero vector (the additive identity) for the following vector spaces 4 a) c) d) e) C, b) x, y x,

More information

1 Linearity and Linear Systems

1 Linearity and Linear Systems Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 26 Jonathan Pillow Lecture 7-8 notes: Linear systems & SVD Linearity and Linear Systems Linear system is a kind of mapping f( x)

More information

Bindel, Fall 2009 Matrix Computations (CS 6210) Week 8: Friday, Oct 17

Bindel, Fall 2009 Matrix Computations (CS 6210) Week 8: Friday, Oct 17 Logistics Week 8: Friday, Oct 17 1. HW 3 errata: in Problem 1, I meant to say p i < i, not that p i is strictly ascending my apologies. You would want p i > i if you were simply forming the matrices and

More information

Singular Value Decomposition

Singular Value Decomposition Singular Value Decomposition Motivatation The diagonalization theorem play a part in many interesting applications. Unfortunately not all matrices can be factored as A = PDP However a factorization A =

More information

Multiple View Geometry in Computer Vision

Multiple View Geometry in Computer Vision Multiple View Geometry in Computer Vision Prasanna Sahoo Department of Mathematics University of Louisville 1 Basic Information Instructor: Professor Ron Sahoo Office: NS 218 Tel: (502) 852-2731 Fax: (502)

More information

[ Here 21 is the dot product of (3, 1, 2, 5) with (2, 3, 1, 2), and 31 is the dot product of

[ Here 21 is the dot product of (3, 1, 2, 5) with (2, 3, 1, 2), and 31 is the dot product of . Matrices A matrix is any rectangular array of numbers. For example 3 5 6 4 8 3 3 is 3 4 matrix, i.e. a rectangular array of numbers with three rows four columns. We usually use capital letters for matrices,

More information

Lecture 6. Numerical methods. Approximation of functions

Lecture 6. Numerical methods. Approximation of functions Lecture 6 Numerical methods Approximation of functions Lecture 6 OUTLINE 1. Approximation and interpolation 2. Least-square method basis functions design matrix residual weighted least squares normal equation

More information

Lecture 4.3 Estimating homographies from feature correspondences. Thomas Opsahl

Lecture 4.3 Estimating homographies from feature correspondences. Thomas Opsahl Lecture 4.3 Estimating homographies from feature correspondences Thomas Opsahl Homographies induced by central projection 1 H 2 1 H 2 u uu 2 3 1 Homography Hu = u H = h 1 h 2 h 3 h 4 h 5 h 6 h 7 h 8 h

More information

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u MATH 434/534 Theoretical Assignment 7 Solution Chapter 7 (71) Let H = I 2uuT Hu = u (ii) Hv = v if = 0 be a Householder matrix Then prove the followings H = I 2 uut Hu = (I 2 uu )u = u 2 uut u = u 2u =

More information

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2 Norwegian University of Science and Technology Department of Mathematical Sciences TMA445 Linear Methods Fall 07 Exercise set Please justify your answers! The most important part is how you arrive at an

More information

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true. 1 Which of the following statements is always true? I The null space of an m n matrix is a subspace of R m II If the set B = {v 1,, v n } spans a vector space V and dimv = n, then B is a basis for V III

More information

Math 301 Test I. M. Randall Holmes. September 8, 2008

Math 301 Test I. M. Randall Holmes. September 8, 2008 Math 0 Test I M. Randall Holmes September 8, 008 This exam will begin at 9:40 am and end at 0:5 am. You may use your writing instrument, a calculator, and your test paper; books, notes and neighbors to

More information

Math 308 Practice Final Exam Page and vector y =

Math 308 Practice Final Exam Page and vector y = Math 308 Practice Final Exam Page Problem : Solving a linear equation 2 0 2 5 Given matrix A = 3 7 0 0 and vector y = 8. 4 0 0 9 (a) Solve Ax = y (if the equation is consistent) and write the general solution

More information

Designing Information Devices and Systems II

Designing Information Devices and Systems II EECS 16B Fall 2016 Designing Information Devices and Systems II Linear Algebra Notes Introduction In this set of notes, we will derive the linear least squares equation, study the properties symmetric

More information

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3 ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3 ISSUED 24 FEBRUARY 2018 1 Gaussian elimination Let A be an (m n)-matrix Consider the following row operations on A (1) Swap the positions any

More information

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson Name: TA Name and section: NO CALCULATORS, SHOW ALL WORK, NO OTHER PAPERS ON DESK. There is very little actual work to be done on this exam if

More information

MATH 3321 Sample Questions for Exam 3. 3y y, C = Perform the indicated operations, if possible: (a) AC (b) AB (c) B + AC (d) CBA

MATH 3321 Sample Questions for Exam 3. 3y y, C = Perform the indicated operations, if possible: (a) AC (b) AB (c) B + AC (d) CBA MATH 33 Sample Questions for Exam 3. Find x and y so that x 4 3 5x 3y + y = 5 5. x = 3/7, y = 49/7. Let A = 3 4, B = 3 5, C = 3 Perform the indicated operations, if possible: a AC b AB c B + AC d CBA AB

More information

CAAM 335: Matrix Analysis

CAAM 335: Matrix Analysis 1 CAAM 335: Matrix Analysis Solutions to Problem Set 4, September 22, 2008 Due: Monday September 29, 2008 Problem 1 (10+10=20 points) (1) Let M be a subspace of R n. Prove that the complement of M in R

More information

Linear Algebra. Shan-Hung Wu. Department of Computer Science, National Tsing Hua University, Taiwan. Large-Scale ML, Fall 2016

Linear Algebra. Shan-Hung Wu. Department of Computer Science, National Tsing Hua University, Taiwan. Large-Scale ML, Fall 2016 Linear Algebra Shan-Hung Wu shwu@cs.nthu.edu.tw Department of Computer Science, National Tsing Hua University, Taiwan Large-Scale ML, Fall 2016 Shan-Hung Wu (CS, NTHU) Linear Algebra Large-Scale ML, Fall

More information

Singular value decomposition

Singular value decomposition Singular value decomposition The eigenvalue decomposition (EVD) for a square matrix A gives AU = UD. Let A be rectangular (m n, m > n). A singular value σ and corresponding pair of singular vectors u (m

More information

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A =

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A = AMS1 HW Solutions All credit is given for effort. (- pts for any missing sections) Problem 1 ( pts) Consider the following matrix 1 1 9 a. Calculate the eigenvalues of A. Eigenvalues are 1 1.1, 9.81,.1

More information

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice 3. Eigenvalues and Eigenvectors, Spectral Representation 3.. Eigenvalues and Eigenvectors A vector ' is eigenvector of a matrix K, if K' is parallel to ' and ' 6, i.e., K' k' k is the eigenvalue. If is

More information

Matrix decompositions

Matrix decompositions Matrix decompositions Zdeněk Dvořák May 19, 2015 Lemma 1 (Schur decomposition). If A is a symmetric real matrix, then there exists an orthogonal matrix Q and a diagonal matrix D such that A = QDQ T. The

More information

COMP 558 lecture 18 Nov. 15, 2010

COMP 558 lecture 18 Nov. 15, 2010 Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to

More information

computing an irreducible decomposition of A B looking for solutions where the Jacobian drops rank

computing an irreducible decomposition of A B looking for solutions where the Jacobian drops rank Diagonal Homotopies 1 Intersecting Solution Sets computing an irreducible decomposition of A B 2 The Singular Locus looking for solutions where the Jacobian drops rank 3 Solving Systems restricted to an

More information

Introduction to SVD and Applications

Introduction to SVD and Applications Introduction to SVD and Applications Eric Kostelich and Dave Kuhl MSRI Climate Change Summer School July 18, 2008 Introduction The goal of this exercise is to familiarize you with the basics of the singular

More information

Background Mathematics (2/2) 1. David Barber

Background Mathematics (2/2) 1. David Barber Background Mathematics (2/2) 1 David Barber University College London Modified by Samson Cheung (sccheung@ieee.org) 1 These slides accompany the book Bayesian Reasoning and Machine Learning. The book and

More information

18.06 Problem Set 3 - Solutions Due Wednesday, 26 September 2007 at 4 pm in

18.06 Problem Set 3 - Solutions Due Wednesday, 26 September 2007 at 4 pm in 8.6 Problem Set 3 - s Due Wednesday, 26 September 27 at 4 pm in 2-6. Problem : (=2+2+2+2+2) A vector space is by definition a nonempty set V (whose elements are called vectors) together with rules of addition

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

5 Linear Algebra and Inverse Problem

5 Linear Algebra and Inverse Problem 5 Linear Algebra and Inverse Problem 5.1 Introduction Direct problem ( Forward problem) is to find field quantities satisfying Governing equations, Boundary conditions, Initial conditions. The direct problem

More information

September 23, Chp 3.notebook. 3Linear Systems. and Matrices. 3.1 Solve Linear Systems by Graphing

September 23, Chp 3.notebook. 3Linear Systems. and Matrices. 3.1 Solve Linear Systems by Graphing 3Linear Systems and Matrices 3.1 Solve Linear Systems by Graphing 1 Find the solution of the systems by looking at the graphs 2 Decide whether the ordered pair is a solution of the system of linear equations:

More information

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018 Homework #1 Assigned: August 20, 2018 Review the following subjects involving systems of equations and matrices from Calculus II. Linear systems of equations Converting systems to matrix form Pivot entry

More information

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

POLI270 - Linear Algebra

POLI270 - Linear Algebra POLI7 - Linear Algebra Septemer 8th Basics a x + a x +... + a n x n b () is the linear form where a, b are parameters and x n are variables. For a given equation such as x +x you only need a variable and

More information

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach Dr. Guangliang Chen February 9, 2016 Outline Introduction Review of linear algebra Matrix SVD PCA Motivation The digits

More information

7. Symmetric Matrices and Quadratic Forms

7. Symmetric Matrices and Quadratic Forms Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value

More information

Linear Algebra 1 Exam 1 Solutions 6/12/3

Linear Algebra 1 Exam 1 Solutions 6/12/3 Linear Algebra 1 Exam 1 Solutions 6/12/3 Question 1 Consider the linear system in the variables (x, y, z, t, u), given by the following matrix, in echelon form: 1 2 1 3 1 2 0 1 1 3 1 4 0 0 0 1 2 3 Reduce

More information

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 15, Tuesday 8 th and Friday 11 th November 016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE

More information

CHAPTER 3 REVIEW QUESTIONS MATH 3034 Spring a 1 b 1

CHAPTER 3 REVIEW QUESTIONS MATH 3034 Spring a 1 b 1 . Let U = { A M (R) A = and b 6 }. CHAPTER 3 REVIEW QUESTIONS MATH 334 Spring 7 a b a and b are integers and a 6 (a) Let S = { A U det A = }. List the elements of S; that is S = {... }. (b) Let T = { A

More information

Dimension reduction, PCA & eigenanalysis Based in part on slides from textbook, slides of Susan Holmes. October 3, Statistics 202: Data Mining

Dimension reduction, PCA & eigenanalysis Based in part on slides from textbook, slides of Susan Holmes. October 3, Statistics 202: Data Mining Dimension reduction, PCA & eigenanalysis Based in part on slides from textbook, slides of Susan Holmes October 3, 2012 1 / 1 Combinations of features Given a data matrix X n p with p fairly large, it can

More information

Chap 3. Linear Algebra

Chap 3. Linear Algebra Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Numerical Linear Algebra

Numerical Linear Algebra Numerical Linear Algebra Direct Methods Philippe B. Laval KSU Fall 2017 Philippe B. Laval (KSU) Linear Systems: Direct Solution Methods Fall 2017 1 / 14 Introduction The solution of linear systems is one

More information

ECON 186 Class Notes: Linear Algebra

ECON 186 Class Notes: Linear Algebra ECON 86 Class Notes: Linear Algebra Jijian Fan Jijian Fan ECON 86 / 27 Singularity and Rank As discussed previously, squareness is a necessary condition for a matrix to be nonsingular (have an inverse).

More information

CSE 252B: Computer Vision II

CSE 252B: Computer Vision II CSE 252B: Computer Vision II Lecturer: Serge Belongie Scribe: Tasha Vanesian LECTURE 3 Calibrated 3D Reconstruction 3.1. Geometric View of Epipolar Constraint We are trying to solve the following problem:

More information

Properties of the least squares estimates

Properties of the least squares estimates Properties of the least squares estimates 2019-01-18 Warmup Let a and b be scalar constants, and X be a scalar random variable. Fill in the blanks E ax + b) = Var ax + b) = Goal Recall that the least squares

More information

Unit 3: HW3.5 Sum and Product

Unit 3: HW3.5 Sum and Product Unit 3: HW3.5 Sum and Product Without solving, find the sum and product of the roots of each equation. 1. x 2 8x + 7 = 0 2. 2x + 5 = x 2 3. -7x + 4 = -3x 2 4. -10x 2 = 5x - 2 5. 5x 2 2x 3 4 6. 1 3 x2 3x

More information

Inverse Theory. COST WaVaCS Winterschool Venice, February Stefan Buehler Luleå University of Technology Kiruna

Inverse Theory. COST WaVaCS Winterschool Venice, February Stefan Buehler Luleå University of Technology Kiruna Inverse Theory COST WaVaCS Winterschool Venice, February 2011 Stefan Buehler Luleå University of Technology Kiruna Overview Inversion 1 The Inverse Problem 2 Simple Minded Approach (Matrix Inversion) 3

More information

Linear Algebra Review. Fei-Fei Li

Linear Algebra Review. Fei-Fei Li Linear Algebra Review Fei-Fei Li 1 / 51 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector

More information

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Linear Algebra Final Exam Study Guide Solutions Fall 2012 . Let A = Given that v = 7 7 67 5 75 78 Linear Algebra Final Exam Study Guide Solutions Fall 5 explain why it is not possible to diagonalize A. is an eigenvector for A and λ = is an eigenvalue for A diagonalize

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2013 PROBLEM SET 2

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2013 PROBLEM SET 2 STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2013 PROBLEM SET 2 1. You are not allowed to use the svd for this problem, i.e. no arguments should depend on the svd of A or A. Let W be a subspace of C n. The

More information

Linear Algebra Review. Fei-Fei Li

Linear Algebra Review. Fei-Fei Li Linear Algebra Review Fei-Fei Li 1 / 37 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

4 ORTHOGONALITY ORTHOGONALITY OF THE FOUR SUBSPACES 4.1

4 ORTHOGONALITY ORTHOGONALITY OF THE FOUR SUBSPACES 4.1 4 ORTHOGONALITY ORTHOGONALITY OF THE FOUR SUBSPACES 4.1 Two vectors are orthogonal when their dot product is zero: v w = orv T w =. This chapter moves up a level, from orthogonal vectors to orthogonal

More information

Math 250B Midterm II Information Spring 2019 SOLUTIONS TO PRACTICE PROBLEMS

Math 250B Midterm II Information Spring 2019 SOLUTIONS TO PRACTICE PROBLEMS Math 50B Midterm II Information Spring 019 SOLUTIONS TO PRACTICE PROBLEMS Problem 1. Determine whether each set S below forms a subspace of the given vector space V. Show carefully that your answer is

More information

CO350 Linear Programming Chapter 5: Basic Solutions

CO350 Linear Programming Chapter 5: Basic Solutions CO350 Linear Programming Chapter 5: Basic Solutions 30th May 005 Chapter 5: Basic Solutions 1 Last week, we learn Recap Definition of a basis B of matrix A. Definition of basic solution x of Ax = b determined

More information

Math Review: parameter estimation. Emma

Math Review: parameter estimation. Emma Math Review: parameter estimation Emma McNally@flickr Fitting lines to dots: We will cover how Slides provided by HyunSoo Park 1809, Carl Friedrich Gauss What about fitting line on a curved surface? Least

More information

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University of Hong Kong Lecture 8: QR Decomposition

More information

Section 1.1 System of Linear Equations. Dr. Abdulla Eid. College of Science. MATHS 211: Linear Algebra

Section 1.1 System of Linear Equations. Dr. Abdulla Eid. College of Science. MATHS 211: Linear Algebra Section 1.1 System of Linear Equations College of Science MATHS 211: Linear Algebra (University of Bahrain) Linear System 1 / 33 Goals:. 1 Define system of linear equations and their solutions. 2 To represent

More information

1809, Carl Friedrich Gauss

1809, Carl Friedrich Gauss 1809, Carl Friedrich Gauss Singular Value Decomposition A = U D V T m x n m x n n x n n x n Singular Value Decomposition A = U D V T m x n m x n n x n n x n Column orthogonal matrix U i T U i U j T T =

More information

Solutions to Assignment 1

Solutions to Assignment 1 Solutions to Assignment 1 Question 1. [Exercises 1.1, # 6] Use the division algorithm to prove that every odd integer is either of the form 4k + 1 or of the form 4k + 3 for some integer k. For each positive

More information

then kaxk 1 = j a ij x j j ja ij jjx j j: Changing the order of summation, we can separate the summands, kaxk 1 ja ij jjx j j: let then c = max 1jn ja

then kaxk 1 = j a ij x j j ja ij jjx j j: Changing the order of summation, we can separate the summands, kaxk 1 ja ij jjx j j: let then c = max 1jn ja Homework Haimanot Kassa, Jeremy Morris & Isaac Ben Jeppsen October 7, 004 Exercise 1 : We can say that kxk = kx y + yk And likewise So we get kxk kx yk + kyk kxk kyk kx yk kyk = ky x + xk kyk ky xk + kxk

More information

Lecture 6 Positive Definite Matrices

Lecture 6 Positive Definite Matrices Linear Algebra Lecture 6 Positive Definite Matrices Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Spring 2017 2017/6/8 Lecture 6: Positive Definite Matrices

More information

Linear Algebra (Math-324) Lecture Notes

Linear Algebra (Math-324) Lecture Notes Linear Algebra (Math-324) Lecture Notes Dr. Ali Koam and Dr. Azeem Haider September 24, 2017 c 2017,, Jazan All Rights Reserved 1 Contents 1 Real Vector Spaces 6 2 Subspaces 11 3 Linear Combination and

More information

Linear Algebra, part 3 QR and SVD

Linear Algebra, part 3 QR and SVD Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2012 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We

More information

Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name:

Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name: Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition Due date: Friday, May 4, 2018 (1:35pm) Name: Section Number Assignment #10: Diagonalization

More information

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another.

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another. Homework # due Thursday, Oct. 0. Show that the diagonals of a square are orthogonal to one another. Hint: Place the vertices of the square along the axes and then introduce coordinates. 2. Find the equation

More information

There are six more problems on the next two pages

There are six more problems on the next two pages Math 435 bg & bu: Topics in linear algebra Summer 25 Final exam Wed., 8/3/5. Justify all your work to receive full credit. Name:. Let A 3 2 5 Find a permutation matrix P, a lower triangular matrix L with

More information

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Linear Algebra & Geometry why is linear algebra useful in computer vision? Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia

More information

Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

More information

Basics. A VECTOR is a quantity with a specified magnitude and direction. A MATRIX is a rectangular array of quantities

Basics. A VECTOR is a quantity with a specified magnitude and direction. A MATRIX is a rectangular array of quantities Some Linear Algebra Basics A VECTOR is a quantity with a specified magnitude and direction Vectors can exist in multidimensional space, with each element of the vector representing a quantity in a different

More information

Math Fall Final Exam

Math Fall Final Exam Math 104 - Fall 2008 - Final Exam Name: Student ID: Signature: Instructions: Print your name and student ID number, write your signature to indicate that you accept the honor code. During the test, you

More information

Lecture Notes in Mathematics. Arkansas Tech University Department of Mathematics. The Basics of Linear Algebra

Lecture Notes in Mathematics. Arkansas Tech University Department of Mathematics. The Basics of Linear Algebra Lecture Notes in Mathematics Arkansas Tech University Department of Mathematics The Basics of Linear Algebra Marcel B. Finan c All Rights Reserved Last Updated November 30, 2015 2 Preface Linear algebra

More information

CS 231A Section 1: Linear Algebra & Probability Review

CS 231A Section 1: Linear Algebra & Probability Review CS 231A Section 1: Linear Algebra & Probability Review 1 Topics Support Vector Machines Boosting Viola-Jones face detector Linear Algebra Review Notation Operations & Properties Matrix Calculus Probability

More information

LEAST SQUARES SOLUTION TRICKS

LEAST SQUARES SOLUTION TRICKS LEAST SQUARES SOLUTION TRICKS VESA KAARNIOJA, JESSE RAILO AND SAMULI SILTANEN Abstract This handout is for the course Applications of matrix computations at the University of Helsinki in Spring 2018 We

More information

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx

More information

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of CHAPTER III APPLICATIONS The eigenvalues are λ =, An orthonormal basis of eigenvectors consists of, The eigenvalues are λ =, A basis of eigenvectors consists of, 4 which are not perpendicular However,

More information

MH1200 Final 2014/2015

MH1200 Final 2014/2015 MH200 Final 204/205 November 22, 204 QUESTION. (20 marks) Let where a R. A = 2 3 4, B = 2 3 4, 3 6 a 3 6 0. For what values of a is A singular? 2. What is the minimum value of the rank of A over all a

More information

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS 1. HW 1: Due September 4 1.1.21. Suppose v, w R n and c is a scalar. Prove that Span(v + cw, w) = Span(v, w). We must prove two things: that every element

More information

Linear Algebra - Part II

Linear Algebra - Part II Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T

More information

Symmetric Matrices and Eigendecomposition

Symmetric Matrices and Eigendecomposition Symmetric Matrices and Eigendecomposition Robert M. Freund January, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 2 1 Symmetric Matrices and Convexity of Quadratic Functions

More information

Miscellaneous Results, Solving Equations, and Generalized Inverses. opyright c 2012 Dan Nettleton (Iowa State University) Statistics / 51

Miscellaneous Results, Solving Equations, and Generalized Inverses. opyright c 2012 Dan Nettleton (Iowa State University) Statistics / 51 Miscellaneous Results, Solving Equations, and Generalized Inverses opyright c 2012 Dan Nettleton (Iowa State University) Statistics 611 1 / 51 Result A.7: Suppose S and T are vector spaces. If S T and

More information

Mathematics I. Exercises with solutions. 1 Linear Algebra. Vectors and Matrices Let , C = , B = A = Determine the following matrices:

Mathematics I. Exercises with solutions. 1 Linear Algebra. Vectors and Matrices Let , C = , B = A = Determine the following matrices: Mathematics I Exercises with solutions Linear Algebra Vectors and Matrices.. Let A = 5, B = Determine the following matrices: 4 5, C = a) A + B; b) A B; c) AB; d) BA; e) (AB)C; f) A(BC) Solution: 4 5 a)

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Regularization: Ridge Regression and Lasso Week 14, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Regularization: Ridge Regression and Lasso Week 14, Lecture 2 MA 575 Linear Models: Cedric E. Ginestet, Boston University Regularization: Ridge Regression and Lasso Week 14, Lecture 2 1 Ridge Regression Ridge regression and the Lasso are two forms of regularized

More information

CS6964: Notes On Linear Systems

CS6964: Notes On Linear Systems CS6964: Notes On Linear Systems 1 Linear Systems Systems of equations that are linear in the unknowns are said to be linear systems For instance ax 1 + bx 2 dx 1 + ex 2 = c = f gives 2 equations and 2

More information

CS 323: Numerical Analysis and Computing

CS 323: Numerical Analysis and Computing CS 323: Numerical Analysis and Computing MIDTERM #1 Instructions: This is an open notes exam, i.e., you are allowed to consult any textbook, your class notes, homeworks, or any of the handouts from us.

More information

Homework 1. Yuan Yao. September 18, 2011

Homework 1. Yuan Yao. September 18, 2011 Homework 1 Yuan Yao September 18, 2011 1. Singular Value Decomposition: The goal of this exercise is to refresh your memory about the singular value decomposition and matrix norms. A good reference to

More information

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.

More information