Problem Set 1. Homeworks will graded based on content and clarity. Please show your work clearly for full credit.

Similar documents
DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Review problems for MA 54, Fall 2004.

LAKELAND COMMUNITY COLLEGE COURSE OUTLINE FORM

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Mathematical foundations - linear algebra

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

web: HOMEWORK 1

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Choose three of: Choose three of: Choose three of:

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4

Linear Algebra Massoud Malek

b 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n

Linear Algebra Review. Vectors

4 Linear Algebra Review

CSL361 Problem set 4: Basic linear algebra

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

is Use at most six elementary row operations. (Partial

Columbus State Community College Mathematics Department Public Syllabus

MATRIX ALGEBRA. or x = (x 1,..., x n ) R n. y 1 y 2. x 2. x m. y m. y = cos θ 1 = x 1 L x. sin θ 1 = x 2. cos θ 2 = y 1 L y.

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

Chapter 6. Orthogonality

14 Singular Value Decomposition

Chapter 6: Orthogonality

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

MAT 2037 LINEAR ALGEBRA I web:

CS 246 Review of Linear Algebra 01/17/19

IFT 6760A - Lecture 1 Linear Algebra Refresher

The following definition is fundamental.

CS123 INTRODUCTION TO COMPUTER GRAPHICS. Linear Algebra /34

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Math Linear Algebra Final Exam Review Sheet

Definitions for Quizzes

MATH 1553, SPRING 2018 SAMPLE MIDTERM 2 (VERSION B), 1.7 THROUGH 2.9

SUMMARY OF MATH 1600

Spring 2015 Midterm 1 03/04/15 Lecturer: Jesse Gell-Redman

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Lecture 22: Section 4.7

Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.

Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name:

Linear Algebra problems

Row Space, Column Space, and Nullspace

22A-2 SUMMER 2014 LECTURE Agenda

MTH 2032 SemesterII

Chapter 4 Euclid Space

Linear Algebra I for Science (NYC)

Math 290, Midterm II-key

Solution: By inspection, the standard matrix of T is: A = Where, Ae 1 = 3. , and Ae 3 = 4. , Ae 2 =

Review of Some Concepts from Linear Algebra: Part 2

Elementary linear algebra

Large Scale Data Analysis Using Deep Learning

2. Matrix Algebra and Random Vectors

Mathematical foundations - linear algebra

4 Linear Algebra Review

MATH 2360 REVIEW PROBLEMS

2018 Fall 2210Q Section 013 Midterm Exam I Solution

MA 265 FINAL EXAM Fall 2012

Mathematical Principles for Scientific Computing and Visualization

Lecture 7: Positive Semidefinite Matrices

Math 1553, Introduction to Linear Algebra

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Lecture 23: 6.1 Inner Products

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Solution to Homework 1

LINEAR ALGEBRA KNOWLEDGE SURVEY

CS123 INTRODUCTION TO COMPUTER GRAPHICS. Linear Algebra 1/33

Extra Problems for Math 2050 Linear Algebra I

Chapter 3: Theory Review: Solutions Math 308 F Spring 2015

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Chapter 3 Transformations

1 Last time: least-squares problems

Linear Algebra Quiz 4. Problem 1 (Linear Transformations): 4 POINTS Show all Work! Consider the tranformation T : R 3 R 3 given by:

Problem 1. CS205 Homework #2 Solutions. Solution

7. Dimension and Structure.

Introduction to Matrix Algebra

ACM 104. Homework Set 4 Solutions February 14, 2001

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

CS 143 Linear Algebra Review

Jim Lambers MAT 610 Summer Session Lecture 1 Notes

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

There are two things that are particularly nice about the first basis

Linear Models Review

Properties of Linear Transformations from R n to R m

ECON 5111 Mathematical Economics

Solutions to Math 51 First Exam October 13, 2015

(v, w) = arccos( < v, w >

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

MA 242 LINEAR ALGEBRA C1, Solutions to First Midterm Exam

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

Linear Algebra. Session 12

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

Matrix Algebra: Summary

Section 1.6. M N = [a ij b ij ], (1.6.2)

7.5 Operations with Matrices. Copyright Cengage Learning. All rights reserved.

LINEAR ALGEBRA REVIEW

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another.

Transcription:

CSE 151: Introduction to Machine Learning Winter 2017 Problem Set 1 Instructor: Kamalika Chaudhuri Due on: Jan 28 Instructions This is a 40 point homework Homeworks will graded based on content and clarity Please show your work clearly for full credit To get full credit while providing a counter-example, it is not sufficient to simply state the example; you should also justify why it is a counter-example Problem 1 (10 points) Let u 1 and u 2 be vectors such that u 1 = u 2 = 1, and u 1, u 2 = 0 For any vector x, we define P (x) as the vector P (x) = x, u 1 u 1 + x, u 2 u 2 1 How would you geometrically interpret P (x)? (Hint: Think about projections) 2 Show that: P (x) 2 = x, u 1 2 + x, u 2 2 3 Using parts (1) and (2), show that P (x) x When is P (x) = x? Solutions 1 P (x) is the projection of x onto the subspace spanned by u 1 and u 2 Let V be the subspace spanned by u 1 and u 2 P (x) is the projection of x onto subspace V if x P (x) is orthogonal to V We first show that x P (x) u 1 and x P (x) u 2 x P (x), u 1 = x x, u 1 u 1 x, u 2 u 2, u 1 = x, u 1 x, u 1 u 1, u 1 x, u 2 u 2, u 1 = x, u 1 x, u 1 u 1, u 1 x, u 2 u 2, u 1 = x, u 1 x, u 1 1 x, u 2 0 = 0, x P (x), u 2 = x x, u 1 u 1 x, u 2 u 2, u 2 = x, u 2 x, u 1 u 1, u 2 x, u 2 u 2, u 2 = x, u 2 x, u 1 u 1, u 2 x, u 2 u 2, u 2 = x, u 2 x, u 1 0 x, u 2 1 = 0 Since x P (x) u 1, x P (x) u 2 and u 1, u 2 are linearly independent, x P (x) is orthogonal to any vector in subspace V, which means that x P (x) is orthogonal to V Therefore, P (x) is the projection of x onto the subspace spanned by u 1 and u 2

Figure 1: Visualization of P (x), when x, u 1, u 2 R 3 2 We show P (x) 2 = x, u 1 2 + x, u 2 2 by expanding P (x) 2 P (x) 2 = P (x), P (x) = x, u 1 u 1 + x, u 2 u 2, x, u 1 u 1 + x, u 2 u 2 = x, u 1 u 1, x, u 1 u 1 + x, u 1 u 1, x, u 2 u 2 + x, u 2 u 2, x, u 1 u 1 + x, u 2 u 2, x, u 2 u 2 = x, u 1 2 u 1, u 1 + x, u 1 x, u 2 u 1, u 2 + x, u 2 x, u 1 u 2, u 1 + x, u 2 2 u 2, u 2 = x, u 1 2 1 + x, u 1 x, u 2 0 + x, u 2 x, u 1 0 + x, u 2 2 1 = x, u 1 2 + x, u 2 2 3 Since P (x) x P (x), we have x 2 = P (x) 2 + x P (x) 2 Or, from part (1), we have u 1, x P (x) = 0 and u 2, x P (x) = 0, thus x 2 = x, x = P (x) + (x P (x)), P (x) + (x P (x)) = P (x), P (x) + P (x), x P (x) + x P (x), P (x) + x P (x), x P (x) = P (x) 2 + 2 P (x), x P (x) + x P (x) 2 = P (x) 2 + 2( x, u 1 u 1 + x, u 2 u 2, x P (x) ) + x P (x) 2 = P (x) 2 + 2( x, u 1 u 1, x P (x) + x, u 2 u 2, x P (x) ) + x P (x) 2 = P (x) 2 + 2( x, u 1 u 1, x P (x) + x, u 2 u 2, x P (x) ) + x P (x) 2 = P (x) 2 + 2( x, u 1 0 + x, u 2 0) + x P (x) 2 = P (x) 2 + x P (x) 2 Therefore, P (x) 2 x 2 Since P (x) 0 and x 0, we have P (x) x When x P (x) 2 = 0, ie x = P (x) or x itself is in the subspace spanned by u 1 and u 2, we have P (x) = x Problem 2 (10 points) Given two column vectors x and y in d-dimensional space, the outer product of x and y is defined to be the d d matrix x y = xy 1 Show that for any x and y, x (x y)y = x 2 y 2 When is this equal to x x, y y? 2 Show that for any non-zero x and y, the outer product x y always has rank 1 3 Let x 1,, x n be n d 1 data vectors, and let X be the n d data matrix whose i-th row is the row vector x i Show that: X X = x i x i i=1

Solutions We know that for any vector x, x x = x 2 Thus, x (x y)y = x (xy )y = (x x)(y y) = x 2 y 2 Also, x x, y y = x, y (x y) = x, y x, y = x, y 2 = ( x y cos θ) 2 = x 2 y 2 cos 2 θ This quantity is equal to x 2 y 2 when θ = 0 or 180 This means that the two quantities are equal when the vectors x and y are collinear Let x i be the ith element of vector x and y i be the ith element of vector y Thus, x y = xy = x 1 x 2 x d [y 1, y 2,, y d ] = x 1 y 1 x 1 y 2 x 1 y d x 2 y 1 x 2 y 2 x 2 y d x d y 1 x d y 2 x d y d Notice that every row is a scalar multiple of the first row of the above matrix Therefore, when this matrix is reduced to a row echelon form, it will contain only one non-zero row Therefore, the outer product x y always has rank 1 Let Y = X X Therefore, Y is a d d matrix Let x ij be the jth element of vector x i Therefore, X can be written as Therefore, Y = X T X = This works out to give Y ij equation x k x k = x k x k = X = x 11 x 12 x 1d x 21 x 22 x 2d x n1 x n2 x nd x 11 x 21 x n1 x 12 x 22 x n2 x 11 x 12 x 1d x 21 x 22 x 2d = x 1d x 2d x nd x n1 x n2 x nd = n x kix kj where i,j = 1, 2d Now we work out the right side of the x k1 x k2 x kd Thus, the right side of the equation equals Y Problem 3 (10 points) [x k1, x k2 x kd ] = x 2 k1 x k1 x k2 x k1 x kd x k2 x k1 x 2 k2 x k2 x kd x kd x k1 x kd x k2 x 2 kd Suppose A and B are d d matrices which are symmetric (in the sense that A ij = A ji and B ij = B ji for all i and j) and positive semi-definite Also suppose that u is a d 1 vector such that u = 1 Which of the following matrices are always positive semi-definite, no matter what A, B and u are? Justify your answer 1 10A 2 A + B 3 uu 4 A B 5 I uu (Hint: Write down x (I uu )x in terms of some dot-products, and try usng Cauchy- Schwartz)

Solutions A general strategy for solving this problem is to first try to prove that the matrix M is positive semi-definite; if you fail, then try to find a counter-example to disprove the claim For the latter, you need find out a specific vector x for which x Mx < 0 By the definition of positive semi-definite matrices, for all d 1 vector x, 1 For the matrix 10A, for all d 1 vector x, thus it is positive semidefinite 2 For the matrix A + B, for all d 1 vector x, x Ax 0, x Bx 0 x (10A)x = 10(x Ax) 0 x (A + B)x = (x Ax) + (x Bx) 0, as both x Ax and x Bx are 0 Thus it is positive semidefinite 3 For the matrix uu, for all d 1 vector x, x (uu )x = (x u)(u x) = ( x, u )( u, x ) = ( x, u ) 2 0 thus it is positive semidefinite 4 The [ matrix ] A B[ is not ] always positive [ semi-definite ] As a concrete counter-example, take d = 2, 1 0 2 0 1 0 A =, and B = Then A B = There exists a 2 1 vector x = [1, 0] 0 1 0 2 0 1 such that x (A B)x = 1 which proves that A B is in fact not positive semi-definite 5 For the matrix I uu, for all d 1 vector x, x (I uu )x = x x ( x, u ) 2 Now applying Cauchy-Schwarz to ( x, u ) and using the fact that u = 1, we find that Thus, we conclude ( x, u ) 2 x 2 u 2 = x 2 = x x x (I uu )x 0 This establishes the fact that (I uu ) is positive semi-definite Problem 4 (10 points) In class, we discussed how to define a norm or a length for a vector It turns out that one can also define a norm or a length for a matrix Two popular matrix norms are the Frobenius norm and the spectral norm The Frobenius norm of a m n matrix A, denoted by A F is defined as: m A F = The spectral norm of a m n matrix A, denoted by A is defined as: A 2 ij where x is a n 1 vector A = max x Ax x

1 Let I be the n n identity matrix What is its Frobenius norm? What is its spectral norm? Justify your answer 2 Suppose A = uv where u is a m 1 vector and v is a n 1 vector Write down the Frobenius norm of A as function of u and v Justify your answer 3 Write down the spectral norm of A in terms of u and v Justify your answer Solutions Since I is an n n identity matrix, therefore it has n elements along the diagonal which are 1 and all the remaining elements are 0 Therefore, the Frobenius norm of I is given by The spectral norm of I is given by I = max x I F = n Ix x = max x x x = 1 Let u = [u 1, u 2 u m ] and v = [v 1, v 2 v n ] Since A = uv, therefore u 1 v 1 u 1 v 2 u 1 v n u 2 v 1 u 2 v 2 u 2 v n A = u m v 1 u m v 2 u m v n The Frobenius norm of A is given by m m A F = u 2 i v2 j = u 2 i vj 2 = u 2 v 2 = u v In order to find the spectral norm of A, observe that for any n 1 x, Ax = u v, x = u v, x = u v x cos θ where θ is the angle between v and x cos θ attains a maximum value of 1 at θ = 0 or 180 Therefore, A = u v