Linear algebra review

Similar documents
MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

EE263: Introduction to Linear Dynamical Systems Review Session 2

MATH 304 Linear Algebra Lecture 19: Least squares problems (continued). Norms and inner products.

Lecture Notes for EE263

Lecture 5 Least-squares

Vector Spaces. Commutativity of +: u + v = v + u, u, v, V ; Associativity of +: u + (v + w) = (u + v) + w, u, v, w V ;

Review of Some Concepts from Linear Algebra: Part 2

Linear Algebra. Session 12

Vector Spaces. (1) Every vector space V has a zero vector 0 V

Typical Problem: Compute.

INNER PRODUCT SPACE. Definition 1

2. Every linear system with the same number of equations as unknowns has a unique solution.

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Lecture 1: Review of linear algebra

3. Vector spaces 3.1 Linear dependence and independence 3.2 Basis and dimension. 5. Extreme points and basic feasible solutions

Math 290, Midterm II-key

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Row Space, Column Space, and Nullspace

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Math 544, Exam 2 Information.

MAT Linear Algebra Collection of sample exams

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

LINEAR ALGEBRA SUMMARY SHEET.

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Lecture Summaries for Linear Algebra M51A

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

MTH 2032 SemesterII

FUNCTIONAL ANALYSIS-NORMED SPACE

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Linear Algebra Review: Linear Independence. IE418 Integer Programming. Linear Algebra Review: Subspaces. Linear Algebra Review: Affine Independence

Lecture 3: Linear Algebra Review, Part II

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 13, 2014

Vector spaces. EE 387, Notes 8, Handout #12

Chapter 6 - Orthogonality

2. Review of Linear Algebra

Online Exercises for Linear Algebra XM511

Math 407: Linear Optimization

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Matrix invertibility. Rank-Nullity Theorem: For any n-column matrix A, nullity A +ranka = n

Observability and state estimation

Review of Basic Concepts in Linear Algebra

Chapter 3 Transformations

2 Determinants The Determinant of a Matrix Properties of Determinants Cramer s Rule Vector Spaces 17

A Review of Linear Algebra

Lecture 2: Linear Algebra Review

NAME MATH 304 Examination 2 Page 1

Linear Systems. Carlo Tomasi. June 12, r = rank(a) b range(a) n r solutions

Section 6.1. Inner Product, Length, and Orthogonality

b 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n

Sept. 26, 2013 Math 3312 sec 003 Fall 2013

4 Linear Algebra Review

The set of all solutions to the homogeneous equation Ax = 0 is a subspace of R n if A is m n.

Chapter 1. Vectors, Matrices, and Linear Spaces

Lecture 23: 6.1 Inner Products

Math 369 Exam #2 Practice Problem Solutions

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Linear Algebra Formulas. Ben Lee

Math 2174: Practice Midterm 1

Dot product and linear least squares problems

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Fall 2016 MATH*1160 Final Exam

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Math 3191 Applied Linear Algebra

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016

Pseudoinverse & Moore-Penrose Conditions

1 Last time: inverses

4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T

EE731 Lecture Notes: Matrix Computations for Signal Processing

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Lecture notes: Applied linear algebra Part 1. Version 2

Linear Algebra Massoud Malek

Numerical Linear Algebra

Chapter 2 Subspaces of R n and Their Dimensions

Vector Space and Linear Transform

5. Orthogonal matrices

Linear System Theory

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Math Linear Algebra

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Linear Systems. Carlo Tomasi

SUMMARY OF MATH 1600

CS 246 Review of Linear Algebra 01/17/19

Elementary linear algebra

Lecture 19 Observability and state estimation

MATH 167: APPLIED LINEAR ALGEBRA Chapter 2

DEF 1 Let V be a vector space and W be a nonempty subset of V. If W is a vector space w.r.t. the operations, in V, then W is called a subspace of V.

Chapter 6. Orthogonality and Least Squares

Stat 159/259: Linear Algebra Notes

Further Mathematical Methods (Linear Algebra)

ECON 5111 Mathematical Economics

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

Transcription:

EE263 Autumn 2015 S. Boyd and S. Lall Linear algebra review vector space, subspaces independence, basis, dimension nullspace and range left and right invertibility 1

Vector spaces a vector space or linear space (over the reals) consists of a set V a vector sum + : V V V a scalar multiplication : R V V a distinguished element 0 V which satisfy a list of properties 2

Vector space axioms x + y = y + x, x, y V + is commutative (x + y) + z = x + (y + z), x, y, z V + is associative 0 + x = x, x V 0 is additive identity x V ( x) V s.t. x + ( x) = 0 existence of additive inverse (αβ)x = α(βx), α, β R x V scalar mult. is associative α(x + y) = αx + αy, α R x, y V right distributive rule (α + β)x = αx + βx, α, β R x V left distributive rule 1x = x, x V 1 is multiplicative identity 3

Examples V 1 = R n, with standard (componentwise) vector addition and scalar multiplication V 2 = {0} (where 0 R n ) V 3 = span(v 1, v 2,..., v k ) where and v 1,..., v k R n span(v 1, v 2,..., v k ) = {α 1v 1 + + α k v k α i R} 4

Subspaces a subspace of a vector space is a subset of a vector space which is itself a vector space roughly speaking, a subspace is closed under vector addition and scalar multiplication examples V 1, V 2, V 3 above are subspaces of R n 5

Vector spaces of functions V 4 = {x : R + R n x is differentiable}, where vector sum is sum of functions: (x + z)(t) = x(t) + z(t) and scalar multiplication is defined by (a point in V 4 is a trajectory in R n ) (αx)(t) = αx(t) V 5 = {x V 4 ẋ = Ax} (points in V 5 are trajectories of the linear system ẋ = Ax) V 5 is a subspace of V 4 6

(Euclidean) norm for x R n we define the (Euclidean) norm as x = x 2 1 + x2 2 + + x2 n = x T x x measures length of vector (from origin) important properties: αx = α x x + y x + y x 0 x = 0 x = 0 homogeneity triangle inequality nonnegativity definiteness 7

RMS value and (Euclidean) distance root-mean-square (RMS) value of vector x R n : rms(x) = ( 1 n n i=1 x 2 i ) 1/2 = x n norm defines distance between vectors: dist(x, y) = x y x x y y 8

Independent set of vectors a set of vectors {v 1, v 2,..., v k } is independent if α 1v 1 + α 2v 2 + + α k v k = 0 = α 1 = α 2 = = 0 some equivalent conditions: coefficients of α 1v 1 + α 2v 2 + + α k v k are uniquely determined, i.e., α 1v 1 + α 2v 2 + + α k v k = β 1v 1 + β 2v 2 + + β k v k implies α 1 = β 1, α 2 = β 2,..., α k = β k no vector v i can be expressed as a linear combination of the other vectors v 1,..., v i 1, v i+1,..., v k 9

Basis and dimension set of vectors {v 1, v 2,..., v k } is called a basis for a vector space V if V = span(v 1, v 2,..., v k ) and {v 1, v 2,..., v k } is independent equivalently, every v V can be uniquely expressed as v = α 1v 1 + + α k v k for a given vector space V, the number of vectors in any basis is the same number of vectors in any basis is called the dimension of V, denoted dim V 10

Nullspace of a matrix the nullspace of A R m n is defined as null(a) = { x R n Ax = 0 } null(a) is set of vectors mapped to zero by y = Ax null(a) is set of vectors orthogonal to all rows of A null(a) gives ambiguity in x given y = Ax: if y = Ax and z null(a), then y = A(x + z) conversely, if y = Ax and y = A x, then x = x + z for some z null(a) null(a) is also written N (A) 11

Zero nullspace A is called one-to-one if 0 is the only element of its nullspace null(a) = {0} Equivalently, x can always be uniquely determined from y = Ax (i.e., the linear transformation y = Ax doesn t lose information) mapping from x to Ax is one-to-one: different x s map to different y s columns of A are independent (hence, a basis for their span) A has a left inverse, i.e., there is a matrix B R n m s.t. BA = I A T A is invertible 12

Two interpretations of nullspace suppose z null(a), and y = Ax represents measurement of x z is undetectable from sensors get zero sensor readings x and x + z are indistinguishable from sensors: Ax = A(x + z) null(a) characterizes ambiguity in x from measurement y = Ax alternatively, if y = Ax represents output resulting from input x z is an input with no result x and x + z have same result null(a) characterizes freedom of input choice for given result 13

Left invertibility and estimation ˆx B y A x apply left-inverse B at output of A then estimate ˆx = BAx = x as desired non-unique: both B and C are left inverses of A 1 0 A = 0 1 B = 1 0 [ 1 0 ] 0 0 1 0 C = [ 0.5 0 ] 0.5 0 1 0 14

Range of a matrix the range of A R m n is defined as range(a) = {Ax x R n } R m range(a) can be interpreted as the set of vectors that can be hit by linear mapping y = Ax the span of columns of A the set of vectors y for which Ax = y has a solution range(a) is also written R(A) 15

Onto matrices A is called onto if range(a) = R m equivalently, Ax = y can be solved in x for any y columns of A span R m A has a right inverse, i.e., there is a matrix B R n m s.t. AB = I rows of A are independent null(a T ) = {0} AA T is invertible 16

Interpretations of range suppose v range(a),w range(a) y = Ax represents measurement of x y = v is a possible or consistent sensor signal y = w is impossible or inconsistent; sensors have failed or model is wrong y = Ax represents output resulting from input x v is a possible result or output w cannot be a result or output range(a) characterizes the possible results or achievable outputs 17

Right invertibility and control y A x C y des apply right-inverse C at input of A then output y = ACy des = y des as desired 18

Inverse A R n n is invertible or nonsingular if it has both a left and right inverse equivalent conditions: columns of A are a basis for R n rows of A are a basis for R n y = Ax has a unique solution x for every y R n null(a) = {0} range(a) = R n 19

Inverse if a matrix A has both a left inverse and a right inverse, then they are equal BA = I and AC = I = B = C hence if A is invertible then the inverse is unique AA 1 = A 1 A = I 20

Interpretations of inverse suppose A R n n has inverse B = A 1 mapping associated with B undoes mapping associated with A (applied either before or after!) x = By is a perfect (pre- or post-) equalizer for the channel y = Ax x = By is unique solution of Ax = y 21

Dual basis interpretation let a i be columns of A, and b T i be rows of B = A 1 from y = x 1a 1 + + x na n and x i = b T i y, we get y = n ( b T i y)a i i=1 thus, inner product with rows of inverse matrix gives the coefficients in the expansion of a vector in the columns of the matrix { b 1,..., b n} and {a 1,..., a n} are called dual bases 22

Change of coordinates 0. standard basis vectors in R n : (e 1, e 2,..., e n) where e i = 1 (1 in ith component). 0 obviously we have x = x 1e 1 + x 2e 2 + + x ne n x i are called the coordinates of x (in the standard basis) if (t 1, t 2,..., t n) is another basis for R n, we have x = x 1t 1 + x 2t 2 + + x nt n where x i are the coordinates of x in the basis (t 1, t 2,..., t n) then x = T x and x = T 1 x 23

Similarity transformation consider linear transformation y = Ax, A R n n express y and x in terms of t 1, t 2..., t n, so x = T x and y = T ỹ, then ỹ = (T 1 AT ) x A T 1 AT is called similarity transformation similarity transformation by T expresses linear transformation y = Ax in coordinates t 1, t 2,..., t n 24