Definitions for Quizzes

Similar documents
MATH 240 Spring, Chapter 1: Linear Equations and Matrices

235 Final exam review questions

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Proofs for Quizzes. Proof. Suppose T is a linear transformation, and let A be a matrix such that T (x) = Ax for all x R m. Then

Study Guide for Linear Algebra Exam 2

1. General Vector Spaces

2. Every linear system with the same number of equations as unknowns has a unique solution.

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

TBP MATH33A Review Sheet. November 24, 2018

Linear Algebra Highlights

Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Chapter 2 Subspaces of R n and Their Dimensions

Lecture Summaries for Linear Algebra M51A

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

LINEAR ALGEBRA REVIEW

Math 217 Midterm 1. Winter Solutions. Question Points Score Total: 100

Linear Algebra Practice Problems

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Lecture 1: Review of linear algebra

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Solutions to Final Exam

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Dimension. Eigenvalue and eigenvector

Vector Spaces and Linear Transformations

City Suburbs. : population distribution after m years

MA 265 FINAL EXAM Fall 2012

Quizzes for Math 304

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

SUMMARY OF MATH 1600

6 Inner Product Spaces

Lecture 7: Positive Semidefinite Matrices

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Linear Algebra Practice Final

Final Exam. Linear Algebra Summer 2011 Math S2010X (3) Corrin Clarkson. August 10th, Solutions

Solving a system by back-substitution, checking consistency of a system (no rows of the form

LINEAR ALGEBRA SUMMARY SHEET.

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Eigenspaces. (c) Find the algebraic multiplicity and the geometric multiplicity for the eigenvaules of A.

Generalized Eigenvectors and Jordan Form

Online Exercises for Linear Algebra XM511

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Announcements Monday, October 29

LINEAR ALGEBRA REVIEW

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Linear Algebra Review. Vectors

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

1. Select the unique answer (choice) for each problem. Write only the answer.

1 9/5 Matrices, vectors, and their applications

NATIONAL UNIVERSITY OF SINGAPORE MA1101R

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Final Exam Practice Problems Answers Math 24 Winter 2012

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

Math Final December 2006 C. Robinson

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Math Linear Algebra Final Exam Review Sheet

Introduction to Linear Algebra, Second Edition, Serge Lange

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

(v, w) = arccos( < v, w >

Linear Algebra- Final Exam Review

After we have found an eigenvalue λ of an n n matrix A, we have to find the vectors v in R n such that

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Math 21b: Linear Algebra Spring 2018

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Control Systems. Linear Algebra topics. L. Lanari

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS. 1. Linear Equations and Matrices

Conceptual Questions for Review

Linear Algebra Massoud Malek

LINEAR ALGEBRA QUESTION BANK

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Chapter 6: Orthogonality

Definition (T -invariant subspace) Example. Example

MAT 242 CHAPTER 4: SUBSPACES OF R n

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

x y + z = 3 2y z = 1 4x + y = 0

ELE/MCE 503 Linear Algebra Facts Fall 2018

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Chapter 3 Transformations

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

Review of Linear Algebra

Transcription:

Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does it need to be a part of your definition.] 1 Linear Equations Definition 1. A matrix is in reduced row-echelon form (RREF) if it satisfies all of the following conditions: a) If a row has nonzero entries, then the first nonzero entry is a 1, [called a leading 1]. b) If a column contains a leading 1, then all the other entries in that column are 0. c) If a row contains a leading 1, then each row above it contains a leading 1 further to the left. Definition 2 (1.3.2). The rank of a matrix A, written rank(a), is the number of leading 1 s in rref(a). Definition 3 (1.3.9). A vector b R n is called a linear combination of the vectors v 1,..., v m in R n if there exist scalars x 1,..., x m such that b = x 1 v 1 + + x m v m. [Note that Ax is a linear combination of the columns of A. By convention, 0 is considered to be the unique linear combination of the empty set of vectors.] 2 Linear Transformations Definition 4 (2.1.1). A function T : R m R n is called a linear transformation if there exists an n m matrix A such that T (x) = Ax for all vectors x in R m. Definition 5. For a function T : X Y, X is called the domain and Y is called the target. A function T : X Y is called one-to-one if for any y Y there is at most one input x X such that T (x) = y [(different inputs give different outputs)]. A function T : X Y is called onto if for any y Y there is at least one input x X such that T (x) = y [(every target element is an output)]. A function T : X Y is called invertible if for any y Y there is exactly one x X such that T (x) = y. [Note that a function is invertible if and only if it is both one-to-one and onto.] 1

3 Subspaces of R n and Their Dimensions Definition 6 (3.1.1). The image of a function T : X Y is its set of outputs: im(t ) = {T (x) : x X}, [a subset of the target Y. Note that T is onto if and only if im(t ) = Y.] [For a linear transformation T : R m R n, the image is a subset of the target R n.] im(t ) = {T (x) : x R m }, Definition 7 (3.1.2). The set of all linear combinations of the vectors v 1,..., v m in R n is called their span: span(v 1, v 2,..., v m ) = {c 1 v 1 + + c m v m : c 1,..., c m R}. [If span(v 1, v 2,..., v m ) = W for some subset W of R n, we say that the vectors v 1,..., v m span W. Thus span can be used as a noun or as a verb.] Definition 8 (3.1.1). The kernel of a linear transformation T : R m R n is its set of zeros: [a subset of the domain R m ]. ker(t ) = {x R m : T (x) = 0}, Definition 9 (3.2.6). A linear relation among the vectors v 1,..., v m R n is an equation of the form c 1 v 1 + + c m v m = 0 for scalars c 1,..., c m R. [If c 1 = = c m = 0, the relation is called trivial, while if at least one of the c i in nonzero, the relation is nontrivial.] Definition 10 (3.2.1). A subset W of a vector space R n is called a subspace of R n if it has the following three properties: a) contains zero vector: 0 W. b) closed under addition: If w 1, w 2 W, then w 1 + w 2 W. c) closed under scalar multiplication: If w W and k R, then kw W. [Property a is needed only to assure that W is nonempty. If W contains any vector w, then it also contains 0w = 0, by property c. Properties b and c are together equivalent to W being closed under linear combinations.] Definition 11 (3.2.3). Let v 1,..., v m R n. a) A vector v i in the list v 1,..., v m is redundant if it is a linear combination of the preceding vectors v 1,..., v i 1. [Note that v 1 is redundant if and only if it equals 0, the unique linear combination of the empty set of vectors.] b) The vectors v 1,..., v m are called linearly independent (LI) if none of them are redundant. [Otherwise, they are linearly dependent (LD).] 2

c) The vectors v 1,..., v m form a basis of a subspace V of R n if they span V and are linearly independent. Definition 12 (3.3.3). The number of vectors in a basis of a subspace V of R n is called the dimension of V, denoted dim(v ). Definition 13. The nullity of a matrix A, written nullity(a), is the dimension of the kernel of A. Definition 14 (3.4.5). Given two n n matrices A and B, we say that A is similar to B, abbreviated A B, if there exists an invertible matrix S such that 4 Linear Spaces AS = SB or, equivalently, B = S 1 AS. Definition 15 (4.2.1). Let V and W be linear spaces. A function f : V W is called a linear transformation if, for all f, g V and k R, and [For a linear transformation T : V W, we let T (f + g) = T (f) + T (g) and T (kf) = kt (f). im(t ) = {T (f) : f V } ker(t ) = {f V : T (f) = 0}. Then im(t ) is a subspace of the target W and ker(t ) is a subspace of the domain V, so im(t ) and ker(t ) are each linear spaces.] [If the image of T is finite dimensional, then dim(im T ) is called the rank of T, and if the kernel of T is finite dimensional, then dim(ker T ) is called the nullity of T.] Definition 16 (4.2.2). An invertible linear transformation T is called an isomorphism [(from the Greek for same structure ). The linear space V is said to be isomorphic to the linear space W, written V W, if there exists an isomorphism T : V W.] 5 Orthogonality and Least Squares Definition 17 (5.1.1). Two vectors v, w R n are called perpendicular or orthogonal if v w = 0. A vector x R n is orthogonal to a subspace V R n if x is orthogonal to all vectors v V. Definition 18 (5.1.1). The length (or magnitude or norm) of a vector v R n is v = v v. A vector u R n is called a unit vector if its length is 1 [(i.e., u = 1 or u u = 1)]. 3

Definition 19 (5.1.2). The vectors u 1,..., u m R n are called orthonormal if they are all unit vectors and all orthogonal to each other: { 1 if i = j u i u j = 0 if i j. Definition 20 (5.1.12). [By the Cauchy-Schwarz inequality, angle between two nonzero vectors x, y R n to be [With this definition, we have the formula θ = arccos x y x y. x y = x y cos θ x y x y for the dot product in terms of the lengths of two vectors and the angle between them.] = x y x y 1], so we may define the Definition 21 (5.3.1). A linear transformation T : R n R n is called orthogonal if it preserves the length of vectors: T (x) = x, for all x R n. [If T (x) = Ax is an orthogonal transformation, we say that A is an orthogonal matrix.] Definition 22 (5.3.5). For an m n matrix A, the transpose A T of A is the n m matrix whose ijth entry is the jith entry of A: [A T ] ij = A ji. [The rows of A become the columns of A T, and the columns of A become the rows of A T.] A square matrix A is symmetric if A T = A and skew-symmetric if A T = A. Definition 23 (5.4.4). Let A be an n m matrix. Then a vector x R m is called a least-squares solution of the system Ax = b if the distance between Ax and b is as small as possible: Definition 24 (5.5.2). b Ax b Ax for all x R m. The norm (or magnitude) of an element f of an inner product space is f = f, f. Two elements f, g of an inner product space are called orthogonal (or perpendicular) if f, g = 0. The distance between two elements of an inner product space is defined to be [the norm of their difference]: dist(f, g) = f g. The angle θ between two elements f, g of an inner product space is defined by the formula ( ) f, g θ = cos 1. f g 4

6 Determinants Definition 25 (6.3.2). An orthogonal matrix A with det A = 1 is called a rotation matrix, [and the linear transformation T (x) = Ax is called a rotation]. Definition 26. The m-parallelepiped defined by the vectors v 1,..., v m R n is the set of all vectors in R n of the form c 1 v 1 + + c m v m, where 0 c i 1. [A 2-parallelepiped is also called a parallelogram.] The m-volume V (v 1,..., v m ) of this m-parallelepiped is defined to be V (v 1,..., v m ) = v 1 v 2 v m. [In the case m = n, this is just det A, where A is the square matrix with columns v 1,..., v n R n.] 7 Eigenvalues and Eigenvectors Definition 27 (7.1.1). Let A be an n n matrix. A nonzero vector v R n is called an eigenvector of A if [Av is a scalar multiple of v, i.e.,] Av = λv for some scalar λ. [The scalar λ is called the eigenvalue of A associated with the eigenvector v. λ-eigenvector.] We sometimes call v a Definition 28 (7.2.6). An eigenvalue λ 0 of a square matrix A has algebraic multiplicity k if [λ 0 is a root of multiplicity k of the characteristic polynomial f A (λ), meaning that we can write] f A (λ) = (λ 0 λ) k g(λ) for some polynomial g(λ) with g(λ 0 ) 0. [We write AM(λ 0 ) = k.] Definition 29 (7.3.1). Let λ be an eigenvalue of an n n matrix A. The λ-eigenspace of A, denoted E λ, is defined to be E λ = ker(a λi n ) [or] = {v R n : Av = λv} [= {λ-eigenvectors of A} {0}]. Definition 30 (7.3.2). The dimension of the λ-eigenspace E λ multiplicity of λ, [written GM(λ). We have = ker(a λi n ) is called the geometric GM(λ) = dim(e λ ) = dim(ker(a λi n )) = nullity(a λi n ) = n rank(a λi n )]. 5

Definition 31 (7.3.3). Let A be an n n matrix. A basis of R n consisting of eigenvectors of A is called an eigenbasis for A. Definition 32 (7.4.2). Consider a linear transformation T : R n R n given by T (x) = Ax. T is called diagonalizable if there exists a basis D of R n such that the D-matrix of T is diagonal. A is called diagonalizable if A is similar to some diagonal matrix D, i.e., if there exists an invertible matrix S such that S 1 AS is diagonal. 6