The Four Fundamental Subspaces
|
|
- Philippa Julia Bridges
- 5 years ago
- Views:
Transcription
1 The Four Fundamental Subspaces Introduction Each m n matrix has, associated with it, four subspaces, two in R m and two in R n To understand their relationships is one of the most basic questions in linear algebra What we want to do, first, is to define the subapaces in terms of the row vectors and the column vectors of the given matrix, and then to find description of them This means that we want to find a basis for each, ie, find a linearly independent set of vectors in terms of which each vector in the subspace can be written uniquely as a linear combination of the basis elements Of course, since the dimension of a vector space is the number of elements in a basis, we will have, in each case, the dimension of the subapace The key tool that is used to answer all these questions is the reduction of a matrix to row-echelon form All the basic informtion on the subspaces is encoded in the reduced matrix, sometimes directly, sometimes indirectly as we shall see In the following, we will use a particular matrix to illustrate the discussion The matrix we often use is which has row-echelon form U = Before proceeding, you should verify this claim The Four Subspaces The Row Space of A This subspace is just the subspace spanned by the rows of the matrix A Since each elementary row operation that reduces A to its row-echelon form U either simply intershanges rows or forms linear combinations of the existing rows, the subspace spanned by the rows of A is exactly the same as the subspace spanned by the rows of U Now the non-zero row vectors of the row-echelon form each contains a leading entry So, for example, in the i th row suppose that the leading entry occurs in the j th column Then all entries below that leading entry are zero (u ik = for k > j) It follows that the non-zero rows of the row-echelon form are linearly independent
2 In our concrete example, the first and second rows of U are linearly independent since = (,,, ) = c (,,, ) + c (,,, ) = (c, c, c + c, c + c ) clearly implies that c = c = In general a row-echelon form π c π c π c π r c r c r+ c m the second and third rows ( π ) and ( π ) are independent just as in the concrete example Obviously, there can be at most m non-zero rows in the row-echelon form The number of these non-zero rows is called the rank of the matrix A ; it is often denoted r Since these non-zero rows are linearly independent and span the row space, the number of non-zero rows is the dimension of the subspace In other words, the dimension of the row space is just r and r m The Null Space of A The null space of the matrix A, usually denoted by N(A) is defined as N(A) = {x R n Ax = } Otherwise said, it is just the set of all solutions of the linear homogeneous equation Ax = On the other hand, since the operations of putting a matrix in rowechelon form simply replaces the original equations with an equivalent system, the set of solutions does not change, ie, the set of solutions to Ax = is exactly the
3 same as the set of solutions of Ux = Each of the r non-zero rows of the rowechelon form U has a leading entry and the columns containing those leading entries correspond to the basic variables; the other n r columns correspond to the free variables and these free variables can be chosen arbitrarily So we may successively assign a value of to a particular free variable, zero to the others and solve Ux = by back substitution The n r vectors so obtained then form a basis for N(A) and dim N(A) = n r The number n r is sometimes called the nullity of A; N(A) is also called the kernel of A and is written ker(a) Notice the crucial fact that rank (A) + nullity (A) = n Example Consider with REF U = The columns of U which contain a leading entry correspond to x and x so that the free variables are x and x 4 Set x =, x 4 = and solve the resulting equations to obtain the equations x = and x + = hence x = (,,, ) is one solution Setting x = and x 4 = leads to the equations x + = and x +x +x 4 = so that x = and x = / Hence x = (,, /, ) is the other solution These vectors are linearly independent and form a basis for N(A) Here we see that dim (N(A)) =, rank (A) = and +=4! The Column Space of A The column space of A is also called the range of A (which it is if we consider the mat x Ax!) and is usually denoted R(A) Our object is to find a basis for R(A) as well as its dimension To do this, we must realize at the outset that the space spanned by the columns of A and that spanned by the columns of U, its row-echelon form, are not the same While the elementary row operations leave the row space and null space unchanged, the columns are significantly altered In our running example: so using the first and third columns, It is unfortunate that row space starts with the letter r as does the term rank One must be careful in using this letter!
4 + 9 = while no linear combination of the columns of U = 5 can produce a vector with a non-zero third component! The key observation we need to make is that the homogeneous equation Ax = expresses a linear dependence relation between the columns of A whose coefficients are just the components of the vector x But since Ax = if and only if Ux =, each dependence relation on the columns of A is matched by a linear dependence relation on the columns of U with exactly the same coefficients! Again, going back to the example, the last column is dependent on the first and third: 5 = + 9 while the last column of U is again dependent on the first and third columns of U with the same coefficients: = + Now, to find a basis for R(A) we first find a basis for R(U) which is eas: the basis for R(U) is formed just by the columns that contain a pivot (or, what is the same thing, the columns corresponding to the basic variables); there will be r of them We then choose the columns of the original matrix A corresponding to the columns of U that contain the pivots It is important to notice that the dimension of R(A) is just the rank of A and that is exactly the dimension of the row space of A Otherwise said, the number of independent columns is exactly the number of independent rows of A This leads to the statement, which is not at all obvious for, say a 7 matrix that the column rank (the number of linearly independent columns of A) is exactly the row rank of A! So we have another crucial relation; 4
5 row rank (A) = column rank (A) In particular, for a square n n matrix, if the rows are linearly independent, then so are the columns To see how this fits with our running example, since with REF U = it is the first and third columns that contain the pivots of U and hence a basis for R(A) is formed by the first and third columns of A namely and 9 4 The Null Space of A The columns of A are the rows of A and we can take the transpose of A y = to get y = (,,, ) (a row vector) The vector y is sometimes called a left null vector of A and the null space of A is then called the left null space of A We use this terminology below Clearly the product y A is a linear combination of the rows of A and produces a row-vector Now the dimension of N(A ) is easy to find Indeed, for any matrix the number of basic variables + the number of free variables is just the number of columns of the matrix In light of what we have just done, this means that, for an m n matrix we see once again that: rank (A) + nullity (A) = dim R(A) + N(A) = n Now apply this formula to the matrix A which has m columns Here the rowrank is the same as the column rank and that is r hence r + dim N(A ) = m and so dim N(A ) = m r So much for the question of the dimension of the null space of A What about finding a basis? Suppose that we have a LU-decomposition where some row interchanges must be done then we have the factoriaation P LU which can be written as L PU Let the last m r rows of U be zero rows Then the last m r row of L P are a basis for the left null space 5
6 5 Summary The foregoing is somtimes put together in the form of a Theorem: Theorem Fundamental Theorem of Linear Algebra, Part I: (a) R(A ) = row space of A has dimension r (b) N(A) = null space of A has dimension n r (c) R(A) = column space of (A has dimension r (d) N(A ) = null space of (A) has dimension m r Orthogonality Recall that orthogonality of vectors is defined in terms of an inner product Here, we take the usual dot product of two vectors as the inner product and say that two vectors, x, y R n are orthogonal provided x, y = n x i y i = We note that the vector is orthogonal to all vectors in R n i= The associated norm (or length) is then given by x = x, x If we look at the norm of x y and expand in terms of the inner product, we find x y = x y, x y = x, x y y, x y = x, x xy y, x + y, y = x x, y + y This means that the Pythagorean Theorem x y = x + y holds if and only if x, y = It also means that the inner product can be used to give a definition of the angle between two vectors To see this we start by using the Law of Cosines which says that if the vectors x and y form two sides of a triangle with included angle θ, then the third side is x y and x + y x y cos (θ) = x y But, from the expression for x y computed above, we find that x y cos (θ) = x, y, or x, y = x y cos (θ) 6
7 Now, we can introduce the idea of orthogonal subspaces In R, for example, the z-axis is a one dimensional subspace and any vector lying in that subspace is clearly orthogonal to any vector lying in the (x, y)-plane This simple idea generalizes to the notion of orthogonal subspaces in higher dimensional spaces Definition Two subspaces, V and W of the space R n are called orthogonal subspaces provided every vector v V is orthogonal to every vector w W If we look at the four fundamental subspaces of a matrix, we notice, first, that N(A) and R(A ) are both subspaces of R n while the other two subspaces, N(A ) and R(A) are subspaces of R m The most important fact about these two pairs of subspaces, other than their dimension, is that they are orthogonal pairs of subspaces Let us see why this is so I : The N(A) is orthogonal to R(A ) To see this, suppose x N(A) The equation Ax = can be written out, symbolically, as follows A x = row row row m x x x n = This is equivalent to m equations formed by taking inner products of the rows of A with the vector x Hence the vector x is orthogonal to all the rows of the matrix A But the rows of A are the columns of A So if v is in the column space of A = R(A ), it is a linear combination of the columns of A we have x, v = Hence N(A) R(A ) II : The left null space N(A ) is orthogonal to the column space R(A) The simple way to prove this statement is to apply statement (I) to the matrix A Or, we can write y ( ) y y y m col col col n = ( ) from which we see that y is orthogonal to every column of A Therefore it is orthogonal to every linear combination of columns, that is, every y N(A ) is orthogonal to every x R(A) 7
8 Let us look at an example Example Let ( 4 8 ) with REF U = ( 4 ) The second column is basic and the other threee variables are free Therefore, if we set each free variable equal to one, in turn, and solve Ux =, we can compute the basis for the null space N(A) to be, 4, It is easy to check that these vectors are all orthogonal to the rows of A as they should be according to our result Since the row rank is equal to the column rank, the column space of A is onedimensional and is spanned by the one basic column (, ) On the other hand, the left null-space is found by combining the rows of A to produce the zero row in the matrix U Since this is done by adding row + row, y = (, ) and the inner product of the second column with this last vector is clearly Finally, we have the following important definition: Definition If V is a subspace of R n the set of all vectors orthogonal to all the vectors of V is called the orthogonal compliment of V We denote the orthogonal compliment of V by V Note that if some vector v were orthogonal to N(A) but was not in the row space of A then adding v as an extra row of A would enlarge the row space without changing the null-space But we have the formula dimr(a ) + dim N(A) = number of columns so it is impossible for the row space to change dimension by adding v Hence we have R(A ) = N(A) Similar reasoning with A leads to the following final result 8
9 Theorem 4 Fundamental Theorem of Linear Algebra part II N(A) = (R(A )), R(A) = N(A) N(A ) = (R(A)), R(A) = (N(A )) One final remark: The last equality means: Ax = b has a solution if and only if b is orthogonal to N(A ) Or, said another way b is in the column space if and only if it is orthogonal to every solution y of the transposed homogeneous equation A y = From this fact, it follows that Ax = b is solvalble for every right-hand vector b if and only if the transposed homogeneous equation has only the zero solution This statement is the finitie dimensional analog of what is known as the Fredholm Alternative Theorem: either Ax = b has a solution for every b or A y = has non-trivial solutions, but not both 9
March 27 Math 3260 sec. 56 Spring 2018
March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated
More informationMAT Linear Algebra Collection of sample exams
MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system
More informationSection 6.1. Inner Product, Length, and Orthogonality
Section 6. Inner Product, Length, and Orthogonality Orientation Almost solve the equation Ax = b Problem: In the real world, data is imperfect. x v u But due to measurement error, the measured x is not
More informationMAT 242 CHAPTER 4: SUBSPACES OF R n
MAT 242 CHAPTER 4: SUBSPACES OF R n JOHN QUIGG 1. Subspaces Recall that R n is the set of n 1 matrices, also called vectors, and satisfies the following properties: x + y = y + x x + (y + z) = (x + y)
More informationLecture 3: Linear Algebra Review, Part II
Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationRow Space and Column Space of a Matrix
Row Space and Column Space of a Matrix 1/18 Summary: To a m n matrix A = (a ij ), we can naturally associate subspaces of K n and of K m, called the row space of A and the column space of A, respectively.
More informationMATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.
MATH 2331 Linear Algebra Section 2.1 Matrix Operations Definition: A : m n, B : n p ( 1 2 p ) ( 1 2 p ) AB = A b b b = Ab Ab Ab Example: Compute AB, if possible. 1 Row-column rule: i-j-th entry of AB:
More informationChapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.
Chapter 3 Directions: For questions 1-11 mark each statement True or False. Justify each answer. 1. (True False) Asking whether the linear system corresponding to an augmented matrix [ a 1 a 2 a 3 b ]
More informationLecture 23: 6.1 Inner Products
Lecture 23: 6.1 Inner Products Wei-Ta Chu 2008/12/17 Definition An inner product on a real vector space V is a function that associates a real number u, vwith each pair of vectors u and v in V in such
More informationMATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.
MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. Basis Definition. Let V be a vector space. A linearly independent spanning set for V is called a basis.
More informationMath 1553 Introduction to Linear Algebra
Math 1553 Introduction to Linear Algebra Lecture Notes Chapter 2 Matrix Algebra School of Mathematics The Georgia Institute of Technology Math 1553 Lecture Notes for Chapter 2 Introduction, Slide 1 Section
More information1111: Linear Algebra I
1111: Linear Algebra I Dr. Vladimir Dotsenko (Vlad) Lecture 13 Dr. Vladimir Dotsenko (Vlad) 1111: Linear Algebra I Lecture 13 1 / 8 The coordinate vector space R n We already used vectors in n dimensions
More informationMath 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam
Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system
More informationv = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :
Length, Angle and the Inner Product The length (or norm) of a vector v R 2 (viewed as connecting the origin to a point (v 1,v 2 )) is easily determined by the Pythagorean Theorem and is denoted v : v =
More informationMath 242 fall 2008 notes on problem session for week of This is a short overview of problems that we covered.
Math 242 fall 28 notes on problem session for week of 9-3-8 This is a short overview of problems that we covered.. For each of the following sets ask the following: Does it span R 3? Is it linearly independent?
More informationReview Notes for Linear Algebra True or False Last Updated: February 22, 2010
Review Notes for Linear Algebra True or False Last Updated: February 22, 2010 Chapter 4 [ Vector Spaces 4.1 If {v 1,v 2,,v n } and {w 1,w 2,,w n } are linearly independent, then {v 1 +w 1,v 2 +w 2,,v n
More information2. Every linear system with the same number of equations as unknowns has a unique solution.
1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations
More informationSystems of Linear Equations
Systems of Linear Equations Math 108A: August 21, 2008 John Douglas Moore Our goal in these notes is to explain a few facts regarding linear systems of equations not included in the first few chapters
More informationChapter 6. Orthogonality and Least Squares
Chapter 6 Orthogonality and Least Squares Section 6.1 Inner Product, Length, and Orthogonality Orientation Recall: This course is about learning to: Solve the matrix equation Ax = b Solve the matrix equation
More informationInstructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.
Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.. Recall that P 3 denotes the vector space of polynomials of degree less
More informationMath 4A Notes. Written by Victoria Kala Last updated June 11, 2017
Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...
More informationMA 511, Session 10. The Four Fundamental Subspaces of a Matrix
MA 5, Session The Four Fundamental Subspaces of a Matrix Let A be a m n matrix. (i) The row space C(A T )ofais the subspace of R n spanned by the rows of A. (ii) The null space N (A) ofa is the subspace
More informationGlossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB
Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the
More informationLINEAR ALGEBRA REVIEW
LINEAR ALGEBRA REVIEW SPENCER BECKER-KAHN Basic Definitions Domain and Codomain. Let f : X Y be any function. This notation means that X is the domain of f and Y is the codomain of f. This means that for
More informationChapter 2 Subspaces of R n and Their Dimensions
Chapter 2 Subspaces of R n and Their Dimensions Vector Space R n. R n Definition.. The vector space R n is a set of all n-tuples (called vectors) x x 2 x =., where x, x 2,, x n are real numbers, together
More informationMTH 362: Advanced Engineering Mathematics
MTH 362: Advanced Engineering Mathematics Lecture 5 Jonathan A. Chávez Casillas 1 1 University of Rhode Island Department of Mathematics September 26, 2017 1 Linear Independence and Dependence of Vectors
More informationMATH 2360 REVIEW PROBLEMS
MATH 2360 REVIEW PROBLEMS Problem 1: In (a) (d) below, either compute the matrix product or indicate why it does not exist: ( )( ) 1 2 2 1 (a) 0 1 1 2 ( ) 0 1 2 (b) 0 3 1 4 3 4 5 2 5 (c) 0 3 ) 1 4 ( 1
More informationMATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.
MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. Orthogonality Definition 1. Vectors x,y R n are said to be orthogonal (denoted x y)
More informationKevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp
MTHSC 3 Section 4.3 Linear Independence in Vector Spaces; Bases Definition Let V be a vector space and let { v. v 2,..., v p } V. If the only solution to the equation x v + x 2 v 2 + + x p v p = is the
More informationSECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. =
SECTION 3.3. PROBLEM. The null space of a matrix A is: N(A) {X : AX }. Here are the calculations of AX for X a,b,c,d, and e. Aa [ ][ ] 3 3 [ ][ ] Ac 3 3 [ ] 3 3 [ ] 4+4 6+6 Ae [ ], Ab [ ][ ] 3 3 3 [ ]
More informationLECTURES 14/15: LINEAR INDEPENDENCE AND BASES
LECTURES 14/15: LINEAR INDEPENDENCE AND BASES MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 1. Linear Independence We have seen in examples of span sets of vectors that sometimes adding additional vectors
More informationSolution Set 4, Fall 12
Solution Set 4, 18.06 Fall 12 1. Do Problem 7 from 3.6. Solution. Since the matrix is invertible, we know the nullspace contains only the zero vector, hence there does not exist a basis for this subspace.
More informationWe showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.
Dimension We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true. Lemma If a vector space V has a basis B containing n vectors, then any set containing more
More information1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.
1. Determine by inspection which of the following sets of vectors is linearly independent. (a) (d) 1, 3 4, 1 { [ [,, 1 1] 3]} (b) 1, 4 5, (c) 3 6 (e) 1, 3, 4 4 3 1 4 Solution. The answer is (a): v 1 is
More informationMath 54 HW 4 solutions
Math 54 HW 4 solutions 2.2. Section 2.2 (a) False: Recall that performing a series of elementary row operations A is equivalent to multiplying A by a series of elementary matrices. Suppose that E,...,
More informationThe definition of a vector space (V, +, )
The definition of a vector space (V, +, ) 1. For any u and v in V, u + v is also in V. 2. For any u and v in V, u + v = v + u. 3. For any u, v, w in V, u + ( v + w) = ( u + v) + w. 4. There is an element
More informationAnnouncements Wednesday, November 15
3π 4 Announcements Wednesday, November 15 Reviews today: Recitation Style Solve and discuss Practice problems in groups Preparing for the exam tips and strategies It is not mandatory Eduardo at Culc 141,
More informationMA 0540 fall 2013, Row operations on matrices
MA 0540 fall 2013, Row operations on matrices December 2, 2013 This is all about m by n matrices (of real or complex numbers). If A is such a matrix then A corresponds to a linear map F n F m, which we
More informationMidterm 1 Solutions Math Section 55 - Spring 2018 Instructor: Daren Cheng
Midterm 1 Solutions Math 20250 Section 55 - Spring 2018 Instructor: Daren Cheng #1 Do the following problems using row reduction. (a) (6 pts) Let A = 2 1 2 6 1 3 8 17 3 5 4 5 Find bases for N A and R A,
More informationMath 344 Lecture # Linear Systems
Math 344 Lecture #12 2.7 Linear Systems Through a choice of bases S and T for finite dimensional vector spaces V (with dimension n) and W (with dimension m), a linear equation L(v) = w becomes the linear
More informationA SHORT SUMMARY OF VECTOR SPACES AND MATRICES
A SHORT SUMMARY OF VECTOR SPACES AND MATRICES This is a little summary of some of the essential points of linear algebra we have covered so far. If you have followed the course so far you should have no
More informationElementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.
Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4
More informationChapter 2: Matrix Algebra
Chapter 2: Matrix Algebra (Last Updated: October 12, 2016) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). Write A = 1. Matrix operations [a 1 a n. Then entry
More informationSolutions to Math 51 First Exam April 21, 2011
Solutions to Math 5 First Exam April,. ( points) (a) Give the precise definition of a (linear) subspace V of R n. (4 points) A linear subspace V of R n is a subset V R n which satisfies V. If x, y V then
More informationStudy Guide for Linear Algebra Exam 2
Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real
More informationWorking with Block Structured Matrices
Working with Block Structured Matrices Numerical linear algebra lies at the heart of modern scientific computing and computational science. Today it is not uncommon to perform numerical computations with
More informationOctober 25, 2013 INNER PRODUCT SPACES
October 25, 2013 INNER PRODUCT SPACES RODICA D. COSTIN Contents 1. Inner product 2 1.1. Inner product 2 1.2. Inner product spaces 4 2. Orthogonal bases 5 2.1. Existence of an orthogonal basis 7 2.2. Orthogonal
More informationTopic 14 Notes Jeremy Orloff
Topic 4 Notes Jeremy Orloff 4 Row reduction and subspaces 4. Goals. Be able to put a matrix into row reduced echelon form (RREF) using elementary row operations.. Know the definitions of null and column
More informationhomogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45
address 12 adjoint matrix 118 alternating 112 alternating 203 angle 159 angle 33 angle 60 area 120 associative 180 augmented matrix 11 axes 5 Axiom of Choice 153 basis 178 basis 210 basis 74 basis test
More informationAssignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.
Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More information10. Rank-nullity Definition Let A M m,n (F ). The row space of A is the span of the rows. The column space of A is the span of the columns.
10. Rank-nullity Definition 10.1. Let A M m,n (F ). The row space of A is the span of the rows. The column space of A is the span of the columns. The nullity ν(a) of A is the dimension of the kernel. The
More informationChapter 5 Eigenvalues and Eigenvectors
Chapter 5 Eigenvalues and Eigenvectors Outline 5.1 Eigenvalues and Eigenvectors 5.2 Diagonalization 5.3 Complex Vector Spaces 2 5.1 Eigenvalues and Eigenvectors Eigenvalue and Eigenvector If A is a n n
More information2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian
FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian
More informationMath 3191 Applied Linear Algebra
Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have
More informationMath 21b: Linear Algebra Spring 2018
Math b: Linear Algebra Spring 08 Homework 8: Basis This homework is due on Wednesday, February 4, respectively on Thursday, February 5, 08. Which of the following sets are linear spaces? Check in each
More informationa s 1.3 Matrix Multiplication. Know how to multiply two matrices and be able to write down the formula
Syllabus for Math 308, Paul Smith Book: Kolman-Hill Chapter 1. Linear Equations and Matrices 1.1 Systems of Linear Equations Definition of a linear equation and a solution to a linear equations. Meaning
More informationMath 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis
Math 24 4.3 Linear Independence; Bases A. DeCelles Overview Main ideas:. definitions of linear independence linear dependence dependence relation basis 2. characterization of linearly dependent set using
More informationRecall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:
Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =
More informationMaths for Signals and Systems Linear Algebra in Engineering
Maths for Signals and Systems Linear Algebra in Engineering Lecture 3, Friday 4 th October 26 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON Mathematics for
More information4.3 - Linear Combinations and Independence of Vectors
- Linear Combinations and Independence of Vectors De nitions, Theorems, and Examples De nition 1 A vector v in a vector space V is called a linear combination of the vectors u 1, u,,u k in V if v can be
More informationFinal Examination 201-NYC-05 December and b =
. (5 points) Given A [ 6 5 8 [ and b (a) Express the general solution of Ax b in parametric vector form. (b) Given that is a particular solution to Ax d, express the general solution to Ax d in parametric
More informationMath 3108: Linear Algebra
Math 3108: Linear Algebra Instructor: Jason Murphy Department of Mathematics and Statistics Missouri University of Science and Technology 1 / 323 Contents. Chapter 1. Slides 3 70 Chapter 2. Slides 71 118
More informationDepartment of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4
Department of Aerospace Engineering AE6 Mathematics for Aerospace Engineers Assignment No.. Decide whether or not the following vectors are linearly independent, by solving c v + c v + c 3 v 3 + c v :
More information(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =
. (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)
More informationSOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1. I. Topics from linear algebra
SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1 Winter 2009 I. Topics from linear algebra I.0 : Background 1. Suppose that {x, y} is linearly dependent. Then there are scalars a, b which are not both
More informationVector Spaces, Orthogonality, and Linear Least Squares
Week Vector Spaces, Orthogonality, and Linear Least Squares. Opening Remarks.. Visualizing Planes, Lines, and Solutions Consider the following system of linear equations from the opener for Week 9: χ χ
More informationLinear Algebra Math 221
Linear Algebra Math 221 Open Book Exam 1 Open Notes 3 Sept, 24 Calculators Permitted Show all work (except #4) 1 2 3 4 2 1. (25 pts) Given A 1 2 1, b 2 and c 4. 1 a) (7 pts) Bring matrix A to echelon form.
More informationRow Space, Column Space, and Nullspace
Row Space, Column Space, and Nullspace MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Introduction Every matrix has associated with it three vector spaces: row space
More informationCSL361 Problem set 4: Basic linear algebra
CSL361 Problem set 4: Basic linear algebra February 21, 2017 [Note:] If the numerical matrix computations turn out to be tedious, you may use the function rref in Matlab. 1 Row-reduced echelon matrices
More informationLINEAR ALGEBRA SUMMARY SHEET.
LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized
More informationReview of Matrices and Block Structures
CHAPTER 2 Review of Matrices and Block Structures Numerical linear algebra lies at the heart of modern scientific computing and computational science. Today it is not uncommon to perform numerical computations
More informationYORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions
YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 222 3. M Test # July, 23 Solutions. For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For
More informationDot Products, Transposes, and Orthogonal Projections
Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +
More informationWe see that this is a linear system with 3 equations in 3 unknowns. equation is A x = b, where
Practice Problems Math 35 Spring 7: Solutions. Write the system of equations as a matrix equation and find all solutions using Gauss elimination: x + y + 4z =, x + 3y + z = 5, x + y + 5z = 3. We see that
More informationMath 2331 Linear Algebra
4.5 The Dimension of a Vector Space Math 233 Linear Algebra 4.5 The Dimension of a Vector Space Shang-Huan Chiu Department of Mathematics, University of Houston schiu@math.uh.edu math.uh.edu/ schiu/ Shang-Huan
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationMTH 464: Computational Linear Algebra
MTH 464: Computational Linear Algebra Lecture Outlines Exam 2 Material Prof. M. Beauregard Department of Mathematics & Statistics Stephen F. Austin State University March 2, 2018 Linear Algebra (MTH 464)
More informationLinear Algebra Highlights
Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to
More informationis Use at most six elementary row operations. (Partial
MATH 235 SPRING 2 EXAM SOLUTIONS () (6 points) a) Show that the reduced row echelon form of the augmented matrix of the system x + + 2x 4 + x 5 = 3 x x 3 + x 4 + x 5 = 2 2x + 2x 3 2x 4 x 5 = 3 is. Use
More informationMAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction
MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example
More informationDefinitions for Quizzes
Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More informationLinear Algebra March 16, 2019
Linear Algebra March 16, 2019 2 Contents 0.1 Notation................................ 4 1 Systems of linear equations, and matrices 5 1.1 Systems of linear equations..................... 5 1.2 Augmented
More informationThe value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.
Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class
More informationMath Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT
Math Camp II Basic Linear Algebra Yiqing Xu MIT Aug 26, 2014 1 Solving Systems of Linear Equations 2 Vectors and Vector Spaces 3 Matrices 4 Least Squares Systems of Linear Equations Definition A linear
More information1 9/5 Matrices, vectors, and their applications
1 9/5 Matrices, vectors, and their applications Algebra: study of objects and operations on them. Linear algebra: object: matrices and vectors. operations: addition, multiplication etc. Algorithms/Geometric
More informationHOSTOS COMMUNITY COLLEGE DEPARTMENT OF MATHEMATICS
HOSTOS COMMUNITY COLLEGE DEPARTMENT OF MATHEMATICS MAT 217 Linear Algebra CREDIT HOURS: 4.0 EQUATED HOURS: 4.0 CLASS HOURS: 4.0 PREREQUISITE: PRE/COREQUISITE: MAT 210 Calculus I MAT 220 Calculus II RECOMMENDED
More informationExam 1 - Definitions and Basic Theorems
Exam 1 - Definitions and Basic Theorems One of the difficuliies in preparing for an exam where there will be a lot of proof problems is knowing what you re allowed to cite and what you actually have to
More informationQuizzes for Math 304
Quizzes for Math 304 QUIZ. A system of linear equations has augmented matrix 2 4 4 A = 2 0 2 4 3 5 2 a) Write down this system of equations; b) Find the reduced row-echelon form of A; c) What are the pivot
More information6.1. Inner Product, Length and Orthogonality
These are brief notes for the lecture on Friday November 13, and Monday November 1, 2009: they are not complete, but they are a guide to what I want to say on those days. They are guaranteed to be incorrect..1.
More informationMath 2174: Practice Midterm 1
Math 74: Practice Midterm Show your work and explain your reasoning as appropriate. No calculators. One page of handwritten notes is allowed for the exam, as well as one blank page of scratch paper.. Consider
More informationOverview. Motivation for the inner product. Question. Definition
Overview Last time we studied the evolution of a discrete linear dynamical system, and today we begin the final topic of the course (loosely speaking) Today we ll recall the definition and properties of
More informationThird Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.
Math 7 Treibergs Third Midterm Exam Name: Practice Problems November, Find a basis for the subspace spanned by the following vectors,,, We put the vectors in as columns Then row reduce and choose the pivot
More information7. Dimension and Structure.
7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain
More informationMATH SOLUTIONS TO PRACTICE PROBLEMS - MIDTERM I. 1. We carry out row reduction. We begin with the row operations
MATH 2 - SOLUTIONS TO PRACTICE PROBLEMS - MIDTERM I. We carry out row reduction. We begin with the row operations yielding the matrix This is already upper triangular hence The lower triangular matrix
More information5 Linear Transformations
Lecture 13 5 Linear Transformations 5.1 Basic Definitions and Examples We have already come across with the notion of linear transformations on euclidean spaces. We shall now see that this notion readily
More informationSYMBOL EXPLANATION EXAMPLE
MATH 4310 PRELIM I REVIEW Notation These are the symbols we have used in class, leading up to Prelim I, and which I will use on the exam SYMBOL EXPLANATION EXAMPLE {a, b, c, } The is the way to write the
More informationLecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra
Lecture: Linear algebra. 1. Subspaces. 2. Orthogonal complement. 3. The four fundamental subspaces 4. Solutions of linear equation systems The fundamental theorem of linear algebra 5. Determining the fundamental
More information