An overview of key ideas

Similar documents
This lecture is a review for the exam. The majority of the exam is on what we ve learned about rectangular matrices.

q n. Q T Q = I. Projections Least Squares best fit solution to Ax = b. Gram-Schmidt process for getting an orthonormal basis from any basis.

[ Here 21 is the dot product of (3, 1, 2, 5) with (2, 3, 1, 2), and 31 is the dot product of

18.06SC Final Exam Solutions

Properties of Linear Transformations from R n to R m

LS.2 Homogeneous Linear Systems with Constant Coefficients

x n -2.5 Definition A list is a list of objects, where multiplicity is allowed, and order matters. For example, as lists

MA 0540 fall 2013, Row operations on matrices

18.06 Problem Set 2 Solution

Definition (T -invariant subspace) Example. Example

On-Line Geometric Modeling Notes VECTOR SPACES

WHY POLYNOMIALS? PART 1

Vectors. (same vector)

Lesson U2.1 Study Guide

LS.1 Review of Linear Algebra

Linear Algebra V = T = ( 4 3 ).

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Gaussian elimination

Chapter 4 & 5: Vector Spaces & Linear Transformations

Matrices. A matrix is a method of writing a set of numbers using rows and columns. Cells in a matrix can be referenced in the form.

Linear Algebra 1 Exam 2 Solutions 7/14/3

Linear algebra and differential equations (Math 54): Lecture 10

MATH 2030: ASSIGNMENT 4 SOLUTIONS

A matrix times a column vector is the linear combination of the columns of the matrix weighted by the entries in the column vector.

MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian.

Maths for Signals and Systems Linear Algebra for Engineering Applications

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math 54 HW 4 solutions

MAT 1302B Mathematical Methods II

Math 24 Spring 2012 Questions (mostly) from the Textbook

4. Sinusoidal solutions

The geometry of least squares

Family Feud Review. Linear Algebra. October 22, 2013

MAC1105-College Algebra. Chapter 5-Systems of Equations & Matrices

18.02 Multivariable Calculus Fall 2007

Determinants and Scalar Multiplication

4 Elementary matrices, continued

CS123 INTRODUCTION TO COMPUTER GRAPHICS. Linear Algebra 1/33

7.6 The Inverse of a Square Matrix

Math 308 Discussion Problems #4 Chapter 4 (after 4.3)

1 Last time: inverses

2. Every linear system with the same number of equations as unknowns has a unique solution.

Chapter 1 Vector Spaces

Matrices and Matrix Algebra.

Section 1.1: Patterns in Division

AN ITERATION. In part as motivation, we consider an iteration method for solving a system of linear equations which has the form x Ax = b

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

CSL361 Problem set 4: Basic linear algebra

5 Group theory. 5.1 Binary operations

Span and Linear Independence

Chapter 9: Systems of Equations and Inequalities

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

Linear Algebra and Matrix Inversion

3 Fields, Elementary Matrices and Calculating Inverses

CS123 INTRODUCTION TO COMPUTER GRAPHICS. Linear Algebra /34

Linear Algebra. Chapter Linear Equations

Determinants and Scalar Multiplication

3. Vector spaces 3.1 Linear dependence and independence 3.2 Basis and dimension. 5. Extreme points and basic feasible solutions

Three-Dimensional Coordinate Systems. Three-Dimensional Coordinate Systems. Three-Dimensional Coordinate Systems. Three-Dimensional Coordinate Systems

Algebra 2 Matrices. Multiple Choice Identify the choice that best completes the statement or answers the question. 1. Find.

Linear Algebra March 16, 2019

Lecture 16: 9.2 Geometry of Linear Operators

Final Exam Practice Problems Answers Math 24 Winter 2012

5.6 Solving Equations Using Both the Addition and Multiplication Properties of Equality

Lecture 1: Systems of linear equations and their solutions

What is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A =

Vector Spaces and SubSpaces

Problem Set # 1 Solution, 18.06

MATH 423/ Note that the algebraic operations on the right hand side are vector subtraction and scalar multiplication.

4. Linear transformations as a vector space 17

Test 3, Linear Algebra

Calculus II - Basic Matrix Operations

Chapter 6. Additional Topics in Trigonometry. 6.6 Vectors. Copyright 2014, 2010, 2007 Pearson Education, Inc.

MAT 242 CHAPTER 4: SUBSPACES OF R n

THE NULLSPACE OF A: SOLVING AX = 0 3.2

The main aim of this exercise is to stress the structure of matrix arithmetic. a. If we had been given the equation

6.034f Neural Net Notes October 28, 2010

17. C M 2 (C), the set of all 2 2 matrices with complex entries. 19. Is C 3 a real vector space? Explain.

Preparing for the CSET. Sample Book. Mathematics

SOLUTIONS FOR PROBLEMS 1-30

Math 3C Lecture 20. John Douglas Moore

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

Solving Equations with the Quadratic Formula

Vector Algebra August 2013

Math 4377/6308 Advanced Linear Algebra

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Chapter 3. Vector spaces

which are not all zero. The proof in the case where some vector other than combination of the other vectors in S is similar.

Remark 3.2. The cross product only makes sense in R 3.

Abstract Vector Spaces

Math 308 Spring Midterm Answers May 6, 2013

7.5 Operations with Matrices. Copyright Cengage Learning. All rights reserved.

Chapter 3 Least Squares Solution of y = A x 3.1 Introduction We turn to a problem that is dual to the overconstrained estimation problems considered s

MATH 304 Linear Algebra Lecture 20: Review for Test 1.

Math 308 Midterm Answers and Comments July 18, Part A. Short answer questions

Unit 1 Matrices Notes Packet Period: Matrices

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Fixed Perimeter Rectangles

Transcription:

An overview of key ideas This is an overview of linear algebra given at the start of a course on the mathematics of engineering. Linear algebra progresses from vectors to matrices to subspaces. Vectors What do you do with vectors? Take combinations. We can multiply vectors by scalars, add, and subtract. Given vectors u, v and w we can form the linear combination x 1 u + x 2 v + x 3 w = b. An example in R 3 would be: u = 1, v = 1, w = 0. The collection of all multiples of u forms a line through the origin. The collection of all multiples of v forms another line. The collection of all combinations of u and v forms a plane. Taking all combinations of some vectors creates a subspace. We could continue like this, or we can use a matrix to add in all multiples of w. Matrices Create a matrix A with vectors u, v and w in its columns: A = 1 1 0. The product: x 1 x 1 Ax = 1 1 0 x 2 = x 1 + x 2 x 3 x 2 + x 3 equals the sum x 1 u + x 2 v + x 3 w = b. The product of a matrix and a vector is a combination of the columns of the matrix. (This particular matrix A is a difference matrix because the components of Ax are differences of the components of that vector.) When we say x 1 u + x 2 v + x 3 w = b we re thinking about multiplying numbers by vectors; when we say Ax = b we re thinking about multiplying a matrix (whose columns are u, v and w) by the numbers. The calculations are the same, but our perspective has changed. 1

For any input vector x, the output of the operation multiplication by A is some vector b: 1 1 A 4 = 3. 9 5 A deeper question is to start with a vector b and ask for what vectors x does Ax = b? In our example, this means solving three equations in three unknowns. Solving: x 1 x 1 b 1 Ax = 1 1 0 x 2 = x 2 x 1 = b 2 x 3 x 3 x 2 b 3 is equivalent to solving: x 1 = b 1 x 2 x 1 = b 2 x 3 x 2 = b 3. We see that x 1 = b 1 and so x 2 must equal b 1 + b 2. In vector form, the solution is: x 1 b 1 x 2 = b 1 + b 2. x 3 b 1 + b 2 + b 3 But this just says: b 1 x = 1 1 0 b 2, 1 1 1 b 3 or x = A 1 b. If the matrix A is invertible, we can multiply on both sides by A 1 to find the unique solution x to Ax = b. We might say that A represents a transform x b that has [ an ] inverse transform [ ] b x. 0 0 In particular, if b = 0 then x = 0. 0 0 The second example has the same columns u and v and replaces column vector w: 1 0 1 C = 1 1 0. Then: 1 0 1 Cx = 1 1 0 x 1 x 1 x 3 x 2 = x 2 x 1 x 3 x 3 x 2 and our system of three equations in three unknowns becomes circular. 2

Where before Ax = 0 implied x = 0, there are non-zero vectors x for which Cx = 0. For any vector x with x 1 = x 2 = x 3, Cx = 0. This is a significant difference; we can t multiply both sides of Cx = 0 by an inverse to find a nonzero solution x. The system of equations encoded in Cx = b is: x 1 x 3 = b 1 x 2 x 1 = b 2 x 3 x 2 = b 3. If we add these three equations together, we get: 0 = b 1 + 2 +b 3. This tells us that Cx = b has a solution x only when the components of b sum to 0. In a physical system, this might tell us that the system is stable as long as the forces on it are balanced. Subspaces Geometrically, the columns of C lie in the same plane (they are dependent; the columns of A are independent). There are many vectors in R 3 which do not lie in that plane. Those vectors cannot be written as a linear combination of the columns of C and so correspond to values of b for which Cx = b has no solution x. The linear combinations of the columns of C form a two dimensional subspace of R 3. This plane of combinations of u, v and w can be described as all vectors Cx. But we know that the vectors b for which Cx = b satisfy the condition b 1 + b 2 + b 3 = 0. So the plane of all combinations of u and v consists of all vectors whose components sum to 0. If we take all combinations of: u = 1, v = 1, and w = 0 we get the entire space R 3 ; the equation Ax = b has a solution for every b in R 3. We say that u, v and w form a basis for R 3. A basis for R n is a collection of n independent vectors in R n. Equivalently, a basis is a collection of n vectors whose combinations cover the whole space. Or, a collection of vectors forms a basis whenever a matrix which has those vectors as its columns is invertible. A vector space is a collection of vectors that is closed under linear combinations. A subspace is a vector space inside another vector space; a plane through the origin in R 3 is an example of a subspace. A subspace could be equal to the space it s contained in; the smallest subspace contains only the zero vector. The subspaces of R 3 are: 3

the origin, a line through the origin, a plane through the origin, all of R 3. Conclusion When you look at a matrix, try to see what is it doing? Matrices can be rectangular; we can have seven equations in three unknowns. Rectangular matrices are not invertible, but the symmetric, square matrix A T A that often appears when studying rectangular matrices may be invertible. 4

MIT OpenCourseWare http://ocw.mit.edu 18.06SC Linear Algebra Fall 2011 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.