Lectures on Linear Algebra for IT

Similar documents
Lectures on Linear Algebra for IT

Lectures on Linear Algebra for IT

Lectures on Linear Algebra for IT

MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian.

Chapter 3. Vector spaces

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Study Guide for Linear Algebra Exam 2

Math 4377/6308 Advanced Linear Algebra

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

The definition of a vector space (V, +, )

GENERAL VECTOR SPACES AND SUBSPACES [4.1]

Math 3C Lecture 25. John Douglas Moore

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Lecture 18: Section 4.3

Linear Equations in Linear Algebra

Math 3C Lecture 20. John Douglas Moore

17. C M 2 (C), the set of all 2 2 matrices with complex entries. 19. Is C 3 a real vector space? Explain.

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Linear Algebra Practice Problems

Lecture 23: 6.1 Inner Products

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

1111: Linear Algebra I

MATH 304 Linear Algebra Lecture 20: Review for Test 1.

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

Chapter 1. Vectors, Matrices, and Linear Spaces

CSL361 Problem set 4: Basic linear algebra

Lecture 6: Spanning Set & Linear Independency

6. The scalar multiple of u by c, denoted by c u is (also) in V. (closure under scalar multiplication)

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

Row Reduction and Echelon Forms

The scope of the midterm exam is up to and includes Section 2.1 in the textbook (homework sets 1-4). Below we highlight some of the important items.

Online Exercises for Linear Algebra XM511

Solutions of Linear system, vector and matrix equation

MA 242 LINEAR ALGEBRA C1, Solutions to First Midterm Exam

2. Every linear system with the same number of equations as unknowns has a unique solution.

4 Chapter 4 Lecture Notes. Vector Spaces and Subspaces

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

Determine whether the following system has a trivial solution or non-trivial solution:

1 Systems of equations

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Linear independence, span, basis, dimension - and their connection with linear systems

Linear Equations in Linear Algebra

Lecture Summaries for Linear Algebra M51A

Eigenvalues and Eigenvectors

ICS 6N Computational Linear Algebra Vector Equations

Math 250B Midterm II Review Session Spring 2019 SOLUTIONS

Lecture 22: Section 4.7

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Linear Independence x

Chapter 2: Linear Independence and Bases

Calculating determinants for larger matrices

VECTORS [PARTS OF 1.3] 5-1

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

Math 544, Exam 2 Information.

Exam 1 - Definitions and Basic Theorems

Math 1553, Introduction to Linear Algebra

Span and Linear Independence

NOTES (1) FOR MATH 375, FALL 2012

and let s calculate the image of some vectors under the transformation T.

MAT 242 CHAPTER 4: SUBSPACES OF R n

Math 3191 Applied Linear Algebra

MTH 362: Advanced Engineering Mathematics

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4

Problem set #4. Due February 19, x 1 x 2 + x 3 + x 4 x 5 = 0 x 1 + x 3 + 2x 4 = 1 x 1 x 2 x 4 x 5 = 1.

b for the linear system x 1 + x 2 + a 2 x 3 = a x 1 + x 3 = 3 x 1 + x 2 + 9x 3 = 3 ] 1 1 a 2 a

Math 54 HW 4 solutions

Linear Equations and Vectors

Chapter 1 Vector Spaces

Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.

Chapter 7. Linear Algebra: Matrices, Vectors,

MATH 300, Second Exam REVIEW SOLUTIONS. NOTE: You may use a calculator for this exam- You only need something that will perform basic arithmetic.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Linear Equations in Linear Algebra

Linear Algebra problems

The Theory of Linear Homogeneous (Constant Coefficient) Recurrences

Dr. Abdulla Eid. Section 4.2 Subspaces. Dr. Abdulla Eid. MATHS 211: Linear Algebra. College of Science

Eigenvalues and Eigenvectors

MODEL ANSWERS TO THE FIRST QUIZ. 1. (18pts) (i) Give the definition of a m n matrix. A m n matrix with entries in a field F is a function

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS

Solution: By inspection, the standard matrix of T is: A = Where, Ae 1 = 3. , and Ae 3 = 4. , Ae 2 =

Row Space, Column Space, and Nullspace

Final Exam Practice Problems Answers Math 24 Winter 2012

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Lecture 12: Solving Systems of Linear Equations by Gaussian Elimination

a 1n a 2n. a mn, a n = a 11 a 12 a 1j a 1n a 21 a 22 a 2j a m1 a m2 a mj a mn a 11 v 1 + a 12 v a 1n v n a 21 v 1 + a 22 v a 2n v n

Elementary matrices, continued. To summarize, we have identified 3 types of row operations and their corresponding

Methods for Solving Linear Systems Part 2

Chapter 2. General Vector Spaces. 2.1 Real Vector Spaces

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

System of Linear Equations

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

Linear Algebra 1 Exam 2 Solutions 7/14/3

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Math Linear algebra, Spring Semester Dan Abramovich

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Carleton College, winter 2013 Math 232, Solutions to review problems and practice midterm 2 Prof. Jones 15. T 17. F 38. T 21. F 26. T 22. T 27.

Transcription:

Lectures on Linear Algebra for IT by Mgr. Tereza Kovářová, Ph.D. following content of lectures by Ing. Petr Beremlijski, Ph.D. Department of Applied Mathematics, VSB - TU Ostrava Czech Republic

5. Linear Independence and Basis 5.1 Dependent and Independent Sets of Vectors 5.2 Dependency and Linear Combinations 5.3 Sufficient Conditions for Independency of Functions 5.4 Basis of a Vector Space 5.5 Vector Coordinates 5.6 Use of Coordinates

5.1 Dependent and Independent Sets of Vectors Definition 1 A nonempty finite set of vectors S = {v 1,..., v k } that are elements of a vector space V is called (linearly) independent, if the equation α 1 v 1 + + α k v k = o ( ) has the unique solution α 1 = = α k = 0. Whenever the set S = {v 1,..., v k } is (linearly) independent, we also say that the vectors v 1,..., v k are (linearly) independent. In case that the equation ( ) has another solution, we call the set S (linearly) dependent and also the vectors v 1,..., v k are (linearly) dependent.

5.1 Dependent and Independent Sets of Vectors Geometric illustration of dependency for two-dimensional vectors of R 2. α 2 v v w α 1 u = u o u α 1 u v o α 3 w = -w α 2 v = -u α 1 u + α 2 v + α 3 w = o α 1 u + α 2 v = o

Example 1 5.1 Dependent and Independent Sets of Vectors Let s consider vectors v 1 = [2, 1, 0], v 2 = [1, 2, 5] and v 3 = [7, 1, 5].Because 3v 1 + v 2 v 3 = o, the vector set S = {v 1, v 2, v 3 } is linearly dependent. Example 2 Polynomials p 1 (x) = 1 x, p 2 (x) = 5 + 3x 2x 2 and p 3 (x) = 1 + 3x x 2 satisfy the equation 3p 1 (x) p 2 (x) + 2p 3 (x) = 0 for each x R.Therefore given polynomials p 1, p 2, p 3 form linearly dependent set in P 3. Example 3 For vectors e 1 = [1, 0, 0], e 2 = [0, 1, 0] a e 3 = [0, 0, 1] the equation α 1 e 1 + α 2 e 2 + α 3 e 3 = o has only the unique solution α 1 = 0, α 2 = 0, α 3 = 0. Therefore e 1, e 2, e 3 form linearly independent set in R 3.

5.1 Dependent and Independent Sets of Vectors Theorem 1 A finite set of nonzero vectors S = {v 1,... v m } is linearly dependent, if and only if there exists k 2 such that v k is a linear combination of v 1,..., v k 1. Proof: (if part) Suppose S is dependent vector set. Consider sets S 1 = {v 1 }, S 2 = {v 1, v 2 },..., S m = {v 1,..., v m } and let S k be the least dependent set, so that α 1 v 1 + + α k v k = o ( ) with some of the coefficients α 1,..., α k different from zero. Because S 1 is obviously independent, must be k 2. Also α k 0, otherwise S k 1 would be dependent. Therefore we can rewrite the equation ( ) using vector space axioms as ( ) α1 v k = v 1 + + α k ( αk 1 α k ) v k 1.

5.1 Dependent and Independent Sets of Vectors Proof: (only if part) Suppose for 2 k m is v k = α 1 v 1 + + α k 1 v k 1, then (α 1 v 1 )... (α k 1 v k 1 ) + 1v k + 0v k+1 + + 0v m = o. It means, that at least the coefficient α k = 1 is different from zero, and so the set S m is linearly dependent.

5.2 The Sufficient Condition for Independency of Functions Let S = {f 1,..., f k } be a finite set of real functions from a vector space F. The set S is independent if and only if the equation α 1 f 1 (x) + + α k f k (x) = 0 for all x R, has the unique solution (the zero solution) α 1 = = α k = 0. sufficient condition: By substituting k different real numbers for x, x 1,..., x k, we obtain the linear system of k linear equations with k variables α 1,..., α k : α 1 f 1 (x 1 ) +... + α k f k (x 1 ) = 0....... α 1 f 1 (x k ) +... + α k f k (x k ) = 0 If the coefficient matrix of this system is regular (invertible), then the only solution is α 1 = = α k = 0, and so the set S is independent.

5.2 The Sufficient Condition for Independency of Functions Example 4 Are the power functions x, x 2 and x 3 linearly independent? Solution: We choose three values of x arbitrarily. For inst. x 1 = 1, x 2 = 2 and x 3 = 3. By substituting these values into functions f 1 (x) = x, f 2 (x) = x 2, f 3 (x) = x 3 we obtain the linear system: α 1 + α 2 + α 3 = 0 2α 1 + 4α 2 + 8α 3 = 0 3α 1 + 9α 2 + 27α 3 = 0 To find out about the solution of the system, we transform the coefficient matrix into an echelon form. 1 1 1 2 4 8 2r 1 1 1 1 0 2 6 1 1 1 0 2 6. 3 9 27 3r 1 0 6 24 3r 2 0 0 6 Since the system has only the zero solution α 1 = α 2 = α 3 = 0 (coefficient matrix is invertible), the functions x, x 2 and x 3 are linearly independent.

5.3 Basis of a Vector Space Definition 2 A finite set B of vectors in a vector space V is called a basis for V, if B is linearly independent set, and each vector v V is a linear combination of vectors in B. Note: The second condition of the above definition can be also rephrased as: The vector space V is the span of B, V = B. The definition of a basis applies also to the case when V is a vector subspace in a vector space W, because any vector subspace is a vector space of itself. Not every vector space has a basis in a sense of our definition. For instance a finite set of real functions, that would span all F does not exists.

Example 5 5.3 Basis of a Vector Space Vectors e 1 = [1, 0, 0], e 2 = [0, 1, 0], e 3 = [0, 0, 1] form a basis for V = R 3. The vectors e 1, e 2, e 3 are linearly independent and any vector v = [v 1, v 2, v 3 ] in V can be expressed as a linear combination: v = v 1 e 1 + v 2 e 2 + v 3 e 3. The basis E = (e 1, e 2, e 3 ) is called a standard basis for R n, and the basis vectors form columns (or rows) of the identity matrix I n. Example 6 Polynomials p 1 (x) = 1 and p 2 (x) = x form a basis for P 2. Certainly each polynomial p(x) = a 0 + a 1 x can be written in the form p(x) = a 0 p 1 (x) + a 1 p 2 (x). To shaw that p 1, p 2 are linearly independent, suppose that a 0, a 1 satisfy a 0 p 1 (x) + a 1 p 2 (x) = o(x) for all x R. From here a 0 + a 1 x = 0. For x = 0 we get a 0 + a 1 0 = 0, and so a 0 = 0. For x = 1 we get a 1 1 = 0, and so a 1 = 0. This proves that p 1 and p 2 are linearly independent.

5.3 Vector Coordinates Definition 3 Suppose B = (b 1,..., b n ) is an ordered basis for V and v is in V. The coordinates of v relative to the basis B (or the B coordinates of v) are the weights (scalars) c 1,..., c n, such that v = c 1 b 1 + + c n b n. Theorem 2 Let B = (b 1,..., b n ) be an ordered basis for a vector space V. Suppose x 1,..., x n and y 1,..., y n are both B coordinates of v V. Then x 1 = y 1,..., x n = y n. Proof: Since B is a basis for V, is v = x 1 b 1 + + x n b n = and = y 1 b 1 + + y n b n. Then o = v + ( 1)v = x 1 b 1 + + x n b n + +( 1)(y 1 b 1 + + y n b n ) = (x 1 y 1 )b 1 + + (x n y n )b n. Because basis vectors are independent, must be x 1 = y 1,..., x n = y n.

5.3 Vector Coordinates For each vector v V representation by coordinates relative to the given basis B is unique. If c 1,..., c n are the B-coordinates of v, then the vector in R n denoted [v] B = [c 1,..., c n ] is called the coordinate vector of v (relative to B). Example 7 For any arithmetic vector v = [v 1, v 2, v 3 ] the coordinates relative to the standard basis E = (e 1, e 2, e 3 ) from example 6 are v 1, v 2, v 3, because [v 1, v 2, v 3 ] = v 1 [1, 0, 0] + v 2 [0, 1, 0] + v 3 [0, 0, 1]. The coordinate vector of v relative to E is [v] E = [v 1, v 2, v 3 ]. Example 8 The coordinates of the polynomial p(x) = x + 2 relative to the basis P = (p 1, p 2 ) where p 1 (x) = 1 and p 2 (x) = x, from example 7 are 2, 1, because p(x) = x + 2 = 2p 1 (x) + 1p 2 (x). The P-coordinate vector of p(x) is [p] P = [2, 1].

5.4 Use of Coordinates The coordinates are used to transform problems involving vectors from V to problems involving only arithmetic vectors from R n. Such a transformation corresponds to a one-to-one mapping of the elements of V into R n. The mapping V v [v] B R n is the coordinate mapping (determined by B). Lemma 1 For any two vectors u, v V and a scalar α 1. [u + v] B = [u] B + [v] B 2. [αu] B = α[u] B Proof: Suppose [u] B = [u 1,..., u n] and [v] B = [v 1,..., v n], where B = (b 1,..., b n) is a basis for V. It means u = u 1 b 1 + + u nb n and v = v 1 b 1 + + v nb n.then u+v = u 1 b 1 + +u nb n+v 1 b 1 + +v nb n = (u 1 +v 1 )b 1 + +(u n+v n)b n and αu = αu 1 b 1 + + αu nb n, from where [u + v] B = [u] B + [v] B and [αu] B = α [u] B.

5.4 Use of Coordinates When we are solving problems involving linear combinations of vectors, as for instance to find out if a vector is a linear combination of some other vectors, or to decide if the given set of vectors is linearly independent, we proceed as follows: We make such a choice of a basis B for the given vector space, that the representations of all the vectors relative to the basis B are to be found easily. We find B-coordinate vectors of all the vectors occurring in the problem description. We solve the problem obtained from the original assignment by the interchange of all the vectors for B-coordinate vectors. Note: The procedure described above assumes, that we can find a suitable basis for the given vector space. This is not always possible.

Example 9 5.4 Use of Coordinates For the polynomial p(x) = x 2 1 find the coordinates relative to the basis P = (p 1, p 2, p 3 ), where p 1 (x) = 1, p 2 (x) = x + 1, p 3 (x) = x 2 + x + 1. Solution: We choose the basis E = (e 1, e 2, e 3 ), where e 1 (x) = 1, e 2 (x) = x, e 3 (x) = x 2 (E is the standard basis for P). Then we find E-coordinate vectors for the polynomials p, p 1, p 2, p 3. They are [p] E = [ 1, 0, 1], [p 1 ] E = [1, 0, 0], [p 2 ] E = [1, 1, 0], [p 3 ] E = [1, 1, 1]. Now we solve the linear system [p] E = α 1 [p 1 ] E + α 2 [p 2 ] E + α 3 [p 3 ] E. By writing equations for the corresponding entries we obtain. 1 = α 1 + α 2 + α 3 0 = α 2 + α 3 1 = α 3 The system has the unique solution α 1 = 1, α 2 = 1, α 3 = 1. It is easy to verify, that p = p 1 p 2 + p 3.

Example 10 5.4 Use of Coordinates Are the polynomials p 1 (x) = x 2 + x + 1, p 2 (x) = x 2 + 2x + 1, p 3 (x) = x 2 + x + 2 linearly dependent or independent? Solution: We choose the standard basis E = (e 1, e 2, e 3 ), where e 1 (x) = 1, e 2 (x) = x, e 3 (x) = x 2. For the vectors p, p 1, p 2, p 3 we find the E-coordinate vectors. They are [p 1 ] E = [1, 1, 1], [p 2 ] E = [1, 2, 1], [p 3 ] E = [2, 1, 1]. We solve the linear system α 1 [p 1 ] E + α 2 [p 2 ] E + α 3 [p 3 ] E = o. By entry equations are α1 + α 2 + 2α 3 = 0 α 1 + 2α 2 + α 3 = 0. α 1 + α 2 + α 3 = 0 Continue by row reduction of the augmented matrix 1 1 2 0 1 2 1 0 1 1 1 0 r 1 r 1 1 1 1 0 0 1 1 0 0 0 1 0 The system has the unique solution α 1 = 0, α 2 = 0, α 3 = 0. Therefore the polynomials p 1, p 2, p 3 are linearly independent..