(II.B) Basis and dimension

Similar documents
(II.D) There is exactly one rref matrix row-equivalent to any A

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Chapter 1. Vectors, Matrices, and Linear Spaces

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

Abstract Vector Spaces

Chapter 2: Linear Independence and Bases

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

2018 Fall 2210Q Section 013 Midterm Exam II Solution

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

Math 205, Summer I, Week 3a (continued): Chapter 4, Sections 5 and 6. Week 3b. Chapter 4, [Sections 7], 8 and 9

Lecture 9: Vector Algebra

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

AFFINE AND PROJECTIVE GEOMETRY, E. Rosado & S.L. Rueda 4. BASES AND DIMENSION

Unit 2, Section 3: Linear Combinations, Spanning, and Linear Independence Linear Combinations, Spanning, and Linear Independence

4.6 Bases and Dimension

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Chapter 3. Vector spaces

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

The definition of a vector space (V, +, )

Study Guide for Linear Algebra Exam 2

1 Last time: determinants

Chapter 1: Systems of Linear Equations

Linear Algebra. Preliminary Lecture Notes

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

Chapter 3. More about Vector Spaces Linear Independence, Basis and Dimension. Contents. 1 Linear Combinations, Span

MTH 35, SPRING 2017 NIKOS APOSTOLAKIS

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Linear Algebra, Summer 2011, pt. 3

Math 54. Selected Solutions for Week 5

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

4 Hilbert s Basis Theorem and Gröbner basis

Chapter 2 Subspaces of R n and Their Dimensions

ARE211, Fall2012. Contents. 2. Linear Algebra (cont) Vector Spaces Spanning, Dimension, Basis Matrices and Rank 8

Linear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016

Vector Spaces and Dimension. Subspaces of. R n. addition and scalar mutiplication. That is, if u, v in V and alpha in R then ( u + v) Exercise: x

Math Linear algebra, Spring Semester Dan Abramovich

3.2 Constructible Numbers

Exam 2 Solutions. (a) Is W closed under addition? Why or why not? W is not closed under addition. For example,

Section 1.5. Solution Sets of Linear Systems

Vector Spaces. (1) Every vector space V has a zero vector 0 V

Linear Algebra. Min Yan

3. Vector spaces 3.1 Linear dependence and independence 3.2 Basis and dimension. 5. Extreme points and basic feasible solutions

Chapter 4 & 5: Vector Spaces & Linear Transformations

Math 369 Exam #2 Practice Problem Solutions

Vector space and subspace

Systems of Linear Equations

Chapter 2. General Vector Spaces. 2.1 Real Vector Spaces

Review of Linear Algebra

Outline. Linear maps. 1 Vector addition is commutative: summands can be swapped: 2 addition is associative: grouping of summands is irrelevant:

which are not all zero. The proof in the case where some vector other than combination of the other vectors in S is similar.

1.8 Dual Spaces (non-examinable)

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Math 110, Spring 2015: Midterm Solutions

div(f ) = D and deg(d) = deg(f ) = d i deg(f i ) (compare this with the definitions for smooth curves). Let:

Linear Algebra. Preliminary Lecture Notes

FOUNDATIONS OF ALGEBRAIC GEOMETRY CLASS 48

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

OHSX XM511 Linear Algebra: Multiple Choice Exercises for Chapter 2

Dimension. Eigenvalue and eigenvector

Linear Algebra, Summer 2011, pt. 2

Abstract Vector Spaces and Concrete Examples

(I.D) Solving Linear Systems via Row-Reduction

Solution to Homework 1

NOTES on LINEAR ALGEBRA 1

GEOMETRY OF MATRICES x 1

Linear Algebra Massoud Malek

Linear algebra and differential equations (Math 54): Lecture 10

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Matrices and RRE Form

MATH 300, Second Exam REVIEW SOLUTIONS. NOTE: You may use a calculator for this exam- You only need something that will perform basic arithmetic.

2. Intersection Multiplicities

This last statement about dimension is only one part of a more fundamental fact.

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

Linear Algebra MAT 331. Wai Yan Pong

Math 54 HW 4 solutions

Lecture Notes 1: Vector spaces

Math 321: Linear Algebra

The Gauss-Jordan Elimination Algorithm

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK)

MATH 225 Summer 2005 Linear Algebra II Solutions to Assignment 1 Due: Wednesday July 13, 2005

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Tangent spaces, normals and extrema

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Mathematics Department Stanford University Math 61CM/DM Inner products

Chapter 2: Matrix Algebra

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall 2011

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

Solution Set 7, Fall '12

x 2 For example, Theorem If S 1, S 2 are subspaces of R n, then S 1 S 2 is a subspace of R n. Proof. Problem 3.

COMPLEX ALGEBRAIC SURFACES CLASS 15

Math Linear Algebra Final Exam Review Sheet

Chapter 6: Orthogonality

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Vector Spaces. Addition : R n R n R n Scalar multiplication : R R n R n.

Transcription:

(II.B) Basis and dimension How would you explain that a plane has two dimensions? Well, you can go in two independent directions, and no more. To make this idea precise, we formulate the DEFINITION 1. A (finite) subset { v 1,..., v m } of a vector space V (over a field F) is a basis of V if (a) they are linearly independent (over F), and (b) they span V (over F). 1 The over F means that the linear combinations are understood to be of the form α i v i, with α i F. As usual, our default will be F = R. EXAMPLE 2. A basis of R m is given by the standard basis vectors {ê i } m. Of course, one could say the same for any field: the standard basis vectors are also basis for C m, viewed as a vector space over C. But what if we view C m as a vector space over R? Then you need the 2m vectors ê 1,..., ê m and iê 1,..., iê m! (Why?) EXAMPLE 3. Let P 2 (x, y) denote the vector space of quadratic polynomials in two variables x and y, with real coefficients. This has the monomial basis {ρ 2 i (x, y)}6 = {1, x, y, xy, x2, y 2 }. Similarly, {ρ 3 i (x, y)}10 = {1, x, y, xy, x2, y 2, x 2 y, y 2 x, x 3, y 3 } gives a basis for the space P 3 (x, y) of cubic polynomials. We ll give a geometric application of this example below. 1 I am not saying that you can t have infinite bases; indeed, non-finitely-generated vector spaces (like P (x) := all polynomials) may have such a basis. I m just not defining or discussing them at this stage. 1

2 (II.B) BASIS AND DIMENSION EXAMPLE 4. Let A be an m n matrix, and V be the vector subspace of R n consisting of solutions to A x = 0. How do you compute a basis of V? By making use of the row-reduction algorithm, of course! For instance, we might have 0 0 1 1 1 1 2 0 3 0 2 4 2 4 2 0 0 1 1 0 A = rre f (A) = 2 4 3 3 3 0 0 0 0 1. 3 6 6 3 6 0 0 0 0 0 The free variables are x 2 and x 4, and an arbitrary solution is given by plugging arbitrary values in for these and solving for the pivot variables x 1, x 3, x 5. To get a basis, you plug in (x 2 = 1, x 4 = 0) and (x 2 = 0, x 4 = 1) to obtain 2 3 1 0 0, 1. 0 1 0 0 More generally, the i th basis element is given by setting the i th free variable to 1 and the rest to zero. EXAMPLE 5. Suppose A is an invertible m m matrix. I claim that its columns { c i } m are a basis for Rm. To establish this, check that: { c i } are linearly independent : γ 1 c 1 +... + γ m c m = 0 = A γ = 0 = γ = 0 (since A is invertible). { c i } span R m : show every y R m can be written y = γ 1 c 1 +... + γ m c m (= A γ). This is easy: just put γ = A 1 y. We say that V is f initely generated if some finite subset { v 1,..., v m } spans V. Here is the precise reason why dimension makes sense: PROPOSITION 6. If V is finitely generated then (i) it has a (finite) basis, and (ii) any 2 bases of V have the same number of elements.

(II.B) BASIS AND DIMENSION 3 DEFINITION 7. We define the dimension dim F (V) of V over F to be the number of elements in (ii). 2 We get (i) by going through the generating subset { v 1,..., v m } and throwing out any v i that is a linear combination of the earlier vectors. What remains is a basis. In order to show (ii) we first prove the following LEMMA 8. If v 1,..., v m span V then any independent set V has no more than m vectors. PROOF. Consider any set u 1,..., u n, n > m (of vectors in V ). Since v 1,..., v m span V, each u j is a linear combination of the v i : u j = m A ij v i, for A m n. Since m < n, there is a nontrivial solution to A x = 0, some nonzero x R n. That is, and so n x j u j = j=1 n A ij x j = 0 for each i, j=1 n m x j A ij v i = j=1 ( m n x j A ij ) v i = j=1 m 0 v i = 0. We conclude that { u j } n j=1 is not an independent set! Now let { v 1,..., v m } and { w 1,..., w n } be two bases for V. The { v i } span V and the { w j } are independent, so the Lemma = n m. The same argument with { v i } and { w j } swapped = n m, and so we ve proved the Proposition. We can rephrase the Lemma as follows: if dim(v) = m then (a) less than m vectors cannot span V, while (b) more than m vectors are linearly dependent. A Geometric Application. Given 5 points A, B, C, D, E in general position in the plane, there is a unique conic passing through 2 The F is typically omitted when there is no ambiguity. That said, in the first example we had dim C (C n ) = n while dim R (C n ) = 2n.

4 (II.B) BASIS AND DIMENSION them. Here s why: the equation of a conic Q has the form 0 = f Q (x, y) = 6 a i ρ 2 i (x, y) = a 1 + a 2 x + a 3 y + a 4 xy + a 5 x 2 + a 6 y 2, and the statement that it passes through A is 0 = f Q (x A, y A ) = a i ρ i (x A, y A ). Continuing this for B thru E we get 5 equations in 6 unknowns a i, a system which generically 3 has a line of solutions: if (a 1,..., a 6 ) solves the system then so does (c a 1,..., c a 6 ). But this just gives multiples of the same f Q, which won t change the shape of the curve, and so there is really only one nontrivial solution: through five general points in the plane there exists a unique conic Q. Similarly, given 8 points A, B, C, D, E, P 1, P 2, P 3 what can we say about the vector space of cubics passing through them (i.e. of cubic polynomials in (x,y) vanishing at all 8 points)? First of all, the vector space of all cubics 10 a i ρ 3 i (x, y) = a 1 + a 2 x + a 3 y +... + a 9 x 3 + a 10 y 3 should be thought of in terms of the coefficient vector a. The 8 constraints give a linear system 0 = 10 a iρ 3 i (x A, y A ). 0 = 10 a iρ 3 i (x 8 equations P 3, y P3 ) whose space of solutions in general (i.e., under the assumption of maximal rank[=8]) is two-dimensional there are 8 leading 1 s in rre f of the 8 10 [ρ] matrix, and therefore 2 parameters to choose freely. So start with A, B, C, D, E in general position ; we d like to construct points on the (unique) conic through them, using a straightedge. Begin by drawing the lines AB, DE, CD, and BC ; label 3 more precisely, generically means here that the matrix [ρ] (with rows indexed by A thru E, columns by i = 1 thru 6 ) is of maximal rank (= 5 ).

(II.B) BASIS AND DIMENSION 5 AB DE by P 1. (Q is drawn in a background color because we don t know what it looks like yet.) Q B C A D E BC AB CD DE P 1 Now draw (almost) any line l through A the choice is free except that l shouldn t go through B, C, D, or E. All the rest of the construction depends on this choice; in particular the final point q depends on l, and so by varying l we can vary q and in this way (we hope) construct enough points on Q to get an idea of what it looks like. Label l CD =: P 2 ; then draw P 1 P 2 and label P 1 P 2 BC =: P 3 ; finally set q := EP 3 l. We must prove q Q. Adding dotted lines to our diagram for the lines depending on l, Q B C A q E D BC AB l CD EP 3 DE P 1 P 2 P 3

6 (II.B) BASIS AND DIMENSION Now consider the following three cubic equations (where e.g. f AB (x, y) = 0 is the linear equation of the line AB, and so on): f 1 (x, y) := f Q (x, y) f P1 P 2 (x, y) = 0, f 2 (x, y) := f AB (x, y) f CD (x, y) f EP3 (x, y) = 0, f 3 (x, y) := f l (x, y) f BC (x, y) f DE (x, y) = 0. The graphical solutions of these equations are (respectively) Q P 1 P 2, AB CD EP 3, l BC DE. All 3 of these unions contain the 8 points A, B, C, D, E, P 1, P 2, P 3 ; that is, the cubic polynomials f 1, f 2, and f 3 each vanish at all 8 points. Above we explained that the (vector) space of such cubic polynomials was 10 8 = 2 dimensional; it follows that 3 vectors f 1, f 2, f 3 cannot be linearly independent and there is a nontrivial relation α f 1 (x, y) + β f 2 (x, y) + γ f 3 (x, y) = 0. Now we extract the information we want (q Q ) from this relation. First of all α cannot be zero. Otherwise β f 2 = γ f 3 are multiples of each other, which means that AB CD EP 3 = l BC DE. That would imply l was either AB, CD, or EP 3 ; and we said l cannot pass through B, C, D, or E so this gives a contradiction. So we may divide by α = f 1 (x, y) = β α f 2(x, y) γ α f 3(x, y). Now since q is contained in both EP 3 and l, f 2 (x, y) and f 3 (x, y) both vanish at q = (x q, y q ). (This is because f 2 contains a factor of f EP and f 3 contains a factor of f l.) According to the relation, therefore also f 1 = 0 at q ; this means that either f Q or f P1 P 2 vanishes at q, and so q Q or P 1 P 2. Suppose the latter: if q P 1 P 2 then the three lines P 1 P 2, EP 3, l are the same, and so E l. But once again, we disallowed such a choice of l at the outset. So q Q, and we re done. Another way of thinking of this result is that intercepts of opposite edges of a hexagon inscribed in a conic lie on line. This is a theorem proved by Pascal in 1639 known as his mystic hexagon!

Exercises EXERCISES 7 (1) Let V be a vector space over a subfield F of the complex numbers. Suppose u, v, w are linearly independent vectors in V. Prove that ( u + v), ( v + w), and ( w + u) are linearly independent. (2) Let V be the set of all 2 2 matrices A with complex entries which satisfy A 11 + A 22 = 0. (a) Show that V is a vector space over the field of real numbers, with the usual operations of matrix addition and multiplication of a matrix by a scalar. (b) Find a basis for this vector space. (c) Let W be the set of all matrices A in V such that A 21 = A 12 (the bar denotes complex conjugation). Prove that W is a subspace of V and find a basis for W. (3) Prove that the space of all m n matrices over the field F has dimension mn, by exhibiting a basis for this space.