On-Line Geometric Modeling Notes VECTOR SPACES

Similar documents
CONVERSION OF COORDINATES BETWEEN FRAMES

BARYCENTRIC COORDINATES

ECS 178 Course Notes QUATERNIONS

The geometry of least squares

August 23, 2017 Let us measure everything that is measurable, and make measurable everything that is not yet so. Galileo Galilei. 1.

On-Line Geometric Modeling Notes

On-Line Geometric Modeling Notes FRAMES

which are not all zero. The proof in the case where some vector other than combination of the other vectors in S is similar.

Exercises Chapter II.

Ohio s Learning Standards-Extended. Mathematics. The Real Number System Complexity a Complexity b Complexity c

Chapter 1 Vector Spaces

An overview of key ideas

11.1 Vectors in the plane

AFFINE COMBINATIONS, BARYCENTRIC COORDINATES, AND CONVEX COMBINATIONS

MATH 304 Linear Algebra Lecture 20: Review for Test 1.

Review of Coordinate Systems

CONTROL POLYGONS FOR CUBIC CURVES

3. Vector spaces 3.1 Linear dependence and independence 3.2 Basis and dimension. 5. Extreme points and basic feasible solutions

Chapter 3. Vector spaces

Linear Equations and Vectors

We know how to identify the location of a point by means of coordinates: (x, y) for a point in R 2, or (x, y,z) for a point in R 3.

- 1 - Items related to expected use of technology appear in bold italics.

General Physics I, Spring Vectors

LS.1 Review of Linear Algebra

I&C 6N. Computational Linear Algebra

[ Here 21 is the dot product of (3, 1, 2, 5) with (2, 3, 1, 2), and 31 is the dot product of

Study Guide for Linear Algebra Exam 2

Section 3.1: Definition and Examples (Vector Spaces)

1.1 Single Variable Calculus versus Multivariable Calculus Rectangular Coordinate Systems... 4

Math 24 Spring 2012 Questions (mostly) from the Textbook

0.2 Vector spaces. J.A.Beachy 1

SOLUTION OF QUADRATIC EQUATIONS LESSON PLAN. A3 Topic Overview ALGEBRA

NOTES ON LINEAR ALGEBRA CLASS HANDOUT

Math 241, Exam 1 Information.

Math 54 HW 4 solutions

GENERAL VECTOR SPACES AND SUBSPACES [4.1]

Linear Equations in Linear Algebra

MATH2210 Notebook 3 Spring 2018

Applied Linear Algebra

Algebra 2 (2006) Correlation of the ALEKS Course Algebra 2 to the California Content Standards for Algebra 2

Announcements Wednesday, November 7

Econ Slides from Lecture 7

1 Last time: row reduction to (reduced) echelon form

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Linear Algebra 1 Exam 2 Solutions 7/14/3

Math 4377/6308 Advanced Linear Algebra

7.5 Operations with Matrices. Copyright Cengage Learning. All rights reserved.

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

Linear Algebra. Min Yan

Vector Spaces ปร ภ ม เวกเตอร

Mathematics Standards for High School Precalculus

Extra Problems for Math 2050 Linear Algebra I

Solving Quadratic Equations

Abstract & Applied Linear Algebra (Chapters 1-2) James A. Bernhard University of Puget Sound

1 Solving Algebraic Equations

r y The angle theta defines a vector that points from the boat to the top of the cliff where rock breaks off. That angle is given as 30 0

Prentice Hall: Algebra 2 with Trigonometry 2006 Correlated to: California Mathematics Content Standards for Algebra II (Grades 9-12)

2.3. VECTOR SPACES 25

Tues Feb Vector spaces and subspaces. Announcements: Warm-up Exercise:

Precalculus. Precalculus Higher Mathematics Courses 85

2 so Q[ 2] is closed under both additive and multiplicative inverses. a 2 2b 2 + b

Math Linear Algebra Final Exam Review Sheet

3 - Vector Spaces Definition vector space linear space u, v,

APPM 3310 Problem Set 4 Solutions

Lecture 8: A Crash Course in Linear Algebra

Announcements Wednesday, November 7

CS 143 Linear Algebra Review

SOLUTIONS: ASSIGNMENT Use Gaussian elimination to find the determinant of the matrix. = det. = det = 1 ( 2) 3 6 = 36. v 4.

Lecture Summaries for Linear Algebra M51A

Tropical Polynomials

MATH10212 Linear Algebra B Homework Week 4

MATH 423 Linear Algebra II Lecture 12: Review for Test 1.

Lecture 17: Section 4.2

Algebra 2 Secondary Mathematics Instructional Guide

Mathematics Online Instructional Materials Correlation to the 2009 Algebra II Standards of Learning and Curriculum Framework

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

Vectors. The standard geometric definition of vector is as something which has direction and magnitude but not position.

Lecture 2: Vector-Vector Operations

INTRODUCTION TO LIE ALGEBRAS. LECTURE 2.

a b 0 a cos u, 0 u 180 :

DEF 1 Let V be a vector space and W be a nonempty subset of V. If W is a vector space w.r.t. the operations, in V, then W is called a subspace of V.

Linear algebra and differential equations (Math 54): Lecture 10

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Abstract Vector Spaces

Lecture 16: 9.2 Geometry of Linear Operators

Linear Algebra. The Manga Guide. Supplemental Appendixes. Shin Takahashi, Iroha Inoue, and Trend-Pro Co., Ltd.

This pre-publication material is for review purposes only. Any typographical or technical errors will be corrected prior to publication.

Tennessee s State Mathematics Standards Precalculus

Contents. 2.1 Vectors in R n. Linear Algebra (part 2) : Vector Spaces (by Evan Dummit, 2017, v. 2.50) 2 Vector Spaces

Contents. 1 Vectors, Lines and Planes 1. 2 Gaussian Elimination Matrices Vector Spaces and Subspaces 124

Math 3C Lecture 20. John Douglas Moore

GEOMETRY OF MATRICES x 1

2- Scalars and Vectors

Introduction to Matrix Algebra

Math 416, Spring 2010 More on Algebraic and Geometric Properties January 21, 2010 MORE ON ALGEBRAIC AND GEOMETRIC PROPERTIES

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

2012 Texas Essential Knowledge and Skills for Algebra II in Pearson Texas Algebra II

Elementary maths for GMT

Roots and Coefficients Polynomials Preliminary Maths Extension 1

Transcription:

On-Line Geometric Modeling Notes VECTOR SPACES Kenneth I. Joy Visualization and Graphics Research Group Department of Computer Science University of California, Davis These notes give the definition of a vector space and several of the concepts related to these spaces. Examples are drawn from the vector space of vectors in R 2. 1 Definition of a Vector Space A nonempty set V of elements v, w,... is called a vector space if in V there are two algebraic operations (called addition and scalar multiplication), so that the following properties hold. Addition associates with every pair of vectors v 1 and v 2 a unique vector v V which is called the sum of v 1 and v 2 and is written v 1 + v 2. In the case of the space of 2-dimensional vectors, the summation is componentwise (i.e. if v 1 =< x 1, y 1 > and v 2 =< x 2, y 2 >, then v 1 + v 2 =< x 1 + x 2, y 1 + y 2 >), which can be best illustrated by the parallelogram illustration below: Addition satisfies the following : Commutativity for any two vectors v 1 and v 2 in V, v 1 + v 2 = v 2 + v 1

Associativity for any three vectors v 1, v 2 and v 3 in V, ( v 1 + v 2 ) + v 3 = v 1 + ( v 2 + v 3 ) This rule is illustrated in the figure below. One can see that even though the sum v 1 + v 2 + v 3 is calculated differently, the result is the same..0/ 1!23 / 1&46573 / 1&8!"$# &%'" &(*) +, - Zero Vector there is a unique vector in V called the zero vector and denoted 0 such that for every vector v V v + 0 = 0 + v = v Additive Inverse for each element v V, there is a unique element in V, usually denoted v, so that v + ( v) = 0 In the case of 2-dimensional vectors, v is simply represented as the vector of equal magnitude to v, but in the opposite direction. The use of an additive inverse allows us to define a subtraction operation on vectors. Simply v 2 v 1 = v 2 + ( v 1 ) The result of vector subtraction in the space of 2-dimensional vectors is shown below. 2

Frequently this 2-d vectors is protrayed as joining the ends of the two original vectors. As we can see, since the vectors are determined by direction and length, and not position, the two vectors are equivalent. Scalar Multiplication associates with every vector v V and every scalar c, another unique vector (usually written c v), For scalar multiplication the following properties hold: Distributivity for every scalar c and vectors v 1 and v 2 in V, c( v 1 + v 2 ) = c v 1 + c v 2 In the case of 2-dimensional vectors, this can be easily seen by extending the parallelogram illustration above. We can see below that the sum of the vectors 2 v 1 and 2 v 2 is just twice the vector v 1 + v 2 3

Distributivity of Scalars for every two scalars c 1 and c 2 and vector v V, (c 1 + c 2 ) v = c 1 v + c 2 v Associativity for every two scalars c 1 and c 2 and vector v V, c 1 (c 2 v) = (c 1 c 2 ) v Identity for every vector v V, 1 v = v 2 Examples of Vector Spaces Examples of vector space abound in mathematics. The most obvious examples are the usual vectors in R 2, from which we have drawn our illustrations in the sections above. But we frequently utilize several other vectors spaces: The 3-d space of vectors, the vector space of all polynomials of a fixed degree, and vector spaces of n n matrices. We briefly discuss these below. The Vector Space of 3-Dimensional Vectors The vectors in R 3 also form a vector space, where in this case the vector operations of addition and scalar multiplication are done componentwise. That is v 1 =< x 1, y 1, z 1 > and v 2 =< x 2, y 2, z 2 > are vectors, then addition is v 1 + v 2 =< x 1 + x 2, y 1 + y 2, z 1 + z 2 > 4

and, if c is a scalar, scalar multiplication is given by c v 1 =< cx 1, cy 1, cz 1 > The axioms are easily verified (for example the additive identity of v 1 =< x 1, y 1, z 1 > is just v 1 =< x 1, y 1, z 1 >, and the zero vector is just 0 =< 0, 0, 0 >. Here the axioms just state what we always have been taught about these sets of vectors. Vector Spaces of Polynomials The set of quadratic polynomials of the form P (x) = ax 2 + bx + c also form a vector space. We add two of polynomials by adding their respective coefficients. That is, if p 1 (x) = a 1 x 2 + b 1 x + c 1 and p 2 (x) = a 2 x 2 + b 2 x + c 2, then (p 1 + p 2 )(x) = (a 1 + a 2 )x 2 + (b 1 + b 2 )x + (c 1 + c 2 ) and multiplication is done by multiplying the scalar by each coefficient. That is, if s is a scalar, then sp(x) = (sa)x 2 + (sb)x + (sc) The axioms are again easily verified by performing the operations individually on like terms. A simple extension of the above is to consider the set of polynomials of degree less than or equal to n. It is easily seen that these also form a vector space. Vector Spaces of Matrices The set of n n Matrices form a vector space. Two matrices can be added componentwise, and a matrix can be multiplied by a scalar. All axioms are easily verified. 5

3 Linear Independence and Bases Given a vector space V, the concept of a basis for the vector space is fundamental for much of the work that we will do in computer graphics. This section discusses several topics relating to linear combinations of vectors, linear independence and bases. 3.1 Linear Combinations Let v 1, v 2,..., v n be any vectors in a vector space V and let c 1, c 2,..., c n be any set of scalars. Then an expression of the form c 1 v 1 + c 2 v 2 + + c n v n is called a linear combination of the vectors. This element is clearly a member of the vector space V (just repeatedly apply the summation and scalar multiplication axioms). The set S that contains all possible linear combinations of v 1, v 2,..., v n is called the span of v 1, v 2,..., v n. We frequently say that S is spanned (or generated) by those n vectors. It is straightforward to show that the span of any set of vectors is again a vector space. 3.2 Linear Independence Given a set of vectors v 1, v 2,..., v n from a vector space V. This set is called linearly independent in V if the equation c 1 v 1 + c 2 v 2 + + c n v n = 0 implies that c i = 0 for all i = 1, 2,..., n. If a set of vectors is not linearly independent, then it is called linearly dependent. This implies that the equation above has a nonzero solution, that is there exist c 1, c 2,..., c n which are not all zero, such that c 1 v 1 + c 2 v 2 + + c n v n = 0 This implies that at least one of the vectors v i can be written in terms of the other n 1 vectors in the set. Assuming that c 1 is not zero, we can see that v 1 = c 2 c 1 v 2 c n c 1 v n 6

Any set of vectors containing the zero vector ( 0) is linearly dependent. 3.2.1 Example To give an example of a linear independent set that everyone has seen, consider the three vectors i =< 1, 0, 0 >, j =< 0, 1, 0 >, k =< 0, 0, 1 > in the vector space of vectors in R 3 Consider the equation c 1 i + c 2 j + c 3 k = 0 If we simplify left-hand side by performing the operations componentwise and write the right-hand side componentwise, we have < c 1, c 2, c 3 >=< 0, 0, 0 > which can only be solved if c 1 = c 2 = c 3 = 0. 3.3 A Basis for a Vector Space Let v 1, v 2,..., v n be a set of vectors in a vector space V and let S be the span of V. If v 1, v 2,..., v n is linearly independent, then we say that these vectors form a basis for S and S has dimension n. Since these vectors span S, any vector v S can be written uniquely as v = c 1 v 1 + c 2 v 2 + + c n v n The uniqueness follows from the argument that if there were two such representations then by subtracting the two equations, we obtain v = c 1 v 1 + c 2 v 2 + + c n v n, and v = c 1 v 1 + c 2 v 2 + + c n v n 0 = (c 1 c 1) v 1 + (c 2 c 2) v 2 + + (c n c n) v n 7

which can only happen if all the expressions c i c i are zero, since the vectors v 1, v 2,..., v n are assumed to be linearly independent. Thus we necessarily have that c i = c i for all i = 1, 2,..., n. If S is the entire vector space V, we say that v 1, v 2,..., v n forms a basis for V, and V has dimension n. All contents copyright (c) 2002 Computer Science Department, University of California, Davis All rights reserved. 8