In this chapter you will learn how to use MATLAB to work with lengths, angles and projections in subspaces of R and later in certain linear spaces.
|
|
- Loreen Lloyd
- 5 years ago
- Views:
Transcription
1 Chapter 5. Introduction In this chapter you will learn how to use MATLAB to work with lengths angles and n projections in subspaces of R and later in certain linear spaces.. Lengths and Angles Recall from class that the dot product is your key for understanding lengths and angles. The best way to find the dot product of following: u = and v = 9 in MATLAB is to do the >> u = [; ; ]; >> v = [; ; 9]; >> dot(uv) Recall that the length of u is denoted by u calculated by computing the angle between u and v then u v = u v cosθ. u u and if θ is Here are some other MATLAB commands that will help you get through this section. >> sqrt(9) %finds the square root of 9 >> acos(/5) %finds the inverse cosine of /5 (Do not use the MATLAB function norm) a. 5. # b. 5. #5. Orthogonal Projections In this section you will use MATLAB to project a vector onto a subspace. For now the subspace will be defined by an orthonormal basis. In section 7 you will see another way to do this even if you do not have an orthonormal basis.
2 n Here is the key idea: Assume V is a subspace of R u u u... u ) is an orthonormal ( n basis for V and x is any vector in R n. Since the orthogonal projection of x onto V is in V and we have a basis for V we only need to find the correct coefficients to use to build the projection. More specifically we need to find the correct c c... cn such that projv ( x) = c u + cu cnun. Here is the secret: c = x u c = x u c = x n u n Here is how you would do 5. #8 Find the orthogonal projection of onto the subspace of R spanned by >> v = [; ; ; ]; >> v = [; ; -; -]; >> v = [; -; -; ]; >> u = (/norm(v))*v; %Normalize the vectors >> u = (/norm(v))*v; >> u = (/norm(v))*v; >> x = [;;;]; >> c = dot(xu); %Calculate the coefficients >> c = dot(xu); >> c = dot(xu); >> proj = c*u + c*u + c*u; %Calculate the projection >> rats(proj) %Display projection with rational numbers / / -/ /
3 So the orthogonal projection of. onto the subspace spanned by is a. 5. # b. 5. #9 (Hint: Look at Fact 5..9). Roll Your Own Orthonormal Basis n The Gram-Schmidt process is a way to turn any basis in R into an orthonormal basis. We are going to start with a basis ( v v... vn ) and turn it into the orthonormal basis u u... u ) that spans the same space. Here is the basic idea: ( n ) If v is not a unit vector divide by its magnitude to turn it into a vector with length. Call this new vector u. ) v has a component parallel to u and a component perpendicular to u. To get u just normalize the perpendicular component. ) v has a component in the subpace spanned by ( u u ) and a component orthogonal to the subspace. u is the normalized perpendicular component. If you keep doing step ) on all of your vectors you will end up with an orthonormal basis that spans the same space. Here is how you would do this in MATLAB starting with the basis. MATLAB trick: The subplot command lets you put several graphs in the same window. The following is somewhat long but worth typing in.
4 >> v = [;-]; >> v = [;]; >> origin = [;]; >> picture_ = [v origin v]; >> subplot(); >> plot(picture_(:)picture_(:)'linewidth'); >> axis([- - ]); >> axis('square'); >> u = (/norm(v))*v; >> picture_ = [u origin v]; >> subplot(); >> plot(picture_(:)picture_(:)'linewidth'); >> axis([- - ]); >> axis('square'); >> v_parallel = dot(vu)*u; >> v_perp = v - v_parallel; >> picture_ = [u origin v_perp]; >> subplot(); >> plot(picture_(:)picture_(:)'linewidth'); >> axis([- - ]); >> axis('square'); >> u = (/norm(v_perp))*v_perp; >> picture_ = [u origin u]; >> subplot(); >> plot(picture_(:)picture_(:)'linewidth'); >> axis([- - ]); >> axis('square');>> u %see the value of u u = >> u %see the value of u u = The new basis is a. 5. # (you do not need to plot your vectors for this one)
5 b. 5. #9 c. 5. # 5. Transposes and Symmetry Oftentimes given a matrix it is useful to make a new matrix whose columns are the rows of the original matrix. This new matrix is called the transpose of the original matrix. More formally the entry in the i th row of the j th column of A is the j th row and i th column of the transpose of A. The ' command makes this easy to do in MATLAB. >> A = [ ; 9 7 5] A = >> A' Recall T ) A is a symmetric matrix iff A = A ) A is a skew-symmetric matrix iff A = A T Use symbolic x matrices in MATLAB to answer the following problems. Then generalize (without MATLAB) to the n x n case. Hints: ( A T A) T T T T ( AB ) = B A. A T ) T = A T T = A ( A ) T = A ( Here is how to prove that A T A T A is symmetric: 5a. 5. # 5b. 5. # 5
6 . Orthogonal Matrices There are several things you know about orthogonal transormations and you can use all of them to help decide whether a given matrix is orthogonal. What you know: ) An orthogonal transformation preserves vector lengths (that is T is orthogonal iff T ( x) = x for all x ) An orthogonal transformation preserves angles between vectors. ) The columns of the matrix form an orthonormal basis which means that A T A = I. (Do you see why? Lok at Fact 5..7) Here is how to use MATLAB to check each of these properties. Note that ) and ) can only be used to show that a transformation is not orthogonal while ) can be used to show that a matrix is orthogonal and that can be used to show that a matrix is not orthogonal. ) Use property : >> A = [ ; ]; >> x = [;]; >> norm(x). >> norm(a*x). So x A x and hence is not orthogonal. ) Use property : (This is 5. #) >> A = (/7)*[ -; - ; ]; >> v = [;;]; >> v = [;;]; >> acos(dot(vv)/(norm(v)*norm(v)))
7 .775 >> acos(dot(a*va*v)/(norm(a*v)*norm(a*v))).5 So then angle between v and v is not the same as the angle between Av and Av. Hence 7 is not orthogonal since ) Use property : (This is 5. #) A T A = I. >> A = (/)*[ - ; ; -]; >> A'*A So is orthogonal. a. 5. # b. 5. # c. 5. #5 7. Least Squares Solutions and Another Way to Calculate Orthogonal Projections. In a perfect world every set of equations would be consistent. Unfortunately our world is not perfect. Luckily a theory has been developed to come up with the best solution to an inconsistent set of equations. More specifically if Ax = b is inconsistent we solve for Ax = the projection of b onto im(a). A way of finding x is given below. Once you know x computing Ax will give you the projection. 7
8 We will do 5. example in MATLAB. The set of equations we want to solve is x + y = x + y = x + y = Note that this system is inconsistent. We will also use the best solution to orthogonally project >> A = [ ; ; ]; >> b = [; ; ]; >> x = inv(a'*a)*a'*b x = -.. >> A*x onto im(a) So the least squares solution is image of is 5 x * = and the orthogonal projection of onto the 7a. 5. # (You can use MATLAB instead of a paper and pencil) 7b. 5. # 8
9 8. Data Fitting In this section you will use section 7 to find the best equation to fit some given data. Here is 5. #: We are trying to fit a linear function of the form f ( t) = c + ct to the data points ( ) ( ) and ( ). In other words we want to find a solution to the following equations: = c + (since we want the point ( ) on the line) c = c + (since we want the point ( ) on the line) c = c + (since we want the point ( ) on the line) c In matrix form = c c >> A = [ ; ; ]; >> b = [;;]; >> x = inv(a'*a)*a'*b x =..5 So our linear function is f ( t) = +. 5t. We will use MATLAB to see how good it is. Remember that our three points are ( ) ( ) and ( ). MATLAB trick: The scatter command will plot points without connecting them. >> points = [ ; ] points = >> scatter(points(:) points(:) 'filled'); >> axis([- 7]) %Set the axis >> hold on %Stop MATLAB from erasing our graph 9
10 >> t = linspace(-); %Set up domain for linear function >> y = +.5*t; %Define linear function >> plot(ty); %Be sure to look at the graph More problems: 8a. 5. # 8b. 5. # Hint: Here is how to plot the quadratic function y = t + t + 5 on the interval [ ] >> t = linspace(-); >> y = -*t.^ + *t + 5; Note the. in front of the ^ >> plot(ty); 8c. 5. #8 9. Inner Product Spaces This is a good place to look back and see what you ve accomplished in the last couple of chapters. Here s a recap: n Chapter : Useful concepts forr including linear independence dimension and subspace among others. Chapter : The Chapter concepts generalized to arbitrary linear spaces. n 5. 5.: dot products lengths angles and orthogonal projections for R. Now we are going to look at dot products lengths angles and orthogonal projections for some spaces other than R n. The basic idea is that given a linear space we make up a rule that has the same properties as the dot product on R n. This rule is called an inner product. Here is 5.5 example 5: The space is C [ π ] and we define < f g >= f ( t) g( t) dt. Some notes about MATLAB before we do this: π ) trapz is the MATLAB command to use the trapezoid rule of integrating. ) MATLAB is subject to round off errors. If MATLAB reports that something is close to it is probably actually equal to for the applications we are doing.
11 >> t = linspace(*pi); >> trapz(tsin(t).*cos(t)).e-7 π 7 The last line says that f ( t) g( t) dt =. =. which is close enough to for us to call it equal to. 9a. 5.5 # 9b. 5.5 # 9c. 5.5 #. Orthogonal Projections Once you have an inner product you can do orthogonal projections like you did in section above. We will go through 5.5 example 8. We are using the formula from Fact 5.5.5: f n ( t) = a + b sin( t) + c cos( t) bn sin( nt) + cn cos( nt) where a = π π π f dt b n c n π π = π f ( t) sin( nt) dt π π = π f ( t) cos( nt) dt >> t = linspace(-pipi); %set up the domain >> f = t; %define the function >> %calculate the coefficients >> a = (/pi)*trapz(t /sqrt()*f); >> b = (/pi)*trapz(t sin(t).*f); >> c = (/pi)*trapz(t cos(t).*f); >> b = (/pi)*trapz(t sin(*t).*f); >> c = (/pi)*trapz(t cos(*t).*f); >> b = (/pi)*trapz(t sin(*t).*f); >> c = (/pi)*trapz(t cos(*t).*f); >> b = (/pi)*trapz(t sin(*t).*f); >> c = (/pi)*trapz(t cos(*t).*f);
12 >> %compute the approximations >> f = a*(/sqrt()) + b*sin(t) + c*cos(t) >> f = a*(/sqrt()) + b*sin(t) + c*cos(t) + b*sin(*t) + c*cos(*t); >> f = a*(/sqrt()) + b*sin(t) + c*cos(t) + b*sin(*t) + c*cos(*t) + b*sin(*t) + c*cos(*t); >> f = a*(/sqrt()) + b*sin(t) + c*cos(t) + b*sin(*t) + c*cos(*t) + b*sin(*t) + c*cos(*t) + b*sin(*t) + c*cos(*t); >>%graph the approximations with the original function >> subplot(); plot(tf); hold on; plot(tf); >> subplot(); plot(tf); hold on; plot(tf); >> subplot(); plot(tf); hold on; plot(tf); >> subplot(); plot(tf); hold on; plot(tf); a. 5.5 # b. 5.5 #
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationInner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:
Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v
More informationPractice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5
Practice Exam. Solve the linear system using an augmented matrix. State whether the solution is unique, there are no solutions or whether there are infinitely many solutions. If the solution is unique,
More informationRecall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:
Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =
More informationThere are two things that are particularly nice about the first basis
Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially
More informationv = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :
Length, Angle and the Inner Product The length (or norm) of a vector v R 2 (viewed as connecting the origin to a point (v 1,v 2 )) is easily determined by the Pythagorean Theorem and is denoted v : v =
More informationConceptual Questions for Review
Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.
More informationMath 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES
Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Recap Yesterday we talked about several new, important concepts
More information5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.
Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the
More informationLecture 3: Linear Algebra Review, Part II
Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n
More informationLINEAR ALGEBRA W W L CHEN
LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008. This chapter is available free to all individuals, on the understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationFinal Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2
Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch
More informationLAB 2: Orthogonal Projections, the Four Fundamental Subspaces, QR Factorization, and Inconsistent Linear Systems
Math 550A MATLAB Assignment #2 1 Revised 8/14/10 LAB 2: Orthogonal Projections, the Four Fundamental Subspaces, QR Factorization, and Inconsistent Linear Systems In this lab you will use Matlab to study
More information6.1. Inner Product, Length and Orthogonality
These are brief notes for the lecture on Friday November 13, and Monday November 1, 2009: they are not complete, but they are a guide to what I want to say on those days. They are guaranteed to be incorrect..1.
More informationLinear Models Review
Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign
More informationLinear least squares problem: Example
Linear least squares problem: Example We want to determine n unknown parameters c,,c n using m measurements where m n Here always denotes the -norm v = (v + + v n) / Example problem Fit the experimental
More informationDot Products, Transposes, and Orthogonal Projections
Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More informationChapter 6. Orthogonality and Least Squares
Chapter 6 Orthogonality and Least Squares Section 6.1 Inner Product, Length, and Orthogonality Orientation Recall: This course is about learning to: Solve the matrix equation Ax = b Solve the matrix equation
More informationAssignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.
Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has
More informationMTH 2032 SemesterII
MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents
More informationDot product and linear least squares problems
Dot product and linear least squares problems Dot Product For vectors u,v R n we define the dot product Note that we can also write this as u v = u,,u n u v = u v + + u n v n v v n = u v + + u n v n The
More informationMTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n
MTH 39Y 37. Inner product spaces Recall: ) The dot product in R n : a. a n b. b n = a b + a 2 b 2 +...a n b n 2) Properties of the dot product: a) u v = v u b) (u + v) w = u w + v w c) (cu) v = c(u v)
More informationMODULE 8 Topics: Null space, range, column space, row space and rank of a matrix
MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x
More informationLinear Algebra- Final Exam Review
Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.
More informationCOMP 558 lecture 18 Nov. 15, 2010
Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to
More informationMath 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1
Math 8, 9 Notes, 4 Orthogonality We now start using the dot product a lot. v v = v v n then by Recall that if w w ; w n and w = v w = nx v i w i : Using this denition, we dene the \norm", or length, of
More informationMath 4A Notes. Written by Victoria Kala Last updated June 11, 2017
Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...
More informationSection 1.8/1.9. Linear Transformations
Section 1.8/1.9 Linear Transformations Motivation Let A be a matrix, and consider the matrix equation b = Ax. If we vary x, we can think of this as a function of x. Many functions in real life the linear
More informationMath Linear Algebra II. 1. Inner Products and Norms
Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,
More informationLinear Algebra Massoud Malek
CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product
More informationAnswer Key for Exam #2
. Use elimination on an augmented matrix: Answer Key for Exam # 4 4 8 4 4 4 The fourth column has no pivot, so x 4 is a free variable. The corresponding system is x + x 4 =, x =, x x 4 = which we solve
More information(v, w) = arccos( < v, w >
MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v
More informationMath 224, Fall 2007 Exam 3 Thursday, December 6, 2007
Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 You have 1 hour and 20 minutes. No notes, books, or other references. You are permitted to use Maple during this exam, but you must start with a blank
More informationProjections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for
Math 57 Spring 18 Projections and Least Square Solutions Recall that given an inner product space V with subspace W and orthogonal basis for W, B {v 1, v,..., v k }, the orthogonal projection of V onto
More informationSection 6.4. The Gram Schmidt Process
Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find
More informationMath 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections
Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product
More information. = V c = V [x]v (5.1) c 1. c k
Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,
More informationMATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.
MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v
More information(v, w) = arccos( < v, w >
MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,
More informationDot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.
Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................
More informationTBP MATH33A Review Sheet. November 24, 2018
TBP MATH33A Review Sheet November 24, 2018 General Transformation Matrices: Function Scaling by k Orthogonal projection onto line L Implementation If we want to scale I 2 by k, we use the following: [
More information3D Coordinate Transformations. Tuesday September 8 th 2015
3D Coordinate Transformations Tuesday September 8 th 25 CS 4 Ross Beveridge & Bruce Draper Questions / Practice (from last week I messed up!) Write a matrix to rotate a set of 2D points about the origin
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationApplied Linear Algebra in Geoscience Using MATLAB
Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in
More informationThe value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.
Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class
More informationk is a product of elementary matrices.
Mathematics, Spring Lecture (Wilson) Final Eam May, ANSWERS Problem (5 points) (a) There are three kinds of elementary row operations and associated elementary matrices. Describe what each kind of operation
More informationSection 6.1. Inner Product, Length, and Orthogonality
Section 6. Inner Product, Length, and Orthogonality Orientation Almost solve the equation Ax = b Problem: In the real world, data is imperfect. x v u But due to measurement error, the measured x is not
More information(v, w) = arccos( < v, w >
MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:
More informationSolutions to Review Problems for Chapter 6 ( ), 7.1
Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,
More informationHomework 5. (due Wednesday 8 th Nov midnight)
Homework (due Wednesday 8 th Nov midnight) Use this definition for Column Space of a Matrix Column Space of a matrix A is the set ColA of all linear combinations of the columns of A. In other words, if
More informationMATH 15a: Linear Algebra Practice Exam 2
MATH 5a: Linear Algebra Practice Exam 2 Write all answers in your exam booklet. Remember that you must show all work and justify your answers for credit. No calculators are allowed. Good luck!. Compute
More informationLecture 10: Vector Algebra: Orthogonal Basis
Lecture 0: Vector Algebra: Orthogonal Basis Orthogonal Basis of a subspace Computing an orthogonal basis for a subspace using Gram-Schmidt Orthogonalization Process Orthogonal Set Any set of vectors that
More informationProblem 1: (3 points) Recall that the dot product of two vectors in R 3 is
Linear Algebra, Spring 206 Homework 3 Name: Problem : (3 points) Recall that the dot product of two vectors in R 3 is a x b y = ax + by + cz, c z and this is essentially the same as the matrix multiplication
More information5. Orthogonal matrices
L Vandenberghe EE133A (Spring 2017) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal
More informationBasic Linear Algebra in MATLAB
Basic Linear Algebra in MATLAB 9.29 Optional Lecture 2 In the last optional lecture we learned the the basic type in MATLAB is a matrix of double precision floating point numbers. You learned a number
More information1 Matrices and matrix algebra
1 Matrices and matrix algebra 1.1 Examples of matrices A matrix is a rectangular array of numbers and/or variables. For instance 4 2 0 3 1 A = 5 1.2 0.7 x 3 π 3 4 6 27 is a matrix with 3 rows and 5 columns
More informationAnswers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3
Answers in blue. If you have questions or spot an error, let me know. 3 4. Find all matrices that commute with A =. 4 3 a b If we set B = and set AB = BA, we see that 3a + 4b = 3a 4c, 4a + 3b = 3b 4d,
More informationLecture 1.4: Inner products and orthogonality
Lecture 1.4: Inner products and orthogonality Matthew Macauley Department of Mathematical Sciences Clemson University http://www.math.clemson.edu/~macaule/ Math 4340, Advanced Engineering Mathematics M.
More informationREVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION
REVIEW FOR EXAM III The exam covers sections 4.4, the portions of 4. on systems of differential equations and on Markov chains, and..4. SIMILARITY AND DIAGONALIZATION. Two matrices A and B are similar
More informationA = 3 B = A 1 1 matrix is the same as a number or scalar, 3 = [3].
Appendix : A Very Brief Linear ALgebra Review Introduction Linear Algebra, also known as matrix theory, is an important element of all branches of mathematics Very often in this course we study the shapes
More informationwhich arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i
MODULE 6 Topics: Gram-Schmidt orthogonalization process We begin by observing that if the vectors {x j } N are mutually orthogonal in an inner product space V then they are necessarily linearly independent.
More informationLinear Algebra Using MATLAB
Linear Algebra Using MATLAB MATH 5331 1 May 12, 2010 1 Selected material from the text Linear Algebra and Differential Equations Using MATLAB by Martin Golubitsky and Michael Dellnitz Contents 1 Preliminaries
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationA VERY BRIEF LINEAR ALGEBRA REVIEW for MAP 5485 Introduction to Mathematical Biophysics Fall 2010
A VERY BRIEF LINEAR ALGEBRA REVIEW for MAP 5485 Introduction to Mathematical Biophysics Fall 00 Introduction Linear Algebra, also known as matrix theory, is an important element of all branches of mathematics
More informationDefinition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition
6 Vector Spaces with Inned Product Basis and Dimension Section Objective(s): Vector Spaces and Subspaces Linear (In)dependence Basis and Dimension Inner Product 6 Vector Spaces and Subspaces Definition
More informationThe set of all solutions to the homogeneous equation Ax = 0 is a subspace of R n if A is m n.
0 Subspaces (Now, we are ready to start the course....) Definitions: A linear combination of the vectors v, v,..., v m is any vector of the form c v + c v +... + c m v m, where c,..., c m R. A subset V
More informationThe Gram Schmidt Process
u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple
More informationThe Gram Schmidt Process
The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case
More informationLinear Algebra, Summer 2011, pt. 3
Linear Algebra, Summer 011, pt. 3 September 0, 011 Contents 1 Orthogonality. 1 1.1 The length of a vector....................... 1. Orthogonal vectors......................... 3 1.3 Orthogonal Subspaces.......................
More informationMATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL
MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left
More informationLecture 6: Lies, Inner Product Spaces, and Symmetric Matrices
Math 108B Professor: Padraic Bartlett Lecture 6: Lies, Inner Product Spaces, and Symmetric Matrices Week 6 UCSB 2014 1 Lies Fun fact: I have deceived 1 you somewhat with these last few lectures! Let me
More informationInner Product Spaces
Inner Product Spaces Introduction Recall in the lecture on vector spaces that geometric vectors (i.e. vectors in two and three-dimensional Cartesian space have the properties of addition, subtraction,
More informationBeyond Vectors. Hung-yi Lee
Beyond Vectors Hung-yi Lee Introduction Many things can be considered as vectors. E.g. a function can be regarded as a vector We can apply the concept we learned on those vectors. Linear combination Span
More informationThe Gram-Schmidt Process 1
The Gram-Schmidt Process In this section all vector spaces will be subspaces of some R m. Definition.. Let S = {v...v n } R m. The set S is said to be orthogonal if v v j = whenever i j. If in addition
More informationMiderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce
Miderm II Solutions Problem. [8 points] (i) [4] Find the inverse of the matrix A = To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce We have A = 2 2 (ii) [2] Possibly
More informationMath 290-2: Linear Algebra & Multivariable Calculus Northwestern University, Lecture Notes
Math 290-2: Linear Algebra & Multivariable Calculus Northwestern University, Lecture Notes Written by Santiago Cañez These are notes which provide a basic summary of each lecture for Math 290-2, the second
More information1 Review of the dot product
Any typographical or other corrections about these notes are welcome. Review of the dot product The dot product on R n is an operation that takes two vectors and returns a number. It is defined by n u
More informationDEPARTMENT OF MATHEMATICS
Name(Last/First): GUID: DEPARTMENT OF MATHEMATICS Ma322005(Sathaye) - Final Exam Spring 2017 May 3, 2017 DO NOT TURN THIS PAGE UNTIL YOU ARE INSTRUCTED TO DO SO. Be sure to show all work and justify your
More informationAssignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:
Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due date: Friday, April 0, 08 (:pm) Name: Section Number Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due
More informationMarch 27 Math 3260 sec. 56 Spring 2018
March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated
More informationHOSTOS COMMUNITY COLLEGE DEPARTMENT OF MATHEMATICS
HOSTOS COMMUNITY COLLEGE DEPARTMENT OF MATHEMATICS MAT 217 Linear Algebra CREDIT HOURS: 4.0 EQUATED HOURS: 4.0 CLASS HOURS: 4.0 PREREQUISITE: PRE/COREQUISITE: MAT 210 Calculus I MAT 220 Calculus II RECOMMENDED
More informationOrthogonality. Orthonormal Bases, Orthogonal Matrices. Orthogonality
Orthonormal Bases, Orthogonal Matrices The Major Ideas from Last Lecture Vector Span Subspace Basis Vectors Coordinates in different bases Matrix Factorization (Basics) The Major Ideas from Last Lecture
More informationReview of Linear Algebra
Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=
More informationTranspose & Dot Product
Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:
More informationMATH 22A: LINEAR ALGEBRA Chapter 4
MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate
More informationEXAM. Exam 1. Math 5316, Fall December 2, 2012
EXAM Exam Math 536, Fall 22 December 2, 22 Write all of your answers on separate sheets of paper. You can keep the exam questions. This is a takehome exam, to be worked individually. You can use your notes.
More informationExperiment # 3 The Gram-Schmidt Procedure and Orthogonal Projections. Student Number: Lab Section: PRA. Student Number: Grade: /20
ECE 417 CommLab Experiment # 3 The Gram-Schmidt Procedure and Orthogonal Projections Name: Date: Student Number: Lab Section: PRA Name: Student Number: Grade: /20 1 Purpose The purpose of this lab is to
More informationx 1 x 2. x 1, x 2,..., x n R. x n
WEEK In general terms, our aim in this first part of the course is to use vector space theory to study the geometry of Euclidean space A good knowledge of the subject matter of the Matrix Applications
More information1.6 and 5.3. Curve Fitting One of the broadest applications of linear algebra is to curve fitting, especially in determining unknown coefficients in
16 and 53 Curve Fitting One of the broadest applications of linear algebra is to curve fitting, especially in determining unknown coefficients in functions You should know that, given two points in the
More informationMAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction
MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example
More informationA TOUR OF LINEAR ALGEBRA FOR JDEP 384H
A TOUR OF LINEAR ALGEBRA FOR JDEP 384H Contents Solving Systems 1 Matrix Arithmetic 3 The Basic Rules of Matrix Arithmetic 4 Norms and Dot Products 5 Norms 5 Dot Products 6 Linear Programming 7 Eigenvectors
More informationMATH 3330 INFORMATION SHEET FOR TEST 3 SPRING Test 3 will be in PKH 113 in class time, Tues April 21
MATH INFORMATION SHEET FOR TEST SPRING Test will be in PKH in class time, Tues April See above for date, time and location of Test It will last 7 minutes and is worth % of your course grade The material
More informationTranspose & Dot Product
Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:
More informationLinear Algebra Review. Vectors
Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors
More informationAnnouncements Monday, September 25
Announcements Monday, September 25 The midterm will be returned in recitation on Friday. You can pick it up from me in office hours before then. Keep tabs on your grades on Canvas. WeBWorK 1.7 is due Friday
More informationTypical Problem: Compute.
Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and
More information7. Dimension and Structure.
7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain
More informationL3: Review of linear algebra and MATLAB
L3: Review of linear algebra and MATLAB Vector and matrix notation Vectors Matrices Vector spaces Linear transformations Eigenvalues and eigenvectors MATLAB primer CSCE 666 Pattern Analysis Ricardo Gutierrez-Osuna
More information