y = A T x = A T (x r + x c ) = A T x r
|
|
- Hilary Bates
- 5 years ago
- Views:
Transcription
1 APPM/MATH 4660 Homework 1 Solutions This assignment is due on Google Drive by 11:59pm on Friday, January 30th Submit your solution writeup and any (well-commented) code that you produce according to the submission guidelines found on the course webpage Minimal credit will be given for incomplete solutions or solutions that do not provide details on how the solution is found You may discuss the problems with your classmates, but all work (analysis and code) must be your own 1 In class we talked etensively about the solution of the least-squares problem min b A for the case when A has full rank In this eercise you ll investigate what happens when A does not have full rank (a) Let A be a real m n matri Prove that for any system of equations, A = b, the normal equations A T A = A T b (and by etension, the least-squares problem) have a solution Make no assumptions about the sizes of m and n Argue that when A does not have full rank the least-squares problem has an infinite number of solutions Solution Method 1: First we need to remind ourselves of the Fundamental Theorem of Linear Algebra The FTLA lays out relationships between the four fundamental subspaces of a matri Let A be a real m n matri, then R m = range(a) coker(a) with range(a) coker(a) R n = corange(a) ker(a) with corange(a) ker(a) We ll show that A T A = A T b always has a solution by showing that A T b range ( A T A ) First we need to convince ourselves that range ( A T A ) = range ( A T ) We have range ( A T ) = { all y A T = y for some R m} but, by the FTLA, R m can be uniquely decomposed into r range(a) and c coker(a) Then y = A T = A T ( r + c ) = A T r Since r range(a) we have y = A T r = A T Az for some z R n Thus range ( A T ) = range ( A T A ) The remainder of the proof is straight forward A T b range ( A T ) = range ( A T A ) and so the normal equations have a solution Solution Method : We reformulate the normal equations as A T (A b) = 0 and argue that this system always has a solution First notice that b R m which means, by the FTLA, it can be decomposed uniquely into a vector b r range(a) and a vector b c coker(a) Since b r range(a) there eists particular solution such that A = b r Plugging this solution into the factored linear system we have A T (A b) = A T (A b r b c ) = A T b c = 0 where the last step follows from the fact that b c coker(a)
2 Note that the n n matri A T A is a Gram matri, so rank ( A T A ) = rank(a) If A has full-rank (ie rank(a) = n ) then rank ( A T A ) = n and thus A T A is nonsingular Since A T A is nonsingular the normal equations have a unique solution If A is not full-rank (ie rank(a) < n or m < n) then the normal equations have infinitely many solutions (b) The method of regularized least squares is often used in practice to solve the least squares problem when A does not have full rank The regularized least squares problem has the following form: min b A + λ where λ > 0 is a (usually small) parameter Show that the regularized least squares problem is equivalent to the following min b 0 A µi where µ = λ Solution: We start by writing out the sum of squared-norms in the first form of the regularized least-squares problem term by term We have b A = (b 1 a 11 1 a 1 a 1n n ) + (b a 1 1 a a n n ) + (b m a m1 1 a m a mn n ) λ = λ ( 1 ) + λ ( ) + λ ( ) = (µ 1 ) + (µ ) + (µ n ) Stacking these on top of each other we have b A + λ = (b 1 a 11 1 a 1 a 1n n ) + (b a 1 1 a a n n ) + (b m a m1 1 a m a mn n ) + (0 µ 1 ) + (0 µ ) + (0 µ n ) = b A 0 µi
3 (c) Derive the normal equations for the regularized least squares problem given in part (b) Solution: The regularized least-squares problem has the form b min ˆb  where ˆb = 0 and  = A µi Then the equivalent normal equations are given by  T  = ÂT ˆb A T µi A µi = A T µi b 0 ( A T A + λi ) = A T b (d) Prove that the normal equations of the regularized problem have a unique solution for every positive value of λ Solution Method 1: The matri A T A is a Gram matri and is thus symmetric positive semidefinite If λ > 0 then ( A T A + λi ) is symmetric positive definite and thus nonsingular Since ( A T A + λi ) is nonsingular the associated normal equations have a unique solution Solution Method : Notice that the matri the argument from part (a) A λi necessarily has full column-rank and repeat
4 In this problem you will eplore the stability of various methods for computing the QR Factorization of a matri (a) Write a computer code that takes an m n matri A and uses Gram-Schmidt to compute its condensed QR Factorization (b) Write a computer code that takes an m n matri A and uses Modified Gram-Schmidt to compute its condensed QR Factorization (c) Compute the QR Factorization of random n n matrices for n =, 4, 8, 18 using (i) Gram- Schmidt, (ii) Modified Gram-Schmidt, and (iii) a canned QR function from your favorite scripting language (which almost certainly uses Householder s method) For each matri and factorization, measure the orthogonality of the columns of Q using η = ˆQ T ˆQ I On a single set of aes, produce a log-log plot of n vs η for each factorization method What can you conclude from your results? Solution: The following figure shows n vs η for Gram-Schmidt, Modified Gram-Schmidt, and Householder QR From the figure we see that for all three methods the orthogonality of the columns of Q degrades only slightly as n increases We also notice that Naive Gram-Schmidt and Modified Gram-Schmidt perform only slightly worse than Householder QR This is because random matrices are typically well-conditioned, and thus even Naive Gram-Schmidt is an effective orthogonalization technique Remember that a matri A is ill-conditioned matrices when it s columns are nearly linearly dependent The odds of generating a random matri linearly dependent columns is infinitesimal
5 (d) Consider the matri A = ɛ ɛ ɛ Repeat the general idea of part (c), but this time keep the size of the matri fied as shown, and let ɛ decrease like ɛ = 10 l for l =, 3,, 10 Produce a log-log plot of ɛ vs η for each factorization What can you conclude about your results this time? Solution: The following figure shows ɛ vs η for Gram-Schmidt, Modified Gram-Schmidt, and Householder QR From the figure we can see that Gram-Schmidt does a terrible job of producing an orthogonal basis for any value of ɛ < 10 5 For much smaller epsilon the columns of Q produced by Naive Gram-Schmidt are not remotely orthogonal Modified Gram-Schmidt does considerably better, but still eperiences a minor loss of orthogonality for small ɛ Finally, Householder QR is entirely stable, producing a basis that is orthogonal to machine precision
6 3 Consider the data in the following table t i y i (a) Use the QR Factorization code you developed in Problem 1 to find the best least-squares fit of the given data using polynomials of degree 1,, and 3 For each model, report the least-squares error b A and plot the fitted curve along with the data Solution: The fitted models along with the data can be seen in the figure at the bottom of the page The least-squares errors for each polynomial fits are as follows poly degree least-squares error 1 39e+ 144e e-4 From the table (and the figures) we see that the quadratic and cubic models fit the data significantly better than the linear model As epected, the cubic model fits the data better than the quadratic model, but only slightly in terms of least-squares error (b) It is sometimes desirable to fit data using models other than polynomials An etremely popular choice is to fit the data to an eponential model of the form y = Ae Bt where here A and B are the model parameters you re trying to determine It turns out that you can can do this using nearly the same method as you used in part (a) Describe in detail how you can accomplish this (This method is called eponential regression in statistics and is covered in nearly every tetbook on numerical analysis See if you can work it out on your own, but don t be afraid to use Google Hint: Think logarithmically) Solution: Taking the natural log of both sides of the eponential model we have ln y = ln A + Bt We see that the logarithmic equation is linear in t and thus fits the form of the typical linear least-squares problem We can then write the minimization problem as min b M where M = 1 t 1 1 t 1 t n, = ln A B, and b = ln y 1 ln y ln y n Forming M and b we can solve the linear least-squares problem for using the usual method by QR decomposition of M Then the fitted model is y = Ae Bt where A = e 1 and B =
7 (c) Using the method you described in part (b) determine the best least-squares fit of the form y = Ae Bt to the given data Plot the fitted curve along with the data and report the least-squares error, which in this case is given by n i=1 ( yi Ae Bti ) Solution: Proceeding by the method described in (b) we find that A 46 and B 0374 The eponential least-squares error is found to be n ( yi Ae Bti) = 418e+ i=1 Notice that this is even worse than the linear polynomial fit found in part (a) Indeed, we can see from the figure that the eponential model does not fit the data particularly well This should not be particularly surprising The data clearly did not come from an eponential process (evidenced by the fact that a simple quadratic model eplained the data etremely well) so there is no reason to epect that an eponential model would eplain the data well
Cheat Sheet for MATH461
Cheat Sheet for MATH46 Here is the stuff you really need to remember for the exams Linear systems Ax = b Problem: We consider a linear system of m equations for n unknowns x,,x n : For a given matrix A
More informationAssignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:
Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due date: Friday, April 0, 08 (:pm) Name: Section Number Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due
More informationOrthonormal Bases; Gram-Schmidt Process; QR-Decomposition
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most
More informationLinear Least Squares Problems
Linear Least Squares Problems Introduction We have N data points (x 1,y 1 ),...(x N,y N ). We assume that the data values are given by y j = g(x j ) + e j, j = 1,...,N where g(x) = c 1 g 1 (x) + + c n
More informationSince the determinant of a diagonal matrix is the product of its diagonal elements it is trivial to see that det(a) = α 2. = max. A 1 x.
APPM 4720/5720 Problem Set 2 Solutions This assignment is due at the start of class on Wednesday, February 9th. Minimal credit will be given for incomplete solutions or solutions that do not provide details
More informationMath 407: Linear Optimization
Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University
More informationMath Fall Final Exam
Math 104 - Fall 2008 - Final Exam Name: Student ID: Signature: Instructions: Print your name and student ID number, write your signature to indicate that you accept the honor code. During the test, you
More informationLeast squares problems Linear Algebra with Computer Science Application
Linear Algebra with Computer Science Application April 8, 018 1 Least Squares Problems 11 Least Squares Problems What do you do when Ax = b has no solution? Inconsistent systems arise often in applications
More informationC. Finding roots of trinomials: 1st Example: x 2 5x = 14 x 2 5x 14 = 0 (x 7)(x + 2) = 0 Answer: x = 7 or x = -2
AP Calculus Students: Welcome to AP Calculus. Class begins in approimately - months. In this packet, you will find numerous topics that were covered in your Algebra and Pre-Calculus courses. These are
More informationMathematical Methods for Engineers 1 (AMS10/10A)
Mathematical Methods for Engineers 1 (AMS10/10A) Quiz 5 - Friday May 27th (2016) 2:00-3:10 PM AMS 10 AMS 10A Name: Student ID: Multiple Choice Questions (3 points each; only one correct answer per question)
More informationGiven the vectors u, v, w and real numbers α, β, γ. Calculate vector a, which is equal to the linear combination α u + β v + γ w.
Selected problems from the tetbook J. Neustupa, S. Kračmar: Sbírka příkladů z Matematiky I Problems in Mathematics I I. LINEAR ALGEBRA I.. Vectors, vector spaces Given the vectors u, v, w and real numbers
More informationLecture 7: Indeterminate forms; L Hôpitals rule; Relative rates of growth. If we try to simply substitute x = 1 into the expression, we get
Lecture 7: Indeterminate forms; L Hôpitals rule; Relative rates of growth 1. Indeterminate Forms. Eample 1: Consider the it 1 1 1. If we try to simply substitute = 1 into the epression, we get. This is
More informationReview problems for MA 54, Fall 2004.
Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on
More informationChapter 3 Transformations
Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases
More informationk is a product of elementary matrices.
Mathematics, Spring Lecture (Wilson) Final Eam May, ANSWERS Problem (5 points) (a) There are three kinds of elementary row operations and associated elementary matrices. Describe what each kind of operation
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 1: Course Overview & Matrix-Vector Multiplication Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 20 Outline 1 Course
More informationChap 3. Linear Algebra
Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions
More informationLinear Algebra- Final Exam Review
Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.
More informationMath 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES
Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Recap Yesterday we talked about several new, important concepts
More informationMAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction
MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example
More informationThe value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.
Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class
More informationLinear Models Review
Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign
More informationLeast Squares. Tom Lyche. October 26, Centre of Mathematics for Applications, Department of Informatics, University of Oslo
Least Squares Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo October 26, 2010 Linear system Linear system Ax = b, A C m,n, b C m, x C n. under-determined
More informationLecture 1: Review of linear algebra
Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations
More informationApplied Linear Algebra in Geoscience Using MATLAB
Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in
More information235 Final exam review questions
5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an
More informationMODULE 8 Topics: Null space, range, column space, row space and rank of a matrix
MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x
More informationApplied Linear Algebra in Geoscience Using MATLAB
Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in
More informationOne Solution Two Solutions Three Solutions Four Solutions. Since both equations equal y we can set them equal Combine like terms Factor Solve for x
Algebra Notes Quadratic Systems Name: Block: Date: Last class we discussed linear systems. The only possibilities we had we 1 solution, no solution or infinite solutions. With quadratic systems we have
More information33A Linear Algebra and Applications: Practice Final Exam - Solutions
33A Linear Algebra and Applications: Practice Final Eam - Solutions Question Consider a plane V in R 3 with a basis given by v = and v =. Suppose, y are both in V. (a) [3 points] If [ ] B =, find. (b)
More informationMay 9, 2014 MATH 408 MIDTERM EXAM OUTLINE. Sample Questions
May 9, 24 MATH 48 MIDTERM EXAM OUTLINE This exam will consist of two parts and each part will have multipart questions. Each of the 6 questions is worth 5 points for a total of points. The two part of
More informationTaylor Series and Series Convergence (Online)
7in 0in Felder c02_online.te V3 - February 9, 205 9:5 A.M. Page CHAPTER 2 Taylor Series and Series Convergence (Online) 2.8 Asymptotic Epansions In introductory calculus classes the statement this series
More informationMATH 350: Introduction to Computational Mathematics
MATH 350: Introduction to Computational Mathematics Chapter V: Least Squares Problems Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Spring 2011 fasshauer@iit.edu MATH
More information3.3.1 Linear functions yet again and dot product In 2D, a homogenous linear scalar function takes the general form:
3.3 Gradient Vector and Jacobian Matri 3 3.3 Gradient Vector and Jacobian Matri Overview: Differentiable functions have a local linear approimation. Near a given point, local changes are determined by
More informationWest Essex Regional School District. AP Calculus AB. Summer Packet
West Esse Regional School District AP Calculus AB Summer Packet 05-06 Calculus AB Calculus AB covers the equivalent of a one semester college calculus course. Our focus will be on differential and integral
More informationSVD and its Application to Generalized Eigenvalue Problems. Thomas Melzer
SVD and its Application to Generalized Eigenvalue Problems Thomas Melzer June 8, 2004 Contents 0.1 Singular Value Decomposition.................. 2 0.1.1 Range and Nullspace................... 3 0.1.2
More informationLinear Algebra - Part II
Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T
More informationThis can be accomplished by left matrix multiplication as follows: I
1 Numerical Linear Algebra 11 The LU Factorization Recall from linear algebra that Gaussian elimination is a method for solving linear systems of the form Ax = b, where A R m n and bran(a) In this method
More informationThe Normal Equations. For A R m n with m > n, A T A is singular if and only if A is rank-deficient. 1 Proof:
Applied Math 205 Homework 1 now posted. Due 5 PM on September 26. Last time: piecewise polynomial interpolation, least-squares fitting Today: least-squares, nonlinear least-squares The Normal Equations
More informationAdministrivia. Matrinomials Lectures 1+2 O Outline 1/15/2018
Administrivia Syllabus and other course information posted on the internet: links on blackboard & at www.dankalman.net. Check assignment sheet for reading assignments and eercises. Be sure to read about
More informationDot product and linear least squares problems
Dot product and linear least squares problems Dot Product For vectors u,v R n we define the dot product Note that we can also write this as u v = u,,u n u v = u v + + u n v n v v n = u v + + u n v n The
More informationLeast-Squares Systems and The QR factorization
Least-Squares Systems and The QR factorization Orthogonality Least-squares systems. The Gram-Schmidt and Modified Gram-Schmidt processes. The Householder QR and the Givens QR. Orthogonality The Gram-Schmidt
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More informationLeast squares: the big idea
Notes for 2016-02-22 Least squares: the big idea Least squares problems are a special sort of minimization problem. Suppose A R m n where m > n. In general, we cannot solve the overdetermined system Ax
More informationMAT 211, Spring 2015, Introduction to Linear Algebra.
MAT 211, Spring 2015, Introduction to Linear Algebra. Lecture 04, 53103: MWF 10-10:53 AM. Location: Library W4535 Contact: mtehrani@scgp.stonybrook.edu Final Exam: Monday 5/18/15 8:00 AM-10:45 AM The aim
More informationPractice Final Exam Solutions for Calculus II, Math 1502, December 5, 2013
Practice Final Exam Solutions for Calculus II, Math 5, December 5, 3 Name: Section: Name of TA: This test is to be taken without calculators and notes of any sorts. The allowed time is hours and 5 minutes.
More informationLecture notes: Applied linear algebra Part 1. Version 2
Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and
More informationLecture 7: Positive Semidefinite Matrices
Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.
More informationSolutions to Second In-Class Exam: Math 401 Section 0201, Professor Levermore Friday, 16 April 2010
Solutions to Second In-Class Exam: Math 4 Section, Professor Levermore Friday, 6 April No notes, books, or electrontics You must show your reasoning for full credit Good luck! [] Consider the polynomials
More informationBindel, Fall 2016 Matrix Computations (CS 6210) Notes for At a high level, there are two pieces to solving a least squares problem:
1 Trouble points Notes for 2016-09-28 At a high level, there are two pieces to solving a least squares problem: 1. Project b onto the span of A. 2. Solve a linear system so that Ax equals the projected
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationMath 265 Midterm 2 Review
Math 65 Midterm Review March 6, 06 Things you should be able to do This list is not meant to be ehaustive, but to remind you of things I may ask you to do on the eam. These are roughly in the order they
More informationMath 224, Fall 2007 Exam 3 Thursday, December 6, 2007
Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 You have 1 hour and 20 minutes. No notes, books, or other references. You are permitted to use Maple during this exam, but you must start with a blank
More informationBasic Elements of Linear Algebra
A Basic Review of Linear Algebra Nick West nickwest@stanfordedu September 16, 2010 Part I Basic Elements of Linear Algebra Although the subject of linear algebra is much broader than just vectors and matrices,
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationMATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS
MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS 1. HW 1: Due September 4 1.1.21. Suppose v, w R n and c is a scalar. Prove that Span(v + cw, w) = Span(v, w). We must prove two things: that every element
More informationEigenvalues, Eigenvectors, and Diagonalization
Week12 Eigenvalues, Eigenvectors, and Diagonalization 12.1 Opening Remarks 12.1.1 Predicting the Weather, Again Let us revisit the example from Week 4, in which we had a simple model for predicting the
More informationFoundations of Matrix Analysis
1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the
More informationMath 113 Final Exam: Solutions
Math 113 Final Exam: Solutions Thursday, June 11, 2013, 3.30-6.30pm. 1. (25 points total) Let P 2 (R) denote the real vector space of polynomials of degree 2. Consider the following inner product on P
More information15.2 Graphing Logarithmic
_ - - - - - - Locker LESSON 5. Graphing Logarithmic Functions Teas Math Standards The student is epected to: A.5.A Determine the effects on the ke attributes on the graphs of f () = b and f () = log b
More informationProblem Set 1. Homeworks will graded based on content and clarity. Please show your work clearly for full credit.
CSE 151: Introduction to Machine Learning Winter 2017 Problem Set 1 Instructor: Kamalika Chaudhuri Due on: Jan 28 Instructions This is a 40 point homework Homeworks will graded based on content and clarity
More informationMath 265 Linear Algebra Sample Spring 2002., rref (A) =
Math 265 Linear Algebra Sample Spring 22. It is given that A = rref (A T )= 2 3 5 3 2 6, rref (A) = 2 3 and (a) Find the rank of A. (b) Find the nullityof A. (c) Find a basis for the column space of A.
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)
AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) Lecture 1: Course Overview; Matrix Multiplication Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical
More information1 Backward and Forward Error
Math 515 Fall, 2008 Brief Notes on Conditioning, Stability and Finite Precision Arithmetic Most books on numerical analysis, numerical linear algebra, and matrix computations have a lot of material covering
More informationAM205: Assignment 2. i=1
AM05: Assignment Question 1 [10 points] (a) [4 points] For p 1, the p-norm for a vector x R n is defined as: ( n ) 1/p x p x i p ( ) i=1 This definition is in fact meaningful for p < 1 as well, although
More informationCourse. Print and use this sheet in conjunction with MathinSite s Maclaurin Series applet and worksheet.
Maclaurin Series Learning Outcomes After reading this theory sheet, you should recognise the difference between a function and its polynomial epansion (if it eists!) understand what is meant by a series
More informationLECTURE 7. Least Squares and Variants. Optimization Models EE 127 / EE 227AT. Outline. Least Squares. Notes. Notes. Notes. Notes.
Optimization Models EE 127 / EE 227AT Laurent El Ghaoui EECS department UC Berkeley Spring 2015 Sp 15 1 / 23 LECTURE 7 Least Squares and Variants If others would but reflect on mathematical truths as deeply
More informationLinear Least-Squares Data Fitting
CHAPTER 6 Linear Least-Squares Data Fitting 61 Introduction Recall that in chapter 3 we were discussing linear systems of equations, written in shorthand in the form Ax = b In chapter 3, we just considered
More informationModel Question Paper with effect from Sixth Semester B.E.(CBCS) Examination Linear Algebra (Open Elective) Time: 3 Hrs Max.
Model Question Paper with effect from 07-8 USN 5MAT Sixth Semester B.E.(CBCS) Examination Linear Algebra (Open Elective) Time: Hrs Max.Marks: 80 Note: Answer any FIVE full questions, choosing at least
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 7: More on Householder Reflectors; Least Squares Problems Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 15 Outline
More informationProblem set 1. (c) Is the Ford-Fulkerson algorithm guaranteed to produce an acyclic maximum flow?
CS261, Winter 2017. Instructor: Ashish Goel. Problem set 1 Electronic submission to Gradescope due 11:59pm Thursday 2/2. Form a group of 2-3 students that is, submit one homework with all of your names.
More informationDesigning Information Devices and Systems I Fall 2016 Babak Ayazifar, Vladimir Stojanovic Homework 12
EECS 6A Designing Information Devices and Systems I Fall 206 Babak Ayazifar, Vladimir Stojanovic Homework 2 This homework is due November 22, 206, at P.M. Recommended Reading: Gilbert Strang, Introduction
More informationREVIEW FOR TEST I OF CALCULUS I:
REVIEW FOR TEST I OF CALCULUS I: The first and best line of defense is to complete and understand the homework and lecture examples. Past that my old test might help you get some idea of how my tests typically
More informationSolution to Homework 8, Math 2568
Solution to Homework 8, Math 568 S 5.4: No. 0. Use property of heorem 5 to test for linear independence in P 3 for the following set of cubic polynomials S = { x 3 x, x x, x, x 3 }. Solution: If we use
More informationGeometric Modeling Summer Semester 2010 Mathematical Tools (1)
Geometric Modeling Summer Semester 2010 Mathematical Tools (1) Recap: Linear Algebra Today... Topics: Mathematical Background Linear algebra Analysis & differential geometry Numerical techniques Geometric
More informationBindel, Fall 2011 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Jan 9
Problem du jour Week 3: Wednesday, Jan 9 1. As a function of matrix dimension, what is the asymptotic complexity of computing a determinant using the Laplace expansion (cofactor expansion) that you probably
More information6.4 graphs OF logarithmic FUnCTIOnS
SECTION 6. graphs of logarithmic functions 9 9 learning ObjeCTIveS In this section, ou will: Identif the domain of a logarithmic function. Graph logarithmic functions. 6. graphs OF logarithmic FUnCTIOnS
More information7.2 Steepest Descent and Preconditioning
7.2 Steepest Descent and Preconditioning Descent methods are a broad class of iterative methods for finding solutions of the linear system Ax = b for symmetric positive definite matrix A R n n. Consider
More informationWHEN MODIFIED GRAM-SCHMIDT GENERATES A WELL-CONDITIONED SET OF VECTORS
IMA Journal of Numerical Analysis (2002) 22, 1-8 WHEN MODIFIED GRAM-SCHMIDT GENERATES A WELL-CONDITIONED SET OF VECTORS L. Giraud and J. Langou Cerfacs, 42 Avenue Gaspard Coriolis, 31057 Toulouse Cedex
More information4.3 How derivatives affect the shape of a graph. The first derivative test and the second derivative test.
Chapter 4: Applications of Differentiation In this chapter we will cover: 41 Maimum and minimum values The critical points method for finding etrema 43 How derivatives affect the shape of a graph The first
More informationThe QR Factorization
The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 13: Conditioning of Least Squares Problems; Stability of Householder Triangularization Xiangmin Jiao Stony Brook University Xiangmin Jiao
More informationLecture 4 Orthonormal vectors and QR factorization
Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced
More informationMath 415 Exam I. Name: Student ID: Calculators, books and notes are not allowed!
Math 415 Exam I Calculators, books and notes are not allowed! Name: Student ID: Score: Math 415 Exam I (20pts) 1. Let A be a square matrix satisfying A 2 = 2A. Find the determinant of A. Sol. From A 2
More informationA Method for Reducing Ill-Conditioning of Polynomial Root Finding Using a Change of Basis
Portland State University PDXScholar University Honors Theses University Honors College 2014 A Method for Reducing Ill-Conditioning of Polynomial Root Finding Using a Change of Basis Edison Tsai Portland
More informationLecture 2: Linear operators
Lecture 2: Linear operators Rajat Mittal IIT Kanpur The mathematical formulation of Quantum computing requires vector spaces and linear operators So, we need to be comfortable with linear algebra to study
More informationA Quick Tour of Linear Algebra and Optimization for Machine Learning
A Quick Tour of Linear Algebra and Optimization for Machine Learning Masoud Farivar January 8, 2015 1 / 28 Outline of Part I: Review of Basic Linear Algebra Matrices and Vectors Matrix Multiplication Operators
More informationlecture 2 and 3: algorithms for linear algebra
lecture 2 and 3: algorithms for linear algebra STAT 545: Introduction to computational statistics Vinayak Rao Department of Statistics, Purdue University August 27, 2018 Solving a system of linear equations
More informationApplied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic
Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization
More informationLecture 6, Sci. Comp. for DPhil Students
Lecture 6, Sci. Comp. for DPhil Students Nick Trefethen, Thursday 1.11.18 Today II.3 QR factorization II.4 Computation of the QR factorization II.5 Linear least-squares Handouts Quiz 4 Householder s 4-page
More informationLecture 6. Numerical methods. Approximation of functions
Lecture 6 Numerical methods Approximation of functions Lecture 6 OUTLINE 1. Approximation and interpolation 2. Least-square method basis functions design matrix residual weighted least squares normal equation
More informationEigenvalues, Eigenvectors, and Diagonalization
Week12 Eigenvalues, Eigenvectors, and Diagonalization 12.1 Opening Remarks 12.1.1 Predicting the Weather, Again View at edx Let us revisit the example from Week 4, in which we had a simple model for predicting
More informationComputational Methods. Least Squares Approximation/Optimization
Computational Methods Least Squares Approximation/Optimization Manfred Huber 2011 1 Least Squares Least squares methods are aimed at finding approximate solutions when no precise solution exists Find the
More informationMATH 310, REVIEW SHEET
MATH 310, REVIEW SHEET These notes are a summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive, so please
More informationSTAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9
STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 1. qr and complete orthogonal factorization poor man s svd can solve many problems on the svd list using either of these factorizations but they
More informationVector Spaces, Orthogonality, and Linear Least Squares
Week Vector Spaces, Orthogonality, and Linear Least Squares. Opening Remarks.. Visualizing Planes, Lines, and Solutions Consider the following system of linear equations from the opener for Week 9: χ χ
More information4.5 Rational functions.
4.5 Rational functions. We have studied graphs of polynomials and we understand the graphical significance of the zeros of the polynomial and their multiplicities. Now we are ready to etend these eplorations
More informationNotes on Eigenvalues, Singular Values and QR
Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square
More informationBasic methods to solve equations
Roberto s Notes on Prerequisites for Calculus Chapter 1: Algebra Section 1 Basic methods to solve equations What you need to know already: How to factor an algebraic epression. What you can learn here:
More information