SECTION 7: CURVE FITTING. MAE 4020/5020 Numerical Methods with MATLAB
|
|
- Neal Powell
- 5 years ago
- Views:
Transcription
1 SECTION 7: CURVE FITTING MAE 4020/5020 Numerical Methods with MATLAB
2 2 Introduction
3 Curve Fitting 3 Often have data,, that is a function of some independent variable,, but the underlying relationship is unknown Know s and s (perhaps only approximately), but don t know Measured data Tabulated data Determine a function (i.e., a curve) that best describes relationship between An approximation to (the unknown) This is curve fitting and
4 Regression vs. Interpolation 4 We ll look at two categories of curve fitting: Least squares regression Noisy data uncertainty in value for a given value Want good agreement between and data points Curve (i.e., ) may not pass through any data points Polynomial interpolation Data points are known exactly noiseless data Resulting curve passes through all data points
5 5 Review of Basic Statistics Before moving on to discuss least squares regression, we ll first review a few basic concepts from statistics.
6 Basic Statistical Quantities 6 Arithmetic mean the average or expected value Standard deviation (unbiased) a measure of the spread of the data about the mean 1 where is the total sum of the squares of the residuals
7 Basic Statistical Quantities 7 Variance another measure of spread The square of the standard deviation Useful measure due to relationship with power and power spectral density of a signal or data set or
8 Normal (Gaussian) Distribution 8 Many naturally occurring random process are normally distributed Measurement noise Very often assume noise in our data is Gaussian Probability density function (pdf): where is the variance, and random variable, is the mean of the
9 Statistics in MATLAB 9 Generation of random values Very useful for simulations including noise Uniformly distributed: rand(m,n) E.g., generate an mn matrix of uniformly distributed points between xmin and xmax: x = xmin + (xmax xmin)*rand(m,n) Normally distributed: randn(m,n) E.g., generate an m n matrix of normally distributed points with mean of mu and standard deviation of sigma: x = mu + sigma*randn(m,n)
10 Statistics in MATLAB 10 Histogram plots: hist(x,binx,nbins) Graphical depiction of the variation of random quantities Plots the frequency of occurrence of ranges (bins) of values Provides insight into the nature of the distribution E.g., generate a histogram of the values in the vector x with 20 bins: hist(x,20) Optional input argument, binx, specifies x values at the centers of the bins Built in statistical functions: min.m, max.m, mean.m, var.m, std.m, median.m, mode.m
11 Statistics in MATLAB 11
12 12 Linear Least Squares Regression
13 Linear Regression 13 Noisy data,,values at known values Suspect relationship between and is linear i.e., assume Determine and that define the bestfit line for the data How do we define the best fit?
14 Measured Data 14 Assumed a linear relationship between and : Due to noise, can t measure Can only approximate values exactly at each Measured values are approximations True value of plus some random error or residual
15 Best Fit Criteria 15 Noisy data do not all line on a single line discrepancy between each point and the line fit to the data The error, or residual: Minimize some measure of this residual: Minimize the sum of the residuals Positive and negative errors can cancel Non unique fit Minimize the sum of the absolute values of the residuals Effect of sign of error eliminated, but still not a unique fit Minimize the maximum error minimax criterion Excessive influence given to single outlying points
16 Least Squares Criterion 16 Better fitting criterion is to minimize the sum of the squares of the residuals Yields a unique best fit line for a given set of data The sum of the squares of the residuals is a function of the two fitting parameters, and, Minimize by setting its partial derivatives to zero and solving for and
17 Least Squares Criterion 17 At its minimum point, partial derivatives of with respect to and will be zero Breaking up the summation: 0 0
18 Normal Equations 18 and form a system of two equations with two unknowns, and 1 2 In matrix form: 3 These are the normal equations
19 Normal Equations 19 Normal equations can be solved for and : Or solve the matrix form of the normal equations, (3), in MATLAB using mldivide.m (\) \
20 Linear Least Squares Example 20 Noisy data with suspected linear relationship Calculate summation terms in the normal equations:,,,
21 Linear Least Squares Example 21 Assemble normal equation matrices Solve normal equations for vector of coefficients,, using mldivide.m
22 Goodness of Fit 22 How well does a function fit the data? Is a linear fit best? A quadratic, higher order polynomial, or other non linear function? Want a way to be able to quantify goodness of fit Quantify spread of data about the mean prior to regression: Following regression, quantify spread of data about the regression line (or curve):
23 Goodness of Fit 23 quantifies the spread of the data about the mean quantifies spread about the best fit line (curve) The spread that remains after the trend is explained The unexplained sum of the squares represents the reduction in data spread after regression explains the underlying trend Normalize to the coefficient of determination
24 Coefficient of Determination 24 For a perfect fit: No variation in data about the regression line 0 1 If the fit provides no improvement over simply characterizing data by its mean value: 0 If the fit is worse at explaining the data than their mean value: 0
25 Coefficient of Determination 25 Calculate for previous example:
26 Coefficient of Determination 26 Don t rely too heavily on the value of Anscombe s famous data sets: Chapra Same line fit to all four data sets in each case
27 27 Linearization of Nonlinear Relationships
28 Nonlinear functions 28 Not all data can be explained by a linear relationship to an independent variable, e.g. Exponential model Power equation Saturation growth rate equation
29 Nonlinear functions 29 Methods for nonlinear curve fitting: Linearization of the nonlinear relationship Transform the dependent and/or independent data values Apply linear least squares regression Inverse transform the determined coefficients back to those that define the nonlinear functional relationship Nonlinear regression Treat as an optimization problem more later
30 Linearizing an Exponential Relationship 30 Have noisy data that is believed to be best described by an exponential relationship Linearize the fitting equation: or where,
31 Linearizing an Exponential Relationship 31 Fit a line to the transformed data using linear leastsquares regression Determine and : Can calculate for the line fit to the transformed data Note that original data must be positive
32 Linearizing an Exponential Relationship 32 Transform the linear fitting parameters, and, back to the parameters defining the exponential relationship Exponential fit: where, Note that is different than that for the line fit to the transformed data
33 Linearizing an Exponential Relationship 33
34 Linearizing a Power Equation 34 Have noisy data that is believed to be best described by an power equation Linearize the fitting equation: log log log or log log where log,
35 Linearizing a Power Equation 35 Fit a line to the transformed data using linear leastsquares regression Determine and : Can calculate for the line fit to the transformed data Note that original data both and must be positive
36 Linearizing a Power Equation 36 Transform the linear fitting parameters, and, back to the parameters defining the power equation Power equation: where, Note that is different than that for the line fit to the transformed data
37 Linearizing a Power Equation 37
38 Linearizing a Saturation Growth Rate Equation 38 Have noisy data that is believed to be best described by a saturation growth rate equation Linearize the fitting equation: or where 1 1,
39 Linearizing a Saturation Growth Rate Equation 39 Fit a line to the transformed data using linear leastsquares regression Determine and : Can calculate for the line fit to the transformed data
40 Linearizing a Saturation Growth Rate Equation 40 Transform the linear fitting parameters, and, back to the parameters defining the saturation growth rate equation Saturation growth rate equation: where, Note that is different than that for the line fit to the transformed data
41 Linearizing a Saturation Growth Rate Equation 41
42 42 Polynomial Regression
43 Polynomial Regression 43 So far we ve looked at fitting straight lines to linear and linearized data sets Can also fit m th order polynomials directly to data using polynomial regression Same fitting criterion as linear regression: Minimize the sum of the squares of the residuals m+1 fitting parameters for an m th order polynomial m+1 normal equations
44 Polynomial Regression 44 Assume, for example, that we have data we believe to be quadratic in nature 2 nd order polynomial regression Fitting equation: Best fit will minimize the sum of the squares of the residuals:
45 Polynomial Regression Normal Equations 45 Best fit polynomial coefficients will minimize Differentiate w.r.t. each coefficient and set to zero
46 Polynomial Regression Normal Equations 46 Rearranging the normal equations yields Σ Σ Σ Σ Σ Σ Σ Σ Σ Σ Σ Which can be put into matrix form: Σ Σ Σ Σ Σ Σ Σ Σ Σ Σ Σ This system of equations can be solved for the vector of unknown coefficients using MATLAB s mldivide.m (\)
47 Polynomial Regression Normal Equations 47 For m th order polynomial regression the normal equations are: Σ Σ Σ Σ Σ Σ Σ Σ Σ Σ Σ Again, this system of 1equations can be solved for the vector of 1unknown polynomial coefficients using MATLAB s mldivide.m (\)
48 Polynomial Regression Example 48
49 Polynomial Regression polyfit.m 49 p = polyfit(x,y,m) x: vector of independent variable data values y: vector of dependent variable data values m: order of the polynomial to be fit to the data p: 1 vector of best fit polynomial coefficients Least squares polynomial regression if: 1 i.e., for over determined systems Polynomial interpolation if: 1 Resulting fit passes through all (x,y) points more later
50 Polynomial Regression polyfit.m 50 Note that the result matches that obtained by solving normal equations
51 Polynomial Regression Using polyfit.m 51 Exercise Determine the 4 th order polynomial with roots at Generate noiseless data points by evaluating this polynomial at integer values of from to Add Gaussian white noise with a standard deviation of to your data points Use polyfit.m to fit a 4 th order polynomial to the noisy data Calculate the coefficient of determination, Plot the noisy data points, along with the best fit polynomial
52 52 Multiple Linear Regression
53 Multiple Linear Regression 53 We have so far fit lines or curves to data described by functions of a single variable For functions of multiple variables, fit planes or surfaces to data Linear function of two independent variables: multiple linear regression Sum of the squares of the residuals is now,,
54 Multiple Linear Regression Normal Equations 54 Differentiate w.r.t. fitting coefficients and equate to zero The normal equations:,,,,,,,,,,,, Solve as before now fitting coefficients,, define a plane
55 55 General Linear Least Squares Regression
56 General Linear Least Squares 56 We ve seen three types of least squares regression Linear regression Polynomial regression Multiple linear regression All are special cases of general linear least squares regression The s are basis functions Basis functions may be nonlinear This is linear regression, because dependence on fitting coefficients,, is linear
57 General Linear Least Squares 57 For linear regression simple or multiple:,,, For polynomial regression:,,, In all cases, this is a linear combination of basis function, which may, themselves, be nonlinear
58 General Linear Least Squares 58 The general linear least squares model: Can be expressed in matrix form: where is an 1 matrix, the design matrix, whose entries are the 1 basis functions evaluated at the independent variable values corresponding to the measurements: where is the basis function evaluated at the independent variable value. (Note: is not the row index and is not the column index, here.)
59 General Linear Least Squares 59 The least squares model is: More measurements than coefficients is not square tall and narrow Over determined system does not exist
60 General Linear Least Squares Design Matrix Example 60 For example, consider fitting a quadratic to five measured values,, at Model is: Basis functions are,, and Least squares equation is
61 General Linear Least Squares Residuals 61 Linear least squares model is: Residual: (1) (2) Sum of the squares or the residuals: Expanding, (3) (4)
62 Deriving the Normal Equations 62 Best fit will minimize the sum of the squares of the residuals Differentiate with respect to the coefficient vector,, and set to zero We ll need to use some matrix calculus identities: (5) (6)
63 Deriving the Normal Equations 63 Using the matrix derivative relationships, (6), Equation (7) is the matrix form of the normal equations: Solution to (8) is the vector of least squares fitting coefficients: (7) (8) (9)
64 Solving the Normal Equations 64 (9) Remember, our starting point was the linear leastsquares model: Couldn t we have solved (10) for fitting coefficients as No, must solve using (9), because: Don t have, only noisy approximations, We have an over determined system is not square does not exist (10) (11)
65 Solving the Normal Equations 65 Solution to the linear least squares problem is: (12) where (13) is the Moore Penrose pseudo inverse of Use the pseudo inverse to find the least squares solutions to an over determined system
66 Coefficient of Determination 66 Goodness of fit characterized by the coefficient of determination: where is given by (3) and (14) (15)
67 General Least Squares in MATLAB 67 Have measurements at known independent variable values and a model, defined by 1basis functions Generate design matrix by evaluating 1basis functions at all values of
68 General Least Squares in MATLAB 68 Solve for vector of fitting coefficients as the solution to the normal equations or by using mldivide.m (\) Result is the same, though the methods are different mldivide.m uses QR factorization to find the solution to over determined systems
69 69 Nonlinear Regression
70 Nonlinear Regression fminsearch.m 70 Nonlinear models: Have nonlinear dependence on fitting parameters E.g., Two options for fitting nonlinear models to data Linearize the model first, then use linear regression Fit a nonlinear model directly by treating as an optimization problem Want to minimize a cost function Cost function is the sum of the squares of the residuals Find the minimum of a multi dimensional optimization Use MATLAB s fminsearch.m
71 Nonlinear Regression fminsearch.m 71 Have noisy data that is believed to be best described by an exponential relationship Cost function: Find and to minimize Use fminsearch.m
72 Nonlinear Regression fminsearch.m 72
73 Nonlinear Regression lsqcurvefit.m 73 An alternative to minimizing a cost function using fminsearch.m, if the optimization toolbox is available: a = lsqcurvefit(f,a0,x,y) f: handle to the fitting function e.g., f a(1)*exp(a(2)*x); a0: initial guess for fitting parameters x: independent variable data y: dependent variable data a: best fit parameters
74 Nonlinear Regression lsqcurvefit.m 74
75 75 Polynomial Interpolation
76 Polynomial Interpolation 76 Sometimes we know both and values exactly Want a function that describes Allows for interpolation between know data points Fit an order polynomial to data points Polynomial will pass through all points We ll look at polynomial interpolation using Newton s polynomial The Lagrange polynomial
77 Polynomial Interpolation 77 Can approach similar to linear least squares regression where,, For an order polynomial, we have with unknowns In matrix form equations
78 Polynomial Interpolation 78 Now, unlike for linear regression All 1values in are known exactly 1equations with 1unknown coefficients is square Could solve in MATLAB using mldivide.m \ is a Vandermonde matrix Tend to be ill conditioned The techniques that follow are more numerically robust
79 79 Newton Interpolating Polynomial
80 Linear Interpolation 80 Fit a line (1 st order polynomial) to two data points using a truncated Taylor series (or simple trigonometry): where is the function for the line fit to the data, and are the known data values This is the Newton linear interpolation formula
81 Quadratic Interpolation 81 To fit a 2 nd order polynomial to three data points, consider the following form Evaluate at to find Back substitution and evaluation at will yield the other coefficients and at and
82 Quadratic Interpolation 82 Can still view this as a Taylor series approximation represents an offset is slope is curvature Choice of initial quadratic form (Newton interpolating polynomial) was made to facilitate the development Resulting polynomial would be the same for any initial form of an order polynomial Solution is unique
83 Order Newton Interpolating Polynomial 83 Extending the quadratic example to order Solve for coefficients as before with back substitution and evaluation of denotes a finite divided difference
84 Finite Divided Differences 84 First finite divided difference, Second finite divided difference,,,, finite divided difference,,,,,,,, Calculate recursively
85 85 Order Newton Interpolating Polynomial order Newton interpolating polynomial in terms of divided differences:,,,,, Divided difference table for calculation of coefficients: Chapra
86 Newton Interpolating Polynomial Example 86
87 87 Lagrange Interpolating Polynomial
88 Linear Lagrange Interpolation 88 Fit a first order polynomial (a line) to two known data points: and and are weighting functions, where 1, 0, 1, 0, The interpolating polynomial is a weighted sum of the individual data point values
89 Linear Lagrange Interpolation 89 For linear (1 st order) interpolation, the weighting functions are: The linear Lagrange interpolating polynomial is:
90 Order Lagrange Interpolation 90 Lagrange interpolation technique can be extended to order polynomials where
91 Lagrange Interpolating Polynomial Example 91
92 92 MATLAB s Curve Fitting Tool
93 MATLAB s Curve Fitting Tool GUI 93 Type cftool at the command line Launches curve fitting tool GUI in a new window Import data from the workspace Quickly try fitting different types of functions to the data Polynomial Exponential Power Fourier Custom equations, and more Interpolation for Least squares regression for
94 MATLAB s Curve Fitting Tool GUI 94
95 Curve Fitting with the cftool GUI 95 Go to the course website and download the following data file: cftool_ex_data.mat Exercise Load data file into the workspace At the command line, type: load( cftool_ex_data ) Launch the Curve Fitting Tool Type cftool at the command line Try fitting different functions to each of the three data sets Can open a new tab in the GUI for each fit
BSM510 Numerical Analysis
BSM510 Numerical Analysis Polynomial Interpolation Prof. Manar Mohaisen Department of EEC Engineering Review of Precedent Lecture Polynomial Regression Multiple Linear Regression Nonlinear Regression Lecture
More informationCurve Fitting. Least Squares Regression. Linear Regression
Curve Fitting Curve fitting is expressing or modelling a discrete set of data points as a continuous function. There are two main branches of curve fitting which are regression and interpolation. Regression
More informationFunction approximation
Week 9: Monday, Mar 26 Function approximation A common task in scientific computing is to approximate a function. The approximated function might be available only through tabulated data, or it may be
More informationx x2 2 + x3 3 x4 3. Use the divided-difference method to find a polynomial of least degree that fits the values shown: (b)
Numerical Methods - PROBLEMS. The Taylor series, about the origin, for log( + x) is x x2 2 + x3 3 x4 4 + Find an upper bound on the magnitude of the truncation error on the interval x.5 when log( + x)
More informationCurve Fitting and Interpolation
Chapter 5 Curve Fitting and Interpolation 5.1 Basic Concepts Consider a set of (x, y) data pairs (points) collected during an experiment, Curve fitting: is a procedure to develop or evaluate mathematical
More informationSV6: Polynomial Regression and Neural Networks
Signal and Information Processing Laboratory Institut für Signal- und Informationsverarbeitung Fachpraktikum Signalverarbeitung SV6: Polynomial Regression and Neural Networks 1 Introduction Consider the
More informationNon-linear least squares
Non-linear least squares Concept of non-linear least squares We have extensively studied linear least squares or linear regression. We see that there is a unique regression line that can be determined
More information1 Cricket chirps: an example
Notes for 2016-09-26 1 Cricket chirps: an example Did you know that you can estimate the temperature by listening to the rate of chirps? The data set in Table 1 1. represents measurements of the number
More informationES-2 Lecture: More Least-squares Fitting. Spring 2017
ES-2 Lecture: More Least-squares Fitting Spring 2017 Outline Quick review of least-squares line fitting (also called `linear regression ) How can we find the best-fit line? (Brute-force method is not efficient)
More informationSection 5.8 Regression/Least Squares Approximation
Section 5.8 Regression/Least Squares Approximation Key terms Interpolation via linear systems Regression Over determine linear system Closest vector to a column space Linear regression; least squares line
More informationMA3457/CS4033: Numerical Methods for Calculus and Differential Equations
MA3457/CS4033: Numerical Methods for Calculus and Differential Equations Course Materials P A R T II B 14 2014-2015 1 2. APPROXIMATION CLASS 9 Approximation Key Idea Function approximation is closely related
More informationNumerical Integration (Quadrature) Another application for our interpolation tools!
Numerical Integration (Quadrature) Another application for our interpolation tools! Integration: Area under a curve Curve = data or function Integrating data Finite number of data points spacing specified
More informationCSCI5654 (Linear Programming, Fall 2013) Lectures Lectures 10,11 Slide# 1
CSCI5654 (Linear Programming, Fall 2013) Lectures 10-12 Lectures 10,11 Slide# 1 Today s Lecture 1. Introduction to norms: L 1,L 2,L. 2. Casting absolute value and max operators. 3. Norm minimization problems.
More informationAccelerated Advanced Algebra. Chapter 1 Patterns and Recursion Homework List and Objectives
Chapter 1 Patterns and Recursion Use recursive formulas for generating arithmetic, geometric, and shifted geometric sequences and be able to identify each type from their equations and graphs Write and
More informationExam 2. Average: 85.6 Median: 87.0 Maximum: Minimum: 55.0 Standard Deviation: Numerical Methods Fall 2011 Lecture 20
Exam 2 Average: 85.6 Median: 87.0 Maximum: 100.0 Minimum: 55.0 Standard Deviation: 10.42 Fall 2011 1 Today s class Multiple Variable Linear Regression Polynomial Interpolation Lagrange Interpolation Newton
More informationLinear Models 1. Isfahan University of Technology Fall Semester, 2014
Linear Models 1 Isfahan University of Technology Fall Semester, 2014 References: [1] G. A. F., Seber and A. J. Lee (2003). Linear Regression Analysis (2nd ed.). Hoboken, NJ: Wiley. [2] A. C. Rencher and
More informationAn introduction to plotting data
An introduction to plotting data Eric D. Black California Institute of Technology v2.0 1 Introduction Plotting data is one of the essential skills every scientist must have. We use it on a near-daily basis
More informationData Fitting and Uncertainty
TiloStrutz Data Fitting and Uncertainty A practical introduction to weighted least squares and beyond With 124 figures, 23 tables and 71 test questions and examples VIEWEG+ TEUBNER IX Contents I Framework
More informationStatistical methods. Mean value and standard deviations Standard statistical distributions Linear systems Matrix algebra
Statistical methods Mean value and standard deviations Standard statistical distributions Linear systems Matrix algebra Statistical methods Generating random numbers MATLAB has many built-in functions
More informationMATLAB for Engineers
MATLAB for Engineers Adrian Biran Moshe Breiner ADDISON-WESLEY PUBLISHING COMPANY Wokingham, England Reading, Massachusetts Menlo Park, California New York Don Mills, Ontario Amsterdam Bonn Sydney Singapore
More informationNumerical Methods Lecture 7 - Statistics, Probability and Reliability
Topics Numerical Methods Lecture 7 - Statistics, Probability and Reliability A summary of statistical analysis A summary of probability methods A summary of reliability analysis concepts Statistical Analysis
More informationJim Lambers MAT 419/519 Summer Session Lecture 13 Notes
Jim Lambers MAT 419/519 Summer Session 2011-12 Lecture 13 Notes These notes correspond to Section 4.1 in the text. Least Squares Fit One of the most fundamental problems in science and engineering is data
More informationStatistics. Lent Term 2015 Prof. Mark Thomson. 2: The Gaussian Limit
Statistics Lent Term 2015 Prof. Mark Thomson Lecture 2 : The Gaussian Limit Prof. M.A. Thomson Lent Term 2015 29 Lecture Lecture Lecture Lecture 1: Back to basics Introduction, Probability distribution
More informationA6523 Modeling, Inference, and Mining Jim Cordes, Cornell University
A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University Lecture 19 Modeling Topics plan: Modeling (linear/non- linear least squares) Bayesian inference Bayesian approaches to spectral esbmabon;
More informationTopics Covered in Math 115
Topics Covered in Math 115 Basic Concepts Integer Exponents Use bases and exponents. Evaluate exponential expressions. Apply the product, quotient, and power rules. Polynomial Expressions Perform addition
More informationPolynomial Form. Factored Form. Perfect Squares
We ve seen how to solve quadratic equations (ax 2 + bx + c = 0) by factoring and by extracting square roots, but what if neither of those methods are an option? What do we do with a quadratic equation
More informationInterpolation and Polynomial Approximation I
Interpolation and Polynomial Approximation I If f (n) (x), n are available, Taylor polynomial is an approximation: f (x) = f (x 0 )+f (x 0 )(x x 0 )+ 1 2! f (x 0 )(x x 0 ) 2 + Example: e x = 1 + x 1! +
More informationScientific Computing: An Introductory Survey
Scientific Computing: An Introductory Survey Chapter 7 Interpolation Prof. Michael T. Heath Department of Computer Science University of Illinois at Urbana-Champaign Copyright c 2002. Reproduction permitted
More informationComputational Methods. Least Squares Approximation/Optimization
Computational Methods Least Squares Approximation/Optimization Manfred Huber 2011 1 Least Squares Least squares methods are aimed at finding approximate solutions when no precise solution exists Find the
More informationChapter 3 Engineering Solutions. 3.4 and 3.5 Problem Presentation
Chapter 3 Engineering Solutions 3.4 and 3.5 Problem Presentation Organize your work as follows (see book): Problem Statement Theory and Assumptions Solution Verification Tools: Pencil and Paper See Fig.
More informationLecture 9: Elementary Matrices
Lecture 9: Elementary Matrices Review of Row Reduced Echelon Form Consider the matrix A and the vector b defined as follows: 1 2 1 A b 3 8 5 A common technique to solve linear equations of the form Ax
More informationModeling Data with Functions
Chapter 11 Modeling Data with Functions 11.1 Data Modeling Concepts 1 11.1.1 Conceptual Explanations: Modeling Data with Functions In school, you generally start with a function and work from there to
More informationAlgebra Performance Level Descriptors
Limited A student performing at the Limited Level demonstrates a minimal command of Ohio s Learning Standards for Algebra. A student at this level has an emerging ability to A student whose performance
More informationCOMMON CORE STATE STANDARDS TO BOOK CORRELATION
COMMON CORE STATE STANDARDS TO BOOK CORRELATION Conceptual Category: Number and Quantity Domain: The Real Number System After a standard is introduced, it is revisited many times in subsequent activities,
More informationCurve Fitting. Objectives
Curve Fitting Objectives Understanding the difference between regression and interpolation. Knowing how to fit curve of discrete with least-squares regression. Knowing how to compute and understand the
More informationPolynomial Form. Factored Form. Perfect Squares
We ve seen how to solve quadratic equations (ax 2 + bx + c = 0) by factoring and by extracting square roots, but what if neither of those methods are an option? What do we do with a quadratic equation
More informationPhysics with Matlab and Mathematica Exercise #1 28 Aug 2012
Physics with Matlab and Mathematica Exercise #1 28 Aug 2012 You can work this exercise in either matlab or mathematica. Your choice. A simple harmonic oscillator is constructed from a mass m and a spring
More informationMath 471. Numerical methods Introduction
Math 471. Numerical methods Introduction Section 1.1 1.4 of Bradie 1.1 Algorithms Here is an analogy between Numerical Methods and Gastronomy: Calculus, Lin Alg., Diff. eq. Ingredients Algorithm Recipe
More informationLeast squares: the big idea
Notes for 2016-02-22 Least squares: the big idea Least squares problems are a special sort of minimization problem. Suppose A R m n where m > n. In general, we cannot solve the overdetermined system Ax
More informationTaylor Series and Numerical Approximations
Taylor Series and Numerical Approximations Hilary Weller h.weller@reading.ac.uk August 7, 05 An introduction to the concept of a Taylor series and how these are used in numerical analysis to find numerical
More informationOutline. 1 Interpolation. 2 Polynomial Interpolation. 3 Piecewise Polynomial Interpolation
Outline Interpolation 1 Interpolation 2 3 Michael T. Heath Scientific Computing 2 / 56 Interpolation Motivation Choosing Interpolant Existence and Uniqueness Basic interpolation problem: for given data
More informationExperiment 1: Linear Regression
Experiment 1: Linear Regression August 27, 2018 1 Description This first exercise will give you practice with linear regression. These exercises have been extensively tested with Matlab, but they should
More informationECE 204 Numerical Methods for Computer Engineers MIDTERM EXAMINATION /8:00-9:30
ECE 204 Numerical Methods for Computer Engineers MIDTERM EXAMINATION 2007-10-23/8:00-9:30 The examination is out of 67 marks. Instructions: No aides. Write your name and student ID number on each booklet.
More informationSome general observations.
Modeling and analyzing data from computer experiments. Some general observations. 1. For simplicity, I assume that all factors (inputs) x1, x2,, xd are quantitative. 2. Because the code always produces
More informationLecture 10: Powers of Matrices, Difference Equations
Lecture 10: Powers of Matrices, Difference Equations Difference Equations A difference equation, also sometimes called a recurrence equation is an equation that defines a sequence recursively, i.e. each
More informationFrequentist-Bayesian Model Comparisons: A Simple Example
Frequentist-Bayesian Model Comparisons: A Simple Example Consider data that consist of a signal y with additive noise: Data vector (N elements): D = y + n The additive noise n has zero mean and diagonal
More informationExcel for Scientists and Engineers Numerical Method s. E. Joseph Billo
Excel for Scientists and Engineers Numerical Method s E. Joseph Billo Detailed Table of Contents Preface Acknowledgments About the Author Chapter 1 Introducing Visual Basic for Applications 1 Chapter
More informationLab 2: Static Response, Cantilevered Beam
Contents 1 Lab 2: Static Response, Cantilevered Beam 3 1.1 Objectives.......................................... 3 1.2 Scalars, Vectors and Matrices (Allen Downey)...................... 3 1.2.1 Attribution.....................................
More informationObservations Homework Checkpoint quizzes Chapter assessments (Possibly Projects) Blocks of Algebra
September The Building Blocks of Algebra Rates, Patterns and Problem Solving Variables and Expressions The Commutative and Associative Properties The Distributive Property Equivalent Expressions Seeing
More informationLectures 9-10: Polynomial and piecewise polynomial interpolation
Lectures 9-1: Polynomial and piecewise polynomial interpolation Let f be a function, which is only known at the nodes x 1, x,, x n, ie, all we know about the function f are its values y j = f(x j ), j
More information2D Plotting with Matlab
GEEN 1300 Introduction to Engineering Computing Class Meeting #22 Monday, Nov. 9 th Engineering Computing and Problem Solving with Matlab 2-D plotting with Matlab Script files User-defined functions Matlab
More informationAdvanced Machine Learning Practical 4b Solution: Regression (BLR, GPR & Gradient Boosting)
Advanced Machine Learning Practical 4b Solution: Regression (BLR, GPR & Gradient Boosting) Professor: Aude Billard Assistants: Nadia Figueroa, Ilaria Lauzana and Brice Platerrier E-mails: aude.billard@epfl.ch,
More informationAlgebra , Martin-Gay
A Correlation of Algebra 1 2016, to the Common Core State Standards for Mathematics - Algebra I Introduction This document demonstrates how Pearson s High School Series by Elayn, 2016, meets the standards
More informationIntelligent Embedded Systems Uncertainty, Information and Learning Mechanisms (Part 1)
Advanced Research Intelligent Embedded Systems Uncertainty, Information and Learning Mechanisms (Part 1) Intelligence for Embedded Systems Ph. D. and Master Course Manuel Roveri Politecnico di Milano,
More informationChapter 3 Higher Order Linear ODEs
Chapter 3 Higher Order Linear ODEs Advanced Engineering Mathematics Wei-Ta Chu National Chung Cheng University wtchu@cs.ccu.edu.tw 1 2 3.1 Homogeneous Linear ODEs 3 Homogeneous Linear ODEs An ODE is of
More information(Mathematical Operations with Arrays) Applied Linear Algebra in Geoscience Using MATLAB
Applied Linear Algebra in Geoscience Using MATLAB (Mathematical Operations with Arrays) Contents Getting Started Matrices Creating Arrays Linear equations Mathematical Operations with Arrays Using Script
More informationLinear Models for Regression. Sargur Srihari
Linear Models for Regression Sargur srihari@cedar.buffalo.edu 1 Topics in Linear Regression What is regression? Polynomial Curve Fitting with Scalar input Linear Basis Function Models Maximum Likelihood
More informationDublin City Schools Mathematics Graded Course of Study Algebra I Philosophy
Philosophy The Dublin City Schools Mathematics Program is designed to set clear and consistent expectations in order to help support children with the development of mathematical understanding. We believe
More informationNumerical Methods for Engineers. and Scientists. Applications using MATLAB. An Introduction with. Vish- Subramaniam. Third Edition. Amos Gilat.
Numerical Methods for Engineers An Introduction with and Scientists Applications using MATLAB Third Edition Amos Gilat Vish- Subramaniam Department of Mechanical Engineering The Ohio State University Wiley
More informationMath 365 Homework 5 Due April 6, 2018 Grady Wright
Math 365 Homework 5 Due April 6, 018 Grady Wright 1. (Polynomial interpolation, 10pts) Consider the following three samples of some function f(x): x f(x) -1/ -1 1 1 1/ (a) Determine the unique second degree
More informationSchool District of Marshfield Course Syllabus
School District of Marshfield Course Syllabus Course Name: Algebra I Length of Course: 1 Year Credit: 1 Program Goal(s): The School District of Marshfield Mathematics Program will prepare students for
More information1 Introduction to Minitab
1 Introduction to Minitab Minitab is a statistical analysis software package. The software is freely available to all students and is downloadable through the Technology Tab at my.calpoly.edu. When you
More informationNon-polynomial Least-squares fitting
Applied Math 205 Last time: piecewise polynomial interpolation, least-squares fitting Today: underdetermined least squares, nonlinear least squares Homework 1 (and subsequent homeworks) have several parts
More informationInverses. Stephen Boyd. EE103 Stanford University. October 28, 2017
Inverses Stephen Boyd EE103 Stanford University October 28, 2017 Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Left and right inverses 2 Left inverses a number
More informationLeast Squares. Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Winter UCSD
Least Squares Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 75A Winter 0 - UCSD (Unweighted) Least Squares Assume linearity in the unnown, deterministic model parameters Scalar, additive noise model: y f (
More informationCorrelation and Regression
Correlation and Regression 8 9 Copyright Cengage Learning. All rights reserved. Section 9.2 Linear Regression and the Coefficient of Determination Copyright Cengage Learning. All rights reserved. Focus
More informationHoward Mark and Jerome Workman Jr.
Linearity in Calibration: How to Test for Non-linearity Previous methods for linearity testing discussed in this series contain certain shortcomings. In this installment, the authors describe a method
More informationSubject Algebra 1 Unit 1 Relationships between Quantities and Reasoning with Equations
Subject Algebra 1 Unit 1 Relationships between Quantities and Reasoning with Equations Time Frame: Description: Work with expressions and equations through understanding quantities and the relationships
More informationTABLE OF CONTENTS INTRODUCTION, APPROXIMATION & ERRORS 1. Chapter Introduction to numerical methods 1 Multiple-choice test 7 Problem set 9
TABLE OF CONTENTS INTRODUCTION, APPROXIMATION & ERRORS 1 Chapter 01.01 Introduction to numerical methods 1 Multiple-choice test 7 Problem set 9 Chapter 01.02 Measuring errors 11 True error 11 Relative
More informationAdaptive Filtering. Squares. Alexander D. Poularikas. Fundamentals of. Least Mean. with MATLABR. University of Alabama, Huntsville, AL.
Adaptive Filtering Fundamentals of Least Mean Squares with MATLABR Alexander D. Poularikas University of Alabama, Huntsville, AL CRC Press Taylor & Francis Croup Boca Raton London New York CRC Press is
More informationCompanion. Jeffrey E. Jones
MATLAB7 Companion 1O11OO1O1O1OOOO1O1OO1111O1O1OO 1O1O1OO1OO1O11OOO1O111O1O1O1O1 O11O1O1O11O1O1O1O1OO1O11O1O1O1 O1O1O1111O11O1O1OO1O1O1O1OOOOO O1111O1O1O1O1O1O1OO1OO1OO1OOO1 O1O11111O1O1O1O1O Jeffrey E.
More informationIn this course, we will be examining large data sets coming from different sources. There are two basic types of data:
Chapter 4 Statistics In this course, we will be examining large data sets coming from different sources. There are two basic types of data: Deterministic Data: Data coming from a specific function- each
More informationMATH 350: Introduction to Computational Mathematics
MATH 350: Introduction to Computational Mathematics Chapter IV: Locating Roots of Equations Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Spring 2011 fasshauer@iit.edu
More informationFLORIDA STANDARDS TO BOOK CORRELATION
FLORIDA STANDARDS TO BOOK CORRELATION Florida Standards (MAFS.912) Conceptual Category: Number and Quantity Domain: The Real Number System After a standard is introduced, it is revisited many times in
More informationMyMathLab for School Precalculus Graphical, Numerical, Algebraic Common Core Edition 2016
A Correlation of MyMathLab for School Precalculus Common Core Edition 2016 to the Tennessee Mathematics Standards Approved July 30, 2010 Bid Category 13-090-10 , Standard 1 Mathematical Processes Course
More informationAlgebra 1 Standards Curriculum Map Bourbon County Schools. Days Unit/Topic Standards Activities Learning Targets ( I Can Statements) 1-19 Unit 1
Algebra 1 Standards Curriculum Map Bourbon County Schools Level: Grade and/or Course: Updated: e.g. = Example only Days Unit/Topic Standards Activities Learning Targets ( I 1-19 Unit 1 A.SSE.1 Interpret
More informationQuadratic Formula: - another method for solving quadratic equations (ax 2 + bx + c = 0)
In the previous lesson we showed how to solve quadratic equations that were not factorable and were not perfect squares by making perfect square trinomials using a process called completing the square.
More informationProbability & Statistics: Introduction. Robert Leishman Mark Colton ME 363 Spring 2011
Probability & Statistics: Introduction Robert Leishman Mark Colton ME 363 Spring 2011 Why do we care? Why do we care about probability and statistics in an instrumentation class? Example Measure the strength
More information8 The SVD Applied to Signal and Image Deblurring
8 The SVD Applied to Signal and Image Deblurring We will discuss the restoration of one-dimensional signals and two-dimensional gray-scale images that have been contaminated by blur and noise. After an
More informationIndex I-1. in one variable, solution set of, 474 solving by factoring, 473 cubic function definition, 394 graphs of, 394 x-intercepts on, 474
Index A Absolute value explanation of, 40, 81 82 of slope of lines, 453 addition applications involving, 43 associative law for, 506 508, 570 commutative law for, 238, 505 509, 570 English phrases for,
More informationMATH 350: Introduction to Computational Mathematics
MATH 350: Introduction to Computational Mathematics Chapter V: Least Squares Problems Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Spring 2011 fasshauer@iit.edu MATH
More information(Linear equations) Applied Linear Algebra in Geoscience Using MATLAB
Applied Linear Algebra in Geoscience Using MATLAB (Linear equations) Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots
More informationFeb 21 and 25: Local weighted least squares: Quadratic loess smoother
Feb 1 and 5: Local weighted least squares: Quadratic loess smoother An example of weighted least squares fitting of data to a simple model for the purposes of simultaneous smoothing and interpolation is
More informationFrom Practical Data Analysis with JMP, Second Edition. Full book available for purchase here. About This Book... xiii About The Author...
From Practical Data Analysis with JMP, Second Edition. Full book available for purchase here. Contents About This Book... xiii About The Author... xxiii Chapter 1 Getting Started: Data Analysis with JMP...
More informationJustify all your answers and write down all important steps. Unsupported answers will be disregarded.
Numerical Analysis FMN011 10058 The exam lasts 4 hours and has 13 questions. A minimum of 35 points out of the total 70 are required to get a passing grade. These points will be added to those you obtained
More informationTake-home Final. The questions you are expected to answer for this project are numbered and italicized. There is one bonus question. Good luck!
Take-home Final The data for our final project come from a study initiated by the Tasmanian Aquaculture and Fisheries Institute to investigate the growth patterns of abalone living along the Tasmanian
More information446 CHAP. 8 NUMERICAL OPTIMIZATION. Newton's Search for a Minimum of f(x,y) Newton s Method
446 CHAP. 8 NUMERICAL OPTIMIZATION Newton's Search for a Minimum of f(xy) Newton s Method The quadratic approximation method of Section 8.1 generated a sequence of seconddegree Lagrange polynomials. It
More informationIntroduction Linear system Nonlinear equation Interpolation
Interpolation Interpolation is the process of estimating an intermediate value from a set of discrete or tabulated values. Suppose we have the following tabulated values: y y 0 y 1 y 2?? y 3 y 4 y 5 x
More information6.867 Machine learning
6.867 Machine learning Mid-term eam October 8, 6 ( points) Your name and MIT ID: .5.5 y.5 y.5 a).5.5 b).5.5.5.5 y.5 y.5 c).5.5 d).5.5 Figure : Plots of linear regression results with different types of
More informationProblem 1: Toolbox (25 pts) For all of the parts of this problem, you are limited to the following sets of tools:
CS 322 Final Exam Friday 18 May 2007 150 minutes Problem 1: Toolbox (25 pts) For all of the parts of this problem, you are limited to the following sets of tools: (A) Runge-Kutta 4/5 Method (B) Condition
More informationQ1. Discuss, compare and contrast various curve fitting and interpolation methods
Q1. Discuss, compare and contrast various curve fitting and interpolation methods McMaster University 1 Curve Fitting Problem statement: Given a set of (n + 1) point-pairs {x i,y i }, i = 0,1,... n, find
More informationNUMERICAL METHODS. lor CHEMICAL ENGINEERS. Using Excel', VBA, and MATLAB* VICTOR J. LAW. CRC Press. Taylor & Francis Group
NUMERICAL METHODS lor CHEMICAL ENGINEERS Using Excel', VBA, and MATLAB* VICTOR J. LAW CRC Press Taylor & Francis Group Boca Raton London New York CRC Press is an imprint of the Taylor & Francis Croup,
More informationPowerPoints organized by Dr. Michael R. Gustafson II, Duke University
Part 5 Chapter 17 Numerical Integration Formulas PowerPoints organized by Dr. Michael R. Gustafson II, Duke University All images copyright The McGraw-Hill Companies, Inc. Permission required for reproduction
More informationAlgebra 1 Khan Academy Video Correlations By SpringBoard Activity and Learning Target
Algebra 1 Khan Academy Video Correlations By SpringBoard Activity and Learning Target SB Activity Activity 1 Investigating Patterns 1-1 Learning Targets: Identify patterns in data. Use tables, graphs,
More informationChapter 1 Linear Equations. 1.1 Systems of Linear Equations
Chapter Linear Equations. Systems of Linear Equations A linear equation in the n variables x, x 2,..., x n is one that can be expressed in the form a x + a 2 x 2 + + a n x n = b where a, a 2,..., a n and
More informationMATH 350: Introduction to Computational Mathematics
MATH 350: Introduction to Computational Mathematics Chapter IV: Locating Roots of Equations Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Spring 2011 fasshauer@iit.edu
More informationModel-building and parameter estimation
Luleå University of Technology Johan Carlson Last revision: July 27, 2009 Measurement Technology and Uncertainty Analysis - E7021E MATLAB homework assignment Model-building and parameter estimation Introduction
More informationAlgebra I Number and Quantity The Real Number System (N-RN)
Number and Quantity The Real Number System (N-RN) Use properties of rational and irrational numbers N-RN.3 Explain why the sum or product of two rational numbers is rational; that the sum of a rational
More informationRelations and Functions
Algebra 1, Quarter 2, Unit 2.1 Relations and Functions Overview Number of instructional days: 10 (2 assessments) (1 day = 45 60 minutes) Content to be learned Demonstrate conceptual understanding of linear
More information(0, 0), (1, ), (2, ), (3, ), (4, ), (5, ), (6, ).
1 Interpolation: The method of constructing new data points within the range of a finite set of known data points That is if (x i, y i ), i = 1, N are known, with y i the dependent variable and x i [x
More information