Python Analysis. PHYS 224 September 25/26, 2014
|
|
- Maximillian Rice
- 5 years ago
- Views:
Transcription
1 Python Analysis PHYS 224 September 25/26, 2014
2 Goals Two things to teach in this lecture 1. How to use python to fit data 2. How to interpret what python gives you Some references: ScipyScriptRepo/CurveFitting.ipynb 2
3 Fitting Experimental Data The goal of the lab experiments is to determine a physical quantity y (independent variable) as a function of x (dependent variable) How? Measure the pair (xi,yi) a number (N) times Find a fit function y=y(x) that describes the relationship between these two quantities 3
4 The Linear Case The simplest function relating the two variables is the linear function f(x) = y = ax +b This is valid for any yi,xi combination If a and b are known, the true value of yi can be calculated for any xi yi,true = axi + b 4
5 Linear Regression Linear regression calculates the most probable values of a and b such that the linear equation is valid yi,true = axi + b When taking measurements of yi, these usually obey Gauss distribution 5
6 An Example Ideal Gas Law: P*V = n*r*t Pressure * Volume = n * R * Temperature P = [(n*r)/v]*t 6
7 Fitting in Python We re going to use the curve_fit function, which is part of the scipy.optimize package The usage is as follows: fit_parameters,fit_covariance = scipy.optimize.curve_fit(fit_function,x_data,y_data,sigma,guess) fit_parameters - an array of the output fit parameters fit_covariance - an array of the covariance of the output fit parameters fit_function - the function used to do the fit sigma - the uncertainty associated with the data guess - the initial guess input to the fit 7
8 Fitting with curve_fit import numpy import scipy.optimize from matplotlib import peplos #define the function to be used in the fitting def linearfit(x,*p): return p[0]+p[1]*x #read in the data (currently only located on my hard drive...) temp_data, vol_data = numpy.loadtxt('ideal_gas_law.txt',unpack=true) #add an uncertainty to each measurement point uncertainty = numpy.empty(len(vol_data)) uncertainty.fill(20.) #do the fit fit_parameters,fit_covariance = scipy.optimize.curve_fit(linearfit, temp_data, vol_data, p0=(1.0,8.0),sigma=uncertainty) 8
9 } Fitting with curve_fit import numpy import scipy.optimize from matplotlib import peplos #define the function to be used in the fitting def linearfit(x,*p): return p[0]+p[1]*x #read in the data (currently only located on my hard drive...) temp_data, vol_data = numpy.loadtxt('ideal_gas_law.txt',unpack=true) #add an uncertainty to each measurement point uncertainty = numpy.empty(len(vol_data)) uncertainty.fill(20.) Function #do the fit fit_parameters,fit_covariance = scipy.optimize.curve_fit(linearfit, temp_data, vol_data, p0=(1.0,8.0),sigma=uncertainty) X data } Initial guess for parameters } Uncertainty on data } } Y data 9
10 Results fit parameters =[ ] fit covariance =[[ e e + 01] [ e e 01]] So what does this mean? We set up the function for the fit to be: y = p[0] + p[1]*x So with the fit parameters, the function is: y = *x 10
11 Full Probability For a set of N measurements of the dependent variable y y1, y2, y3, yn The probability of obtaining these values is the product of the individual probaiblities P a,b (y 1,y 2,y 3...y N )=P a,b (y 1 )P a,b (y 2 )P a,b (y 3 )...P a,b (y N ) = 1 N y e P Ni=1 (y i a bx i ) 2 2 y 2 11
12 Full Probability For a set of N measurements of the dependent variable y y1, y2, y3, yn The probability of obtaining these values is the product of the individual probabilities P a,b (y 1,y 2,y 3...y N )=P a,b (y 1 )P a,b (y 2 )P a,b (y 3 )...P a,b (y N ) = 1 N y e P Ni=1 (y i a bx i ) 2 2 y 2 Called the chi-squared (χ 2 ) 12
13 Chi-Squared 2 = NX i=1 (y i a bx i ) 2 2 y The circled part is the definition of the residuals, ie the true data (yi) minus the fit data (a + b*xi) Dividing this by the standard deviation (σ) tells us how many standard deviations the test data is away from the fit at that x The square ensures this is always positive 13
14 Plotting the Residuals #read in the data (currently only located on my hard drive...) temp_data,vol_data = numpy.loadtxt('/users/kclark/desktop/teaching/phys224/weather_data/ ideal_gas_law.txt',unpack=true) #add an uncertainty to each measurement point uncertainty = numpy.empty(len(vol_data)) uncertainty.fill(20.) #do the fit fit_parameters,fit_covariance = scipy.optimize.curve_fit(linearfit,temp_data,vol_data,p0=(1.0,8.0),sigma=uncertainty) #now generate the line of the best fit #set up the temperature points for the full array fit_temp = numpy.arange(270,355,5) #make the data for the best fit values fit_answer = linearfit(fit_temp,*fit_parameters) #calculate the residuals fit_resid = vol_data-linearfit(temp_data,*fit_parameters) #make a line at zero zero_line = numpy.zeros(len(vol_data)) 14
15 How do the Residuals Look? The residuals are obviously a large component of the χ 2 value used by the minimizer They can be plotted to look for trends and see if the fit function is appropriate 15
16 Interpreting the Covariance Elements in covariance matrix represent the relationship between the two variables The diagonals are the square of the standard deviations we will use this in our interpretation of the answer 16
17 Covariance Matrix Elements fit parameters =[ ] fit covariance =[[ e e + 01] [ e e 01]] Diagonal elements are the square of the standard deviation for that parameter The non-diagonal elements show the relationship between the parameters cov(x, y) = 1 N NX (x i x)(y i ȳ) i=1 17
18 Fit Results import numpy import scipy.optimize from matplotlib import pyplot #define the function to be used in the fitting, which is linear in this case def linearfit(x,*p): return p[0]+p[1]*x #read in the data (currently only located on my hard drive...) temp_data,vol_data = numpy.loadtxt('/users/kclark/desktop/teaching/phys224/weather_data/ ideal_gas_law.txt',unpack=true) #add an uncertainty to each measurement point uncertainty = numpy.empty(len(vol_data)) uncertainty.fill(20.) #do the fit fit_parameters,fit_covariance = scipy.optimize.curve_fit(linearfit,temp_data,vol_data,p0=(1.0,8.0),sigma=uncertainty) #determine the standard deviations for each parameter sigma0 = numpy.sqrt(fit_covariance[0,0]) sigma1 = numpy.sqrt(fit_covariance[1,1]) 18
19 Fit Results fit parameters =[ ] fit covariance =[[ e e + 01] [ e e 01]] Calculate the standard deviation on the slope (p[1]) This is the square root of the [1,1] entry of the covariance matrix 19
20 Fit Results fit parameters =[ ] fit covariance =[[ e e + 01] [ e e 01]] Show the p[1] parameter with the standard deviation: p1 = 8.33 ±
21 Comparison to Accepted Values We obtained the result p[1] = 8.33±0.47 We assume that there is 1 mole in a 1m 3 volume so that n=v=1 The accepted value (currently) is ± The accepted value IS contained within our uncertainty (our one sigma range is from 7.86 to 8.80) These values agree within their error 21
22 Application to Non-linear Examples This method can also be applied to other examples Powers: y = b x can be linearized as y 2 = b 2 *x Polynomials: y = a + b*x + c*x 2 + d*x 3 This is just a case of using multiple regression since the equation is linear in the coefficients Exponentials: y= a*e bx Can be linearized as ln(y) = ln(a) + b*x There are many other examples 22
23 Return to Chi-Squared 2 = NX i=1 (y i y(x i )) 2 2 y Here the definition of the residual has changed Instead of yi - a - b*xi a more general term has been used yi is still the data y(xi) is the fit function evaluated at xi 23
24 Gauss Distribution The probability is described by P (x) = 1 p 2 e (x x)2 2 2 where the average (mean) value is x and the spread in values is σ 24
25 Gauss Distribution We use the probabilities shown above to determine how probable a value is in this distribution When we take a measurement, we expect that 68.2% of the time it will be within 1σ from the mean value Another way of phrasing this is that we expect a value to be more than 3σ above the mean value only 0.1% of the time 25
26 Another example 26
27 Fitting the Gaussian import numpy import scipy.optimize import matplotlib.pyplot as pyplot import pylab as py #define the function to be used in the fitting, which is linear in this case def gaussfit(x,*p): return p[0]+p[1]*numpy.exp(-1*(x-p[2])**2/(2*p[3]**2)) #read in the data (currently only located on my hard drive...) day_num,rain_data = numpy.loadtxt('/users/kclark/desktop/teaching/phys224/weather_data/ precip_2013.txt', unpack=true) #get some (pretty good) guesses for the fitting parameters data_mean = rain_data.mean() data_std = rain_data.std() #set up the histogram so that it can be fit data_plot = py.hist(rain_data,range=(0.1,90),bins=100) histx = [0.5 * (data_plot[1][i] + data_plot[1][i + 1]) for i in xrange(100)] histy = data_plot[0] #actually do the fitting fit_parameters,fit_covariance = scipy.optimize.curve_fit(gaussfit,histx,histy,p0=(5.0,10.0,data_mean,data_std)) 27
28 Another example Mean Fit mean: 7.06mm Fit standard deviation: 10.13mm } Standard Deviation 28
29 Another example Mean Fit mean: 7.06mm Fit standard deviation: 10.13mm } Standard Deviation 29
30 Another example Mean Fit mean: 7.06mm Fit standard deviation: 10.13mm } Standard Deviation 30
31 Another example Mean Fit mean: 7.06mm Fit standard deviation: 10.13mm Rainfall of 85.5mm is 7.74 standard deviations above the mean (from this data) which is extremely } Standard Deviation unlikely 31
32 Chi-Squared and Goodness of Fit 2 = NX i=1 (y i y(x i )) 2 2 y This can then be used as a goodness of fit test If the function is a good approximation, then the residual will be within one standard deviation, so this will sum to approximately N 32
33 Chi-Squared 2 = NX i=1 (y i y(x i )) 2 2 y We normally use the number of degrees of freedom of the experiment to determine the fit quality The number of DOF is the number of data points in the sample minus the number of parameters in the fit For a sample with 20 data points and a linear fit (2 parameters), DOF = 18 This is used as the goodness of fit since χ 2 /DOF 1 for a good fit 33
34 Revisit the First Example import numpy import scipy.optimize from matplotlib import pyplot #define the function to be used in the fitting, which is linear in this case def linearfit(x,*p): return p[0]+p[1]*x #read in the data (currently only located on my hard drive...) temp_data,vol_data = numpy.loadtxt('/users/kclark/desktop/teaching/phys224/weather_data/ ideal_gas_law.txt',unpack=true) #add an uncertainty to each measurement point uncertainty = numpy.empty(len(vol_data)) uncertainty.fill(20.) #do the fit fit_parameters,fit_covariance = scipy.optimize.curve_fit(linearfit,temp_data,vol_data,p0=(1.0,8.0),sigma=uncertainty) #calculate the chi-squared value chisq = sum(((vol_data-linearfit(temp_data,*fit_parameters))/uncertainty)**2) print chisq dof = len(temp_data)-len(fit_parameters) print dof 34
35 Revisit the First Example Is this a good fit? 2 = X16 i=1 apple presdatai fit i uncertainty 2 = 65.6 Divide this by the DOF We have 16 data points, 2 parameters 2 DOF = =4.68 This may not be a great fit... 35
36 Goodness of Fit Previous statements only mostly true More accurately: χ 2 >> 1 is a very poor fit, maybe even a fit model which doesn t match χ 2 > 1 is not a good fit, or the uncertainty is underestimated χ 2 << 1 means the uncertainty could be overestimated 36
37 Summary You should now be well prepared to use python to fit the data Your practice with this starts with the next pendulum exercise, which you can begin now! 37
Python Analysis. PHYS 224 October 1/2, 2015
Python Analysis PHYS 224 October 1/2, 2015 Goals Two things to teach in this lecture 1. How to use python to fit data 2. How to interpret what python gives you Some references: http://nbviewer.ipython.org/url/media.usm.maine.edu/~pauln/
More informationParameter Estimation and Fitting to Data
Parameter Estimation and Fitting to Data Parameter estimation Maximum likelihood Least squares Goodness-of-fit Examples Elton S. Smith, Jefferson Lab 1 Parameter estimation Properties of estimators 3 An
More informationCorrelation 1. December 4, HMS, 2017, v1.1
Correlation 1 December 4, 2017 1 HMS, 2017, v1.1 Chapter References Diez: Chapter 7 Navidi, Chapter 7 I don t expect you to learn the proofs what will follow. Chapter References 2 Correlation The sample
More informationIntermediate Lab PHYS 3870
Intermediate Lab PHYS 3870 Lecture 4 Comparing Data and Models Quantitatively Linear Regression Introduction Section 0 Lecture 1 Slide 1 References: Taylor Ch. 8 and 9 Also refer to Glossary of Important
More informationComputational Physics
Interpolation, Extrapolation & Polynomial Approximation Lectures based on course notes by Pablo Laguna and Kostas Kokkotas revamped by Deirdre Shoemaker Spring 2014 Introduction In many cases, a function
More informationLeast-Squares Regression
Least-quares Regression ChEn 2450 Concept: Given data points (x i, ), find parameters in the function f(x) that minimize the error between f(x i ) and. f(x) f(x) x x Regression.key - eptember 22, 204 Introduction:
More informationCURVE FITTING LEAST SQUARE LINE. Consider the class of linear function of the form. = Ax+ B...(1)
CURVE FITTIG LEAST SQUARE LIE Consider the class of linear function of the form y = f( x) = B...() In previous chapter we saw how to construct a polynomial that passes through a set of points. If all numerical
More informationUnions of Solutions. Unions. Unions of solutions
Unions of Solutions We ll begin this chapter by discussing unions of sets. Then we ll focus our attention on unions of sets that are solutions of polynomial equations. Unions If B is a set, and if C is
More information6x 2 8x + 5 ) = 12x 8
Example. If f(x) = x 3 4x + 5x + 1, then f (x) = 6x 8x + 5 Observation: f (x) is also a differentiable function... d dx ( f (x) ) = d dx ( 6x 8x + 5 ) = 1x 8 The derivative of f (x) is called the second
More informationIntroduction to least square fits for Geant 4 Simulation and ROOT Analysis of a Silicon Beam Telescope Mar , DESY Hamburg Olaf Behnke, DESY
Introduction to least square fits for Geant 4 Simulation and ROOT Analysis of a Silicon Beam Telescope Mar 4-6 2, DESY Hamburg Olaf Behnke, DESY Literature: Roger Barlow: Statistics, A Guide To The Use
More informationRegression and Nonlinear Axes
Introduction to Chemical Engineering Calculations Lecture 2. What is regression analysis? A technique for modeling and analyzing the relationship between 2 or more variables. Usually, 1 variable is designated
More informationLecture 10 Polynomial interpolation
Lecture 10 Polynomial interpolation Weinan E 1,2 and Tiejun Li 2 1 Department of Mathematics, Princeton University, weinan@princeton.edu 2 School of Mathematical Sciences, Peking University, tieli@pku.edu.cn
More informationSimple Linear Regression for the Climate Data
Prediction Prediction Interval Temperature 0.2 0.0 0.2 0.4 0.6 0.8 320 340 360 380 CO 2 Simple Linear Regression for the Climate Data What do we do with the data? y i = Temperature of i th Year x i =CO
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationSome Statistics. V. Lindberg. May 16, 2007
Some Statistics V. Lindberg May 16, 2007 1 Go here for full details An excellent reference written by physicists with sample programs available is Data Reduction and Error Analysis for the Physical Sciences,
More informationLecture 3: Linear Models. Bruce Walsh lecture notes Uppsala EQG course version 28 Jan 2012
Lecture 3: Linear Models Bruce Walsh lecture notes Uppsala EQG course version 28 Jan 2012 1 Quick Review of the Major Points The general linear model can be written as y = X! + e y = vector of observed
More informationStatistics, Data Analysis, and Simulation SS 2015
Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 89 Part II
More informationSimple Linear Regression for the MPG Data
Simple Linear Regression for the MPG Data 2000 2500 3000 3500 15 20 25 30 35 40 45 Wgt MPG What do we do with the data? y i = MPG of i th car x i = Weight of i th car i =1,...,n n = Sample Size Exploratory
More information26, 24, 26, 28, 23, 23, 25, 24, 26, 25
The ormal Distribution Introduction Chapter 5 in the text constitutes the theoretical heart of the subject of error analysis. We start by envisioning a series of experimental measurements of a quantity.
More informationMath 2142 Homework 5 Part 1 Solutions
Math 2142 Homework 5 Part 1 Solutions Problem 1. For the following homogeneous second order differential equations, give the general solution and the particular solution satisfying the given initial conditions.
More informationIntroduction to Error Analysis
Introduction to Error Analysis Part 1: the Basics Andrei Gritsan based on lectures by Petar Maksimović February 1, 2010 Overview Definitions Reporting results and rounding Accuracy vs precision systematic
More informationLecture 14 Simple Linear Regression
Lecture 4 Simple Linear Regression Ordinary Least Squares (OLS) Consider the following simple linear regression model where, for each unit i, Y i is the dependent variable (response). X i is the independent
More informationLecture 2: Linear Models. Bruce Walsh lecture notes Seattle SISG -Mixed Model Course version 23 June 2011
Lecture 2: Linear Models Bruce Walsh lecture notes Seattle SISG -Mixed Model Course version 23 June 2011 1 Quick Review of the Major Points The general linear model can be written as y = X! + e y = vector
More informationIntroduction to Python
Introduction to Python Luis Pedro Coelho Institute for Molecular Medicine (Lisbon) Lisbon Machine Learning School II Luis Pedro Coelho (IMM) Introduction to Python Lisbon Machine Learning School II (1
More informationOutline Python, Numpy, and Matplotlib Making Models with Polynomials Making Models with Monte Carlo Error, Accuracy and Convergence Floating Point Mod
Outline Python, Numpy, and Matplotlib Making Models with Polynomials Making Models with Monte Carlo Error, Accuracy and Convergence Floating Point Modeling the World with Arrays The World in a Vector What
More informationStudy Sheet. December 10, The course PDF has been updated (6/11). Read the new one.
Study Sheet December 10, 2017 The course PDF has been updated (6/11). Read the new one. 1 Definitions to know The mode:= the class or center of the class with the highest frequency. The median : Q 2 is
More informationPhysics 1140 Fall 2013 Introduction to Experimental Physics
Physics 1140 Fall 2013 Introduction to Experimental Physics Joanna Atkin Lecture 5: Recap of Error Propagation and Gaussian Statistics Graphs and linear fitting Experimental analysis Typically make repeat
More informationNumpy. Luis Pedro Coelho. October 22, Programming for Scientists. Luis Pedro Coelho (Programming for Scientists) Numpy October 22, 2012 (1 / 26)
Numpy Luis Pedro Coelho Programming for Scientists October 22, 2012 Luis Pedro Coelho (Programming for Scientists) Numpy October 22, 2012 (1 / 26) Historical Numeric (1995) Numarray (for large arrays)
More informationSimple Linear Regression Analysis
LINEAR REGRESSION ANALYSIS MODULE II Lecture - 6 Simple Linear Regression Analysis Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Prediction of values of study
More informationECE 5615/4615 Computer Project
Set #1p Due Friday March 17, 017 ECE 5615/4615 Computer Project The details of this first computer project are described below. This being a form of take-home exam means that each person is to do his/her
More informationPoisson distribution and χ 2 (Chap 11-12)
Poisson distribution and χ 2 (Chap 11-12) Announcements: Last lecture today! Labs will continue. Homework assignment will be posted tomorrow or Thursday (I will send email) and is due Thursday, February
More informationComplex Numbers. A complex number z = x + iy can be written in polar coordinates as re i where
Lab 20 Complex Numbers Lab Objective: Create visualizations of complex functions. Visually estimate their zeros and poles, and gain intuition about their behavior in the complex plane. Representations
More informationGradient Descent Methods
Lab 18 Gradient Descent Methods Lab Objective: Many optimization methods fall under the umbrella of descent algorithms. The idea is to choose an initial guess, identify a direction from this point along
More informationUncertainty and Graphical Analysis
Uncertainty and Graphical Analysis Introduction Two measures of the quality of an experimental result are its accuracy and its precision. An accurate result is consistent with some ideal, true value, perhaps
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More informationError Analysis in Experimental Physical Science Mini-Version
Error Analysis in Experimental Physical Science Mini-Version by David Harrison and Jason Harlow Last updated July 13, 2012 by Jason Harlow. Original version written by David M. Harrison, Department of
More informationLecture 10: Linear Multistep Methods (LMMs)
Lecture 10: Linear Multistep Methods (LMMs) 2nd-order Adams-Bashforth Method The approximation for the 2nd-order Adams-Bashforth method is given by equation (10.10) in the lecture note for week 10, as
More informationBiostatistics and Design of Experiments Prof. Mukesh Doble Department of Biotechnology Indian Institute of Technology, Madras
Biostatistics and Design of Experiments Prof. Mukesh Doble Department of Biotechnology Indian Institute of Technology, Madras Lecture - 39 Regression Analysis Hello and welcome to the course on Biostatistics
More informationLinear Regression Spring 2014
Linear Regression 18.05 Spring 2014 Agenda Fitting curves to bivariate data Measuring the goodness of fit The fit vs. complexity tradeoff Regression to the mean Multiple linear regression January 1, 2017
More informationConjugate-Gradient. Learn about the Conjugate-Gradient Algorithm and its Uses. Descent Algorithms and the Conjugate-Gradient Method. Qx = b.
Lab 1 Conjugate-Gradient Lab Objective: Learn about the Conjugate-Gradient Algorithm and its Uses Descent Algorithms and the Conjugate-Gradient Method There are many possibilities for solving a linear
More informationNumerical Methods for Initial Value Problems; Harmonic Oscillators
1 Numerical Methods for Initial Value Problems; Harmonic Oscillators Lab Objective: Implement several basic numerical methods for initial value problems (IVPs), and use them to study harmonic oscillators.
More informationChi-square tests. Unit 6: Simple Linear Regression Lecture 1: Introduction to SLR. Statistics 101. Poverty vs. HS graduate rate
Review and Comments Chi-square tests Unit : Simple Linear Regression Lecture 1: Introduction to SLR Statistics 1 Monika Jingchen Hu June, 20 Chi-square test of GOF k χ 2 (O E) 2 = E i=1 where k = total
More informationParameter estimation Conditional risk
Parameter estimation Conditional risk Formalizing the problem Specify random variables we care about e.g., Commute Time e.g., Heights of buildings in a city We might then pick a particular distribution
More informationCS 195-5: Machine Learning Problem Set 1
CS 95-5: Machine Learning Problem Set Douglas Lanman dlanman@brown.edu 7 September Regression Problem Show that the prediction errors y f(x; ŵ) are necessarily uncorrelated with any linear function of
More informationSTATISTICS OF OBSERVATIONS & SAMPLING THEORY. Parent Distributions
ASTR 511/O Connell Lec 6 1 STATISTICS OF OBSERVATIONS & SAMPLING THEORY References: Bevington Data Reduction & Error Analysis for the Physical Sciences LLM: Appendix B Warning: the introductory literature
More informationCurve Fitting and Interpolation
Chapter 5 Curve Fitting and Interpolation 5.1 Basic Concepts Consider a set of (x, y) data pairs (points) collected during an experiment, Curve fitting: is a procedure to develop or evaluate mathematical
More informationIntroduction to Determining Power Law Relationships
1 Goal Introduction to Determining Power Law Relationships Content Discussion and Activities PHYS 104L The goal of this week s activities is to expand on a foundational understanding and comfort in modeling
More informationConditioning and Stability
Lab 17 Conditioning and Stability Lab Objective: Explore the condition of problems and the stability of algorithms. The condition number of a function measures how sensitive that function is to changes
More informationStatistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg
Statistics for Data Analysis PSI Practical Course 2014 Niklaus Berger Physics Institute, University of Heidelberg Overview You are going to perform a data analysis: Compare measured distributions to theoretical
More informationWrite a simple 1D DFT code in Python
Write a simple 1D DFT code in Python Ask Hjorth Larsen, asklarsen@gmail.com Keenan Lyon, lyon.keenan@gmail.com September 15, 2018 Overview Our goal is to write our own KohnSham (KS) density functional
More informationTMA4255 Applied Statistics V2016 (5)
TMA4255 Applied Statistics V2016 (5) Part 2: Regression Simple linear regression [11.1-11.4] Sum of squares [11.5] Anna Marie Holand To be lectured: January 26, 2016 wiki.math.ntnu.no/tma4255/2016v/start
More informationAn introduction to plotting data
An introduction to plotting data Eric D. Black California Institute of Technology v2.0 1 Introduction Plotting data is one of the essential skills every scientist must have. We use it on a near-daily basis
More informationSimple Linear Regression
Simple Linear Regression Christopher Ting Christopher Ting : christophert@smu.edu.sg : 688 0364 : LKCSB 5036 January 7, 017 Web Site: http://www.mysmu.edu/faculty/christophert/ Christopher Ting QF 30 Week
More informationCalculus I Homework: The Derivatives of Polynomials and Exponential Functions Page 1
Calculus I Homework: The Derivatives of Polynomials and Exponential Functions Page 1 Questions Example Differentiate the function y = ae v + b v + c v 2. Example Differentiate the function y = A + B x
More information4 Gaussian Mixture Models
4 Gaussian Mixture Models Once you have a collection of feature vectors you will need to describe their distribution. You will do this using a Gaussian Mixture Model. The GMM comprises a collection of
More informationCan you predict the future..?
Can you predict the future..? Gaussian Process Modelling for Forward Prediction Anna Scaife 1 1 Jodrell Bank Centre for Astrophysics University of Manchester @radastrat September 7, 2017 Anna Scaife University
More informationLinear Least Squares Fitting
Linear Least Squares Fitting Bhas Bapat IISER Pune Nov 2014 Bhas Bapat (IISER Pune) Linear Least Squares Fitting Nov 2014 1 / 16 What is Least Squares Fit? A procedure for finding the best-fitting curve
More informationReview of Statistics
Review of Statistics Topics Descriptive Statistics Mean, Variance Probability Union event, joint event Random Variables Discrete and Continuous Distributions, Moments Two Random Variables Covariance and
More informationCurve Fitting. Objectives
Curve Fitting Objectives Understanding the difference between regression and interpolation. Knowing how to fit curve of discrete with least-squares regression. Knowing how to compute and understand the
More informationExercise 4 Modeling transient currents and voltages
Exercise 4 Modeling transient currents and voltages Basic A circuit elements In a D circuit, the electro-motive forces push the electrons along the circuit and resistors remove that energy by conversion
More informationLecture 48 Sections Mon, Nov 16, 2009
and and Lecture 48 Sections 13.4-13.5 Hampden-Sydney College Mon, Nov 16, 2009 Outline and 1 2 3 4 5 6 Outline and 1 2 3 4 5 6 and Exercise 13.4, page 821. The following data represent trends in cigarette
More informationThe Treatment of Numerical Experimental Results
Memorial University of Newfoundl Department of Physics Physical Oceanography The Treatment of Numerical Experimental Results The purpose of these notes is to introduce you to some techniques of error analysis
More informationEC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)
1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For
More informationIntroduction to Python
Introduction to Python Luis Pedro Coelho luis@luispedro.org @luispedrocoelho European Molecular Biology Laboratory Lisbon Machine Learning School 2015 Luis Pedro Coelho (@luispedrocoelho) Introduction
More informationLeast Squares. Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Winter UCSD
Least Squares Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 75A Winter 0 - UCSD (Unweighted) Least Squares Assume linearity in the unnown, deterministic model parameters Scalar, additive noise model: y f (
More informationMA 1128: Lecture 19 4/20/2018. Quadratic Formula Solving Equations with Graphs
MA 1128: Lecture 19 4/20/2018 Quadratic Formula Solving Equations with Graphs 1 Completing-the-Square Formula One thing you may have noticed when you were completing the square was that you followed the
More informationLecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis
Lecture 3 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,
More informationLorenz Equations. Lab 1. The Lorenz System
Lab 1 Lorenz Equations Chaos: When the present determines the future, but the approximate present does not approximately determine the future. Edward Lorenz Lab Objective: Investigate the behavior of a
More informationLegendre s Equation. PHYS Southern Illinois University. October 18, 2016
Legendre s Equation PHYS 500 - Southern Illinois University October 18, 2016 PHYS 500 - Southern Illinois University Legendre s Equation October 18, 2016 1 / 11 Legendre s Equation Recall We are trying
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationGRAPHICAL ANALYSIS. y=mx+b (22)
GRAPHCAL ANALYSS A purpose of many experiments is to find the relationship between measured variables. A good way to accomplish this task is to plot a graph of the data and then analyze the graph. These
More informationApplied Econometrics (QEM)
Applied Econometrics (QEM) based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #3 1 / 42 Outline 1 2 3 t-test P-value Linear
More informationp(z)
Chapter Statistics. Introduction This lecture is a quick review of basic statistical concepts; probabilities, mean, variance, covariance, correlation, linear regression, probability density functions and
More information15.2 Fitting Data to a Straight Line
15.2 Fitting Data to a Straight Line 661 as a distribution can be. Almost always, the cause of too good a chi-square fit is that the experimenter, in a fit of conservativism, has overestimated his or her
More informationSection Least Squares Regression
Section 2.3 - Least Squares Regression Statistics 104 Autumn 2004 Copyright c 2004 by Mark E. Irwin Regression Correlation gives us a strength of a linear relationship is, but it doesn t tell us what it
More information1.4 Techniques of Integration
.4 Techniques of Integration Recall the following strategy for evaluating definite integrals, which arose from the Fundamental Theorem of Calculus (see Section.3). To calculate b a f(x) dx. Find a function
More informationLesson Plan. Answer Questions. Summary Statistics. Histograms. The Normal Distribution. Using the Standard Normal Table
Lesson Plan Answer Questions Summary Statistics Histograms The Normal Distribution Using the Standard Normal Table 1 2. Summary Statistics Given a collection of data, one needs to find representations
More informationIntroduction to Data Analysis
Introduction to Data Analysis Analysis of Experimental Errors How to Report and Use Experimental Errors Statistical Analysis of Data Simple statistics of data Plotting and displaying the data Summary Errors
More informationMathematical Economics: Lecture 2
Mathematical Economics: Lecture 2 Yu Ren WISE, Xiamen University September 25, 2012 Outline 1 Number Line The number line, origin (Figure 2.1 Page 11) Number Line Interval (a, b) = {x R 1 : a < x < b}
More informationPROBABILITY THEORY REVIEW
PROBABILITY THEORY REVIEW CMPUT 466/551 Martha White Fall, 2017 REMINDERS Assignment 1 is due on September 28 Thought questions 1 are due on September 21 Chapters 1-4, about 40 pages If you are printing,
More informationLinear Regression (continued)
Linear Regression (continued) Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 6, 2017 1 / 39 Outline 1 Administration 2 Review of last lecture 3 Linear regression
More informationEconomics 620, Lecture 2: Regression Mechanics (Simple Regression)
1 Economics 620, Lecture 2: Regression Mechanics (Simple Regression) Observed variables: y i ; x i i = 1; :::; n Hypothesized (model): Ey i = + x i or y i = + x i + (y i Ey i ) ; renaming we get: y i =
More informationLectures about Python, useful both for beginners and experts, can be found at (http://scipy-lectures.github.io).
Random Matrix Theory (Sethna, "Entropy, Order Parameters, and Complexity", ex. 1.6, developed with Piet Brouwer) 2016, James Sethna, all rights reserved. This is an ipython notebook. This hints file is
More informationThe minimal polynomial
The minimal polynomial Michael H Mertens October 22, 2015 Introduction In these short notes we explain some of the important features of the minimal polynomial of a square matrix A and recall some basic
More informationVectors and Matrices Statistics with Vectors and Matrices
Vectors and Matrices Statistics with Vectors and Matrices Lecture 3 September 7, 005 Analysis Lecture #3-9/7/005 Slide 1 of 55 Today s Lecture Vectors and Matrices (Supplement A - augmented with SAS proc
More informationInterpolation APPLIED PROBLEMS. Reading Between the Lines FLY ROCKET FLY, FLY ROCKET FLY WHAT IS INTERPOLATION? Figure Interpolation of discrete data.
WHAT IS INTERPOLATION? Given (x 0,y 0 ), (x,y ), (x n,y n ), find the value of y at a value of x that is not given. Interpolation Reading Between the Lines Figure Interpolation of discrete data. FLY ROCKET
More informationMathematics 136 Calculus 2 Everything You Need Or Want To Know About Partial Fractions (and maybe more!) October 19 and 21, 2016
Mathematics 36 Calculus 2 Everything You Need Or Want To Know About Partial Fractions (and maybe more!) October 9 and 2, 206 Every rational function (quotient of polynomials) can be written as a polynomial
More informationStatistical Methods for Astronomy
Statistical Methods for Astronomy Probability (Lecture 1) Statistics (Lecture 2) Why do we need statistics? Useful Statistics Definitions Error Analysis Probability distributions Error Propagation Binomial
More informationComplex Numbers. Visualize complex functions to estimate their zeros and poles.
Lab 1 Complex Numbers Lab Objective: Visualize complex functions to estimate their zeros and poles. Polar Representation of Complex Numbers Any complex number z = x + iy can be written in polar coordinates
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationVoyager Plasma Science at Jupiter Error Analysis (1) errors in the measurement (2) uncertainty in the fit parameters Fred Fred ( ):
Voyager Plasma Science at Jupiter Error Analysis Logan Dougherty, Kaleb Bodisch, Rob Wilson, Fran Bagenal, Frank Crary LASP University of Colorado Boulder The purpose of this document is to address two
More informationWeighted Least Squares
Weighted Least Squares The standard linear model assumes that Var(ε i ) = σ 2 for i = 1,..., n. As we have seen, however, there are instances where Var(Y X = x i ) = Var(ε i ) = σ2 w i. Here w 1,..., w
More informationECON The Simple Regression Model
ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In
More informationECON 4160, Autumn term Lecture 1
ECON 4160, Autumn term 2017. Lecture 1 a) Maximum Likelihood based inference. b) The bivariate normal model Ragnar Nymoen University of Oslo 24 August 2017 1 / 54 Principles of inference I Ordinary least
More informationIntroduction to Statistics and Error Analysis II
Introduction to Statistics and Error Analysis II Physics116C, 4/14/06 D. Pellett References: Data Reduction and Error Analysis for the Physical Sciences by Bevington and Robinson Particle Data Group notes
More informationAnalysis of Functions
Lecture for Week 11 (Secs. 5.1 3) Analysis of Functions (We used to call this topic curve sketching, before students could sketch curves by typing formulas into their calculators. It is still important
More information( 3) ( ) ( ) ( ) ( ) ( )
81 Instruction: Determining the Possible Rational Roots using the Rational Root Theorem Consider the theorem stated below. Rational Root Theorem: If the rational number b / c, in lowest terms, is a root
More informationUncertainty in Physical Measurements: Module 5 Data with Two Variables
: Often data have two variables, such as the magnitude of the force F exerted on an object and the object s acceleration a. In this Module we will examine some ways to determine how one of the variables,
More informationElementary ODE Review
Elementary ODE Review First Order ODEs First Order Equations Ordinary differential equations of the fm y F(x, y) () are called first der dinary differential equations. There are a variety of techniques
More information