Regression and Covariance

Size: px
Start display at page:

Download "Regression and Covariance"

Transcription

1 Regression and Covariance James K. Peterson Department of Biological ciences and Department of Mathematical ciences Clemson University April 16, 2014 Outline A Review of Regression Regression and Covariance

2 Abstract This lecture redoes regression in terms of covariances. We begin with a collection of data pairs {(x i, y i) : 1 i }. The line we want to pick has the form y = mx + b for some choice of slope m and intercept b. The distance from a given data point (x i, y i) and our line is the usual Euclidean distance d ij given by d i = (mx i + b y i) 2. If we want to minimize the sum of all these individual errors, we get the same result by minimizing the sum of all the errors squared. Define an error function E by E = d 2 i = (mx i + b y i) 2. We see the error function E is really a function of the two independent variables m and b.

3 The optimal slope is m = E(XY) E(X) E(Y) E(X 2 ) E(X) E(X). The optimal intercept is b = E(Y) E(X2 ) E(X) E(XY) E(X 2 ) E(X) E(X) An equivalent solution that is easier to find is b = E(X)E(Y) m (E(X))2 E(X) = E(Y) m E(X). Now the term E(X 2 ) E(X) E(X) and E(XY) E(X) E(Y) occurs a lot in this kind of work. We call this kind of calculation a covariance and use the symbol Cov for them. The formal definitions are We thus know Thus, = E(X 2 ) E(X) E(X) Cov(XY) = E(XY) E(X) E(Y) m = E(XY) E(X) E(Y) E(X 2 ) E(X) E(X) = Cov(XY) Cov(XY) = m which tells us the covariance of x and y is proportional to the slope of the regression line of Y on X with proportionality constant given by.

4 Hence, the optimal slope and intercept are given by b = E(Y) E(X2 ) E(X) E(XY) m = Cov(XY). Finally, there is one other idea that is useful here: the idea of how our data varies from the expected value E(X). We can calculate the expected squared total difference as follows E((X E(X)) 2 ) = 1 (xi E(X)) 2 = 1 ( ) xi 2 2xi E(X) + (E(X)) 2 = 1 xi 2 2 E(X) xi + 1 (E(X)) 2 = E(X 2 ) 2 (E(X)) 2 + (E(X)) 2 = E(X 2 ) (E(X)) 2 =. This calculation gives us what is called the variance of our data and you should learn more about this tool in other courses as it is extremely useful. Alas, our needs are quite limited in this course, so we just need to mention it. o we have another definition. The variance is denoted by the symbol Var and defined by Var(X) = E((X E(X)) 2 ) = E(X 2 ) (E(X)) 2 =. Note that the variance Var(X) is exactly the same as the covariance! We can now see that there is an interesting connection between the covariance of X and Y in terms of the covariance and variance of X.

5 Let s denote the slope m obtaining by our optimization strategy by m(x, Y) so we always remember is a function of our data pairs. A summary of our work is in order. We have found the optimal slope and y intercept for our data is given by m and b, respectively, where m(x, Y) = Cov(XY) b(x, Y) = E(Y) E(X2 ) E(X) E(XY) = E(Y) m(x, Y) E(X) We call the optimal slope m(x, Y) the slope of the regression of Y on X. This is something we can measure so it is an estimate of how the variable y changes with respect to x. We now know a fair bit of calculus, so we can think of m(x, Y) as an estimate of either dy y dx or x which is a really useful idea. Then, we notice that Cov(X, Y) = Var(X) Cov(X, Y) = Var(X) m(x, Y). Var(X) Thus, the covariance of x and y is proportional to the slope of the regression line of Y on X with proportionality constant given by the variance Var(X) which, of course, is the same as the covariance of X with itself, Cov(X, X). Example Let s find the = Var(X) and Cov(XY) for the data D = {(1.2, 2.3), (2.4, 1.9), (3.0, 4.5), (3.7, 5.2), (4.1, 3.2), (5.0, 7.2)}. olution We do this in MatLab. % s e t u p t h e d a t a a s X and Y v e c t o r s X = [ 1. 2 ; 2. 4 ; 3. 0 ; 3. 7 ; 4. 1 ; 5. 0 ] ; Y = [ 2. 3 ; 1. 9 ; 4. 5 ; 5. 2 ; 3. 2 ; 7. 2 ] ; % g e t l e n g t h o f d a t a N = l e n g t h (X) ; % F i n d E (X) c a l l e d EX h e r e EX = sum (X) /N;

6 olution % F i n d E (Y) c a l l e d EY h e r e EY = sum (Y) /N % f i n d E (XY) c a l l e d EXY h e r e EXY = sum (X. Y) /N; % f i n d E (Xˆ2) c a l l e d EXX h e r e EXX = sum (X. X) /N; % f i n d Cov (X), h e r e COVX COVX = EXX EX EX ; % here COVX = % F i n d COV(X,Y), h e r e COVXY COVXY = EXY EX EY % here COVXY = Homework 75 Again, these problems are taken from from ones you can find in R. okal and F. J. Rohlf, Introduction to Biostatistics published by Dover in the chapter on regression. Your results need to be placed in a Word doc in the usual way, nicely commented with embedded plots. For these problems calculate and Cov(XY) in MatLab The data here has the form (Time, Temperature) where the time is the amount of time that has elapsed since a rabbit was inoculated with a virus and the temperature is the rabbit s temperature at that time. Find the covariances for this data. D = {(24, 102.8), (32, 104.5), (48, 106.5), (56.0, 107.0), (72.0, 103.9), (80.0, 103.2)}.

7 Homework The data here has the form (Larval Density, Weight) where the larval density is the number of fly larva per unit area and the weight is the adult fly weight. Find the covariances for this data. D = {(1, 1.356), (3, 1.356), (5, 1.284), (6, 1.252), (10, 0.989), (20, 0.664)} The data here has the form (Temperature, Calorie Expenditure) where temperature is the environmental temperature a sparrow is living in and calorie expenditure is the amount of energy the sparrow used at that temperature. Find the regression line for this data D = {(0, 24.9), (4, 23.4), (10, 24.2), (18, 18.7), (26, 15.2), (34, 13.7)}. Homework The data here has the form (Temperature, Developmental Time) where temperature is the environmental temperature the leaf hopper is living in and the developmental time is a measurement of the time it takes for the leaf hopper to develop at this temperature. Find the covariances for this data. D = {(59.8, 58.1), (67.6, 27.3), (70.0, 26.8), (74.0, 19.1), (78.0, 16.5), (91.4, 14.6)} The data here has the form (Depth, Temperature) where depth is the depth in meters at which the water temperature in a lake is measured. Find the covariances for this data. D = {(0, 24.8), (1, 23.2), (2, 22, 2), (3, 21.2), (5, 13.8), (7, 8.2)}.

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

CS70: Lecture 33. Linear Regression. 1. Examples 2. History 3. Multiple Random variables 4. Linear Regression 5. Derivation 6.

CS70: Lecture 33. Linear Regression. 1. Examples 2. History 3. Multiple Random variables 4. Linear Regression 5. Derivation 6. CS70: Lecture 33. Let s Guess! Dollar or not with equal probability? Guess how much you get! Guess a 1/2! The expected value. Win X, 100 times. How much will you win the 101st. Guess average! Let s Guess!

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

Business Statistics. Lecture 9: Simple Regression

Business Statistics. Lecture 9: Simple Regression Business Statistics Lecture 9: Simple Regression 1 On to Model Building! Up to now, class was about descriptive and inferential statistics Numerical and graphical summaries of data Confidence intervals

More information

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18 Variance reduction p. 1/18 Variance reduction Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Variance reduction p. 2/18 Example Use simulation to compute I = 1 0 e x dx We

More information

Riemann Sums. Outline. James K. Peterson. September 15, Riemann Sums. Riemann Sums In MatLab

Riemann Sums. Outline. James K. Peterson. September 15, Riemann Sums. Riemann Sums In MatLab Riemann Sums James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University September 15, 2013 Outline Riemann Sums Riemann Sums In MatLab Abstract This

More information

Project Two. James K. Peterson. March 26, Department of Biological Sciences and Department of Mathematical Sciences Clemson University

Project Two. James K. Peterson. March 26, Department of Biological Sciences and Department of Mathematical Sciences Clemson University Project Two James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University March 26, 2019 Outline 1 Cooling Models 2 Estimating the Cooling Rate k 3 Typical

More information

CS70: Jean Walrand: Lecture 22.

CS70: Jean Walrand: Lecture 22. CS70: Jean Walrand: Lecture 22. Confidence Intervals; Linear Regression 1. Review 2. Confidence Intervals 3. Motivation for LR 4. History of LR 5. Linear Regression 6. Derivation 7. More examples Review:

More information

Lecture 16 - Correlation and Regression

Lecture 16 - Correlation and Regression Lecture 16 - Correlation and Regression Statistics 102 Colin Rundel April 1, 2013 Modeling numerical variables Modeling numerical variables So far we have worked with single numerical and categorical variables,

More information

Simple Linear Regression Estimation and Properties

Simple Linear Regression Estimation and Properties Simple Linear Regression Estimation and Properties Outline Review of the Reading Estimate parameters using OLS Other features of OLS Numerical Properties of OLS Assumptions of OLS Goodness of Fit Checking

More information

STAT Homework 8 - Solutions

STAT Homework 8 - Solutions STAT-36700 Homework 8 - Solutions Fall 208 November 3, 208 This contains solutions for Homework 4. lease note that we have included several additional comments and approaches to the problems to give you

More information

Chapter 12 - Lecture 2 Inferences about regression coefficient

Chapter 12 - Lecture 2 Inferences about regression coefficient Chapter 12 - Lecture 2 Inferences about regression coefficient April 19th, 2010 Facts about slope Test Statistic Confidence interval Hypothesis testing Test using ANOVA Table Facts about slope In previous

More information

Regression and correlation. Correlation & Regression, I. Regression & correlation. Regression vs. correlation. Involve bivariate, paired data, X & Y

Regression and correlation. Correlation & Regression, I. Regression & correlation. Regression vs. correlation. Involve bivariate, paired data, X & Y Regression and correlation Correlation & Regression, I 9.07 4/1/004 Involve bivariate, paired data, X & Y Height & weight measured for the same individual IQ & exam scores for each individual Height of

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y) HW5 Solutions 1. (50 pts.) Random homeworks again (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] Answer: Applying the definition of expectation we have

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Introduction to Simple Linear Regression

Introduction to Simple Linear Regression Introduction to Simple Linear Regression Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Introduction to Simple Linear Regression 1 / 68 About me Faculty in the Department

More information

Political Science 6000: Beginnings and Mini Math Boot Camp

Political Science 6000: Beginnings and Mini Math Boot Camp Political Science 6000: Beginnings and Mini Math Boot Camp January 20, 2010 First things first Syllabus This is the most important course you will take. 1. You need to understand these concepts in order

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula Lecture 4: Proofs for Expectation, Variance, and Covariance Formula by Hiro Kasahara Vancouver School of Economics University of British Columbia Hiro Kasahara (UBC) Econ 325 1 / 28 Discrete Random Variables:

More information

Linear Systems of ODE: Nullclines, Eigenvector lines and trajectories

Linear Systems of ODE: Nullclines, Eigenvector lines and trajectories Linear Systems of ODE: Nullclines, Eigenvector lines and trajectories James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University October 6, 203 Outline

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability

More information

Project Two. Outline. James K. Peterson. March 27, Cooling Models. Estimating the Cooling Rate k. Typical Cooling Project Matlab Session

Project Two. Outline. James K. Peterson. March 27, Cooling Models. Estimating the Cooling Rate k. Typical Cooling Project Matlab Session Project Two James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University March 27, 2018 Outline Cooling Models Estimating the Cooling Rate k Typical Cooling

More information

BIOSTATISTICS NURS 3324

BIOSTATISTICS NURS 3324 Simple Linear Regression and Correlation Introduction Previously, our attention has been focused on one variable which we designated by x. Frequently, it is desirable to learn something about the relationship

More information

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Preliminary Statistics. Lecture 3: Probability Models and Distributions Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution

More information

ECSE B Solutions to Assignment 8 Fall 2008

ECSE B Solutions to Assignment 8 Fall 2008 ECSE 34-35B Solutions to Assignment 8 Fall 28 Problem 8.1 A manufacturing system is governed by a Poisson counting process {N t ; t < } with rate parameter λ >. If a counting event occurs at an instant

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5

More information

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes? Week 10 Worksheet Ice Breaker Question: Do you prefer waffles or pancakes? 1. Suppose X, Y have joint density f(x, y) = 12 7 (xy + y2 ) on 0 < x < 1, 0 < y < 1. (a) What are the marginal densities of X

More information

Linear Systems of ODE: Nullclines, Eigenvector lines and trajectories

Linear Systems of ODE: Nullclines, Eigenvector lines and trajectories Linear Systems of ODE: Nullclines, Eigenvector lines and trajectories James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University October 6, 2013 Outline

More information

Derivatives and the Product Rule

Derivatives and the Product Rule Derivatives and the Product Rule James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University January 28, 2014 Outline 1 Differentiability 2 Simple Derivatives

More information

The Derivative of a Function

The Derivative of a Function The Derivative of a Function James K Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University March 1, 2017 Outline A Basic Evolutionary Model The Next Generation

More information

Solving systems of ODEs with Matlab

Solving systems of ODEs with Matlab Solving systems of ODEs with Matlab James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University October 20, 2013 Outline 1 Systems of ODEs 2 Setting Up

More information

Properties of Summation Operator

Properties of Summation Operator Econ 325 Section 003/004 Notes on Variance, Covariance, and Summation Operator By Hiro Kasahara Properties of Summation Operator For a sequence of the values {x 1, x 2,..., x n, we write the sum of x 1,

More information

Econ 2120: Section 2

Econ 2120: Section 2 Econ 2120: Section 2 Part I - Linear Predictor Loose Ends Ashesh Rambachan Fall 2018 Outline Big Picture Matrix Version of the Linear Predictor and Least Squares Fit Linear Predictor Least Squares Omitted

More information

X = X X n, + X 2

X = X X n, + X 2 CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk

More information

18.440: Lecture 26 Conditional expectation

18.440: Lecture 26 Conditional expectation 18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions

More information

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom 1 Learning Goals Covariance and Correlation Class 7, 18.05 Jerem Orloff and Jonathan Bloom 1. Understand the meaning of covariance and correlation. 2. Be able to compute the covariance and correlation

More information

Mathematical Induction

Mathematical Induction Mathematical Induction James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University January 12, 2017 Outline Introduction to the Class Mathematical Induction

More information

Principal components

Principal components Principal components Principal components is a general analysis technique that has some application within regression, but has a much wider use as well. Technical Stuff We have yet to define the term covariance,

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

1 Correlation between an independent variable and the error

1 Correlation between an independent variable and the error Chapter 7 outline, Econometrics Instrumental variables and model estimation 1 Correlation between an independent variable and the error Recall that one of the assumptions that we make when proving the

More information

Applied Regression. Applied Regression. Chapter 2 Simple Linear Regression. Hongcheng Li. April, 6, 2013

Applied Regression. Applied Regression. Chapter 2 Simple Linear Regression. Hongcheng Li. April, 6, 2013 Applied Regression Chapter 2 Simple Linear Regression Hongcheng Li April, 6, 2013 Outline 1 Introduction of simple linear regression 2 Scatter plot 3 Simple linear regression model 4 Test of Hypothesis

More information

ECE Homework Set 3

ECE Homework Set 3 ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is

More information

Linear Estimation of Y given X:

Linear Estimation of Y given X: Problem: Given measurement Y, estimate X. Linear Estimation of Y given X: Why? You want to know something that is difficult to measure, such as engine thrust. You estimate this based upon something that

More information

Predator - Prey Model Trajectories and the nonlinear conservation law

Predator - Prey Model Trajectories and the nonlinear conservation law Predator - Prey Model Trajectories and the nonlinear conservation law James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University October 28, 2013 Outline

More information

Predator - Prey Model Trajectories are periodic

Predator - Prey Model Trajectories are periodic Predator - Prey Model Trajectories are periodic James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University November 4, 2013 Outline 1 Showing The PP

More information

Business Statistics 41000: Homework # 2 Solutions

Business Statistics 41000: Homework # 2 Solutions Business Statistics 4000: Homework # 2 Solutions Drew Creal February 9, 204 Question #. Discrete Random Variables and Their Distributions (a) The probabilities have to sum to, which means that 0. + 0.3

More information

Derivatives in 2D. Outline. James K. Peterson. November 9, Derivatives in 2D! Chain Rule

Derivatives in 2D. Outline. James K. Peterson. November 9, Derivatives in 2D! Chain Rule Derivatives in 2D James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University November 9, 2016 Outline Derivatives in 2D! Chain Rule Let s go back to

More information

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51 Yi Lu Correlation and Covariance Yi Lu ECE 313 2/51 Definition Let X and Y be random variables with finite second moments. the correlation: E[XY ] Yi Lu ECE 313 3/51 Definition Let X and Y be random variables

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable CHAPTER 4 MATHEMATICAL EXPECTATION 4.1 Mean of a Random Variable The expected value, or mathematical expectation E(X) of a random variable X is the long-run average value of X that would emerge after a

More information

Chapter 2: simple regression model

Chapter 2: simple regression model Chapter 2: simple regression model Goal: understand how to estimate and more importantly interpret the simple regression Reading: chapter 2 of the textbook Advice: this chapter is foundation of econometrics.

More information

MA 1128: Lecture 08 03/02/2018. Linear Equations from Graphs And Linear Inequalities

MA 1128: Lecture 08 03/02/2018. Linear Equations from Graphs And Linear Inequalities MA 1128: Lecture 08 03/02/2018 Linear Equations from Graphs And Linear Inequalities Linear Equations from Graphs Given a line, we would like to be able to come up with an equation for it. I ll go over

More information

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations Robert Vanderbei Fall 2014 Slides last edited on October 20, 2014 http://www.princeton.edu/ rvdb Definition The expectation of a random variable

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Gaussian random variables inr n

Gaussian random variables inr n Gaussian vectors Lecture 5 Gaussian random variables inr n One-dimensional case One-dimensional Gaussian density with mean and standard deviation (called N, ): fx x exp. Proposition If X N,, then ax b

More information

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3 Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................

More information

Intro to Linear Regression

Intro to Linear Regression Intro to Linear Regression Introduction to Regression Regression is a statistical procedure for modeling the relationship among variables to predict the value of a dependent variable from one or more predictor

More information

STOR Lecture 16. Properties of Expectation - I

STOR Lecture 16. Properties of Expectation - I STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics

More information

Integration by Parts Logarithms and More Riemann Sums!

Integration by Parts Logarithms and More Riemann Sums! Integration by Parts Logarithms and More Riemann Sums! James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University September 16, 2013 Outline 1 IbyP with

More information

Predator - Prey Model Trajectories are periodic

Predator - Prey Model Trajectories are periodic Predator - Prey Model Trajectories are periodic James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University November 4, 2013 Outline Showing The PP Trajectories

More information

Final Examination Solutions (Total: 100 points)

Final Examination Solutions (Total: 100 points) Final Examination Solutions (Total: points) There are 4 problems, each problem with multiple parts, each worth 5 points. Make sure you answer all questions. Your answer should be as clear and readable

More information

Random Vectors, Random Matrices, and Matrix Expected Value

Random Vectors, Random Matrices, and Matrix Expected Value Random Vectors, Random Matrices, and Matrix Expected Value James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 16 Random Vectors,

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 26: Probablity and Random Processes Problem Set 9 Fall 2007 Issued: Thursday, November, 2007 Due: Friday, November 9, 2007 Reading:

More information

Correlation and Regression

Correlation and Regression Correlation and Regression October 25, 2017 STAT 151 Class 9 Slide 1 Outline of Topics 1 Associations 2 Scatter plot 3 Correlation 4 Regression 5 Testing and estimation 6 Goodness-of-fit STAT 151 Class

More information

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v } Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v

More information

15-780: Grad AI Lecture 17: Probability. Geoff Gordon (this lecture) Tuomas Sandholm TAs Erik Zawadzki, Abe Othman

15-780: Grad AI Lecture 17: Probability. Geoff Gordon (this lecture) Tuomas Sandholm TAs Erik Zawadzki, Abe Othman 15-780: Grad AI Lecture 17: Probability Geoff Gordon (this lecture) Tuomas Sandholm TAs Erik Zawadzki, Abe Othman Review: probability RVs, events, sample space Ω Measures, distributions disjoint union

More information

Notes on Random Processes

Notes on Random Processes otes on Random Processes Brian Borchers and Rick Aster October 27, 2008 A Brief Review of Probability In this section of the course, we will work with random variables which are denoted by capital letters,

More information

University of Regina. Lecture Notes. Michael Kozdron

University of Regina. Lecture Notes. Michael Kozdron University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating

More information

Lecture 4. August 24, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.

Lecture 4. August 24, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University. random Lecture 4 Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University August 24, 2007 random 1 2 3 4 random 5 6 7 8 9 random 1 Define random 2 and 3 4 Co

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Math101, Sections 2 and 3, Spring 2008 Review Sheet for Exam #2:

Math101, Sections 2 and 3, Spring 2008 Review Sheet for Exam #2: Math101, Sections 2 and 3, Spring 2008 Review Sheet for Exam #2: 03 17 08 3 All about lines 3.1 The Rectangular Coordinate System Know how to plot points in the rectangular coordinate system. Know the

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Antiderivatives! Outline. James K. Peterson. January 28, Antiderivatives. Simple Fractional Power Antiderivatives

Antiderivatives! Outline. James K. Peterson. January 28, Antiderivatives. Simple Fractional Power Antiderivatives Antiderivatives! James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University January 28, 2014 Outline Antiderivatives Simple Fractional Power Antiderivatives

More information

Defining Exponential Functions and Exponential Derivatives and Integrals

Defining Exponential Functions and Exponential Derivatives and Integrals Defining Exponential Functions and Exponential Derivatives and Integrals James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University February 19, 2014

More information

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS Name: ANSWE KEY Math 46: Probability MWF pm, Gasson Exam SOLUTIONS Problem Points Score 4 5 6 BONUS Total 6 Please write neatly. You may leave answers below unsimplified. Have fun and write your name above!

More information

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi

More information

Lecture 10: F -Tests, ANOVA and R 2

Lecture 10: F -Tests, ANOVA and R 2 Lecture 10: F -Tests, ANOVA and R 2 1 ANOVA We saw that we could test the null hypothesis that β 1 0 using the statistic ( β 1 0)/ŝe. (Although I also mentioned that confidence intervals are generally

More information

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2 MATH 3806/MATH4806/MATH6806: MULTIVARIATE STATISTICS Solutions to Problems on Rom Vectors Rom Sampling Let X Y have the joint pdf: fx,y) + x +y ) n+)/ π n for < x < < y < this is particular case of the

More information

Riemann Integration. James K. Peterson. February 2, Department of Biological Sciences and Department of Mathematical Sciences Clemson University

Riemann Integration. James K. Peterson. February 2, Department of Biological Sciences and Department of Mathematical Sciences Clemson University Riemann Integration James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University February 2, 2017 Outline 1 Riemann Sums 2 Riemann Sums In MatLab 3 Graphing

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Matrices and Vectors

Matrices and Vectors Matrices and Vectors James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University November 11, 2013 Outline 1 Matrices and Vectors 2 Vector Details 3 Matrix

More information

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture

More information

4. Distributions of Functions of Random Variables

4. Distributions of Functions of Random Variables 4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n

More information

Optimization and Simulation

Optimization and Simulation Optimization and Simulation Variance reduction Michel Bierlaire Transport and Mobility Laboratory School of Architecture, Civil and Environmental Engineering Ecole Polytechnique Fédérale de Lausanne M.

More information

3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof.

3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof. 3. General Random Variables Part IV: Mul8ple Random Variables ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof. Ilya Pollak Joint PDF of two con8nuous r.v. s PDF of continuous r.v.'s

More information

Riemann Integration. Outline. James K. Peterson. February 2, Riemann Sums. Riemann Sums In MatLab. Graphing Riemann Sums

Riemann Integration. Outline. James K. Peterson. February 2, Riemann Sums. Riemann Sums In MatLab. Graphing Riemann Sums Riemann Integration James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University February 2, 2017 Outline Riemann Sums Riemann Sums In MatLab Graphing

More information

Lecture 17: Estimating and Manipulating Covariance

Lecture 17: Estimating and Manipulating Covariance CS 4980/6980: Data Science c Spring 2018 Lecture 17: Estimating and Manipulating Covariance Instructor: Daniel L. Pimentel-Alarcón Scribed by: Paul Trimor and Jeevitha Meyyappan This is preliminary work

More information

Newton s Cooling Model in Matlab and the Cooling Project!

Newton s Cooling Model in Matlab and the Cooling Project! Newton s Cooling Model in Matlab and the Cooling Project! James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University March 10, 2014 Outline Your Newton

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Chapter 4 continued. Chapter 4 sections

Chapter 4 continued. Chapter 4 sections Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:

More information

STA441: Spring Multiple Regression. This slide show is a free open source document. See the last slide for copyright information.

STA441: Spring Multiple Regression. This slide show is a free open source document. See the last slide for copyright information. STA441: Spring 2018 Multiple Regression This slide show is a free open source document. See the last slide for copyright information. 1 Least Squares Plane 2 Statistical MODEL There are p-1 explanatory

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information