ACE 562 Fall Lecture 2: Probability, Random Variables and Distributions. by Professor Scott H. Irwin

Size: px
Start display at page:

Download "ACE 562 Fall Lecture 2: Probability, Random Variables and Distributions. by Professor Scott H. Irwin"

Transcription

1 ACE 562 Fall 2005 Lecture 2: Probability, Random Variables and Distributions Required Readings: by Professor Scott H. Irwin Griffiths, Hill and Judge. Some Basic Ideas: Statistical Concepts for Economists, Ch. 2 in Learning and Practicing Econometrics Mirer. "Random Variables and Probability Distributions" Ch. 9; "The Normal and t Distributions" Ch. 10 in Economic Statistics and Econometrics (readings packet) Gilovich, et. al. "The Hot Hand in Basketball: On the Misperception of Random Sequences." Cognitive Psychology 17(1985): (readings packet) Optional Readings: Mirer. Descriptive Statistics Ch. 3; "Probability Theory" Ch. 8 ACE 562, University of Illinois at Urbana-Champaign 2-1

2 Most sophomores can easily learn to put numbers into a computer and get back statistical results. To do this wisely---to make the statistical analysis valid- --requires a real understanding of the techniques involved. ---T. Mirer Overview Intro stats courses typically cover the following topics: Types of data Descriptive statistics Frequency distributions and histograms Probability Random variables Probability distributions Confidence intervals Hypothesis tests ACE 562, University of Illinois at Urbana-Champaign 2-2

3 A sample of data on wheat acreage planted in the US: Planted Acreage Marketing Year (million acres) 1975/76 74, /77 80, /78 75, /79 65, /80 71, /81 80, /82 88, /83 86, /84 76, /85 79, /86 75, /87 71, /88 65, /89 65, /90 76, /91 77,041 Source: USDA From a statistical viewpoint, how was this data generated? ACE 562, University of Illinois at Urbana-Champaign 2-3

4 Viewing the world through a statistical lens Random variable Data sample Statistics (mean, variance, etc.) Repeated sampling Confidence intervals and hypothesis tests This arrangement highlights the foundational role played by random variables in classical statistical analysis Our study of linear regression fits in this general model Note: I assume you understand probability concepts at the level found in Chapter 8 of Mirer ACE 562, University of Illinois at Urbana-Champaign 2-4

5 Random Variables Random variable: a process that generates data More formally, a random variable is a variable whose value is unknown until it is observed Random implies the existence of some probability distribution defined over the set of all possible values An arbitrary variable does not have a probability distribution associated with its values Think of a random variable as a machine that produces numbers one after another Each number produced is a value, or realization, of the random variable Uncertainty about the next value at any point in time A complete production run produces a set of numbers called a sample ACE 562, University of Illinois at Urbana-Champaign 2-5

6 Discrete Random Variable A discrete random variable can take only a finite number of values, that can be counted by using the positive integers. Examples: Prize money from the following lottery is a discrete random variable: first prize: $1,000 second prize: $50 third prize: $25 since it has only four (a finite number) (count: 1,2,3,4) of possible outcomes: $0.00 $25 $50 $1,000 Outcome from rolling a six-sided die is a discrete random variable, since there are only six (finite) possible outcomes: 1,2, 3, 4, 5, or 6 ACE 562, University of Illinois at Urbana-Champaign 2-6

7 A list of all of the possible values taken by a discrete random variable along with the chances of each outcome occurring is called a probability function or probability density function (pdf) die x f(x) one dot 1 1/6 two dots 2 1/6 three dots 3 1/6 four dots 4 1/6 five dots 5 1/6 six dots 6 1/6 In the discrete case only, f(x) is the probability that X takes on the value x or equivalently, f ( x) = Pr( X = x) f( x ) = Pr( X = x ) t Notation Note: capital X represents the random variable, while lower case x represents a particular value, or realization, or X t ACE 562, University of Illinois at Urbana-Champaign 2-7

8 Probability density functions for discrete random variables have two basic properties: 0 f ( x ) 1 for all t t f x1 f x2 f x T ( ) + ( ) ( ) = 1 In other words, probability of each outcome must be between zero and one and the sum of probabilities for all outcomes is one Probability density functions of discrete random variables may be represented in three equivalent ways 1. Table (always) 2. Graph (always) 3. Equation (sometimes) Consider this example: A hat contains 10 balls, with one labeled 5, two labeled 6, three 7 and four 8 The random variable is the outcome of selecting one ball out of the hat ACE 562, University of Illinois at Urbana-Champaign 2-8

9 ACE 562, University of Illinois at Urbana-Champaign 2-9

10 ACE 562, University of Illinois at Urbana-Champaign 2-10

11 Properties of Discrete Random Variables Descriptive statistics mainly attempt to measure the middle and dispersion of a data sample We are interested in the same properties of random variables Before developing these measures, it is helpful to review some rules of summation, which will be used throughout the course: Rule 1. If X takes on T values x1, x2,..., x T, then T t= 1 x = x + x x t 1 2 T Note that summation is a linear operator, which means it operates term by term Rule 2. If a is a constant, then T t= 1 a = Ta ACE 562, University of Illinois at Urbana-Champaign 2-11

12 Rule 3. If a is a constant, then T T ax = a x = ax + ax ax t t 1 2 T t= 1 t= 1 The arithmetic mean (average) of T values of X is simply an application of this rule T T x = x = x = ( x + x + + x ) L. t t 1 2 T t= 1T T t= 1 T Also, T t= 1 ( x x) = 0 t Rule 4. If f ( x ) is a function of X, then T t= 1 f ( x ) = f( x ) + f( x ) f( x ) t 1 2 T ACE 562, University of Illinois at Urbana-Champaign 2-12

13 We often use an abbreviated form of the summation notation. For example, if f(x) is a function of the values of X, T t= 1 f( x ) = f( x ) + f( x ) + L+ f( x ) t 1 2 = f ( x ) ("Sum over all values of the index t") = t x t t f ( x) ("Sum over all possible values of X") Rule 5. If X and Y are two variables, then T T T ( x + y ) = x + y t t t t t= 1 t= 1 t= 1 Rule 6. If X and Y are two variables, then T T T ( ax + by ) = a x + b y t t t t t= 1 t= 1 t= 1 ACE 562, University of Illinois at Urbana-Champaign 2-13

14 Note: Several summation signs can be used in one expression. Suppose the variable Y takes T values and X takes S values, and let f(x,y) = x + y. Then the double summation of this function is T S T S f ( x, y ) = ( x + y ) t s t s t= 1 s= 1 t= 1 s= 1 To evaluate such expressions work from the innermost sum outward. First set t=1 and sum over all values of s, and so on. That is, To illustrate, let T = 2 and S = 3. Then (, ) = (, ) + (, ) + (, ) f x y f x y f x y f x y t s t 1 t 2 t 3 t= 1 s= 1 t= 1 ( 1, 1) ( 1, 2) ( 1, 3) (, ) + (, ) + (, ) = f x y + f x y + f x y + f x y f x y f x y The order of summation does not matter, so T S T S f ( x, y ) = f( x, y ) t s t s t= 1 s= 1 s= 1 t= 1 ACE 562, University of Illinois at Urbana-Champaign 2-14

15 Mean of a Discrete Random Variable The "middle," or mean, of a discrete random variable is its expected value Often, a special notation is used for the mean μ = E( X) or β = EX ( ) There are two entirely different, but mathematically equivalent, ways of defining the expected value Analytical Definition If X is a discrete random variable which can take the values x 1, x 2,, x T with probability density values f(x 1 ), f(x 2 ),, f(x T ), the expected value of X is computed using the following mathematical expectation formula T E( X) = x f( x ) = x f( x ) + x f( x ) x f( x ) t= 1 t t T T ACE 562, University of Illinois at Urbana-Champaign 2-15

16 Note that the expected value (mean) is determined by weighting all the possible values of X by corresponding probabilities and summing Hence, the mean is a weighted-average of the possible values for the discrete random variable Empirical Definition The expected value of discrete random variable X is the average value from an experiment of producing an infinite number of samples We can use a thought experiment to illustrate the empirical definition First, use the discrete random variable "machine" to generate a single sample of T observations on X ( x1, x2,..., x T ) Next, use the discrete random variable "machine" to generate a very large (infinite) number of samples of size T Now, take the simple arithmetic average of all the x ' s generated in the previous two steps t ACE 562, University of Illinois at Urbana-Champaign 2-16

17 The computed average is the expected value of the discrete random variable and the center of its pdf Note: The above thought experiment is identical to one where we consider taking one sample of infinite size and compute the arithmetic average for this single infinitely-sized sample Analytical vs. Empirical The analytical and empirical definitions produce exactly the same expected value for a discrete random variable The equivalence depends on the number of samples in the empirical case going to infinity When the number of samples goes to infinity, the observations on X occur with frequencies across all samples exactly equal to the corresponding probabilities [ f(x t )] in the analytical case ACE 562, University of Illinois at Urbana-Champaign 2-17

18 Variance of a Discrete Random Variable It is essential to have a measure of the dispersion, or variability, of a discrete random variable Again, we can define the variance of a discrete random variable both analytically and empirically Before developing the analytical definition, it is useful to introduce the expectation of a function of discrete random variables If g(x) is a function of the discrete random variable X, then T EgX [ ( )] = gx ( ) f( x) t= 1 t t = gx ( ) f( x) + gx ( ) f( x) gx ( ) f( x) T T Important applications of this result: 1. If g( X) = a, where a is a constant, then Ea ( ) = a ACE 562, University of Illinois at Urbana-Champaign 2-18

19 2. If g( X) = cx, where c is a constant and X is a discrete random variable, then EcX ( ) = cex ( ) 3. If g( X) = a+ cx, where a and c are constants and X is a random variable, then Analytical Definition Ea ( + cx) = a+ cex ( ) To begin, set g( X) = [ X E( X)] 2 2 EgX [ ( )] = EX [ EX ( )] = [ x E( X)] f( x ) + [ x E( X)] f( x ) [ xt E( X)] f( xt) 2 = var( X ) = σ We can think of the variance as the expected value of the squared deviations around the mean of X Or, variance is a weighted-average of the squared distances between the values of X and the mean of the random variable ACE 562, University of Illinois at Urbana-Champaign 2-19

20 Empirical Definition The variance of discrete random variable X is the average squared deviation from the arithmetic mean based on an experiment of producing an infinite number of samples We can again use a thought experiment to illustrate the empirical definition First, use the discrete random variable "machine" to generate a single sample of T observations on X ( x1, x2,..., x T ) Next, use the discrete random variable "machine" to generate a very large (infinite) number of samples of size T Now, take the simple arithmetic average of all the x ' s generated in the previous two steps t Then, compute the squared deviation of each x t from the simple arithmetic average computed above ACE 562, University of Illinois at Urbana-Champaign 2-20

21 Finally, the average of the squared deviations is the variance of the discrete random variable and the dispersion of it s pdf Analytical vs. Empirical The analytical and empirical definitions produce exactly the same variance for a discrete random variable The equivalence depends on the number of samples in the empirical case going to infinity When the number of samples goes to infinity, the observations on X occur with frequencies across all samples exactly equal to the corresponding probabilities [ f(x t )] in the analytical case ACE 562, University of Illinois at Urbana-Champaign 2-21

22 ACE 562, University of Illinois at Urbana-Champaign 2-22

23 ACE 562, University of Illinois at Urbana-Champaign 2-23

24 Discrete Random Variables and Samples of Data Review If we generate a fixed number of observations from a discrete random variable, we have a sample The data for a sample can be: summarized by a relative frequency distribution analyzed with descriptive statistics At this point, it is helpful to re-emphasize that a random variable is a theoretical construct Examples are simply there to illustrate the concept In reality, we never see a random variable, only the resulting data sample Used to represent some kind of physical, economic or sociological process ACE 562, University of Illinois at Urbana-Champaign 2-24

25 ACE 562, University of Illinois at Urbana-Champaign 2-25

26 ACE 562, University of Illinois at Urbana-Champaign 2-26

27 Continuous Random Variables A continuous random variable can take any real value (not just whole numbers) in at least one interval on the real line In other words, an infinite number of possible values may occur for the next realization of the variable Examples: gross national product (GNP) money supply price of eggs household income expenditure on clothing The probability distribution of a continuous random variable has two components: 1. A statement of the possibly occurring values 2. A function that gives information about probabilities These two components have to be altered relative to the case of a discrete random variable ACE 562, University of Illinois at Urbana-Champaign 2-27

28 Consider the case of a continuous random variable where the probability for each outcome is the same The probability of any outcome occurring is 1, as there is an infinite number of possible outcomes The implication is that the probability of any individual value is zero! No longer useful to focus on f ( x ) = Pr( X = x ) since it equals zero t t For continuous random variables, we can only relate probabilities to an interval of X If the interval is [a,b], we want to compute Pr( a X b) Hence, a continuous random variable uses area under a curve rather than the height, f(x), to represent probability ACE 562, University of Illinois at Urbana-Champaign 2-28

29 ACE 562, University of Illinois at Urbana-Champaign 2-29

30 ACE 562, University of Illinois at Urbana-Champaign 2-30

31 Formally, the area under a curve is the integral of the equation that generates the curve: b Pr( a X b) = f( x) dx x= a In other words, for continuous random variables the integral of f(x), and not f(x) itself, defines the area and, therefore, the probability So, what does f(x) represent in the continuous case? It is still the height of the pdf, but it does not equal probability Instead, it represents the relative likelihood of a value of X occurring Note, that f(x) may take on a value greater than one ACE 562, University of Illinois at Urbana-Champaign 2-31

32 Mean and Variance of a Continuous Random Variable Once again, we want to develop measures of the middle and "dispersion" of a random variable In the continuous case, to derive the analytical definitions, we must resort to integral calculus Specifically, for a continuous random variable defined over the range x min and x max xmax EX ( ) = β = xf( xdx ) x= xmin xmax 2 2 var( X ) = σ = [ x E( X)] f( x) dx x= xmin The empirical definitions are the same in both the continuous and discrete random variable cases, the only twist is the type of variable generating the infinite number of samples ACE 562, University of Illinois at Urbana-Champaign 2-32

33 Normal Probability Density Function for Continuous Random Variables Many continuous random variables have pdf s that share a common mathematical form One of the most important families of distributions in econometrics is the famous normal distribution If X is a continuous random variable that can take on values from to, it has a normal pdf of the following form 2 1 ( X β ) f( x) = exp X 2 < < 2 2πσ 2σ We say that X is distributed normally with mean β 2 2 and variance σ [ X ~ N( β, σ )] Each member of the normal distribution family has 2 a different β (mean) and/or σ (variance) ACE 562, University of Illinois at Urbana-Champaign 2-33

34 ACE 562, University of Illinois at Urbana-Champaign 2-34

35 ACE 562, University of Illinois at Urbana-Champaign 2-35

36 Standard Normal Distribution It is possible to determine probabilities for areas under normal curves using integral calculus This is a cumbersome task, which can be avoided by taking advantage of the fact that any one normal distribution can be obtained from another by compressing or expanding it shifting it to the left or right We can formalize this idea by considering the following transformation of the continuous random variable X X β Z = σ The new random variable, Z, has the following pdf 2 1 Z f( x) = exp Z 2π < < 2 We say that Z is distributed standard normally with mean 0 and variance 1 [ ~ (0,1) Z N ] ACE 562, University of Illinois at Urbana-Champaign 2-36

37 ACE 562, University of Illinois at Urbana-Champaign 2-37

38 The equivalence of probability statements between a normal and standard normal distribution can also be shown mathematically Let x l and x u represent two values of the random variable X, and we would like to determine Pr( x X x ) l We can subtract the mean β from each term and not change the meaning of the probability statement (since β is a constant) Pr( x β X β x β) l Based on the same logic, we can divide each term by σ x β X β x β σ σ σ Pr l u Which can be re-stated as, Pr( Z Z Z ) Hence, the equivalence of the probability statements l u u u ACE 562, University of Illinois at Urbana-Champaign 2-38

39 Transformations of Random Variables The results from the previous section can be generalized We will assume the following linear transformation Y = a+ bx t t For discrete random variables, the transformation re-labels the possible values of X, but does not affect the probabilities of their occurring For continuous random variables, the transformation also re-labels the possible values of X, but does not affect the probability of Y and X being in corresponding intervals ACE 562, University of Illinois at Urbana-Champaign 2-39

40 ACE 562, University of Illinois at Urbana-Champaign 2-40

41 ACE 562, University of Illinois at Urbana-Champaign 2-41

42 Mean and Variance With the Linear Transformation When the following transformation is applied to either a discrete or continuous random variable Y = a+ bx t t The mean and variance of Y can be computed from the following relations β Y = a+ bβ X σ = b σ Y X σ Y = b σ X These relations can be proven using the original equations used to define expected value and variance ACE 562, University of Illinois at Urbana-Champaign 2-42

43 Joint Probability Density Functions Previously, we have worked with a single random variable that generates numbers The generation process is governed by the probability distribution of the random variable Now, we want to envision a more complex process where two numbers are generated Throwing a pair of dice simultaneously Determination of corn and soybean futures prices at the Chicago Board of Trade Key is that two numbers are produced simultaneously, but separately identifiable To generalize, we can think of a process where the output is the combination of numbers for two random variables Y and X The probability of observing the different combinations of Y and X is given by the joint probability density function, f(x,y) ACE 562, University of Illinois at Urbana-Champaign 2-43

44 ACE 562, University of Illinois at Urbana-Champaign 2-44

45 Joint Probability Density Function for Discrete Random Variables f(x,y) Y 6 10 X ACE 562, University of Illinois at Urbana-Champaign 2-45

46 ACE 562, University of Illinois at Urbana-Champaign 2-46

47 Formal Definitions The marginal probability density functions, f(x) and f(y), for discrete random variables, can be obtained by summing over f(x,y) with respect to the values of Y to obtain f(x) summing over f(x,y) with respect to the values of X to obtain f(y) The marginal distributions are the same thing as the regular univariate pdf s for Y and X The term marginal is used because the univariate pdf s are displayed in the margin of joint pdf tables ACE 562, University of Illinois at Urbana-Champaign 2-47

48 The conditional probability density functions of X given Y = y are f( x y) = Pr( X = x Y = y) = f ( xy, ) f ( y) The conditional probability density functions of Y given X = x are f( y x) = Pr( Y = y X = x) = f ( x, y) f ( x) In each of the above cases, think of fixing X or Y at some value and then determining the pdf ACE 562, University of Illinois at Urbana-Champaign 2-48

49 Independence Again, consider two discrete random variables X and Y that have a joint pdf f(x,y) Assume that all of the conditional distributions, f ( x y t ), are the same Implies that the probability of obtaining different values of X is not affected by the simultaneously determined value of Y We can then say that X and Y are independent random variables Mathematically, independence can be stated as f ( x y ) = f( x) for all t t From this result we can derive a highly useful implication of independence called the multiplication rule ACE 562, University of Illinois at Urbana-Champaign 2-49

50 Start by re-stating the definition of conditional probability f( x y) = f ( x, y) f( y) Independence allows us to substitute for f ( x y ) as follows f( x) = f ( x, y) f( y) Re-arranging f ( x, y) = f( x) f( y) Note that this condition holds for each and every pair of values x and y Also, the multiplication rule generalizes to more than two random variables and continuous random variables ACE 562, University of Illinois at Urbana-Champaign 2-50

51 Covariance and Correlation In econometrics, a key issue is the relationship between variables The covariance between two random variables, X and Y, measures the linear association between them cov( X, Y) = E[ X E( X)][ Y E( Y)] To more explicitly define covariance for the case of a discrete random variable, we need the following rules of summation T S T f ( x, y ) = [ f( x, y ) + f( x, y ) f( x, y )] t s t 1 t 2 t s t= 1 s= 1 t= 1 T S S T f ( x, y ) = f( x, y ) t s t s t= 1 s= 1 s= 1 t= 1 Then cov( X, Y) = E[ X E( X)][ Y E( Y)] T S = [ x E( X)][ y EY ( )] f ( x, y ) t= 1 s= 1 t s t s ACE 562, University of Illinois at Urbana-Champaign 2-51

52 ACE 562, University of Illinois at Urbana-Champaign 2-52

53 ACE 562, University of Illinois at Urbana-Champaign 2-53

54 Covariance is difficult to interpret because it depends on the units of measurement of X and Y The correlation between two random variables X and Y overcomes this problem by creating a pure number falling between 1 and +1 ρ = cov( X, Y ) var( X ) var( Y ) Independent random variables have zero covariance and, therefore, zero correlation. The converse is not true because X and Y may be related in a non-linear manner ACE 562, University of Illinois at Urbana-Champaign 2-54

55 Mean and Variance of a Weighted-Sum of Random Variables There are many situations in econometrics where we want to create a new random variable as the weighted-sum of other random variables For two discrete random variables, this can be expressed as W = a1x + ay 2 To develop the mean and variance of W, we need some more results from the rules of summation T T T x + y = x + y t t t t t= 1 t= 1 t= 1 = x + x x + y + y y 1 2 T 1 2 T T T T ax + by = ax + by t t t t t= 1 t= 1 t= 1 = ax + ax ax + by + by by 1 2 T 1 2 T ACE 562, University of Illinois at Urbana-Champaign 2-55

56 The expected value of the weighted sum of random variables is the sum of the expectations of the individual terms Since expectation is a linear operator, it can be applied term by term EW ( ) = EaX ( 1 ) + EaY ( 2 ) = ae( X) + aey ( ) 1 2 So, the expectation of a weighted-sum of random variables is the weighted-sum of the expectations of the individual random variables The variance of W is found as follows, var( W) = E[ W E( W)] = EaX [ + ay EaX ( + ay)] = a var( X) + a var( Y) + 2abcov( X, Y) Notice what happens when we assume X and Y are independent var( W) = a var( X) + a var( Y) ACE 562, University of Illinois at Urbana-Champaign 2-56

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the

More information

Notes on Random Variables, Expectations, Probability Densities, and Martingales

Notes on Random Variables, Expectations, Probability Densities, and Martingales Eco 315.2 Spring 2006 C.Sims Notes on Random Variables, Expectations, Probability Densities, and Martingales Includes Exercise Due Tuesday, April 4. For many or most of you, parts of these notes will be

More information

Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Some Basic Probability Cocepts 2. Experimets, Outcomes ad Radom Variables A radom variable is a variable whose value is ukow util it is observed. The value of a radom variable results from a experimet;

More information

Probability. Table of contents

Probability. Table of contents Probability Table of contents 1. Important definitions 2. Distributions 3. Discrete distributions 4. Continuous distributions 5. The Normal distribution 6. Multivariate random variables 7. Other continuous

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) 1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For

More information

Mathematics 426 Robert Gross Homework 9 Answers

Mathematics 426 Robert Gross Homework 9 Answers Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr. Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Summer School in Statistics for Astronomers V June 1 - June 6, Regression. Mosuk Chow Statistics Department Penn State University.

Summer School in Statistics for Astronomers V June 1 - June 6, Regression. Mosuk Chow Statistics Department Penn State University. Summer School in Statistics for Astronomers V June 1 - June 6, 2009 Regression Mosuk Chow Statistics Department Penn State University. Adapted from notes prepared by RL Karandikar Mean and variance Recall

More information

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix

More information

ECON Fundamentals of Probability

ECON Fundamentals of Probability ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,

More information

ACE 564 Spring Lecture 8. Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information. by Professor Scott H.

ACE 564 Spring Lecture 8. Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information. by Professor Scott H. ACE 564 Spring 2006 Lecture 8 Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information by Professor Scott H. Irwin Readings: Griffiths, Hill and Judge. "Collinear Economic Variables,

More information

Random Variables and Expectations

Random Variables and Expectations Inside ECOOMICS Random Variables Introduction to Econometrics Random Variables and Expectations A random variable has an outcome that is determined by an experiment and takes on a numerical value. A procedure

More information

Review of probability

Review of probability Review of probability Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts definition of probability random variables

More information

COMP6053 lecture: Sampling and the central limit theorem. Markus Brede,

COMP6053 lecture: Sampling and the central limit theorem. Markus Brede, COMP6053 lecture: Sampling and the central limit theorem Markus Brede, mb8@ecs.soton.ac.uk Populations: long-run distributions Two kinds of distributions: populations and samples. A population is the set

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables

More information

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3 Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

An Introduction to Parameter Estimation

An Introduction to Parameter Estimation Introduction Introduction to Econometrics An Introduction to Parameter Estimation This document combines several important econometric foundations and corresponds to other documents such as the Introduction

More information

Fourier and Stats / Astro Stats and Measurement : Stats Notes

Fourier and Stats / Astro Stats and Measurement : Stats Notes Fourier and Stats / Astro Stats and Measurement : Stats Notes Andy Lawrence, University of Edinburgh Autumn 2013 1 Probabilities, distributions, and errors Laplace once said Probability theory is nothing

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

STAT 430/510 Probability Lecture 7: Random Variable and Expectation STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula

More information

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics,

More information

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Preliminary Statistics. Lecture 3: Probability Models and Distributions Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution

More information

Review of probability and statistics 1 / 31

Review of probability and statistics 1 / 31 Review of probability and statistics 1 / 31 2 / 31 Why? This chapter follows Stock and Watson (all graphs are from Stock and Watson). You may as well refer to the appendix in Wooldridge or any other introduction

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Applied Econometrics - QEM Theme 1: Introduction to Econometrics Chapter 1 + Probability Primer + Appendix B in PoE

Applied Econometrics - QEM Theme 1: Introduction to Econometrics Chapter 1 + Probability Primer + Appendix B in PoE Applied Econometrics - QEM Theme 1: Introduction to Econometrics Chapter 1 + Probability Primer + Appendix B in PoE Warsaw School of Economics Outline 1. Introduction to econometrics 2. Denition of econometrics

More information

EXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0.

EXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0. EXPECTED VALUE of a RV corresponds to the average value one would get for the RV when repeating the experiment, independently, infinitely many times. Sample (RIS) of n values of X (e.g. More accurately,

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Expectation and Variance

Expectation and Variance Expectation and Variance August 22, 2017 STAT 151 Class 3 Slide 1 Outline of Topics 1 Motivation 2 Expectation - discrete 3 Transformations 4 Variance - discrete 5 Continuous variables 6 Covariance STAT

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces. Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for

More information

Statistical and Learning Techniques in Computer Vision Lecture 1: Random Variables Jens Rittscher and Chuck Stewart

Statistical and Learning Techniques in Computer Vision Lecture 1: Random Variables Jens Rittscher and Chuck Stewart Statistical and Learning Techniques in Computer Vision Lecture 1: Random Variables Jens Rittscher and Chuck Stewart 1 Motivation Imaging is a stochastic process: If we take all the different sources of

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

Probabilities & Statistics Revision

Probabilities & Statistics Revision Probabilities & Statistics Revision Christopher Ting Christopher Ting http://www.mysmu.edu/faculty/christophert/ : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 January 6, 2017 Christopher Ting QF

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

LECTURE 1. Introduction to Econometrics

LECTURE 1. Introduction to Econometrics LECTURE 1 Introduction to Econometrics Ján Palguta September 20, 2016 1 / 29 WHAT IS ECONOMETRICS? To beginning students, it may seem as if econometrics is an overly complex obstacle to an otherwise useful

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability

More information

2.3 Estimating PDFs and PDF Parameters

2.3 Estimating PDFs and PDF Parameters .3 Estimating PDFs and PDF Parameters estimating means - discrete and continuous estimating variance using a known mean estimating variance with an estimated mean estimating a discrete pdf estimating a

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Lecture 2: Probability

Lecture 2: Probability Lecture 2: Probability Statistical Paradigms Statistical Estimator Method of Estimation Output Data Complexity Prior Info Classical Cost Function Analytical Solution Point Estimate Simple No Maximum Likelihood

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Univariate Normal Distribution; GLM with the Univariate Normal; Least Squares Estimation

Univariate Normal Distribution; GLM with the Univariate Normal; Least Squares Estimation Univariate Normal Distribution; GLM with the Univariate Normal; Least Squares Estimation PRE 905: Multivariate Analysis Spring 2014 Lecture 4 Today s Class The building blocks: The basics of mathematical

More information

1 Random variables and distributions

1 Random variables and distributions Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set

More information

Chapter 1. Probability, Random Variables and Expectations. 1.1 Axiomatic Probability

Chapter 1. Probability, Random Variables and Expectations. 1.1 Axiomatic Probability Chapter 1 Probability, Random Variables and Expectations Note: The primary reference for these notes is Mittelhammer (1999. Other treatments of probability theory include Gallant (1997, Casella & Berger

More information

Lecture 2: Probability

Lecture 2: Probability Lecture 2: Probability Statistical Paradigms Statistical Estimator Method of Estimation Output Data Complexity Prior Info Classical Cost Function Analytical Solution Point Estimate Simple No Maximum Likelihood

More information

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all Lecture 6 1 Lecture 6 Probability events Definition 1. The sample space, S, of a probability experiment is the collection of all possible outcomes of an experiment. One such outcome is called a simple

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2 MATH 3806/MATH4806/MATH6806: MULTIVARIATE STATISTICS Solutions to Problems on Rom Vectors Rom Sampling Let X Y have the joint pdf: fx,y) + x +y ) n+)/ π n for < x < < y < this is particular case of the

More information

Chapter 2. Continuous random variables

Chapter 2. Continuous random variables Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction

More information

Lectures 5 & 6: Hypothesis Testing

Lectures 5 & 6: Hypothesis Testing Lectures 5 & 6: Hypothesis Testing in which you learn to apply the concept of statistical significance to OLS estimates, learn the concept of t values, how to use them in regression work and come across

More information

The regression model with one stochastic regressor.

The regression model with one stochastic regressor. The regression model with one stochastic regressor. 3150/4150 Lecture 6 Ragnar Nymoen 30 January 2012 We are now on Lecture topic 4 The main goal in this lecture is to extend the results of the regression

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Numerical Methods Lecture 7 - Statistics, Probability and Reliability

Numerical Methods Lecture 7 - Statistics, Probability and Reliability Topics Numerical Methods Lecture 7 - Statistics, Probability and Reliability A summary of statistical analysis A summary of probability methods A summary of reliability analysis concepts Statistical Analysis

More information

Notes on Mathematics Groups

Notes on Mathematics Groups EPGY Singapore Quantum Mechanics: 2007 Notes on Mathematics Groups A group, G, is defined is a set of elements G and a binary operation on G; one of the elements of G has particularly special properties

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Chapter 2 The Simple Linear Regression Model: Specification and Estimation

Chapter 2 The Simple Linear Regression Model: Specification and Estimation Chapter The Simple Linear Regression Model: Specification and Estimation Page 1 Chapter Contents.1 An Economic Model. An Econometric Model.3 Estimating the Regression Parameters.4 Assessing the Least Squares

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

Statistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg

Statistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg Statistics for Data Analysis PSI Practical Course 2014 Niklaus Berger Physics Institute, University of Heidelberg Overview You are going to perform a data analysis: Compare measured distributions to theoretical

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Lecture 1: Basics of Probability

Lecture 1: Basics of Probability Lecture 1: Basics of Probability (Luise-Vitetta, Chapter 8) Why probability in data science? Data acquisition is noisy Sampling/quantization external factors: If you record your voice saying machine learning

More information

System Identification

System Identification System Identification Arun K. Tangirala Department of Chemical Engineering IIT Madras July 27, 2013 Module 3 Lecture 1 Arun K. Tangirala System Identification July 27, 2013 1 Objectives of this Module

More information

Multivariate probability distributions and linear regression

Multivariate probability distributions and linear regression Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

01 Probability Theory and Statistics Review

01 Probability Theory and Statistics Review NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement

More information

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables. Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel

More information

Properties of Summation Operator

Properties of Summation Operator Econ 325 Section 003/004 Notes on Variance, Covariance, and Summation Operator By Hiro Kasahara Properties of Summation Operator For a sequence of the values {x 1, x 2,..., x n, we write the sum of x 1,

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

ECONOMICS TRIPOS PART I. Friday 15 June to 12. Paper 3 QUANTITATIVE METHODS IN ECONOMICS

ECONOMICS TRIPOS PART I. Friday 15 June to 12. Paper 3 QUANTITATIVE METHODS IN ECONOMICS ECONOMICS TRIPOS PART I Friday 15 June 2007 9 to 12 Paper 3 QUANTITATIVE METHODS IN ECONOMICS This exam comprises four sections. Sections A and B are on Mathematics; Sections C and D are on Statistics.

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

18.440: Lecture 25 Covariance and some conditional expectation exercises

18.440: Lecture 25 Covariance and some conditional expectation exercises 18.440: Lecture 25 Covariance and some conditional expectation exercises Scott Sheffield MIT Outline Covariance and correlation Outline Covariance and correlation A property of independence If X and Y

More information

Statistical Distribution Assumptions of General Linear Models

Statistical Distribution Assumptions of General Linear Models Statistical Distribution Assumptions of General Linear Models Applied Multilevel Models for Cross Sectional Data Lecture 4 ICPSR Summer Workshop University of Colorado Boulder Lecture 4: Statistical Distributions

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

A review of probability theory

A review of probability theory 1 A review of probability theory In this book we will study dynamical systems driven by noise. Noise is something that changes randomly with time, and quantities that do this are called stochastic processes.

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Machine Learning for Large-Scale Data Analysis and Decision Making A. Week #1

Machine Learning for Large-Scale Data Analysis and Decision Making A. Week #1 Machine Learning for Large-Scale Data Analysis and Decision Making 80-629-17A Week #1 Today Introduction to machine learning The course (syllabus) Math review (probability + linear algebra) The future

More information

ECE Lecture #9 Part 2 Overview

ECE Lecture #9 Part 2 Overview ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by

More information

CS 246 Review of Proof Techniques and Probability 01/14/19

CS 246 Review of Proof Techniques and Probability 01/14/19 Note: This document has been adapted from a similar review session for CS224W (Autumn 2018). It was originally compiled by Jessica Su, with minor edits by Jayadev Bhaskaran. 1 Proof techniques Here we

More information

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the

More information

Lecture 1: Introduction and probability review

Lecture 1: Introduction and probability review Stat 200: Introduction to Statistical Inference Autumn 2018/19 Lecture 1: Introduction and probability review Lecturer: Art B. Owen September 25 Disclaimer: These notes have not been subjected to the usual

More information

Statistics for Economists. Lectures 3 & 4

Statistics for Economists. Lectures 3 & 4 Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Lecture Note 1: Probability Theory and Statistics

Lecture Note 1: Probability Theory and Statistics Univ. of Michigan - NAME 568/EECS 568/ROB 530 Winter 2018 Lecture Note 1: Probability Theory and Statistics Lecturer: Maani Ghaffari Jadidi Date: April 6, 2018 For this and all future notes, if you would

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Problem 1. Problem 2. Problem 3. Problem 4

Problem 1. Problem 2. Problem 3. Problem 4 Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Statistics and Data Analysis

Statistics and Data Analysis Statistics and Data Analysis The Crash Course Physics 226, Fall 2013 "There are three kinds of lies: lies, damned lies, and statistics. Mark Twain, allegedly after Benjamin Disraeli Statistics and Data

More information

[y i α βx i ] 2 (2) Q = i=1

[y i α βx i ] 2 (2) Q = i=1 Least squares fits This section has no probability in it. There are no random variables. We are given n points (x i, y i ) and want to find the equation of the line that best fits them. We take the equation

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

COMP6053 lecture: Sampling and the central limit theorem. Jason Noble,

COMP6053 lecture: Sampling and the central limit theorem. Jason Noble, COMP6053 lecture: Sampling and the central limit theorem Jason Noble, jn2@ecs.soton.ac.uk Populations: long-run distributions Two kinds of distributions: populations and samples. A population is the set

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information