MAS113 Introduction to Probability and Statistics. Proofs of theorems
|
|
- Silvia Marshall
- 5 years ago
- Views:
Transcription
1 MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a measure, ma) = mb) + ma \ B); rearranging gives the result Note that more generally ie not assuming B A) ma \ B) = ma) ma B), by the same argument M2 As ma \ B) 0 by definition of a measure, this follows immediately from M1 M3 Apply M1 with B = A; then A \ A = so the LHS is m ), and the RHS is ma) ma) = 0 M4 We can write A B = A B) A \ B) B \ A), and the three sets here are disjoint So using the definition of measure ma B) = ma B) + ma \ B) + mb \ A) Applying M1, we get ma B) + ma) ma B) + mb) ma B) which, simplifying, gives the result Note A B and B A are the same) M5 See Exercise 4 M6 See Exercise 4 Theorem 3 Law of Total Probability) Because the E i form a partition, they are disjoint Hence their intersections with F, F E i, are also disjoint Again because the E i are a partition, any element of F must be in one of them, so the union of F E i for i = 1,, n must be the whole of F, and the previous sentence says it is a disjoint union Hence P F ) = n i=1 P F E i) The second form of the statement which is the more useful one in practice) follows immediately by writing P F E i ) = P E i )P F E i ) from the definition of conditional probability) 1
2 Theorem 4 Bayes Theorem) By the definition of conditional probability, P E i F ) = P E i F )/P F ) However, we also know from the definition of conditional probability that P E i F ) = P E i )P F E i ) Hence P E i F ) = P E i)p F E i ) P F ) Theorem 5 Before the full proof, consider Example 35 again Here we have a random variable X with range R X = { 1, 0, 1}, and we let Y = X 2 Thus R Y = {0, 1} By definition, we have EY ) = y R Y yp Y = y) = P Y = 1) after a bit of simplification) So we need to consider the event {Y = 1} For Y to be 1 means that either X = 1 or X = 1, and by the obvious) disjointness of the two possibilities P Y = 1) = P X = 1)+P X = 1), so we can say that EY ) = P X = 1)+P X = 1) Now consider the general case, and let Y = gx) Then, by definition, EY ) = y R Y yp Y y) = y R Y yp Y = y) In the example above, we split the event {Y = 1} up into events in terms of X which give Y = 1 More generally, the event {Y = y} is the disjoint union of events {X = x} for each x R X such that gx) = y If g is injective, there will be only one event in the union) So EY ) = yp Y = y) y R Y = y R Y y =,gx)=y y R Y,gx)=y = y R Y,gx)=y P X = x) yp X x) gx)p X x), and the double sum here is equivalent to, giving the result Theorem 6 This is a special case of Theorem 8: in the notation of that theorem let a = 1 and b = EX) 2
3 Theorem 7 By definition and Theorem 5, VarX) = EX EX)) 2 ) = Expanding the brackets, we have VarX) = x 2 p X x) 2EX) x EX)) 2 p X x) xp X x) + EX) 2 p X x) Note that here we have used the fact that 2EX) and EX) 2 are constants which do not depend on x, so can be taken outside the sum Then xp X x) = EX), by definition, and p X x) = 1 as p X is a probability mass function, so we get VarX) = EX 2 ) 2EX)EX) + EX) 2 = EX 2 ) EX) 2, as required Theorem 8, mean part By Theorem 5, EaX + b) = ax + b)p X x) = a xp X x) + b xp X x) = aex) + b, again using xp X x) = EX) and p X x) = 1 Hence we have the result Theorem 8, variance part By definition, By the mean part, we get VaraX + b) = EaX + b EaX + b)) 2 VaraX + b) = EaX + b aex) b)) 2 ) = EaX EX))) 2 ) = Ea 2 X EX)) 2 ) = a 2 VarX) 3
4 Theorem 9 The definition of expectation gives EX + Y ) = zp X + Y = z) z R X+Y Now, if z R X+Y we can write z = x + y where x R X and y R Y Hence we can replace the sum over z by a sum over x and y: EX + Y ) = x + y)p X = x, Y = y) y R Y Split the sum up: EX + Y ) = xp X = x, Y = y) + yp X = x, Y = y) y R Y y R Y = x P X = x, Y = y) + y P X = x, Y = y) y R Y y R Y If R X or R Y is infinite, you ll need to take on trust that the reversal of the order of summation is OK here) Now y R Y P X = x, Y = y) = P X = x), and similarly P X = x, Y = y) = P Y = y) So we get EX + Y ) = xp X = x) + yp Y = y) y R Y = EX) + EY ) Theorem 10 This is similar to Theorem 9 Start with EXY ) = zp XY = z) z R XY = xy)p X = x, Y = y) y R Y By independence, P X = x, Y = y) = P X = x)p Y = y), so we get EXY ) = xy)p X = x)p Y = y) y R Y 4
5 Now, with respect to y, we can regard x and P X = x) as constants, so we take them out of the sum with respect to y, and get EXY ) = xp X = x) yp Y = y), y R Y which immediately gives EXY ) = EX)EY ) Corollary 11 Start with the variance identity Theorem 7): VarX + Y ) = EX + Y ) 2 ) EX + Y )) 2 Use Theorems 8, 9 and 10 to get VarX + Y ) = EX 2 + 2XY + Y 2 ) EX)) 2 EY )) 2 2EX)EY ) = EX 2 ) + EY 2 ) + 2EXY ) EX)) 2 EY )) 2 2EX)EY ) = VarX) + VarY ) + 2EXY ) EX)EY )), and by Theorem 10 EXY ) EX)EY ) = 0, giving the result Theorem 12 This is an easy exercise with the definitions of mean and variance: EX) = 0 1 p) + 1 p = p and EX 2 ) is also p since X only takes values 0 and 1, so X and X 2 are actually the same) Hence VarX) = EX 2 ) = EX)) 2 = p p 2 = p1 p) Theorem 13 Use the fact that X = n i=1 Z i, where Z i = 1 if trial i is a success and Z i = 0 if it is a failure By assumption, the Z i are independent Thus Theorems 9 and 12 tell us EX) = E n Z i ) = i=1 and Corollary 11 and Theorem 12 give VarX) = Var n Z i ) = i=1 n EZ i ) = np, i=1 n VarZ i ) = np1 p) Theorem 14 We are looking for ) ) x n λ lim 1 λ n x, n x n n) 5 i=1
6 which, writing ) n x = n! depend on n, becomes Now, and λ x x! x!n x)! nn 1)n 2) n x + 1) lim n n x n 1 lim n n = lim n n 2 n is also 1 So we are left with λ x and factoring out terms which do not 1 λ ) x 1 λ n n n) = = lim n n x + 1 n lim 1 λ ) x n n x! lim n 1 λ n) n = 1, By Note 638 in MAS110, lim n 1 λ n) n = e λ, so we are left with as required e λ λ x Theorem 15: valid pmf As p X x) 0, we just need to check x=0 p Xx) = 1 Checking, we have p X x) = x=0 x=0 x! e λ λ x = e λ x! x=0 λ x x! = e λ e λ = 1, recognising the sum as the series expansion of the exponential function Theorem 15: mean and variance We have EX) = x e λ λ x = e λ λ x x! x 1)! x=0 using x! = xx 1)!) Changing variables to y = x 1, we get EX) = e λ λ y+1 y=0 y! = λe λ y=0 x=1 λ y y! = λe λ e λ = λ For the variance, we have EX 2 ) = λ 2 + λ see Exercise 36) and thus VarX) = λ 2 + λ) λ) 2 = λ 6
7 Theorem 16 For x N, x F X x) = P X = a) = a=1 x a=1 1 p) a 1 p = p 1 p)x p 1 1 p) geometric series with x terms, first term p and common ration 1 p) and that simplifies to 1 1 p) x as required Theorem 17 By the definition of mean, EX) = x1 p) x 1 p x=1 The Binomial Theorem negative integer case) tells us that for θ < 1, 1 θ) 2 = n=0 n + 1)θn = m=1 mθm 1, which you can also obtain by differentiating term by term the formula for the sum of an infinite geometric series Using this with θ = 1 p gives x1 p) x 1 p = p x1 p) x 1 = p1 1 p)) 2 = 1 p x=1 x=1 For the variance, we start by finding EXX 1)) = xx 1)1 p) x 1 p = x=1 xx 1)1 p) x 1 p, as the x = 1 term is zero Again the Binomial Theorem or term by term differentiation says 1 θ) 3 mm 1) = θ m 2, 2 x=2 m=2 and thus xx 1)1 p) x 1 p = 2p1 p) xx 1)1 p) x 2 = p1 p)1 1 p)) 3 = x=2 Now, EX 2 ) = EXX 1) + X) = 1 p p 2 VarX) = EX 2 ) EX)) 2 = x=2 + 1 p, and 21 p) p p 1 p 2 = 1 p p 2 21 p) p 2 7
8 Theorem 18 This is really just a repeat of Corollary 11, but without the last line which uses the independence assumption Start with the variance identity Theorem 7): Use Theorem 8 to get VarX + Y ) = EX + Y ) 2 ) EX + Y )) 2 VarX + Y ) = EX 2 + 2XY + Y 2 ) EX)) 2 EY )) 2 2EX)EY ) Theorem 19 = EX 2 ) + EY 2 ) + 2EXY ) EX)) 2 EY )) 2 2EX)EY ) = VarX) + VarY ) + 2EXY ) EX)EY )) = VarX) + VarY ) + 2 CovX, Y ) 1 By definition, CovX, X) = EX EX))X EX))) = EX EX)) 2 ), which is the definition of the variance 2 We have CovaX + b, cy + d) = EaX + b aex) + b))cy + d cey ) + d))) = EacX EX))Y EY ))) = ac CovX, Y ), using the definition of covariance and Theorem 8 3 That CovX, Y ) = 0 if X and Y are independent follows from Theorem 10 That the converse does not necessarily hold is shown by Example 46 CovX,X) 4 We have CorX, X) = = VarX) = 1 For CorX, X), VarX) VarX) VarX) note that 2 above shows that CovX, X) = VarX), from which CorX, X) = 1 follows immediately 5 Let σx 2 = VarX), ) σ2 Y = VarY ) and c = CovX, Y ) Consider Var X c Y, and note that because this is a variance it must σy 2 be non-negative By Theorem 18, and also using Theorem 8 and 8
9 2 above, we get Var c ) Y σy 2 = VarX) + c2 σ 4 Y = σ 2 X + c2 σ 2 Y = σx 2 c2 σy 2 2 c2 σ 2 Y VarY ) 2 c σ 2 Y CovX, Y ) Thus σx 2 c2 0, σy 2 and dividing through by σx 2 we get CovX, Y ) 2 = c2 σ 2 X σ2 Y 1, from which the result follows 6 By 2 above, CovX, a + bx) = b VarX), and by Theorem 8 Vara + bx) = b 2 VarX) So we get CorX, a + bx) = b VarX) bvarx)) 2, which gives the result, remembering that b 2 is b if b > 0 and b if b < 0 Theorem 20 We want P X 2 = x 2, X 3 = x 3,, X k = x k X 1 = x 1 ), which by the definition of conditional probability is P X 2 = x 2, X 3 = x 3,, X k = x k, X 1 = x 1 ) P X 1 = x 1 ) By the formulae for the multinomial and binomial distributions, this becomes n! x 1!x 2!x k! px 1 1 p x 2 2 p x k k n! x 1!n x 1 )! px p 1 ) n x 1 9
10 and various terms cancel, giving which is the same as n x 1 )! x 2! x k! giving the result p2 n x 1 )!p x 2 2 p x 3 3 p x k k x 2! x k!1 p 1 ) n x 1, 1 p 1 ) x2 p3 1 p 1 ) x3 ) xk pk, 1 p 1 Theorem 21 For x 0, F X x) = P X x) = x 0 λe λt dt = 1 e λx Note that if x < 0, P X x) = 0 as x cannot be negative, so in full { 1 e λx x 0 F X x) = 0 x < 0 Theorem 22 We have EX) = λxe λx dx Integration by parts gives 0 λ [ 1 ) λ xe λx ] λ e λx dx As xe λx 0 as x, we get 0 e λx dx which gives 1/λ For the variance see exercise 51 Theorem 23 By the definition of conditional probability, the left hand side is P {X > x + a} {X > a}) P X > a) However {X > x + a} {X > a} = {X > x + a}, so we get 0 P X > x + a) P X > a) = e λx+a) e λa = e λx = P X > x), where we have used Theorem 21 and the fact that it implies P X > x) = e λx for all x > 0 Theorem 24 For x [a, b], F X x) = x a 1 b a dt = x a b a 10
11 Theorem 25 We have EX) = b For the variance, first find EX 2 ) = Then b a a x 1 [ b a dx = x 2 2b a) [ x 2 1 b a dx = x 3 3b a) VarX) = b2 + ab + a 2 3 Theorem 26 By definition, Φ z) = a + b)2 4 z ] b a ] b a = b2 a 2 2b a) = b + a 2 = b3 a 3 3b a) = b2 + ab + a 2 3 = b2 + a 2 2ab 12 φt) dt Change variables to s = t and use the symmetry of φ to get Φ z) = which is 1 Φz) as required Theorem 27 For the expectation, EZ) = z φ s) ds = zφz) dz = 1 2π z φs) ds, = ze z2 /2 dz Considering the improper integral as a limit, this is 1 t lim ze z2 /2 dz, 2π s,t s which becomes 1 2π lim /2 s,t [ e z2 ] t s = 1 2π b a)2 12 lim /2 s,t e s2 e t2 /2 ) = 0 11
12 For the variance, we need to calculate EZ 2 ) = 1 2π z 2 e z2 /2 dz Writing z 2 e z2 /2 as z ze z2 /2 and integrating by parts, we get EZ 2 ) = 1 [ ] ) ze z2 /2 + e z2 /2 dz 2π The integral is just the integral of the Normal pdf again, so is 1, and ze z2 /2 0 both as z and z Hence we get EZ 2 ) = 1, and so VarZ) = 1 EZ) 2 = 1, as EZ) = 0 Theorem 28 If X = µ+σz, consider the cumulative) distribution function of X: F X x) = P X x) = P µ + σz x) = P Z x µ ) σ ) x µ = Φ σ To get the probability density function of X, differentiate, using the chain rule: f X x) = F Xx) = 1 ) x µ σ φ σ That EX) = µ and VarX) = σ 2 follows from Theorem 8 12
MAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationIntroduction to Computational Finance and Financial Econometrics Probability Review - Part 2
You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationExpectation, Variance and Standard Deviation for Continuous Random Variables Class 6, Jeremy Orloff and Jonathan Bloom
Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, 8.5 Jeremy Orloff and Jonathan Bloom Learning Goals. Be able to compute and interpret expectation, variance, and standard
More informationCHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable
CHAPTER 4 MATHEMATICAL EXPECTATION 4.1 Mean of a Random Variable The expected value, or mathematical expectation E(X) of a random variable X is the long-run average value of X that would emerge after a
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationHW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)
HW5 Solutions 1. (50 pts.) Random homeworks again (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] Answer: Applying the definition of expectation we have
More informationIntroduction to Machine Learning
What does this mean? Outline Contents Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola December 26, 2017 1 Introduction to Probability 1 2 Random Variables 3 3 Bayes
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationLecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution
Econ 514: Probability and Statistics Lecture 6: Special probability distributions Summarizing probability distributions Let X be a random variable with probability distribution P X. We consider two types
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationECON Fundamentals of Probability
ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,
More informationExpectation of Random Variables
1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationClass 8 Review Problems solutions, 18.05, Spring 2014
Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots
More informationProblem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51
Yi Lu Correlation and Covariance Yi Lu ECE 313 2/51 Definition Let X and Y be random variables with finite second moments. the correlation: E[XY ] Yi Lu ECE 313 3/51 Definition Let X and Y be random variables
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationSTAT 430/510: Lecture 16
STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions
More informationRandom variables (discrete)
Random variables (discrete) Saad Mneimneh 1 Introducing random variables A random variable is a mapping from the sample space to the real line. We usually denote the random variable by X, and a value that
More informationProbability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3
Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationLecture 4: Probability and Discrete Random Variables
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1
More informationChapter 4 continued. Chapter 4 sections
Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationChp 4. Expectation and Variance
Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 10: Expectation and Variance Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationSlides 8: Statistical Models in Simulation
Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationGaussian vectors and central limit theorem
Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More informationMath Bootcamp 2012 Miscellaneous
Math Bootcamp 202 Miscellaneous Factorial, combination and permutation The factorial of a positive integer n denoted by n!, is the product of all positive integers less than or equal to n. Define 0! =.
More informationStatistics STAT:5100 (22S:193), Fall Sample Final Exam B
Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More informationFinal Exam # 3. Sta 230: Probability. December 16, 2012
Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationRecall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.
Expectations of Sums of Random Variables STAT/MTHE 353: 4 - More on Expectations and Variances T. Linder Queen s University Winter 017 Recall that if X 1,...,X n are random variables with finite expectations,
More informationIf g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get
18:2 1/24/2 TOPIC. Inequalities; measures of spread. This lecture explores the implications of Jensen s inequality for g-means in general, and for harmonic, geometric, arithmetic, and related means in
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationThis does not cover everything on the final. Look at the posted practice problems for other topics.
Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry
More informationProbability Review. Chao Lan
Probability Review Chao Lan Let s start with a single random variable Random Experiment A random experiment has three elements 1. sample space Ω: set of all possible outcomes e.g.,ω={1,2,3,4,5,6} 2. event
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationStat 5101 Notes: Algorithms
Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................
More informationPreliminary Statistics. Lecture 3: Probability Models and Distributions
Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution
More informationProbability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.
Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel
More informationLecture 4: Proofs for Expectation, Variance, and Covariance Formula
Lecture 4: Proofs for Expectation, Variance, and Covariance Formula by Hiro Kasahara Vancouver School of Economics University of British Columbia Hiro Kasahara (UBC) Econ 325 1 / 28 Discrete Random Variables:
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationThe Multivariate Normal Distribution. In this case according to our theorem
The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More informationENGG2430A-Homework 2
ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationCovariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance
Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationProbability- the good parts version. I. Random variables and their distributions; continuous random variables.
Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density
More information2 Continuous Random Variables and their Distributions
Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationProbability Review. Gonzalo Mateos
Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationLecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality
Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi
More informationUC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007
UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 8 Fall 007 Issued: Thursday, October 5, 007 Due: Friday, November, 007 Reading: Bertsekas
More informationIntroduction to Probability
Introduction to Probability SECOND EDITION Dimitri P. Bertsekas and John N. Tsitsiklis Massachusetts Institute of Technology Selected Summary Material All Rights Reserved WWW site for book information
More informationTom Salisbury
MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 13: Expectation and Variance and joint distributions Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin
More informationWeek 12-13: Discrete Probability
Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible
More informationA Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.
A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,
More informationEC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)
1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2016 Page 0 Expectation of a discrete random variable Definition: The expected value of a discrete random variable exists, and is defined by EX
More information01 Probability Theory and Statistics Review
NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement
More informationGaussian random variables inr n
Gaussian vectors Lecture 5 Gaussian random variables inr n One-dimensional case One-dimensional Gaussian density with mean and standard deviation (called N, ): fx x exp. Proposition If X N,, then ax b
More informationECE 302 Division 2 Exam 2 Solutions, 11/4/2009.
NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total
More informationMATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM
MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM YOUR NAME: KEY: Answers in Blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they
More informationThe mean, variance and covariance. (Chs 3.4.1, 3.4.2)
4 The mean, variance and covariance (Chs 3.4.1, 3.4.2) Mean (Expected Value) of X Consider a university having 15,000 students and let X equal the number of courses for which a randomly selected student
More information