Introduction to Probability. Ariel Yadin. Lecture 10. Proposition Let X be a discrete random variable, with range R and density f X.
|
|
- Sharon Williams
- 5 years ago
- Views:
Transcription
1 Introduction to Probability Ariel Yadin Lecture 1 1. Expectation - Discrete Case Proposition 1.1. Let X be a discrete random variable, with range R and density f X. Then, *** Jan. 3 *** Expectation for discrete RVs r R rf X (r). Proof. For all N, let X + N := r R [,N] 1 {X=r} r and X N := Note that X + N X+ and X N X. Moreover, by linearity E[X + N ] = r R [,N] Using monotone convergence we get that r R [ N,] P[X = r]r and E[X N ] = r R [ N,] 1 {X=r} r. E[X + ] E[X ] = lim N E[X+ N ] E[X N ] = f X (r)r. r R P[X = r]r. Example 1.2. Let us calculate the expectations of different discrete random variables: Examples: Ber, Bin, Poi, Geo If X Ber(p) then 1 P[X = 1] + P[X = ] = p. If X Bin(n, p) then since ( ) ( n k k = n n 1 k 1) for 1 k n, n f X (k) k = = np n ( ) n p k (1 p) n k k k n ( ) n 1 p k 1 (1 p) n k = np. k 1 An easier way, would be to note that X = n X k where X k Ber(p) (and in fact X 1,..., X n are independent). Thus, by linearity, n E[X k] = np. 1
2 2 For X Poi(λ): λ λk e k! k = λ e λ λk 1 (k 1)! = λ. For X Geo(p): Note that for g(x) = (1 x) k, we have p x g(x) = k(1 x)k 1. Do this one f X (k) k = (1 p) k 1 p k = p = p p p (1 p)k = p p ( 1 p 1 ) = p Another way: Let E = E[X]. Then, 1 p 2 = 1 p. ( ) (1 p) k E = p + (1 p) k 1 pk = p + (1 p) k 1 p(k 1) + (1 p) k 1 p = p + (1 p)e + 1 p. k=2 k=2 k=2 So E = 1 + (1 p)e or E = 1/p. Example 1.3. A pair of independent fair dice are tossed. What is the expected number of tosses needed to see Shesh-Besh? Note that each toss of the dice is an independent trial, such that the probability of seeing Shesh-Besh is 2/36 = 1/18. So if X = number of tosses until Shesh-Besh, then X Geo(1/18). Thus, Function of a random variable. Let g : R d R be a measurable function. Let (X 1,..., X d ) be a joint distribution of d discrete random variables with range R each. Then, Y = g(x 1,..., X d ) is a random variable. What is its expectation? Well, first we need the density of Y : For any y R we have that P[Y = y] = P[(X 1,..., X d ) g 1 ({y})] = P[(X 1,..., X d ) = (r 1,..., r d )]. So, if R Y := g(r d ), then since R d = (r 1,...,r d ) R d g 1 ({y}) y R Y R d g 1 ({y}),
3 3 and since Y is discrete we get that E[g(X 1,..., X d )] = y P[Y = y] Y R Y = f (X1,...,X d )( r) g( r) Specifically, y R Y r R d g 1 ({y}) = r R d f (X1,...,X d )( r) g( r). Proposition 1.4. Let g : R d R be a measurable function, and X = (X 1,..., X d ) a discrete joint random variable with range R d. Then, Example 1.5. E[g(X)] = r R d f X ( r) g( r). Manchester United plans to earn some money selling Wayne Rooney jerseys. Each jersey costs the club x pounds, and is sold for y > x pounds. Suppose that the number of people who want to buy jerseys is a discrete random variable with range N. What is the expected profit if the club orders N jerseys? How many jerseys should be ordered in order to maximize this profit? Solution. Let p k = f X (k) = P[X = k]. Let g N : N R be the function that gets the number of people that want to buy jerseys and gives the profit, if the club ordered N jerseys. That is, ky Nx k N g n (k) = N(y x) k > N The expected profit is then E[g N (X)] = p k g N (k) = p k N(y x) p k (N k)y = N(y x) p k (N k)y. Call this G(N) := N(y x) N p k(n k)y. We want to maximize this as a function of N. Note that G(N + 1) G(N) = y x yp k (N + 1 N) = y x y P[X N]. So G(N + 1) > G(N) as long as P[X N] < y x y, so the club should order N + 1 jerseys for the largest N such that P[X N] < y x y.
4 4 2. Expectation - Continuous Case Goal is E g(x) Our goal is now to prove the following theorem: Theorem 1.6. Let X = (X 1,..., X d ) be an absolutely continuous random variable, and let g : R d R be a measurable function. Then, E[g(X)] = g( x)f X ( x)d x. R d The main lemma here is Lemma 1.7. Let X = (X 1,..., X d ) be an absolutely continuous random variable. Then, for any Borel set B B d, P[X B] = f X ( x)d x. Proof. Let Q(B) = B f X( x)d x. Then Q is a probability measure on (R d, B d ), that coincides with P X on the π-system of rectangles (, b 1 ] (, b d ]. Thus, P[X B] = P X (B) = Q(B) for all B B d. B Remark 1.8. We have not really defined the integral B f X( x)d x. However, for our purposes, we can define it as P[X B], and note that this coincides with the Riemann integral on intervals. Proof of Theorem 1.6. First assume that g, so R d = g 1 [, ). For all n define Y n = 2 n 2 n g(x) which are discrete non-negative random variables. First, we show that E[Y n ] = 2 n 2 n g( x) f X ( x)d x. R d Indeed, for n, k let B n,k = g 1 [2 n k, 2 n (k + 1)) B d. Note that Y n = 2 n k1 {X Bn,k } and E[Y n ] = 2 n k P[X B n,k ]. Now, since and since n n R d = g 1 [, ) = g 1 [2 n k, 2 n (k + 1)) = B n,k, 1 Bn,k 2 n 2 n g f X = 1 Bn,k 2 n kf X,
5 5 we have by the lemma above that 2 n 2 n g( x) f X ( x)d x = R d = 1 Bn,k 2 n 2 n g( x) f X ( x)d x = R d 2 n k P[X B n,k ] = E[Y n ]. Here we have used the fact that 1 Bn,k 2 n k 1 Bn,k 2 n k. 2 n k f X ( x)d x B n,k Since 2 n 2 n g g 2 n, we get that Y n g(x) 2 n. Thus, R E[g(X)] g( x)f X ( x)d x E[g(X)] E[Y n] + g( x)f X ( x)d x 2 n 2 n g( x) f X ( x)d x d R d R d 2 2 n. This proves the theorem for non-negative functions g. Now, if g is a general measurable function, consider g = g + g. Since g +, g are nonnegative, we have that E[g(X)] = E[g + (X) g (X)] = E[g + (X)] E[g (X)] = (g + ( x) g ( x))f X ( x)d x. R d Expectation for Corollary 1.9. Let X be an absolutely continuous random variable with density f X. Then, tf X (t)dt. cont. Compare tf X (t)dt to r P[X = r] in the discrete case. This is another place where f X is like P[X = ] (although the latter is identically in the continuous case, as we have seen). Examples: Unif., Example 1.1. Expectations of some absolutely continuous random variables: X U[, 1]: 1 tdt = 1/2. More generally, X U[a, b]: b a t 1 b a dt = 1 2(b a) (b2 a 2 ) = b+a 2. X Exp(λ): We use integration by parts, since e λt = λ 1 e λt, t λe λt dt = te λt + e λt dt = 1 λ. Normal, Exp.
6 6 X N(µ, σ): Change u = t µ so du = dt, 1 2πσ Since the function u u exp ) t 2πσ exp ( (t µ)2 2σ dt. 2 ) u exp ( u2 2σ 2 du + µf X (t)dt. ( ) u2 2σ is an odd function, its integral is, so 2 µ f X (t)dt = µ. ** Jan. 5 *** Proposition Let X be an absolutely continuous random variable, such that E[X] exists. Then, Proof. Note that Similarly, P[X > t]dt = (P[X > t] P[X t])dt = f X (s)dsdt = s (1 F X (t) F X (t ))dt. dtf X (s)ds = t sf X (s)ds. P[X t]dt = t f X (s)dsdt = Subtracting both we have the result. ** in exercises ** In a similar way we can prove s dtf X (s)ds = sf X (s)ds. Exercise Let X be a discrete random variable, with range Z such that E[X] exists. Then, (P[X > k] P[X < k]). Example Let X N(, 1). Compute E[X 2 ]. By the above, E[X 2 ] = 1 2π R x 2 e x2 /2 dx = 1 xe x2 /2 2π + 1 2π R e x2 /2 dx = 1 where we have used integration by parts, with x e x2 /2 = xe x2 /2.
7 7 3. Examples Using Linearity Example [Coupon Collector] Gilad collects super-goal cards. There are N cards to collect altogether. Each time he buys a card, he gets one of the N uniformly at random, independently. What is the expected amount of cards Gilad needs to buy in order to collect all cards? For k =, 1,..., N 1, let T k be the number of cards bought after getting the k-th new card, until getting the (k + 1) th new card. That is, when Gilad has k different cards, he buys T k more cards until he has (k + 1) different cards. If Gilad has k different cards, then with probability N k N have. So, T k Geo( N k N ). he buys a card he does not already Since the total number of cards Gilad buys until getting all N cards is T = T + T 1 + T T N 1, using linearity of expectation Example then E[T ] = E[T ] + E[T 1 ] + + E[T N 1 ] = 1 + N N N N = N 1 k. We toss a die 1 times. What is the expected sum of all tosses? Here it really begs to use linearity. If X k is the outcome of the k-th toss, and X = 1 X k Example on [, 1]. What is their expected sum? 1 E[X k ] = = random numbers are output by a computer, each distributed uniformly 2 E[U[, 1]] = = 1. Example Let X n U[, 2 n ], for n, and let S N = N X k. What is the expectation of S N? Linearity of expectation gives E[S N ] = E[X k ] = 2 (k+1) = 1 2 (N+1).
8 8 Note that if S = X k, then S N S so monotone convergence gives that E[S ] = 1.
Random Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationFundamental Tools - Probability Theory II
Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions
More informationGenerating and characteristic functions. Generating and Characteristic Functions. Probability generating function. Probability generating function
Generating and characteristic functions Generating and Characteristic Functions September 3, 03 Probability generating function Moment generating function Power series expansion Characteristic function
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationChapter 5. Random Variables (Continuous Case) 5.1 Basic definitions
Chapter 5 andom Variables (Continuous Case) So far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions on
More informationX 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:
nna Janicka Probability Calculus 08/09 Lecture 4. Real-valued Random Variables We already know how to describe the results of a random experiment in terms of a formal mathematical construction, i.e. the
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationOutline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions
Week 5 Random Variables and Their Distributions Week 5 Objectives This week we give more general definitions of mean value, variance and percentiles, and introduce the first probability models for discrete
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationSpecial distributions
Special distributions August 22, 2017 STAT 101 Class 4 Slide 1 Outline of Topics 1 Motivation 2 Bernoulli and binomial 3 Poisson 4 Uniform 5 Exponential 6 Normal STAT 101 Class 4 Slide 2 What distributions
More informationSTAT 430/510 Probability Lecture 7: Random Variable and Expectation
STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationMath Bootcamp 2012 Miscellaneous
Math Bootcamp 202 Miscellaneous Factorial, combination and permutation The factorial of a positive integer n denoted by n!, is the product of all positive integers less than or equal to n. Define 0! =.
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More informationRandom Variables (Continuous Case)
Chapter 6 Random Variables (Continuous Case) Thus far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationCS5314 Randomized Algorithms. Lecture 5: Discrete Random Variables and Expectation (Conditional Expectation, Geometric RV)
CS5314 Randomized Algorithms Lecture 5: Discrete Random Variables and Expectation (Conditional Expectation, Geometric RV) Objectives Introduce Geometric RV We then introduce Conditional Expectation Application:
More informationLecture 1: Review on Probability and Statistics
STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationM378K In-Class Assignment #1
The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More information18.440: Lecture 19 Normal random variables
18.440 Lecture 19 18.440: Lecture 19 Normal random variables Scott Sheffield MIT Outline Tossing coins Normal random variables Special case of central limit theorem Outline Tossing coins Normal random
More informationThe mean, variance and covariance. (Chs 3.4.1, 3.4.2)
4 The mean, variance and covariance (Chs 3.4.1, 3.4.2) Mean (Expected Value) of X Consider a university having 15,000 students and let X equal the number of courses for which a randomly selected student
More informationMath/Stat 352 Lecture 8
Math/Stat 352 Lecture 8 Sections 4.3 and 4.4 Commonly Used Distributions: Poisson, hypergeometric, geometric, and negative binomial. 1 The Poisson Distribution Poisson random variable counts the number
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More informationMathematics 426 Robert Gross Homework 9 Answers
Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationMultivariate distributions
CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density
More informationChap 2.1 : Random Variables
Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution
1 ACM 116: Lecture 2 Agenda Independence Bayes rule Discrete random variables Bernoulli distribution Binomial distribution Continuous Random variables The Normal distribution Expected value of a random
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationLecture 10. Variance and standard deviation
18.440: Lecture 10 Variance and standard deviation Scott Sheffield MIT 1 Outline Defining variance Examples Properties Decomposition trick 2 Outline Defining variance Examples Properties Decomposition
More informationNotes 12 Autumn 2005
MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationProbability and Statistics
Probability and Statistics Jane Bae Stanford University hjbae@stanford.edu September 16, 2014 Jane Bae (Stanford) Probability and Statistics September 16, 2014 1 / 35 Overview 1 Probability Concepts Probability
More informationn px p x (1 p) n x. p x n(n 1)... (n x + 1) x!
Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)
More informationMATH 151, FINAL EXAM Winter Quarter, 21 March, 2014
Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists
More informationChapter 4 : Expectation and Moments
ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average
More informationConditional distributions. Conditional expectation and conditional variance with respect to a variable.
Conditional distributions Conditional expectation and conditional variance with respect to a variable Probability Theory and Stochastic Processes, summer semester 07/08 80408 Conditional distributions
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationSTAT 430/510: Lecture 16
STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions
More informationConditional distributions
Conditional distributions Will Monroe July 6, 017 with materials by Mehran Sahami and Chris Piech Independence of discrete random variables Two random variables are independent if knowing the value of
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationMA 519 Probability: Review
MA 519 : Review Yingwei Wang Department of Mathematics, Purdue University, West Lafayette, IN, USA Contents 1 How to compute the expectation? 1.1 Tail........................................... 1. Index..........................................
More informationClassical Probability
Chapter 1 Classical Probability Probability is the very guide of life. Marcus Thullius Cicero The original development of probability theory took place during the seventeenth through nineteenth centuries.
More informationChapter 1. Sets and probability. 1.3 Probability space
Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationLecture 3. David Aldous. 31 August David Aldous Lecture 3
Lecture 3 David Aldous 31 August 2015 This size-bias effect occurs in other contexts, such as class size. If a small Department offers two courses, with enrollments 90 and 10, then average class (faculty
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More information2 Random Variable Generation
2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods
More information3. Review of Probability and Statistics
3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture
More informationA random variable is said to have a beta distribution with parameters (a, b) ifits probability density function is equal to
224 Chapter 5 Continuous Random Variables A random variable is said to have a beta distribution with parameters (a, b) ifits probability density function is equal to 1 B(a, b) xa 1 (1 x) b 1 x 1 and is
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More informationChapter 4 : Discrete Random Variables
STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2015 Néhémy Lim Chapter 4 : Discrete Random Variables 1 Random variables Objectives of this section. To learn the formal definition of a random variable.
More informationProbability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.
Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More informationTopic 9 Examples of Mass Functions and Densities
Topic 9 Examples of Mass Functions and Densities Discrete Random Variables 1 / 12 Outline Bernoulli Binomial Negative Binomial Poisson Hypergeometric 2 / 12 Introduction Write f X (x θ) = P θ {X = x} for
More informationNormal Random Variables and Probability
Normal Random Variables and Probability An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2015 Discrete vs. Continuous Random Variables Think about the probability of selecting
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationX n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)
14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More information5. Conditional Distributions
1 of 12 7/16/2009 5:36 AM Virtual Laboratories > 3. Distributions > 1 2 3 4 5 6 7 8 5. Conditional Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an
More informationPreliminaries. Probability space
Preliminaries This section revises some parts of Core A Probability, which are essential for this course, and lists some other mathematical facts to be used (without proof) in the following. Probability
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationIntroduction to Statistical Data Analysis Lecture 3: Probability Distributions
Introduction to Statistical Data Analysis Lecture 3: Probability Distributions James V. Lambers Department of Mathematics The University of Southern Mississippi James V. Lambers Statistical Data Analysis
More information6 The normal distribution, the central limit theorem and random samples
6 The normal distribution, the central limit theorem and random samples 6.1 The normal distribution We mentioned the normal (or Gaussian) distribution in Chapter 4. It has density f X (x) = 1 σ 1 2π e
More informationEcon 508B: Lecture 5
Econ 508B: Lecture 5 Expectation, MGF and CGF Hongyi Liu Washington University in St. Louis July 31, 2017 Hongyi Liu (Washington University in St. Louis) Math Camp 2017 Stats July 31, 2017 1 / 23 Outline
More informationMath 151. Rumbos Spring Solutions to Review Problems for Exam 3
Math 151. Rumbos Spring 2014 1 Solutions to Review Problems for Exam 3 1. Suppose that a book with n pages contains on average λ misprints per page. What is the probability that there will be at least
More informationTom Salisbury
MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom
More informationPart (A): Review of Probability [Statistics I revision]
Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery
More informationStatistika pro informatiku
Statistika pro informatiku prof. RNDr. Roman Kotecký DrSc., Dr. Rudolf Blažek, PhD Katedra teoretické informatiky FIT České vysoké učení technické v Praze MI-SPI, ZS 2011/12, Přednáška 5 Evropský sociální
More information