MA 519 Probability: Review
|
|
- Silvester Allen
- 6 years ago
- Views:
Transcription
1 MA 519 : Review Yingwei Wang Department of Mathematics, Purdue University, West Lafayette, IN, USA Contents 1 How to compute the expectation? 1.1 Tail Index Permutations and combinations 3.1 Stars and bars Divided to parts About the Exp(λ) Basic facts Scaling Relation to Geo(p) Memoryless property Hazard rate Comparison between independent variables Two variables with different λ Two variables with the same λ More than two variables with the same λ Order statistics Two variables with joint density n variables with iid Conditional case This is based on the lecture notes of Prof Sellke. 1
2 5 Poisson process Gamma distribution Poisson distribution and binomial distribution Poisson process Definition Comparison Normal Statistics Independent case Dependent case Prediction Limiting distribution The max of Exp(λ) The min of U[0,1] Useful things Jensen Theorem Stirling formula Slutsky Theorem Delta method Central limit theorem Mongolian coins problem Versions Questions How to compute the expectation? 1.1 Tail E(X) P(X k). k1 1. Index E(X) k I k (w), where I k 0 or 1.
3 Permutations and combinations.1 Stars and bars Theorem.1. The number of distinguishable ways that n indistinguishable balls can be distributed among r distinguishable boxes is ( ) n+r 1 r 1 Corollary.1. If box i is requested to have m i balls, ( r i1 m i n), then the answer is ( n r i1 m ) i +r 1 Corollary.. There are box is empty. ( n 1 r 1 r 1 ) ways to distribute n identical balls to r boxes so that no. Divided to parts Theorem.. n people are supposed to be divided into r groups with n r people in each group, i.e. n 1 + +n r n. Then there are ( ways to do that. 3 About the Exp(λ) n n 1,n,,n r ) n! n 1! n r! 3.1 Basic facts If X Exp (λ), then { 0, if t 0, f X (t) λe λt, if t > 0. { 0, if t 0, F X (t) 1 e λt, if t > 0. (3.1) (3.) E(X) 1 λ, E(Xk ) k! λk, (3.3) Var(X) 1 λ. (3.4) 3
4 3. Scaling If X Exp (λ), then cx Exp (λ/c), where c > 0 is a constant. 3.3 Relation to Geo(p) If X Exp (λ), then P(0 < X 1) 1 e λ, P(1 < X ) e λ e λ e λ (1 e λ ), P(k < X k +1) e kλ (1 e λ ). Let P(Y k) P(k < X k +1), then Y Geo (1 e λ ). 3.4 Memoryless property If X Exp (λ), then Note that it is independent of t. 3.5 Hazard rate Definition 3.1. If X Exp (λ), then We call λ as Hazard rate. P(X > t+m X > t) λe λ(t+m) λe λt e λm P(X > m). (3.5) P(X [t,t+δ) X t) P(t < X < t+δ,x t) P(X t) e λ(t+δ) e λt e λt e λδ δλ. Definition 3.. Generally, if random variable T has density function f(t) and cdf F(t), then the Hazard rate is λ T (t) f(t) 1 F(t). 4
5 3.6 Comparison between independent variables Note that all of the random variables here are independent to each other Two variables with different λ If X Exp (λ 1 ), Y Exp (λ ), then Furthermore, let U min(x,y), then P(X < Y) λ 1 λ 1 +λ. (3.6) P(U > t) P(X > t)p(y > t) e (λ 1+λ )t, so U Exp (λ 1 +λ ). (3.7) 3.6. Two variables with the same λ Suppose X,Y Exp (λ), and D X Y,W X Y, then { 1 f D (t) λe λt, if t > 0, 1 λeλt, if t < 0. { 1 f W (t) 3 λeλt, if t < 0, 1 3 λe 1 λt, if t > 0. (3.8) (3.9) Furthermore, U min(x,y), V max(x,y), T V U, then T Exp (λ). Note that T D. Besides, X 1 X 1 +X U[0,1] More than two variables with the same λ Suppose X 1,,X n,y Exp (λ), then min{x 1,,X n } Exp (nλ), (3.10) P(max{X 1,,X n } < Y) 1 n+1, (3.11) ( ) n 1 P(X 1 +X + +X n < Y). (3.1) 5
6 Furthermore, if X 1,,X n, Exp (1), let V n max{x 1,X,,X n }, then consider the order statistics, V n X (n) X (1) +(X () X (1) )+ +(X (n) X (n 1) ), Exp(n)+ Exp(n 1)+ + Exp(1), E(V n ) 1 n + 1 n ln(n), Var(V n ) 1 n + 1 π + (n 1) 6. Remark 3.1. Eq.(3.11) is always true if {X i } and Y are iid, no matter what kind of distribution. Remark 3.. In Eq.(3.1), X 1 +X + +X n Gamma (n,λ). 4 Order statistics 4.1 Two variables with joint density Suppose X,Y have joint density f XY (x,y), then U min(x,y),v max(x,y), then f UV (u,v) { fxy (u,v)+f XY (v,u) u < v, 0 else. (4.1) Furthermore, if we are just interested in the expectation, it is more convenient to use these equations: 4. n variables with iid max(x,y) X +Y min(x,y) X +Y + 1 X Y, (4.) 1 X Y. (4.3) Suppose X 1,X,,X n iid f(t) and F(t), then the density for each X (k) is The combined density is f X(k) Cn 1 Ck 1 n 1 Fk 1 f(1 F) n k, (4.4) n F X(k) Cn j Fj (1 F) n j. (4.5) jk f X(1),X (),,X (n) (x 1,x,,x n ) n!f(x 1 )f(x ) f(x n ), x 1 < x < x n. 6
7 4.3 Conditional case Suppose X 1,X,,X n iid U[0,1], then the combined density of X (1),X (),,X (n) is { n! 0 < x1 < x f X(1),X (1),,X (n) (x 1,x,,x n ) < < x n < 1, (4.6) 0 else. The conditional density of X (1),,X (k 1),X (k+1),x (n) given X (k) a, a [0,1], is f X(1),,X (k 1),X (k+1),,x (n) X (k) a(x 1,,x k 1,x k+1,,x n x k a) { f(x1,,x k 1,a,x k+1,,x n), 0 < x f(x1,,x k 1,a,x k+1,,x n)dx 1 < < x k 1 < a < x k+1 < x n < 1, 0, else. 5 Poisson process 5.1 Gamma distribution Definition 5.1 (Gamma function). Define the function Γ(x) as Γ(x) 0 e y y x 1 dy, x > 0. Remark 5.1. Special case: Γ(n) (n 1)! for n N; Γ(1/) π,γ(1) 1. Definition 5. (Gamma distribution). Say X Gamma(α, λ) if its density is { 0, if t 0, f X (t) λe λt (λt) α 1, if t > 0. Γ(α) Remark 5.. E(X) α λ, E(Xk ) Γ(α+k) λ k Γ(α), Var(X) α λ. 5. Poisson distribution and binomial distribution Definition 5.3 (Poisson distribution). Say X Poisson(λ) if Remark 5.3. E(X) λ, Var(X) λ. P(X k) e λλk k!, k 0,1,. Definition 5.4 (Binomial distribution). Say X Binomial(n, p) if P(X k) C k n pk (1 p) n k, k 0,1,,n. Theorem 5.1. Suppose X Binomial(n,p). If n is very large while p is very small, then X Poisson (λ np). 7
8 5.3 Poisson process Definition Definition 5.5 (Poisson process). Suppose W i Exp(λ), T n n i1 W i, then we say T n is a Poisson process and T n Gamma(n,λ). Theorem 5.. Suppose T n is a Poisson process. Let N the number of W is in the interval [a,b], then N Poisson (λ(b a)) Comparison There are two Poisson processes: A with rate λ A, waiting time X 1,X,,X n ; B with rate λ B : waiting time Y 1,Y,,Y m. Then (I) (II) P(X 1 +X + +X n < Y 1 ) P( First n hits in combined process are all A hits ) ( ) n λa. λ A +λ B P(X 1 +X + +X n +1 < Y 1 ) P( First n hits in combined process are all A hits and no B hits in next 1 time unit ) ( ) n λa e λ B. λ A +λ B (III) P(X 1 +X +X 3 < Y 1 +Y ) P( 3rd A hit are before nd B hit ) P( there are 3 or 4 hits of the first 4 hits are A hits ) ( ) 3 ( ) ( ) 4 λa λb λa. C 3 4 λ A +λ B λ A +λ B +C 4 4 λ A +λ B Remark 5.4. Compare the results here with Section
9 6 Normal Statistics 6.1 Independent case Definition 6.1 (χ distribution ). Say X χ (n) if where x 1,x,,x n iid N(0,1). X x 1 +x + +x n, Remark 6.1. χ (n) Gamma( n, 1 ), χ () Exp( 1 ). Theorem 6.1. Let X 1,X,,X n iid N(µ,σ ), and then 6. Dependent case X 1 n n X i, (6.1) i1 S 1 n 1 n ( Xi X ), (6.) i1 X N(µ, σ ), n (6.3) (n 1)S χ (n 1). σ (6.4) Suppose (X, Y) Standard Bivariate Normal distribution with correlation ρ, then P(X > 0,Y > 0) P(X > 0,ρX + (1 ρ )Z > 0) ρ P(X > 0,Z > Z) 1 ρ ( ) π +arctan ρ 1 ρ where Z N(0,1) and independent with X. π, 9
10 6.3 Prediction Suppose X,Y with correlation ρ, then the prediction of Y based on X is Ŷ ρx, Ŷ µ Y +ρx σ Y, where X X µ X, Y Y µ Y. σ X σ Y 7 Limiting distribution Idea: try to use ( 1 x n) n e x. 7.1 The max of Exp(λ) Suppose X 1,,X n, Exp (1), and V n max{x 1,X,,X n }, then F Vn (t) P(V n < t) (1 e t ) n, (7.1) ) n P(V n ln(n) < t) (1 e (t+ln(n)) ) n (1 e t e e t. (7.) n Now we know that, if n is sufficient large, then E(V n ) ln(n), (7.3) Median (V n ) ln(n) ln(ln()). (7.4) Remark 7.1. Compare the results here with Section The min of U[0,1] Suppose X 1,,X n, U(1), and U n min{x 1,X,,X n }, then F Un (t) 1 P(U n > t) 1 (1 t) n, (7.5) ( F nun (t) 1 P U n > t ) ( 1 1 t n 1 e n n) t. (7.6) It indicates that nu n Exp(1), if n is sufficient large. 10
11 8 Useful things 8.1 Jensen Theorem Theorem 8.1 (Jensen). Suppose Q(x) : R R is convex, then Q(E(X)) E(Q(X)). Remark 8.1. Consider Q(x) x, then E(X ) (EX) Var(X) Stirling formula 8.3 Slutsky Theorem n! ( n ) ne πn θ 1n, θ (0,1). (8.1) e Theorem 8. (Slutsky). If X n X,(D) and Y n a,(p), W n b,(p), then 8.4 Delta method Y n X n +W n ax +b,(d). Suppose a n, a n (W n b) X,(D). Let g : R R be differentiable at b, then 8.5 Central limit theorem a n (g(w n ) g(b)) g (b)x,(d). Theorem 8.3 (Central limit). Suppose X 1,X,,X n iid with mean µ and variance σ. Then X 1 +X + +X n nµ σ n N(0,1). 9 Mongolian coins problem 9.1 Versions Suppose in each toss, P( Head ) θ, where θ U[0,1]. Let N the number of Heads in n tosses. Version I : P(N k) 1 0 C k nθ k (1 θ) n k dθ. 11
12 Recall the order statistics: is the U (k+1) from U 1,,U n+1 iid U[0,1]. So f (k+1) (t) C 1 n+1 Ck n tk (1 t) n k, P(N k) 1 n+1, k. Version II : Let U 0,U 1,,U n iid U[0,1]. Call θ U 0, X k I{U k < U 0 }. Then 9. Questions N n X k the number of U 1,,U n which are < U 0, k1 P(N k) P(U 0 is the U (k+1) from U 0,,U n ) 1 n+1. Given θ, let P(X i 1) P( ith toss get head ), then X 1,X,,X n iid Bernoulli (θ). (i) Compute P(X 3 X 1 X 1). Method one: P(X 3 X 1 X 1) P(X 1 X X 3 1) P(X 1 X 1) E(P(X 1 X X 3 1 θ [0,1])) E(P(X 1 X 1 θ [0,1])) 1 0 θ3 dθ θ dθ Method two: Consider the relative order of U 0,U 1,U,U 3. We can also get the same answer. (ii) Compute P(θ < 1/ X 1 X 1). P(θ < 1/ X 1 X 1) 1/ 0 θ dθ 1 0 θ dθ
THE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationStatistics STAT:5100 (22S:193), Fall Sample Final Exam B
Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More information18.175: Lecture 13 Infinite divisibility and Lévy processes
18.175 Lecture 13 18.175: Lecture 13 Infinite divisibility and Lévy processes Scott Sheffield MIT Outline Poisson random variable convergence Extend CLT idea to stable random variables Infinite divisibility
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationSpecial distributions
Special distributions August 22, 2017 STAT 101 Class 4 Slide 1 Outline of Topics 1 Motivation 2 Bernoulli and binomial 3 Poisson 4 Uniform 5 Exponential 6 Normal STAT 101 Class 4 Slide 2 What distributions
More informationStat410 Probability and Statistics II (F16)
Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two
More information18.175: Lecture 17 Poisson random variables
18.175: Lecture 17 Poisson random variables Scott Sheffield MIT 1 Outline More on random walks and local CLT Poisson random variable convergence Extend CLT idea to stable random variables 2 Outline More
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationThis exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.
TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationConditional distributions (discrete case)
Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationLecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs
Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions
More informationUCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE 53 Handout #0 Prof. Young-Han Kim Thursday, April 4, 04 Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei). Time until the n-th arrival. Let the random variable N(t) be the number
More informationSolutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.
Solutions to Homework Set #3 (Prepared by Yu Xiang). Time until the n-th arrival. Let the random variable N(t) be the number of packets arriving during time (0,t]. Suppose N(t) is Poisson with pmf p N
More informationMath Stochastic Processes & Simulation. Davar Khoshnevisan University of Utah
Math 5040 1 Stochastic Processes & Simulation Davar Khoshnevisan University of Utah Module 1 Generation of Discrete Random Variables Just about every programming language and environment has a randomnumber
More informationLecture 14. Text: A Course in Probability by Weiss 5.6. STAT 225 Introduction to Probability Models February 23, Whitney Huang Purdue University
Lecture 14 Text: A Course in Probability by Weiss 5.6 STAT 225 Introduction to Probability Models February 23, 2014 Whitney Huang Purdue University 14.1 Agenda 14.2 Review So far, we have covered Bernoulli
More informationSTAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS
STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,
More informationTwo hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45
Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions
More informationPractice Midterm 2 Partial Solutions
8.440 Practice Midterm 2 Partial Solutions. (20 points) Let X and Y be independent Poisson random variables with parameter. Compute the following. (Give a correct formula involving sums does not need to
More information1 Introduction. P (n = 1 red ball drawn) =
Introduction Exercises and outline solutions. Y has a pack of 4 cards (Ace and Queen of clubs, Ace and Queen of Hearts) from which he deals a random of selection 2 to player X. What is the probability
More information2 Random Variable Generation
2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods
More informationPractice Midterm 2 Partial Solutions
8.440 Practice Midterm Partial Solutions. (0 points) Let and Y be independent Poisson random variables with parameter. Compute the following. (Give a correct formula involving sums does not need to be
More informationPoisson processes Overview. Chapter 10
Chapter 1 Poisson processes 1.1 Overview The Binomial distribution and the geometric distribution describe the behavior of two random variables derived from the random mechanism that I have called coin
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationSampling Distributions
In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationThis does not cover everything on the final. Look at the posted practice problems for other topics.
Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry
More information15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More informationSTAT 418: Probability and Stochastic Processes
STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationProbability Distributions Columns (a) through (d)
Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)
More informationProblems ( ) 1 exp. 2. n! e λ and
Problems The expressions for the probability mass function of the Poisson(λ) distribution, and the density function of the Normal distribution with mean µ and variance σ 2, may be useful: ( ) 1 exp. 2πσ
More informationMathematical Statistics
Mathematical Statistics Chapter Three. Point Estimation 3.4 Uniformly Minimum Variance Unbiased Estimator(UMVUE) Criteria for Best Estimators MSE Criterion Let F = {p(x; θ) : θ Θ} be a parametric distribution
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationMath/Stat 352 Lecture 8
Math/Stat 352 Lecture 8 Sections 4.3 and 4.4 Commonly Used Distributions: Poisson, hypergeometric, geometric, and negative binomial. 1 The Poisson Distribution Poisson random variable counts the number
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationHomework for 1/13 Due 1/22
Name: ID: Homework for 1/13 Due 1/22 1. [ 5-23] An irregularly shaped object of unknown area A is located in the unit square 0 x 1, 0 y 1. Consider a random point distributed uniformly over the square;
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationChapter 4 continued. Chapter 4 sections
Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationp. 4-1 Random Variables
Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i
More informationn px p x (1 p) n x. p x n(n 1)... (n x + 1) x!
Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)
More informationSTAT 414: Introduction to Probability Theory
STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises
More informationExample 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3).
Example 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3). First of all, we note that µ = 2 and σ = 2. (a) Since X 3 is equivalent
More informationSTAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution
STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution Pengyuan (Penelope) Wang June 15, 2011 Review Discussed Uniform Distribution and Normal Distribution Normal Approximation
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More information1 Probability and Random Variables
1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in
More informationBMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution
Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationS n = x + X 1 + X X n.
0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each
More informationUniversity of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions
Name: University of Chicago Graduate School of Business Business 490: Probability Final Exam Solutions Special Notes:. This is a closed-book exam. You may use an 8 piece of paper for the formulas.. Throughout
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationGuidelines for Solving Probability Problems
Guidelines for Solving Probability Problems CS 1538: Introduction to Simulation 1 Steps for Problem Solving Suggested steps for approaching a problem: 1. Identify the distribution What distribution does
More informationWeek 9 The Central Limit Theorem and Estimation Concepts
Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population
More informationJoint p.d.f. and Independent Random Variables
1 Joint p.d.f. and Independent Random Variables Let X and Y be two discrete r.v. s and let R be the corresponding space of X and Y. The joint p.d.f. of X = x and Y = y, denoted by f(x, y) = P(X = x, Y
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationLectures on Elementary Probability. William G. Faris
Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................
More informationChing-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12
Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution
More informationIntroduction to Probability. Ariel Yadin. Lecture 10. Proposition Let X be a discrete random variable, with range R and density f X.
Introduction to Probability Ariel Yadin Lecture 1 1. Expectation - Discrete Case Proposition 1.1. Let X be a discrete random variable, with range R and density f X. Then, *** Jan. 3 *** Expectation for
More information1 Bernoulli Distribution: Single Coin Flip
STAT 350 - An Introduction to Statistics Named Discrete Distributions Jeremy Troisi Bernoulli Distribution: Single Coin Flip trial of an experiment that yields either a success or failure. X Bern(p),X
More informationLecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014
Lecture 13 Text: A Course in Probability by Weiss 5.5 STAT 225 Introduction to Probability Models February 16, 2014 Whitney Huang Purdue University 13.1 Agenda 1 2 3 13.2 Review So far, we have seen discrete
More informationLecture 12. Poisson random variables
18.440: Lecture 12 Poisson random variables Scott Sheffield MIT 1 Outline Poisson random variable definition Poisson random variable properties Poisson random variable problems 2 Outline Poisson random
More informationFundamentals. CS 281A: Statistical Learning Theory. Yangqing Jia. August, Based on tutorial slides by Lester Mackey and Ariel Kleiner
Fundamentals CS 281A: Statistical Learning Theory Yangqing Jia Based on tutorial slides by Lester Mackey and Ariel Kleiner August, 2011 Outline 1 Probability 2 Statistics 3 Linear Algebra 4 Optimization
More informationCDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random
CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (RV)X (R.V.) X: A function on sample space X: S R Cumulative distribution
More informationRandom variables and transform methods
Chapter Random variables and transform methods. Discrete random variables Suppose X is a random variable whose range is {,,..., } and set p k = P (X = k) for k =,,..., so that its mean and variance are
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationChapter 2 Continuous Distributions
Chapter Continuous Distributions Continuous random variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationPhysics 403 Probability Distributions II: More Properties of PDFs and PMFs
Physics 403 Probability Distributions II: More Properties of PDFs and PMFs Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Last Time: Common Probability Distributions
More informationBivariate Normal Distribution
.0. TWO-DIMENSIONAL RANDOM VARIABLES 47.0.7 Bivariate Normal Distribution Figure.: Bivariate Normal pdf Here we use matrix notation. A bivariate rv is treated as a random vector X X =. The expectation
More informationStatistics (1): Estimation
Statistics (1): Estimation Marco Banterlé, Christian Robert and Judith Rousseau Practicals 2014-2015 L3, MIDO, Université Paris Dauphine 1 Table des matières 1 Random variables, probability, expectation
More information18 Bivariate normal distribution I
8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer
More informationStat 5101 Notes: Brand Name Distributions
Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More information