Math 180B Homework 9 Solutions

Size: px
Start display at page:

Download "Math 180B Homework 9 Solutions"

Transcription

1 Problem 1 (Pinsky & Karlin, Exercise 5.1.3). Let X and Y be independent Poisson distributed random variables with parameters α and β, respectively. Determine the conditional distribution of X, given that N X + Y n. Solution. Note that N X +Y has the Poisson distribution with parameter α+β, and that N X, so if k > n, then P (X k N n). If k n, then by the independence of X and Y we have so P (X k, N n) P (X k, Y n k) P (X k)p (Y n k) e α α k e β β n k (n k)!, P (X k, X + Y n) P (X k N n) P (N n) e α α k e β β n k (n k)! n! e (α+β) (α + β) n ( ) n α k β n k ( ) ( ) k ( n α k (α + β) n 1 α ) n k. k α + β α + β Problem 2 (Pinsky & Karlin, Exercise 5.1.5). Suppose that a random variable X is distributed according to a Poisson distribution with parameter λ. The parameter λ is itself a random variable, exponentially distributed with density f(x) θe θx for x. Find the probability mass function for X. Solution. Stated precisely, the assumption of the problem is that if x and k, 1, 2,..., then P (X k λ x) e x x k. 1

2 By the law of total probability, we have P (X k) P (X k λ x)f(x) dx e x x k θe θx dx θ x k e (θ+1)x dx. Making the change of variables u (θ + 1)x, simplifying, and using the fact that u k e u du Γ(k + 1), we get P (X k) θ (θ + 1) k+1 θ (θ + 1) k+1 θ θ + 1 u k e u du ( 1 θ θ + 1 Thus, X has the geometric distribution with parameter θ/(θ + 1). ) k Problem 3 (Pinsky & Karlin, Exercise 5.1.9). Let {X(t) : t } be a Poisson process having rate parameter λ 2. Determine the following expectations: (a) E[X(2)]. (b) E [ (X(1)) 2]. (c) E[X(1)X(2)]. Solution. (a) Since X(2) Poisson(4), we have E[X(2)] 4. (b) Since X(1) Poisson(2), we have E[X(1)] 2 and Var(X(1)) 2, so E [ (X(1)) 2] Var(X(1)) + (E[X(1)]) (c) Since a Poisson process has independent increments, the random variables X(1) and X(2) X(1) are independent. Moreover, X(2) X(1) Poisson(2). Therefore, we have E[X(1)X(2)] E [ (X(1)) 2 + X(1) ( X(2) X(1) )] E [ (X(1)) 2] + E[X(1)]E[X(2) X(1)]

3 Problem 4 (Pinsky & Karlin, Problem 5.1.2). Suppose that minor defects are distributed over the length of a cable as a Poisson process with rate α, and that, independently, major defects are distributed over the cable according to a Poisson process of rate β. Let X(t) be the number of defects, either major or minor, in the cable up to length t. Argue that X(t) must be a Poisson process of rate α + β. Solution. Let Y (t) be the number of minor defects over the length of the cable up to time t, and let Z(t) be the number of major defects over the length of the cable up to time t. Then {Y (t) : t } and {Z(t) : t } are independent Poisson processes with rates α and β, respectively, and X(t) Y (t) + Z(t) for all t. We now verify that {X(t) : t } is a Poisson process of rate α + β. First, we have X() Y () + Z(). Next, if s and t >, then Y (s + t) Y (s) Poisson(αt) and Z(s + t) Z(s) Poisson(βt), so, since Y (s + t) Y (s) and Z(s + t) Z(s) are independent, it follows that X(s + t) X(s) ( Y (s + t) Y (s) ) + ( Z(s + t) Z(s) ) Poisson((α + β)t). Finally, suppose we have time points t < t 1 < < t n. Then the random variables Y (t 1 ) Y (t ),..., Y (t n ) Y (t n 1 ), Z(t 1 ) Z(t ),..., Z(t n ) Z(t n 1 ) are independent, so since X(t i ) X(t i 1 ) ( Y (t i ) Y (t i 1 ) ) + ( Z(t i ) Z(t i 1 ) ) for all i 1,..., n, it follows that the random variables X(t 1 ) X(t ),..., X(t n ) X(t n 1 ) are independent. We conclude that {X(t) : t } is a Poisson process of rate α + β. 3

4 Problem 5 (Pinsky & Karlin, Problem 5.1.3). The generating function of a probability mass function p k P (X k), for k, 1,..., is defined by g X (s) E[s X ] p k s k for s < 1. k Show that the generating function for a Poisson random variable X with mean µ is given by g X (s) e µ(1 s). Solution. If X Poisson(µ), then we just compute its generating function: g X (s) P (X k)s k e µ µ k k k k s k e µ (µs) k e µ e µs e µ(1 s). Problem 6 (Pinsky & Karlin, Problem 5.1.9). Arrivals of passengers at a bus stop form a Poisson process X(t) with rate λ 2 per unit time. Assume that a bus departed at time t leaving no customers behind. Let T denote the arrival time of the next bus. Then, the number of passengers present when it arrives is X(T ). Suppose that the bus arrival time T is independent of the Poisson process and that T has the uniform probability density function 1 for t 1, f T (t) otherwise. (a) Determine E[X(T ) T t] and E [ (X(T )) 2 T t ]. (b) Determine the mean E[X(T )] and variance Var[X(T )]. Solution. (a) By the independence of T and {X(t) : t }, we have E[X(T ) T t] E[X(t) T t] E[X(t)] 2t 4

5 and E [ (X(T )) 2 ] [ ] [ T t E (X(t)) 2 T t E (X(t)) 2 ] 4t 2 + 2t. (b) By the law of total probability, we have E[X(T )] E[X(T ) T t]f T (t) dt 1 2t dt 1. Similarly, so E [ (X(T )) 2] 1 ( 4t 2 + 2t ) dt 7 3, Var(X(T )) E [ (X(T )) 2] (E[X(T )]) Problem 7 (Pinsky & Karlin, Exercise 5.2.4). Suppose that a book of 6 pages contains a total of 24 typographical errors. Develop a Poisson approximation for the probability that three particular successive pages are error-free. Solution. The typographical errors in the book can be modeled as the arrivals of a Poisson process with rate 24/6. If N is the number of typographical errors in three particular successive pages, then N approximately has the Poisson distribution with rate 3 24/6 1.2, so P (N ) e 1.2. Problem 8 (Pinsky & Karlin, Problem 5.2.4). Suppose that N points are uniformly distributed over the interval [, N). Determine the probability distribution for the number of points in the interval [, 1) as N. Solution. Let N be a positive integer, and let U 1,..., U N Uniform([, N)) be independent random variables, and for i 1,..., N, let X i be the indicator random variable of the event {U i [, 1)}. That is, 1 if U i [, 1), X i 1 {Ui [,1)} otherwise. 5

6 Since P (U i [, 1)) 1/N, we have X i Bernoulli(1/N) for each i, and X 1,..., X N are independent. If X denotes the number of the U i s in the interval [, 1), then X X X N Binomial(N, 1/N). By the Poisson approximation to the binomial distribution, it follows that as N, X approximately has the Poisson distribution with rate N 1/N 1. Problem 9 (Pinsky & Karlin, Problem 5.2.5). Suppose that N points are uniformly distributed over the surface of a circular disk of radius r. Determine the probability distribution for the number of points within a distance of one of the origin as N, r, N/(πr 2 ) λ. Solution. To simplify notation, let D R denote the surface of the disk of radius R centered at the origin in R 2. Let N be a positive integer, and let r >. Let U 1,..., U N be independent random variables distributed uniformly over D r. That is, each U i has density 1 if x 2 + y 2 r 2, πr f(x, y) 2 otherwise. For i 1,..., N, let X i be the indicator random variable of the event U i D 1. That is, 1 if U i D 1, X i 1 {Ui D 1 } otherwise. Since 1 P (U i D 1 ) D 1 πr dx dy 1 2 r, 2 we have X i Bernoulli(1/r 2 ) for each i, and X 1,..., X N are independent. If X denotes the number of the U i s within 1 of the origin, then X X X N Binomial(N, 1/r 2 ). By the Poisson approximation to the binomial distribution, it follows that as N and r, with N/(πr 2 ) λ, X approximately has the Poisson distribution with rate N 1/r 2 πλ. 6

7 Problem 1 (Pinsky & Karlin, Problem ). Let X and Y be jointly distributed random variables and B an arbitrary set. Fill in the details that justify the inequality P (X B) P (Y B) P (X Y ). Solution. We have {X B} {X B, Y B} {X B, Y / B} {Y B} {X Y }, and hence Therefore, P (X B) P ( {Y B} {X Y } ) P (Y B) + P (X Y ). P (X B) P (Y B) P (X Y ). Switching the roles of X and Y in the argument above, we also have P (Y B) P (X B) P (X Y ), and hence P (X B) P (Y B) P (X Y ). 7

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Chapter 4. Continuous Random Variables

Chapter 4. Continuous Random Variables Chapter 4. Continuous Random Variables Review Continuous random variable: A random variable that can take any value on an interval of R. Distribution: A density function f : R R + such that 1. non-negative,

More information

Conditional distributions (discrete case)

Conditional distributions (discrete case) Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can

More information

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P

More information

Limiting Distributions

Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results

More information

Math Spring Practice for the final Exam.

Math Spring Practice for the final Exam. Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100] HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 536 December, 00 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. You will have access to a copy

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Stat 426 : Homework 1.

Stat 426 : Homework 1. Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values

More information

Homework 4 Math 11, UCSD, Winter 2018 Due on Tuesday, 13th February

Homework 4 Math 11, UCSD, Winter 2018 Due on Tuesday, 13th February PID: Last Name, First Name: Section: Approximate time spent to complete this assignment: hour(s) Homework 4 Math 11, UCSD, Winter 2018 Due on Tuesday, 13th February Readings: Chapters 16.6-16.7 and the

More information

Guidelines for Solving Probability Problems

Guidelines for Solving Probability Problems Guidelines for Solving Probability Problems CS 1538: Introduction to Simulation 1 Steps for Problem Solving Suggested steps for approaching a problem: 1. Identify the distribution What distribution does

More information

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in: STAT/MA 46 Midterm Exam 2 Thursday, October 8, 27 Name Purdue student ID ( digits) Circle the section you are enrolled in: STAT/MA 46-- STAT/MA 46-2- 9: AM :5 AM 3: PM 4:5 PM REC 4 UNIV 23. The testing

More information

Math Spring Practice for the Second Exam.

Math Spring Practice for the Second Exam. Math 4 - Spring 27 - Practice for the Second Exam.. Let X be a random variable and let F X be the distribution function of X: t < t 2 t < 4 F X (t) : + t t < 2 2 2 2 t < 4 t. Find P(X ), P(X ), P(X 2),

More information

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D 3 Independent Random Variables II: Examples 3.1 Some functions of independent r.v. s. Let X 1, X 2,... be independent r.v. s with the known distributions. Then, one can compute the distribution of a r.v.

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014. EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014 Midterm Exam 2 Last name First name SID Rules. DO NOT open the exam until instructed

More information

Solutions to Homework 2

Solutions to Homework 2 Solutions to Homewor Due Tuesday, July 6,. Chapter. Problem solution. If the series for ln+z and ln z both converge, +z then we can find the series for ln z by term-by-term subtraction of the two series:

More information

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Statistics Ph.D. Qualifying Exam: Part I October 18, 2003 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your answer

More information

4 Moment generating functions

4 Moment generating functions 4 Moment generating functions Moment generating functions (mgf) are a very powerful computational tool. They make certain computations much shorter. However, they are only a computational tool. The mgf

More information

Discrete Distributions

Discrete Distributions Chapter 2 Discrete Distributions 2.1 Random Variables of the Discrete Type An outcome space S is difficult to study if the elements of S are not numbers. However, we can associate each element/outcome

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

3 Conditional Expectation

3 Conditional Expectation 3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y) Stochastic Processes - MM3 - Solutions MM3 - Review Exercise Let X N (0, ), i.e. X is a standard Gaussian/normal random variable, and denote by f X the pdf of X. Consider also a continuous random variable

More information

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS EXAM Exam # Math 3342 Summer II, 2 July 2, 2 ANSWERS i pts. Problem. Consider the following data: 7, 8, 9, 2,, 7, 2, 3. Find the first quartile, the median, and the third quartile. Make a box and whisker

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

Limiting Distributions

Limiting Distributions Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the

More information

Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.

Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet. 2016 Booklet No. Test Code : PSA Forenoon Questions : 30 Time : 2 hours Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

1. Aufgabenblatt zur Vorlesung Probability Theory

1. Aufgabenblatt zur Vorlesung Probability Theory 24.10.17 1. Aufgabenblatt zur Vorlesung By (Ω, A, P ) we always enote the unerlying probability space, unless state otherwise. 1. Let r > 0, an efine f(x) = 1 [0, [ (x) exp( r x), x R. a) Show that p f

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact

More information

STAT 516 Midterm Exam 3 Friday, April 18, 2008

STAT 516 Midterm Exam 3 Friday, April 18, 2008 STAT 56 Midterm Exam 3 Friday, April 8, 2008 Name Purdue student ID (0 digits). The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001 Statistics Ph.D. Qualifying Exam: Part II November 3, 2001 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your

More information

Statistics for Economists. Lectures 3 & 4

Statistics for Economists. Lectures 3 & 4 Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

Expectation, variance and moments

Expectation, variance and moments Expectation, variance and moments John Appleby Contents Expectation and variance Examples 3 Moments and the moment generating function 4 4 Examples of moment generating functions 5 5 Concluding remarks

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015 Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted

More information

Math 362, Problem set 1

Math 362, Problem set 1 Math 6, roblem set Due //. (4..8) Determine the mean variance of the mean X of a rom sample of size 9 from a distribution having pdf f(x) = 4x, < x

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Example 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3).

Example 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3). Example 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3). First of all, we note that µ = 2 and σ = 2. (a) Since X 3 is equivalent

More information

2 Continuous Random Variables and their Distributions

2 Continuous Random Variables and their Distributions Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any

More information

A Study of Poisson and Related Processes with Applications

A Study of Poisson and Related Processes with Applications University of Tennessee, Knoxville Trace: Tennessee Research and Creative Exchange University of Tennessee Honors Thesis Projects University of Tennessee Honors Program 5-2013 A Study of Poisson and Related

More information

Conditioning a random variable on an event

Conditioning a random variable on an event Conditioning a random variable on an event Let X be a continuous random variable and A be an event with P (A) > 0. Then the conditional pdf of X given A is defined as the nonnegative function f X A that

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

STOR : Lecture 17. Properties of Expectation - II Limit Theorems

STOR : Lecture 17. Properties of Expectation - II Limit Theorems STOR 435.001: Lecture 17 Properties of Expectation - II Limit Theorems Jan Hannig UNC Chapel Hill 1 / 14 Properties of expectation Recall: For two random variables X and Y, conditional distribution of

More information

CSE 312 Foundations, II Final Exam

CSE 312 Foundations, II Final Exam CSE 312 Foundations, II Final Exam 1 Anna Karlin June 11, 2014 DIRECTIONS: Closed book, closed notes except for one 8.5 11 sheet. Time limit 110 minutes. Calculators allowed. Grading will emphasize problem

More information

Review for the previous lecture

Review for the previous lecture Lecture 1 and 13 on BST 631: Statistical Theory I Kui Zhang, 09/8/006 Review for the previous lecture Definition: Several discrete distributions, including discrete uniform, hypergeometric, Bernoulli,

More information

(Practice Version) Midterm Exam 2

(Practice Version) Midterm Exam 2 EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 7, 2014 (Practice Version) Midterm Exam 2 Last name First name SID Rules. DO NOT open

More information

Basic concepts of probability theory

Basic concepts of probability theory Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,

More information

7. Higher Dimensional Poisson Process

7. Higher Dimensional Poisson Process 1 of 5 7/16/2009 7:09 AM Virtual Laboratories > 14 The Poisson Process > 1 2 3 4 5 6 7 7 Higher Dimensional Poisson Process The Process The Poisson process can be defined in higher dimensions, as a model

More information

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. Stationary independent increments 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. 2. If each set of increments, corresponding to non-overlapping collection of

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Test Problems for Probability Theory ,

Test Problems for Probability Theory , 1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30

More information

Question 1. Question 4. Question 2. Question 5. Question 3. Question 6.

Question 1. Question 4. Question 2. Question 5. Question 3. Question 6. İstanbul Kültür University Faculty of Engineering MCB17 Introduction to Probability Statistics Second Midterm Fall 21-21 Number: Name: Department: Section: Directions You have 9 minutes to complete the

More information

STA 4321/5325 Solution to Homework 5 March 3, 2017

STA 4321/5325 Solution to Homework 5 March 3, 2017 STA 4/55 Solution to Homework 5 March, 7. Suppose X is a RV with E(X and V (X 4. Find E(X +. By the formula, V (X E(X E (X E(X V (X + E (X. Therefore, in the current setting, E(X V (X + E (X 4 + 4 8. Therefore,

More information

Expectation of Random Variables

Expectation of Random Variables 1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete

More information

15 Discrete Distributions

15 Discrete Distributions Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.

More information

Bell-shaped curves, variance

Bell-shaped curves, variance November 7, 2017 Pop-in lunch on Wednesday Pop-in lunch tomorrow, November 8, at high noon. Please join our group at the Faculty Club for lunch. Means If X is a random variable with PDF equal to f (x),

More information

Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross.

Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross. Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M Ross John L Weatherwax November 14, 27 Introduction Chapter 1: Introduction to Stochastic Processes Chapter 1:

More information

3 Modeling Process Quality

3 Modeling Process Quality 3 Modeling Process Quality 3.1 Introduction Section 3.1 contains basic numerical and graphical methods. familiar with these methods. It is assumed the student is Goal: Review several discrete and continuous

More information

Mathematics 1 Lecture Notes Chapter 1 Algebra Review

Mathematics 1 Lecture Notes Chapter 1 Algebra Review Mathematics 1 Lecture Notes Chapter 1 Algebra Review c Trinity College 1 A note to the students from the lecturer: This course will be moving rather quickly, and it will be in your own best interests to

More information

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs STATISTICS 4 Summary Notes. Geometric and Exponential Distributions GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs P(X = x) = ( p) x p x =,, 3,...

More information

Lecture 5: Moment generating functions

Lecture 5: Moment generating functions Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf

More information

Practice Final Solutions

Practice Final Solutions Practice Final Solutions Math 1, Fall 17 Problem 1. Find a parameterization for the given curve, including bounds on the parameter t. Part a) The ellipse in R whose major axis has endpoints, ) and 6, )

More information

2905 Queueing Theory and Simulation PART IV: SIMULATION

2905 Queueing Theory and Simulation PART IV: SIMULATION 2905 Queueing Theory and Simulation PART IV: SIMULATION 22 Random Numbers A fundamental step in a simulation study is the generation of random numbers, where a random number represents the value of a random

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

3. Poisson Processes (12/09/12, see Adult and Baby Ross) 3. Poisson Processes (12/09/12, see Adult and Baby Ross) Exponential Distribution Poisson Processes Poisson and Exponential Relationship Generalizations 1 Exponential Distribution Definition: The continuous

More information

1. (Regular) Exponential Family

1. (Regular) Exponential Family 1. (Regular) Exponential Family The density function of a regular exponential family is: [ ] Example. Poisson(θ) [ ] Example. Normal. (both unknown). ) [ ] [ ] [ ] [ ] 2. Theorem (Exponential family &

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

the convolution of f and g) given by

the convolution of f and g) given by 09:53 /5/2000 TOPIC Characteristic functions, cont d This lecture develops an inversion formula for recovering the density of a smooth random variable X from its characteristic function, and uses that

More information

Review of Probability Theory II

Review of Probability Theory II Review of Probability Theory II January 9-3, 008 Exectation If the samle sace Ω = {ω, ω,...} is countable and g is a real-valued function, then we define the exected value or the exectation of a function

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Page Max. Possible Points Total 100

Page Max. Possible Points Total 100 Math 3215 Exam 2 Summer 2014 Instructor: Sal Barone Name: GT username: 1. No books or notes are allowed. 2. You may use ONLY NON-GRAPHING and NON-PROGRAMABLE scientific calculators. All other electronic

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and

More information

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x! Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)

More information

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. NAME: ECE 302 Division MWF 0:30-:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. If you are not in Prof. Pollak s section, you may not take this

More information

Gillespie s Algorithm and its Approximations. Des Higham Department of Mathematics and Statistics University of Strathclyde

Gillespie s Algorithm and its Approximations. Des Higham Department of Mathematics and Statistics University of Strathclyde Gillespie s Algorithm and its Approximations Des Higham Department of Mathematics and Statistics University of Strathclyde djh@maths.strath.ac.uk The Three Lectures 1 Gillespie s algorithm and its relation

More information

MA 320 Introductory Probability, Section 1 Spring 2017 Final Exam Practice May. 1, Exam Scores. Question Score Total. Name:

MA 320 Introductory Probability, Section 1 Spring 2017 Final Exam Practice May. 1, Exam Scores. Question Score Total. Name: MA 320 Introductory Probability, Section 1 Spring 2017 Final Exam Practice May. 1, 2017 Exam Scores Question Score Total 1 10 Name: Last 4 digits of student ID #: No books or notes may be used. Turn off

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer February 14, 2003 1 Discrete Uniform Distribution DiscreteUniform(n). Discrete. Rationale Equally likely outcomes. The interval 1, 2,..., n of

More information

ISyE 6644 Fall 2016 Test #1 Solutions

ISyE 6644 Fall 2016 Test #1 Solutions 1 NAME ISyE 6644 Fall 2016 Test #1 Solutions This test is 85 minutes. You re allowed one cheat sheet. Good luck! 1. Suppose X has p.d.f. f(x) = 3x 2, 0 < x < 1. Find E[3X + 2]. Solution: E[X] = 1 0 x 3x2

More information

Continuous r.v practice problems

Continuous r.v practice problems Continuous r.v practice problems SDS 321 Intro to Probability and Statistics 1. (2+2+1+1 6 pts) The annual rainfall (in inches) in a certain region is normally distributed with mean 4 and standard deviation

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA THE ROYAL STATISTICAL SOCIETY 4 EXAINATIONS SOLUTIONS GRADUATE DIPLOA PAPER I STATISTICAL THEORY & ETHODS The Societ provides these solutions to assist candidates preparing for the examinations in future

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY

More information

Tutorial 1 : Probabilities

Tutorial 1 : Probabilities Lund University ETSN01 Advanced Telecommunication Tutorial 1 : Probabilities Author: Antonio Franco Emma Fitzgerald Tutor: Farnaz Moradi January 11, 2016 Contents I Before you start 3 II Exercises 3 1

More information