Eleventh Problem Assignment

Similar documents
Twelfth Problem Assignment

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Name: Firas Rassoul-Agha

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Essentials on the Analysis of Randomized Algorithms

Page Max. Possible Points Total 100

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Massachusetts Institute of Technology

Readings: Finish Section 5.2

57:022 Principles of Design II Final Exam Solutions - Spring 1997

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

MARKOV PROCESSES. Valerio Di Valerio

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

BINOMIAL DISTRIBUTION

Relationship between probability set function and random variable - 2 -

Chapter 2: Random Variables

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Queueing Theory and Simulation. Introduction

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Lecture 3. Discrete Random Variables

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Chapter 2: The Random Variable

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Janson s Inequality and Poisson Heuristic

Lecture Notes 3 Convergence (Chapter 5)

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.

Probability. Lecture Notes. Adolfo J. Rumbos

Midterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley

Problem Points S C O R E Total: 120

Mathematical Methods for Computer Science

CSE 312 Foundations, II Final Exam

Disjointness and Additivity

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2

Part I Stochastic variables and Markov chains

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Stochastic Models in Computer Science A Tutorial

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability and Statistics Concepts

Chapter 2 Queueing Theory and Simulation

X = X X n, + X 2

Randomized Algorithms

Exercises with solutions (Set D)

6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

Guidelines for Solving Probability Problems

Chapter 3: Random Variables 1

Lecture 2 Binomial and Poisson Probability Distributions

Lecture 2: Repetition of probability theory and statistics

Fundamentals of Applied Probability and Random Processes

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Part 3: Parametric Models

Math489/889 Stochastic Processes and Advanced Mathematical Finance Solutions for Homework 7

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

EE126: Probability and Random Processes

Be sure this exam has 9 pages including the cover. The University of British Columbia

Math 180B Problem Set 3

Midterm Exam 2 (Solutions)

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Lecture 3 Continuous Random Variable

Discrete Distributions

Chernoff Bounds. Theme: try to show that it is unlikely a random variable X is far away from its expectation.

Engineering Mathematics : Probability & Queueing Theory SUBJECT CODE : MA 2262 X find the minimum value of c.

Discrete Mathematics and Probability Theory Fall 2011 Rao Midterm 2 Solutions

Continuous-time Markov Chains

Multiple Random Variables

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

Markov Processes Hamid R. Rabiee

Stochastic process. X, a series of random variables indexed by t

Problem Points S C O R E Total: 120

Queueing Theory. VK Room: M Last updated: October 17, 2013.

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Discrete Random Variables

Chapter 5. Means and Variances

Probability and Statistics

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Chapter 8. Some Approximations to Probability Distributions: Limit Theorems

Lecture 20 : Markov Chains

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Chapter 6: Random Processes 1

1 Sequences of events and their limits

ECE 650 1/11. Homework Sets 1-3

Chapter 3: Random Variables 1

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

1.225J J (ESD 205) Transportation Flow Systems

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER I EXAMINATION MH4702/MAS446/MTH437 Probabilistic Methods in OR

Probability Method in Civil Engineering Prof. Dr. Rajib Maity Department of Civil Engineering Indian Institute of Technology, Kharagpur

Continuous-Valued Probability Review

Lecture 4: Sampling, Tail Inequalities

Quick Tour of Basic Probability Theory and Linear Algebra

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

Data analysis and stochastic modeling

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science Transmission of Information Spring 2006

Transcription:

EECS April, 27 PROBLEM (2 points) The outcomes of successive flips of a particular coin are dependent and are found to be described fully by the conditional probabilities P(H n+ H n ) = P(T n+ T n ) = 2 where we have used the notation: Event H k : Heads on the kth toss; Event T k : Tails on the kth toss. We know that the first toss came up heads. H (a) Determine the probability that the first tail will occur on the kth toss (k = 2,,,...). T 2 The probability that the first tail occurs on the kth toss is ( ) k 2 ( ) p K (k) = (b) What is the probability that flip 5 will come up heads? We can assume that by 5th toss, the system is in steady state, so the probability of head will be the steady state probability of head i.e. solving which we get p H = p H + p T p H + p T = p H = 7 p T = 7 Thus the probability of getting a head at the 5th flip is /7 April, 27

s (c) What is the probability that flip 5 will come up heads and flip 52 will also come up heads? Pr ( {H 5, H 52 } ) = Pr(H 5 ) Pr(H 52 H 5 ) ( ( = ) ) 2 7 + = 8 =.69 (d) Given that flips 5, 52,... 5 + m all have the same result, what is the probability that all of these m outcomes are heads? Simplify your answer as much as possible, and interpret your result for large values of m. Let A (resp. B) denote the event that flips 5, 52,..., 5 + m are all heads (resp. tails). We need to find Pr(A A B) = Pr(A) Pr(A B) = 7 ( ( 7 ) m + 7 ) m For large values of m, consider the limit as m goes to. We have lim Pr(A A B) = lim m m 7 + 7 7 ( 8 9 ( ) 2 m ) m = This means that for large m, having all heads is much more likely than having all tails. (e) We are told that the 75th head just occurred on the 5th toss. Determine the expected value of the number of additional flips required until we observe the 79th head. We know that 5th toss is a head. Thus we need to find out the expected number of tosses required for four heads given that we are starting from a head. First consider the expected number of tosses required for a head, given that the previous toss was a head. Let K be the number of tosses required for the next head. We have: 2 April, 27

s { p K (k) =, ( k = 2 ) k 2, k = 2,,,..., Thus, E[K] = + k ( 2 k=2 = + [ ] = 7 8 ) [ k 2 = + k 8 k= ( ) ] 2 k The expected number of tosses till the fourth head is E[K] = 7. PROBLEM 2 (2 points) Consider a circular random walk in which six points, 2,,, 5, 6 are placed, in a clockwise order, on a circle. Suppose that one-step transitions are possible only from a point to its adjacent points with equal probabilities. Starting from, 6 2 5 (a) find the probability that in transitions the Markov chain returns to ; The state transition matrix is given by P = The initial state is given to be, thus, (continued on next page) April, 27

s π() = We know that π() = P π().75.25 =.25 Thus, the probability that the Markov chain returns to state is π () =.75. (b) find the probability that in 5 transitions the Markov chain enters to an adjacent point of, namely, to 2 or 6. π(5) = Pπ() =.8.25.8 Thus the probability that the Markov chain is in states 2 or 6 is π 2 (5) + π 6 (5) =.6875. PROBLEM (5 points) A telephone company establishes a direct connection between two cities expecting Poisson traffic with rate c calls/minute. The durations of calls are independent and exponentially distributed with mean /d minutes. Interarrival times are independent of call duration. The system can handle up to n calls, meaning that any calls that are attempted when the system has n calls on line will be blocked. Assume that we start observing the system long after the process has started. April, 27

s (a) Formulate a discrete-time Markov Chain to model the above process, assuming that the discrete-time time intervals are very short. Assume that the time interval is. Then we can take the number of active calls in the system as the state and get the following Markov chain c c d c 2d c c c c 2 d 2d d nd (b) Determine the PMF for the number of calls in the system. c nd From the local balance equations we have P k c = p k+ (k + )d Solving in terms of p we get p k = ρk k! p, k =, 2,..., n where ρ = c/d. The PMF must sum to one. Thus p = ( n k= ) ρ k k! (c) Given that the number of calls on the system just changed, find the PMF for the number of calls on the system immediately after the change. We have Let C denote the event that the number of calls in the system have changed. Pr(K = k C) = Pr(K = k, C) Pr(C) () where (continued on next page) 5 April, 27

s Further, Pr(K =, C) = p d = dρ p Pr(K = k, C) = p k c + p k+ (k + )d = ρk d(k + ρ), k =, 2,..., n ; k! ρ n Pr(K = n, C) = p n c = d (n )! p Pr(C) = n Pr(K = k, C) = k= n ρ k n d (k )! + c ρk+ k! k= k= n = d( + ρ) k= ρ k+ k! Substituting in () we get the expression PMF of the number of calls. PROBLEM (2 points) The Markov chain with transition probabilities listed below is in state immediately before the first trial (except in part ). p, =.75, p,2 =.25, p 2, = p, =., p,2 =.2, p, =, p, =...75.25. 2.2 (a) Find the variance for J, the number of transitions up to and including the transition on which the process leaves state for the last time. The probability that the process leaves state at the j-th step is P J (j) = () j which is a geometric random variable with parameter. Thus Var(J) = p/( p) 2 = 2. (b) Find the expectation for K, the number of transitions up to and including the transition on which the process enters state for the first time. The probability that the process enters state for the first time at time k is { P K (k) = () k. k =, 2,,....2 k = Notice that the you may never enter state, in which case the value of K is. Thus E[K] =. 6 April, 27

s (c) Find for all values of i and j: lim Pr(state t is i initial state is j) t When we start at state or 2, we will always remain in state in states or 2. Thus, the steady state probabilities are given by p = p } 2 p + p 2 = = p = 5 p 2 = 5 p = p = If we start in state we will make a transition to either state 2 and state at some finite time. When this transition is made, the probability that we go to state 2 is.2/(.2 +.) = 2/5 and the probability that we make a transition to state is./(.2 +.) = /5. Hence the steady state probabilities are p = 2 5 5 = 8 25 p 2 = 2 5 5 = 2 25 p = p = 5 When we start in state, we will always remain in state, thus the steady state probabilities are p = p 2 = p = p = 7 April, 27

s (d) Find, for i =, 2,,, the steady-state probability P i that the process is in state i or explain why these probabilities can t be found. Writing the balance equations for the steady state distribution we get p =.75p + p 2 () p 2 =.25p +.2p (2) p = p () p =.p + p () p + p 2 + p + p = (5) Equation () gives that p = and this makes () redundant. () and (2) are the same equations. Thus, we only 2 equations, () and (5), and we need to solve for three variables, p, p 2 and p. This system does not have a unique solution. Any set of values (p, p 2, p, p ) that satisfy the above equations is a valued steady-state distribution. (Observe that this is a Markov chain with two recurrence classes, thus a stationary distribution does not exist.) (e) Given that the process never enters state, find the steady-state probabilities P i that the process is in state i, i =, 2,,, or explain why these probabilities can t be found. The process starts at state and it must leave state at some finite time. We are given that the process never enters state. This means that when the process leaves state, it enters state 2. Thus p =. This given us another equation along with the two equation of the previous part. Now we have three equations in three variables, solving which we get p = 5 PROBLEM 5 (8 points) p 2 = 5 p = p = (a) Identify the transient, recurrent, and periodic states of the discrete state discretetransition Markov process described by p, = p,5 = p, = p,7 =, p 2, = p 5, = p 5,5 =., p 2,2 = p 5,2 = p 6, = p 7,7 =., p 2,5 = p, = p,6 =.2, p 2,6 =., p, = p 6, = p 7, =.6. 8 April, 27

s.... 2 6.6 7...2..6.2 5.2 Recurrent States:,7 Transient States:,2,,5,6 Periodic States: None.6 (b) How many classes are formed by the recurrent states of this process? There is one recurrent class formed: {,7} (c) Evaluate lim n p,(n) and lim n p 6,6(n). lim p,(n) = n as state can not communicate to state. as state 6 is a transient state. lim p 6,6(n) = n PROBLEM 6 (5 points) Let X, X 2,... be independent, identically distributed random variables with (unknown but finite) mean µ and positive variance. For i =, 2,..., let Y i = X i + 2 X i+ (a) Are the random variables Y i independent? No, the random variables Y i are dependent. An easy way to see this to consider the case when X i are Bernoulli random variables. Consider Pr(Y 2 Y = ). Y = means that X = X 2 =, so Y 2 can take only two values / and. Unconditioned, Y can take values, /, 2/ and. Thus, Y and Y 2 are dependent. 9 April, 27

s (b) Are they identically distributed? of Y i Yes, they are identically distributed. This can be seen by looking at the PDF ( ) ( ) f Yi (y) = f Xi (y) f Xi+ 2 y = f X (y) f X 2 y which is same for all values of i. (c) Let M n = n n i= Y i Show that M n converges to µ in probability. M n = n E[M n ] = µ ( ) M n = n n X + X i + 2 X n k=2 ( ) Var(M n ) = n n 2 Var X + X i + 2 X n = ( n 2 9 σ2 X + (n )σ 2 X + ) 9 σ2 X By Chebyshev s inequality k=2 Pr( M n µ > ε) Var(M n) ε 2 = Thus M n converges in probability to µ. n i= Y i = n σ2 X 5 9n 2 σ2 X n σ2 X 5 9n 2 σ2 X ε 2 as n April, 27