Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Friday, Feb 8, 2013

Similar documents
Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Statistics 433 Practice Final Exam: Cover Sheet and Marking Sheet

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Stochastic Processes

Statistics 150: Spring 2007

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Stochastic process. X, a series of random variables indexed by t

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Question Points Score Total: 70

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

Random Walk on a Graph

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

STAT 516 Midterm Exam 2 Friday, March 7, 2008

Exam 2 Solutions. x 1 x. x 4 The generating function for the problem is the fourth power of this, (1 x). 4

STAT/MA 416 Midterm Exam 3 Monday, November 19, Circle the section you are enrolled in:

Introduction to Queuing Networks Solutions to Problem Sheet 3

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

Discrete time Markov chains. Discrete Time Markov Chains, Limiting. Limiting Distribution and Classification. Regular Transition Probability Matrices

MARKOV PROCESSES. Valerio Di Valerio

The Transition Probability Function P ij (t)

Be sure this exam has 9 pages including the cover. The University of British Columbia

Homework 4 Math 11, UCSD, Winter 2018 Due on Tuesday, 13th February

M.Sc.(Mathematics with Applications in Computer Science) Probability and Statistics

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Interlude: Practice Final

Disjointness and Additivity

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

IE 5112 Final Exam 2010

IEOR 6711, HMWK 5, Professor Sigman

Midterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley

CMPSCI 611 Advanced Algorithms Midterm Exam Fall 2015

IE 336 Seat # Name (one point) < KEY > Closed book. Two pages of hand-written notes, front and back. No calculator. 60 minutes.

6.041/6.431 Fall 2010 Final Exam Wednesday, December 15, 9:00AM - 12:00noon.

Comprehensive Exam in Real Analysis Fall 2006 Thursday September 14, :00-11:30am INSTRUCTIONS

CS 798: Homework Assignment 3 (Queueing Theory)

Discrete Event Systems Exam


Readings: Finish Section 5.2

Examination paper for TMA4265 Stochastic Processes

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

University of Illinois ECE 313: Final Exam Fall 2014

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Modelling data networks stochastic processes and Markov chains

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER I EXAMINATION MH4702/MAS446/MTH437 Probabilistic Methods in OR

Lecture 20 : Markov Chains

Intro to Probability Instructor: Alexandre Bouchard


MATH 180A - INTRODUCTION TO PROBABILITY PRACTICE MIDTERM #2 FALL 2018

Mathematics 24 Winter 2004 Exam I, January 27, 2004 In-class Portion YOUR NAME:

STOR : Lecture 17. Properties of Expectation - II Limit Theorems

UC Berkeley, CS 174: Combinatorics and Discrete Probability (Fall 2008) Midterm 1. October 7, 2008

MIDTERM Fundamental Algorithms, Spring 2008, Professor Yap March 10, 2008

(i) What does the term communicating class mean in terms of this chain? d i = gcd{n 1 : p ii (n) > 0}.

Markov chains. Randomness and Computation. Markov chains. Markov processes

Assignment 3 with Reference Solutions

1 Continuous-time chains, finite state space

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

MATH 56A: STOCHASTIC PROCESSES CHAPTER 2

Outlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)

Statistics & Data Sciences: First Year Prelim Exam May 2018

MAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather

16:330:543 Communication Networks I Midterm Exam November 7, 2005

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

M.Sc. (MATHEMATICS WITH APPLICATIONS IN COMPUTER SCIENCE) M.Sc. (MACS)


(Practice Version) Midterm Exam 2

Notes on Continuous Random Variables

Math 365 Final Exam Review Sheet. The final exam is Wednesday March 18 from 10am - 12 noon in MNB 110.

1 Basic continuous random variable problems

UNIVERSITY OF LONDON IMPERIAL COLLEGE LONDON

1 Gambler s Ruin Problem

CSE 312 Foundations, II Final Exam

STAT FINAL EXAM

STAT STOCHASTIC PROCESSES. Contents

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=

Math 51 Midterm 1 July 6, 2016

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

At the boundary states, we take the same rules except we forbid leaving the state space, so,.

Solution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better.

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Modelling data networks stochastic processes and Markov chains

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

Winter 2014 Practice Final 3/21/14 Student ID

Lecture 20: Reversible Processes and Queues

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321

Midterm 2. Your Exam Room: Name of Person Sitting on Your Left: Name of Person Sitting on Your Right: Name of Person Sitting in Front of You:

NAME : Math 20. Final Exam August 27, Prof. Pantone

Positive and null recurrent-branching Process

P(X 0 = j 0,... X nk = j k )

Transcription:

Statistics 253/317 Introduction to Probability Models Winter 2014 - Midterm Exam Friday, Feb 8, 2013 Student Name (print): (a) Do not sit directly next to another student. (b) This is a closed-book, closed-note examination. You should have your hand calculator and one page of formula sheet that you may refer to. (c) You need to show your work to receive full credit. In particular, if you are basing your calculations on a formula or an expression (e.g., E(Y X = k)), write down that formula before you substitute numbers into it. (d) If a later part of a question depends on an earlier part, the later part will be graded conditionally on how you answered the earlier part, so that a mistake on the earlier part will not cost you all points on the later part. If you can t work out the actual answer to an earlier part, put down your best guess and proceed. (e) Do not pull the pages apart. If a page falls off, sign the page. If you do not have enough room for your work in the place provided, ask for extra papers, label and sign the pages. Question Points Available Points Earned 1 10 2 20 3 35 4 35 TOTAL 100 1

Problem 1. [10 points] A Markov chain with state space {0, 1, 2, 3, 4} has transition matrix P = 0 1 2 3 4 0 0 1/2 0 0 1/2 1 0 0 1 0 0 2 0 1/2 0 1/2 0 3 0 0 1 0 0 4 1/2 0 0 1/2 0 Find all communicating classes. For each state, determine its periods, and whether it is recurrent or transient. Explain your reasoning. 2

Problem 2. [20 points] Let i and j be 2 states of a Markov chain. Let f ij be the probability that starting in i the chain ever reaches j and let f ji be the probability that starting in j the chain ever reaches i. (a) [10 pts] If i and j are both transient and i communicates with j, argue that f ij and f ji cannot both be equal to one. (b) [10 pts] If i and j are both recurrent, is it possible for i to be accessible from j but j not accessible from i? Justify your answer. 3

Problem 3. [35 points] [Nearly Symmetric Random Walk] Let {X n : n 0} be a (non-simple) random walk on {0, 1, 2,...} with transition probabilities P i,i+1 = a i+1 a i + a i+1, for i 0 and P i,i 1 = a i a i + a i+1 for i 1, P 0,0 = a 0 a 0 + a 1. Here {a i > 0 : i = 0, 1, 2,...} is a positive sequence such that lim i a i+1 /a i = 1. Observed that P i,i+1 1/2 when i is large. Thus the process behaves like a symmetric random walk when it is far from 0. Since a i > 0 for all i = 0, 1, 2,... and P 0,0 > 0, {X n } is irreducible and aperiodic. We know that a symmetric random walk is null recurrent, will a nearly symmetric random walk be null recurrent or positive recurrent? (a) [10pts] Show that {X n } has a limiting distribution (and hence is positive recurrent) if i=0 a i <. Express the limiting distribution of {X n } in terms of {a 0, a 1, a 2,...}. (Hint: Solve the detailed balanced equation for the limiting distribution.) 4

(b) [5 pts] For a i = 1/(i + 1) 2, i = 0, 1, 2,..., find the limiting distribution of {X n }. You may use the identity n=1 (1/n2 ) = π 2 /6. 5

(c) [10 pts] Let M i,j be the mean time to reach state j starting in state i. First show that for a general random walk on {0,1,2,... }, M i,i+1 = 1 + P i,i 1 M i 1,i + P i,i 1 M i,i+1, for i 1 and M 0,1 = 1/P 01. and then show that for the nearly symmetric random walk above a i+1 M i,i+1 = a i + a i+1 + a i M i 1,i. 6

(d) [10 pts] Use the equation in part (c) to find an expression of M i,i+1 when a k = 1 (k + 1)(k + 2) for k 0. The expression for M i,i+1 cannot involve unevaluated summation. Then argue that to find M 0,n. Hint: 1 (k+1)(k+2) = 1 k+1 1 k+2. M 0,n = M 0,1 + M 1,2 + M 2,3 + + M n 1,n 7

Problem 4. [35 points] Suppose that people arrive at a bus stop in accordance with a Poisson process with rate λ. The bus departs at time t. (a) [5 pts] Suppose everyone arrived will wait until the bus comes, i.e., everyone arrive during [0, t] will get on the bus. What is the probability that the bus departs with n people aboard? (b) [10 pts] Let X be the total amount of waiting time of all those who get on the bus at time t. Find E[X]. (Hint: First find E[X N(t)] where N(t) is the number of people on the bus departs at time t.) 8

Suppose each person arrives at the bus stop will independently wait a period of time that has an exponential distribution with rate µ. If no bus arrives, he/she will leave the bus stop. (c) [10 pts] What is the probability that the bus departs with n people aboard? (d) [10 pts] If at time s (s < t), there are k people waiting at the bus stop. What is the expected number of customers who will get on the bus at time t? (Note some people may leave the bus stop and some may arrive.) 9