Midterm Exam 2 (Solutions)

Similar documents
EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2017 Kannan Ramchandran March 21, 2017.

(Practice Version) Midterm Exam 2

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.

Midterm Exam 1 (Solutions)

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

Midterm Exam 1 Solution

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

EECS 70 Discrete Mathematics and Probability Theory Fall 2015 Walrand/Rao Final

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Statistics 433 Practice Final Exam: Cover Sheet and Marking Sheet

Final Solutions Fri, June 8

6.041/6.431 Fall 2010 Final Exam Wednesday, December 15, 9:00AM - 12:00noon.

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

Twelfth Problem Assignment

1 Basic continuous random variable problems

ISyE 3044 Fall 2015 Test #1 Solutions

Part I: Discrete Math.

ISyE 6644 Fall 2015 Test #1 Solutions (revised 10/5/16)

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Disjointness and Additivity

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

CS 70 Discrete Mathematics and Probability Theory Summer 2016 Dinh, Psomas, and Ye Final Exam

6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon.

Midterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley

MAT 271E Probability and Statistics

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Math 447. Introduction to Probability and Statistics I. Fall 1998.

COMPSCI 240: Reasoning Under Uncertainty

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

STAT 3128 HW # 4 Solutions Spring 2013 Instr. Sonin

6.1 Moment Generating and Characteristic Functions

STAT 414: Introduction to Probability Theory

1 Basic continuous random variable problems

CSE 312 Foundations, II Final Exam

CS 70 Discrete Mathematics and Probability Theory Spring 2018 Ayazifar and Rao Final

ISyE 3044 Fall 2017 Test #1a Solutions

Spring 2016 Network Science. Solution of Quiz I

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

Final Exam: Probability Theory (ANSWERS)

ECEn 370 Introduction to Probability

18.05 Practice Final Exam

You have 3 hours to complete the exam. Some questions are harder than others, so don t spend too long on any one question.

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

Discrete Mathematics and Probability Theory Summer 2017 Hongling Lu, Vrettos Moulos, and Allen Tang HW 6

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

Business Statistics Midterm Exam Fall 2015 Russell. Please sign here to acknowledge

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Tail Inequalities. The Chernoff bound works for random variables that are a sum of indicator variables with the same distribution (Bernoulli trials).

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

EE126: Probability and Random Processes

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Randomized algorithm

ECEN 5612, Fall 2007 Noise and Random Processes Prof. Timothy X Brown NAME: CUID:

Math 218 Supplemental Instruction Spring 2008 Final Review Part A

Midterm 2. Your Exam Room: Name of Person Sitting on Your Left: Name of Person Sitting on Your Right: Name of Person Sitting in Front of You:

University of Illinois ECE 313: Final Exam Fall 2014

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Birth-Death Processes

The first bound is the strongest, the other two bounds are often easier to state and compute. Proof: Applying Markov's inequality, for any >0 we have

Eleventh Problem Assignment

Continuous-time Markov Chains

FINAL EXAM: 3:30-5:30pm

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

ISyE 6644 Fall 2016 Test #1 Solutions

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Problem Points S C O R E Total: 120

STAT 516 Midterm Exam 2 Friday, March 7, 2008

Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1

STAT 418: Probability and Stochastic Processes

Randomized Algorithms

UC Berkeley, CS 174: Combinatorics and Discrete Probability (Fall 2008) Midterm 1. October 7, 2008

The exam is closed book, closed calculator, and closed notes except your one-page crib sheet.

Name: Math 29 Probability. Practice Final Exam. 1. Show all work. You may receive partial credit for partially completed problems.

Exam III #1 Solutions

Designing Information Devices and Systems I Spring 2018 Midterm 1. Exam Location: 155 Dwinelle Last Name: Cheng - Lazich

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

Designing Information Devices and Systems I Fall 2017 Midterm 2. Exam Location: 150 Wheeler, Last Name: Nguyen - ZZZ

18.175: Lecture 17 Poisson random variables

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

MARKOV PROCESSES. Valerio Di Valerio

MAT 271E Probability and Statistics

EECS 123 Digital Signal Processing University of California, Berkeley: Fall 2007 Gastpar November 7, Exam 2

Class 26: review for final exam 18.05, Spring 2014

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

MA 320 Introductory Probability, Section 1 Spring 2017 Final Exam Practice May. 1, Exam Scores. Question Score Total. Name:

Part I Stochastic variables and Markov chains

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

Applied Stochastic Processes

MATH 180A - INTRODUCTION TO PROBABILITY PRACTICE MIDTERM #2 FALL 2018

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Stochastic Processes

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Multiple Random Variables

Transcription:

EECS 6 Probability and Random Processes University of California, Berkeley: Spring 07 Kannan Ramchandran March, 07 Midterm Exam (Solutions) Last name First name SID Name of student on your left: Name of student on your right: DO NOT open the exam until instructed to do so. The total number of points is 0, but a score of 00 is considered perfect. You have 0 minutes to read this exam without writing anything and 05 minutes to work on the problems. Box your final answers. Partial credit will not be given to answers that have no proper reasoning. Remember to write your name and SID on the top left corner of every sheet of paper. Do not write on the reverse sides of the pages. All eletronic devices must be turned off. Textbooks, computers, calculators, etc. are prohibited. No form of collaboration between students is allowed. If you are caught cheating, you may fail the course and face disciplinary consequences. You must include explanations to receive credit. Problem Part Max Points Problem Part Max Points (a) 0 5 (b) 3 0 (c) 8 4 5 (d) 0 Total 0

Problem. (a) (0 points, 4 points each part) Evaluate the statements with True or False. Give brief explanations in the provided boxes. Anything written outside the boxes will not be graded. () Given a random variable M Geom( 0 ) and i.i.d. exponential random variables X i Exp(), the distribution of the sum X + X + + X M is Erlang of order 0 with rate. True or False: False Explanation: It is exponential () Recall fountain codes as introduced in Lab 4. You would like to recover the packets x, x, x 3, x 4 and receive the following equations: x x 4 = x = x x x 3 x 4 = x 3 x 4 = The peeling decoder is able to recover all the packets that can possibly be recovered. True or False: False Explanation: An optimal decoder could recover all the packets, but peeling can recover only one

(3) Give an example of a discrete time Markov Chain which is transient and has period 3. (4) Consider a Poisson process with rate. Suppose that the 7 th arrival comes at time T. The joint distribution of the first 6 arrivals is the same as the joint distribution of U, U,..., U 6, where U i are i.i.d. U[0, T ] random variables. True or False: False Explanation: It has the joint distribution of the order statistics of 6 uniform random variables. (5) For a random variable X with a well-defined M X (s), we have: Var(X) = d ds ln M X(s) s=0 Explanation: True or False: True 3

(b) ( points) After attending an EE6 lecture, you went back home and started playing Twitch Plays Pokemon. Suddenly, you realized that you may be able to analyze Twitch Plays Pokemon. Figure : A snapshot of Twitch Plays Pokemon - (i) (6 points) The player in the top left corner performs a random walk on the 8 checkered squares and the square containing the stairs. At every step the player is equally likely to move to any of the squares in the four cardinal directions (North, West, East, South) if there is a square in that direction. Find the expected number of moves until the player reaches the stairs in Figure. Using symmetry, the 9 states can be grouped as follows. a b c b d e c e f Now, you can also observe that state d is equivalent to state c. a b c b c e c e f With the above states, one can write down the following first-step equations. T a = + T b T b = + 3 T a + 3 T c T c = + T b + T e T e = + 3 T c + 3 T f T f = 0 4

Solving the above equations gives: T a = 8, T b = 7, T c = 5, T d = Thus, the player has to make 8 moves to go downstairs on average. 5

Figure : A snapshot of Twitch Plays Pokemon - (ii) (6 points) The player randomly walks in the same way as in the previous part. Find the probability that the player reaches the stairs in the bottom right corner in Figure. Consider 9 initial states and corresponding probabilities of reaching the good stairs as follows. Using symmetry, one can obtain the following table. p q 0 p q With the above probabilities, one can write down the following first-step equations. Solving the above equations gives: p = q + q = 3 p + 3 + 3 0 p = 0.4, q = 0.3 Thus, Red is going to reach the good stairs with probability 0.4. 6

(c) (8 points) Suppose that X, X,..., X n are i.i.d. random variables and X i U[, ]. Does the sequence Y n = max{x, X,..., X n } converge in probability? If so, what does it converge to? Consider ɛ [0, ]. We see that: Pr( Y n ɛ) = Pr(max{X,..., X n } ɛ) Thus, Pr( Y n ɛ) 0 and we are done. = Pr(X ɛ,..., X n ɛ) ( = Pr(X i ɛ) n = ɛ ) n 7

(d) (0 points) Nodes in a wireless sensor network are dropped randomly according to a twodimensional Poisson process. A two-dimensional Poisson process is defined as follows: (i.) For any region of area A, the number of nodes in that region has a Poisson distribution with mean λa, (ii.) The number of nodes in non-overlapping regions is independent. Suppose that you are at an arbitrary location in the plane. Let X be the distance to the nearest node in the sensor network. Find E[X]. (Here distance between the points (x, y ) and (x, y ) is defined as (x x ) + (y y ) ) Given an arbitrary location, X > t if and only if there are no special points in the circle of radius t around the given point. The expected number in that circle is λπt, and since the number in that circle is Poisson with expected value λπt, the probability that number is 0 is e λπt. Thus Pr(X > t) = e λπt. Since X 0, we have E[X] = 0 Pr(X > t)dt = 0 e λπt dt We can look this up in a table of integrals, or recognize its resemblance to the Gaussian PDF. If we define σ = /(πλ), the above integral is E[X] = σ π 0 σ t π e σ dt = σ π = λ. 8

Problem. (5 points) Consider the continuous time Markov chain given in the figure. A 8 C 8 B (a.) (7 points) Draw a DTMC (discrete time Markov Chain) which has the same stationary distribution as the given CTMC and find the stationary distribution. 6 A 3 C 6 3 B 6 6 The flow equations are given by: 0π A = 8π B + π C π C = π A + π B 0π B = 8π A π A + π B + π C = Thus, π A = 0, π B = 8, π C = 7 9

(b.) (8 points) Let τ denote the first time the CTMC is in state C after starting from state A at time 0. Find M τ (s), the MGF of τ. Let T C, T B be the events that the first transition is to state C and state B, respectively: E[e sτ ] = E[e sτ T C ]P (T C ) + E[e sτ T B ]P (T B ) = s 5 + E[esτ ] 4 5 Where in the last equality, we note that the hitting time to state C from state A is the same as the hitting time to state C from state B. Thus, M τ (s) = s. 0

Problem 3. (0 points) (a.) (0 points) Batteries from Company X last an amount of time exponentially distributed with rate λ. The company guarantees that λ. Suppose that you have used n batteries from Company X and observed their lifetimes to be X, X,..., X n. Using Chebyshev s inequality, find the minimum n required to construct a confidence interval for the mean interval of the light bulb such that it has tolerance at most ɛ and confidence at least δ. In other words, we want to find n such that ( n i= P X i n ) λ ε δ We require ( P n ) n X i λ ε δ. i= By Chebyshev s Inequality, the above inequality will be satisfied so long as Var(n n i= X i) ε = Var(X i) nε = nλ ε δ. From our trustworthy source, we have the lower bound λ, so we are happy if we choose n 4δε. (b.) (0 points) Suppose now that you have taken 0000 samples of the batteries. With the same tolerance level ɛ, use the CLT to find your new confidence (you may again use the company guarantee that λ ). You may leave your answer in terms of Φ, the CDF of the standard normal distribution. We now assume n n i= X i N (λ, n λ ). We now seek to bound the probability of an ε -deviation from the mean: ( ) n P X i n λ ε P ( N (λ, n λ ) λ ε) i= = P ( N (0, n λ ) ε) = P ( N (0, ) ελ n) Φ( 00ε).

Problem 4. (5 points, 5 points each) There are a set of M bins and a set of balls labeled f, f,.... Let f (i) = {f j : j i mod M} be the set of balls thrown to bin i. The interarrival time between balls thrown to bin i is exponential with rate λ balls per second, and all interarrival times are independent. The situation is illustrated in Figure 3. f () f () f (3) f (M)... B B B 3 B M Figure 3: Balls throwing to bins (a.) Alice and Bob decide to play a betting game. Bob bets that there will be at least one ball in bin B after λ seconds and Alice bets that there will be at least two balls in bin B after λ seconds. What is the probability that exactly one of Bob and Alice wins money? Let E be the event that Bob wins money and E the event that Alice wins money. We are interested in P (E, E c) + P (Ec, E ). Note that P (E, E c ) is the probability that there is exactly one arrival in the first λ seconds and no arrivals in the next λ seconds, so P (E, E (c) ) = e. Additionally, P (E c, E ) is the probability that there are no arrivals in the first λ seconds, and at least arrivals in the next λ seconds, which is given by e ( e ). Thus, the probability that exactly one of them wins money is e e. (b.) Suppose that λ =. You peek into bin B i at a random time t. Let X be the difference between the time of the most recent arrival to bin B 3 and the time of the next arrival to any of the M bins. Find E[X ]. Note that by the same argument as random incidence, the time to the most recent arrival to the bin is exponential with rate, and the time to the next arrival to any bin is exponential with rate M. We are interested in X, the sum of these two random variables. We note that since they are independent, we find M X (s) as the product of the MGFs of these random variables to see that: M X (s) = ( s )( M M s ) = M M + s (M + )s We note that E[X ] = d ds M X (s) s=0. Plugging in, one sees: E[X ] = M (M + ) + M = M + M + M

Your friend finds a switch which can send balls to two consecutive bins as in Figure 4. f () f () f (3) f (M)... B B B 3 B M B Figure 4: Balls throwing to bins in the new configuration (c.) Call a bin a singleton if it has exactly one ball contained in it. number of singletons at time T? MT λe T λ, by linearity of expectation. What is the expected (d.) Suppose that your friend randomly chooses the setting on the switch so that the situation is equally likely to be that of Figure 3 and Figure 4. What is the variance of the total number of distinct balls in B and B after second? Let N be the number of distinct balls in B and B and let S be the setting of part a., S be the setting of part b. We need E[N ] and EN. First, we see that E[N ] = E[N S ]P (S ) + E[N S ]P (S ). Noting that E[N S ] = Var(N S ) + E[N S ], and the analagous setting for E[N S ], we find that E[N ] = 5 λ + 3 λ. We likewise see that E[N] = 5 λ, so Var(N) = 5 λ + 4 λ. 3

(e.) Alice and Bob decide to play a new game. They flip the switch so that the set up is as in Figure 4, but with a twist. Now, the balls in f () arrive according to a Poisson process with rate λ, but all other sets of balls ( f (), f (3),... ) arrive according to a Poisson process with rate λ. The game is as follows: Alice wins if at any point in time, B 3 contains 0 balls more than B. What is the probability that Alice wins? Note that the balls which go to both B and B 3 do not factor into the difference. Now, consider the merged Poisson process containing balls that go to B but not B 3 and the balls that go to B 3 but not B. Now, note that each arrival goes to bin λ B with probability λ+λ = 3 and to bin B 3 with probability 3. Now, set a Markov chain that tracks the difference between the number of balls in B 3 and B. Let state 0 be the starting state, state 0 the state at which Alice wins and state 0 the state at which Bob wins. The transitions are governed by P i,i+ = 3, P i,i = 3 for i = {,,..., 9} and P 0,0 =, P 0,0 =. Let p i denote the probability of hitting state 0 when starting at state 0 and note that p i = 3 p i+ + 3 p i. Additionally, one has that p 0 =, p 0 = 0. Thus, one can see that p i = i p. Thus, setting p k =, one finds that p = 9 0. Plugging in, one sees that p 0 = 0 ( 0 ). 0 4

END OF THE EXAM. Please check whether you have written your name and SID on every page. 5