UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 3: Solutions Fall 2007

Similar documents
2.6 Tools for Counting sample points

MATH 556: PROBABILITY PRIMER

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Math 378 Spring 2011 Assignment 4 Solutions

Statistical Inference

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Lecture 1. Chapter 1. (Part I) Material Covered in This Lecture: Chapter 1, Chapter 2 ( ). 1. What is Statistics?

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

MAT 271E Probability and Statistics

6 Permutations Very little of this section comes from PJE.

MAT 271E Probability and Statistics

Probability and Statistics Notes

2. AXIOMATIC PROBABILITY

Introduction to Probability 2017/18 Supplementary Problems

PROBABILITY.

Probabilities and Expectations

MATH 10B METHODS OF MATHEMATICS: CALCULUS, STATISTICS AND COMBINATORICS

Lecture 4: Counting, Pigeonhole Principle, Permutations, Combinations Lecturer: Lale Özkahya

Combinatorics. But there are some standard techniques. That s what we ll be studying.

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

MAT 271E Probability and Statistics

Tutorial 3: Random Processes

Midterm 2. Your Exam Room: Name of Person Sitting on Your Left: Name of Person Sitting on Your Right: Name of Person Sitting in Front of You:

Test One Mathematics Fall 2009

Sum and Product Rules

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Problems for 2.6 and 2.7

Chase Joyner. 901 Homework 1. September 15, 2017

Lecture 1: An introduction to probability theory

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Discrete Mathematics & Mathematical Reasoning Chapter 6: Counting

PRACTICE PROBLEMS FOR EXAM 1

Chapter 2 Class Notes

PRIMES Math Problem Set

BmMT 2017 Individual Round Solutions November 19, 2017

Chapter 7. Inclusion-Exclusion a.k.a. The Sieve Formula

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

STAT 516 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS. = {(a 1, a 2,...) : a i < 6 for all i}

The Inclusion Exclusion Principle

the time it takes until a radioactive substance undergoes a decay

Section F Ratio and proportion

Example. If 4 tickets are drawn with replacement from ,

CS626 Data Analysis and Simulation

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

STAT 516: Basic Probability and its Applications

Probability Problems for Group 3(Due by EOC Mar. 6)

Math 105A HW 1 Solutions

Statistical Theory 1

1 Basic Combinatorics

2030 LECTURES. R. Craigen. Inclusion/Exclusion and Relations

STAT 285 Fall Assignment 1 Solutions

PROBABILITY VITTORIA SILVESTRI

Introduction to Probability, Fall 2009

Math P (A 1 ) =.5, P (A 2 ) =.6, P (A 1 A 2 ) =.9r

Undergraduate Probability I. Class Notes for Engineering Students

Problem # Number of points 1 /20 2 /20 3 /20 4 /20 5 /20 6 /20 7 /20 8 /20 Total /150

L11.P1 Lecture 11. Quantum statistical mechanics: summary

MATH2206 Prob Stat/20.Jan Weekly Review 1-2

Name: Exam 2 Solutions. March 13, 2017

Discrete Mathematics and Probability Theory Fall 2017 Ramchandran and Rao Midterm 2 Solutions

Notes. Combinatorics. Combinatorics II. Notes. Notes. Slides by Christopher M. Bourke Instructor: Berthe Y. Choueiry. Spring 2006

Chapter 8 Sequences, Series, and Probability

STAT:5100 (22S:193) Statistical Inference I

MA 524 Final Fall 2015 Solutions

1 Probability Theory. 1.1 Introduction

1. How many labeled trees are there on n vertices such that all odd numbered vertices are leaves?

8.1 Polynomial Threshold Functions

California 5 th Grade Standards / Excel Math Correlation by Lesson Number

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

BMT 2018 Team Test Solutions March 18, 2018

Module 1. Probability

February 2017 February 18, 2017

Probability Notes (A) , Fall 2010

Topic 3: Introduction to Probability

7 = 8 (Type a simplified fraction.)

THE KENNESAW STATE UNIVERSITY HIGH SCHOOL MATHEMATICS COMPETITION PART II Calculators are NOT permitted Time allowed: 2 hours

Senior Math Circles November 19, 2008 Probability II

ECS 120 Lesson 18 Decidable Problems, the Halting Problem

Math 463/563 Homework #2 - Solutions

1 True/False. Math 10B with Professor Stankova Worksheet, Discussion #9; Thursday, 2/15/2018 GSI name: Roy Zhao

Team Solutions. November 19, qr = (m + 2)(m 2)

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom

Math 55 Second Midterm Exam, Prof. Srivastava April 5, 2016, 3:40pm 5:00pm, F295 Haas Auditorium.

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

With Question/Answer Animations. Chapter 7

SUMS PROBLEM COMPETITION, 2000

Math/Stat 394 Homework 5

Chapter Learning Objectives. Random Experiments Dfiii Definition: Dfiii Definition:

Randomized Algorithms

Homework every week. Keep up to date or you risk falling behind. Quizzes and Final exam are based on homework questions.

Midterm Exam 1 Solution

2 Exercises 1. The following represent graphs of functions from the real numbers R to R. Decide which are one-to-one, which are onto, which are neithe

CISC-102 Fall 2017 Week 1 David Rappaport Goodwin G-532 Office Hours: Tuesday 1:30-3:30

Basic counting techniques. Periklis A. Papakonstantinou Rutgers Business School

1 Combinatorial Analysis

Probabilistic models

Solving Equations. A: Solving One-Variable Equations. One Step x + 6 = 9-3y = 15. Two Step 2a 3 6. Algebra 2 Chapter 1 Notes 1.4 Solving Equations

Transcription:

UC Berkeley Department of Electrical Engineering and Computer Science EE 126: Probablity and Random Processes Problem Set 3: Solutions Fall 2007 Issued: Thursday, September 13, 2007 Due: Friday, September 21, 2007 Reading: Bertsekas & Tsitsiklis, 1.4, 1.5. and 1.6 Problem 3.1 A mouse is wandering on a rectangular grid with grid points (a, b, a, b = 0, 1, 2, 3.... From grid position (a, b, the mouse must move either to position (a + 1, b or (a, b + 1. (a Find the total number of different paths from (0, 0 to (n, n. (b Find the total number of different paths from (1, 0 to (n + 1, n that pass through at least one of the points (r, r, with 1 r n. (a Each path from (0, 0 to (n, n can be specified by a list of ordered moves. There will always be n moves in each of the two directions, for a total of 2n moves to get to (n, n. Therefore, any such list of ordered moves will have length 2n. The different possible paths can be counted by finding the number of ways to label the 2n spots in the list with R and U moves, corresponding to incrememnting the first and second coordinate, ( 2n respectively. The answer is therefore. n The argument can be generalized to see that to get from any (a 0, b 0 to (a 1, b 1 (with a 0 < a 1 and b 0 < b 1 there are ( (a 1 a 0 +(b 1 b 0 a 1 a 0 paths. (b We instead consider the unrestricted paths from (0, 1 to (n + 1, n. We can construct a bijective map from this new set of paths to the paths in the original problem statement, meaning for each path in the new set there is exactly one path in the original set, and no path in the original set is unaccounted for. To construct the mapping, we notice that any path from (0, 1 to (n + 1, n must cross the diagonal (touch some (r, r for 1 r n at least once, though it may touch more than once. For any such path, we consider the path segment from (0, 1 to the first time the path touches the diagonal. By reflecting this path segment across the diagonal (so that the point (0, 1 becomes (1, 0, etc. we generate a path from (1, 0 to (n + 1, n that at least touches the diagonal at the point (r, r, and so it is one of the paths in the original set. This mapping is bijective, and so the original set and the new set have the same cardinality (number of elements. We count the number of paths in the new set by our generalization in part (a: we have ( 2n such paths. n + 1 1

Problem 3.2 A lazy GSI returns exams to students randomly. If the class has n students, what is the probability that at least one student receives his/her own exam? Compute the limiting value of this probability as n +. We define the collection of events {A i } i 1,2,...,n, with A i = { student i gets his own test }. The probability in question is given by P ( n i A i, and since the probability law is uniform, we can equivalently count n i=1 A i and normalize by n! to get the desired probability. To count a union like the one above, we will use the inclusion-exclusion formula from p. 54 of Bertsekas and Tsitsiklis: n A i = i ( n A i A i A j + A i A j A k i=1 1 i<j n 1 i<j<k n Note that for any i we have: A i = (n 1! The equality can be seen by fixing the ith student to get his paper back and counting the number of permutations of the remaining (n 1 papers with respect to their corresponding students. There are n such i s in the first inclusion-exclusion summation. By the same reasoning, we have for any i j: A i A j = (n 2! In the inclusion-exclusion formula we only sum over i < j because otherwise we would be counting events such as A 1 A 1 and double-counting A 1 A 2 as both A 1 A 2 and A 2 A 1. We also note that there are ( n 2 such events Ai A j for i < j, since we have one for each subset of {A i } i 1,2,...,n of size two. Therefore, in general we have, for intersections of k of our A i events: A i1 A i2 A ik = (n k! And we sum over ( n k such sets in the corresponding inclusion-exclusion term. We apply the described counting to the terms in the inclusion-exclusion formula to get: 2

n n A i = ( A i ( A i A j + i i=1 1 i<j n ( ( n n = n! (n 2! + (n 3! 2 3 = n! (1 12! + 13! = n! = n! n k=1 ( 1 ( 1 (k 1 k! n ( 1 k k=0 k! 1 i<j<k n ( A i A j A k Where the last equality holds by rewriting the summation to begin at 0. To calculate the n ( 1 k appropriate probability, we normalize by n! to get 1. k! We recognize the summation in the final expression as the power series for e 1, and so in the limiting case we have: k=0 n lim 1 ( 1 k n k! k=0 = 1 1 e Problem 3.3 (Fishing trip Koichi s backyard pond contains g goldfish and c catfish. All fish are equally likely to be caught. (a Suppose that Koichi catches a total of k fish (no fish are thrown back. What is the probability of catching x goldfish? (b Now suppose that all k fish are returned to the pond, and he starts fishing again, and catches a total of m fish. What is the probability that among the caught set of m fish, exactly 2 goldfish are included that were also caught in the first catch? (Assume that fish do not learn from experience. (a Since the experiment is under a uniform probability law, we can solve this problem by counting. We want to count the number of ways to catch x goldfish in a total of k fish caught. We decompose the enumeration of the set into two decisions using the counting principle; we first select x goldfish and then select k x catfish, completely counting the sets of size k containing exactly x goldfish. The number of ways to select x goldfish in the first step is given by ( ( g x, and the number of ways to select k x catfish is given by c k x, 3

and so there are ( ( g c x k x ways of selecting a catch k with x goldfish. We normalize by the total number of ways to catch k fish, yielding: ( g c x( k x ( g+c k (b Again we use counting to solve for the probability. We decompose the countng problem into two steps; we first select the two goldfish to be caught again from our original catch, and then we select the m 2 other fish from the g + c x fish that were not goldfish caught the first time. Therefore we have ( ( x g+c x 2, yielding a probability of: m 2 ( x g+c x 2( m 2 ( g+c m Problem 3.4 In each running of a lottery, a sequence r numbers are drawn independently (with replacement from a total of n numbers. You win the lottery if the r-number sequence on your ticket matches the drawn sequence. (a Suppose I buy one (randomly chosen lottery ticket. What is the probability of winning the lottery? (b Suppose that I hack into the lottery computer, and program it to only draw sequences with r numbers in a non-decreasing order. I then go and buy a ticket with its numbers in non-decreasing order. Now what is my probability of winning the lottery? (a The probability of choosing any given number correctly is 1/n, and since there is replacement, the probability of getting any number in the sequence correct is independent of getting the other numbers correct. Thus, by multiplying out the probabilities with independence, the probability is 1/n r. (b We can rephrase the problem in terms of placing labels on the selected numbers, or placing r indistinguishable balls in n bins, with each ball corresponding to picking a number once (we can place zero or more balls in each bin and each bin corresponding with one of the possible numbers to select. Since we know the sequence is in nondecreasing order, there exists a bijective map between the r-balls-in-n-bins description and the nondecreasing sequence description. (Note that if the sequence had to be in increasing order, we could not pick the same number twice, or, correspondingly, place more than one ball in a bin. Also, if the sequence were not sorted, we could not draw a bijection to the balls-in-bins scenario, since we would need some way to distinguish the order of selected numbers. Thus we have a collection of integers x ii {1,2,...,n}, with each x i 0 indicating the number of balls placed in bin i for i = 1, 2,..., n. (Though the problem only tells us there are n 4

Case 1: b > a + 1 Case 2: b = a + 1 distinct numbers to choose from, we make this labeling without loss of generality because we can just choose some way to match the n numbers in the lottery to 1, 2,..., n. We also know that since we place r balls total, we have n i=1 x i = r. Thus we can use the provided formula to find that the number of ways to choose such non-decreasing sequences of length r from n numbers with replacement is ( ( n+r 1 n 1 = n+r 1 r. To win the lottery, we must choose the correct one out of the ( n+r 1 n 1 equally likely sequences, and so the probability is given by: 1 ( n+r 1 n 1 Problem 3.5 An urn contains a aquamarine balls, and b blue balls. The balls are removed successively without replacement until the urn is empty. If b > a, show that the probability that at all stages until the urn is empty there are more blue than aquamarine balls in the urn is equal to (b a/(b + a. Proof by induction on the sum of a and b. Base case: Let a + b = 1. From b > a 0, we know b = 1 and a = 0. There is one blue ball and no aquamarine ball. The result holds trivially. For inductive hypothesis, we assume the result is correct when a + b = k. The following inductive step is to show that it still holds for a+b = k +1. Let P (i, j denote the probability that the urn with i aquamarine balls and j blue balls has more blue than aquamrine balls during a sequence of removal until the urn is empty. After one ball, either aquamarine or blue, is removed, the urn still has more blue balls than aquamarine balls. P (a, b = P (one aquamarine ball is removedp (a 1, b + P (one blue ball is removedp (a, b 1 a = a + b b a + 1 b + a 1 + b a + b b 1 a b + a 1 = b a b + a If one blue ball is removed, the urn will have the same number of blue and aquamarine balls. This is not the event we are looking for. P (a, b = P (one aquamarine ball is removedp (a 1, b a = a + b b a + 1 b + a 1 1 = 2a + 1 = b a b + a 5

Case 1 and 2 conclude our proof. Problem 3.6 A communication system transmits one of three signals, s 1, s 2 and s 3, with equal probabilities. The transmission is corrupted by noise, causing the received signal to be changed according to the following table of conditional probabilities: Receive, j P(s j s i s 1 s 2 s 3 s 1 0.3 0.4 0.3 Send, i s 2 0.002 0.99 0.008 s 3 0.8 0.15 0.05 For example, if s 1 is sent, the probability of receiving s 3 is 0.3. The entries of the table list the probability of s j received, given that s i is sent, i.e., P (s j received s i sent. (a Compute the (unconditional probability that s j is received for j = 1, 2, 3. (b Compute the probability P(s i sent s j received for i, j = 1, 2, 3. (c As seen from the numbers above, the transmitted signal can be very different from the received signal. Thus, when symbol s j is received, it may not be a good idea to conclude that s j was sent. We need a decision scheme to decide which signal was sent based on the signal we receive with the lowest possible error probability, that is, the probability of making a wrong decision. Find the scheme that minimizes the overall error probability. The problem statement provides P (s i sent = 1/3 as well as a table listing the conditional probabilities P (s j received s i sent. 1. Via the Total Probability Theorem, we can calculate the unconditional probability that s j is received, for each j = 1, 2, 3, according to P (s j received = i P (s j received s i sentp (s i sent = 0.3673, j = 1 0.513, j = 2 0.1193, j = 3. 2. For each unique pair (i, j, Bayes Rule yields P (s i sent s j received = P (s j received s i sentp (s i sent P (s j received where the denominator for each j has already been computed in part (a. Performing this calculation for each pair (i, j, the desired inverse conditional probabilities can be arranged in a table similar to the one given in the problem statement: 6

Sent, i P (s j s i s 1 s 2 s 3 s 1 0.2722 0.0018 0.7260 Received, j s 2 0.2597 0.6429 0.0974 s 3 0.8380 0.0223 0.1397 Note that we have rounded all calculations to the fourth decimal place, also ensuring that the conditional distribution P (s i sent s j received sums to unity for each j = 1, 2, 3. 3. Let a j {s 1, s 2, s 3 } be our decision for the signal sent when we receive s j. 3 P (error = P (wrong decision s j received P (s j received j=1 = [1 P (a j sent s j received] P (s j received j = 1 j P (a j sent s j received P (s j received. To minimize P (error, we need to choose a j to maximize the sum. In other words, we need to choose a j to maximize P (a j sent s j received for every j. From part (b, a 1 = s 3, a 2 = s 2, a 3 = s 1. Problem 3.7 Evelyn has n coins in her left pocket, and n coins in her right pocket. Every minute, she takes a coin out of a pocket chosen at randomly (and independently from previous draws, until the first time one of her pockets is empty. For an integer t [1, n], compute the probability of having t coins in the other pocket when she stops. Let (i, j denote the status that Evelyn has i coins in her left pocket and j coins in the right. Initially, her pocket status is (n, n, equivalently n coins in left pocket and n coins in the right. After first draw, her pocket status is (n 1, n if a coin is taken from her left pocket, or (n, n 1 otherwise. Let N(i, j be the number of all possible draw sequences that reach (i, j from (n, n. From Problem 3.1 (a, we know ( 2n i j N(i, j = N(j, i = n i Case 1: 2 t n. One step before a pocket becomes empty at the first time, her pocket status must be (1, t or (t, 1 and she must take one coin from left or right pocket respectively. And the set of sequences reaching (1, t and the set of sequences reaching (t, 1 are disjoint, when t 2. Once the pocket status is at (0, t or (t, 0, there is only one way to get (0,0. Thus, the probability that one pocket becomes empty at the first time and the other has t 7

coins is equal to ( 2n 1 t ( 2n 1 t N(1, t + N(t, 1 N(0, 0 = 2 n 1 ( = 2n n n 1 ( = 2n 1 n 1 n (n 1 (n t + 1 (2n 1 (2n t Case 2: t = 1. One step before one of pockets becomes empty at the first time, her pocket status must be (1,1. There are two paths from (1,1 to (0,0. Thus, the probability is: N(1, 1 2 N(0, 0 = n 2n 1 Note: although case 2 can be included in case 1 by choosing t = 1, we should be aware of the duplicated counts happening in case 2. 8