Expectation, inequalities and laws of large numbers

Size: px
Start display at page:

Download "Expectation, inequalities and laws of large numbers"

Transcription

1 Chapter 3 Expectation, inequalities and laws of large numbers 3. Expectation and Variance Indicator random variable Let us suppose that the event A partitions the sample space S, i.e. A A S. The indicator of an event A is the (indicator) random variable I A defined by { if s A I A (s) 0 if s A The event A occurs if and only if I A. The probability distribution is p IA (0) P (A) P (A) p IA () P (A). The corresponding distribution function reads 0 for x < 0 F IA (x) P (A) for 0 x < for x. Exercise 3. (Indicator random variable) Consider a probabilistic space over a set S. Show that for every event A S and its indicator I A it holds E (I A ) P(A). (An indicator is defined as I A (w) for all w A and I A (w) 0 for all w / A.) Solution of Exercise 3. : E (I A ) w S P(w)I A (w) w A P(w)+ w (S A) ( ) P(w)0 P {w} +0 P(A). w A

2 CHAPTER 3. EXPECTATION, INEQUALITIES AND LAWS OF LARGE NUMBERS Exercise 3. Consider two discrete random variables X, Y such that w S : X(w) Y (w). Prove that E (X) E (Y ). Solution of Exercise 3. : E (Y ) E (X) w S P(w)Y (w) w S w S P(w)0 0. P(w)X(w) w S P(w)[Y (w) X(w)] Exercise 3.3 Suppose that after a long night n drunken sailors return to the ship and they sequentially, independently at random enter one of r cabins and fall asleep. Assuming they select each cabin with uniform probability distribution (and ignore any other sailors already present inside), what is the expected number of empty cabins? Solution of Exercise 3.3 : Let X i if cabin i is empty and let X i 0 otherwise. The number of empty cabins is X r X i and we want to calculate E(X). Using the fact that the expectation of a sum is the sum of expectations (even for not independent random variables), we have ( r ) E(X) E X i r E(X i ). We have to compute E (X i ). Probability that i-th cabin is empty is ( r )n (this follows from the negative binomial probability distribution where one of the parameters is equal to 0). We have E (X i ) ( r ) n and ( E(X) r ) n. r Exercise 3.4 Let X be uniformly distributed on {0,,..., n}. Find the mean and variance of X. Solution of Exercise 3.4 :. Mean: E (X) i n + n + i0 i n(n + ) n n +

3 3.. EXPECTATION AND VARIANCE 3. Variance: ( ) V ar(x) E(X ) (E(X)) i ( n ) n + i0 n(n + )(n + ) ( n ) n + 6 n(n + ) n 6 4 4n + n 3n n + n Exercise 3.5 Having two dice, let the random variable X be the outcome of the first die and Y be the maximum of their outcomes. Compute E(X), E(Y ), V ar(x), Cov(X, Y ) and the joint distribution of X and Y. Solution of Exercise 3.5 : E(X) E(Y ) E(X ) 6 6 i 6 6 i 7 6 (i + (i )) i 6 ( ) 6 6 i 6 6 i 9 6 V ar(x) E(X ) E(X) 9 6 ( 6 ) We can use the equation: P (X i Y j) P (Y j X i)p (X i) For probability P (X i), we have for each i,..., 6 that P (X i) 6. Conditional probability is given by Thus we get the joint distribution: 0 j < i i P (Y j X i) 6 j i 6 j < i

4 4CHAPTER 3. EXPECTATION, INEQUALITIES AND LAWS OF LARGE NUMBERS Y \X E(XY ) i0 j0 ijp r(x i, Y j) jp r(x, Y j) + + j0 6jP r(x 6, Y j) j Cov(X, Y ) E(XY ) E(X)E(Y ) 66 6 Exercise Suppose a box contains 3 balls labeled,,3. Two balls are selected without replacement from the box. Let X be the number on the first ball and let Y be the number on the second ball. Compute Cov(X, Y ) and ϱ(x, Y ). Solution of Exercise 3.6 : We have Cov(X, Y ) E ((X E (X) (Y E (Y )) i,j p xi,y j ( xi E (X) )( y j E (Y ) ). Note that X and Y have identical marginal distributions. Since E (X) E (Y ), and p xi,y j for all i,j, we obtain 6 ( p xi,y j xi E (X) )( y j E (Y ) ) 6 i,j 6( ( )( ) + ( )(3 ) + ( )( ) + ( )(3 )+ + (3 )( ) + (3 )( ) ) ( ) + ( ) 6 3. Also, we have V ar(y ) V ar(x) E(X ) (E(X)) 3 (+4+9) ()

5 3.. EXPECTATION AND VARIANCE 5 3 and thus ϱ(x, Y ) Exercise 3.7 Suppose X and Y are two independent random variables such that E ( X 4), E ( Y ), E ( X ) and E (Y ) 0. Compute Var(X Y ). Solution of Exercise 3.7 : (X Var(X Y ) E( Y E(X Y ) ) ) ( (X E Y E(X )E(Y ) ) ) E ( (X Y 0) ) E(X 4 Y ) E(X 4 )E(Y ) Exercise 3.8 A p-random graph on v vertices is an unoriented graph where between every distinct vertices i < j there is an edge with probability p. Compute the expected value and the variance of the number of all edges in the graph. Solution of Exercise 3.8 : Denote X the number of all edges and X i,j the indicator of the event that there is an edge between i and j. Observe that X i<j X i,j. Then E (X) i<j E (X i,j ) ( ) v p. Since all X i,j are mutually independent, we know, that Var(X) ( ) v Var(X i,j ) (p p ). i<j In fact, we have just calculated the expectation and variance of the binomial distribution, if you substitute n ( v ). Exercise 3.9 Let X have the binomial distribution with parameters n and p. Find E (X). Solution of Exercise 3.9 : Here we provide a direct calculation of the expectation, in contrast to the Exercise 3.8, where we used the linearity of expectation. We have p k (X k) p X (k) ( n k) p k ( p) n k and E (X) i x if(x i ). The the binomial distribution assigns positive probabilities to 0,,... n Thus, E (X) ( ) n i p k ( p) n k. k i0

6 6CHAPTER 3. EXPECTATION, INEQUALITIES AND LAWS OF LARGE NUMBERS To calculate this quantity we observe that ( ) n in! i i i!(n i)! n(n )! (i )! ( (n ) (i ) )! ( ) n n i Thus n ( ) n E (X) n p i ( p) n i. i Making the substitution i l + we see that By the binomial theorem l0 n ( ) n E (X) np p l ( p) n l l l0 n ( ) n p i ( p) n l ( p + ( p) ) n l so we see that E (X) np. Exercise 3.0 Find V ar(x) for X from the previous example. Solution of Exercise 3.0 : Here we provide a direct calculation of the expectation, in contrast to the Exercise 3.8, where we used the linearity of expectation. We have V ar(x) E ( X ) ( E (X) ) We use again the identity k ( ) ( n k n n k ) and get E ( X ) s0 ( ) n k p k ( p) n k n k k0 ( ) n k p k ( p) n k. k k0 We put m n and s k and obtain E ( X ) m ( ) ( m m ( ) m m ( ) m np (s+) p s ( p) m s np s p s ( p) m s + )p s ( p) m s s s s s0 s0

7 3.. EXPECTATION AND VARIANCE 7 The first sum is equal to mp (see Exercise 3.9), the second is equal to from the binomial theorem. We obtain E ( X ) np(mp + ) np ( (n )p + ) np(np p + ) Var(X) E ( X ) (E (X)) np(np p + ) (np) np( p) Exercise 3. Consider a group of n people. A special day is a day such that exactly k people in the group have a birthday. What is the expected number of special days in a year? (Assume all years are non-leap.) Solution of Exercise 3. : We will define a family of random variables X, X,..., X 5 as follows. {, if exactly k people have their birthday on the i-th day of a year, X i 0, otherwise In other words, X i if and only if i is a special day. probability of X i. Let s compute the P (X i ) ( ) ( ) k ( ) n k n 4 k 5 5 Note, that although all 5 random variables have the same probability distribution, they still denote different random variables. In addition, these random variables are not independent! Let us now define another variable X 5 X i. It can be seen that X m if there are m special days in a year. Since we are interested in the expected number of special days, we will calculate E (X). E (X) E 5 ( 5 ) 5 5 X i E (X i ) P (X i ) ( n k ) ( 5 ) k ( 4 5 ) n k Exercise 3. Consider the same group of n people. What is the expected number of days such that at least two people have a birthday? How large should be n to make this expectation exceed? Solution of Exercise 3. : Note that this exercise is almost the same as the previous one, only the definition of a special day has changed. The solution can

8 8CHAPTER 3. EXPECTATION, INEQUALITIES AND LAWS OF LARGE NUMBERS therefore be found in the same manner as above. We will once again define a family of random variables X, X,..., X 5 as follows. {, if two people have their birthday on the i-th day of a year, X i 0, otherwise Now to calculate the probability distribution of X i and the expectation of 5 X i. ( ) n ( ) n 4 4 P (X i 0) + n E P (X i ) P (X i 0) ( 5 ) 5 5 X i E (X i ) P (X i ) ( ( ) n 4 5 n 5 ( ( ) n 4 5 n ( ) ) n 4 5 ( ) ) n 4 5 Exercise 3.3 Let X have a geometric distribution with parameter p. Find E (X). Solution of Exercise 3.3 : The expectation of the geometric distribution is E (X) p jp( p) j j0 p j( p) j j0 j0 d ( p)j dp Since a power series can be differentiated term by term, it follows that E (X) p d dp ( p) j Using the formula for the sum of a geometric progression, we can see that Exercise 3.4 j0 E (X) p d dp p p p p

9 3.. MARKOV AND CHEBYSHEV INEQUALITY; CHERNOFF BOUNDS9 Let the random variable X be representable as a sum of random variables X n X i. Show that, if E[X i X j ] E[X i ]E[X j ] for every pair of i and j with i < j n, then var[x] n var[x i] Solution of Exercise 3.4 : First we have var(x) E(X ) (E(X)) E(( X i ) ) (E( X i )) E( j X i X j ) (E( X i )) By the linearity of expectation we have var( X i ) Using the assumption we get var( X i ) E(X i X j ) ( E(X i )) (3.) j j E(X i X j ) j E(Xi ) (E(X i )) E(X j )E(X i ) (3.) var(x i ) (3.3) 3. Markov and Chebyshev inequality; Chernoff bounds Exercise 3.5 Suppose we flip a fair coin n times to obtain n random bits. Consider all m ( ) n pairs of these bits in some order. Let Y i be the exclusive or of the ith pair of bits, and let Y m Y i be the number of Y i that equal.. Show that each Y i is 0 with probability / and with probability /.. Show that Y i are not mutually independent. 3. Show that Y i satisfy the property E[Y i Y j ] E[Y i ]E[Y j ]. 4. Using previous exercise find V ar[y ] 5. Using Chebyshev s inequality prove a bound on P r[ Y E(Y ) n]

10 0CHAPTER 3. EXPECTATION, INEQUALITIES AND LAWS OF LARGE NUMBERS Solution of Exercise 3.5 :. Straightforward.. Consider n 3 and probability P r(y, Y, Y 3 ). If Y i were mutually independent, P r(y, Y, Y 3 ) would be equal to 8. In fact it is equal to zero (consider all 3-bit strings to see this). 3. First we will show, that Y i are pairwise independent. Consider possibilities..) Y i and Y j do not share a bit position. They are obviously independent..) Y i and Y j share a bit position. Consider all 3-bit strings to confirm the independence. 4. V ar[y ] pairwise independence m m V ar(y i ) E(Y i ) E(Y i ) m (0 + ) (0 + ) m 4 m 4 5. Chebyshev has the form P r[ Y E(Y ) n] V ar(y ) n n(n ) 8 n 8. Exercise 3.6 Suppose that Y has the geometric distribution with parameter p 3/4. Compute the exact value and the Chebyshev bound for the probability that Y is at least standard deviations away from the mean. Note: If Y has geometric distribution, E(Y ) p p and var(y ) p. Solution of Exercise 3.6 : E(Y ) 4/3 and var(y ) 4/9 and σ(y ) 3 We need to compute P r[y ] + P r[y ] 0 + P r[y 3] F Y () ( (/4) ) /6. By Chebyshev inequality P r[ Y E(Y ) 4 3 ] 4/9 6/9 4

11 3.. MARKOV AND CHEBYSHEV INEQUALITY; CHERNOFF BOUNDS Exercise 3.7 Alice and Bob play checkers often. Alice is a better player, so the probability that she wins any given game is.6, independent of all other games. They decide to play a tournament of n games. Bound the probability that Alice loses the tournament using a Chernoff bound. Solution of Exercise 3.7 : Alice looses the game is she wins less than half of the games, i.e. we want the probability P (X n ). We will use the Chernoff bound P r(x ( δ)µ) e µδ /. To bound our probability from above, we have to solve (for µ 3n/5) the inequality ( δ)µ n ( δ)(3n/5) n δ 5 n 6 n what gives e.g. δ /3 for n 5. In fact, we can use δ ɛ for any ɛ > /6 and sufficiently large n. We get P r(x (n )/) P r(x ( δ)µ) e µδ / e (3n/5)(/9)/ e n/30. Exercise 3.8 We have a standard six-sided die. Let X be the number of times that a 6 occurs over n throws of the die. Let p be the probability of the event X n/4. Compare the best upper bounds on p that you can obtain using Markov s inequality, Chebyshev s inequality and the Chernoff bounds. Solution of Exercise 3.8 : Chernoff states that p P r(x n/4) P r(x ( + δ)µ) e µδ /3. Observing that E(X) µ n 6, we solve ( + δ)µ n/4 to get δ /. Consequently, we can bound p as p e (n/6)(/4)/3 e n/7. Markov states that ( ) E(X) P (X t) t where E(X) E ( X i ) n/6. We can deduce that p P (X n/4) ( ) n/6 n/4 /3.

12 CHAPTER 3. EXPECTATION, INEQUALITIES AND LAWS OF LARGE NUMBERS Chebyshev states that P (X t) P ( X E(X) t) var(x) t. Noting that var(x) nq( q), we have P (X n/4) P ( X E(X) n/4) n (n/4) 0 9n. Exercise 3.9 We plan to conduct an opinion poll to find out the percentage of people in a community who want its president impeached. Assume that every person answers either yes or no. If the actual fraction of people who want the president impeached is p, we want to find and estimate X of p such that P r( X p εp) > δ for a given ε and δ, with ε > 0 and δ <. Solution of Exercise 3.9 : Find X such that P r( X p εp) > δ. This is equivalent to P r( nx np εpn) > δ. This holds if (but is not necessarily equivalent) P r( nx np < εpn) > δ. Finally, we can replace this by P r( nx np εpn) δ. It remains to solve apply a suitable Chernoff bound P r( Y µ ɛµ) e µɛ /3. For µ np we solve e µɛ /3 δ to get 3 ln δ pɛ n. Exercise 3.0 Consider a collection X,..., X n of n independent integers chosen uniformly from the set {0,, }. Let X n X i and 0 < δ <. Derive a Chernoff bound for P r(x ( + δ)n) and P r(x ( + δ)n). Solution of Exercise 3.0 : M Xi E(e txi ) 3 (et + e t + ) e et +e t.

13 3.. MARKOV AND CHEBYSHEV INEQUALITY; CHERNOFF BOUNDS3 Then M X (t) Setting t ln( + δ) we get n M Xi (t) e n(et +e t). P r(x ( + δ)n) <...

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Lecture 4: Sampling, Tail Inequalities

Lecture 4: Sampling, Tail Inequalities Lecture 4: Sampling, Tail Inequalities Variance and Covariance Moment and Deviation Concentration and Tail Inequalities Sampling and Estimation c Hung Q. Ngo (SUNY at Buffalo) CSE 694 A Fun Course 1 /

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Lecture 4. P r[x > ce[x]] 1/c. = ap r[x = a] + a>ce[x] P r[x = a]

Lecture 4. P r[x > ce[x]] 1/c. = ap r[x = a] + a>ce[x] P r[x = a] U.C. Berkeley CS273: Parallel and Distributed Theory Lecture 4 Professor Satish Rao September 7, 2010 Lecturer: Satish Rao Last revised September 13, 2010 Lecture 4 1 Deviation bounds. Deviation bounds

More information

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Overview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis

Overview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis Day 5 Functions/Probability Overview Functions Probabilistic analysis Neil Rhodes UC San Diego Image/Coimage The image of f is the set of values f actually takes on (a subset of the codomain) The inverse

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 26: Probablity and Random Processes Problem Set 9 Fall 2007 Issued: Thursday, November, 2007 Due: Friday, November 9, 2007 Reading:

More information

Week 12-13: Discrete Probability

Week 12-13: Discrete Probability Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible

More information

Tail Inequalities. The Chernoff bound works for random variables that are a sum of indicator variables with the same distribution (Bernoulli trials).

Tail Inequalities. The Chernoff bound works for random variables that are a sum of indicator variables with the same distribution (Bernoulli trials). Tail Inequalities William Hunt Lane Department of Computer Science and Electrical Engineering, West Virginia University, Morgantown, WV William.Hunt@mail.wvu.edu Introduction In this chapter, we are interested

More information

Discrete Probability

Discrete Probability MAT 258 Discrete Mathematics Discrete Probability Kenneth H. Rosen and Kamala Krithivasan Discrete Mathematics 7E Global Edition Chapter 7 Reproduced without explicit consent Fall 2016 Week 11 Probability

More information

Chernoff Bounds. Theme: try to show that it is unlikely a random variable X is far away from its expectation.

Chernoff Bounds. Theme: try to show that it is unlikely a random variable X is far away from its expectation. Chernoff Bounds Theme: try to show that it is unlikely a random variable X is far away from its expectation. The more you know about X, the better the bound you obtain. Markov s inequality: use E[X ] Chebyshev

More information

Lecture 4: Probability and Discrete Random Variables

Lecture 4: Probability and Discrete Random Variables Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1

More information

X = X X n, + X 2

X = X X n, + X 2 CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk

More information

Kousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13

Kousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13 Discrete Mathematics & Mathematical Reasoning Chapter 7 (continued): Markov and Chebyshev s Inequalities; and Examples in probability: the birthday problem Kousha Etessami U. of Edinburgh, UK Kousha Etessami

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

Introduction to discrete probability. The rules Sample space (finite except for one example)

Introduction to discrete probability. The rules Sample space (finite except for one example) Algorithms lecture notes 1 Introduction to discrete probability The rules Sample space (finite except for one example) say Ω. P (Ω) = 1, P ( ) = 0. If the items in the sample space are {x 1,..., x n }

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

Homework 5 Solutions

Homework 5 Solutions 126/DCP126 Probability, Fall 214 Instructor: Prof. Wen-Guey Tzeng Homework 5 Solutions 5-Jan-215 1. Let the joint probability mass function of discrete random variables X and Y be given by { c(x + y) ifx

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006 Review problems UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Solutions 5 Spring 006 Problem 5. On any given day your golf score is any integer

More information

Lecture 4: Two-point Sampling, Coupon Collector s problem

Lecture 4: Two-point Sampling, Coupon Collector s problem Randomized Algorithms Lecture 4: Two-point Sampling, Coupon Collector s problem Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized Algorithms

More information

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, 1.2.11, 1.2.12, 1.2.16 (extra credit) A fashionable country club has 100 members, 30 of whom are lawyers. Rumor has it that 25 of the club members are liars

More information

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam. Probability and Statistics FS 2017 Session Exam 22.08.2017 Time Limit: 180 Minutes Name: Student ID: This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided

More information

The first bound is the strongest, the other two bounds are often easier to state and compute. Proof: Applying Markov's inequality, for any >0 we have

The first bound is the strongest, the other two bounds are often easier to state and compute. Proof: Applying Markov's inequality, for any >0 we have The first bound is the strongest, the other two bounds are often easier to state and compute Proof: Applying Markov's inequality, for any >0 we have Pr (1 + ) = Pr For any >0, we can set = ln 1+ (4.4.1):

More information

Lecture 3 - Expectation, inequalities and laws of large numbers

Lecture 3 - Expectation, inequalities and laws of large numbers Lecture 3 - Expectation, inequalities and laws of large numbers Jan Bouda FI MU April 19, 2009 Jan Bouda (FI MU) Lecture 3 - Expectation, inequalities and laws of large numbersapril 19, 2009 1 / 67 Part

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations EECS 70 Discrete Mathematics and Probability Theory Fall 204 Anant Sahai Note 5 Random Variables: Distributions, Independence, and Expectations In the last note, we saw how useful it is to have a way of

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

Discrete Random Variables

Discrete Random Variables Discrete Random Variables An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan Introduction The markets can be thought of as a complex interaction of a large number of random processes,

More information

Assignment 4: Solutions

Assignment 4: Solutions Math 340: Discrete Structures II Assignment 4: Solutions. Random Walks. Consider a random walk on an connected, non-bipartite, undirected graph G. Show that, in the long run, the walk will traverse each

More information

CSC Discrete Math I, Spring Discrete Probability

CSC Discrete Math I, Spring Discrete Probability CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields

More information

Discrete Structures Prelim 1 Selected problems from past exams

Discrete Structures Prelim 1 Selected problems from past exams Discrete Structures Prelim 1 CS2800 Selected problems from past exams 1. True or false (a) { } = (b) Every set is a subset of its power set (c) A set of n events are mutually independent if all pairs of

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Discussion 03 Solutions

Discussion 03 Solutions STAT Discussion Solutions Spring 8. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they liked the new flavor, and the remaining indicated they

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS 6.0/6.3 Spring 009 Quiz Wednesday, March, 7:30-9:30 PM. SOLUTIONS Name: Recitation Instructor: Question Part Score Out of 0 all 0 a 5 b c 5 d 5 e 5 f 5 3 a b c d 5 e 5 f 5 g 5 h 5 Total 00 Write your solutions

More information

Discrete Mathematics

Discrete Mathematics Discrete Mathematics Workshop Organized by: ACM Unit, ISI Tutorial-1 Date: 05.07.2017 (Q1) Given seven points in a triangle of unit area, prove that three of them form a triangle of area not exceeding

More information

Probability reminders

Probability reminders CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so

More information

. Find E(V ) and var(v ).

. Find E(V ) and var(v ). Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 CS 70 Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 Today we shall discuss a measure of how close a random variable tends to be to its expectation. But first we need to see how to compute

More information

Homework 9 (due November 24, 2009)

Homework 9 (due November 24, 2009) Homework 9 (due November 4, 9) Problem. The join probability density function of X and Y is given by: ( f(x, y) = c x + xy ) < x

More information

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic

More information

Lecture 5: The Principle of Deferred Decisions. Chernoff Bounds

Lecture 5: The Principle of Deferred Decisions. Chernoff Bounds Randomized Algorithms Lecture 5: The Principle of Deferred Decisions. Chernoff Bounds Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized

More information

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) = Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,

More information

Week 04 Discussion. a) What is the probability that of those selected for the in-depth interview 4 liked the new flavor and 1 did not?

Week 04 Discussion. a) What is the probability that of those selected for the in-depth interview 4 liked the new flavor and 1 did not? STAT Wee Discussion Fall 7. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they lied the new flavor, and the remaining 6 indicated they did not.

More information

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces. Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for

More information

Chapter 4 - Tail inequalities

Chapter 4 - Tail inequalities Chapter 4 - Tail inequalities Igor Gonopolskiy & Nimrod Milo BGU November 10, 2009 Igor Gonopolskiy & Nimrod Milo (BGU) Chapter 4 November 10, 2009 1 / 28 Outline 1 Definitions 2 Chernoff Bound Upper deviation

More information

Math 493 Final Exam December 01

Math 493 Final Exam December 01 Math 493 Final Exam December 01 NAME: ID NUMBER: Return your blue book to my office or the Math Department office by Noon on Tuesday 11 th. On all parts after the first show enough work in your exam booklet

More information

Twelfth Problem Assignment

Twelfth Problem Assignment EECS 401 Not Graded PROBLEM 1 Let X 1, X 2,... be a sequence of independent random variables that are uniformly distributed between 0 and 1. Consider a sequence defined by (a) Y n = max(x 1, X 2,..., X

More information

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 203 Vazirani Note 2 Random Variables: Distribution and Expectation We will now return once again to the question of how many heads in a typical sequence

More information

Conditional Probability and Bayes

Conditional Probability and Bayes Conditional Probability and Bayes Chapter 2 Lecture 5 Yiren Ding Shanghai Qibao Dwight High School March 9, 2016 Yiren Ding Conditional Probability and Bayes 1 / 13 Outline 1 Independent Events Definition

More information

Analysis of Engineering and Scientific Data. Semester

Analysis of Engineering and Scientific Data. Semester Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each

More information

Notes on Discrete Probability

Notes on Discrete Probability Columbia University Handout 3 W4231: Analysis of Algorithms September 21, 1999 Professor Luca Trevisan Notes on Discrete Probability The following notes cover, mostly without proofs, the basic notions

More information

With high probability

With high probability With high probability So far we have been mainly concerned with expected behaviour: expected running times, expected competitive ratio s. But it would often be much more interesting if we would be able

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

Math , Fall 2012: HW 5 Solutions

Math , Fall 2012: HW 5 Solutions Math 230.0, Fall 202: HW 5 Solutions Due Thursday, October 4th, 202. Problem (p.58 #2). Let X and Y be the numbers obtained in two draws at random from a box containing four tickets labeled, 2, 3, 4. Display

More information

EE514A Information Theory I Fall 2013

EE514A Information Theory I Fall 2013 EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/

More information

Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22

Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22 CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22 Random Variables and Expectation Question: The homeworks of 20 students are collected in, randomly shuffled and returned to the students.

More information

CSE 312 Final Review: Section AA

CSE 312 Final Review: Section AA CSE 312 TAs December 8, 2011 General Information General Information Comprehensive Midterm General Information Comprehensive Midterm Heavily weighted toward material after the midterm Pre-Midterm Material

More information

Part (A): Review of Probability [Statistics I revision]

Part (A): Review of Probability [Statistics I revision] Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery

More information

Week 2. Review of Probability, Random Variables and Univariate Distributions

Week 2. Review of Probability, Random Variables and Univariate Distributions Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 13: Expectation and Variance and joint distributions Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Refresher on Discrete Probability

Refresher on Discrete Probability Refresher on Discrete Probability STAT 27725/CMSC 25400: Machine Learning Shubhendu Trivedi University of Chicago October 2015 Background Things you should have seen before Events, Event Spaces Probability

More information

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro CS37300 Class Notes Jennifer Neville, Sebastian Moreno, Bruno Ribeiro 2 Background on Probability and Statistics These are basic definitions, concepts, and equations that should have been covered in your

More information

CS 246 Review of Proof Techniques and Probability 01/14/19

CS 246 Review of Proof Techniques and Probability 01/14/19 Note: This document has been adapted from a similar review session for CS224W (Autumn 2018). It was originally compiled by Jessica Su, with minor edits by Jayadev Bhaskaran. 1 Proof techniques Here we

More information

Expectation of Random Variables

Expectation of Random Variables 1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete

More information

Discrete Structures for Computer Science

Discrete Structures for Computer Science Discrete Structures for Computer Science William Garrison bill@cs.pitt.edu 6311 Sennott Square Lecture #24: Probability Theory Based on materials developed by Dr. Adam Lee Not all events are equally likely

More information

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions)

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions) CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions) 1. (Confidence Intervals, CLT) Let X 1,..., X n be iid with unknown mean θ and known variance σ 2. Assume

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Conditional Probability and Independence

Conditional Probability and Independence Conditional Probability and Independence September 3, 2009 1 Restricting the Sample Space - Conditional Probability How do we modify the probability of an event in light of the fact that something is known?

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

MATH 19B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 2010

MATH 19B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 2010 MATH 9B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 00 This handout is meant to provide a collection of exercises that use the material from the probability and statistics portion of the course The

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Basic Probability. Introduction

Basic Probability. Introduction Basic Probability Introduction The world is an uncertain place. Making predictions about something as seemingly mundane as tomorrow s weather, for example, is actually quite a difficult task. Even with

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

Problems and results for the ninth week Mathematics A3 for Civil Engineering students Problems and results for the ninth week Mathematics A3 for Civil Engineering students. Production line I of a factor works 0% of time, while production line II works 70% of time, independentl of each other.

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 2015 MODULE 1 : Probability distributions Time allowed: Three hours Candidates should answer FIVE questions. All questions carry equal marks.

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

Discrete Random Variable

Discrete Random Variable Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n

More information

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y) HW5 Solutions 1. (50 pts.) Random homeworks again (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] Answer: Applying the definition of expectation we have

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Discrete Mathematics and Probability Theory Fall 2011 Rao Midterm 2 Solutions

Discrete Mathematics and Probability Theory Fall 2011 Rao Midterm 2 Solutions CS 70 Discrete Mathematics and Probability Theory Fall 20 Rao Midterm 2 Solutions True/False. [24 pts] Circle one of the provided answers please! No negative points will be assigned for incorrect answers.

More information

Discrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 202 Vazirani Note 4 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected in, randomly

More information

Handout 1: Mathematical Background

Handout 1: Mathematical Background Handout 1: Mathematical Background Boaz Barak February 2, 2010 This is a brief review of some mathematical tools, especially probability theory that we will use. This material is mostly from discrete math

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2017 Kannan Ramchandran March 21, 2017.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2017 Kannan Ramchandran March 21, 2017. EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2017 Kannan Ramchandran March 21, 2017 Midterm Exam 2 Last name First name SID Name of student on your left: Name of

More information

Discrete Probability Refresher

Discrete Probability Refresher ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory

More information

Review of Basic Probability Theory

Review of Basic Probability Theory Review of Basic Probability Theory James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 35 Review of Basic Probability Theory

More information