UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

Size: px
Start display at page:

Download "UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007"

Transcription

1 UC Berkeley Department of Electrical Engineering and Computer Science EE 26: Probablity and Random Processes Problem Set 9 Fall 2007 Issued: Thursday, November, 2007 Due: Friday, November 9, 2007 Reading: Bertsekas & Tsitsiklis, Chapter 4, 5. Chapter 4 problems Problem 9. For this problem, we consider the form of the linear least squares estimator of X when two measurements are available. More specifically, find a, a 2, and b such that the linear predictor g(y, Y 2 a Y + a 2 Y 2 + b minimizes E[(X g(y, Y 2 2 ] E[(X a Y a 2 Y 2 b 2 ]. To make the problem a little more tractable, assume that Y and Y 2 are uncorrelated (i.e., E[Y Y 2 ] E[Y ]E[Y 2 ]. Solution: In this problem, we want to find the linear estimator g(y, Y 2 that minimizes E[(X g(y, Y 2 2 ]. This is an extension of the case of single measurement. Our estimator is of the form g(y, Y 2 a Y + a 2 Y 2 + b and therefore our goal is to find a, a 2 and b that solve minimize (E[(X a Y a 2 Y 2 b 2 ] ( To achieve (, we must satisfy E[(X a Y a 2 Y 2 b 2 ] b E[2(X a Y a 2 Y 2 b( ] 0 E[(X a Y a 2 Y 2 b 2 ] a E[2(X a Y a 2 Y 2 b( Y ] 0 E[(X a Y a 2 Y 2 b 2 ] a 2 E[2(X a Y a 2 Y 2 b( Y 2 ] 0 and 2 E[ ]/ b 2, 2 E[ ]/ a 2, and 2 E[ ]/ a 2 2 must be > 0. By the linearity of expectation, b E[X] a E[Y ] a 2 E[Y 2 ] ( a E[Y 2] E[XY ] a 2 E[Y Y 2 ] be[y ] (2 a 2 E[Y2 2] E[XY 2] a E[Y Y 2 ] be[y 2 ]. ( We now have equations to solve for unknowns. Consider (2 E[Y ] (, we obtain a E[Y 2 ] be[y ] E[XY ] a 2 E[Y Y 2 ] be[y ] E[X]E[Y ] + a (E[Y ] 2 + a 2 E[Y ]E[Y 2 ].

2 Arranging algebra and use the fact that E[Y Y 2 ] E[Y ]E[Y 2 ] a (E[Y 2 ] (E[Y ] 2 E[XY ] E[X]E[Y ]. Similarly, Therefore, a 2 (E[Y 2 2 ] (E[Y 2 ] 2 E[XY 2 ] E[X]E[Y 2 ]. a (E[XY ] E[X]E[Y ]/σy 2 a 2 (E[XY 2 ] E[X]E[Y 2 ]/σy 2 2 b E[X] a E[Y ] a 2 E[Y 2 ]. Writing this expression in the similar term as the case of single measurement: g(y, Y 2 E[X] + Cov(X,Y σ 2 Y (Y E[Y ] + Cov(X,Y 2 σ 2 Y 2 (Y 2 E[Y 2 ] where we use the fact that Cov(X, Y E[(X E[X](Y E[Y ]] E[XY ] E[X]E[Y ]. Note: Convince yourself that the second order condition is satisfied. Problem 9.2 Bob has gone hiking, and is lost in the forest. In order to try and find a road, he decides on the following distance/coin-flip strategy. At time instants t, 2,..., he chooses a distance uniformly at random between t and t +. Independently of the chosen random distance, he then flips a fair coin; if it comes up heads, he moves the chosen random distance to the right (positive on the real line, and otherwise for a tails toss, he moves the chosen random distance to the left (negative on the real line. Both the random distance and the coin flip are independent random variables for different time instants. Assume that he starts at the origin at time instant t 0. (a Let Y s be Bob s position after repeating his distance/coin-flip strategy for a fixed number of s time instants. Compute its expected value and variance as a function of s. Now suppose that Bob repeats his distance/coin-flip strategy for a random number S of time rounds, after which he stops. Assume that S Geo(p has a geometric distribution with parameter p, and let X R be his final position. For any question below, you may feel free to express your answer (if appropriate in terms of the moments µ i E[S i ], i, 2,.... (b Suppose that you observe that S s. What is the minimum mean squared error (MMSE estimator of X given this information? (c What is the expected value and variance of his position X? (d Now suppose that you observe that Bob finishes at position X x. Given this information, what is the linear least squares estimator (LLSE of the number of time rounds S that he repeated his distance/coin-flip strategy? (e What is the linear least-squares estimate of S based on X 2? Solution: 2

3 (a Let X t denote the distance that Bob travels at time instant t. X t is uniformly distributed on two line segments [ (t +, t] and [t, t + ]. Obviously, E(X t 0, and V ar(x t E(X 2 t t+ t x 2 dx (t + t Since, Y s s t X t and {X t } s t are independent random variables, E(Y s E(X t 0 V ar(y s t t V ar(x t (s + (b The minimum mean squared error estimator of X given S s is the conditional expection of X given S s. This value is the same as E(Y s, which is 0. (c E(X E(E(X S E(0 0 (d The covariance of X and S is V ar(x E(V ar(x S + V ar(e(x S E(V ar(y s + V ar(0 E ( (S + µ + µ 2 + µ / Cov(X, S E(XS E(XE(S E(XS E(E(XS S s 0 Thus, the linear least square estimator of S given X is E(S µ. (e The linear least square estimator of S given X 2 can be written as: E(S + Cov(X2, S V ar(x 2 (X2 E(X 2 where E(S µ and E(X 2 V ar(x µ + µ 2 + µ /. By definition, the covariance between X 2 and S is Cov(X 2, S E(X 2 S E(X 2 E(S E(E(X 2 S S E(X 2 E(S E(S (S + E(X 2 E(S (µ 2 + µ + µ 4 / (µ + µ 2 + µ /µ

4 To compute the variance of X 2, we use the law of conditional variances. V ar(x 2 E(V ar(x 2 S + V ar(e(x 2 S V ar(e(x 2 S V ar(s / + S 2 + S E((S / + S 2 + S 2 (E(S / + S 2 + S 2 (µ 6 /9 + 2/µ 5 + 5/µ 4 + 2µ + µ 2 (µ / + µ 2 + µ 2 To compute V ar(x 2 S s, we rewrite X 2 as a sum over X i X j. V ar(x 2 S s V ar(( X t 2 t V ar(xt 2 + t i j Cov(X i X j, Xk 2 + k i j V ar(x i X j + i j Cov(X 2 i, X 2 j + (i,j (k,l,i j,k l Cov(X i X j, X k X l We will see that the most of covariances are zero. Since X i and X j are independent, Cov(Xi 2, X2 j 0. If k i and k j, Cov(X i X j, X 2 k E(X ix j X 2 k E(X ie(x j E(X 2 k E(X ie(x j E(X 2 k 0 0 If k equals to one of i and j, let s assume k i, Cov(X i X j, X 2 k E(X i X j E(X i E(X j E(X 2 i E(X i E(X j 0 0 Given the fact that (i, j (k, l, Cov(X i X j, X k X l is non-zero, ONLY when k j, l i. V ar(x 2 S s V ar(xt t i j t t V ar(x i X j E(X 2 i E(X 2 j t t(e(x 4 (E(Xt i j E(Xt 4 ( E(Xt ( E(Xt 2 2 t (S (9S5 + 45S S + 75S 2 + S + 2(S2 + S (0 + 2S + 5S 2 + 5S 8S 5 /45 Note that we actually use mathematica, a symbolic tool for math, to get the summation: s t E(X2 t 2 s t (t / + t 2 + t 2. Therefore, the variance of X 2 is (5µ 6 + 2µ µ µ + 60µ 2 + µ + 5/45 (µ / + µ 2 + µ 2 4

5 Putting all results into one expression, the linear least square estimator of S given X 2 is (µ 2 + µ + µ 4 / (µ + µ 2 + µ /µ ( µ + (5µ 6 + 2µ µ µ + 60µ 2 + µ + 5/45 (µ / + µ 2 + µ 2 X 2 (µ +µ 2 +µ / Students received full credit for reasoning up to the boxed line above. Chapter 5 problems Problem 9. Each of n packages is loaded independently onto either a red truck (with probability p or onto a green truck (with probability p. Let R be the total number of items selected for the red truck and let G be the total number of items selected for the green truck. (a Determine the PMF, expected value, and variance of the random variable R. (b Evaluate the probability that the first item to be loaded ends up being the only one on its truck. (c Evaluate the probability that at least one truck ends up with a total of exactly one package. (d Evaluate the expected value and the variance of the difference R G. (e Assume that n 2. Given that both of the first two packages are loaded onto the red truck, find the conditional expectation, variance, and PMF of the random variable R. Solution: (a R is a binomial random variable with parameters p and n, hence p R (r ( n r ( p n r p r, for r 0,, 2,..., n. Note that R can be written as a sum of i.i.d. Bernoulli random variables as follows: R X + X 2 + X n, where for each i p Xi (x { p, if x ; ( p, if x 0. It is easy to verify that E(X i p and var(x i p( p. Then E(R E(X + X X n E(X + E(X E(X n np and, because of the independence of X i s, var(r var(x + var(x var(x n n var(x i np( p. 5

6 (b Denote the event of interest by A. A is the union of the following two mutually exclusive events: The first item is placed in the red truck and the remaining n are placed in the green truck. The first item is placed in the green truck and the remaining n are placed in the red truck. Thus the probability of interest is the sum of the probabilities of the two events above: P (A p( p n + ( pp n. (c Denote the event of interest by B. It occurs if exactly one OR both of the trucks end up with exactly package. Thus,, if n ; p( P (B ( p + ( pp, ( if n 2; n n ( p n p + (p n ( p if n, 4, 5,.... (d E(D E(R G E(R (n R E(2R n 2E(R n 2np n. Since D 2R n where n is a constant, var(d 4 var(r 4np( p. (e Let C be the event that both of the first two packages to be loaded go onto the red truck. The random variable R given event C becomes Hence, R C 2 + X + X X n. E(R C E(2 + X + X X n 2 + (n 2E(X 2 + (n 2p. Similarly the conditional variance of R is var(r C var(2 + X + X X n (n 2 var(x (n 2p( p. Finally the conditional PMF for R given C is simply the unconditional PMF for R shifted to the right by 2. Hence, ( n 2 p R C (r c ( p (n r p (r 2 for r 2,,..., n. r 2 Problem 9.4 Fred is giving out samples of dog food. He makes calls door to door, but he leaves a sample (one can only on those calls for which the door is answered and a dog is in residence. On any call the probability of the door being answered is /4, and the probability that any household has a dog is 2/. Assume that the events Door answered and A dog lives here are independent and also that the outcomes of all calls are independent. 6

7 (a Determine the probability that Fred gives away his first sample on his third call. (b Given that he has given away exactly four samples on his first eight calls, determine the conditional probability that Fred will give away his fifth sample on his eleventh call. (c Determine the probability that he gives away his second sample on his fifth call. (d Given that he did not give away his second sample on his second call, determine the conditional probability that he will leave his second sample on his fifth call. (e We will say that Fred needs a new supply immediately after the call on which he gives away his last can. If he starts out with two cans, determine the probability that he completes at least five calls before he needs a new supply. (f If he starts out with exactly m cans, determine the expected value and variance of D m, the number of homes with dogs which he passes up (because of no answer before he needs a new supply. Solution: A successful call occurs with probability p (a Fred will give away his first sample on the third call if the first two calls are failures and the third is a success. Since the trials are independent, the probability of this sequence of events is simply ( p( pp (b The event of interest requires failures on the ninth and tenth trials and a success on the eleventh trial. For a Bernoulli process, the outcomes of these three trials are independent of the results of any other trials and again our answer is ( p( pp (c We desire the probability that L 2, the second-order interarrival time is equal to five trials. We know that p L2 (l is a Pascal PMF, and we have ( 5 ( 5 p L2 (5 p 2 ( p (d Here we require the conditional probability that the experimental value of L 2 is equal to 5, given that it is greater than 2. p L2 L 2 >2(5 L 2 > 2 p L2 (5 P (L 2 > 2 p L 2 (5 p L2 (2 p 2 ( p 5 2 ( 5 2 ( 2 2 ( p 2 ( p 0 (

8 (e The probability that Fred will complete at least five calls before he needs a new supply is equal to the probability that the experimental value of L 2 is greater than or equal to 5. 4 ( l P (L 2 5 P (L 2 4 p 2 ( p l 2 2 l2 ( ( ( ( 2 ( (f Let discrete random variable F represent the number of failures before Fred runs out of samples on his mth successful call. Since L m is the number of trials up to and including the mth success, we have F L m m. Given that Fred makes L m calls before he needs a new supply, we can regard each of the F unsuccessful calls as trials in another Bernoulli process with parameter r, where r is the probability of a success (a disappointed dog obtained by r P(dog lives there Fred did not leave a sample P(dog lives there AND door not answered P(giving away a sample We define X to be a Bernoulli random variable with parameter r. Then, the number of dogs passed up before Fred runs out, D m, is equal to the sum of F Bernoulli random variables each with parameter r, where F is a random variable. In other words, D m X i + X 2 + X + + X F. Note that D m is a sum of a random number of independent random variables. Further, F is independent of the X i s since the X i s are defined in the conditional universe where the door is not answered, in which case, whether there is a dog or not does not affect the probability of that trial being a failed trial or not. From our results in class, we can calculate its expectation and variance by E[D m ] E[F ]E[X] where we make the following substitutions. var(d m E[F ]var(x + (E[X] 2 var(f, E[F ] E[L m m] m p m m. var(f var(l m m var(l m E[X] r. m( p p 2 2m. var(x r( r

9 Finally, substituting these values, we have E[D m ] m m var(d m m 2 ( m 2m 9 9

Massachusetts Institute of Technology

Massachusetts Institute of Technology 6.04/6.43: Probabilistic Systems Analysis (Fall 00) Problem Set 7: Solutions. (a) The event of the ith success occuring before the jth failure is equivalent to the ith success occurring within the first

More information

Random variables (discrete)

Random variables (discrete) Random variables (discrete) Saad Mneimneh 1 Introducing random variables A random variable is a mapping from the sample space to the real line. We usually denote the random variable by X, and a value that

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015. EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)

More information

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi

More information

Final Review: Problem Solving Strategies for Stat 430

Final Review: Problem Solving Strategies for Stat 430 Final Review: Problem Solving Strategies for Stat 430 Hyunseung Kang December 14, 011 This document covers the material from the last 1/3 of the class. It s not comprehensive nor is it complete (because

More information

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. Stationary independent increments 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. 2. If each set of increments, corresponding to non-overlapping collection of

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

Expectation, inequalities and laws of large numbers

Expectation, inequalities and laws of large numbers Chapter 3 Expectation, inequalities and laws of large numbers 3. Expectation and Variance Indicator random variable Let us suppose that the event A partitions the sample space S, i.e. A A S. The indicator

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

X = X X n, + X 2

X = X X n, + X 2 CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Lecture 9: Conditional Probability and Independence

Lecture 9: Conditional Probability and Independence EE5110: Probability Foundations July-November 2015 Lecture 9: Conditional Probability and Independence Lecturer: Dr. Krishna Jagannathan Scribe: Vishakh Hegde 9.1 Conditional Probability Definition 9.1

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 4: Solutions Fall 007 Issued: Thursday, September 0, 007 Due: Friday, September 8,

More information

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006 Review problems UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Solutions 5 Spring 006 Problem 5. On any given day your golf score is any integer

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y) HW5 Solutions 1. (50 pts.) Random homeworks again (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] Answer: Applying the definition of expectation we have

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Overview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis

Overview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis Day 5 Functions/Probability Overview Functions Probabilistic analysis Neil Rhodes UC San Diego Image/Coimage The image of f is the set of values f actually takes on (a subset of the codomain) The inverse

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) = Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,

More information

STOR Lecture 16. Properties of Expectation - I

STOR Lecture 16. Properties of Expectation - I STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics

More information

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 CS 70 Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 Today we shall discuss a measure of how close a random variable tends to be to its expectation. But first we need to see how to compute

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Stochastic Models of Manufacturing Systems

Stochastic Models of Manufacturing Systems Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/

More information

Econ 371 Problem Set #1 Answer Sheet

Econ 371 Problem Set #1 Answer Sheet Econ 371 Problem Set #1 Answer Sheet 2.1 In this question, you are asked to consider the random variable Y, which denotes the number of heads that occur when two coins are tossed. a. The first part of

More information

Expected Value 7/7/2006

Expected Value 7/7/2006 Expected Value 7/7/2006 Definition Let X be a numerically-valued discrete random variable with sample space Ω and distribution function m(x). The expected value E(X) is defined by E(X) = x Ω x m(x), provided

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Review for Exam Spring 2018

Review for Exam Spring 2018 Review for Exam 1 18.05 Spring 2018 Extra office hours Tuesday: David 3 5 in 2-355 Watch web site for more Friday, Saturday, Sunday March 9 11: no office hours March 2, 2018 2 / 23 Exam 1 Designed to be

More information

Appendix A : Introduction to Probability and stochastic processes

Appendix A : Introduction to Probability and stochastic processes A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of

More information

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with

More information

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups. Copyright & License Copyright c 2006 Jason Underdown Some rights reserved. choose notation binomial theorem n distinct items divided into r distinct groups Axioms Proposition axioms of probability probability

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008 IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008 Justify your answers; show your work. 1. A sequence of Events. (10 points) Let {B n : n 1} be a sequence of events in

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 8 Fall 007 Issued: Thursday, October 5, 007 Due: Friday, November, 007 Reading: Bertsekas

More information

the amount of the data corresponding to the subinterval the width of the subinterval e x2 to the left by 5 units results in another PDF g(x) = 1 π

the amount of the data corresponding to the subinterval the width of the subinterval e x2 to the left by 5 units results in another PDF g(x) = 1 π Math 10A with Professor Stankova Worksheet, Discussion #42; Friday, 12/8/2017 GSI name: Roy Zhao Problems 1. For each of the following distributions, derive/find all of the following: PMF/PDF, CDF, median,

More information

Midterm Exam 1 (Solutions)

Midterm Exam 1 (Solutions) EECS 6 Probability and Random Processes University of California, Berkeley: Spring 07 Kannan Ramchandran February 3, 07 Midterm Exam (Solutions) Last name First name SID Name of student on your left: Name

More information

Final Examination Solutions (Total: 100 points)

Final Examination Solutions (Total: 100 points) Final Examination Solutions (Total: points) There are 4 problems, each problem with multiple parts, each worth 5 points. Make sure you answer all questions. Your answer should be as clear and readable

More information

ECSE B Solutions to Assignment 8 Fall 2008

ECSE B Solutions to Assignment 8 Fall 2008 ECSE 34-35B Solutions to Assignment 8 Fall 28 Problem 8.1 A manufacturing system is governed by a Poisson counting process {N t ; t < } with rate parameter λ >. If a counting event occurs at an instant

More information

Notes 12 Autumn 2005

Notes 12 Autumn 2005 MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,

More information

Notes for Math 324, Part 17

Notes for Math 324, Part 17 126 Notes for Math 324, Part 17 Chapter 17 Common discrete distributions 17.1 Binomial Consider an experiment consisting by a series of trials. The only possible outcomes of the trials are success and

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5

More information

Guidelines for Solving Probability Problems

Guidelines for Solving Probability Problems Guidelines for Solving Probability Problems CS 1538: Introduction to Simulation 1 Steps for Problem Solving Suggested steps for approaching a problem: 1. Identify the distribution What distribution does

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8. Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8. Coin A is flipped until a head appears, then coin B is flipped until

More information

Topic 3: The Expectation of a Random Variable

Topic 3: The Expectation of a Random Variable Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM

MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM YOUR NAME: KEY: Answers in Blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p). CHAPTER 6 Solution to Problem 6 (a) The random variable R is binomial with parameters p and n Hence, ( ) n p R(r) = ( p) n r p r, for r =0,,,,n, r E[R] = np, and var(r) = np( p) (b) Let A be the event

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

ECE Homework Set 3

ECE Homework Set 3 ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3

More information

EE514A Information Theory I Fall 2013

EE514A Information Theory I Fall 2013 EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/

More information

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom 1 Learning Goals Covariance and Correlation Class 7, 18.05 Jerem Orloff and Jonathan Bloom 1. Understand the meaning of covariance and correlation. 2. Be able to compute the covariance and correlation

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

1 Bernoulli Distribution: Single Coin Flip

1 Bernoulli Distribution: Single Coin Flip STAT 350 - An Introduction to Statistics Named Discrete Distributions Jeremy Troisi Bernoulli Distribution: Single Coin Flip trial of an experiment that yields either a success or failure. X Bern(p),X

More information

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro CS37300 Class Notes Jennifer Neville, Sebastian Moreno, Bruno Ribeiro 2 Background on Probability and Statistics These are basic definitions, concepts, and equations that should have been covered in your

More information

STT 441 Final Exam Fall 2013

STT 441 Final Exam Fall 2013 STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014. EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014 Midterm Exam 1 Last name First name SID Rules. DO NOT open the exam until instructed

More information

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables

More information

MATH Notebook 5 Fall 2018/2019

MATH Notebook 5 Fall 2018/2019 MATH442601 2 Notebook 5 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 5 MATH442601 2 Notebook 5 3 5.1 Sequences of IID Random Variables.............................

More information

Lecture 4: Probability and Discrete Random Variables

Lecture 4: Probability and Discrete Random Variables Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1

More information

Chapter 4 continued. Chapter 4 sections

Chapter 4 continued. Chapter 4 sections Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:

More information

Chapter 3. Discrete Random Variables and Their Probability Distributions

Chapter 3. Discrete Random Variables and Their Probability Distributions Chapter 3. Discrete Random Variables and Their Probability Distributions 2.11 Definition of random variable 3.1 Definition of a discrete random variable 3.2 Probability distribution of a discrete random

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Introduction to Probability Theory for Graduate Economics Fall 2008

Introduction to Probability Theory for Graduate Economics Fall 2008 Introduction to Probability Theory for Graduate Economics Fall 008 Yiğit Sağlam October 10, 008 CHAPTER - RANDOM VARIABLES AND EXPECTATION 1 1 Random Variables A random variable (RV) is a real-valued function

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X. Math 10B with Professor Stankova Worksheet, Midterm #2; Wednesday, 3/21/2018 GSI name: Roy Zhao 1 Problems 1.1 Bayes Theorem 1. Suppose a test is 99% accurate and 1% of people have a disease. What is the

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Expectation. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3 Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................

More information

University of Regina. Lecture Notes. Michael Kozdron

University of Regina. Lecture Notes. Michael Kozdron University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating

More information

Properties of Summation Operator

Properties of Summation Operator Econ 325 Section 003/004 Notes on Variance, Covariance, and Summation Operator By Hiro Kasahara Properties of Summation Operator For a sequence of the values {x 1, x 2,..., x n, we write the sum of x 1,

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Common Discrete Distributions

Common Discrete Distributions Common Discrete Distributions Statistics 104 Autumn 2004 Taken from Statistics 110 Lecture Notes Copyright c 2004 by Mark E. Irwin Common Discrete Distributions There are a wide range of popular discrete

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture

More information

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO QUESTION BOOKLET EE 26 Spring 2006 Final Exam Wednesday, May 7, 8am am DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO You have 80 minutes to complete the final. The final consists of five

More information

success and failure independent from one trial to the next?

success and failure independent from one trial to the next? , section 8.4 The Binomial Distribution Notes by Tim Pilachowski Definition of Bernoulli trials which make up a binomial experiment: The number of trials in an experiment is fixed. There are exactly two

More information

7. Be able to prove Rules in Section 7.3, using only the Kolmogorov axioms.

7. Be able to prove Rules in Section 7.3, using only the Kolmogorov axioms. Midterm Review Solutions for MATH 50 Solutions to the proof and example problems are below (in blue). In each of the example problems, the general principle is given in parentheses before the solution.

More information