DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

Similar documents
DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

Math Spring Practice for the final Exam.

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

ENGG2430A-Homework 2

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

6.041/6.431 Fall 2010 Final Exam Wednesday, December 15, 9:00AM - 12:00noon.

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Friday, Feb 8, 2013

Twelfth Problem Assignment

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

We introduce methods that are useful in:

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

STAT 414: Introduction to Probability Theory

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

18.440: Lecture 26 Conditional expectation

University of Illinois ECE 313: Final Exam Fall 2014

Midterm Exam 1 Solution

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

1 Basic continuous random variable problems

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

conditional cdf, conditional pdf, total probability theorem?

STAT 418: Probability and Stochastic Processes

Class 8 Review Problems solutions, 18.05, Spring 2014

Lecture 4: Sampling, Tail Inequalities

6.041/6.431 Fall 2010 Quiz 2 Solutions

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Lecture 2: Repetition of probability theory and statistics

1 Solution to Problem 2.1

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

Final Exam # 3. Sta 230: Probability. December 16, 2012

8 Laws of large numbers

Problem Set 7 Due March, 22

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

Joint Distribution of Two or More Random Variables

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 9 Spring 2006

Review: mostly probability and some statistics

More than one variable

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

1 Random Variable: Topics

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

CS145: Probability & Computing

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

THE QUEEN S UNIVERSITY OF BELFAST

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon.

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Midterm Exam 2 (Solutions)

Disjointness and Additivity

Joint Probability Distributions and Random Samples (Devore Chapter Five)

STT 441 Final Exam Fall 2013

Lecture 16: Hierarchical models and miscellanea

FINAL EXAM: Monday 8-10am

Formulas for probability theory and linear models SF2941

Midterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Mathematical Methods for Computer Science

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

Algorithms for Uncertainty Quantification

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Essentials on the Analysis of Randomized Algorithms

ECE 302: Probabilistic Methods in Engineering

. Find E(V ) and var(v ).

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

ECE 673-Random signal analysis I Final

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

18.440: Lecture 28 Lectures Review

Expectation of Random Variables

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

STAT FINAL EXAM

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

Math 510 midterm 3 answers

STAT 430/510: Lecture 16

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Lecture 4: September Reminder: convergence of sequences

14.30 Introduction to Statistical Methods in Economics Spring 2009

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

2 (Statistics) Random variables

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

Transcription:

QUESTION BOOKLET EE 26 Spring 2006 Final Exam Wednesday, May 7, 8am am DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO You have 80 minutes to complete the final. The final consists of five (FIVE) problems, provided in the question booklet (THIS BOOKLET), that are in no particular order of difficulty. Write your solution to each problem in the space provided in the solution booklet (THE OTHER BOOKLET). Try to be neat! If we can t read it, we can t grade it. You may give an answer in the form of an arithmetic expression (sums, products, ratios, factorials) that could be evaluated using a calculator. Expressions like ( 8 3) or 5 k=0 (/2)k are also fine. A correct answer does not guarantee full credit and a wrong answer does not guarantee loss of credit. You should concisely explain your reasoning and show all relevant work. The grade on each problem is based on our judgment of your understanding as reflected by what you have written. This is a closed-book exam except for three sheets of handwritten notes, plus a calculator.

Problem : (22 points) Suppose that a pair of random variables X and Y has a joint PDF that is uniform over the shaded region shown in the figure below: x exp(y) + y Figure : Joint PDF of random variables X and Y. (a) (7 pt) Compute the Bayes least squares estimator (BLSE) of X based on Y. (Note: You should evaluate the required integrals; however, your answer can be left in terms of quantities like /e or 2). (b) (7 pt) Compute the linear least squares estimator (LLSE) of X based on Y, as well as the associated error variance of this estimator. Is the LLSE the same as the BLSE in this case? Why or why not? (Note: You should evaluate the required integrals; however, your answer can be left in terms of quantities like /e or 2). (c) (8 pt) Now suppose that in addition to observing some value Y = y, we also know that X /e. Compute the BLSE and LLSE estimators of X based on both pieces of information. Are the estimators the same or different? Explain why in either case. Solution: (a) The joint PDF has the form f X,Y (x,y) = { (/C) if y and 0 x exp(y) 0 otherwise, where the normalization constant of the joint PDF is given by C = exp y 0 dxdy = exp(y) + = e /e. 2

We have exp(y) f Y (y) = (/C) 0 = (/C) exp(y) for y. Hence, for y, the conditional distribution takes the form Consequently, we have f X Y (x y) = f X,Y (x,y) f Y (y) = exp( y) for exp x exp(y) X BLSE = E[X Y ] = 2 exp(y). (b) To compute the LLSE, we first compute and E[Y 2 ] = + E[Y ] = C = 2 C e, C = C + y 2 exp(y)dy [ y 2 exp(y) + y exp(y)dy 2 = [e (/e)] 2E[Y ] C ] y exp(y)dy Putting these pieces together yields the variance var(y ) = E[Y 2 ] (E[Y ]) 2. Using iterated expectation, we compute Lastly, we compute E[X] = E[E[X Y ]] E[XY ] = = 2C + exp(2y)dy = 4C (e2 /e 2 ). C = C = 2C = 8C + + + ( ) exp(y) xydx dy 0 ( x 2 2 exp(y) 0 y exp(2y)dy (e 2 + 3e ) 2 3 ) ydy

which yields the covariance cov(x,y ) = E[XY ] E[X]E[Y ]. With these pieces, the LLSE is given by X LSSE = E[X] + cov(x,y ) var(y ) [Y E[Y ]]. In this case, the LLSE and BLSE are different: the BLSE is the optimum predictor of any function Y, and it is non-linear in this case. In contrast, the LLSE is restricted to linear functions of the data Y. (c) Let Z = (X {X /e}). We compute P[Z z,x /e,y y] P[Z z,y y] = P[X /e] P[X z,y y] = I[z /e] P[X /e] so that (Y,Z) is simply uniform over the box { y, 0 z /e}. Hence, we have E[X Y,(X /e)] = E[Z Y ] = 2e for all y [,+]. This BLSE is already linear, so it must be equivalent to the LLSE. 4

Problem 2 (8 points): Bob the Gambler: Bob is addicted to gambling, and does so frequently. The time between any two consecutive visits that Bob makes to the local casino can be modeled as an exponentially-distributed random variable X with mean of day. The times between different pairs of consecutive visits are independent random variables. Every time i =, 2, 3... that Bob gambles, he wins/loses a random amount of money modeled as a Gaussian random variable Y i with mean 0 and variance σ 2. The amounts of money that he wins/loses on different occasions are independent random variables. Suppose that Bob starts off at time t = 0 with 0 dollars. (a) (3 pt) What is the distribution of N(t), the number of times Bob goes to the casino before some fixed time t 0? (b) (2 pt) What is PDF of M(t), the amount of money Bob has won/lost by time t? (c) (5 pt) Using Chernoff bound, bound the probability that M(t) is greater that a dollars. (Note: You do NOT need to solve the minimization problem in the Chernoff bound.) (d) (6 pt) For a given t > 0, define a random variable Z t = M(t), t corresponding to the average amount of money that Bob has won/lost at time t. As t, does Z t converge in probability? If so, prove it. If not, explain intuitively why not. Solution: (a) (2pt) N(t) is a Poisson distribution with parameter t (b) (2pt) Since M(t) = N(t) i= Y i, conditioning over N(t) we have f M(t) (m) = j=0 t j m2 j! e t e 2jσ 2 2πjσ 2 (c) (5pt) Since M M(t) (s) = M N(t) (s) e s =M Y (s) = et eσ2 s 2 /2, we have P(M(t) > a) e max s 0 sa e σ2 s 2 /2 (d) (5pt) One would expect that Z t 0 in probability. To show this, we use Chebyshev inequality to bound ( P Z t ) N(t) t E[M(t)] > ǫ = P t 5 i= Y i > ǫ

Where we have used E[M(t)] = E N [E[M(t) N = n]] and E[M(t) N = n] = 0 by symmetry. First we compute N(t) N(t) N(t) var = E var Y i N(t) + var E Y i N(t) so i= Y i = E[N(t)σ 2 ] = tσ 2 i= i= N(t) P t i= Y i > ǫ ( var ) N(t) t i= Y i ǫ 2 tσ2 t 2 ǫ 2 0 when n 6

Problem 3: (20 points) True or false: For each of the following statements, either give a counterexample to show that it is false, or provide an argument to justify that it is true. (Note: You will receive no points for just guessing the correct answer; full points will be awarded only when an answer is justified with an example or argument.) (a) (4 pt) If the Bayes least squares estimator of X given Y is equal to E[X], then X and Y are independent. (b) (4 pt) If random variables X and Y are independent, then they are conditionally independent given any other random variable Z. (c) (4 pt) Given a sequence of random variables {X n } such that E[X n ] = n and E[Xn] 2 = (n+) 2, the sequence Y n := Xn n converges in probability to some real number. (d) (4 pt) If the linear-least squares estimator X LLSE and Bayes least-squares estimator X BLSE (of X based on Y ) are equal, then the random variables X and Y must be jointly Gaussian. (e) (4 pt) There do not exist any pairs of random variables X and Y with E[X 4 ] = 4, E[Y 4 ] = and E[X 2 Y 2 ] = 3. Solutions: (a) FALSE. Let X N(0,), and let Y = ZX, where Z is equal to and + with probability 2. Then conditioned on Y = y, X is equal to y and +y with probability, so that E[X Y = y] = 0 = E[X]. However, X and Y are certainly not independent. 2 (b) FALSE. Let X and Y be independent Bernoulli RVs, and let Z = X +Y. Given Z = 0, we have X = Y. (c) TRUE. In fact, Y n converges in probability to. We compute E[Y n ] = and var(y n ) = 2n+ n 2. Applying Chebyshev s inequality yields as n +. P[ Y n ǫ] var(y n) ǫ 2 0 (d) FALSE. Consider part (c) of problem (this exam). (e) TRUE. We know that E[(X 2 Y 2 ) 2 ] must be non-negative. We can expand it to E[X 4 ] 2E[X 2 Y 2 ]+E[Y 4 ]. If we plug in the numbers provided, we get 4 2 3+ =, but since our original expression had to be non-negative, we see that the provided values are not possible. 7

Problem 4: (20 points) The position of a moving particle can be modeled by a Markov chain on the states {0,,2} with the state transition diagram shown below /2 /2 /2 0 2 /3 /2 2/3 (a) (3 pt) Classify the states in this Markov chain. Is the chain periodic or aperiodic? (b) (3 pt) In the long run, what fraction of time does the particle spend in state? (c) (4 pt) Suppose that the starting position X 0 is chosen according to the the steady state distribution. Conditioned on X 2 = 2, what is the probability that X 0 = 0? (d) (5 pt) Suppose that X 0 = 0, and let T denote the first time by which the particle has visited all the states. Compute E[T]. [Hint: This problem is made easier by a suitable choice of conditioning.] (e) (5 pt) Suppose now we have two particles, both independently moving according to the Markov chain. One particle starts in state 2, and the other particle starts in state. What is the average time before at least one of the particles is in state 0? Solution: (a) (2pt) The Markov chain has one recurrent aperiodic class. (b) (2pt) Solving π = π 0 2 π 0 = 2 3 π 2 π 0 + π + π 2 = we get π 0 = 4, π = 4, π 2 = 3. (c) (2pt) From Bayes rule P(X 0 = 0 X 2 = 2) = P(X 2 = 2 X 0 = 0)P(X 0 = 0) P(X 2 = 2) 8

where by assumption P(X 0 = 0) = π 0 and, from total probability theorem P(X 2 = 2 X 0 = 0) = i = i P(X 2 = 2 X 0 = 0,X = i)p(x = i X 0 = 0) P(X 2 = 2 X = i)p(x = i X 0 = 0) = i p i2 p 0i = 6. Similarly P(X 2 = 2) = P(X 2 = 2 X 0 = i)π i i = P(X 2 = 2 X = j,x 0 = i)p(x = j X 0 = i)π i i j = p j2 p ij π i i j = 6 π 0 + ( 2 4 π + 3 2 + ) π 2. 3 3 = 3 So, P(X 0 = 0 X 2 = 2) = 2 9 (d) (5pt) Conditioning on the first move of the particle, E[T] = 2 [E[T X = ] + 2 [E[T X = 2] where E[T X = ] is the mean first passage time to state 2 starting from state plus one (to account for the first step to ), and E[T X = 2] is the mean first passage time to state starting from state 2 plus one (to account for the first step to 2). From the first step analysis, one can write two systems of equations, the first of which is given below t = + 2 t 0 + 2 t t 0 = + 2 t + 2 t 2 Here we are using the book s notation, with t i denoting the mean first passage time from state i to state 2 (the 2 is not made explicit). Therefore, t 2 = 0. We can solve the system to get t = 6, or E[T X = ] = t + = 7. Now for the second system: 9

t 2 = + 2 3 t 0 + 3 t 2 t 0 = + 2 t + 2 t 2 Here we have t = 0, and we can solve to get t 2 = 5 or E[T X = 2] = 6. Therefore our final answer is 2 6 + 2 7 = 3 2. (e) (3pt) The problem asks to compute E[Z] where Z = min(x,x 2 ) and X and X 2 are geometric random variables with parameter /2 and 2/3 respectively. First, let us compute the CDF of a geometric random variable X with parameter p: P(X > z) = k=z+ = p( p) k z p( p) k k= z = p ( p) k k=0 ( p)z = p ( p) = ( p) z Hence, P(Z > z) = P(min(X,X 2 ) > z) = P(X > z)p(x 2 > z) ( = ) z ( 2 ) z 2 3 And = E[Z] = = ( 6 ) z P(Z > z) z=0 6 = 6 5 0

Problem 5: (20 points) Action figure collector: Acme Brand Chocolate Eggs are hollow on the inside, and each egg contains an action figure, chosen uniformly from a set of four different action figures. The price of any given egg (in dollars) is an exponentially distributed random variable with parameter λ, and the prices of different eggs are independent. For i =,...,4, let T i be a random variable corresponding to the number of eggs that you purchase in order to have i different action figures. To be precise, after purchasing T 2 eggs (and not before), you have at least one copy of exactly 2 different action figures; and after purchasing T 4 eggs (and not before), you have at least one copy of all four action figures. (a) (2 pt) What is the PMF, expected value, and variance of T? (b) (4 pt) What is the PMF and expected value of T 2? (c) (5 pt) Compute E[T 4 ] and var(t 4 ). (Hint: Find a representation of T 4 as a sum of independent RVs.) (d) (5 pt) Compute the moment generating function of T 4. (e) (4 pt) You keep buying eggs until you have collected all four action figures. Letting Z be a random variable representing the total amount of money (in dollars) that you spend, compute E[Z] and var[z]. Solution: (a) P(T = ) = and P(T = i) = 0 i. Thus E[T ] = and var(t ) = 0 (b) T 2 = T + X 2 where X 2 is a geometric(3/4) r.v. So P(T 2 = k) = 3 ( ) k 4 4 k = 2,3,... and E[T 2 ] = E[T ] + E[X 2 ] = + 4 3 = 7 3 (c) T 4 = +X 2 +X 3 +4 where X 3 is a geometric(/2) r.v and X 4 is a geometric(/4) r.v.. From this, one get E[T 4 ] = E[] + E[X 2 ] + E[X 3 ] + E[X 4 ] = 25 3 and (d) From we get var[t 4 ] = var[] + var[x 2 ] + var[x 3 ] + var[4] = 30 9 M T4 (s) = e s M X2 (s)m X3 (s)m X4 (s) M T4 (s) = e s ( 3 4 es 4 es )( 2 es 2 es ) ( 4 es 3 4 es )

(e) Since Z = T 4 i= P i, where P i is the price of egg i, we have: and. E[Z] = E[E[Z T 4 ]] = λ E[T 4] = 25 3λ var[z] = var[e[z T 4 ]] + E[var[Z T 4 ]] = 30 λ 2 9 + 25 3 λ 2 = 205 9λ 2 2