ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab

Similar documents
University of Illinois ECE 313: Final Exam Fall 2014

Midterm Exam 1 Solution

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

MAT 271E Probability and Statistics

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Midterm Exam 1 (Solutions)

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

Class 26: review for final exam 18.05, Spring 2014

Random variables. DS GA 1002 Probability and Statistics for Data Science.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

1 Basic continuous random variable problems

Math 493 Final Exam December 01

Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual

Guidelines for Solving Probability Problems

STAT/MA 416 Midterm Exam 3 Monday, November 19, Circle the section you are enrolled in:

FINAL EXAM: 3:30-5:30pm

MAT 271E Probability and Statistics

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Lecture 2: Repetition of probability theory and statistics

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

FINAL EXAM: Monday 8-10am

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Closed book and notes. 120 minutes. Cover page, five pages of exam. No calculators.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

Final Exam. Math Su10. by Prof. Michael Cap Khoury

ECEn 370 Introduction to Probability

MAS108 Probability I

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Lecture 14. More discrete random variables

1: PROBABILITY REVIEW

Algorithms for Uncertainty Quantification

Introduction to Probability 2017/18 Supplementary Problems

STAT Chapter 5 Continuous Distributions

ECE 313: Hour Exam I

Class 8 Review Problems 18.05, Spring 2014

18.05 Exam 1. Table of normal probabilities: The last page of the exam contains a table of standard normal cdf values.

57:022 Principles of Design II Final Exam Solutions - Spring 1997

Chapter 2 Random Variables

STAT 516 Midterm Exam 2 Friday, March 7, 2008

Probability and Statistics Concepts

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

ECE 302: Probabilistic Methods in Electrical Engineering

STAT 414: Introduction to Probability Theory

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) HIGHER CERTIFICATE IN STATISTICS, 1996

ECE 302: Probabilistic Methods in Engineering

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Chapter 2: Random Variables

MATH Solutions to Probability Exercises

1 Presessional Probability

Statistics 100A Homework 5 Solutions

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

1 Basic continuous random variable problems

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

ECE313 Summer Problem Set 7. Reading: Cond. Prob., Law of total prob., Hypothesis testinng Quiz Date: Tuesday, July 3

Discrete Distributions

Part 3: Parametric Models

. Find E(V ) and var(v ).

Discrete Random Variable Practice

Probability and Distributions

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

Discrete Random Variables. Discrete Random Variables

Tutorial 1 : Probabilities

Exercises with solutions (Set D)

6.041/6.431 Fall 2010 Quiz 2 Solutions

We introduce methods that are useful in:

CME 106: Review Probability theory

1.1 Review of Probability Theory

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

STAT 418: Probability and Stochastic Processes

Semester , Example Exam 1

Notes on Continuous Random Variables

Review of Probability. CS1538: Introduction to Simulations

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

Massachusetts Institute of Technology

Lecture 8: Continuous random variables, expectation and variance

Conditional distributions (discrete case)

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Probability Distributions Columns (a) through (d)

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

ST 371 (V): Families of Discrete Distributions

Practice Exam 1: Long List 18.05, Spring 2014

Random Variable. Pr(X = a) = Pr(s)

Archdiocese of Washington Catholic Schools Academic Standards Mathematics

ECE531: Principles of Detection and Estimation Course Introduction

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam

Solution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better.

MAT 271E Probability and Statistics

18.05 Practice Final Exam

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Transcription:

University of Illinois Spring 1 ECE 313: Conflict Final Exam Tuesday, May 13, 1, 7: p.m. 1: p.m. Room 1 Everitt Lab 1. [18 points] Consider an experiment in which a fair coin is repeatedly tossed every ten seconds, with the first toss happening ten seconds after time zero. Assume the tosses are independent. (a) ( points) Associated to the k th toss, define a random variable X k that takes the value 1 if the outcome is heads and otherwise. What is the pmf of X k? The distribution of X k is Bernoulli with p =.5, so p xk () = p xk (1) =.5. (b) ( points) What is the name of the random process defined by the infinite sequence of random variables X 1, X,...? The sequence X 1, X,... is a Bernoulli process. (c) (6 points) Find the pmf for the total number of heads that appear in the first 1 minutes. In the first ten minutes, there will be (6 tosses/min) (1 min) = 6 tosses. Let C 6 denote the number of heads that appear in 6 tosses; then C 6 is binomially distributed with pmf p C6 (k) = ( ) 6 k (.5) k (.5) 6 k = ( ) 6 k (.5) 6, k 6. (d) (6 points) Find the pmf for the time it takes until a heads shows for the first time. Since the tosses are conducted every 1 s, the time elapsed until a heads shows up for the first time is T 1 = 1 L 1 [s], where L 1 is a geometric random variable with parameter p =.5. Thus, p T1 (1k) = (.5) k, k = 1,,..... [ points] Let T 1 and T denote two independent exponential random variables with the same parameter, λ. (a) (6 points) Find the pdf of T t = T 1 + T. Since T 1 and T are independent, we can use the convolution formula; thus, f Tt (t) = t λ e λu e λ(t u) du = t λ e λt du = λ te λt ALTERNATIVELY, the sum of r independent exponentially distributed random variables with parameter r has the Erlang distribution with parameters r and λ. The Erlang distribution for r = has pdf λ te λt for t. (b) (6 points) Find the pdf of T s = min{t 1, T }. First we will find the CDF of T s and then we will differentiate to obtain the pdf. From the definition of CDF, we have that F Ts (t) = P {T s t} = P {min{t 1, T } t} = 1 P {min{t 1, T } > t} = 1 P ({T 1 > T } {T > t}) = 1 P {T 1 > t}p {T > t} (by independence of T 1 and T ) = 1 e λt e λt = 1 e λt Differentiating yields f Ts (t) = λe λt for t. That is, T s has the exponential distribution with parameter λ. (c) (6 points) Find the pdf of T p = max{t 1, T }. Similarly to part (b) above, first we will find the CDF of T p and then will differentiate to find its pdf. Again, from the definition of CDF, we have that F Tp (t) = P (T p t) = P (max{t 1, T } t) = P ({{T 1 T } {T t}}). Then, from independence, we have that P ({{T 1 T } {T t}}) = P (T 1 T )P (T t) = (1 e λt )(1 e λt ) = 1 e λt + e λt. Finally, by differentiating, we obtain that f Tp (t) = λ(e λt e λt ).

(d) (6 points) Without doing any calculation, determine which is larger, E[T s ] or E[T p ]. Explain your reasoning. Let D = T p T s, so that T p = T s + D. Then P {D > } = 1, because the minimum of two numbers is less than the maximum. So E[D] >. Therefore, E[T p ] = E[T s ] + E[D] > E[T s ]. 3. [ points] At a tennis tournament, Serena is up against an opponent who she never met before. The match is best of five sets, where the one who first wins three sets is the winner. There are three equally likely scenarios for Serena: She is a stronger player and she wins each set with probability 3/. She is a weaker player and she wins each set with probability 1/. She is equally good as her opponent and she wins each set with probability 1/. Given which scenario holds, the outcomes of the sets are mutually independent. (a) (6 points) What is the probability the match ends in three straight sets? P {W W W } = P {LLL} = 1/3 ((1/) 3 + (3/) 3 + (1/) 3 ) = 3/16. So the probability the match ends in three sets is P {W W W, LLL} = 3/8. (b) (6 points) What is the probability Serena loses the match? 1/ by symmetry. (c) (6 points) It turns out that Serena lost the match by 1-3. What was the probability for that to happen? P (lost 1-3) = P {W LLL, LW LL, LLW L} = 3P {W LLL} = 3 1 3 ((1/) 3 (3/) + (3/) 3 (1/) + (1/) ) = 3 56 + 7 56 + 16 56 = 6 56 = 3 18 (d) (6 points) After the match Serena decides to reevaluate herself. Given that she lost by 1-3, what is the conditional probability Serena is a stronger player than her opponent? 3 3+7+16 = 3 6.. [ points] Suppose (X, Y ) is uniformly distributed within the triangular region {(u, v) : v u 1}, i.e., { v u 1 f XY (u, v) = else (a) (6 points) Are X and Y independent? Explain your answer. No, the support of f X,Y is not a product set. (b) (6 points) Suppose we want to estimate Y from X in the way that minimizes the meansquare error (MSE). Find the best unconstrained estimator g (X) and the corresponding MSE. Conditioned on X = u, Y is uniformly distributed on [, u]. Therefore g (u) = E[Y X = u] = u/. The MMSE is E[(Y g (X)) ] = E[Y ] E[g (X) ], where E[Y ] = 1 (1 v)v dv = 1 6, E[X ] = 1 uu du = 1. So MMSE = E[Y ] E[X ]/ = 1.

(c) (6 points) Find the best linear estimator L (X) and the corresponding MSE. Since g is linear, it is also the best linear estimator. Therefore L coincides with g and the MSE is also 1. Alternatively, we could use the formulas for L and for the associated MSE. (d) (6 points) Find the correlation coefficient ρ X,Y. By symmetry, σ X = σ Y. And by the general formula, L (X) = µ Y + ρ X,Y σ Y σ X (X µ X ), whereas L (X) = X. Matching the slopes of these two expressions for L yields ρ X,Y = 1/. ALTERNATIVELY, we can calculate Cov(X, Y ), Var(X), and Var(Y ) (which is equal to Var(X)) and use the definition of ρ X,Y. 5. [18 points] Consider a vehicle traveling along a straight road in the direction of increasing mile markers. Suppose that at time zero, the location of the vehicle U is uniformly distributed between zero and one (i.e. between mile marker zero and mile marker one), and suppose the speed of the vehicle V is a constant over time, which is exponentially distributed with parameter λ >, and is independent of U. Thus, the location of the vehicle as a function of time t (measured in hours) is given by X t = U + V t. (a) (6 points) Identify the joint pdf of (U, V ). Be sure to specify the support (i.e. where the joint pdf is not zero.) { λe λv u 1, v f U,V (u, v) = f U (u)f V (v) = else (b) (6 points) Find the mean and variance of X t for a fixed t >. Make your answer as explicit as possible. E[X t ] = E[U] + te[v ] = 1 + t λ, and, since U and V are independent, Var(X t ) = Var(U) + t Var(V ) = 1 1 + t. λ (c) (6 points) Let T be the time the vehicle reaches the one mile marker. Find the CDF of T. Using U + V T = 1 yields T = 1 U V. Clearly F T (t) = for t <. For t, { } { 1 U F T (t) = P t = P V 1 U } V t = P {(U, V ) shaded region} = = 1 e 1 u λ( t ) t 1 u du = e λ( λ 1 t ) 1 u= 1 u t λe λv dvdu = t λ (1 e λ t ) 6. [1 points] Suppose 19 tickets are sold for an airplane flight with 15 seats. Suppose each passenger arrives for the flight with probability p =.75. (a) (7 points) Assuming each passenger arrives independently with probability p, use the central limit theorem (CLT) to express the approximate probability the flight is oversold (i.e. strictly more than 15 passengers arrive for the flight) in terms of the Q function. You do not need to use the continuity correction. Let X denote the number of passengers that arrive for the flight. By the assumptions, X has the binomial distribution with parameters n = 19 and p =.75. Thus E[X] = (19)(.75) = 1 and Var(X) = 19(.75)(.5) = 3 19 16 = 36. So using the Gaussian approximation without a continuity correction, { } ( ) X 1 15 1 15 1 P {X > 15} = P > Q = Q(1). 6 6 6 3

For the continuity correction we would start with P {X 15.5} yielding Q(6.5/6). We could also start with P {X 151}, giving Q(7/6). (b) (7 points) Suppose instead that the 19 passengers consist of 96 pairs of passengers, such that for each pair, both passengers in the pair arrive for the flight with probability p =.75; otherwise neither passenger in the pair arrives for the flight. Assume pairs arrive independently. The number of seats on the plane is still 15. Find the Gaussian approximation for the probability the flight is oversold. If we let Y i denote the number of passengers that arrive from the i th pair, then the total number of passengers that arrive can be written as S = Y 1 + + Y 96, where the Y s are independent and P {Y i = } = p and P {Y i = } = 1 p. Thus, S has mean 96()p = 1 and variance 96()p(1 p) = 7. (So the effect of pairing gives a larger variance than in part (a).) Thus, { } ( ) S 1 15 1 15 1 P {S > 15} = P > Q = Q(1/ ). 7 7 7 (Slightly different correct answers are possible as for part (a). Another way to solve this part is to focus on the number of pairs of passengers that arrive, with the flight being oversold if more than 75 pairs of passengers arrive.) 7. [ points] Suppose under hypothesis H, the observation X has density f, and under hypothesis H 1, the observation X has density f 1, where the densities are given by f (u) = { 1 u 1 u 1 f 1 (u) = { 1 u u 1 u 1 (a) (6 points) Describe the ML decision rule for deciding which hypothesis is true for observation X. ML decision rule: declare H 1 if X < 1, and declare H otherwise. (b) (6 points) Find p false alarm and p miss for the ML rule. ( p false alarm = P (declare H 1 H is true) = P 1 X 1 ) H = 1. ( p miss = P (declare H H 1 is true) = P X 1 ) H 1 = 1. (c) (6 points) Suppose it is assumed a priori that H is true with probability π and H 1 is true with probability π 1 = 1 π. For what values of π does the MAP decision rule declare H with probability one, no matter which hypothesis is really true? In order for H to be declared with probability one, we need f 1(u) f (u) π π 1 for all u. The maximum of f 1(u) f (u) is (occurs at u = draw a sketch) so we need π π 1, or π 3. (d) (6 points) For the prior distribution π =. and π 1 =.8, find P P ( ) H X <.5 = P (H, X <.5) = P ( X <.5) ( ) H X <.5...5..5 +.8.75 = 1 7

8. [ points] Suppose X has the pdf: {.5u u f X (u) = else (a) (6 points) Find E[X ]. E[X ] =.5u 3 du = u 8 =. (b) (6 points) Find P ( X = 1), where u is the greatest integer less than or equal to u. P ( X ) = 1) = P (1 X < ) = P (1 X < ) = 1.5u du = u (c) (6 points) Find the cumulative distribution function (CDF) of Y = ln X. The support of Y is (, ln ]. F Y (c) = P {Y c} = P {ln X c} = P {X e c } = e c.5u du = u e c 1 = 1. = ec for c ln, and F Y (c) = 1 for c > ln (d) (6 points) If U is uniformly distributed over the interval [, 1], find a function g such that g(u) has the same distribution as X. c F X (c) = P {X c} =.5u du = u c = c. Let c = u, then c = u, hence g(u) = F 1 (u) = u, for u 1. 9. [3 points] (3 points per answer) In order to discourage guessing, 3 points will be deducted for each incorrect answer (no penalty or gain for blank answers). A net negative score will reduce your total exam score. (a) This part concerns different choices for the joint pdf of random variables X and Y. The constant c stands for a normalizing constant in each part. TRUE FALSE { ce uv u 1, v 1 X and Y are independent if f X,Y (u, v) = else { ce uv u X and Y are independent if f X,Y (u, v) = + v 1 else False,False (b) Suppose X is a continuous type random variable with CDF F X (u). 5

TRUE FALSE If b > a, then F X (b) > F X (a). If F X (b) > F X (a), then b > a. F X (α) = 1 for some α with < α <. False, True, True. (c) Let (X, Y ) be uniformly distributed over [, 1] [, 1]. TRUE FALSE The support of Z = X Y is [ 1, 1]. Z = X + Y is uniformly distributed over [, ]. True, False (d) Let X 1, X,... be an infinite sequence of independent Bernoulli random variables all with parameter p (, 1). TRUE FALSE C 1 = 1 k=1 X k has a geometric distribution with parameter p. Let C m = m k=1 X k, then the mean of C m C m 1 is p. X = C 1 C 8, and Y = C 9 C 7 are independent random variables. False, True, False 6