Assignment 9. Due: July 14, 2017 Instructor: Dr. Mustafa El-Halabi. ! A i. P (A c i ) i=1

Similar documents
University of Illinois ECE 313: Final Exam Fall 2014

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Recitation 2: Probability

Midterm Exam 1 Solution

ECE Information theory Final

Discrete Random Variables

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ENGG2430A-Homework 2

Jointly Distributed Random Variables

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Lecture 2: Repetition of probability theory and statistics

Tutorial 1 : Probabilities

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

ECE 353 Probability and Random Signals - Practice Questions

Algorithms for Uncertainty Quantification

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Notes for Math 324, Part 19

MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM

Probability review. September 11, Stoch. Systems Analysis Introduction 1

HW Solution 3 Due: July 15

CME 106: Review Probability theory

Statistics 427: Sample Final Exam

Homework 5 Solutions

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Probability Review. Chao Lan

Statistical Methods in Particle Physics

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 2: Review of Probability

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

STAT 414: Introduction to Probability Theory

FINAL EXAM: Monday 8-10am

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Bivariate distributions

Multivariate Random Variable

Twelfth Problem Assignment

Probability, Random Processes and Inference

Exam 1. Problem 1: True or false

More on Distribution Function

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Chapter 4 continued. Chapter 4 sections

Joint Distribution of Two or More Random Variables

BASICS OF PROBABILITY

ECE531: Principles of Detection and Estimation Course Introduction

Appendix A : Introduction to Probability and stochastic processes

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

1 Random Variable: Topics

ECEn 370 Introduction to Probability

Precept 4: Hypothesis Testing

ECON Fundamentals of Probability

Probability and Stochastic Processes

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Data Analysis and Monte Carlo Methods

STAT 418: Probability and Stochastic Processes

Random Variables and Their Distributions

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

MAT 271E Probability and Statistics

Functions of two random variables. Conditional pairs

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

CMPSCI 240: Reasoning Under Uncertainty

Random Variables and Expectations

Ch. 5 Joint Probability Distributions and Random Samples

Electrical Engineering Written PhD Qualifier Exam Spring 2014

Debugging Intuition. How to calculate the probability of at least k successes in n trials?

Chapter 5 continued. Chapter 5 sections

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

Basic concepts of probability theory

A Hilbert Space for Random Processes

FINAL EXAM: 3:30-5:30pm

Data Mining Techniques. Lecture 3: Probability

ECE Homework Set 2

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Review of Probability. CS1538: Introduction to Simulations

Massachusetts Institute of Technology

Class 8 Review Problems 18.05, Spring 2014

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

ECE Homework Set 3

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

ECE 302: Probabilistic Methods in Electrical Engineering

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Midterm Exam 1 (Solutions)

Discrete Random Variable

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011

Final Examination Solutions (Total: 100 points)

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

1 Basic continuous random variable problems

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

PROBABILITY AND STOCHASTIC PROCESSES A Friendly Introduction for Electrical and Computer Engineers

Chapter 2: Random Variables

General Random Variables

Lecture 11. Probability Theory: an Overveiw

Transcription:

CCE 40: Communication Systems Summer 206-207 Assignment 9 Due: July 4, 207 Instructor: Dr. Mustafa El-Halabi Instructions: You are strongly encouraged to type out your solutions using mathematical mode typing in some o ce suite. If you must handwrite your homework, please write clearly and legibly. I will not grade homework that are unreadable. Note: You must justify all your answers. In particular, you will get no credit if you simply write the final answer without any explanation. Only few selected problems will be graded. Problem. Union Bound. Using the Union bound : P! n\ A i i= nx P (A c i ) i= Problem 2. Convex Combination. Let be a sample space equipped with two probability measures, P and P 2. Given that 0 apple apple, we define the convex combination of P and P 2 as P (A) = P (A)+( )P 2 (A). Show that P satisfies Kolmogorov axioms of probability. Problem 3. Probability Bounds. Let A and B be events with probabilities P (A) = 3 4 and P (B) = 3.Showthat 2 apple P (A \ B) apple 3 Problem 4. Internet Error. Consider three cities A, B, and C connected with fiber optical cables to provide internet service. An internet configuration error takes place, and as a result packets transmitted from A to B are routed through C with probability 3/4. If a packet is routed through C, it has a probability of /3 of being dropped. If a packet is not routed through C, it has a probability of /4 of begin dropped.. What is the probability that a packet is dropped between A and B? 2. What is the probability that a packet is routed through C given that it is not dropped? 3. Assume that a packet is transmitted from A to C and from C to B, and from B to your laptop. If packets drop through each of the three paths occurs independently with probability /3. Find the probability that a packet gets to your laptop from A successfully. Problem 5. Snow. There are two roads from A to B and two roads from B to C. Each of the two roads is blocked by snow with probability p, independently of the others.. Find the probability that the road between A and B is open. 2. Find the probability that there is an open road from A to B given that there is a closed road from A to C.

3. If, in addition, there is a direct road from A to C, this road being blocked with probability p independently of the others, find the required conditional probability. Problem 6. Independence. Assume that A and B are independent.. Show that A c and B c are independent. 2. Show that A c and B are independent. Problem 7. Conditional Probability Law. Prove that given a fixed event B, the conditional probability P (A B) verifies Kolmogorov axioms and hence is a valid probability law. Problem 8. A Convergence. Assume that for any i, P (A i ) =. Show that P! \ A i = i= Problem 9. Joint Source-Channel Communication. A binary source generates a sequence of 7 bits according to Ber (/2).. What is the probability that at least two s occur? 2. The source feeds a binary noisy channel with per-bit error probability equal to 0.. What is the probability of having more than one error in the received bits? 2

CCE 40: Communication Systems Summer 206-207 Assignment 0 Due: July 8, 207 Instructor: Dr. Mustafa El-Halabi Instructions: You are strongly encouraged to type out your solutions using mathematical mode typing in some o ce suite. If you must handwrite your homework, please write clearly and legibly. I will not grade homework that are unreadable. Note: You must justify all your answers. In particular, you will get no credit if you simply write the final answer without any explanation. Only few selected problems will be graded. Problem. 5 before 7. Consider the random experiment that consists of rolling a pair of fair dices, where the outcome of a roll is the sum of the dices. Assume that every trial is independent for the other.. Without using conditional probabilities, what is the probability that an outcome of 5 appears before an outcome of 7? 2. Using the total law of probability, what is the probability that an outcome of 5 appears before an outcome of 7? Problem 2. Repetition Coding and Majority Selector. Assume that the binary source produces a sequence of binary digits (zeros and ones) at the rate of digit per second. Suppose that the digits 0 and are equally likely to occur and that they are produced independently. The digits are transmitted through the channel, where the probability of error is assume to be p =/3, and that the channel acts on successive inputs independently. We also assume that digits can be transmitted through the channel at a rate not to exceed digit per second. Now a probability of error of /3 may be far too high in a given application, and we would naturally look for way for improving reliability. One way that might come to mind involves sending the source digit through the channel more than once. For example, if the source produces a 0 at a given time, we might send a sequence of 3 zeros through the channel; if the source produces a, we would send 3 ones. At the receiving end of the channel, we will have a sequence of 2 digits for each one produced by the source. We will have the problem of decoding each sequence, that is, making a decision, for each sequence received, as to the identity of the source digit. A reasonable way to decide is by means of a majority selector, that is, a rule which specifies that if more ones than zeros are received, we are to decode the received sequence as a ; if more zeros than ones appear, decode as a 0. i. (3 points) Calculate the probability that a given source digit is received in error. ii. (3 points) Verify how did repetition coding improve reliability. Problem 3. MAP Detection. Consider the following discrete noisy channel between the transmitter X and the receiver Y, with the assigned transitional conditional probabilities (for instance P (Y = 2 X = a) = 0.3). Assume we have the following a prioris: P (X = a) =0.3, P (X = b) =0.5 andp (X = c) =0.2. A generalization of the ML detection rule discussed in class is the Maximum A Posterioi (MAP) detection rule, which is given by ˆx MAP = max i=,2,3 `i where `i = P (y x i )P (x i ). For a given observation y, the detector computes `i for i =, 2, 3 and decides on the x that corresponds to the maximum `i.

. Derive the MAP detection rule for the given problem (i.e., find the assignment of, 2, 3toa, b, c). 2. Find the probability of error corresponding to the MAP detector you derived. 3. What is the minimum probability of error if you do not have the channel statistics? Problem 4. Bernoulli Distribution. Let X be a Bernoulli random variable with P (X = ) = p = P (X = 0).. Find Var(X). 2. Let Y =(a b)x + b. Find the distribution of Y and the mean and variance of Y. Problem 5. Cauchy Distribution. The following PDF is know as the standard Cauchy distribution Find. f X (x) = Problem 6. Valid CDFs. Which of the following functions could be a valid CDF:. F X (x) = 2 + tan (x) +x 2 2. F X (x) =[ e x ]u(x), where u(x) is the step function 3. F X (x) =e x 2 4. F X (x) =x 2 u(x) 2

CCE 40: Communication Systems Summer 206-207 Assignment Due: July 20, 207 Instructor: Dr. Mustafa El-Halabi Instructions: You are strongly encouraged to type out your solutions using mathematical mode typing in some o ce suite. If you must handwrite your homework, please write clearly and legibly. I will not grade homework that are unreadable. Note: You must justify all your answers. In particular, you will get no credit if you simply write the final answer without any explanation. Only few selected problems will be graded. Problem. Laplace Random Variable. Let X be a Laplace random variable that has the following pdf: x f X (x) =Aexp b. Find A. 2. Find the CDF F X (x). 3. Find E[X] and Var(X). 4. Find E[ X ] 5. Let Y = X 2 +. Find the pdf of Y. Problem 2. Mixture Distribution Consider a Gaussian random variable A N(, ), and a uniform random variable B U[, 4], and an exponential random C with parameter, i.e, f C (c) =e c U(c).. Find P (A <), P (B <) and P (C <) 2. Find P (2B 2 B>) 3. A random variable X assumes a the distribution of A with probability /3 and assumes the distribution of B with probability 2/3. (a) Find the pdf of X. (b) Find P (X <3). Problem 3. On Gaussian Distribution. Let X be a Gaussian random variable of the form. Find B. 2. Find E[X 2 ] 3. Find P ( X + > 3). 4. Let Y = 4X 3. f X (x) =Be 2x 2 3x

(a) Find E[Y ]andvar(y ). (b) Find E[Y 7 ], E[sin(Y )] Problem 4. Variance of a sum.. Prove that: Var(aX + by + c) =a 2 Var(X)+b 2 Var(Y )+2abCov(X, Y ) 2. Prove that: Var(X + Y + Z) =Var(X)+Var(Y )+Var(Z)+2Cov(X, Y )+2Cov(X, Z)+2Cov(Y,Z) Problem 5. On Gaussian Distribution. Let X and Y be jointly Gaussian random variables with E[X] =, E[Y ]= XY =/3. Find the PDF of Z =2X 3Y 5. Problem 6. Correlation of Gaussian Random Variables Let X N(2, 3) and Y N(, ) be two normal distributions. 2, Var(X) = 4, Var(Y ) = 9, and. Find P [X 2Y +> ]suchthatx and Y are independent. 2. Find P [X 2Y +> ]suchthat XY = /3. Problem 7. Joint PMF The joint probability mass function (pmf) of the random variable X and Y is: p XY (a, ) = 4, p XY (2, ) = 2, p XY (a, 2) = 8, p XY (2, 2) = 8. Find the marginals of the joint PMF. 2. Find E[XY ], E[X], and E[Y ]. 3. For what value of a are X and Y uncorrelated? 2