E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Similar documents
ECE Homework Set 3

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Stochastic Processes

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Random Processes Why we Care

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Recitation 2: Probability

4. Distributions of Functions of Random Variables

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Chapter 4. Chapter 4 sections

Random Process. Random Process. Random Process. Introduction to Random Processes

Lecture 2: Repetition of probability theory and statistics

Exercises with solutions (Set D)

Probability Review. Chao Lan

Algorithms for Uncertainty Quantification

ECON Fundamentals of Probability

FINAL EXAM: 3:30-5:30pm

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

3F1 Random Processes Examples Paper (for all 6 lectures)

Definition of a Stochastic Process

Module 9: Stationary Processes

1.1 Review of Probability Theory

Communication Theory II

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Chapter 6. Random Processes

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Chapter 6: Random Processes 1

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

6.041/6.431 Fall 2010 Quiz 2 Solutions

MAS223 Statistical Inference and Modelling Exercises

Jointly Distributed Random Variables

Appendix A : Introduction to Probability and stochastic processes

DISCRETE RANDOM VARIABLES: PMF s & CDF s [DEVORE 3.2]

CME 106: Review Probability theory

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

Continuous Random Variables

conditional cdf, conditional pdf, total probability theorem?

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

BASICS OF PROBABILITY

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

1 Random variables and distributions

Chapter 5 continued. Chapter 5 sections

Stat 5101 Notes: Algorithms

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

More on Distribution Function

18 Bivariate normal distribution I

Lecture 1: August 28

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Continuous Distributions

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

ECE Lecture #10 Overview

EXAM # 3 PLEASE SHOW ALL WORK!

ENGG2430A-Homework 2

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

1 Presessional Probability

Problem Sheet 1 Examples of Random Processes

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Chapter 2: Random Variables

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

16.584: Random (Stochastic) Processes

HW Solution 12 Due: Dec 2, 9:19 AM

ECE 353 Probability and Random Signals - Practice Questions

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Lecture 2: Review of Probability

Midterm Exam 1 Solution

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

1 Probability theory. 2 Random variables and probability theory.

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

PROBABILITY AND RANDOM PROCESSESS

Class 8 Review Problems 18.05, Spring 2014

1 Probability and Random Variables

3 Multiple Discrete Random Variables

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Name of the Student: Problems on Discrete & Continuous R.Vs

Multivariate random variables

Semester , Example Exam 1

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008

ECE-340, Spring 2015 Review Questions

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Introduction to Probability and Stocastic Processes - Part I

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

. Find E(V ) and var(v ).

Probability and Distributions

p. 6-1 Continuous Random Variables p. 6-2

Quick Tour of Basic Probability Theory and Linear Algebra

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Transcription:

E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator, Lecture notes (as in Fronter). 1

Problem 1 (10 points) Let A and B be two events with probabilities P (A) = a and P (B) = b, respectively, where a and b are real-valued constants 0 < a, b 1. Determine the conditional probability P (A B) if 1.1) A and B are mutually exclusive. 1.2) A and B are independent. 1.3) A B. 1.4) B A. 1.5) P (A B) = c, where c is a real-valued constant and 0 < c a. Problem 2 (10 points) An experiment is defined by flipping an unfair coin three times, where the result of each flip is independent of any preceding result. The probability of coming up heads h and tails t equals P (h) = 2/3 and P (t) = 1/3, respectively. 2.1) Determine all possible outcomes and their corresponding probabilities. 2.2) Find the probability P (A) of the event A that the first flip results in a head. 2.3) Find the probability P (B) of the event B that the number of heads is even. 2.4) Find the probability P (A B) of the events A and B. 2.5) Find the conditional probability P (A B) of the events A and B. 2

Problem 3 (15 points) The probability density function (PDF) f X (k) of a discrete random variable X is given by f X (k) = P {X = k} = a(k 2 + 4), k = 0, 1, 2, 3, 4, 0, otherwise, (3.1) where a is a real-valued constant. 3.1) Determine the constant a and sketch the PDF f X ( k) of X. 3.2) Find and sketch the corresponding cumulative distribution function (CDF) F X (k) of X. 3.3) Find the probability P {X > 1}. 3.4) Find the conditional probability P {X = 3 X 2}. 3.5) Find the mean E{X} of X. 3.6) Find the variance Var{X} of X. Problem 4 (10 points) Let X be a uniformly distributed random variable over the interval [1, 3]. Given is another random variable Y, which is defined as Y = 6 X+2. 4.1) Sketch the PDF f X (x) of X. 4.2) Find the PDF f Y (y) of Y. 4.3) Sketch the PDF f Y (y) of Y. 3

Problem 5 (20 points) Let X and Y be two random variables, which are characterized by the joint PDF f XY (x, y) = xy, if 0 x 1 and 0 y 2, 0, otherwise. (5.1) 5.1) Find the marginal PDF f X (x) of X. 5.2) Find the marginal PDF f Y (y) of Y. 5.3) Are X and Y dependent or independent? Explain your answer. 5.4) Are X and Y correlated or uncorrelated? Explain your answer. 5.5) Find the covariance C XY of the two random variables X and Y. 5.6) Find the probability P {X 2 + Y 2 1}. Hints: 1. The Cartesian coordinates (x, y) can be transformed into polar coordinates (r, θ) by means of x = r cos(θ), y = r sin(θ), and dx dy = r dr dθ. 2. sin(α) cos(β) = 1 2 [sin(α + β) + sin(α β)] 3. sin(ax)dx = 1 a cos(ax) 4

Problem 6 (15 points) Let X i (i = 1, 2,..., n) be the lifetime of a light bulb, which is used until it fails, and then it is replaced by a new light bulb. The lifetimes X i of the light bulbs are independent and identically distributed (i.i.d.) random variables with mean E{X i } = µ X = 100 h (in hours) and variance Var{X i } = σ 2 X = 25 h 2. Another random variable Y is defined as the total lifetime of n light bulbs, which are used one-by-one, i.e., Y = X 1 + X 2 + + X n. 6.1) Find the mean E{Y } and the variance Var{Y } of Y. 6.2) Give reasons, why the cumulative distribution function (CDF) F Y (y) of Y can be expressed by F Y (y) = G y nµ X nσ 2 X if n where G( ) is the CDF of the standard normal random variable. 6.3) For n = 36, find an approximate expression for the probability that Y is between 3500 and 3700, i.e., P {3500 Y 3700}. Hint: Some useful values of the CDF G(x) of the standard normal random variable: G(1) = 0.6 G(3) = 0.7 G(10/3) = 0.9. 5

Problem 7 (20 points) Let X(t) = e At be a stochastic process, where A is a real-valued random variable, which is uniformly distributed between 1 and 3. 7.1) Compute the mean µ X (t) of X(t) by using µ X (t) = E{X(t)}. 7.2) Compute the autocorrelation function R XX (t 1, t 2 ) of X(t) by using R XX (t 1, t 2 ) = E{X(t 1 )X (t 2 )}. 7.3) Is the stochastic process X(t) wide-sense stationary (WSS)? Give reasons for your answer. 7.4) Compute the time average µ X of a single sample function X(t; a i ) using µ X =< X(t; a i ) >, where a i is a constant (outcome of A) and < > denotes the time averaging operator. 7.5) Is the stochastic process X(t) mean-ergodic? Give reasons for your answer. 7.6) Compute the autocorrelation function R XX (τ) of a single sample function X(t; a i ) using R XX (τ) =< X(t + τ; a i )X (t; a i ) >. 7.7) Is the stochastic process X(t) autocorrelation ergodic? Give reasons for your answer. Hint: e ax dx = eax, where a 0 is a constant. (7.1) a 6