UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7
|
|
- Norman Adams
- 5 years ago
- Views:
Transcription
1 UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn from the urn Replace that ball and one like it Let X denote the name of the next ball drawn Replace it and one like it Continue, drawing and replacing (a) Argue that the probability of drawing k reds follwed by n k whites is 3 k k+ (n k) (k +) (n+) k!(n k)! (n+)! ( (n+) nk ) (b) Let P n be the proportion of red balls in the urn after the n th drawing Argue that Pr{P n k n+ } n+, for k,,,n + Thus all proportions are equally probable This shows that P n tends to a uniformly distributed random Variable in distribution, ie, lim Pr{P n t} t, 0 t n (c) Whatcan yousay aboutthebehaviorof theproportionp n ifyou startedinitially withoneredballintheurnandtwowhiteballs? Specifically, whatisthelimiting distribution of P n? Can you show Pr{P n k n+3 } k n+, for k,,,n+? Solution: (a) Let r be the number of red balls, w the number of white balls and t the total number of balls r +w in the urn The picture of the balls in the urn for each drawing is given below The ratio r/t or w/t gives the probability for the next drawing
2 r/t / st drawing r/t /3 d drawing r/t 3/4 (k ) th drawing r/t k/k + k th drawing w/t /k + (k+) th drawing w/t /k +3 (k+) th drawing w/t 3/k +4 (n ) th drawing n th drawing w/t n k/n+ The probability of drawing k reds followed by n k whites is 3 k (k +) (n k) (k +) (n+) k!(n k)! (n+)! ( n k) (n+) In fact, regardless of the ordering, the probability of drawing k reds in n draws is given by this expression as well Note that after the n th drawing, there are k + red balls and n k + white balls for a total of n+ balls in the urn (b) After the n th drawing, where k red balls have been drawn, the proportion P n of red balls is number of red balls in the urn P n total number of balls in the urn k+ n+ There are ( n k) orderings of outcomes with k red balls and n k white balls, and each ordering has the same probability In fact, in the expression of the probability of drawing k reds followed by n k whites, there will be just permutations of the numerators in the case of a different sequence of drawings Therefore { Pr P n k+ } n+ ( ) n k n+ ( ( n k ) ) (n+) for k 0,,n All the proportions are equally probable (the probability does not depend on k) and P n tends to a uniformly distributed random Variable Unif[0,] in distribution
3 (c) If there are one red ball and two white balls to start with: r/t /3 st drawing r/t /4 d drawing r/t 3/5 (k ) th drawing r/t k/k + k th drawing w/t /k +3 (k +) th drawing w/t 3/k +4 (k +) th drawing w/t 4/k +5 (n ) th drawing n th drawing The probability of drawing k reds followed by n k whites is w/t n k+/n+ 3 4 k (k +) (n k+) k!(n k +)! (k +3) (n+) (n+)! After the n th drawing, number of red balls in the urn P n total number of balls in the urn k+ n+3 Similarly as in part (b), { Pr P n k + } n+3 ( ) ( ) n k!(n k+)! k (n+)! ( ) n! k!(n k +)! k!(n k)! (n+)! (n k+) (n+)(n+) The distribution is linearly decreasing in p k+ Pr{P n p} ( (n+3) (n+)(n+) n+3 ) p+ ( ) n+ Therefore the limiting distribution of P n as n goes to infinity is a triangular distribution with density f Pn (p) ( p) 3
4 Note: the Polya urn for this problem generates a process that is identical to the following mixture of Bernoulli s Θ f Θ (θ) ( θ), 0 θ < Let X i Bernoulli(Θ), then P n converges to Θ as n tends to infinity Considering the general case of λ red balls and λ white balls to start with, the limiting distribution is f Θ (θ) C(λ,λ )θ λ ( θ) λ where C(λ,λ ) is a function of λ and λ Symmetric random walk Let X n be a random walk defined by X 0 0, n X n Z i, where Z,Z, are iid with P{Z } P{Z } (a) Find P{X 0 0} (b) Approximate P{ 0 X 00 0} using the central limit theorem (c) Find P{X n k} Solution: (a) Since the event {X 0 0} is equivalent to {Z Z 0 }, we have P{X 0 0} 0 (b) Since E(Z j ) 0 and E(Zj ), by the central limit theorem, { ( ) } 00 P{ 0 X 00 0} P Z i 00 (c) i i Q() Φ() 068 P{X n k} P{(n+k)/ heads in n independent coin tosses} ( ) n n+k n for n k n with n+k even 4
5 3 Absolute-value random walk ConsiderthesymmetricrandomwalkX n intheprevious problem Define the absolute value random process Y n X n (a) Find P{Y n k} (b) Find P{max i<0 Y i 0 Y 0 0} Solution: (a) If k 0 then P{Y n k} P{X n +k or X n k} If k > 0 then P{Y n k} P{X n k}, while P{Y n 0} P{X n 0} Thus ( n n (n+k)/)( ) k > 0, n k is even, n k 0 ( P{Y n k} n n n/)( ) k 0, n is even, n 0 0 otherwise (b) If Y 0 X 0 0 then there are only two sample paths with max i<0 X i 0 that is, Z Z Z 0 +,Z Z 0 or Z Z Z 0,Z Z 0 + Since the total number of sample paths is ( 0 0) and all paths are equally likely, P { max Y i 0 Y 0 0 } ( i<0 0 ) Discrete-time Wiener process Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process, ie, Z,Z, are iid N(0,) Define the process X n, n as X 0 0, and X n X n +Z n for n (a) Is X n an independent increment process? Justify your answer (b) Is X n a Gaussian process? Justify your answer (c) Find the mean and autocorrelation functions of X n (d) Specify the first order pdf of X n (e) Specify the joint pdf of X 3,X 5, and X 8 (f) Find E(X 0 X,X,,X 0 ) (g) Given X 4, X, and 0 X 3 4, find the minimum MSE estimate of X 4 Solution: 5
6 (a) Yes The increments X n, X n X n,,x nk X nk are sums of different Z i random variables, and the Z i are IID (b) Yes Any set of samples of X n, n are obtained by a linear transformation of IID N(0,) random variables and therefore all nth order distributions of X n are jointly Gaussian (it is not sufficient to show that the random variable X n is Gaussian for each n) (c) We have [ n ] E[X n ] E Z i i n E[Z i ] i R X (n,n ) E[X n X n ] n n E Z i i j Z j n n E[Z i Z j ] i j min(n,n ) i min(n,n ) n 0 0, i E(Z i) (IID) (d) As shown above, X n is Gaussian with mean zero and variance Thus, X n N(0,n) Var(X n ) E[X n ] E [X n ] R X (n,n) 0 n Cov(X n,x n ) E(X n X n ) E(X n )E(X n ) min(n,n ) Therefore, X n and X n are jointly ( Gaussian random variables ) with mean µ [0 0] T n and covariance matrix Σ min(n,n ) min(n,n ) n (e) X n, n is a zero mean Gaussian random process Thus X 3 E[X 3 ] R X (3,3) R X (3,5) R X (3,8) X 5 N E[X 5 ], R X (5,3) R X (5,5) R X (5,8) X 8 E[X 8 ] R X (8,3) R X (8,5) R X (8,8) 6
7 Substituting, we get X X 5 N 0, X (f) Since X n is an independent increment process, E(X 0 X,X,,X 0 ) E(X 0 X 0 +X 0 X,X,,X 0 ) E(X 0 X 0 X,X,,X 0 )+E(X 0 X,X,,X 0 ) E(X 0 X 0 )+X 0 0+X 0 X 0 (g) The MMSE estimate of X 4 given X x, X x and a X 3 b equals E[X 4 X x,x x,a X 3 b] Thus, the MMSE estimate is given by [ ] {X E X 4 x,x x,a X 3 b} [ ] {X E X +Z 3 +Z 4 x,x x,a X +Z 3 b} [ ] {a x x +E Z 3 Z 3 b x } +E[Z 4 ] ( since Z 3 is independent of (X,X ), and Z 4 is independent of (X,X,Z 3 ) ) [ ] {a x x +E Z 3 Z 3 b x } Plugging in the values, the required MMSE estimate is ˆX [ {Z3 4 +E Z 3 ] [,]} We have f Z3 (z 3 ) f Z3 Z 3 [,](z 3 ) P(Z 3 [,]), z 3 [,], 0, otherwise [ ] {Z3 which is symmetric about z 3 0 Thus, E Z 3 [,]} 0 and we have ˆX 4 5 A random process Let X n Z n + Z n for n, where Z 0,Z,Z, are iid N(0,) (a) Find the mean and autocorrelation function of {X n } 7
8 (b) Is {X n } Gaussian? Justify your answer (c) Find E(X 3 X,X ) (d) Find E(X 3 X ) (e) Is {X n } Markov? Justify your answer (f) Is {X n } independent increment? Justify your answer Solution: (a) E(X n ) E(Z n )+E(Z n ) 0 R X (m,n) E(X m X n ) E[(Z m +Z m )(Z n +Z n )] E[Zn ], n m E[Zn ]+E[Z n ], n m E[Zn], m n 0, otherwise, n m, n m 0, otherwise (b) Since (X,,X n ) is a linear transform of a GRV (Z 0,Z,,Z n ), the process is Gaussian (c) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear Hence, E(X 3 X,X ) [ 0 ][ ] [ ] X 3 (X X ) (d) Similarly, E(X 3 X ) (/)X (e) Since E(X 3 X,X ) E(X 3 X ), the process is not Markov (f) Since the process is not Markov, it is not independent increment X 6 Autoregressive process Let X 0 0 and X n X n +Z n for n, wherez,z, are iid N(0,) Find the mean and autocorrelation function of X n 8
9 Solution: E[X n ] E[X n ]+E[Z n ] 4 E[X n ]+ E[Z n ] n E[X ] n E[Z ] 0 For n > m we can write X n n mx m + Therefore, n m Z m+ ++ Z n +Z n n m n mx m + R X (n,m) E(X n X m ) (n m) E[X m], i0 iz n i since X m and Z n i, i 0,,n m, are independent and E[X m ] E[Z n i ] 0 To find E[X m ] consider Thus in general, E[X ], E[X ] 4 E[X ]+E[Z ] 4 +, E[X n ] 4 n ( 4 n) R X (n,m) E(X n X m ) n m 4 3 [ ( 4 )min{n,m} ] 7 Moving average process Let Z 0,Z,Z, be iid N(0,) (a) Let X n Z n +Z n for n Find the mean and autocorrelation function of X n (b) Is {X n } wide-sense stationary? Justify your answer (c) Is {X n } Gaussian? Justify your answer (d) Is {X n } strict-sense stationary? Justify your answer (e) Find E(X 3 X,X ) (f) Find E(X 3 X ) (g) Is {X n } Markov? Justify your answer (h) Is {X n } independent increment? Justify your answer (i) Let Y n Z n +Z n for n Find the mean and autocorrelation function of Y n 9
10 (j) Is {Y n } wide-sense stationary? Justify your answer (k) Is {Y n } Gaussian? Justify your answer (l) Is {Y n } strict-sense stationary? Justify your answer (m) Find E(Y 3 Y,Y ) (n) Find E(Y 3 Y ) (o) Is {Y n } Markov? Justify your answer (p) Is {Y n } independent increment? Justify your answer Solution: (a) E(X n ) E(Z n )+E(Z n ) 0 R X (m,n) E(X m X n ) [( )( )] E Z n +Z n Z m +Z m E[Z n ], n m 4 E[Z n ]+E[Z n ], n m E[Z n], m n 0, otherwise 5 4, n m, n m 0, otherwise (b) Since the mean and autocorrelation functions are time-invariant, the process is WSS (c) Since (X,,X n ) is a linear transform of a GRV (Z 0,Z,,Z n ), the process is Gaussian (d) Since the process is WSS and Gaussian, it is SSS (e) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear Hence, E(X 3 X,X ) [ 0 (f) Similarly, E(X 3 X ) (/5)X ] [ ] [ ] X X (0X 4X ) 0
11 (g) Since E(X 3 X,X ) E(X 3 X ), the process is not Markov (h) Since the process is not Markov, it is not independent increment (i) The mean function µ n E(Y n ) 0 The autocorrelation function n m, R Y (n,m) n m or n m+, 0 otherwise (j) Since the mean and autocorrelation functions are time-invariant, the process is WSS (k) Since (Y,,Y n ) is a linear transform of a GRV (Z 0,Z,,Z n ), the process is Gaussian (l) Since the process is WSS and Gaussian, it is SSS (m) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear Hence, (n) Similarly, E(Y 3 Y ) (/)Y E(Y 3 Y,Y ) [ 0 ][ ] [ ] Y 3 (Y Y ) (o) Since E(Y 3 Y,Y ) E(Y 3 Y ), the process is not Markov (p) Since the process is not Markov, it is not independent increment Y 8 Gauss-Markov process Let X 0 0 and X n X n +Z n for n, wherez,z, are iid N(0,) Find the mean and autocorrelation function of X n Solution: Using the method of lecture notes 8, we can easily verify that E(X n ) 0 for every n and that R X (n,m) E(X n X m ) n m 4 3 [ ( 4 )min{n,m} ] 9 QAM random process Consider the random process X(t) Z cosωt+z sinωt, < t <, wherez andz areiiddiscreterandomvariablessuchthatp Zi (+) p Zi ( ) (a) Is X(t) wide-sense stationary? Justify your answer (b) Is X(t) strict-sense stationary? Justify your answer
12 Solution: (a) We first check the mean E(X(t)) E(Z )cosωt+e(z )sinωt 0 cos(ωt) + 0 sin(ωt) 0 The mean is independent of t Next we consider the autocorrelation function E(X(t+τ)X(t)) E((Z cos(ω(t+τ))+z sin(ω(t+τ)))(z cos(ωt)+z sin(ωt))) E(Z )cos(ω(t+τ))cos(ωt)+e(z )sin(ω(t+τ))sin(ωt) cos(ω(t+τ))cos(ωt)+sin(ω(t+τ))sin(ωt) cos(ω(t+τ) ωt)) cosωτ The autocorrelation function is also time invariant Therefore X(t) is WSS (b) Note that X(0) Z cos0 + Z sin0 Z, so X(0) has the same pmf as Z On the other hand, X( π 4ω ) Z cos(π/4)+z (sinπ/4) (Z +Z ) wp 4 0 wp wp 4 This shows that X(π/4ω) does not have the same pdf or even same range as X(0) Therefore X(t) is not first-order stationary and as a result is not SSS 0 Mixture of two WSS processes Let X(t) and Y(t) be two zero-mean WSS processes with autocorrelation functions R X (τ) and R Y (τ), respectively Define the process { X(t), with probability Z(t) Y(t), with probability Find the mean and autocorrelation functions for Z(t) Is Z(t) a WSS process? Justify your answer Solution: To show that Z(t) is WSS, we show that its mean and autocorrelation
13 functions are time invariant Consider µ Z (t) E[Z(t)] E(Z Z X)P{Z X}+E(Z Z Y)P{Z Y} (µ X +µ Y ) 0, and similarly R Z (t+τ,t) E[Z(t+τ)Z(t)] (R X(τ)+R Y (τ)) Since µ Z (t) is independent of time and R Z (t+τ,t) depends only on τ, Z(t) is WSS 3
UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7
UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,
More informationUCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white
More informationUCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011
UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More informationUCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011
UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, 2014 Homework Set #6 Due: Thursday, May 22, 2011 1. Linear estimator. Consider a channel with the observation Y = XZ, where the signal X and
More informationUCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE53 Handout #34 Prof Young-Han Kim Tuesday, May 7, 04 Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei) Linear estimator Consider a channel with the observation Y XZ, where the
More informationSolutions to Homework Set #6 (Prepared by Lele Wang)
Solutions to Homework Set #6 (Prepared by Lele Wang) Gaussian random vector Given a Gaussian random vector X N (µ, Σ), where µ ( 5 ) T and 0 Σ 4 0 0 0 9 (a) Find the pdfs of i X, ii X + X 3, iii X + X
More informationfor valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I
Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:
More informationThe distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)
Stochastic Processes - MM3 - Solutions MM3 - Review Exercise Let X N (0, ), i.e. X is a standard Gaussian/normal random variable, and denote by f X the pdf of X. Consider also a continuous random variable
More informationChapter 6: Random Processes 1
Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else
ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5
More informationLecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process
Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes
More informationStochastic Processes
Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic
More informationECE Homework Set 3
ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3
More informationLecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN
Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and
More informationChapter 4 Random process. 4.1 Random process
Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.1 Random process 4.1 Random process Random processes - Chapter 4 Random process 2 Random process Random process,
More informationFinal Examination Solutions (Total: 100 points)
Final Examination Solutions (Total: points) There are 4 problems, each problem with multiple parts, each worth 5 points. Make sure you answer all questions. Your answer should be as clear and readable
More informationRandom Processes Why we Care
Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More informationStationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.
Stationary independent increments 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. 2. If each set of increments, corresponding to non-overlapping collection of
More informationProbability and Statistics
Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph
More informationIntroduction to Probability and Stochastic Processes I
Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationUCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE53 Handout #7 Prof. Young-Han Kim Tuesday, May 6, 4 Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei). Neural net. Let Y = X + Z, where the signal X U[,] and noise Z N(,) are independent.
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationEEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:
EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,
More informationECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes
ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationFinal. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 06 Instructor: Prof. Stanley H. Chan Final Fall 06 (Dec 6, 06) Name: PUID: Please copy and write the following statement: I certify
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationSolutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π
Solutions to Homework Set #5 (Prepared by Lele Wang). Neural net. Let Y X + Z, where the signal X U[,] and noise Z N(,) are independent. (a) Find the function g(y) that minimizes MSE E [ (sgn(x) g(y))
More informationP 1.5 X 4.5 / X 2 and (iii) The smallest value of n for
DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X
More informationProblem Sheet 1 Examples of Random Processes
RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give
More informationGaussian, Markov and stationary processes
Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November
More informationECE-340, Spring 2015 Review Questions
ECE-340, Spring 2015 Review Questions 1. Suppose that there are two categories of eggs: large eggs and small eggs, occurring with probabilities 0.7 and 0.3, respectively. For a large egg, the probabilities
More informationPROBABILITY AND RANDOM PROCESSESS
PROBABILITY AND RANDOM PROCESSESS SOLUTIONS TO UNIVERSITY QUESTION PAPER YEAR : JUNE 2014 CODE NO : 6074 /M PREPARED BY: D.B.V.RAVISANKAR ASSOCIATE PROFESSOR IT DEPARTMENT MVSR ENGINEERING COLLEGE, NADERGUL
More information16.584: Random (Stochastic) Processes
1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More informationProbability and Statistics for Final Year Engineering Students
Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms
More informationProperties of the Autocorrelation Function
Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R
More informationAppendix A : Introduction to Probability and stochastic processes
A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of
More informationECE534, Spring 2018: Solutions for Problem Set #5
ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described
More informationUCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE 53 Handout #0 Prof. Young-Han Kim Thursday, April 4, 04 Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei). Time until the n-th arrival. Let the random variable N(t) be the number
More informationMA6451 PROBABILITY AND RANDOM PROCESSES
MA6451 PROBABILITY AND RANDOM PROCESSES UNIT I RANDOM VARIABLES 1.1 Discrete and continuous random variables 1. Show that the function is a probability density function of a random variable X. (Apr/May
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationMultivariate probability distributions and linear regression
Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,
More informationSRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS
UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided
More information2. (a) What is gaussian random variable? Develop an equation for guassian distribution
Code No: R059210401 Set No. 1 II B.Tech I Semester Supplementary Examinations, February 2007 PROBABILITY THEORY AND STOCHASTIC PROCESS ( Common to Electronics & Communication Engineering, Electronics &
More informationEAS 305 Random Processes Viewgraph 1 of 10. Random Processes
EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationVALLIAMMAI ENGINEERING COLLEGE SRM Nagar, Kattankulathur 603 203. DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING SUBJECT QUESTION BANK : MA6451 PROBABILITY AND RANDOM PROCESSES SEM / YEAR:IV / II
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationPh.D. Qualifying Exam Monday Tuesday, January 4 5, 2016
Ph.D. Qualifying Exam Monday Tuesday, January 4 5, 2016 Put your solution to each problem on a separate sheet of paper. Problem 1. (5106) Find the maximum likelihood estimate of θ where θ is a parameter
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the
More informationMultivariate distributions
CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density
More informationENGR352 Problem Set 02
engr352/engr352p02 September 13, 2018) ENGR352 Problem Set 02 Transfer function of an estimator 1. Using Eq. (1.1.4-27) from the text, find the correct value of r ss (the result given in the text is incorrect).
More information3F1 Random Processes Examples Paper (for all 6 lectures)
3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories
More informationChapter 6 - Random Processes
EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process
More informationMassachusetts Institute of Technology
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.011: Introduction to Communication, Control and Signal Processing QUIZ, April 1, 010 QUESTION BOOKLET Your
More informationChapter 2 Random Processes
Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated
More information6.041/6.431 Fall 2010 Quiz 2 Solutions
6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential
More informationECE Lecture #10 Overview
ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental
More informationECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.
ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. 1. Enter your name, student ID number, e-mail address, and signature in the space provided on this page, NOW! 2. This is a closed book exam.
More informationFINAL EXAM: 3:30-5:30pm
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.
More informationHomework 4 Solution, due July 23
Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var
More informationChapter 6. Random Processes
Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process
More informationTwelfth Problem Assignment
EECS 401 Not Graded PROBLEM 1 Let X 1, X 2,... be a sequence of independent random variables that are uniformly distributed between 0 and 1. Consider a sequence defined by (a) Y n = max(x 1, X 2,..., X
More informationLecture Notes 3 Convergence (Chapter 5)
Lecture Notes 3 Convergence (Chapter 5) 1 Convergence of Random Variables Let X 1, X 2,... be a sequence of random variables and let X be another random variable. Let F n denote the cdf of X n and let
More informationHomework for 1/13 Due 1/22
Name: ID: Homework for 1/13 Due 1/22 1. [ 5-23] An irregularly shaped object of unknown area A is located in the unit square 0 x 1, 0 y 1. Consider a random point distributed uniformly over the square;
More informationConditional distributions (discrete case)
Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationGaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts
White Gaussian Noise I Definition: A (real-valued) random process X t is called white Gaussian Noise if I X t is Gaussian for each time instance t I Mean: m X (t) =0 for all t I Autocorrelation function:
More informationStochastic Processes. Chapter Definitions
Chapter 4 Stochastic Processes Clearly data assimilation schemes such as Optimal Interpolation are crucially dependent on the estimates of background and observation error statistics. Yet, we don t know
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationTest Code: STA/STB (Short Answer Type) 2013 Junior Research Fellowship for Research Course in Statistics
Test Code: STA/STB (Short Answer Type) 2013 Junior Research Fellowship for Research Course in Statistics The candidates for the research course in Statistics will have to take two shortanswer type tests
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationECE 541 Stochastic Signals and Systems Problem Set 11 Solution
ECE 54 Stochastic Signals and Systems Problem Set Solution Problem Solutions : Yates and Goodman,..4..7.3.3.4.3.8.3 and.8.0 Problem..4 Solution Since E[Y (t] R Y (0, we use Theorem.(a to evaluate R Y (τ
More informationECE 673-Random signal analysis I Final
ECE 673-Random signal analysis I Final Q ( point) Comment on the following statement "If cov(x ; X 2 ) 0; then the best predictor (without limitation on the type of predictor) of the random variable X
More informationMATH 151, FINAL EXAM Winter Quarter, 21 March, 2014
Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More information4 Derivations of the Discrete-Time Kalman Filter
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationQuestion Paper Code : AEC11T03
Hall Ticket No Question Paper Code : AEC11T03 VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad Four Year B Tech III Semester Tutorial Question Bank 2013-14 (Regulations: VCE-R11)
More informationRecursive Estimation
Recursive Estimation Raffaello D Andrea Spring 08 Problem Set : Bayes Theorem and Bayesian Tracking Last updated: March, 08 Notes: Notation: Unless otherwise noted, x, y, and z denote random variables,
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1
ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen D. van Alphen 1 Lecture 10 Overview Part 1 Review of Lecture 9 Continuing: Systems with Random Inputs More about Poisson RV s Intro. to Poisson Processes
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each
More informationMathematical Methods for Computer Science
Mathematical Methods for Computer Science Computer Science Tripos, Part IB Michaelmas Term 2016/17 R.J. Gibbens Problem sheets for Probability methods William Gates Building 15 JJ Thomson Avenue Cambridge
More informationLecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes
Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationRandom Process. Random Process. Random Process. Introduction to Random Processes
Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,
More informationEC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)
1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More information[POLS 8500] Review of Linear Algebra, Probability and Information Theory
[POLS 8500] Review of Linear Algebra, Probability and Information Theory Professor Jason Anastasopoulos ljanastas@uga.edu January 12, 2017 For today... Basic linear algebra. Basic probability. Programming
More information