Signals. Time-domain & Frequency-domain representations. Signals & Noises YCJ

Size: px
Start display at page:

Download "Signals. Time-domain & Frequency-domain representations. Signals & Noises YCJ"

Transcription

1 Signals Deterministic signals & Stochastic (random) signals Continuous time signals & Discrete Time signals Analog signals & Digital signals Energy signals & Power signals Periodic signals & Non-periodic signals Time-domain & Frequency-domain representations 1

2 The Delta Function: δ(t) Definition: δ(t) = 0, for t 0 δ(t) dt = 1 Other definitions: δ(t) = lim n f n (t) δ(t) = lim a Sin(at)/πt f n (t) n/2-1/n 1/n t 2

3 The Delta Function: δ(t) δ(at) = 1/ a δ(t) f(t) δ(t) = f(0) δ(t) - δ(t)f(t) dt = f(0) - δ(t-t 0 )f(t) dt = f(t 0 ) 3

4 The Sinc Function: Sin(x)/x Sin(x)/x Sinc(x) = Sin(x)/x Sin(ωt)/ωt Sin(t)/t dt = π π 2π x lim a Sin(at)/πt = δ(t)

5 The Integral of e jωt - e jωt dt = 2π δ(ω) or - e jωt dω = 2π δ(t) - e jωt dt = lim a -a e jωt dt = lim a 2a Sin(ωa)/(ωa) = lim a 2π Sin(ωa)/(ωπ) = 2π δ(ω) a 5

6 Fourier Fundamental Theorem F(ω) = - f(t)e -jωt dt f(t) = 1/(2π) - F(ω)e jωt dω 6

7 The Fourier Integral F(ω) = - f(t)e -jωt dt = R(ω) + j I(ω) = A(ω) e jφ(ω) An example: f(t) = e -αt u(t) α > 0 F(ω) = 1/(α+jω) = α/(α 2 +ω 2 ) + j (-ω)/(α 2 +ω 2 ) = 1/(α 2 +ω 2 ) 1/2 e -jarctan(ω/α) 7

8 Fourier Integral of a Real Function F(ω) = - f(t)e -jωt dt F*(ω) = - f(t)e jωt dt F*(ω) = F(-ω) F(ω) = A(ω) e jφ(ω) F*(ω) = A(ω) e -jφ(ω) F(-ω) = A(-ω) e jφ( ω) A(ω) = A(-ω) φ(ω) = φ( ω) 8

9 Basic Properties of Fourier Integral Linearity: a 1 f 1 (t)+a 2 f 2 (t) a 1 F 1 (ω)+a 2 F 2 (ω) Symmetry: F(-t) 2πf(ω) Time Scaling: f(at) 1/ a F(ω/a) Time Shifting: f(t-t o ) F(ω)e jω t o Frequency Shifting: f(t)e jω ot F(ω ω o ) Modulation: f(t)cos(ω o t) [F(ω ω o )+F(ω+ω o )]/2 9

10 Convolution & Convolution Theorem g(t) = f(t) * h(t) = - f(τ) h(t-τ) dτ G(ω) = F(ω) H(ω) h(t)f(t) (1/2π) H(ω)*F(ω) 10

11 Parseval Theorem & Energy - f(t)g*(t) dt = (1/2π) - F(ω)G*(ω)dω - f(t)f*(t) dt = (1/2π) - F(ω)F*(ω)dω - f(t) 2 dt = (1/2π) - F(ω) 2 dω Energy Spectral Density: F(ω) 2 11

12 Fourier Transforms of Basic Functions p T (t) -T T t (2T)Sin(Tω)/Tω Sin(at)/πt p a (ω) -a a ω p T (t)cos(ω o t) (T){Sin[T(ω ω o )]/[T(ω ω o )] + Sin[T(ω+ω o )]/[T(ω+ω o )]} q T (t) 4Sin 2 (Tω/2)/(Tω 2 ) -T T t 12

13 Fourier Transforms of Basic Functions δ(t) 1; 1 2πδ(ω) δ(t-t o ) e jω t o; e jω ot 2πδ(ω ω o ) Cos(ω o t) π[δ(ω ω o ) + δ(ω+ω o )] Sin(ω o t) -jπ[δ(ω ω o ) - δ(ω+ω o )] Sgn(t) u(t) 2/jω πδ(ω) + 1/(jω) 13

14 Bandwidth and Duration B ω = - H(ω) dω /H(0) D τ = - h(t) dt / h(t) max B ω D τ 2π 14

15 The Summation of e jkωt k= k= Σ e jkωt = (2π/T) Σ δ[ω k(2π/τ)] k=- k= k= k=- Σ e jkω = (2π) Σ δ(ω k2π) k=- k= k=- Σ e jk(ω-ω)μt = (2π/MT) Σ δ[ω-ω k(2π/μτ)] k=- k= k=- 15

16 The Fourier Series For a periodic function f(t) with period T: f(t) = f(t-t) f(t) = Σ k α k e jkω o t where ω o = 2π/T T/2 α k =(1/T) -T/2 f(t) e -jkω o t dt = (1/T) F o (kω o ) Where F o (ω) is the Fourier transform of f o (t) and f o (t) = { f(t), for t (-T/2,T/2), 0, otherwise 16

17 The Fourier Series f(t) -T/2 T/2 t f o (t) F o (ω) -T/2 T/2 t 3ω o 2ω o ω o ω o 2ω o 3ω o ω 17

18 Parseval Theorem & Average Power Average Power = (1/T) -T/2 f(t) 2 dt k= = Σ α k 2 k=- k= -T/2 = Σ (1/T)F o [k(2π/τ)] 2 k=- If G(ω) is the Fourier transform of a pulse of duration T - G(ω) 2 d ω = (2π/Τ) Σ G[k(2π/Τ)] 2 k= k=- 18

19 Autocorrelation Function & Power Spectral Density T/2 R v (τ) = lim T (1/T) -T/2 v(t)v*(t-τ)dt Average Power = P v = R v (0) The Power Spectral Density: S v (ω) S v (ω) = - R v (τ) e -jωτ dτ P v = R v (0) = 1/(2π) - S v (ω) dω 19

20 Power Spectral Density of Periodic Signals T/2 R v (τ) = (1/T) -T/2 v(t)v*(t-τ)dt k= R v (τ) = Σ α k 2 e jkω oτ k=- k= R v (0) = Σ α k 2 k=- S v (ω) =2π Σ α k 2 δ(ω kω o ) k= k=- 20

21 Linear Time-Invariant System x(t) h(t) H(ω) y(t) Linearity: a x 1 (t)+ b x 2 (t) a y 1 (t)+ b y 2 (t) Time- Invariant: x(t-τ) y(t-τ) Convolution: y(t) = - x(t-τ)h(τ)dτ Y(ω) =X(ω)H(ω) 21

22 Frequency Response of LTI System e jωt H(ω) H(ω) e jωt H(ω) = H(ω) e jarg{h(ω)} A 1 e jω 1 t +A 2 e jω 2 t H(ω) H(ω 1 ) A 1 e jω 1 t + H(ω 2 ) A 1 e jω 2 t x(t) = Σ k α k e jkω o t H(ω) y(t) = Σ k α k H(kω ο ) e jkω o t 22

23 Frequency Response of LTI System X(ω) Y(ω) ω=2πf ω=2πf 23

24 Power Spectral Density & LTI System x(t) h(t) H(ω) S v (ω) = S x (ω) H(ω) 2 y(t) For periodic Signals k= S v (ω) = Σ α k 2 H(ω) 2 δ(ω kω o ) k=- S v (ω) = Σ α k 2 H(kω o ) 2 δ(ω kω o ) k= k=- 24

25 Probability Space ξ : Experimental Outcome {ξ 1, ξ 2, ξ 3 }: Event {ξ}: Elementary Event S : Certain Event φ: Impossible Event 25

26 Two-Face Coin Example: two-face coin {H, T} ξ = H, 1 ξ =T 2 {ξ 1 } ={ H }, {ξ 2 } ={ T } S = {H,T} φ = {} 26

27 Six-Face Die Example: six-face die {f 1, f 2, f 3, f 4, f 5, f 6 } ξ =f k k, k = 1, 2, 3, 4, 5, 6 {ξ k } ={ f k }, k = 1, 2, 3, 4, 5, 6 even Event: { f 2, f 4,, f 6 } smallest two Event: { f 1, f 2 } S = {f 1, f 2, f 3, f 4, f 5, f 6 } φ = {} 27

28 Three Axioms Q: How to specify (characterize) a probability space? A: To find All the probabilities to All events A and B are events, Let AB = A B and A+B = A B Axiom 1: P(A) 0 Axiom 2: P(S) = 1 Axiom 3: If AB = φ Then P(A+B) = P(A) + P(B) 28

29 Basic Results P(φ) = 0 (use Axiom 3, B=φ) P(A) 1 (use Axioms 1, 2 and 3, B=A ) P(A+B) = P(A) + P(B) P(AB) A+B = A + A c B P(A+B) = P(A)+P(A c B) B = (A+A c )B P(B) = P(AB)+P(A c B) P(A+B) = P(A)+P(B) P(AB) 29

30 Real Line : A Probability Space If f(x) 0, for all x, and - f(x)dx =1 We define P{ x x o } = - xo f(x)dx Then the probability space is completely characterized Events: {x 1 < x x 2 }, {x 1 x}, {x x 2 } P{single point} = 0 30

31 Conditional Probability The conditional probability of an event A given M: assuming P(M) 0 P(A M) = P(AM) / P(M) 1. If M A, Then P(A M) = 1 2. If A M, Then P(A M) P(A) 3. If A and M are independent, Then P(A M) = P(A) 4. Conditional Probability is a Probability P(A M) 0 P(S M) = 1 If A B =φ, Then P(A+B M) = P(A M) + P(B M) 31

32 Conditional Probability Example 1: In a fair six-face die experiment A={f 2 }, and M = {even} P(A) = 1/6, P(M) = ½, P(AM) = 1/6 P(A M) = (1/6)/(1/2) = 1/3 Example 2: Example 3: 3 white balls and two red balls t: the age a person dies to P{ t t o } = 0 f(t)dt Prior Probability Posteriori Probability 32

33 Total Probability Theorem { A 1, A 2, A 3, A n } is a partition of S Then P(B) = P(B A 1 )P(A 1 ) + P(B A 2 )P(A 2 ) + + P(B A n )P(A n ) Bayes Theorem: P(A k B) = P(B A k ) P(A k ) /P(B) = P(B A k ) P(A k ) /{P(B A 1 )P(A 1 ) +P(B A 2 )P(A 2 ) + +P(B A n )P(A n )} Example: four boxes of components Independent: A & B are independent if P(AB)=P(A)P(B) 33

34 Random Variables X(ξ): assignment of a number to every experimental outcome, ξ, in the entire probability space,s. Examples: X(f k ) = k; Y(f k ) = k 2 Events generated by random variables: {X x}; {X 3.5}={ f 1, f 2, f 3 } {Y y}; {Y 7} = { f 1, f 2 } 34

35 The Distribution Function F X (x) = P{X x }, x: a real variable Probability Distribution Function, Cumulative Distribution Function (CDF) Examples: coin tossing, die rolling, real line 35

36 Properties of Distribution Function F(x + ) = lim ε 0 F(x+ε); F(x- ) = lim ε 0 F(x-ε) 1. F(+) = 1; F(-) = 0 2. F(x) is non-decreasing 3. P{X > x} = 1-F(x) 4. F(x + ) = F(x), i.e., F is continuous from right 5. P{x 1 < X x 2 } = F(x 2 ) F(x 1 ) 6. P{X = x} = F(x) - F(x - ) 7. P{x 1 x x 2 } = F(x 2 ) F(x 1- ) 36

37 Probability Density Function (PDF) 1. f(x) 0 f(x) = (d/dx)f(x) 2. - x f(u)du = F(x) 3. - f(u)du = 1 4. a b f(u)du = F(b) F(a) 37

38 Discrete Random Variables and PMF Six-face die experiment Bernoulli trials & Binomial Distribution A: event of a successful trial P(A) = p; P(A c )= q; p+q = 1 p n (k) = P { A occurs exactly k times in n trials} p n (k) = ( n k )pk q (n-k) Σ ( n k )pk q (n-k) = 1 n k=0 38

39 Poisson Distribution Poisson Theorem: For n, p 0, np µ ( n k )pk q (n-k) e µ (µ k /k!), k=0,1,.., Random Poisson points: T τ P{k points in τ out of n points in T } = ( n k )pk q (n-k) where p = τ/t Now, let T, n and n/t λ Then, P{k points in τ } e nτ/τ (nτ/τ) k /k! = e λτ (λτ) k /k!, k=0,1,.., 39

40 Continuous Random Variables & PDF Gaussian (Normal) distribution f X (x) = 1/[(2π ) 1/2 σ x ] exp{-(x-m x ) 2 /2σ x2 )} Uniform distribution Rayleigh distribution f R (r) = r/σ 2 exp{-r 2 /2σ 2 )} for r>0 = 0 for r < 0 40

41 Mean, Variance & Moments m x = - xf(x)dx σ x2 = - (x-m x ) 2 f(x)dx m x = Σ kp{x=k} σ x2 = Σ (k- m x ) 2 P{X=k} N-th moment = - x n f(x)dx 41

42 Characteristic Function Φ(ω) = - f(x)e jωx dx Φ(ω) Φ(0) = 1 Moment generation using characteristic function 42

43 Functions of a Random Variable Y = g(x) ξ X real numbers g real numbers Y Y is a random variable F Y (y) = P{Y y }= P{g(X) y } y 1 y=g(x) y 2 x 2 x 2 x 2 x 1 x 43

44 Functions of a Random Variable F Y (y) = P{Y y }= P{g(X) y } y 1 y=g(x) y 2 x 2 x 2 x 2 x 1 x F Y (y 1 ) = F X (x 1 ) F Y (y 2 ) = F X (x 2 ) + F X (x 2 ) - F X (x 2 ) 44

45 Examples 1. Y = ax + b F Y (y) = P{Y y } =P{aX+b y } = P{aX y-b }= P{X (y-b)/a} = F X ((y-b)/a) if a>0 { { P{X (y-b)/a} 1- F X ((y-b)/a) if a<0 2. Y = X 2 For y < 0, F Y (y) = 0 For y > 0. F Y (y) = P{Y y } =P{X 2 y } = P{-y 1/2 < X y 1/2 } = F X (y 1/2 ) - F X (-y 1/2 ) y=x 2 x 45

46 Fundamental Theorem y=g(x) y y+dy f X (x) x 1 x 2 x 3 x x 1 +dx 1 x 2 +dx 2 x 3 +dx 3 f Y (y) = f X (x 1 ) / g (x 1 ) + f X (x 2 ) / g (x 2 ) + + f X (x n ) / g (x n ) where x 1, x 2, x 3, x n are the solutions of y=g(x) 46

47 Fundamental Theorem f Y (y)dy = P{ y < Y y+dy } = P{ y < g(x) y+dy } = P{ x 1 < X x 1 +dx 1 } + P{ x 2 < X x 2 +dx 2 } + P{ x 3 < X x 3 +dx 3 } = f X (x 1 )dx 1 +f X (x 2 )dx 2 + f X (x 3 )dx 3 = f X (x 1 )dy/g (x 1 )+ f X (x 2 )dy/ g (x 2 ) + f X (x 3 )dy/ g (x 3 ) f Y (y) =f X (x 1 )/ g (x 1 ) + f X (x 2 )/ g (x 2 ) + f X (x 3 )/ g (x 3 ) 47

48 Examples 1. Y = ax + b f Y (y) = (1/ a ) f X ([y-b]/a) 2. Y = ax 2, a > 0 3. Y = a sin(x), a > 0 & X uniform 4. Y = tan(x) f Y (y) = (1/(2ay) 1/2 ) {f X ([y/a] 1/2 ) + f X (-[y/a] 1/2 )} f Y (y) = 1/π (a 2 -y 2 ) 1/2, for y < a F Y (y) = ½ + (1/π )sin -1 (y/a) f Y (y) = 1/π (1+y 2 ) F Y (y) = ½ + (1/π )tan 1 (y) 48

49 Conditional Distribution & Density The conditional distribution of a random variable X given the event, M, is F(x M) = P{X x M} = P(X x & M)/P(M) Conditional distribution function is a distribution function The conditional density function f(x M) is given by f(x M) = (d/dx)f(x M) = lim x 0 [F(x+ x M)-F(x M)]/ x = lim x 0 P{x X<x+ x M}/ x 49

50 Conditional Distribution Examples F(x M)) M = {X a} F(x) a x F(x M)) M = {b < X a} F(x) b a x 50

51 Total Probability & Bayes Theorem F(x) = F(x A 1 )P(A 1 ) + F(x A 2 )P(A 2 ) + + F(x A n )P(A n ) f(x) = f(x A 1 )P(A 1 ) + f(x A 2 )P(A 2 ) + + f(x A n )P(A n ) P(A X x) = [F(x A)/F(x)] P(A) P(A x 1 <X x 2 ) = {[F(x 2 A)-F(x 1 A)]/[F(x 2 )-F(x 1 )]} P(A) P(A X = x ) = lim x 0 P{A x < X x+ x} = [f(x A)/f(x)]P(A) P(A) = P(A x)f(x)dx f(x A) = P(A X = x ) f(x) / P(A x)f(x)dx 51

52 Two Random Variables X(ξ), Y(ξ) :assignment of a pair of numbers {(x,y)} to every experimental outcome, ξ, in the entire probability space,s. {(x,y)} y ξ X Y M x 52

53 Joint Distribution y X (x,y) {X x} {Y y} x = {X x, Y y} F XY (x,y) = P{X x, Y y} x F XY (-, y) = 0 F XY (x, -) = 0 F XY (, ) = 1 F XY (, y) = F Y (y) : marginal distribution F XY (x, ) = F X (x) : marginal distribution 53

54 Joint Distribution y y 2 (x 0,y 2 ) y 1 {X x (x 0,y 1 ) 0, y 1 <Y y 2 } x 0 x 1 x 2 x (x 1,y 0 ) (x 2,y 0 ) y 0 {x 1 <X x 2,Y y 0 } P {x 1 <X x 2,Y y} = F XY (x 2,y)- F XY (x 1,y) P {X x, y 1 <Y y 2 } = F XY (x,y 2 )- F XY (x,y 1 ) 54

55 Joint Distribution y (x 1,y 2 ) (x 2,y 2 ) (x 1,y 1 ) ( x 2,y 1 ) x P {x 1 <X x 2, y 1 <Y y 2 } {x 1 <X x 2, y 1 <Y y 2 } = F XY (x 2,y 2 ) - F XY (x 2,y 1 ) - F XY (x 1,y 2 ) + F XY (x 1,y 1 ) 55

56 Joint Density f XY (x,y) =( 2 / x y)f XY (x,y) F XY (x,y) = - x - y fxy (u,v)dudv Marginal density f X (x) = - fxy (x,y)dy f Y (y) = - fxy (x,y)dx 56

57 Independence If X and Y are independent F XY (x,y) = P{X x, Y y} = P{X x} P{Y y} = F X (x) F Y (y) and f XY (x,y) = f X (x) f Y (y) X and Y are independent if and only if f XY (x,y) = f X (x) f Y (y) 57

58 Discrete Distributions Joint distribution P{ X= x k, Y = y m } = p km Σ k Σ m p km = 1 Marginal distribution p k = Σ m p km p m = Σ k p km 58

59 Bivariate Gaussian Distribution Random Variables X and Y are jointly normal (gaussian) if f XY (x,y) = 1/[2πσ x σ y (1-ρ 2 ) 1/2 ] exp{-a(x,y)/2(1-ρ 2 )} where A(x,y) = (x-m x ) 2 /σ 2 x + (y-m y ) 2 /σ 2 y 2ρ(x-m x )(y-m y )/σ x σ y Marginal normal density: f X (x) and f Y (y) f X (x) = 1/[(2π ) 1/2 σ x ] exp{-(x-m x ) 2 /2σ x2 )} f Y (y) = 1/[(2π ) 1/2 σ y ] exp{-(y-m y ) 2 /2σ y2 )} 59

60 Mean, Variance & Covariance m x = E(X) = - - x fxy (x,y)dxdy = - x fx (x)dx m y = E(Y) = - - y fxy (x,y)dxdy = - y fy (y)dy C(x,y)= E{(X- m x )(Y- m y )} = - - (x- mx )(y- m y ) f XY (x,y)dxdy Correlation coefficient: ρ(x,y) = C(x,y)/σ x σ y ρ(x,y) 1 For Joint Gaussian density, un-correlation implies independence. 60

61 Conditional Density F Y {y x 1 <X x 2 } = P{x 1 <X x 2,Y y}/p{x 1 <X x 2 } =[F XY (x 2,y)- F XY (x 1,y)] / [F X (x 2 )- F X (x 1 )] x2 f Y {y x 1 <X x 2 } = x1 fxy (x,y)dx /[F X (x 2 )- F X (x 1 )] x+ x f Y {y x<x x+ x} = x fxy (x,y)dx /[F X (x+ x)- F X (x)] x+ x f Y X {y X=x} = lim x 0 x fxy (x,y)dx /[F X (x+ x)- F X (x)] x+ x f Y X {y X=x} = lim x 0 (1/ x) x fxy (x,y)dx /f X (x)= f XY (x,y)/f X (x) f Y X (y X)= f XY (x,y)/f X (x) and f X Y (x Y)= f XY (x,y)/f Y (y) 61

62 Conditional Gaussian Density IF X and Y are jointly normal, then f XY (x,y) = 1/[2πσ x σ y (1-ρ 2 ) 1/2 ] exp{-a(x,y)/2(1-ρ 2 )} where A(x,y) = (x-m x ) 2 /σ 2 x + (y-m y ) 2 /σ 2 y 2ρ(x-m x )(y-m y )/σ x σ y and f Y X (y X)=1/{σ y [2π (1-ρ 2 )] 1/2 } exp{-b(x,y)/2σ y2 (1-ρ 2 )} where B(x,y) = [(y-m y ) ρσ y /σ x (x-m x )] 2 62

63 Conditional Mean & Variance Conditional Mean: m y x = E(Y X=x) = y f Y X (y x)dy Conditional Variance: σ y x2 = E{(Y-m y x ) 2 X=x) = (y-m y x ) 2 f Y X (y x)dy For Normal density m y x =m y + ρσ y /σ x (x-m x ) σ y x2 = σ y2 (1-ρ 2 ) Conditional mean as a Random Variable E(Y X=x) = φ(x) E(Y X) = φ(x) 63

64 Functions of Two Random Variables Z = g(x,y), P{Z z} = P{g(X,Y) z} F Z (z) = Dz f XY (x,y)dxdy where Dz = { g(x,y) z} Example: Z = X+Y F Z (z) = P{Z z} = P{X+Y z} z-y = - [ - fxy (x,y)dx]dy f Z (z) = (d/dz)f Z (z) = - fxy (z-y,y)dy X and Y independent f Z (z)= - fx (z-y)f Y (y) dy 64

65 Functions of Two Random Variables Example: Z = (X 2 +Y 2 ) 1/2 F Z (z) = P{Z z} = P{(X 2 +Y 2 ) 1/2 z} = Dz f XY (x,y)dxdy Let R = (X 2 +Y 2 ) 1/2 and θ =tan -1 (Y/X) P{(X 2 +Y 2 ) 1/2 2π z z} = P{R z} = 0 0 r g(r,θ)drdθ z = 2π 0 r g(r,θ)drdθ If f XY (x,y) = (1/2πσ 2 )exp[-(x 2 +y 2 )/2σ 2 ] Then f Z (z) = (z/σ 2 )exp[-(z 2 /2σ 2 )] for z > 0 i.e., Z is a Rayleigh distrinution 65

66 Polar Coordinate f(t) = X cos(ωt) + Y sin(ωt) = R cos(ωt θ) R = (X 2 +Y 2 ) 1/2 and θ =tan -1 (Y/X) If f XY (x,y) = (1/2πσ 2 )exp[-(x 2 +y 2 )/2σ 2 ] Then f Rφ (z) = (1/2π)(z/σ 2 )exp[-(z 2 /2σ 2 )] for z > 0 and θ < π 66

67 Joint Moments & Characteristic Function Expected value of Z=g(X,Y), E(Z) = - zf Z (z)dz = - - g(x,y)f XY (x,y)dxdy E(X+Y) = E(x) + E(Y) E(XY) = - - xy f XY (x,y)dxdy E(X)E(Y) unless X and Y are independent Φ(ω 1,ω 2 ) = E (e jω 1 X e jω 2 Y ) = - - e jω 1 x e jω 2 y f XY (x,y)dxdy 67

68 Mean Square Estimation Y is a random variable, C is a constant Find a C such that e=e{(y - C) 2 } is minimized e=e{(y - C) 2 } = E{Y 2 2YC + C 2 } = E{Y 2 } 2C E{Y} + C 2 = E{Y 2 } 2Cm y + C 2 e/ C = -2m y + 2C = 0 C = m y e min = E{(Y - m y ) 2 } = σ y 2 68

69 Mean Square Estimation Y is a random variable, X is another random variable Find a function of X, c(x), such that e=e{[y c(x)] 2 } is minimized e=e{[y-c(x)] 2 } = - - [y-c(x)] 2 f XY (x,y)dxdy = - f X (x){ - [y-c(x)] 2 f Y X (y x)dy} dx Since f X (x) is non-negative, for every fixed x the best choice for c(x) is c(x) = E(Y X=x) i.e., c(x) = E(Y X) = φ(x), the conditional mean e min = E{[Y - φ(x)] 2 } = E{Y 2 } 2E{Yφ(X)]} + E{φ 2 (X)} = E{Y 2 } E{φ 2 (X)} If X and Y are independent, Then c(x) = m y 69

70 Mean Square Estimation Y is a random variable, X is another random variable Find a linear function of X, c(x)= ax+b, such that e=e{[y c(x)] 2 } is minimized e = E{[Y-(aX+b)] 2 } = E{[(Y-aX)-b] 2 } If X and Y are jointly normal, b = m y -am x a = ρ σ y /σ x and e min = σ y 2 (1-ρ 2 ) φ(x) = (ρ σ y /σ x )X + m y -(ρσ y /σ x )m x i.e., the conditional mean, φ(x), is a linear function of X 70

71 Sample Mean & Sample Variance X k, k = 1, 2, 3, 4,,n; X k are i.i.d. with E(X k )=m, and E(X-m) 2 = σ 2 n Sample Mean: X= (1/n)Σ k=1 Xk n Sample Variance: V = [1/(n-1)]Σ k=1 (Xk X) 2 E(X) = m, σ 2 X = σ 2 /n E(V) = σ 2 71

72 Parameter Estimation X k = m + Y k, k = 1, 2, 3, 4,,n; Y k are i.i.d. and zero mean To find a minimum variance unbiased estimator of m n M= Σ k=1 ak X k i.e., E(M) = m and E[(M-m) 2 ] is minimum Σ k=1 n ak = m and Σ k=1 n ak σ k 2 = minimum Lagrange multiplier: V = Σ k=1 n ak σ k 2 λ (Σk=1 n ak 1) V/ a k = 0 a k = λ/2σ k 2 and λ = 2/(Σ k=1 n 1/σk 2 ) 72

73 Stochastic (Random) Processes X(t, ξ): assign a time function for every experimental outcome: ξ Continuous time process X(t, ξ) Discrete-time process X[n, ξ] Continuous-state process Discrete-state process For a fixed outcome ξ 0, we have a time function: x(t, ξ 0 ) A single realization is a sample function: x(t, ξ 0 ) For a fixed time t 0, we have a random variable: x(t 0, ξ) t 0 t 73

74 Random Processes Examples: X(t, ξ) = R(ξ)cos[ωt+θ(ξ)] Random Walk: X[n+1, ξ] = X[n, ξ] + X n Equality of processes 1. Almost everywhere: P{ X(t)-Y(t) 0} = 0 2. Mean-square (M.S.) sense: E{ X(t)-Y(t) 2 } = 0 74

75 Distribution & Density Functions First-order statistics F(x;t) = P{X(t) x} f(x;t) = ( / x)f(x;t) m X (t) = E{X(t)} = - xf(x;t)dx Second-order statistics F(x 1,x 2 ;t 1,t 2 ) = P{X(t 1 ) x 1, X(t 2 ) x 2 } f(x 1,x 2 ;t 1,t 2 ) = ( 2 / x 1 x 2 )F(x 1,x 2 ;t 1,t 2 ) 75

76 Autocorrelation Function R X (t 1,t 2 ) = E{X(t 1 )X(t 2 )} = - - x 1 x 2 f(x 1,x 2 ;t 1,t 2 ) dx 1 dx 2 R X (t,t) = E{X(t)X(t)}: second moment C X (t 1,t 2 ) = R X (t 1,t 2 ) m X (t 1 )m X (t 2 ): covariance Example: X(t) = A cos(ωt+θ) R X (t 1,t 2 ) = (A 2 /2)cos[ω(t 1 -t 2 )] 76

77 Stationary Processes Strictly Stationary For all n, all (t 1,t 2 t 3,t 4.t n-1,t n ) and all f(x 1,x 2. x n-1,x n ;t 1,t 2. t n-1,t n ) = f(x 1,x 2. x n-1,x n ;t 1 +,t 2 +. t n-1 +,t n + ) Wide-sense Stationary (WSS) m X (t) = E{X(t)} is independent of t R X (t 1,t 2 ) dependent only on τ = t 1 -t 2 Cyclo-stationary both m X (t) and R X (t+τ,t) are periodic function of t Y(t) = X(t) cos(ωt); where X(t) is WSS 77

78 Autocorrelation Function of WSS Processes R X (τ) = R X (-τ) R X (τ) R X (0) If for some T we have R X (T) = R X (0), then R X (kt) = R X (0) for all k A stationary process is Ergodic if for all function g(x) and for all ξ S if lim (1/ ) - /2 /2 g[x(t,ξ)]dt = E{g(X(t)} 78

79 Power Spectral Density S(ω) = - R(τ)e -jωτ dτ R(τ) = (1/2π) - S(ω)e jωτ dω R(0) = (1/2π) - S(ω) dω 79

80 LTI Systems X(t) m X (t) R X (τ) h(t) H(ω) Y(t) = - X(τ)h(t-τ)dτ = X(t)*h(t) m Y (t) = m X (t)*h(t) R Y (τ) = R X (τ)*h(t)*h(-t) S(ω) X S(ω) Y = S(ω) X H(ω) 2 80

81 White Process (White Noise) A process X(t) is called a white process if it has a flat power spectral density, i.e. S X (ω) is a constant for all ω. S X (ω) N 0 /2 ω Noise equivalent bandwidth of a filter: BW N BW N = (1/ H(ω) 2 max) 0 H(ω) 2 dω Total Noise Power: N = N 0 BW N H(ω) 2 max 81

82 Counting Process N(t) X 1 X 2 X 3 X 4 X 5 X 6 X k : Inter-arrival time 82

83 Poisson Process P N(T) [n] = (λt) n e -λt /n! [N(t 1 )-N(t 1 )] & [N(t 3 )-N(t 2 )] are independent E{N(t)} = λt, & R(t 1, t 2 ) = λt 1 +λ 2 t 1 t (t 2 1 <t 2 ) Inter-arrival time distribution F X (x) = 1- e -λx for x>0 83

84 Brownian Motion X(t) = 0, and [X(t+τ)-X(t)] is a zero mean Gaussian Random Variable with Variance = ατ Process of independent increment 84

85 Gaussian Process (Gaussian Noise) A process X(t) is called a Gaussian process if for all n and for all (t 1,t 2 t 3,t 4.t n-1,t n ), the random variables {X(t 1 ), X(t 2 ),.., X(t n )} have a jointly Gaussian density. For Gaussian processes, the mean, m X (t), and autocorrelation function, R X (t 1,t 2 ), completely characterize the process. If a Gaussian process X(t) is passed through an LTI system, then the output Process Y(t) is also Gaussian. For Gaussian processes, WSS and strictly stationary are equivalent 85

86 Power Spectrum of A Digital PAM System X(t) = Σ n=- Bn g(t-nt) E{X(t)} = Σ n=- E(Bn )g(t-nt) = m B Σ n=- g(t-nt) R X (t+τ,t) = E{X(t)X(t+τ)} = Σ k=- RB [k] Σ n=- g(t-nt)g(t+τ-nt-kt) X(t) is cyclostationary with period T! T/2 R X (τ) = (1/T) -T/2 R X (t+τ,t)dt S X (ω) = (1/T) {Σ k=- RB [k]e -jkωt } G(ω) 2 = (1/T)S B (ω) G(ω) 2 86

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

PROBABILITY AND RANDOM PROCESSESS

PROBABILITY AND RANDOM PROCESSESS PROBABILITY AND RANDOM PROCESSESS SOLUTIONS TO UNIVERSITY QUESTION PAPER YEAR : JUNE 2014 CODE NO : 6074 /M PREPARED BY: D.B.V.RAVISANKAR ASSOCIATE PROFESSOR IT DEPARTMENT MVSR ENGINEERING COLLEGE, NADERGUL

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Problem Sheet 1 Examples of Random Processes

Problem Sheet 1 Examples of Random Processes RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

3F1 Random Processes Examples Paper (for all 6 lectures)

3F1 Random Processes Examples Paper (for all 6 lectures) 3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white

More information

Chapter 4 Random process. 4.1 Random process

Chapter 4 Random process. 4.1 Random process Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.1 Random process 4.1 Random process Random processes - Chapter 4 Random process 2 Random process Random process,

More information

Chapter 5 Random Variables and Processes

Chapter 5 Random Variables and Processes Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Series FOURIER SERIES. Graham S McDonald. A self-contained Tutorial Module for learning the technique of Fourier series analysis

Series FOURIER SERIES. Graham S McDonald. A self-contained Tutorial Module for learning the technique of Fourier series analysis Series FOURIER SERIES Graham S McDonald A self-contained Tutorial Module for learning the technique of Fourier series analysis Table of contents Begin Tutorial c 24 g.s.mcdonald@salford.ac.uk 1. Theory

More information

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit   dwm/courses/2tf Time-Frequency Analysis II (HT 20) 2AH 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 20 For hints and answers visit www.robots.ox.ac.uk/ dwm/courses/2tf David Murray. A periodic

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

2A1H Time-Frequency Analysis II

2A1H Time-Frequency Analysis II 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 209 For any corrections see the course page DW Murray at www.robots.ox.ac.uk/ dwm/courses/2tf. (a) A signal g(t) with period

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

EE 438 Essential Definitions and Relations

EE 438 Essential Definitions and Relations May 2004 EE 438 Essential Definitions and Relations CT Metrics. Energy E x = x(t) 2 dt 2. Power P x = lim T 2T T / 2 T / 2 x(t) 2 dt 3. root mean squared value x rms = P x 4. Area A x = x(t) dt 5. Average

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement: ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 06 Instructor: Prof. Stanley H. Chan Final Fall 06 (Dec 6, 06) Name: PUID: Please copy and write the following statement: I certify

More information

3. Review of Probability and Statistics

3. Review of Probability and Statistics 3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

EE 224 Signals and Systems I Review 1/10

EE 224 Signals and Systems I Review 1/10 EE 224 Signals and Systems I Review 1/10 Class Contents Signals and Systems Continuous-Time and Discrete-Time Time-Domain and Frequency Domain (all these dimensions are tightly coupled) SIGNALS SYSTEMS

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Signal and systems. Linear Systems. Luigi Palopoli. Signal and systems p. 1/5

Signal and systems. Linear Systems. Luigi Palopoli. Signal and systems p. 1/5 Signal and systems p. 1/5 Signal and systems Linear Systems Luigi Palopoli palopoli@dit.unitn.it Wrap-Up Signal and systems p. 2/5 Signal and systems p. 3/5 Fourier Series We have see that is a signal

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

Mathematical Foundations of Signal Processing

Mathematical Foundations of Signal Processing Mathematical Foundations of Signal Processing Module 4: Continuous-Time Systems and Signals Benjamín Béjar Haro Mihailo Kolundžija Reza Parhizkar Adam Scholefield October 24, 2016 Continuous Time Signals

More information

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES 13.42 READING 6: SPECTRUM OF A RANDOM PROCESS SPRING 24 c A. H. TECHET & M.S. TRIANTAFYLLOU 1. STATIONARY AND ERGODIC RANDOM PROCESSES Given the random process y(ζ, t) we assume that the expected value

More information

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

2. (a) What is gaussian random variable? Develop an equation for guassian distribution Code No: R059210401 Set No. 1 II B.Tech I Semester Supplementary Examinations, February 2007 PROBABILITY THEORY AND STOCHASTIC PROCESS ( Common to Electronics & Communication Engineering, Electronics &

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Homework 4. May An LTI system has an input, x(t) and output y(t) related through the equation y(t) = t e (t t ) x(t 2)dt

Homework 4. May An LTI system has an input, x(t) and output y(t) related through the equation y(t) = t e (t t ) x(t 2)dt Homework 4 May 2017 1. An LTI system has an input, x(t) and output y(t) related through the equation y(t) = t e (t t ) x(t 2)dt Determine the impulse response of the system. Rewriting as y(t) = t e (t

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

Signals & Systems. Lecture 5 Continuous-Time Fourier Transform. Alp Ertürk

Signals & Systems. Lecture 5 Continuous-Time Fourier Transform. Alp Ertürk Signals & Systems Lecture 5 Continuous-Time Fourier Transform Alp Ertürk alp.erturk@kocaeli.edu.tr Fourier Series Representation of Continuous-Time Periodic Signals Synthesis equation: x t = a k e jkω

More information

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV

More information

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011 UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,

More information

Problem 1. Problem 2. Problem 3. Problem 4

Problem 1. Problem 2. Problem 3. Problem 4 Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication

More information

Fourier transforms. Definition F(ω) = - should know these! f(t).e -jωt.dt. ω = 2πf. other definitions exist. f(t) = - F(ω).e jωt.

Fourier transforms. Definition F(ω) = - should know these! f(t).e -jωt.dt. ω = 2πf. other definitions exist. f(t) = - F(ω).e jωt. Fourier transforms This is intended to be a practical exposition, not fully mathematically rigorous ref The Fourier Transform and its Applications R. Bracewell (McGraw Hill) Definition F(ω) = - f(t).e

More information

Practice Examination # 3

Practice Examination # 3 Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single

More information

Spectral Analysis of Random Processes

Spectral Analysis of Random Processes Spectral Analysis of Random Processes Spectral Analysis of Random Processes Generally, all properties of a random process should be defined by averaging over the ensemble of realizations. Generally, all

More information

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions

More information

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc. ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Quantitative Methods in Economics Conditional Expectations

Quantitative Methods in Economics Conditional Expectations Quantitative Methods in Economics Conditional Expectations Maximilian Kasy Harvard University, fall 2016 1 / 19 Roadmap, Part I 1. Linear predictors and least squares regression 2. Conditional expectations

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Problems on Discrete & Continuous R.Vs

Problems on Discrete & Continuous R.Vs 013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Fundamentals of Noise

Fundamentals of Noise Fundamentals of Noise V.Vasudevan, Department of Electrical Engineering, Indian Institute of Technology Madras Noise in resistors Random voltage fluctuations across a resistor Mean square value in a frequency

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

FINAL EXAM: 3:30-5:30pm

FINAL EXAM: 3:30-5:30pm ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.

More information

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7 UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn

More information

Communication Theory Summary of Important Definitions and Results

Communication Theory Summary of Important Definitions and Results Signal and system theory Convolution of signals x(t) h(t) = y(t): Fourier Transform: Communication Theory Summary of Important Definitions and Results X(ω) = X(ω) = y(t) = X(ω) = j x(t) e jωt dt, 0 Properties

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

Fourier transform representation of CT aperiodic signals Section 4.1

Fourier transform representation of CT aperiodic signals Section 4.1 Fourier transform representation of CT aperiodic signals Section 4. A large class of aperiodic CT signals can be represented by the CT Fourier transform (CTFT). The (CT) Fourier transform (or spectrum)

More information

Chap 2.1 : Random Variables

Chap 2.1 : Random Variables Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is

More information

Random Process. Random Process. Random Process. Introduction to Random Processes

Random Process. Random Process. Random Process. Introduction to Random Processes Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Random Variables. P(x) = P[X(e)] = P(e). (1)

Random Variables. P(x) = P[X(e)] = P(e). (1) Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment

More information

Introduction to Probability and Stochastic Processes I

Introduction to Probability and Stochastic Processes I Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides

More information

Definition of a Stochastic Process

Definition of a Stochastic Process Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic

More information

EE 3054: Signals, Systems, and Transforms Summer It is observed of some continuous-time LTI system that the input signal.

EE 3054: Signals, Systems, and Transforms Summer It is observed of some continuous-time LTI system that the input signal. EE 34: Signals, Systems, and Transforms Summer 7 Test No notes, closed book. Show your work. Simplify your answers. 3. It is observed of some continuous-time LTI system that the input signal = 3 u(t) produces

More information

Chapter 6 THE SAMPLING PROCESS 6.1 Introduction 6.2 Fourier Transform Revisited

Chapter 6 THE SAMPLING PROCESS 6.1 Introduction 6.2 Fourier Transform Revisited Chapter 6 THE SAMPLING PROCESS 6.1 Introduction 6.2 Fourier Transform Revisited Copyright c 2005 Andreas Antoniou Victoria, BC, Canada Email: aantoniou@ieee.org July 14, 2018 Frame # 1 Slide # 1 A. Antoniou

More information

Introduction to Probability Theory

Introduction to Probability Theory Introduction to Probability Theory Ping Yu Department of Economics University of Hong Kong Ping Yu (HKU) Probability 1 / 39 Foundations 1 Foundations 2 Random Variables 3 Expectation 4 Multivariate Random

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Signal Processing Signal and System Classifications. Chapter 13

Signal Processing Signal and System Classifications. Chapter 13 Chapter 3 Signal Processing 3.. Signal and System Classifications In general, electrical signals can represent either current or voltage, and may be classified into two main categories: energy signals

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

Appendix A : Introduction to Probability and stochastic processes

Appendix A : Introduction to Probability and stochastic processes A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of

More information

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University. Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University Two Random Variables Previously, we only dealt with one random variable

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information