Assuming that the transmission delay is negligible, we have

Size: px
Start display at page:

Download "Assuming that the transmission delay is negligible, we have"

Transcription

1 Baseband Transmsson of Bnary Sgnals Let g(t), =,, be a sgnal transmtted over an AWG channel. Consder the followng recever g (t) + + Σ x(t) LTI flter h(t) y(t) t = nt y(nt) threshold comparator Decson ˆ b n W(t) Assumng that the transmsson delay s neglgble, we have x(t) = g(t) + W(t), (n-)t t nt, n =,,... W(t) s a zero-mean whte Gaussan nose process wth psd, f 46

2 At the output of the LTI flter, yt ( ) = x( τ) ht ( τ) dτ = [ g( τ) + W( τ)] ht ( τ) dτ yt ( ) = g( τ) ht ( τ) dτ + W( τ) ht ( τ) dτ. At the samplng nstant t = nt, y ( nt ) = g ( τ ) h( nt τ ) dτ + W ( τ ) h( nt τ ) dτ. Let g(t) be the transmtted pulse when a logcal s sent and let g(t) be the transmtted pulse when a logcal s sent. Then a possble decson strategy s: If y(nt) > A, then g(t) was transmtted ( sent) If y(nt) A, then g(t) was transmtted ( sent) 47

3 Snce ht () s LTI, the observaton y( nt ), gven the nowledge of () random varable. Hence, to establsh the decson crteron we only need to compute the condtonal mean and varance of y( nt ) gven g () t. g t s a Gaussan The condtonal expected value (mean) of the flter output gven that g () t was sent s gven by E{ y( nt ) g() t } = E g( τ) h( nt τ) dτ g() t + E W ( τ) h( nt τ) dτ g() t { ( ) ( ) () } { ( ) ( ) () } = E g τ h nt τ g t dτ + E W τ h nt τ g t dτ { } = g ( τ) h( nt τ) dτ + E W ( τ) = g ( τ) h( nt τ) dτ G, =, h ( nt τ ) dτ 48

4 Moreover, the condtonal varance s { } ( ) = { } Var y( nt ) g () t E y( nt ) E y( nt ) g () t g () t = E W ( τ) h( nt τ) dτ g ( t) = E W ( τ) h( nt τ) dτ * * = E W() τw ( λ)( h nt τ) h ( nt λ) dτdλ { } * * = E W ( τ) W ( λ) h( nt τ) h ( nt λ) dτdλ = * δτ ( λ) h( nt τ) h ( nt λ) dτdλ * = δ( τ λ) h( nt τ) dτ h ( nt λ) dλ = h( nt λ) dλ h( λ) dλ = = H ( f ) df σ = 49

5 Therefore, f ( y( nt ) "" sent) = e πσ f y nt sent = e πσ σ ( y G ) σ ( y G ) ( ( ) "" ). Therefore, an error wll occur f ether a) choose when was sent, or b) choose when was sent. Mathematcally, { choose"" ""was sent } = { ( ) ""was sent} P P y nt A and A = f ( y( nt ) ""was sent) dy { choose"" ""was sent } = { ( ) > ""was sent} P P y nt A = f ( y( nt ) ""was sent) dy. A 5

6 So, the average probablty of a bt error (BER) s gven by P { } { bt error} = P { bt error and ""was sent} { bt error and ""was sent} = P{ bt error and ""was sent} + P{ bt error and ""was sent} = P{ choose""and ""was sent} + P{ choose""and ""was sent} = P choose"" ""was sent P{ ""was sent} + P { } { choose"" ""was sent } P{ ""was sent} A = p f ( y( nt ) ""was sent) dy + ( p) f ( y( nt ) ""was sent) dy A where p = P{ ""was sent} and p = P{ } ""was sent. The decson regons are separated by the threshold A. See the next fgure, where the threshold A s assumed to be because the bts occur wth equal probablty,.e. p =.5 (we wll dscuss ths ssue next). 5

7 y f ( y ""was sent) f ( y ""was sent) 5

8 The queston now s: How do we choose A optmally? To fnd the optmal value of A, we mnmze the probablty of error wth respect to t,.e., d da or A d P{ bt error } = p f ( y( nt ) ""was sent) dy ( p) f ( y( nt ) ""was sent) dy da + A = p f( A""sent) + ( p) f( A""sent) = d P A G { } pe σ p e σ = + = da πσ ( A G ) ( ) bt error [ ( ) ] ( A G ) ( A G ) ( A G ) σ σ e σ p e = = ( A G ) p e p or ln ( A G ) ( A G) p = σ or p ln A AG G A AG G G G A( G G) p = σ + + = σ 53

9 The optmal value of A can now be found by solvng the followng equaton for A: σ ( G G )( G + G ) p G G ln AG ( G) AG ( G) p = =. amely, A opt G + G σ ln p = + p G G If both symbols are transmtted wth equal probablty,.e. G+ G p = then Aopt =, namely, the arthmetc average of the means at the output of h(t) at t = nt. When the two symbols are not transmtted wth equal probablty, the optmal threshold Aopt wll shft to the rght or to the left, dependng on whch symbol occurs wth hgher probablty. 54

10 In the case of equal probablty of occurrence, A opt P{ bt error } = ( ( ) ""sent) ( ( ) ""sent) f y nt dy + f y nt dy Aopt G+ G ( y G) σ ( y G) σ = e dy e dy πσ + G+ G = σ e du + σ e dv, u = πσ u v G G G G σ σ σ u G G = e du = Q π G G σ σ, y G, v = y G σ where u ( ), Q x = e du x π x mean and unt varance,.e. the area under the tal of, s the area under the tal of the Gaussan pdf wth zero x fx ( x) = e, x. π 55

11 It should be clear from the prevous dervaton that both G and G depend on h(t), the G G G G mpulse response of the recever flter. Also, Q decreases as σ σ ncreases,.e., the average probablty of error, P{bt error}, decreases as the separaton between G and G ncreases. Let s now fnd the h(t) that wll result n the mnmum probablty of bt error. To do ths, consder the followng optmzaton problem: Maxmze over all possble h(t) the square of the argument of the Q functon,.e. G G max ht () σ. 56

12 G G G G max = max ht () σ ht () σ h( τ) g( nt τ) dτ h( τ) g( nt τ) dτ = max ht () H ( f ) df h( τ) g ( nt τ) g( nt τ) d τ max = ht () H ( f ) df ( h() t [ g ]) () t g() t t nt = = max. ht () H ( f ) df 57

13 But, jπ ft () () () ( ) ( ) ( ), ( [ ]) = [ ] h t g t g t H f G f G f e df t = nt Usng the Schwarz nequalty, we get jπ ft ( [ ]) [ ] h() t g () t g () t G ( f ) G ( f ) e df H ( f ) df, t = nt, Equalty occurs whenever For arbtrary, we get = G ( f ) G ( f ) df H ( f ) df, t = nt jπ f nt ([ ] ) * jπ f nt [ ( ) ( )]. H( f) = G ( f) G ( f) e = G f G f e * G G max = G( f ) G( f ) df. ht () σ 58

14 For =, jπ fnt jπ ft hopt ( t) = G ( f ) G ( f ) e e df = j π f ( nt t) G ( f ) G ( f ) e df = [ ( ) ( )] j π f ( nt t) G f G f e df ( g ( nt t) g ( nt t) ) = g ( nt t) g ( nt t) * * = = g ( nt t) g ( nt t), for real g ( t), g ( t). G G Assumng p =, P{ error} s mnmum when s maxmum or when σ h( t) = g ( nt t) g ( nt t),.e. ht () matches the nput pulses g( t) and g( t ). 59

15 By Parseval s theorem, G G max = G( f ) G( f ) df [ g() t g() t ] dt, ht () σ = whch mples that / / G G = [ g() t g() t ] dt G( f ) G( f ) df σ = max and = [ g () t g () t ] dt G ( f ) G ( f ) df = [ ] g () t g() t dt G( f ) G( f ) df P{ error} = Q Q mn =. 6

16 Example: Compute the mnmum error probablty (BER) for the followng on-off eyng (transmt a pulse when a logcal occurs and transmt nothng when a logcal occurs): A T/ T T 3T 4T t Here, g() t T t A tr = T and g ( t) =, t T. 6

17 ow, t T g() t g() t dt = Atr dt T [ ] T T t T = A tr dt T = + ( ) T T T T T A t dt t T dt = A t dt = t = T 4 8A 3 T T 3T AT 3 and the mnmum bt error rate (BER) s equal to P{ error occurs} mn AT = Q. 6 6

18 Assumng the bt sequence s random, then the average bt energy s gven by AT AT Eb, av = E"" + E"" = + = 3 6 as BER mn E b, Q av = and the BER can be rewrtten. The followng plot shows the performance of the prevous communcaton system n AWG (ths s the same as on-off eyng or OO). - BER SR n db BER system performance 63

19 Baseband sgnal-space Analyss Our goal now s to formulate the dfferent detecton strateges n a more ntutve fashon. We do ths by gvng the modulated sgnals a geometrc nterpretaton. φ () t Let S be a -dmensonal sgnal space and { } further that the bass functons are orthonormal,.e. = be a bass for ths space. Suppose, = j φ() t φj () t dt =,, j t t + T s φ () t where Ts s a tme nterval yet to be determned, then { } = s an orthonormal set. Let st () sφ () t,.e., st () S. j= j j 64

20 ow, for =,, s() t () t dt s () t () t dt + + φ = jφj φ t t j= t Ts t Ts j= t + T = s φ ( t) φ ( t) dt = s, =,...,. s j j t Furthermore, the energy of s(t) s gven by t+ Ts t+ Ts t+ Ts E = s() t dt = s() t s () t dt = sφ() t s jφj() t dt t t t = j= t+ Ts t+ Ts * j φ() φj () j φ() φj () t = j= = j= t = s s t t dt = s s t t dt ss s = = = =. 65

21 Let the coeffcents s, =,, be expressed as a vector [ ] s T E= ss = [ s s] = s = s Let { s (), t s () t } M s s s s T,, then,.e., E s the nner (dot) product of s wth tself. be a set of sgnals we want to use n a communcaton system. If ths set s defned on the nterval ( t t T ), s then an orthonormal bass can be constructed as follows: +, where T s s the maxmum sgnal duraton,. Let g() t = s() t and φ() t = g() t g() t = g() t E s() t = Eφ(), t where + s + s t T t T g() t g() t g () t dt = g() t dt = E. t t 66

22 . Let g() t = s() t s(), t φ() t φ() t and φ () t = g() t g(), t where t + T s ut (), vt () utv () () tdt. t g3() t 3. Let g3() t = s3() t s3(), t φ() t φ() t s3(), t φ() t φ(), t φ3() t =. g () t 3. Let g () t g () t = s () t s (), t () t (), t =, g () t and M. φ φ φ = g () t ote that s (), t φ () t can be nterpreted as the projecton of s(t) onto φ () t. The set of bass functon { } φ = () t forms an orthonormal set. The procedure outlned above s nown as the Gram-Schmdt orthogonalzaton procedure. 67

23 s () t M Remar : The set of sgnals { } = s a lnearly ndependent set ff = M. Remar : The sgnals s (), t s () t are not lnearly ndependent f < M and g(t) =, < M. Example: Consder the sgnals s(t), =,, 3, 4 ( M = 4 M ) descrbed by s (t) s (t) s 3 (t) s 4 (t) 3 4 t 3 4 t 3 4 t 3 4 t In ths case T s = 3 sgnals. seconds. Let us now construct an orthonormal bass for ths set of g () t = s () t. 3 g () t = g (), t g () t = g () t dt = dt = = E g () t g () t φ () t = = = g () t = s () t s () t = φ () t g() t 68

24 . = φ g () t s () t s (), t () t φ () t 3 = = = = s (), t φ () t s () t φ () t dt dt g () t s () t φ () t g () t = s () t s (). t ow, or 3 3 g () t = g () t dt = [ s () t s () t ] dt = dt = = E g () t s () t φ () t φ () t = = s () t = φ () t + φ () t g() t 3. 3 = 3 3 φ φ 3 φ φ g () t s () t s (), t () t () t s (), t () t () t 69

25 3 3 3 φ 3 φ 3 s (), t () t = s () t () t dt = s () t s () t dt = φ 3 φ 3 [ ] s (), t () t = s () t () t dt = s () t s () t s () t dt = dt = g () t = s () t φ () t = s () t s () t + φ () t = s () t s () t + s () t g () t = g () t dt = [ s () t s ( t) + s () t ] dt = dt = = E g3() t s3() t φ() t φ3() t = = s3() t = φ() t + φ3() t g () t = g () t s () t s (), t φ () t φ () t s (), t φ () t φ () t s (), t φ () t φ () t 3 4 φ 4 φ s (), t () t = s () t () t dt = dt = 3 4 φ 4 φ s (), t () t = s () t () t dt = dt = 7

26 3 3 4 φ3 4 φ3 s (), t () t = s () t () t dt = dt = [ ] g () t = s () t φ () t φ () t φ () t = s () t φ () t + φ () t + φ () t But, φ ( t), =,, 3 are descrbed by φ () t φ () t φ () t 3 3 t 3 t 3 t g () t = s () t = φ () t + φ () t + φ () t Hence, s () t = φ () t s () t = φ () t + φ () t s () t = φ () t + φ () t 3 3 7

27 φ () t 3 s 3 φ () t φ () t s4 s s Clearly, { s ()} 4 t = coordnates φ, φ and φ. 3 s defned on the 3-dmensonal Eucldean space represented by the Let the sgnal arrvng at the recever be descrbed by xt () = s() t + W(), t =,, M, where s () t s the transmtted sgnal and W(t) s WG wth zero mean and power spectral densty Sw( f ) Watts Hz, f. sgnal space S,.e., = Let { φ j t } j= j j j= (), s () t = s φ (), t =,, M. M be an orthonormal bass for the 7

28 Consder a coherent correlator recever and the observed output at the th correlator,.e., x(t) φ () t φ φ () t () t T s T s T s ( ) dt ( ) dt ( ) dt t = T s t = T s t = T s X X X Coherent correlator recever 73

29 Let s () t or symbol m be transmtted through the channel, then the output of the th correlator s gven by Ts Ts Ts X m = x() t φ () [ () ()] () () () () t dt = s t + W t φ t dt == s jφj t + W t φ t dt j= Ts Ts j φj φ φ j= = s () t () t dt + W () t () t dt = s + W, =,,, where, T s s = s () t () t dt and φ T s W = W () t φ () t dt. Defne a new r.p. x' () t by T [ ] [ φ φ ] T x () t xt () Xφ() t = xt () X Φ(), t = where X = X X s the projecton of x(t) onto the sgnal space S and Φ() t = () t () T t. 74

30 Hence, [ ] [ ] xt () = st () + Wt () Xφ () t = st () + Wt () s + W φ () t = = = sφ () t+ Wt () sφ () t Wφ () t = Wt () Wφ () t = = = = W = Wt () φ () t φ () t Wt () Φ () tw= Wφ () t = W (), t where, = [ ] T = + W T W W, W s the projecton of the nose onto the space S (sgnal space) and W' () t s the part of the nose W(t) that does not le on the sgnal space S. Therefore, xt () = Xφ () t + W () t, whch means that we must only worry about the part of the = nose that les on the sgnal space, namely, the part of the nose whch s not n the sgnal space does not affect the output of the correlators. 75

31 Defne Wt () W() t+ W(), t where r p W () t Wφ (). t r = Then x() t = s () t + W () t + W () t, where W () t = W () t r p ow, f x(t) s a Gaussan r.p., then X m s a Gaussan r.v. wth mean Ts T s = E sj φ () () () () j t φ t dt m + E W t φ t dt m j= { } { T } s T s X m = E X m = E x() t φ () () () () t dt m = E s jφj t + W t φ t dt m j= Ts Ts = E sj φ () () () () j t φ t dt + W t φ t dt m j= = s + { } T s and varance { ()} EWt φ () t dt = s p 76

32 { } { } Ts Ts σ X () ( ) ( ) () m = E X s m = E W m = E W t W τφ τφ t dtdτ Ts Ts Ts Ts { () ( )} () ( ) ( ) () ( ) = E W t W τ φ t φ τ dtdτ = δ t τ φ t φ τ dtdτ Ts φ τφ τ τ = ( ) ( ) d =, =,,. Also, for j, {( )( ) } { } T * s Ts * () () * E X j sj X s m = E WW j m = E W t φj t W ( τφ ) ( τ) dtdτ Ts Ts Ts j j = δ( t τφ ) ( τφ ) ( t) dtdτ = φ( τφ ) ( τ) dτ = 77

33 Ths means that the X s are mutually uncorrelated X s are statstcally ndependent T, because they are Gaussan. Hence, the jont densty functon of = [ ] gven that message m has been transmtted s gven by ( ) ( ) ( ) f x m = f x,, x m = f x m, =,, M X X X = ( x s ) = e = π ( π ) ( ) = = x s e. X X X Defne the Eucldean dstance between vectors u and v by u v = u v + + u v = u v / ( ) ( ) ( ) T. and = [ ] = v v v Then fx ( xm) = exp x s,,, = M π ( ) /, where = [ ] u u u T. where s = [ s s s ] T 78

34 Let s be transmtted and X be the observaton vector of the sampled values of the T correlators, then X = [ X X ] = s + W W = [ W W ],. Decson strategy: Gven the observaton vector X = x, choose the symbol m so that the probablty of mang a decson error s mnmum. Let symbol (sgnal) P m x denote the condtonal probablty of mang a decson error gven that x s observed, then m be sent through the channel and let e( ) ( ) { select, } = { select } Pe m x P m x P m x Optmum Decson Rule: Pe( m x ) s mnmum whenever { select } Equvalently, choose m f { } { } P select m x P select m x,, =,, M. T P m x s maxmum. Ths s nown as the Maxmum A Posteror (MAP) probablty. Equvalently, applyng Bayes rule, yelds: p fx ( xm) Choose symbol m f s maxmum for =, =,, M, fx ( x) 79

35 where p s the a pror probablty of occurrence of the symbol lelhood functon that results when p.d.f. of X. m s transmtted, and f ( x) X m, fx ( xm) s the s the uncondtonal The equvalent rule comes from the fact that, n the lmt, Bayes rule, as appled to a contnuous r.v. s gven by fx ( xa) P{ A} P{ AX= x} =, f x X ( ) where A s an event of selectng symbol m. The dstrbuton of X s ndependent of the transmtted sgnal. Therefore, f p = p,.e., all symbols are transmtted wth equal probablty, then the optmum decson rule can be stated as Choose m f f ( x m ) X s maxmum for =, =,, M. Ths s the maxmum lelhood decson rule and s based on Baysan statstcs. 8

36 Fnally, snce the lelhood functon s non-negatve because t s a probablty densty functon, we can restate the optmum decson rule as choose m f ln fx ( xm) monotoncally ncreasng functon of the argument. s maxmum for =, =,, M, snce ln ( ) s a Remar: The maxmum lelhood decson rule dffers from the MAP decson rule n that t assumes equally lely message symbols. For an AWG channel, the condtonal pdf of the observaton vector x gven that symbol m was transmtted s descrbed by x s fx ( xm) = e = M Hence, ( π ),,,. ln f ( xm ) = ln [ π ] x s, =,, M. X 8

37 But, { ( ) } [ π ] max ln fx x m max ln = x s = max x s = mn x s, { x s } mn{ x s } = mn = snce multplcaton by the postve constant mnmum. does not change the locaton of the Geometrcally speang, f we partton the -dmensonal sgnal space nto M regons, R,,R M, then the decson rule can be reformulated as follows: X les nsde R f ln fx ( xm) Therefore, s maxmum for =, =,, M. X les nsde R f the Eucldean dstance x s s mnmum for =, =,, M.e., choose m f the dstance between x and s s mnmum. 8

38 Error Performance of MAP Recevers If symbol m (sgnal vector s ) s transmtted and x does not le n R, then an error occurs. Therefore, the average probablty of symbol error (SER) s M { does not le n and was sent} P = P X R m e = M = = { does not le n was sent } { was sent} P X R m P m If Pm { was sent } =,, =,, M, then M M Pe = P{ X does not le n R m sent} M = or M P = P{ X les n R m sent }, e M = P X les n R m sent = f x m dx. where { } X ( ) R 83

39 Example: Let m(t) be a bnary sgnal transmtted over an AWG channel. Let m(t) be represented by a bpolar nonreturn to zero waveform wth ampltude A. Then s () t A, t T b A, t T = and s() t = b., otherwse, otherwse Let us apply the Gram-Schmdt procedure. Let g (t) = s (t), then () T b ( b) g t = A dt = A T = A T b g() t, t Tb φ() t = = s() t = Tb s() t = A Tb φ() t g() t A Tb, otherwse 84

40 Let φ g () t = s () t s (), t () t φ () t, then Tb Tb A (), φ() () φ () T b s t t = s t t dt = dt = A T Hence, ( b φ ) g () t = s () t T A () t = s () t + A T φ () t = s () t + s () t = b b Therefore, = = bφ s () t s () t A T (). t Ths s called antpodal sgnalng (one bnary symbol s represented by a sgnal whch s the negatve of the other). Furthermore, only one bass functon s needed to represent the two bnary sgnals. Consder the followng correlator recever: 85

41 Then the sgnal constellaton dagram and the observaton space are shown n the fgure below. R R s = s = A T b s = s = A T = d b φ, X Sgnal constellaton dagram and observaton space Clearly, R = { x: x } and R = { x: x< } If the two symbols are equally lely to be transmtted, then the average probablty of symbol error s gven by Pe = P{ X les n R m sent }. = 86

42 The pdf of the observaton X, gven that m was transmtted s ( x d ) ( ) ( ) x d f ( xm) = e = e, d Xm = A T π π Then the probablty of a correct decson, gven that m was transmtted s ( ) = ( x d ) P X R m e dx π u = x d, du = dx, dx = du, Let ( ) b and u ( x d ) ( x d ) u ( x d ) = = =. 87

43 Hence, d u u u P( X R m) = e du = e du e du π = π π d d u = e du = Q d = P X R m π d ( ) Therefore, the average probablty of a bnary error (BER) s gven by Pe = Q d = + Q d = Q d d d d Fnally, d = Pe = Q Q =, where d s the dstance between sgnal ponts and. 88

44 Effect of Rotaton: Let sr, = s and xr = s + W, then xr sr, = s+ W ( s) = W = x s, =, the dstance depends on the nose alone and Pe s nvarant to rotaton! Effect of Translaton: Suppose now that st, = s a, =, and x t=x-a, then xt st, = x a s+ a = x s = s+ w s = w dstance agan depends on the nose alone and Pe s nvarant to translaton. Remar: Rotatonal nvarance holds only when the rotaton s caused by an orthonormal transformaton matrx Q,.e., for x Q, xr = Qx and QQ T = I. x 89

45 Let P e (m ) be the condtonal probablty of symbol error when symbol m s sent. Let A,, =,, M, denote the event that the observaton vector x s closer to the sgnal vector s than to s, when m( s) s sent. Then M M Pe ( m ) = P A P{ A}, =,, M. = = Equalty holds only when the events A are mutually exclusve. ote that P{A } s a par-wse probablty of a data transmsson system that uses only a par of sgnals, s and s. Ths s dfferent than P { m m m} ˆ =, the probablty that the observaton vector x s closer to the sgnal vector s than any other when s( m) s sent. 9

46 Example: A message source outputs of 4 symbols every T s seconds wth equal probablty. The 4 symbols have the sgnal constellaton and observaton space shown n the fgure below. φ, X s R R R 4 φ, X s s 4 R 3 s 3 9

47 Suppose the observaton vector x at the nput of the decson devce at the recever les on the followng regon: s X s 4 s R ' X x s 3 even though s was sent. Then P m = P{ X R m} e ( ) les n. 9

48 ow, the events A, A3, and A4 are equvalent to havng x le n each one of the followng regons: d X X s s s s 4 s 4 d 3 / X s X R x s 3 x s 3 R 3 A A3 93

49 X s d 4 s 4 s X x s 3 R 4 or A4 { } + { 3} + { 4} = { les n R } + { les n R 3 } + { les n R 4 } P{ X les n R m } = P( m ). P A P A P A P X m P X m P X m e 94

50 ˆ 3 les n R 3, Fnally, P{ m= m m} = P{ x m} where R 3 s depcted below X s s 4 s X x s3 R 3 95

51 We already now that n an AWG channel an error s caused by the nose. Moreover, WG s dentcally dstrbuted along any set of orthogonal axes. Accordng to these crtera, an error s made when m s sent (vector s ) and x les n R, or ( /) / x d d u u P{ A} = e dx = e du e du π = π d π where d s s and u x d / =. Usng the defnton of the Q functon and the error functon complement, we get u d d P{ A} = e dv = Q = erfc, π d snce λ erfc( z) e dλ, z. π z 96

52 Thus, M d Pe( m) Q, =,,, M = d ( ), sent. M M M and P = pp m pq p = P{ m } e e = = = Up to ths pont, our space sgnal analyss has been carred out assumng a correlator recever archtecture, even though we already clamed that the optmum recever for an AWGM channel uses a matched flter. 97

53 Consder the followng matched flter detector for M-ary transmsson over an AWG channel. φ ( T t) y () t X s (t) Σ x(t) φ ( T t) y () t t = T X W(t) φ ( T t ) y () t t = T X t = T Matched flter ban for M-ary sgnalng 98

54 ow, y ( t) = x( τφ ) ( T ( t τ)) dτ At t = T, y( T) = x( τφ ) ( τ) dτ. Moreover, () t φ = for t [ T] T = = y ( T) X x( τφ ) ( τ) dτ,,. Hence, whch s the same as the output of the th branch of the correlator recever! the two recevers are equvalent. Ths means that we have the choce of usng ether a generalzed correlator recever or a ban of matched flters matched to each bass functon of the set that descrbes the sgnal space. In our wor, we shall mostly use correlator recever mplementatons to analyze and assess communcaton system performance. 99

Consider the following passband digital communication system model. c t. modulator. t r a n s m i t t e r. signal decoder.

Consider the following passband digital communication system model. c t. modulator. t r a n s m i t t e r. signal decoder. PASSBAND DIGITAL MODULATION TECHNIQUES Consder the followng passband dgtal communcaton system model. cos( ω + φ ) c t message source m sgnal encoder s modulator s () t communcaton xt () channel t r a n

More information

Communication with AWGN Interference

Communication with AWGN Interference Communcaton wth AWG Interference m {m } {p(m } Modulator s {s } r=s+n Recever ˆm AWG n m s a dscrete random varable(rv whch takes m wth probablty p(m. Modulator maps each m nto a waveform sgnal s m=m

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Chapter 7 Channel Capacty and Codng Contents 7. Channel models and channel capacty 7.. Channel models Bnary symmetrc channel Dscrete memoryless channels Dscrete-nput, contnuous-output channel Waveform

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Wreless Informaton Transmsson System Lab. Chapter 7 Channel Capacty and Codng Insttute of Communcatons Engneerng atonal Sun Yat-sen Unversty Contents 7. Channel models and channel capacty 7.. Channel models

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Signal space Review on vector space Linear independence Metric space and norm Inner product

Signal space Review on vector space Linear independence Metric space and norm Inner product Sgnal space.... Revew on vector space.... Lnear ndependence... 3.3 Metrc space and norm... 4.4 Inner product... 5.5 Orthonormal bass... 7.6 Waveform communcaton system... 9.7 Some examples... 6 Sgnal space

More information

Digital Modems. Lecture 2

Digital Modems. Lecture 2 Dgtal Modems Lecture Revew We have shown that both Bayes and eyman/pearson crtera are based on the Lkelhood Rato Test (LRT) Λ ( r ) < > η Λ r s called observaton transformaton or suffcent statstc The crtera

More information

Error Probability for M Signals

Error Probability for M Signals Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Multi-dimensional Central Limit Argument

Multi-dimensional Central Limit Argument Mult-dmensonal Central Lmt Argument Outlne t as Consder d random proceses t, t,. Defne the sum process t t t t () t (); t () t are d to (), t () t 0 () t tme () t () t t t As, ( t) becomes a Gaussan random

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Differentiating Gaussian Processes

Differentiating Gaussian Processes Dfferentatng Gaussan Processes Andrew McHutchon Aprl 17, 013 1 Frst Order Dervatve of the Posteror Mean The posteror mean of a GP s gven by, f = x, X KX, X 1 y x, X α 1 Only the x, X term depends on the

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering / Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s

More information

Multi-dimensional Central Limit Theorem

Multi-dimensional Central Limit Theorem Mult-dmensonal Central Lmt heorem Outlne ( ( ( t as ( + ( + + ( ( ( Consder a sequence of ndependent random proceses t, t, dentcal to some ( t. Assume t = 0. Defne the sum process t t t t = ( t = (; t

More information

Bayesian decision theory. Nuno Vasconcelos ECE Department, UCSD

Bayesian decision theory. Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world observatons decson functon L[,y] loss of predctn y wth the epected value of the

More information

However, since P is a symmetric idempotent matrix, of P are either 0 or 1 [Eigen-values

However, since P is a symmetric idempotent matrix, of P are either 0 or 1 [Eigen-values Fall 007 Soluton to Mdterm Examnaton STAT 7 Dr. Goel. [0 ponts] For the general lnear model = X + ε, wth uncorrelated errors havng mean zero and varance σ, suppose that the desgn matrx X s not necessarly

More information

Classification as a Regression Problem

Classification as a Regression Problem Target varable y C C, C,, ; Classfcaton as a Regresson Problem { }, 3 L C K To treat classfcaton as a regresson problem we should transform the target y nto numercal values; The choce of numercal class

More information

p 1 c 2 + p 2 c 2 + p 3 c p m c 2

p 1 c 2 + p 2 c 2 + p 3 c p m c 2 Where to put a faclty? Gven locatons p 1,..., p m n R n of m houses, want to choose a locaton c n R n for the fre staton. Want c to be as close as possble to all the house. We know how to measure dstance

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0 MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models ECO 452 -- OE 4: Probt and Logt Models ECO 452 -- OE 4 Maxmum Lkelhood Estmaton of Bnary Dependent Varables Models: Probt and Logt hs note demonstrates how to formulate bnary dependent varables models

More information

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU

MIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU Group M D L M Chapter Bayesan Decson heory Xn-Shun Xu @ SDU School of Computer Scence and echnology, Shandong Unversty Bayesan Decson heory Bayesan decson theory s a statstcal approach to data mnng/pattern

More information

Feb 14: Spatial analysis of data fields

Feb 14: Spatial analysis of data fields Feb 4: Spatal analyss of data felds Mappng rregularly sampled data onto a regular grd Many analyss technques for geophyscal data requre the data be located at regular ntervals n space and/or tme. hs s

More information

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.

More information

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1 C/CS/Phy9 Problem Set 3 Solutons Out: Oct, 8 Suppose you have two qubts n some arbtrary entangled state ψ You apply the teleportaton protocol to each of the qubts separately What s the resultng state obtaned

More information

CSCE 790S Background Results

CSCE 790S Background Results CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Homework Notes Week 7

Homework Notes Week 7 Homework Notes Week 7 Math 4 Sprng 4 #4 (a Complete the proof n example 5 that s an nner product (the Frobenus nner product on M n n (F In the example propertes (a and (d have already been verfed so we

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Lossy Compression. Compromise accuracy of reconstruction for increased compression. Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost

More information

Why Bayesian? 3. Bayes and Normal Models. State of nature: class. Decision rule. Rev. Thomas Bayes ( ) Bayes Theorem (yes, the famous one)

Why Bayesian? 3. Bayes and Normal Models. State of nature: class. Decision rule. Rev. Thomas Bayes ( ) Bayes Theorem (yes, the famous one) Why Bayesan? 3. Bayes and Normal Models Alex M. Martnez alex@ece.osu.edu Handouts Handoutsfor forece ECE874 874Sp Sp007 If all our research (n PR was to dsappear and you could only save one theory, whch

More information

Ph 219a/CS 219a. Exercises Due: Wednesday 23 October 2013

Ph 219a/CS 219a. Exercises Due: Wednesday 23 October 2013 1 Ph 219a/CS 219a Exercses Due: Wednesday 23 October 2013 1.1 How far apart are two quantum states? Consder two quantum states descrbed by densty operators ρ and ρ n an N-dmensonal Hlbert space, and consder

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity Week3, Chapter 4 Moton n Two Dmensons Lecture Quz A partcle confned to moton along the x axs moves wth constant acceleraton from x =.0 m to x = 8.0 m durng a 1-s tme nterval. The velocty of the partcle

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore

8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore 8/5/17 Data Modelng Patrce Koehl Department of Bologcal Scences atonal Unversty of Sngapore http://www.cs.ucdavs.edu/~koehl/teachng/bl59 koehl@cs.ucdavs.edu Data Modelng Ø Data Modelng: least squares Ø

More information

Complex Numbers, Signals, and Circuits

Complex Numbers, Signals, and Circuits Complex Numbers, Sgnals, and Crcuts 3 August, 009 Complex Numbers: a Revew Suppose we have a complex number z = x jy. To convert to polar form, we need to know the magntude of z and the phase of z. z =

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Exerments-I MODULE III LECTURE - 2 EXPERIMENTAL DESIGN MODELS Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 2 We consder the models

More information

The Geometry of Logit and Probit

The Geometry of Logit and Probit The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

EGR 544 Communication Theory

EGR 544 Communication Theory EGR 544 Communcaton Theory. Informaton Sources Z. Alyazcoglu Electrcal and Computer Engneerng Department Cal Poly Pomona Introducton Informaton Source x n Informaton sources Analog sources Dscrete sources

More information

What would be a reasonable choice of the quantization step Δ?

What would be a reasonable choice of the quantization step Δ? CE 108 HOMEWORK 4 EXERCISE 1. Suppose you are samplng the output of a sensor at 10 KHz and quantze t wth a unform quantzer at 10 ts per sample. Assume that the margnal pdf of the sgnal s Gaussan wth mean

More information

PHYS 705: Classical Mechanics. Calculus of Variations II

PHYS 705: Classical Mechanics. Calculus of Variations II 1 PHYS 705: Classcal Mechancs Calculus of Varatons II 2 Calculus of Varatons: Generalzaton (no constrant yet) Suppose now that F depends on several dependent varables : We need to fnd such that has a statonary

More information

Interval Estimation in the Classical Normal Linear Regression Model. 1. Introduction

Interval Estimation in the Classical Normal Linear Regression Model. 1. Introduction ECONOMICS 35* -- NOTE 7 ECON 35* -- NOTE 7 Interval Estmaton n the Classcal Normal Lnear Regresson Model Ths note outlnes the basc elements of nterval estmaton n the Classcal Normal Lnear Regresson Model

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be

More information

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem. Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from

More information

Probability and Random Variable Primer

Probability and Random Variable Primer B. Maddah ENMG 622 Smulaton 2/22/ Probablty and Random Varable Prmer Sample space and Events Suppose that an eperment wth an uncertan outcome s performed (e.g., rollng a de). Whle the outcome of the eperment

More information

An Application of Fuzzy Hypotheses Testing in Radar Detection

An Application of Fuzzy Hypotheses Testing in Radar Detection Proceedngs of the th WSES Internatonal Conference on FUZZY SYSEMS n pplcaton of Fuy Hypotheses estng n Radar Detecton.K.ELSHERIF, F.M.BBDY, G.M.BDELHMID Department of Mathematcs Mltary echncal Collage

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve

More information

Here is the rationale: If X and y have a strong positive relationship to one another, then ( x x) will tend to be positive when ( y y)

Here is the rationale: If X and y have a strong positive relationship to one another, then ( x x) will tend to be positive when ( y y) Secton 1.5 Correlaton In the prevous sectons, we looked at regresson and the value r was a measurement of how much of the varaton n y can be attrbuted to the lnear relatonshp between y and x. In ths secton,

More information

Pulse Coded Modulation

Pulse Coded Modulation Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

TLCOM 612 Advanced Telecommunications Engineering II

TLCOM 612 Advanced Telecommunications Engineering II TLCOM 62 Advanced Telecommuncatons Engneerng II Wnter 2 Outlne Presentatons The moble rado sgnal envronment Combned fadng effects and nose Delay spread and Coherence bandwdth Doppler Shft Fast vs. Slow

More information

UNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours

UNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours UNIVERSITY OF TORONTO Faculty of Arts and Scence December 005 Examnatons STA47HF/STA005HF Duraton - hours AIDS ALLOWED: (to be suppled by the student) Non-programmable calculator One handwrtten 8.5'' x

More information

The exam is closed book, closed notes except your one-page cheat sheet.

The exam is closed book, closed notes except your one-page cheat sheet. CS 89 Fall 206 Introducton to Machne Learnng Fnal Do not open the exam before you are nstructed to do so The exam s closed book, closed notes except your one-page cheat sheet Usage of electronc devces

More information

The Second Anti-Mathima on Game Theory

The Second Anti-Mathima on Game Theory The Second Ant-Mathma on Game Theory Ath. Kehagas December 1 2006 1 Introducton In ths note we wll examne the noton of game equlbrum for three types of games 1. 2-player 2-acton zero-sum games 2. 2-player

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

Solutions to exam in SF1811 Optimization, Jan 14, 2015

Solutions to exam in SF1811 Optimization, Jan 14, 2015 Solutons to exam n SF8 Optmzaton, Jan 4, 25 3 3 O------O -4 \ / \ / The network: \/ where all lnks go from left to rght. /\ / \ / \ 6 O------O -5 2 4.(a) Let x = ( x 3, x 4, x 23, x 24 ) T, where the varable

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

Chapter 3 Describing Data Using Numerical Measures

Chapter 3 Describing Data Using Numerical Measures Chapter 3 Student Lecture Notes 3-1 Chapter 3 Descrbng Data Usng Numercal Measures Fall 2006 Fundamentals of Busness Statstcs 1 Chapter Goals To establsh the usefulness of summary measures of data. The

More information

Strong Markov property: Same assertion holds for stopping times τ.

Strong Markov property: Same assertion holds for stopping times τ. Brownan moton Let X ={X t : t R + } be a real-valued stochastc process: a famlty of real random varables all defned on the same probablty space. Defne F t = nformaton avalable by observng the process up

More information

Economics 130. Lecture 4 Simple Linear Regression Continued

Economics 130. Lecture 4 Simple Linear Regression Continued Economcs 130 Lecture 4 Contnued Readngs for Week 4 Text, Chapter and 3. We contnue wth addressng our second ssue + add n how we evaluate these relatonshps: Where do we get data to do ths analyss? How do

More information

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models ECO 452 -- OE 4: Probt and Logt Models ECO 452 -- OE 4 Mamum Lkelhood Estmaton of Bnary Dependent Varables Models: Probt and Logt hs note demonstrates how to formulate bnary dependent varables models for

More information

CS 468 Lecture 16: Isometry Invariance and Spectral Techniques

CS 468 Lecture 16: Isometry Invariance and Spectral Techniques CS 468 Lecture 16: Isometry Invarance and Spectral Technques Justn Solomon Scrbe: Evan Gawlk Introducton. In geometry processng, t s often desrable to characterze the shape of an object n a manner that

More information

LINEAR REGRESSION ANALYSIS. MODULE VIII Lecture Indicator Variables

LINEAR REGRESSION ANALYSIS. MODULE VIII Lecture Indicator Variables LINEAR REGRESSION ANALYSIS MODULE VIII Lecture - 7 Indcator Varables Dr. Shalabh Department of Maematcs and Statstcs Indan Insttute of Technology Kanpur Indcator varables versus quanttatve explanatory

More information

A random variable is a function which associates a real number to each element of the sample space

A random variable is a function which associates a real number to each element of the sample space Introducton to Random Varables Defnton of random varable Defnton of of random varable Dscrete and contnuous random varable Probablty blt functon Dstrbuton functon Densty functon Sometmes, t s not enough

More information

STATISTICAL MECHANICS

STATISTICAL MECHANICS STATISTICAL MECHANICS Thermal Energy Recall that KE can always be separated nto 2 terms: KE system = 1 2 M 2 total v CM KE nternal Rgd-body rotaton and elastc / sound waves Use smplfyng assumptons KE of

More information

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14 APPROXIMAE PRICES OF BASKE AND ASIAN OPIONS DUPON OLIVIER Prema 14 Contents Introducton 1 1. Framewor 1 1.1. Baset optons 1.. Asan optons. Computng the prce 3. Lower bound 3.1. Closed formula for the prce

More information

Maximum Likelihood Estimation (MLE)

Maximum Likelihood Estimation (MLE) Maxmum Lkelhood Estmaton (MLE) Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Wnter 01 UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y

More information

Analysis of Discrete Time Queues (Section 4.6)

Analysis of Discrete Time Queues (Section 4.6) Analyss of Dscrete Tme Queues (Secton 4.6) Copyrght 2002, Sanjay K. Bose Tme axs dvded nto slots slot slot boundares Arrvals can only occur at slot boundares Servce to a job can only start at a slot boundary

More information

Eigenvalues of Random Graphs

Eigenvalues of Random Graphs Spectral Graph Theory Lecture 2 Egenvalues of Random Graphs Danel A. Spelman November 4, 202 2. Introducton In ths lecture, we consder a random graph on n vertces n whch each edge s chosen to be n the

More information

Statistical pattern recognition

Statistical pattern recognition Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve

More information

Chapter Newton s Method

Chapter Newton s Method Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve

More information

Week 9 Chapter 10 Section 1-5

Week 9 Chapter 10 Section 1-5 Week 9 Chapter 10 Secton 1-5 Rotaton Rgd Object A rgd object s one that s nondeformable The relatve locatons of all partcles makng up the object reman constant All real objects are deformable to some extent,

More information

I + HH H N 0 M T H = UΣV H = [U 1 U 2 ] 0 0 E S. X if X 0 0 if X < 0 (X) + = = M T 1 + N 0. r p + 1

I + HH H N 0 M T H = UΣV H = [U 1 U 2 ] 0 0 E S. X if X 0 0 if X < 0 (X) + = = M T 1 + N 0. r p + 1 Homework 4 Problem Capacty wth CSI only at Recever: C = log det I + E )) s HH H N M T R SS = I) SVD of the Channel Matrx: H = UΣV H = [U 1 U ] [ Σr ] [V 1 V ] H Capacty wth CSI at both transmtter and

More information

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006) ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)

More information

The Feynman path integral

The Feynman path integral The Feynman path ntegral Aprl 3, 205 Hesenberg and Schrödnger pctures The Schrödnger wave functon places the tme dependence of a physcal system n the state, ψ, t, where the state s a vector n Hlbert space

More information

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

Introduction to Random Variables

Introduction to Random Variables Introducton to Random Varables Defnton of random varable Defnton of random varable Dscrete and contnuous random varable Probablty functon Dstrbuton functon Densty functon Sometmes, t s not enough to descrbe

More information

Problem Points Score Total 100

Problem Points Score Total 100 Physcs 450 Solutons of Sample Exam I Problem Ponts Score 1 8 15 3 17 4 0 5 0 Total 100 All wor must be shown n order to receve full credt. Wor must be legble and comprehensble wth answers clearly ndcated.

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2) 1/16 MATH 829: Introducton to Data Mnng and Analyss The EM algorthm (part 2) Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 20, 2016 Recall 2/16 We are gven ndependent observatons

More information

x i1 =1 for all i (the constant ).

x i1 =1 for all i (the constant ). Chapter 5 The Multple Regresson Model Consder an economc model where the dependent varable s a functon of K explanatory varables. The economc model has the form: y = f ( x,x,..., ) xk Approxmate ths by

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

10.40 Appendix Connection to Thermodynamics and Derivation of Boltzmann Distribution

10.40 Appendix Connection to Thermodynamics and Derivation of Boltzmann Distribution 10.40 Appendx Connecton to Thermodynamcs Dervaton of Boltzmann Dstrbuton Bernhardt L. Trout Outlne Cannoncal ensemble Maxmumtermmethod Most probable dstrbuton Ensembles contnued: Canoncal, Mcrocanoncal,

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

Georgia Tech PHYS 6124 Mathematical Methods of Physics I

Georgia Tech PHYS 6124 Mathematical Methods of Physics I Georga Tech PHYS 624 Mathematcal Methods of Physcs I Instructor: Predrag Cvtanovć Fall semester 202 Homework Set #7 due October 30 202 == show all your work for maxmum credt == put labels ttle legends

More information