Chapter 7 Channel Capacty and Codng
Contents 7. Channel models and channel capacty 7.. Channel models Bnary symmetrc channel Dscrete memoryless channels Dscrete-nput, contnuous-output channel Waveform channels 7.. Channel capacty
7.. Channel Models Bnary symmetrc channel (BSC) If the channel nose and other dsturbances cause statstcally ndependent errors n the transmtted bnary sequence wth average probablty p, then ( = X = ) = P( Y = X = ) = p ( = X = ) = P( Y = X = ) = p P Y P Y 3
7.. Channel Models Dscrete memoryless channels (DMC) BSC s a specal case of a more general dscrete-nput, dscrete-output channel. Output symbols from the channel encoder are q-ary symbols,.e., X={x,x,,x q- }. Output of the detector conssts of Q-ary symbols, where Q M= q. If the channel and modulaton are memoryless, we have a set of qq condtonal probabltes: P ( Y = y ) ( ) X = x P y x where =,,,Q- and =,,,q-. Such a channel s called a dscrete memoryless channel (DMC). 4
7.. Channel Models Dscrete memoryless channels (DMC) Input u,u,,u n Output: v,v,,v n The condtonal probablty s gven by: ( = v, Y = v,...,y = v X = u,..., X = u ) P Y = n k = ( = v X = u ) P Y k n In general, the condtonal probabltes P(y x ) can be arranged n the matrx form P=[p ], called probablty transton matrx. n k 5 n Dscrete q-ary nput, Q-ary output channel
7.. Channel Models Dscrete-nput, contnuous-output channel p Dscrete nput alphabet X={x,x,,x q- }. Output of the detector s unquantzed (Q= ). The most mportant channel of ths type s the addtve whte Gaussan nose (AWG) channel, for whch Y = X + G where G s a zeor-mean Gaussan random varable wth varance σ and X=x k, k=,,,q-. p( y X = xk ) = e πσ 6 ( y x ) k σ ( y ), y,..., yn X = u, X = u,..., X n = un = p( y X = u ) n =
Waveform channels 7.. Channel Models Assume that a channel has a gven bandwdth W, wth deal frequency response C( f )= wthn the bandwdth W, and the sgnal at ts output s corrupted by AWG: y(t)=x(t)+n(t). Expand y(t), x(t), and n(t) nto a complete set of orthonormal functons: ( ) = ( ) ( ) = ( ) ( ) = ( ) y t y f t, x t x f t, n t n f t. T T * * () () () () () y = y t f t dt = x t + n t f t dt = x + n T f * () t f () t dt = δ = ( = ) ( ) 7
Waveform channels Snce y =x +n, t follows that: y x p( y x ) = e πσ 7.. Channel Models ( ) σ, = Snce the functons {f (t)} are orthonormal, t follows that the {n } are uncorrelated. Snce they are Gaussan, they are also statstcally ndependent: p Samples of x(t) and y(t) may be taken at the yqust rate of W samples per second. Thus, n a tme nterval of length T, there are =WT samples. 8,,... ( y ), y,..., y x, x,..., x = p( y x ) =
7.. Channel Capacty Consder a DMC havng an nput alphabet X={x,x,,x q- }, an output alphabet Y={y,y,,y Q- }, and the set of transton probabltes P(y,x ). The mutual nformaton provded about the event X=x by the occurrence of the event Y=y s log[p(y x )/P(y )], where ( ) ( ) ( ) ( ) q P y = P Y = y P xk P y xk k = Hence, the average mutual nformaton provded by the output Y about the nput X s: ( ) q ( ) Q ( ) ( ) P y x I X ; Y P x P y x log P y = = ( ) 9
7.. Channel Capacty The value of I(X;Y) maxmzed over the set of nput symbol probabltes P(x ) s a quantty that depends only on the characterstcs of the DMC through the condtonal probabltes P(y x ). Ths quantty s called the capacty of the channel and s denoted by C: ( ) C = max I X; Y Px ( ) ( x ) q Q P y max ( ) ( ) P x P y x log Px ( ) P y = ( ) = = The maxmzaton of I(X;Y) s performed under the constrants that q P( x ) and ( ) P x =. =
7.. Channel Capacty Example 7.- BSC wth transton probabltes P( )=P( )=p. The average mutual nformaton s maxmzed when the nput probabltes P()=P()=½. The capacty of the BSC s C = p log p + ( p) log ( p) = H ( p) where H(p) s the bnary entropy functon.
7.. Channel Capacty Consder the dscrete-tme AWG memoryless channel descrbed by ( y x ) ( ) k σ p y X = xk = e πσ The capacty of ths channel n bts per channel use s the maxmum average mutual nformaton between the dscrete nput X={x,x,,x q- } and the output Y={,- }: where C = max ( x ) q P = p p ( y x ) P( x ) q log ( y) p( y x ) P( ) = k = k x k P ( y x ) P( y) dy
7.. Channel Capacty Example 7.-. Consder a bnarynput AWG memoryless channel wth possble nputs X=A and X=-A. The average mutual nformaton I(X;Y) s maxmzed when the nput probabltes are P(X=A)=P(X=- A)=½. ( ) p( y) p( y A) p( y) p y A C = p( y A) log dy + p ( y A) log dy 3
7.. Channel Capacty It s not always the case to obtan the channel capacty by assumng that the nput symbols are equally probable. othng can be sad n general about the nput probablty assgnment that maxmzes the average mutual nformaton. It can be shown that the necessary and suffcent condtons for the set of nput probabltes {P(x )} to maxmze I(X;Y) and to acheve capacty on a DMC are: I I ( x ) ( ) ; Y = C for all wth P x > ( x ; Y ) C for all wth P( x ) = where C s the capacty of the channel and Q ( ) ; = ( ) P y I x Y P y x log P = 4 ( x ) ( y )
7.. Channel Capacty Consder a band-lmted waveform channel wth AWG. The capacty of the channel per unt tme has been defned by Shannon (948) as C = lm max I( X ; Y ) T p( x) T Alternatvely, we may use the samples or the coeffcents {y }, {x }, and {n } n the seres expansons of y(t), x(t), and n(t) to determne the average mutual nformaton between x =[x x x ] and y =[y y y ], where =WT, y = x + n. I p ( ) ( ) ( ) ( y x ) X Y =...... p y x p x log x y p( y ) p ( ) ( ) ( y x ) = p y x p x log dydx p( y ) ; dx dy = 5
7.. Channel Capacty where ( y x ) p( y x ) = e π The maxmum of I(X;Y) over the nput PDFs p(x ) s obtaned when the {x } are statstcally ndependent zero-mean Gaussan random varables,.e., p ( ) x σ x x = e πσ σ x σ x max I( X; Y) = log + = log + px ( ) = σ x = WT log + x 6
7.. Channel Capacty If we put a constrant on the average power n x(t),.e., T () ( σ ) x Pav = E x t dt E x T = = T T TPav Pav σ x = = W P ( ) ( ) = + av max I X ; Y WT log p x W Dvdng both sdes by T and we can obtan the capacty of the band-lmted AWG waveform channel wth a band-lmted and average power-lmted nput: C 7 = P log + av W W =
7.. Channel Capacty ormalzed channel capacty as a functon of SR for band-lmted AWG channel Channel capacty as a functon of bandwdth wth a fxed transmtted average power 8
7.. Channel Capacty ote that as W approaches nfnty, the capacty of the channel approaches the asymptotc value Pav Pav C = log e = bts/s ln Snce P av represents the average transmtted power and C s the rate n bts/s, t follows that Hence, we have Consequently C W P av = Cε 9 b C log + W ε = b ε C C W W b =
When C/W=, ε b / = ( db). 7.. Channel Capacty When C/W, CW εb C C exp ln ln C W W W When C/W ε b = C C lm W C W W = ln
7.. Channel Capacty The channel capacty formulas serve as upper lmts on the transmsson rate for relable communcaton over a nosy channel. osy channel codng theorem by Shannon (948) There exst channel codes (and decoders) that make t possble to acheve relable communcaton, wth as small an error probablty as desred, f the transmsson rate R<C, where C s the channel capacty. If R>C, t s not possble to make the probablty of error tend toward zero wth any code.