Channel Coding 2.

Size: px
Start display at page:

Download "Channel Coding 2."

Transcription

1 Channel Codng Dr.-Ing. Drk Wübben Insttute for Telecommuncatons and Hgh-Frequency Technques Department of Communcatons Engneerng Room: N3, Phone: 4/ Lecture Tuesday, 8:3 : n N33 Exercse Wednesday, 4: 6: n N4 Dates for exercses wll be announced durng lectures. Tutor Shayan Hassanpour Room: N39 Phone hassanpour@ant.un-bremen.de

2 Outlne Channel Codng II. Concatenated Codes Seral Concatenaton & Parallel Concatenaton (Turbo Codes) Iteratve Decodng wth Soft-In/Soft-Out decodng algorthms EXIT-Charts BCM LDPC Codes. Trellscoded Modulaton (TCM) Motvaton by nformaton theory TCM of Ungerböck, pragmatc approach by Vterb, Multlevel codes Dstance propertes and error rate performance Applcatons (data transmsson va modems) 3. Adaptve Error Control Automatc Repeat Request (ARQ) Performance for perfect and dsturbed feedback channel Hybrd FEC/ARQ schemes

3 Chapter. Concatenated Codes Introducton Seral and Parallel Concatenaton Interleavng Seral Concatenaton Drect approach, Product Codes, Choce of Component Codes Parallel Concatenaton Modfcaton of Product Codes, Turbo-Codes, Choce of Component Codes Dstance Propertes and Performance Approxmaton Decodng of Concatenated Codes Defnton of Soft-Informaton, L-Algebra, General Approach for Soft-Output Decodng, BCJR-Algorthm, Iteratve Decodng, General Concept of Iteratve Decodng EXtrnstc Informaton Transfer (EXIT)-Charts Btnterleaved Coded Modulaton (BCM) Low Densty Party Check (LDPC) Codes 3

4 Introducton Achevng Shannon s channel capacty s the general goal of codng theory Block- and convolutonal codes of CC- are far away from achevng ths lmt Decodng effort ncreases (exponentally) wth performance Questonable, f Shannon s lmt can be acheved by these codes Concatenaton of Codes Forney (966): proposed combnaton of smple codes Berrou, Glaxeux, Thtmajshma: Turbo-Codes (993): Clever parallel concatenaton of two convolutonal codes achevng.5 db loss at P b = -5 to channel capacty Claude E. Shannon Davd Forney Prncpal Idea: Claude Berrou Alan Glaveux Punya Thtmajshma Clever concatenaton of smple codes n order to generate a total code wth hgh performance and enablng effcent decodng Example: Convolutonal Code wth L C = 9 8 = 56 states Convolutonal Codes wth L C = 3 = 8 states complexty reducton by a factor of 3 repeated decodng (6 teratons) 6 8 = 48 states reducton by a factor of 5 4

5 Seral and Parallel Code Concatenaton Seral Code Concatenaton C C D D nner code outer code Subsequent encoder obtans whole output stream of prevous encoder redundancy bts are also encoded C Parallel Code Concatenaton P Each encoder obtans only nformaton bts C Parallel-seral converter generates seral data stream Example: Turbo Codes C q S 5

6 Interleavng Interleaver performs permutaton of symbol sequence Strong mpact on performance of concatenated codes Also used to splt burst errors nto sngle errors for fadng channels Block nterleaver wrte xx xx 3 xx 6 xx 9 xx xx xx 4 xx 7 xx xx 3 xx xx 5 xx 8 xx xx 4 read Column-wse wrte n, but row-wse read out leads to permutaton of symbol sequence nterleavng depth L I = 5: neghborng symbols of the nput stream have a dstance of 5 n the output stream gven by number of columns nput sequence: x, x, x, x 3, x 4, x 5, x 6, x 7, x 8, x 9, x, x, x, x 3, x 4 output sequence: x, x 3, x 6, x 9, x, x, x 4, x 7, x, x 3, x, x 5, x 8, x, x 4 xx yy Π Π xx yyy channel L I 6

7 Interleavng Assumpton: burst errors of length b should be separated Aspects of dmensonng block nterleaver Number of columns affects drectly the nterleaver depth L I L I ³ b s requred, so that burst error of length b s broken nto sngle errors by Π Number of rows Example: For a convolutonal code wth L C = 5, fve successve code words are correlated for R c =/ ten successve code bts are correlated In order to separate these ten bts (by L I to protect them from burst errors), the number of rows should correspond to L C /R c = Tme delay (latency) The memory s read out after the whole memory s wrtten Notce: For duplex speech communcaton only an overall delay of 5 ms s tolerable Example: data rate 9,6 kbt/s and nterleaver sze 4 bts xx yy Π Π xx yyy t = rows columns T 4 t = = 83,3 ms 96/ s b channel Burst error 7

8 Interleavng Convolutonal Interleaver L L (N-) L channel (N-) L L L Conssts of N regsters and multplexer Each regster stores L symbols more than the prevous regster Prncple s smlar to block nterleaver Random Interleaver Block nterleaver has a regular structure output dstance s drectly gven by nput dstance leadng to bad dstance propertes for Turbo-Codes Random nterleavers are constructed as block nterleavers where the data postons are determned randomly A pseudo-random generator can be utlzed for constructng these nterleavers 8

9 Seral Code Concatenaton: Drect Approach Concatenaton of (3,,)-SPC and (4,3,)-SPC code u c c w H (c ) R c = /4 = / d mn = Concatenaton of (4,3,)-SPC and (7,4,3)-Hammng code u c c w H (c ) c w H (c ) C C Concatenaton does not automatcally result n a code wth larger dstance R c = 3/7 orgnal concatenaton: d mn = 3 optmzed concatenaton: d mn = 4 9

10 Seral Code Concatenaton: Product Codes k V n V - k V k H u p V C V n H - k H p H C H p + checks on checks C H Π C V block nterleaver Informaton bts arranged n (k V,k H )- matrx u Row-wse encodng wth systematc (n H, k H, d H )-code C H of rate k H / n H each row contans a code word Column-wse encodng wth systematc (n V, k V, d V )-code C V of rate k V / n V each column contans a code word Entre code rate: kh kv R = = R R n n c c,h c,v H V Mnmum Hammng dstance: d = d d mn mn,h mn,v

11 Seral Code Concatenaton: Examples of Product Codes (,6,4) product code (8,,6) product code Detecton capablty of horzontal code exceeded x x 4 x 8 x x 7 x 4 x x x 7 x 4 x x x 5 x 9 x x x 5 x x 8 x 5 x x x 8 x 5 x x x 6 x x x 9 x 6 x 3 x x 9 x 6 x 3 x 3 x 7 x x 3 x x 7 x 4 x 3 x x 7 x 4 Horzontal: (3,,)-SPC code Vertcal: (4,3,)-SPC code Code rate: / d mn = = 4 Correcton of error & detecton of 3 errors possble Interleaver combnes 3 nfo words ncrease of eff. block length x 4 x 5 x 6 x x x 3 x 8 x 9 x x 5 x 6 x 7 x 4 x 5 x 6 x x x 3 x 8 x 9 x x 5 x 6 x 7 Horzontal: (4,3,)-SPC code Vertcal: (7,4,3)-Hammng code d mn = 3 = 6 correcton of errors possble

12 Parallel Code Concatenaton: Modfed Product Codes k H n H - k H Informaton bts u row-wse and column-wse encoded wth C H and C V, respectvely k V u p H C H Party check bts of component codes not encoded twce (no checks on checks) n V - k V Π p V C V C H C V Entre code rate kh kv Rc = nh nv ( nh kh) ( nv kv) = / R + / R c,h c,v Mnmum Hammng dstance: d = d + d mn mn,h mn,v

13 Parallel Code Concatenaton: Examples modfed (,6,3) product code modfed (5,,4) product code x x 4 x 8 x x 7 x 4 x x x 5 x d 9 mn = 3 x x 8 x 5 x x x 6 x x x 9 x 6 x 3 x 3 x 7 x 3 x x 7 x 4 x 4 x x 8 Horzontal: (3,,) SPC code Vertcal: (4,3,) SPC code Code rate: 6/ d mn = + - = 3 error correctable x 5 x 6 x x 3 x 9 x Horzontal: (4,3,) SPC code Vertcal: (7,4,3) Hammng code d mn = = 4 error correctable 3

14 Unon Bound on Bt Error Rate for Product Codes Product codes usng same (n,k,3)-hammng code Only takng nto account mnmum dstance d mn =3+3-=5 results only vald for hgh sgnal to nose ratos (7,4) (5,) (3,6) (7,4) (5,) (3,6) P b -5 P -5 b ( E N ) log / s ( E N ) log / b 4

15 Parallel Code Concatenaton: Turbo Codes General structure wth q consttuent codes specal case wth consttuent codes u Π u C c u u=u C c c Π u C c P c Π P u C c Π q u q C q c q Presented n 993 by Berrou, Glaveaux, Thtmajshma Interleaver P neglectable Informaton bts generally not punctured Code rate: Rc = / R + / R c, c, 5

16 Potental of Turbo Codes Comparson convolutonal codes / turbo codes for R c =/ P b L c =3 L c =5 - L c =7 L c =9 TC db db log (E b /N ) Optmzed nterleaver of length 56 x 56 = bts For ths nterleaver, gan of nearly 3 db over convolutonal code wth L c = 9 Gap to Shannon s channel capacty only.5 db (C =.5 at E b /N =.9 db) Tremendous performance loss for smaller nterleavers World record:.8 db gap to Shannon capacty by Stephan ten Brnk 6

17 Influence of Consttuent Codes Systematc recursve convolutonal encoders employed n turbo codes Consttuent codes generate only party bts Conventonally codes wth small constrant length (3 L c 5) and rate RR cc = nn (codes of larger rate can be acheved by puncturng) Error probablty depends on nterleaver sze LL ππ and mnmum nput weght ww mmmmmm of consttuent encoders that leads to fnte output weght P b L w π mn Only recursve encoders requre at least ww mn = for fnte output weght Interleavng gan only achevable for recursve encoders due to PP bb ~LL ππ Nonrecursve encoders wth ww mn = do not gan from enlargng nterleaver sze (PP bb ~LL ππ ) RSC-Encoders are used as consttuent codes performance mproves wth length of nterleaver! 7

18 Influence of Consttuent Codes Instead of free dstance d f the effectve dstance d eff s crucal d = w + c Interpretaton: Turbo codes are systematc codes eff mn mn Total weght of code words depends on weght of nformaton bts ww mn cc mn denotes mnmum weght of party bts of one encoder for nput weght ww mn = Assumng same consttuent codes, mnmum weght for ww mn = s gven by dd eff Consequence: Sutable consttuent codes should maxmze party weght for nput weght ww mn = Am s acheved f feedback polynomal of consttuent encoders s prme Shft regster generates sequence of maxmum length (m-sequence) may have larger weght than shorter sequences Feedback polynomal of consttuent encoders should be prme! 8

19 Example of Turbo Code wth Codes (L c = 3), R c = / u c c u T T Π u C C c P 8 8 T T P = g g = = 5 7 9

20 Example of Turbo Code wth Codes (L c = 3), R c = / Recursve polynomal: g (D) = + D + D g (D) s prme g () = + + = and g () = + + = Shft regster acheves sequence of maxmum length (m-sequence) wth L = -=3 Max dst. dd max eff = ww mn + LL + = + 4 = u = [ ] c = [ ] Recursve polynomal: g (D) = + D g (D) = (+D)(+D) non-prme Shft regster generates sequence of length L = Max dst.dd max eff = ww mn + LL + = + 3 = 8 u = [ ] c = [ ] Feedback polynomal g (D) would lead to degraded performance! / / / / / / / / / / / / / / / /

21 Example of Turbo Code wth Codes (L c = 5), R c = /3 u g = 3 g = c c u T T T T Π C c u C T T T T P =

22 LTE Turbo Code wth Codes (L c = 4) u g = + D + D = 3 g = + D+ D = c Π u C T T T Rate Matchng c c u T T T C

23 Influence of Interleaver E b Pb cd erfc d Rc d N Avodng output sequences wth low Hammng weght at both encoders If output c of C has low Hammng weght permutaton of nput sequence u for C should result n output sequence c wth hgh Hammng weght Hgher total average Hammng weght / Hammng dstance d Interleaver drectly nfluences mnmum dstance Number of sequences wth low weght reduced due to nterleavng Small coeffcents c d Even more mportant than mnmum dstance that acts only asymptotcally Randomness of nterleaver s mportant Smple block nterleavers perform bad due to symmetry c d : total number of nonzero nfo bts assocated wth code sequences of Hammng weght d Pseudo-random nterleavers are much better random codes ( Shannon) 3

24 Dstance Propertes of Turbo Codes: Defntons General IOWEF (Input Output Weght Enumeratng Functon) of encoder: k n w d AW (, D) = Awd, W D w= d= Condtoned IOWEF s (specfc nput weght w or specfc output weght d): n d (, ) = wd, (, ), AwD A D d = Important for parallel concatenaton: weght c of party bts w c A( W, C) = Aw, c W C wth d = w+ c w c Correspondng condtoned IOWEF: c AwC = A C (, ) w, c c AWd = A W k w= A w,d : number of code words wth nput weght w and output weght d wd w All encoders have same nput weght w Encoders generate only party bts consder weght c of party bts 4

25 Dstance Propertes of Turbo Codes: Unform Interleaver Problem: concrete nterleaver has to be consdered for dstance spectrum / IOWEF determnaton of IOWEF computatonally expensve Unform nterleaver (UI): theoretc devce comprsng all possble permutatons unform nterleaver /6 /6 /6 /6 /6 UI provdes average dstance spectrum (ncl. good and bad nterleavers) /6 4 possbltes 5

26 Dstance Propertes of Turbo Codes: Results Parallel concatenaton: Both encoders have same nput weght w Weghts c and c of encoder outputs are added A (w,c) A (w,c) combnes output sequences wth same nput weght w and covers all possble combnatons of output sequences (unform nterleaver) Denomnator acheves averagng w.r.t. number of permutatons of w ones n length L π A( w, C) A ( w, C) A C = = A C ( w, ) par par c w, c Lπ c w c d w+ c= d w L = A π par wc, Seral concatenaton: Output weght of outer encoder equals nput weght of nner encoder ser AW (,) A(, D) ser w d A ( W, D) = = A L wd, W D w c = π w d d w L π R c A ser wd, 6

27 Dstance Propertes of Turbo Codes c d 5 TC, L π = TC, L π =4 CC dstance d Codes Turbo Code g = 5 8, g = 7 8 Convolutonal Code wth L c =9 R c =/3 Observatons UI c d < s possble TC has lower d f but coeffcents c d are much smaller effect becomes more obvous wth ncreasng nterleaver length L π 7

28 Analytcal Error Rate Estmaton of Turbo Codes - TC, L= TC, L=4 FC Observatons For small SNR the TC outperforms CC sgnfcantly P b AWGN log / ( E N ) b Flat Raylegh Fadng Gan ncreases wth L π For ncreasng SNR the BER of TC flattens, whereas the curve of CC decreases Explanatons d f domnates BER for large SNR For small SNR the number of sequences wth specfc weght s of larger mportance 8

29 Decodng of Concatenated Codes Defnton of Soft-Informaton L-Algebra General Approach for Soft-Output Decodng Soft-Output Decodng usng the Dual Code Soft-Output Decodng for (4,3,)-SPC-Code BCJR Algorthm for Convolutonal Codes 9

30 Decodng of Concatenated Codes Optmum Maxmum Lkelhood Decodng of concatenated codes s too complex Consttuent codes C and C are decoded by separated decoders D and D Decoders D and D are allowed to exchange nformaton n to mprove ther performance probablty of nformaton and/or code bts s of nterest soft output decodng s requred! What s a useful soft output? Assumpton: uncoded transmsson over AWGN channel BPSK modulaton u = x=+ x= u u = x= y = x+ n MAP crteron (Maxmum a posteror) consders unequal dstrbuton of symbols Pr{ u = y} < > Pr{ u = y} Pr{ x=+ y > } < Pr{ x= y} 3

31 Condtonal Probablty Decodng of Concatenated Codes { =+, } > { =, } { } Pr{ y} < Pr{ y} { } p x y p x y Log-Lkelhood-Rato (LLR) (or L-values) derved by Hagenauer p{ x= +, y} Lx ( ˆ) = Lxy (, ) = Lx ( y) = ln > p x=, y < { } { } { } p y x=+ Pr x=+ = ln + ln = Ly ( x) + La ( x) Sgn of pllr s { y x= equals } hard Prdecson { x= } L( yx ) L ( x) Sgn sgn LL xx corresponds to hard decson Magntude LL xx ndcates relablty of hard decson Another possble defnton would be (not used) Lx ( ) = Pr x=+ Pr x= { } { } { x=+ y} = p{ x=+ y} { y} Pr, Pr a { } { } { } { } p x=+, y p y x=+ Pr x=+ = p x=, y p y x= Pr x= > < Joachm Hagenauer Addton of LLRs requres statstcally ndependency of varables! 3

32 Log-Lkelhood-Rato For an uncoded transmsson the LLR conssts of two components L(y x) depends on channel statstcs and therefore on the receved sgnal y L a (x) represents a-pror knowledge about symbol x 8 L a (x) Pr{x = +} { x = + } { x = } Pr L ( ) ln a x = Pr Symmetrc wth respect to (,5 ; ) Pr{x = +} >,5 + more lkely than - postve L a (x) The larger the dfference between Pr{x=+} and Pr{x=-} the larger L a (x) sutable value for relablty Pr{x = +} =,5 L a (x) = decson would be random 3

33 LLR for a Memoryless Channel Memoryless channel (AWGN or -path fadng channel) Channel nformaton { } ( ) exp y α E s / Ts p y x= + σ L( yx) = ln = ln p{ y x= } ( ) exp y+α E / s Ts σ ( ) ( ) = y+α E / / s Ts y α E s Ts σ σ 4 α y Es / Ts Es / Ts Es = = 4α y = 4α σ N / Ts N L ch y y =α x+ n wth σ = N T s normalzed receved sgnal y y = α E / T s s L ch = relablty of the channel (depends on SNR E S / N and channel gan α ) 33

34 Relablty of channel: L LLR for a Memoryless Channel LLR s smply a scaled verson of the matched flter motvaton for ln 6 4 ch = 4 α E N s L(y x) versus y hgh channel relablty ( ) L yx E y N s 4 = α db db 4 db 6 db 8dB - - y 34

35 Bnary Symmetrc Channel (BSC) ( ) L yx LLRs for BSC and BEC -P e -P e P e Bnary Erasure Channel (BEC) Pq ln for y = Y + for y = Y Pq L( y x) = ln for y = Y = for y = Y Pq for y = Y ln for y = Y Pq X X P e Y Pe ln y Y p{ y x } for = = + = + Pe Pe = ln y ln p{ y x = = = } Pe Pe ln for y Y P = = e Y X -P q P q LL(yy xx) Y Y P X q Y -P q PP ee 35

36 Relaton between LLRs and Probabltes () Matched flter corresponds to LLR Task: Fnd arthmetc to perform operaton wth respect to LLR nstead of probabltes L-algebra by Hagenauer Basc relaton Usng completeness (Pr{x = +} + Pr{x = -} = ) n LLR Pr{ x=+ y} Pr{ x=+ y} Pr x= y Lx ( ˆ) = Lx ( y) = ln = ln = ln Pr x= y Pr x=+ y Pr x= y { } { } L( xˆ ) e Pr{ x=+ y} = = L( xˆ) L( xˆ) + e + e Pr{ x= y} = ( ˆ) + e L x Wth respect to symbol x {+,-} the general relaton holds L( xˆ )/ e L ˆ { } ( x Pr x= y = e )/ = wth ( ˆ) sgn( ) ( ˆ) {, + L x L x } + e + e { } { } 36

37 Relaton between LLRs and Probabltes () Probablty of a correct decson For x = + decson s correct, f LL( xx) s postve L( xˆ ) L( xˆ ) e e Pr{ xˆ correct x=+ } = = L( xˆ ) L( xˆ ) + e + e For x = - decson s correct, f LL( xx) s negatve L( xˆ ) e Pr{ xˆ correct x= -} = = = L( xˆ ) L( xˆ) L( xˆ) + e + e + e L( xˆ ) e Pr{ xˆ s correct} = L( xˆ ) + e Soft bt: expected value for antpodal tx sgnal L( xˆ) L( xˆ) e e Lx ( ˆ) λ= E{ xˆ} = Pr{ xˆ = } = ( + ) + ( ˆ) ( ) = = tanh L x L( xˆ) L( xˆ) + e + e e + =± λ+ Pr{ xˆ =+ } =

38 L-Algebra Party bts are generated by modulo--sums of certan nformaton bts how can we calculate the L-value of a party bt? Hagenauer Assumpton: Sngle party check code (SPC) p= u u L(p) =? x and x are statstcally ndependent + x artanh ( x) = ln { } { } x Pr u u = Pr x x =+ x L( p) = Lu ( u) = ln = ln = Lx ( x) x e λ= tanh Pr{ u u = } Pr{ x x = } ( ) = x e + Pr{ x =+ } Pr{ x =+ } + Pr{ x =+ } Pr{ x =+ } + Pr{ x = } Pr{ x = } Pr{ x = } Pr{ x = } L( x x) = ln = ln Pr { x =+ } Pr { x = } + Pr { x = } Pr { x =+ } Pr { x =+ } Pr { x =+ } + Pr x = Pr x = L( x) L( x) L( x) + L( x) e e + e + = ln = ln e + e e + e ( x ) L x L( x ) L( x ) L( x ) L( x ) { } Lx ( ) Lx ( ) = = [ λ λ ] = boxplus operaton { } artanh tanh tanh artanh Lx ( ) + Lx ( ) 38

39 L-Algebra mod--sum of statstcally ndependent random varables: Lx ( ) Lx ( ) Lu ( u) = artanh tanh tanh artanh [ ] Lx ( ) + = λ λ = Lx ( ) Lx ( ) Lx ( ) mod--sum of n varables: [ Lx ] [ Lx ] { Lx Lx } sgn ( ) sgn ( ) mn ( ), ( ) tanh(x/) + - tanh(x/) λ λ - + artanh(x) n n Lu ( un) = artanh tanh ( Lx ( ) / ) + = Lx ( ) = = Lx ( x) { Lx } [ Lx ] mn ( ) sgn ( ) n = 39

40 General Approach for Soft-Output Decodng For FEC encoded sequence MAP crteron should be fulflled pu ( =, y) Symbol-by-Symbol MAP Crteron: Lu ( ˆ ) = ln pu=, y ( ) L-value for estmaton of nformaton bt u gven by receve sequence y Jont probablty densty functon p(u =/,y) not avalable elementary conversons Usng the completeness, the code space s splt nto two subsets Pa ( ) = Pab (, ) () ( ) Γ = contans all c wth u = Γ = contans all c wth u = ( ) ( ) ( ) ( ) c Γ ( ) { } ( ) { } ( ) p cy, ( ) p y c Pr c c Γ c Γ Lu ( ˆ ) = ln = ln p cy, p y c Pr c c Γ ( =, y) = ( ) p( cy, ) pu ( =, y) = ( ) p( cy, ) pu c Γ c Γ sum over k /= k- code words n numerator and n denomnator 4

41 General Approach for Soft-Output Decodng Assumng statstcal ndependency of the y j (transmsson over AWGNC) Succeedng nose terms n j are ndependent, but of course not succeedng code bts c j (nterdependences ntroduced by encoder)! p(y c) represents probablty densty condtoned on the hypothess c y j are statstcally ndependent random varables n ( y c) ( j j) p = p y c j= Each codeword c s unquely determned by the correspondng nfo word u n k (u are statstcally ndependent) k p y c Pr u () c Γ j= j= Pr{ c} = Pr{ u } = Pr{ u j} Lu ( ˆ ) = ln n k j= p y c Pr u Symbol-by-Symbol MAP () j= j= ( ˆ ) Lu = ln c Γ c Γ c Γ () () n j= n j= ( j cj) Pr{ c} p y ( j cj) Pr{ c} p y ( j j) { j} ( j j) { j} 4

42 General Approach for Soft-Output Decodng Symbol-by-Symbol MAP for systematc encoders For systematc encoder u = c holds for k- -th term p(y c ) s constant n numerator and denomnator can be separated together wth P(u ) ( ) ( ) Soft-Output can be splt nto 3 statstcally ndependent parts: Systematc part L ch y ( ) ( ) A-pror nformaton L a (u ) { } { } p y u = Pr u = Lu ( ˆ ) = ln + ln + ln p y u = Pr u = = L y + L u + L u ch a e k ( j j) Pr{ j} k ( j j) Pr{ j} Extrnsc nformaton L e (u ): nformaton provded by code bts connected wth u c Γ c Γ () ( ) n p y c u j= j= j j n p y c u j= j= j j 4

43 General Approach for Soft-Output Decodng Compact descrpton of extrnsc nformaton n k n p( y c ) Pr{ u } < k p( yj cj) Pr { uj} = p( yj; cj) wth p( y ; c ) = p( y c ) k < n j= j= j= j j j L Calculaton of extrnsc nformaton wth LLR s: ( ) n exp Lc ( ; y) c ( ) c Γ L y + L ( u ) < k L( ˆ e u ) = ln Lc ( ; y) = L y k < n c Γ n k n ( ) { } ( ) () c Γ ( ) p y c Pr c p y ; c j j j j j ( ) ( ) c Γ j= j= c Γ j= j j j ( ˆ e u ) = ln = ln n k n ( ) { } c Γ ( ) p y c Pr c p y ; c j j j j j j= j= j= j j j j j j j= j ch a wth n ch exp Lc ( j; yj) c j j= j 43

44 Soft-Output Decodng of Repetton Codes Code word cc = cc cc cc nn contans nn repettons of nformaton word uu = [uu ] Set of all code words for nn = 3 s gven by Γ = {, } ( ) ( ) ( ) { c} n n j j j ( ) c Γ j= j= n n ( j j) { c} j= j= p( y ) p( y ) p( y ) Pr{ u = } p( y ) p( y ) p( y ) Pr{ u = } ( ) ( ) ( ) p( y ) p( y ) p( y ) ( ) ( ) ( ) ( ) Corresponds to averagng of LLRs ( ) { c [ ]} p y c Pr p y Pr = Luˆ = ln = ln p y c Pr p y Pr = = ln c Γ ( j ) { c [ ]} { = } { u = } p y p y p y Pr u = ln + ln + ln + ln Pr = L y c + L y c + L y c + L u a 44

45 Soft-Output Decodng usng the Dual Code Calculaton of extrnsc nformaton requres summaton over all code words c of the code space Γ The (55,47,3) Hammng code contans 47 =.3 74 code words Instead of calculatng the LLR over all code words c of the code, t s also possble to perform ths calculaton wth respect to the dual code Benefcal, f the number of party bts s relatvely small dual code for (55,47,3) Hammng code contans only 8 = 56 code words Calculaton of extrnsc nformaton wth dual code: c n Lc ( ; y ) tanh c Γ = L ( ˆ e u ) = ln n c c Lc ( ; y) ( ) tanh Summaton over all Γ = n-k code c words c of the dual code 45

46 Soft-Output Decodng of (4,3,)-SPC usng the Dual Code Calculaton of extrnsc nformaton requres summaton over 3 = 8 code words. Instead, the dual code contans only n-k = words Γ = {, }. Calculaton of LLR n Lc ( ; y ) + tanh = Lu ( ˆ ) = Lch y + ln n Lc ( ; y ) tanh = n Lc ( ; y) Lch y artanh = + tanh = n { } [ ] L y+ mn Lc ( ; y) sgn Lc ( ; y) ch = Frst term n numerator and denomnator (c=) s one. wth + x ln = artanh x ( x) Each cîg fulflls cc T =,.e. c s gven by modulo--sum of all other code bts c j : c = c L( c) = + Lx ( ) j j e j j= j n 46

47 Soft-Output Decodng for (4,3,)-SPC-Code E s /N = db u L ch y encodng + Approxmaton for c L e (û) ( ) L ch = 4 EsN = 4 EsN / / 4 6,34 = = L c = + Lx ( ) e j j= j n Pr{û correct} ln db BPSK = L ch y+ L e (û) x L(û) HD AWGN y HD error detected. but not corrected error corrected 47

48 BCJR Algorthm for Convolutonal Codes Symbol-by-Symbol MAP Decodng: Bahl, Cocke, Jelnek, Ravv (97) p( u, ) ( ',, ) ( ',,,, ) ( ', ), ( ', ), k k = y ps sy ps s s s u= y s s u < y y = > Lu ( ˆ ) = ln = ln = ln p( u =, y) ps ( ', s, y) ps ( ', s, y, y, y ) ( s', s), u = ( s', s), u = k< k> Effcent calculaton of LLR based on the Trells dagram (explotng Markov prop.) state s state s ps ( ', yk< ) p( y, s s') p( y s) k> Trells of a RSC wth L c =3 - y u = u = [ ] y = y y y N = y y y,, n, 48

49 BCJR Algorthm for Convolutonal Codes state s ps ( ', yk< ) p( y, s s') state s p( y s) k > Splttng up the observatons y k> ps ( ', s, y, y, y ) = p( y s', s, y, y ) ps ( ', s, y, y ) k< k> k> k< k< Backward probablty: Probablty of the sequence y k>, f the trells s assumed n state s at tme nstant If state s at tme nstant s known, the β () s = p( yk> s', s, yk<, y) = p( yk> s) parameter s, y, y k< are not relevant Splttng up the observatons y p( ', s, y, y = ps (, y s', y ) p( s', y ) s k< ) k< k< Transton probablty: Probablty of observng y under the condton that the transton from s to s takes place at tme nstant y k< not relevant γ ( s', s ) = ps (, ys', yk< ) = ps (, ys') { s} { s', s} r { s' } ps ( ', s, y ) Pr = = p( y s', s) = p( y s', s) Pr s s' Pr ' P { } - p{y s,s}: transton probablty of channel Pr{s s } : a-pror-nformaton Possblty to use a-pror knowledge wthn the decodng process Pr{s s } ~ u 49

50 BCJR Algorthm for Convolutonal Codes Forward probablty: α ( s ) = ps ( ', yk< ) Probablty densty splts nto three terms p( s', s, yk<, y, yk> ) = α ( s') γ ( s', s) β () s Compact descrpton of Symbol-by-Symbol MAP ps ( ', s, y,, ) ( ', ), k k ( ', ), ( ') ( ', ) () s s u < y y > α s s u s γ s s β s = = Lu ( ˆ ) = ln = ln p( s', s, y, y, y ) α ( s') γ ( s', s) β () s Recursve Calculaton Forward probablty Backward probablty ( s', s), u = k< k> ( s', s), u = α Intalzaton Termnated code otherwse α s' = ( s') = s' ( s) = ps (, y = γ ( ', ) ( s') k<+ ) s s α s' β ( s ') = p( yk> s' ) = γ ( s', s) β( s) () s' = βn s = s' Probablty of sequence y k<, f the trells s assumed n state s at tme nstant - s β () N s = m a - () g (,) a - () state s ps ( ', yk< ) - g (3,) β - (3) g (3,3) p( y, s s') (m memory elements) 5 state s p( y s) k > a () β () β (3)

51 BCJR Algorthm for Convolutonal Codes Symbol-by-Symbol MAP Decodng: pu (, ) ( ', ), ( ') ( ', ) ( ) = y α s s u s γ s s β s = Lu ( ˆ ) = ln = ln p( u =, y) α ( s') γ ( s', s) β ( s) ( s', s), u = u = u = αα = αα 3 = αα γγ 3, γγ, αα γγ, αα γγ 3, +αα γγ 3, ββ NN γγ NN, ββ NN γγ NN, ββ NN = αα = αα αα αα 3 ββ NN ββ NN ββ NN = αα = αα αα αα 3 ββ NN ββ NN ββ NN = αα 3 = 3 NN NN NN αα 3 αα 3 αα 3 3 ββ NN 3 ββ NN 3 ββ NN 3 = 5

52 Calculaton n Logarthmc Doman Implementaton wth respect to probabltes s complcated numercal problems mplementaton n the logarthmc doman favorable Transton varable γ ( s', s ) = ln γ ( s', s) = ln p( y s', s) + ln Pr ss' { } = C y x( s', s) + ln Pr { u σn = u( s', s) } Forward varable α () s = ln α () s = ln γ ( s', s) α (') s = ln exp γ ( s', s) + α (' s ) Backward varable β ( s' ) = lnβ (') s = ln γ( s', s) β() s = ln e xp γ ( s', s) + β() s Intalzaton Termnated code otherwse s' = ' α( s') = ( ) s' s = βn = s' βn ( s) = const. ( ) ( ( )) s' s' ( ) ( ( )) s s 5

53 Calculaton n Logarthmc Doman: Jacob Logarthm In recurson, ln of sum of exponents occur Proof x x ( ) [ ] ( x x e e x ) x e + = + + = [ x x] ln max, ln max *, For x > x For x x [ ] ( ( ( ) )) x x ( ) ( ) Second term has small range between and ln effcently be mplemented by a lookup table w.r.t x -x ( ) ( ) * x x x x x x max x, x = ln e + e = ln e + ln + e = x+ ln + e [ ] ( ( ( ) )) x x ( ) ( ) ( ) ( ) * x x x x x x max x, x = ln e + e = ln e + ln + e = x + ln + e 53

54 Calculaton n Logarthmc Doman: Jacob Logarthm x x * x Smplfy logarthm of sums ln e + e = max x, x = max x, x + ln + e Forward varable α () s = ln α () s = ln exp γ ( s', s) + α (') s s' Backward varable * β ( s ) = ln β ( s ) = max γ( s, s) + β ( s), γ( s, s) + β ( s) Declaraton: ( ) = ma x γ ( s, s) + β () s + ln+ e s ( ( )) s' * [ γ( s, s) α ( s ) γ( s, s) α ( s ) ] [ γ ( s', s) α ( s') ] l ( e ) = max +, + = max + + n+ ( ) [ ] [ ] ( x ) correcton term Log-MAP: mplementaton of BCJR n log-doman wth correcton term Max-Log-MAP: mplementaton n log-doman wthout correcton term ( γ (, s) α ( ) ) ( γ ( s, s) α ( s )) = s + s + correcton term ( γ ( s, s ) β ( s )) ( γ ( s, s) + β ( s) ) = + 54

55 Iteratve Decodng General Structure for Parallel Concatenated Codes Turbo Decodng for (4,6,3)-Product Code Smulaton Results Turbo Decodng for Serally Concatenated Codes 55

56 General Concept for Iteratve Decodng Parallel Concatenated Codes u u=u C c y y D y L a, c LL( uu) L e, Π Π Π u C c L a, D LL( uu) L e, Π systematc (message) bts party bts A pror nformaton Decoder soft-decson estmates for message bts 56

57 Turbo Decodng for (4,6,3) Modfed Product Code () u x - encodng AWGN LLR L ch y BPSK SNR= db Vertcal extrnsc nfo serves as horzontal a-pror nfo LL aa, uu = LL ee, uu LL uu = L ch y + LL ee, uu vertcal extrnsc decodng nformaton LL ee, uu L ch y L e, (û)

58 Turbo Decodng for (4,6,3) Modfed Product Code () L ch y + LL aa, uu horzontal decodng LL ee, L ch y + LL aa, uu uu LL ee, uu = LL aa, uu L ch y + L e, (û) + L a, (û) LL uu û

59 Turbo Decodng for (4,6,3) Modfed Product Code (3) L ch y + L a, (u) û -.9 L (û) x x vertcal decodng L ch y + L a, (u) L ch y + L e, (û) + L a, (û) L e, (û) L (û) L e, (û) L ch y horzontal L e, (û) decodng

60 Turbo Decodng for (4,6,3) Modfed Product Code (4) L ch y + L a,3 (u) û 3 L (û) vertcal decodng L ch y + L a,3 (u) L ch y + L e,3 (û) + L a,3 (û) L e,3 (û) L 3 (û) L e,3 (û) L ch y horzontal L e,3 (û) decodng

61 Turbo Decodng for Parallel Concatenated Codes LL a, uu =LL e, uu Π LL ch yy D LL uu = LL ch yy s Π D +LL e, uu +LL a, uu LL ch yy s + LL a, uu only L ch y s fed to decoder D LL uu = LL ch yy s +LL a, uu +LL e, uu Both decoders estmate same nformaton word u and each decoder receves correspondng channel outputs Systematc nformaton bts y s are fed to D va D and Π Each decoder generates extrnsc nformaton for bt u servng as a pror LLRs for other decoder A pror LLRs mprove decoders performance n each teraton as long as they are statstcally ndependent of regular nputs 6

62 Smulaton Results for Modfed Product Codes (7,4,3)-Hammng Codes P b It. It. It.3 analyt log / ( E N ) b Observatons Gans decrease wth number of teratons Same nfo bts are estmated and correlaton of a-pror nformaton ncreases Wth the larger nterleaver length the gans of subsequent teratons are generally larger statstcal ndependence of bts s requred 6

63 Smulaton Results for Modfed Product Codes (5,,3)-Hammng-Codes It. It. It.3 analyt. - Observatons Larger nterleaver leads to mproved statstc gans for teraton 3 P b log / ( E N ) b 63

64 Smulaton Results for Modfed Product Codes (3,6,3)-Hammng-Codes It. It. It.3 analyt. - Observatons Larger nterleaver leads to mproved statstc gans for teraton 3 P b -4 For larger SNR the BER flattens mnmum dstance domnates error rate for large SNR log / ( E N ) b 64

65 Smulaton Results for Modfed Product Codes Hammng codes have dd mn = 3 for all lengths nn Analyzed product codes have same dd mn smlar error rates versus EE ss /NN Code rates are dfferent longer product codes are better versus EE bb /NN P b (7,4) (5,) (3,6) - P b (7,4) (5,) (3,6) ( E N ) log / s ( E N ) log / b 65

66 Smulaton Results for Turbo Codes (L c = 3) P b x Block-Interleaver - - It. It. It. 3 It. 4 It. 5 It. 6 P b 3x3 Block-Interleaver - - It. It. It. 3 It. 4 It. 5 It ( E N ) log / b ( E N ) log / b Gans decrease wth number of teratons Increase of nterleaver sze leads to reduced BER 66

67 Smulaton Results for Turbo Codes (L c = 3) P b 9-Random-Interleaver, Rc=/3 - - It. It. It. 3 It. 4 It. 6 It. P b Comparson of dfferent nterleavers - - CC, Lc=9 BIL- BIL-4 BIL-9 RIL-9 RIL-9,Rc=/ E b / N n db Usage of random nterleaver leads to sgnfcant performance mprovements n comparson to block nterleaver Eb / N n db Random nterleaver (RIL) acheves larger gans n comparson to block nterleaver (BIL) 67

68 Turbo Decodng for Serally Concatenated Codes C outer encoder Π C systematc nner encoder LL aa, cc = Π{LL e, cc } Π LL cc = LL ch yy S +LL a, cc +LL e, cc Π D D LL cc = LL ch yy s +LL e, cc LL ch yy S + LL a, cc +LL a, cc LL uu Outer decoder receves nformaton only from nner decoder Outer decoder delvers estmates on nformaton bts u as well as extrnsc LLRs of code bts c beng nformaton bts of nner code C Extrnsc LLRs of code bts c serve as a pror LLRs for nner code C 68

69 Comparson of Seral and Parallel Concatenaton BER seral parallel n = n = n = Results for specfc setup, no generalzaton possble! E / N n db b 69

70 Repeat Accumulate Code by ten Brnk Approxmately decodng teratons are needed Half-rate outer repetton encoder and rate-one nner recursve convolutonal encoder - - BER E b /N n db 7

71 Repeat Accumulate Code by Stephan ten Brnk 7

72 EXtrnsc Informaton Transfer Chart (EXIT-Charts) Stephan ten Brnn

73 Parallel Concatenaton Mutual Informaton for Turbo Decoder Π C D Π D 73

74 Mutual Informaton for Sngle Decoder C BPSK D 74

75 General Concept of Iteratve Turbo Decodng BER curve shows three dfferent regons At low SNR the teratve decodng performs worse than uncoded transmsson At low to medum SNR the teratve decodng s extremely effectve waterfall regon At hgh SNR the decodng converges already n few teratons error floor How to understand ths varyng behavor? Extrnsc nformaton s exchanged between decoders Analyss of teratve process by sem-analytc approach Determne analytcally mutual nformaton II uu; LL aa uu between nformaton bts and a-pror nput of decoder Determne by smulaton mutual nformaton II uu; LL ee uu between nformaton bts and extrnsc output of decoder for specfc a-pror nformaton at nput Draw relatonshp between both mutual nformaton's Combne dagrams of both contrbutng decoders nto one chart: EXIT chart: EXtrnsc Informaton Transfer chart 75

76 Dstrbuton of Extrnsc Informaton Investgaton of extrnsc decoder output Example: [7,5]-RSC at E b /N =,, db PDF of extrnsc estmate s gven for x = + and x = - separately Extrnsc nformaton s nearly Gaussan dstrbuted Wth ncreasng SNR the mean s absolute value s ncreased the varance s ncreased Iteratve Decodng: Wth ncreasng number of teratons the extrnsc nformaton approaches a Gaussan dstrbuton ( ˆ ) = ( ˆ ) ( ) L u Lu L y L u e ch a p p e e ( ξ x =+ ) db.5 db. db.5 db. db ( ξ x = ). db.5 db. db.5 db. db

77 Analytcal Model for the A-Pror Informaton Extrnsc nformaton of decoder becomes a-pror-nformaton of decoder and vce versa For EXIT analyss the a-pror nformaton A=L a s modeled as A=µ A x+ na Gaussan random varable n A of zero mean and varance σσ AA s added to the value x of the transmtted systematc bt multpled by µ = σ p A ( ) σa x ξ ξ = exp πσ σ A A ( x) Normalzaton of a-pror nformaton wth σσ AA Wth ncreasng varance the probablty functons are more separated and do not overlap anymore σ = A A A σ = 6 A σ = 36 A

78 Motvaton for Modelng A-Pror Informaton LLR for uncoded transmsson over AWGNC s gven by y = x+ n~ ±, σn p{ y x= + } Es L( y x) = ln = 4 y = Lch y = Lc h ( x+ n) Es p{ y x= } N wth L 4 4 ch = = = N σ σ L( yx) = x+ n σ σ LLR s Gaussan dstrbuted wth mean m A and varance s A A The mean s absolute value equals the half of the varance Model for a-pror LLR A L x n n { ( ) µ = E L yx= = n n L ch { } ( ) σ A E n σ σ = N n Ts 4 n n σn σn σ = = σ = 4 = a =µ A + A ~ A ( ± σa, σ A) = ±, σn σn and ( ) n Es σ x = = T 78 s n

79 Mutual Informaton of A-Pror Informaton and Info Bts Mutual nformaton between systematc bts and a-pror LLR I A Integral has to be solved numercally J(s A ) s monotoncally ncreasng n s A has a unque nverse functon s A = J - (I A ) Close approxmaton for J-functon and ts nverse pa( ξ x) ( ξ = ) + ( ξ =+ ) I ( σ ) = I( X; A) = p ( x ) log d ξ ξ A A A x = { +, } pa x pa x { } ( A) ( ( ) ) ξ ξ σ A ( ) ( ) A = exp ξ σ log + e dξ = E log + e = J σ πσ σ A ( σ A) = ( σ A) ( ) J I A A / ( A) log ( A ) σa J I = I.373 I A ( σ ) A σ A 79

80 Mutual Informaton of Extrnsc Informaton and Info Bts Mutual nformaton between systematc bts and extrnsc LLR I E pe( ξ x) ( ξ = ) + ( ξ =+ ) I = I( X; E) = p ( x ) log d ξ ξ E E x = { +, } pe x pe x Sem analytcal approach to determne the dependency of mutual nformaton at decoder nput and output Perform encodng for a random nformaton sequence u c = f(u) and x = -c Transmt BPSK sgnals over AWGN channel y = x + n For gven I A determne s A usng the nverse J-functon s A = J - (I A ) Model a-pror nformaton usng analytcal model: A = m A x + n A Perform decodng of nosy receve sgnal y usng a-pror nformaton A Determne mutual nformaton I E for extrnsc nformaton usng hstogram for approxmatng pdf p E ( ξ x ) Transfer characterstc shows dependency of I E and I A (, / ) I = Tr I E N E A b 8

81 Measurement of the Mutual Informaton By applcaton of ergodc theorem (expectaton s replaced by tme average), the mutual nformaton can be measured for large number N of samples N x L { } ( ) L ( ) ( ) n n I LX ; = Elog + e log + e N n= Measurement setup u {,} x { ±} systematc bts n / σn L( yx) La ( x) Le ( x) xl ( ) e log +e Average ( X) I L σ A = 4/ σn A 4 L = A~ ( ± σ, σ ) = ±, A A A A σn σn E, A n A / σ n A xl ( ) a log +e Average ( X) I L A, 8

82 I E I u; L e ( u).4 Dependency of Mutual Informaton at Decoder Input and Output =.8.6 ( ) -.5 db. db.5 db. db.5 db. db.5 db 3. db I = I u; L ( u). A ( ) a LL ch yy LL aa Dec. LL ee Transfer characterstc for (37,3 r ) 8 - RSC code Decoder processes L(y x) and L a (x) Observatons I E ncreases wth growng SNR and I A I A = no a-pror nformaton avalable I A = perfect a-pror I E s relable regardless of SNR For hgh SNR, nearly no a- pror nformaton s requred for good decodng results 8

83 Behavor of dfferent Convolutonal Codes L c = 3 Transfer characterstc f only a-pror nformaton s provded to the decoder (c.f. seral concatenaton) I E = ( ; ( )) I ul u e L c = 5 L c = I A ( ; ( )) = I ul u a Weak codes better for low a-pror nformaton Strong codes better for hgh a-pror nformaton Pont of ntersecton for all convolutonal codes close to (.5,.5) (explanaton for ths behavor unknown!) Seral concatenaton: Outer decoder gets only a-pror nformaton of nner decoder Transfer functon of outer decoder s ndependent of SNR LL aa Dec. LL ee 83

84 Comparson of MAP and Max-Log-MAP Dec E b / N = - db E b / N = db. E b / N = db E b / N = db E b / N = 3 db Hgh channel SNR leads to hgh extrnsc nformaton Large a-pror nformaton can compensate bad channel condtons Max-Log-MAP decoder performs nearly as good as optmal MAP decoder 84

85 EXtrnsc Informaton Transfer (EXIT) Charts Extrnsc nformaton provded by one decoder s used as a-pror nformaton for other decoder D D For EXIT charts the transfer functon of both consttuent codes are drawn nto one dagram wth exchangng the abscssa and ordnate for the second code Assumptons A large nterleaver s assumed to assure statstcal ndependence of I A and I E For nner decoders n a seral concatenated scheme and for parallel concatenated schemes the nput parameters are L ch and I A For outer decoders n a seral concatenaton only I (outer) A appears as nput whch s taken form the nterleaved sgnal I (nner) E (Transfer functon of outer decoder s ndependent of SNR) 85

86 EXIT Charts for Seral Concatenaton pnch-off SNR: mnmum SNR for convergence of turbo decoder E b /N = -. db E b /N =. db E b /N =. db E b /N =. db E b /N =. db E b /N = 3. db outer decoder Outer non-recursve convolutonal encoder (5,3) 8, RR cc = 3/4 Inner recursve convolutonal encoder (3,5 r ) 8, RR cc = /3 86

87 EXIT Charts for Seral Concatenaton.8.6 Outer non-recursve convolutonal encoder (5,3) 8, RR cc = 3/4 Inner recursve convolutonal encoder (3,5 r ) 8, RR cc = /

88 EXtrnsc Informaton Transfer (EXIT) Charts Outer convolutonal code Inner Walsh-Hadamard code.9 P R E J I J I ( ) ( ) b b erfc 8 c + A + E N I ( u ; L ( )) ( ; e u = I u La ( u) ) I ( u ; L ( )) ( ; a u = I u Le ( u) ) 88

89 EXtrnsc Informaton Transfer (EXIT) Charts Determnng pnch-off SNR: mnmum SNR for whch convergence s mantaned I ( u ; L ( )) ( ; e u = I u La ( u) ) log (E b /N ) = -.3 db I ( u ; L ( )) ( ; a u = I u Le ( u) ) 89

90 Code Desgn for Half-Rate Repeat-Accumulate Code.8 Sgnal-to-Nose rato.6.4 Outer repetton code. outer repetton code nner code Inner recursve convolutonal encoder 9

91 Btnterleaved Coded Modulaton General Structure for Serally Concatenated Blocks Calculaton of LLRs Smulaton Results 9

92 Bt-Interleaved Coded Modulaton (BICM) Π channel encoder Π mapper channel demapper Π - channel decoder Coded transmsson wth hgher order modulaton: Bnary vector of length m s mapped to one of m symbols of the alphabet XX Usually Gray mappng employed xx XX mnmzes bt error probablty wthout channel codng Good propertes regardng the capacty of a BICM system Interpretaton as serally concatenated system Inserton of nterleaver between encoder and mapper leads to pseudo random mappng of bts onto specfc levels and s crucal for teratve turbo detecton Iteratve detecton and decodng: demapper and decoder exchange extrnsc nformaton How to perform turbo detecton / decodng? Are there better mappng strateges than Gray mappng? 9

93 Soft-Output Demappng LLR for each of the m bts (for one specfc tme nstant kk): p( y c) Pr c p( yc, m µ = dem ) c GF( ), cµ = L ( c µ ) = Lc ( µ y) = ln = ln p yc, = p y c Pr c x, c = x, c = A pror nformaton LL aa m Pr ( µ ) ( ) Pr{ x} p yx µ = ln = ln p y { cν ( x) } µ = m ν= ν= ( x) Pr{ x} cc νν c ( xl ) ( c ) e ν + e ( ) c GF, c = x x provded by decoder L a a ν ( c ) ν µ µ m µ exp exp ( ) y x σn y x σ n m ν= m ν= { } { } Pr Pr { cν ( x) } { cν ( x) } 93

94 Soft-Output Demappng Denomnator of a pror nformaton cancels when nserted nto LL dem m y x cν( xl ) a ( c ν) exp e x σ µ n ν= dem L ( c µ ) = Lc ( µ y) = ln m y x cν( xl ) a ( c ν) exp e x σ µ n ν= dem Intrnsc nformaton LL cc νν s ndependent of a pror nformaton LL aa cc νν dem dem L c = L c La c ( µ ) ( µ ) ( µ ) = ln x µ µ x exp exp y x σ n y x σ n m ν=, ν µ m ν=, ν µ e e c ν ( xl ) ( c ) a ( ) ( ) c xl c ν a ν ν cc μμ 94

95 Soft-Output Demappng for 6-QAM dem dem ( ) L ( c ) 4 L c Im Im Re Re L dem ( c ) = ln x x exp exp { } ( ) y x Pr cν ( x) σ n { } ( ) y x Pr c ( x) σ ν n m ν= m ν= 95

96 System Model for BICM uu channel Π cc kk = cc [kk],, cc mm [kk] xx[kk] mapper encoder Transmtter channel uu channel decoder Π - LL dem cc νν LL dem +LL aa cc νν cc νν soft demapper yy[kk] Recever Π LL dec cc νν = LL aa cc νν 96

97 97 Selected Bt-Mappngs for 8-PSK Gray natural d d3 Ant- Gray

98 EXtrnsc Informaton Transfer Charts I (c; L dem ) = I (c; L dec a ).8.6 I I E b /N = 5 db Gray natural d. d3 Ant-Gray BCH(8,4) I (c; L dem a ) = I (c; L dec ) Demapper: a pror nformaton mutual nformaton I (c; L dem a ) Detecton and decodng only once Gray s best Iteratve detecton and decodng Ant-Gray s best 98

99 Bt Error Rates BER Gray natural -5 d d3 Ant-Gray 4 teratons E b /N [db] Smulaton parameters BCH(8,4) 8-PSK Alamout scheme 36 coded bts per frame Independent Raylegh fadng Channel const. for 4 symbols Frst detecton and decodng Gray good, Ant-Gray bad After four teratons Ant-Gray s best Same results as predcted by EXIT charts 99

100 Low Densty Party Check Codes Defnton and propertes of LDPC codes Iteratve decodng Smulaton results

101 LDPC Codes Low Densty Party Check Codes Invented by Robert G. Gallager n hs PhD thess, 963 Re-nvented by Davd J.C. Kay n 999 LDPC codes are lnear block codes wth sparse party check matrx H contans relatvely few spread among many (for bnary codes) Iteratvely decoded on a factor graph of the check matrx Advantages Good codes Low decodng complexty

102 Introducton Recall: For every lnear bnary (n, k) code wth code rate R c = k/n There s a generator matrx G GF(q) k n such that code words x GF(q) n and nfo words u GF(q) k are related by x= ug There s a party-check matrx H GF(q) m n of rank{h} = n-k, such that T x H = Relaton of generator and party check matrx T G H =

103 Regular LDPC-Codes Defnton: A regular (d v,d c )-LDPC code of length n s defned by a party-check matrx H GF(q) m n wth d v ones n each column and d c ones n each row. The dmenson of the code (nfo word length) s k = n rank{h} Example: n = 8, m = 6, k = n - rank{h} = 4 (!), R C = / d v = 3, d c = 4 H = 3

104 Regular LDPC-Codes Desgn Rate: The true rate R C and the desgn rate R d are defned as k dv RC = and Rd = wth RC Rd n d c Proof: The number of ones n the check matrx m d c = n d v. Some party check equatons may be redundant,.e., m n-k, and thus k n k m dv = = n n n d The check matrces can be constructed randomly or determnstc Encodng LDPC codes are usually systematcally encoded,.e., by a systematc generator matrx G = Ik kpk n k The matrx P can be found by transformng H nto another check matrx of the code, that has the form T H = Pk n ki n k n k c 4

105 Factor Graph A factor graph of a code s a graphcal representaton of the code constrants defned by a party-check matrx of ths code T x H = The factor graph s a bpartte graph wth a varable node for each code symbol, a check node for each check equaton, an edge between a varable node and a check node f the code symbol partcpates n the check equaton Notce that each edge corresponds to one n the check matrx. 5

106 Example: x x x x = x x x x = 4 5 x x x x = 3 5 x x x x = x x x x = x x x x = 6 7 Factor Graph T x H = [ x x x7] = chk chk chk chk 3 chk 4 chk 5 x x x x 3 x 4 x 5 x 6 x 7 chk chk chk chk 3 chk 4 chk 5 T n = 8 columns (code word length) n-k = 6 party check equatons Each check node represents one row of party check matrx Varable nodes Check nodes 6

107 Decodng wth the Sum-Product Algorthm Smlar to Turbo Decodng, extrnsc nformaton s exchanged Check nodes collect extrnsc nformaton from the connected varable nodes Varable nodes collect extrnsc nformaton from the connected check nodes Check node example Varable node example k chk = + L A = E x x 3 x 4 x 5 L L 3 L 4 L 5 Iteratve decodng procedure x chk chk boxed plus E chk chk extrnsc nfo k E + j j K = Also called message passng or beleve propagaton k L k E E Stop f T x H = 7

108 Frst check equaton Decodng wth the Sum-Product Algorthm Is the check equaton fulflled? Extrnsc nformaton x x3 x4 x5 x x3 x4 x5 = = ( ) L x = Lx ( ) Lx ( ) Lx ( ) e Lx ( ) Lx ( ) Lx ( ) L( x ) ck h = L(x ) = L ch y L(x ) = L ch y L(x ) = L ch y L(x 3 ) = L ch y 3 L(x 4 ) = L ch y 4 L(x 5 ) = L ch y 5 L(x 6 ) = L ch y 6 L(x 7 ) = L ch y 7 x x x x 3 x 4 x 5 x 6 x 7 chk chk chk chk 3 chk 4 chk 5 e e e ( ) ( ) ( ) L x = Lx ( ) Lx ( ) Lx ( ) L x = Lx ( ) Lx ( ) Lx ( ) L x = Lx ( ) Lx ( ) Lx ( )

109 Decodng wth the Sum-Product Algorthm Second check equaton Thrd check equaton x x x x 3 x 4 x 5 x 6 x 7 x x x4 x5 = e ( ) e ( ) e ( ) ( ) x x x3 x5 = chk chk chk chk 3 chk 4 chk 5 L x = Lx ( ) Lx ( ) Lx ( ) L x = Lx ( ) Lx ( ) Lx ( ) L x = Lx ( ) Lx ( ) Lx ( ) L x = Lx ( ) Lx ( ) Lx ( ) e e e e e ( ) ( ) ( ) ( ) L x = Lx ( ) Lx ( ) Lx ( ) L x = Lx ( ) Lx ( ) Lx ( ) L x = Lx ( ) Lx ( ) Lx ( ) L x = Lx ( ) Lx ( ) Lx ( )

110 Varable update Decodng wth the Sum-Product Algorthm Collect extrnsc nformaton of check nodes and update varable nodes L(x ) = L ch y +A L(x ) = L ch y +A L(x ) = L ch y +A L(x 3 ) = L ch y 3 +A 3 L(x 4 ) = L ch y 4 +A 4 L(x 5 ) = L ch y 5 +A 5 L(x 6 ) = L ch y 6 +A 6 L(x 7 ) = L ch y 7 +A x x x x 3 x 4 x 5 x 6 x 7 A = E k k chk chk chk chk 3 chk 4 chk 5

111 Example: BEC X X -P q P q -P q Y L( y)? P q Y + y = Y = y =? y = Y x x x x 3 x 4 x 5 x 6 x 7?? chk chk chk chk 3 chk 4 chk 5

112 Example: BEC Check equatons calculate extrnsc nformaton x L chk e ( x) = Lx ( ) + Lx ( 4) + Lx ( 5) = + x + x? chk Le ( x) = Lx ( ) + Lx ( 3) + Lx ( 5) = + + x chk 5 3 Le ( x) = Lx ( ) + Lx ( 6) + Lx ( 7) = x 4 chk 3 3 L x 5 e ( x6) = Lx ( ) + Lx ( 3) + Lx ( 7) = + + chk x 6? chk L 5 e ( x6) = Lx ( ) + Lx ( 4) + Lx ( 7) = + x 7 5 L x = Lx ( ) + Lx ( ) + Lx ( ) = e ( ) 6 7 ( ) ( ) ( ) L x = L x = L x = e e 4 e 5 Varable check 5 L x L x L x L x 5 ( ) = ( ) + ( ) + ( ) = Le( x) Le( x) Le( x) ( ) = ( ) + ( ) + ( ) = Le( x6) Le( x6) Le( x6) a e e e L x L x L x L x a 6 e 6 e 6 e 6 = + = + = + = +

113 Irregular LDPC-Codes Propertes: Generalzaton of regular LDPC codes Lower error rates,.e., better performance Irregular number of ones per column and per row Varable nodes of dfferent degrees Check nodes of dfferent degrees Example: H = x x x x 3 x 4 x 5 x 6 x 7 chk chk chk chk 3 chk 4 chk 5 3

114 Irregular LDPC-Codes Irregular number of ones per column and per row: : proporton of left (varable) nodes of degree r : proporton of rght (check) nodes of degree In example: 3 = 5 / 8 4 = / 8 5 = / 8 r 4 = 3 / 6 r 5 = / 6 r 6 = / 6 Proportons of edges: λ : proporton of edges ncdent to left nodes of degree r : proporton of edges ncdent to rght nodes of degree In example: l 3 = 5 / 9 l 4 = 4 / 9 l 5 = / 9 r 4 = / 9 r 5 = 5 / 9 r 6 = / 9 x x x x 3 x 4 x 5 x 6 x 7 chk chk chk chk 3 chk 4 chk 5 4

115 Irregular LDPC-Codes LDPC codes are optmzed va Densty Evoluton or EXIT analyss Probablty densty functons descrbng the dstrbuton of check and varable nodes n a party check matrx Specfc codes can be found va random code generaton followng these dstrbutons PDFs wll only be nearly fulflled due to the fnte number of checks and varables Qualty may vary n such an ensemble of codes due to random generaton Example: R c =/ LDPC Code wth n=496 and k=48 Varable node dstrbuton: Degree PDF Number Check node dstrbuton Degree 8 9 PDF Number

Application of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations

Application of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations Applcaton of Nonbnary LDPC Codes for Communcaton over Fadng Channels Usng Hgher Order Modulatons Rong-Hu Peng and Rong-Rong Chen Department of Electrcal and Computer Engneerng Unversty of Utah Ths work

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Low Complexity Soft-Input Soft-Output Hamming Decoder

Low Complexity Soft-Input Soft-Output Hamming Decoder Low Complexty Soft-Input Soft-Output Hammng Der Benjamn Müller, Martn Holters, Udo Zölzer Helmut Schmdt Unversty Unversty of the Federal Armed Forces Department of Sgnal Processng and Communcatons Holstenhofweg

More information

Consider the following passband digital communication system model. c t. modulator. t r a n s m i t t e r. signal decoder.

Consider the following passband digital communication system model. c t. modulator. t r a n s m i t t e r. signal decoder. PASSBAND DIGITAL MODULATION TECHNIQUES Consder the followng passband dgtal communcaton system model. cos( ω + φ ) c t message source m sgnal encoder s modulator s () t communcaton xt () channel t r a n

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Wreless Informaton Transmsson System Lab. Chapter 7 Channel Capacty and Codng Insttute of Communcatons Engneerng atonal Sun Yat-sen Unversty Contents 7. Channel models and channel capacty 7.. Channel models

More information

Error Probability for M Signals

Error Probability for M Signals Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Chapter 7 Channel Capacty and Codng Contents 7. Channel models and channel capacty 7.. Channel models Bnary symmetrc channel Dscrete memoryless channels Dscrete-nput, contnuous-output channel Waveform

More information

Iterative Multiuser Receiver Utilizing Soft Decoding Information

Iterative Multiuser Receiver Utilizing Soft Decoding Information teratve Multuser Recever Utlzng Soft Decodng nformaton Kmmo Kettunen and Tmo Laaso Helsn Unversty of Technology Laboratory of Telecommuncatons Technology emal: Kmmo.Kettunen@hut.f, Tmo.Laaso@hut.f Abstract

More information

State-of-the-Art Channel Coding

State-of-the-Art Channel Coding Institut für State-of-the-Art Channel Coding Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering / Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003 Tornado and Luby Transform Codes Ashsh Khst 6.454 Presentaton October 22, 2003 Background: Erasure Channel Elas[956] studed the Erasure Channel β x x β β x 2 m x 2 k? Capacty of Noseless Erasure Channel

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

Pulse Coded Modulation

Pulse Coded Modulation Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

/ n ) are compared. The logic is: if the two

/ n ) are compared. The logic is: if the two STAT C141, Sprng 2005 Lecture 13 Two sample tests One sample tests: examples of goodness of ft tests, where we are testng whether our data supports predctons. Two sample tests: called as tests of ndependence

More information

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.

More information

S Advanced Digital Communication (4 cr) Targets today

S Advanced Digital Communication (4 cr) Targets today S.72-3320 Advanced Dtal Communcaton (4 cr) Convolutonal Codes Tarets today Why to apply convolutonal codn? Defnn convolutonal codes Practcal encodn crcuts Defnn qualty of convolutonal codes Decodn prncples

More information

A Comparison between Weight Spectrum of Different Convolutional Code Types

A Comparison between Weight Spectrum of Different Convolutional Code Types A Comparson between Weght Spectrum of fferent Convolutonal Code Types Baltă Hora, Kovac Mara Abstract: In ths paper we present the non-recursve systematc, recursve systematc and non-recursve non-systematc

More information

Communication with AWGN Interference

Communication with AWGN Interference Communcaton wth AWG Interference m {m } {p(m } Modulator s {s } r=s+n Recever ˆm AWG n m s a dscrete random varable(rv whch takes m wth probablty p(m. Modulator maps each m nto a waveform sgnal s m=m

More information

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem. Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from

More information

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Lossy Compression. Compromise accuracy of reconstruction for increased compression. Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Tracking with Kalman Filter

Tracking with Kalman Filter Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010 Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression 11 MACHINE APPLIED MACHINE LEARNING LEARNING MACHINE LEARNING Gaussan Mture Regresson 22 MACHINE APPLIED MACHINE LEARNING LEARNING Bref summary of last week s lecture 33 MACHINE APPLIED MACHINE LEARNING

More information

Inductance Calculation for Conductors of Arbitrary Shape

Inductance Calculation for Conductors of Arbitrary Shape CRYO/02/028 Aprl 5, 2002 Inductance Calculaton for Conductors of Arbtrary Shape L. Bottura Dstrbuton: Internal Summary In ths note we descrbe a method for the numercal calculaton of nductances among conductors

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

DC-Free Turbo Coding Scheme Using MAP/SOVA Algorithms

DC-Free Turbo Coding Scheme Using MAP/SOVA Algorithms Proceedngs of the 5th WSEAS Internatonal Conference on Telecommuncatons and Informatcs, Istanbul, Turkey, May 27-29, 26 (pp192-197 DC-Free Turbo Codng Scheme Usng MAP/SOVA Algorthms Prof. Dr. M. Amr Mokhtar

More information

Neuro-Adaptive Design - I:

Neuro-Adaptive Design - I: Lecture 36 Neuro-Adaptve Desgn - I: A Robustfyng ool for Dynamc Inverson Desgn Dr. Radhakant Padh Asst. Professor Dept. of Aerospace Engneerng Indan Insttute of Scence - Bangalore Motvaton Perfect system

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

Motion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong

Motion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong Moton Percepton Under Uncertanty Hongjng Lu Department of Psychology Unversty of Hong Kong Outlne Uncertanty n moton stmulus Correspondence problem Qualtatve fttng usng deal observer models Based on sgnal

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

This column is a continuation of our previous column

This column is a continuation of our previous column Comparson of Goodness of Ft Statstcs for Lnear Regresson, Part II The authors contnue ther dscusson of the correlaton coeffcent n developng a calbraton for quanttatve analyss. Jerome Workman Jr. and Howard

More information

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1] DYNAMIC SHORTEST PATH SEARCH AND SYNCHRONIZED TASK SWITCHING Jay Wagenpfel, Adran Trachte 2 Outlne Shortest Communcaton Path Searchng Bellmann Ford algorthm Algorthm for dynamc case Modfcatons to our algorthm

More information

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models ECO 452 -- OE 4: Probt and Logt Models ECO 452 -- OE 4 Maxmum Lkelhood Estmaton of Bnary Dependent Varables Models: Probt and Logt hs note demonstrates how to formulate bnary dependent varables models

More information

EGR 544 Communication Theory

EGR 544 Communication Theory EGR 544 Communcaton Theory. Informaton Sources Z. Alyazcoglu Electrcal and Computer Engneerng Department Cal Poly Pomona Introducton Informaton Source x n Informaton sources Analog sources Dscrete sources

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Uncertainty in measurements of power and energy on power networks

Uncertainty in measurements of power and energy on power networks Uncertanty n measurements of power and energy on power networks E. Manov, N. Kolev Department of Measurement and Instrumentaton, Techncal Unversty Sofa, bul. Klment Ohrdsk No8, bl., 000 Sofa, Bulgara Tel./fax:

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

Statistical pattern recognition

Statistical pattern recognition Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Time-Varying Systems and Computations Lecture 6

Time-Varying Systems and Computations Lecture 6 Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy

More information

Quantum and Classical Information Theory with Disentropy

Quantum and Classical Information Theory with Disentropy Quantum and Classcal Informaton Theory wth Dsentropy R V Ramos rubensramos@ufcbr Lab of Quantum Informaton Technology, Department of Telenformatc Engneerng Federal Unversty of Ceara - DETI/UFC, CP 6007

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Tutorial 2. COMP4134 Biometrics Authentication. February 9, Jun Xu, Teaching Asistant

Tutorial 2. COMP4134 Biometrics Authentication. February 9, Jun Xu, Teaching Asistant Tutoral 2 COMP434 ometrcs uthentcaton Jun Xu, Teachng sstant csjunxu@comp.polyu.edu.hk February 9, 207 Table of Contents Problems Problem : nswer the questons Problem 2: Power law functon Problem 3: Convoluton

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

Transform Coding. Transform Coding Principle

Transform Coding. Transform Coding Principle Transform Codng Prncple of block-wse transform codng Propertes of orthonormal transforms Dscrete cosne transform (DCT) Bt allocaton for transform coeffcents Entropy codng of transform coeffcents Typcal

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Lecture 5 Decoding Binary BCH Codes

Lecture 5 Decoding Binary BCH Codes Lecture 5 Decodng Bnary BCH Codes In ths class, we wll ntroduce dfferent methods for decodng BCH codes 51 Decodng the [15, 7, 5] 2 -BCH Code Consder the [15, 7, 5] 2 -code C we ntroduced n the last lecture

More information

Basically, if you have a dummy dependent variable you will be estimating a probability.

Basically, if you have a dummy dependent variable you will be estimating a probability. ECON 497: Lecture Notes 13 Page 1 of 1 Metropoltan State Unversty ECON 497: Research and Forecastng Lecture Notes 13 Dummy Dependent Varable Technques Studenmund Chapter 13 Bascally, f you have a dummy

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function Advanced Scence and Technology Letters, pp.83-87 http://dx.do.org/10.14257/astl.2014.53.20 A Partcle Flter Algorthm based on Mxng of Pror probablty densty and UKF as Generate Importance Functon Lu Lu 1,1,

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

Contents 1 Basics of Convolutional Coding.

Contents 1 Basics of Convolutional Coding. Contents 1 Bascs of Convolutonal Codng 1 Φ 11 Relatng G 1 wth the mpulse responses g j 1 ;j 12 TransformAnalyssn ConvolutonalCodes context 4 13 Constructng a convolutonal encoder structure from the generator

More information

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity Week3, Chapter 4 Moton n Two Dmensons Lecture Quz A partcle confned to moton along the x axs moves wth constant acceleraton from x =.0 m to x = 8.0 m durng a 1-s tme nterval. The velocty of the partcle

More information

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

2016 Wiley. Study Session 2: Ethical and Professional Standards Application 6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl RECURSIVE SPLINE INTERPOLATION METHOD FOR REAL TIME ENGINE CONTROL APPLICATIONS A. Stotsky Volvo Car Corporaton Engne Desgn and Development Dept. 97542, HA1N, SE- 405 31 Gothenburg Sweden. Emal: astotsky@volvocars.com

More information

Limited Dependent Variables

Limited Dependent Variables Lmted Dependent Varables. What f the left-hand sde varable s not a contnuous thng spread from mnus nfnty to plus nfnty? That s, gven a model = f (, β, ε, where a. s bounded below at zero, such as wages

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique Outlne and Readng Dynamc Programmng The General Technque ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Matrx Chan-Product ( 5.3.1) Dynamc Programmng verson 1.4 1 Dynamc Programmng verson 1.4 2 Dynamc Programmng

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 4: Universal Hash Functions/Streaming Cont d CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected

More information

Lecture 6: Introduction to Linear Regression

Lecture 6: Introduction to Linear Regression Lecture 6: Introducton to Lnear Regresson An Manchakul amancha@jhsph.edu 24 Aprl 27 Lnear regresson: man dea Lnear regresson can be used to study an outcome as a lnear functon of a predctor Example: 6

More information

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model EXACT OE-DIMESIOAL ISIG MODEL The one-dmensonal Isng model conssts of a chan of spns, each spn nteractng only wth ts two nearest neghbors. The smple Isng problem n one dmenson can be solved drectly n several

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng

More information

Chapter 3 Describing Data Using Numerical Measures

Chapter 3 Describing Data Using Numerical Measures Chapter 3 Student Lecture Notes 3-1 Chapter 3 Descrbng Data Usng Numercal Measures Fall 2006 Fundamentals of Busness Statstcs 1 Chapter Goals To establsh the usefulness of summary measures of data. The

More information

Quantifying Uncertainty

Quantifying Uncertainty Partcle Flters Quantfyng Uncertanty Sa Ravela M. I. T Last Updated: Sprng 2013 1 Quantfyng Uncertanty Partcle Flters Partcle Flters Appled to Sequental flterng problems Can also be appled to smoothng problems

More information

Probability-Theoretic Junction Trees

Probability-Theoretic Junction Trees Probablty-Theoretc Juncton Trees Payam Pakzad, (wth Venkat Anantharam, EECS Dept, U.C. Berkeley EPFL, ALGO/LMA Semnar 2/2/2004 Margnalzaton Problem Gven an arbtrary functon of many varables, fnd (some

More information

Statistical Inference. 2.3 Summary Statistics Measures of Center and Spread. parameters ( population characteristics )

Statistical Inference. 2.3 Summary Statistics Measures of Center and Spread. parameters ( population characteristics ) Ismor Fscher, 8//008 Stat 54 / -8.3 Summary Statstcs Measures of Center and Spread Dstrbuton of dscrete contnuous POPULATION Random Varable, numercal True center =??? True spread =???? parameters ( populaton

More information

Report on Image warping

Report on Image warping Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.

More information

CSE4210 Architecture and Hardware for DSP

CSE4210 Architecture and Hardware for DSP 4210 Archtecture and Hardware for DSP Lecture 1 Introducton & Number systems Admnstratve Stuff 4210 Archtecture and Hardware for DSP Text: VLSI Dgtal Sgnal Processng Systems: Desgn and Implementaton. K.

More information

Resource Allocation and Decision Analysis (ECON 8010) Spring 2014 Foundations of Regression Analysis

Resource Allocation and Decision Analysis (ECON 8010) Spring 2014 Foundations of Regression Analysis Resource Allocaton and Decson Analss (ECON 800) Sprng 04 Foundatons of Regresson Analss Readng: Regresson Analss (ECON 800 Coursepak, Page 3) Defntons and Concepts: Regresson Analss statstcal technques

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

Hidden Markov Models

Hidden Markov Models CM229S: Machne Learnng for Bonformatcs Lecture 12-05/05/2016 Hdden Markov Models Lecturer: Srram Sankararaman Scrbe: Akshay Dattatray Shnde Edted by: TBD 1 Introducton For a drected graph G we can wrte

More information

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s

More information

Hidden Markov Models

Hidden Markov Models Hdden Markov Models Namrata Vaswan, Iowa State Unversty Aprl 24, 204 Hdden Markov Model Defntons and Examples Defntons:. A hdden Markov model (HMM) refers to a set of hdden states X 0, X,..., X t,...,

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information