Channel Coding 2.

Similar documents
Application of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations

Lecture 3: Shannon s Theorem

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

VQ widely used in coding speech, image, and video

Composite Hypotheses testing

Low Complexity Soft-Input Soft-Output Hamming Decoder

Consider the following passband digital communication system model. c t. modulator. t r a n s m i t t e r. signal decoder.

Chapter 7 Channel Capacity and Coding

Error Probability for M Signals

Chapter 7 Channel Capacity and Coding

Iterative Multiuser Receiver Utilizing Soft Decoding Information

State-of-the-Art Channel Coding

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003

Chapter 9: Statistical Inference and the Relationship between Two Variables

Pulse Coded Modulation

Linear Approximation with Regularization and Moving Least Squares

Lecture 12: Classification

/ n ) are compared. The logic is: if the two

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

S Advanced Digital Communication (4 cr) Targets today

A Comparison between Weight Spectrum of Different Convolutional Code Types

Communication with AWGN Interference

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

A Robust Method for Calculating the Correlation Coefficient

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Generalized Linear Methods

Tracking with Kalman Filter

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Lecture Notes on Linear Regression

MACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression

Inductance Calculation for Conductors of Arbitrary Shape

ECE559VV Project Report

DC-Free Turbo Coding Scheme Using MAP/SOVA Algorithms

Neuro-Adaptive Design - I:

Comparison of Regression Lines

Motion Perception Under Uncertainty. Hongjing Lu Department of Psychology University of Hong Kong

Linear Regression Analysis: Terminology and Notation

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

This column is a continuation of our previous column

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

EGR 544 Communication Theory

Lecture 10 Support Vector Machines II

Uncertainty in measurements of power and energy on power networks

Notes on Frequency Estimation in Data Streams

Statistical pattern recognition

4DVAR, according to the name, is a four-dimensional variational method.

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Time-Varying Systems and Computations Lecture 6

Quantum and Classical Information Theory with Disentropy

EEE 241: Linear Systems

Tutorial 2. COMP4134 Biometrics Authentication. February 9, Jun Xu, Teaching Asistant

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

Section 8.3 Polar Form of Complex Numbers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Transform Coding. Transform Coding Principle

Errors for Linear Systems

Lecture 5 Decoding Binary BCH Codes

Basically, if you have a dummy dependent variable you will be estimating a probability.

Kernel Methods and SVMs Extension

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function

Appendix B: Resampling Algorithms

Contents 1 Basics of Convolutional Coding.

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

LOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl

Limited Dependent Variables

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Supporting Information

Problem Set 9 Solutions

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique

Numerical Heat and Mass Transfer

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 6: Introduction to Linear Regression

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model

Negative Binomial Regression

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system

Chapter 3 Describing Data Using Numerical Measures

Quantifying Uncertainty

Probability-Theoretic Junction Trees

Statistical Inference. 2.3 Summary Statistics Measures of Center and Spread. parameters ( population characteristics )

Report on Image warping

CSE4210 Architecture and Hardware for DSP

Resource Allocation and Decision Analysis (ECON 8010) Spring 2014 Foundations of Regression Analysis

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Homework Assignment 3 Due in class, Thursday October 15

Hidden Markov Models

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

Hidden Markov Models

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

Transcription:

Channel Codng Dr.-Ing. Drk Wübben Insttute for Telecommuncatons and Hgh-Frequency Technques Department of Communcatons Engneerng Room: N3, Phone: 4/8-6385 wuebben@ant.un-bremen.de Lecture Tuesday, 8:3 : n N33 Exercse Wednesday, 4: 6: n N4 Dates for exercses wll be announced durng lectures. Tutor Shayan Hassanpour Room: N39 Phone 8-6387 hassanpour@ant.un-bremen.de www.ant.un-bremen.de/courses/cc/

Outlne Channel Codng II. Concatenated Codes Seral Concatenaton & Parallel Concatenaton (Turbo Codes) Iteratve Decodng wth Soft-In/Soft-Out decodng algorthms EXIT-Charts BCM LDPC Codes. Trellscoded Modulaton (TCM) Motvaton by nformaton theory TCM of Ungerböck, pragmatc approach by Vterb, Multlevel codes Dstance propertes and error rate performance Applcatons (data transmsson va modems) 3. Adaptve Error Control Automatc Repeat Request (ARQ) Performance for perfect and dsturbed feedback channel Hybrd FEC/ARQ schemes

Chapter. Concatenated Codes Introducton Seral and Parallel Concatenaton Interleavng Seral Concatenaton Drect approach, Product Codes, Choce of Component Codes Parallel Concatenaton Modfcaton of Product Codes, Turbo-Codes, Choce of Component Codes Dstance Propertes and Performance Approxmaton Decodng of Concatenated Codes Defnton of Soft-Informaton, L-Algebra, General Approach for Soft-Output Decodng, BCJR-Algorthm, Iteratve Decodng, General Concept of Iteratve Decodng EXtrnstc Informaton Transfer (EXIT)-Charts Btnterleaved Coded Modulaton (BCM) Low Densty Party Check (LDPC) Codes 3

Introducton Achevng Shannon s channel capacty s the general goal of codng theory Block- and convolutonal codes of CC- are far away from achevng ths lmt Decodng effort ncreases (exponentally) wth performance Questonable, f Shannon s lmt can be acheved by these codes Concatenaton of Codes Forney (966): proposed combnaton of smple codes Berrou, Glaxeux, Thtmajshma: Turbo-Codes (993): Clever parallel concatenaton of two convolutonal codes achevng.5 db loss at P b = -5 to channel capacty Claude E. Shannon Davd Forney Prncpal Idea: Claude Berrou Alan Glaveux Punya Thtmajshma Clever concatenaton of smple codes n order to generate a total code wth hgh performance and enablng effcent decodng Example: Convolutonal Code wth L C = 9 8 = 56 states Convolutonal Codes wth L C = 3 = 8 states complexty reducton by a factor of 3 repeated decodng (6 teratons) 6 8 = 48 states reducton by a factor of 5 4

Seral and Parallel Code Concatenaton Seral Code Concatenaton C C D D nner code outer code Subsequent encoder obtans whole output stream of prevous encoder redundancy bts are also encoded C Parallel Code Concatenaton P Each encoder obtans only nformaton bts C Parallel-seral converter generates seral data stream Example: Turbo Codes C q S 5

Interleavng Interleaver performs permutaton of symbol sequence Strong mpact on performance of concatenated codes Also used to splt burst errors nto sngle errors for fadng channels Block nterleaver wrte xx xx 3 xx 6 xx 9 xx xx xx 4 xx 7 xx xx 3 xx xx 5 xx 8 xx xx 4 read Column-wse wrte n, but row-wse read out leads to permutaton of symbol sequence nterleavng depth L I = 5: neghborng symbols of the nput stream have a dstance of 5 n the output stream gven by number of columns nput sequence: x, x, x, x 3, x 4, x 5, x 6, x 7, x 8, x 9, x, x, x, x 3, x 4 output sequence: x, x 3, x 6, x 9, x, x, x 4, x 7, x, x 3, x, x 5, x 8, x, x 4 xx yy Π Π xx yyy channel L I 6

Interleavng Assumpton: burst errors of length b should be separated Aspects of dmensonng block nterleaver Number of columns affects drectly the nterleaver depth L I L I ³ b s requred, so that burst error of length b s broken nto sngle errors by Π Number of rows Example: For a convolutonal code wth L C = 5, fve successve code words are correlated for R c =/ ten successve code bts are correlated In order to separate these ten bts (by L I to protect them from burst errors), the number of rows should correspond to L C /R c = Tme delay (latency) The memory s read out after the whole memory s wrtten Notce: For duplex speech communcaton only an overall delay of 5 ms s tolerable Example: data rate 9,6 kbt/s and nterleaver sze 4 bts xx yy Π Π xx yyy t = rows columns T 4 t = = 83,3 ms 96/ s b channel Burst error 7

Interleavng Convolutonal Interleaver L L (N-) L channel (N-) L L L Conssts of N regsters and multplexer Each regster stores L symbols more than the prevous regster Prncple s smlar to block nterleaver Random Interleaver Block nterleaver has a regular structure output dstance s drectly gven by nput dstance leadng to bad dstance propertes for Turbo-Codes Random nterleavers are constructed as block nterleavers where the data postons are determned randomly A pseudo-random generator can be utlzed for constructng these nterleavers 8

Seral Code Concatenaton: Drect Approach Concatenaton of (3,,)-SPC and (4,3,)-SPC code u c c w H (c ) R c = /4 = / d mn = Concatenaton of (4,3,)-SPC and (7,4,3)-Hammng code u c c w H (c ) c w H (c ) 3 4 3 4 4 4 3 4 4 4 4 4 7 4 C C Concatenaton does not automatcally result n a code wth larger dstance R c = 3/7 orgnal concatenaton: d mn = 3 optmzed concatenaton: d mn = 4 9

Seral Code Concatenaton: Product Codes k V n V - k V k H u p V C V n H - k H p H C H p + checks on checks C H Π C V block nterleaver Informaton bts arranged n (k V,k H )- matrx u Row-wse encodng wth systematc (n H, k H, d H )-code C H of rate k H / n H each row contans a code word Column-wse encodng wth systematc (n V, k V, d V )-code C V of rate k V / n V each column contans a code word Entre code rate: kh kv R = = R R n n c c,h c,v H V Mnmum Hammng dstance: d = d d mn mn,h mn,v

Seral Code Concatenaton: Examples of Product Codes (,6,4) product code (8,,6) product code Detecton capablty of horzontal code exceeded x x 4 x 8 x x 7 x 4 x x x 7 x 4 x x x 5 x 9 x x x 5 x x 8 x 5 x x x 8 x 5 x x x 6 x x x 9 x 6 x 3 x x 9 x 6 x 3 x 3 x 7 x x 3 x x 7 x 4 x 3 x x 7 x 4 Horzontal: (3,,)-SPC code Vertcal: (4,3,)-SPC code Code rate: / d mn = = 4 Correcton of error & detecton of 3 errors possble Interleaver combnes 3 nfo words ncrease of eff. block length x 4 x 5 x 6 x x x 3 x 8 x 9 x x 5 x 6 x 7 x 4 x 5 x 6 x x x 3 x 8 x 9 x x 5 x 6 x 7 Horzontal: (4,3,)-SPC code Vertcal: (7,4,3)-Hammng code d mn = 3 = 6 correcton of errors possble

Parallel Code Concatenaton: Modfed Product Codes k H n H - k H Informaton bts u row-wse and column-wse encoded wth C H and C V, respectvely k V u p H C H Party check bts of component codes not encoded twce (no checks on checks) n V - k V Π p V C V C H C V Entre code rate kh kv Rc = nh nv ( nh kh) ( nv kv) = / R + / R c,h c,v Mnmum Hammng dstance: d = d + d mn mn,h mn,v

Parallel Code Concatenaton: Examples modfed (,6,3) product code modfed (5,,4) product code x x 4 x 8 x x 7 x 4 x x x 5 x d 9 mn = 3 x x 8 x 5 x x x 6 x x x 9 x 6 x 3 x 3 x 7 x 3 x x 7 x 4 x 4 x x 8 Horzontal: (3,,) SPC code Vertcal: (4,3,) SPC code Code rate: 6/ d mn = + - = 3 error correctable x 5 x 6 x x 3 x 9 x Horzontal: (4,3,) SPC code Vertcal: (7,4,3) Hammng code d mn = + 3 - = 4 error correctable 3

Unon Bound on Bt Error Rate for Product Codes Product codes usng same (n,k,3)-hammng code Only takng nto account mnmum dstance d mn =3+3-=5 results only vald for hgh sgnal to nose ratos (7,4) (5,) (3,6) (7,4) (5,) (3,6) P b -5 P -5 b - 4 6 8 ( E N ) log / s - 4 6 8 ( E N ) log / b 4

Parallel Code Concatenaton: Turbo Codes General structure wth q consttuent codes specal case wth consttuent codes u Π u C c u u=u C c c Π u C c P c Π P u C c Π q u q C q c q Presented n 993 by Berrou, Glaveaux, Thtmajshma Interleaver P neglectable Informaton bts generally not punctured Code rate: Rc = / R + / R c, c, 5

Potental of Turbo Codes Comparson convolutonal codes / turbo codes for R c =/ P b L c =3 L c =5 - L c =7 L c =9 TC - -3-4.5 db -5 3 4 5 6.9 db log (E b /N ) Optmzed nterleaver of length 56 x 56 = 65536 bts For ths nterleaver, gan of nearly 3 db over convolutonal code wth L c = 9 Gap to Shannon s channel capacty only.5 db (C =.5 at E b /N =.9 db) Tremendous performance loss for smaller nterleavers World record:.8 db gap to Shannon capacty by Stephan ten Brnk 6

Influence of Consttuent Codes Systematc recursve convolutonal encoders employed n turbo codes Consttuent codes generate only party bts Conventonally codes wth small constrant length (3 L c 5) and rate RR cc = nn (codes of larger rate can be acheved by puncturng) Error probablty depends on nterleaver sze LL ππ and mnmum nput weght ww mmmmmm of consttuent encoders that leads to fnte output weght P b L w π mn Only recursve encoders requre at least ww mn = for fnte output weght Interleavng gan only achevable for recursve encoders due to PP bb ~LL ππ Nonrecursve encoders wth ww mn = do not gan from enlargng nterleaver sze (PP bb ~LL ππ ) RSC-Encoders are used as consttuent codes performance mproves wth length of nterleaver! 7

Influence of Consttuent Codes Instead of free dstance d f the effectve dstance d eff s crucal d = w + c Interpretaton: Turbo codes are systematc codes eff mn mn Total weght of code words depends on weght of nformaton bts ww mn cc mn denotes mnmum weght of party bts of one encoder for nput weght ww mn = Assumng same consttuent codes, mnmum weght for ww mn = s gven by dd eff Consequence: Sutable consttuent codes should maxmze party weght for nput weght ww mn = Am s acheved f feedback polynomal of consttuent encoders s prme Shft regster generates sequence of maxmum length (m-sequence) may have larger weght than shorter sequences Feedback polynomal of consttuent encoders should be prme! 8

Example of Turbo Code wth Codes (L c = 3), R c = / u c c u T T Π u C C c P 8 8 T T P = g g = = 5 7 9

Example of Turbo Code wth Codes (L c = 3), R c = / Recursve polynomal: g (D) = + D + D g (D) s prme g () = + + = and g () = + + = Shft regster acheves sequence of maxmum length (m-sequence) wth L = -=3 Max dst. dd max eff = ww mn + LL + = + 4 = u = [ ] c = [ ] Recursve polynomal: g (D) = + D g (D) = (+D)(+D) non-prme Shft regster generates sequence of length L = Max dst.dd max eff = ww mn + LL + = + 3 = 8 u = [ ] c = [ ] Feedback polynomal g (D) would lead to degraded performance! / / / / / / / / / / / / / / / /

Example of Turbo Code wth Codes (L c = 5), R c = /3 u g = 3 g = 35 8 8 c c u T T T T Π C c u C T T T T P =

LTE Turbo Code wth Codes (L c = 4) u g = + D + D = 3 g = + D+ D = 5 3 3 8 8 c Π u C T T T Rate Matchng c c u T T T C

Influence of Interleaver E b Pb cd erfc d Rc d N Avodng output sequences wth low Hammng weght at both encoders If output c of C has low Hammng weght permutaton of nput sequence u for C should result n output sequence c wth hgh Hammng weght Hgher total average Hammng weght / Hammng dstance d Interleaver drectly nfluences mnmum dstance Number of sequences wth low weght reduced due to nterleavng Small coeffcents c d Even more mportant than mnmum dstance that acts only asymptotcally Randomness of nterleaver s mportant Smple block nterleavers perform bad due to symmetry c d : total number of nonzero nfo bts assocated wth code sequences of Hammng weght d Pseudo-random nterleavers are much better random codes ( Shannon) 3

Dstance Propertes of Turbo Codes: Defntons General IOWEF (Input Output Weght Enumeratng Functon) of encoder: k n w d AW (, D) = Awd, W D w= d= Condtoned IOWEF s (specfc nput weght w or specfc output weght d): n d (, ) = wd, (, ), AwD A D d = Important for parallel concatenaton: weght c of party bts w c A( W, C) = Aw, c W C wth d = w+ c w c Correspondng condtoned IOWEF: c AwC = A C (, ) w, c c AWd = A W k w= A w,d : number of code words wth nput weght w and output weght d wd w All encoders have same nput weght w Encoders generate only party bts consder weght c of party bts 4

Dstance Propertes of Turbo Codes: Unform Interleaver Problem: concrete nterleaver has to be consdered for dstance spectrum / IOWEF determnaton of IOWEF computatonally expensve Unform nterleaver (UI): theoretc devce comprsng all possble permutatons unform nterleaver /6 /6 /6 /6 /6 UI provdes average dstance spectrum (ncl. good and bad nterleavers) /6 4 possbltes 5

Dstance Propertes of Turbo Codes: Results Parallel concatenaton: Both encoders have same nput weght w Weghts c and c of encoder outputs are added A (w,c) A (w,c) combnes output sequences wth same nput weght w and covers all possble combnatons of output sequences (unform nterleaver) Denomnator acheves averagng w.r.t. number of permutatons of w ones n length L π A( w, C) A ( w, C) A C = = A C ( w, ) par par c w, c Lπ c w c d w+ c= d w L = A π par wc, Seral concatenaton: Output weght of outer encoder equals nput weght of nner encoder ser AW (,) A(, D) ser w d A ( W, D) = = A L wd, W D w c = π w d d w L π R c A ser wd, 6

Dstance Propertes of Turbo Codes c d 5 TC, L π = TC, L π =4 CC -5 5 5 5 3 dstance d Codes Turbo Code g = 5 8, g = 7 8 Convolutonal Code wth L c =9 R c =/3 Observatons UI c d < s possble TC has lower d f but coeffcents c d are much smaller effect becomes more obvous wth ncreasng nterleaver length L π 7

Analytcal Error Rate Estmaton of Turbo Codes - TC, L= TC, L=4 FC Observatons For small SNR the TC outperforms CC sgnfcantly P b -4-6 -8 - AWGN 4 6 8 log / ( E N ) b Flat Raylegh Fadng Gan ncreases wth L π For ncreasng SNR the BER of TC flattens, whereas the curve of CC decreases Explanatons d f domnates BER for large SNR For small SNR the number of sequences wth specfc weght s of larger mportance 8

Decodng of Concatenated Codes Defnton of Soft-Informaton L-Algebra General Approach for Soft-Output Decodng Soft-Output Decodng usng the Dual Code Soft-Output Decodng for (4,3,)-SPC-Code BCJR Algorthm for Convolutonal Codes 9

Decodng of Concatenated Codes Optmum Maxmum Lkelhood Decodng of concatenated codes s too complex Consttuent codes C and C are decoded by separated decoders D and D Decoders D and D are allowed to exchange nformaton n to mprove ther performance probablty of nformaton and/or code bts s of nterest soft output decodng s requred! What s a useful soft output? Assumpton: uncoded transmsson over AWGN channel BPSK modulaton u = x=+ x= u u = x= y = x+ n + - + + - - - + MAP crteron (Maxmum a posteror) consders unequal dstrbuton of symbols Pr{ u = y} < > Pr{ u = y} Pr{ x=+ y > } < Pr{ x= y} 3

Condtonal Probablty Decodng of Concatenated Codes { =+, } > { =, } { } Pr{ y} < Pr{ y} { } p x y p x y Log-Lkelhood-Rato (LLR) (or L-values) derved by Hagenauer p{ x= +, y} Lx ( ˆ) = Lxy (, ) = Lx ( y) = ln > p x=, y < { } { } { } p y x=+ Pr x=+ = ln + ln = Ly ( x) + La ( x) Sgn of pllr s { y x= equals } hard Prdecson { x= } L( yx ) L ( x) Sgn sgn LL xx corresponds to hard decson Magntude LL xx ndcates relablty of hard decson Another possble defnton would be (not used) Lx ( ) = Pr x=+ Pr x= { } { } { x=+ y} = p{ x=+ y} { y} Pr, Pr a { } { } { } { } p x=+, y p y x=+ Pr x=+ = p x=, y p y x= Pr x= > < Joachm Hagenauer Addton of LLRs requres statstcally ndependency of varables! 3

Log-Lkelhood-Rato For an uncoded transmsson the LLR conssts of two components L(y x) depends on channel statstcs and therefore on the receved sgnal y L a (x) represents a-pror knowledge about symbol x 8 L a (x) 6 4 - -4-6 -8..4.6.8 Pr{x = +} { x = + } { x = } Pr L ( ) ln a x = Pr Symmetrc wth respect to (,5 ; ) Pr{x = +} >,5 + more lkely than - postve L a (x) The larger the dfference between Pr{x=+} and Pr{x=-} the larger L a (x) sutable value for relablty Pr{x = +} =,5 L a (x) = decson would be random 3

LLR for a Memoryless Channel Memoryless channel (AWGN or -path fadng channel) Channel nformaton { } ( ) exp y α E s / Ts p y x= + σ L( yx) = ln = ln p{ y x= } ( ) exp y+α E / s Ts σ ( ) ( ) = y+α E / / s Ts y α E s Ts σ σ 4 α y Es / Ts Es / Ts Es = = 4α y = 4α σ N / Ts N L ch y y =α x+ n wth σ = N T s normalzed receved sgnal y y = α E / T s s L ch = relablty of the channel (depends on SNR E S / N and channel gan α ) 33

Relablty of channel: L LLR for a Memoryless Channel LLR s smply a scaled verson of the matched flter motvaton for ln 6 4 ch = 4 α E N s L(y x) versus y hgh channel relablty ( ) L yx E y N s 4 = α - -4-6 db db 4 db 6 db 8dB - - y 34

Bnary Symmetrc Channel (BSC) ( ) L yx LLRs for BSC and BEC -P e -P e P e Bnary Erasure Channel (BEC) Pq ln for y = Y + for y = Y Pq L( y x) = ln for y = Y = for y = Y Pq for y = Y ln for y = Y Pq X X P e Y Pe ln y Y p{ y x } for = = + = + Pe Pe = ln y ln p{ y x = = = } Pe Pe ln for y Y P = = e Y X -P q P q LL(yy xx) Y Y P X q Y -P q 5-5..4.6.8 PP ee 35

Relaton between LLRs and Probabltes () Matched flter corresponds to LLR Task: Fnd arthmetc to perform operaton wth respect to LLR nstead of probabltes L-algebra by Hagenauer Basc relaton Usng completeness (Pr{x = +} + Pr{x = -} = ) n LLR Pr{ x=+ y} Pr{ x=+ y} Pr x= y Lx ( ˆ) = Lx ( y) = ln = ln = ln Pr x= y Pr x=+ y Pr x= y { } { } L( xˆ ) e Pr{ x=+ y} = = L( xˆ) L( xˆ) + e + e Pr{ x= y} = ( ˆ) + e L x Wth respect to symbol x {+,-} the general relaton holds L( xˆ )/ e L ˆ { } ( x Pr x= y = e )/ = wth ( ˆ) sgn( ) ( ˆ) {, + L x L x } + e + e { } { } 36

Relaton between LLRs and Probabltes () Probablty of a correct decson For x = + decson s correct, f LL( xx) s postve L( xˆ ) L( xˆ ) e e Pr{ xˆ correct x=+ } = = L( xˆ ) L( xˆ ) + e + e For x = - decson s correct, f LL( xx) s negatve L( xˆ ) e Pr{ xˆ correct x= -} = = = L( xˆ ) L( xˆ) L( xˆ) + e + e + e L( xˆ ) e Pr{ xˆ s correct} = L( xˆ ) + e Soft bt: expected value for antpodal tx sgnal L( xˆ) L( xˆ) e e Lx ( ˆ) λ= E{ xˆ} = Pr{ xˆ = } = ( + ) + ( ˆ) ( ) = = tanh L x L( xˆ) L( xˆ) + e + e e + =± λ+ Pr{ xˆ =+ } = + - 37

L-Algebra Party bts are generated by modulo--sums of certan nformaton bts how can we calculate the L-value of a party bt? Hagenauer Assumpton: Sngle party check code (SPC) p= u u L(p) =? x and x are statstcally ndependent + x artanh ( x) = ln { } { } x Pr u u = Pr x x =+ x L( p) = Lu ( u) = ln = ln = Lx ( x) x e λ= tanh Pr{ u u = } Pr{ x x = } ( ) = x e + Pr{ x =+ } Pr{ x =+ } + Pr{ x =+ } Pr{ x =+ } + Pr{ x = } Pr{ x = } Pr{ x = } Pr{ x = } L( x x) = ln = ln Pr { x =+ } Pr { x = } + Pr { x = } Pr { x =+ } Pr { x =+ } Pr { x =+ } + Pr x = Pr x = L( x) L( x) L( x) + L( x) e e + e + = ln = ln e + e e + e ( x ) L x L( x ) L( x ) L( x ) L( x ) { } Lx ( ) Lx ( ) = = [ λ λ ] = boxplus operaton { } artanh tanh tanh artanh Lx ( ) + Lx ( ) 38

L-Algebra mod--sum of statstcally ndependent random varables: Lx ( ) Lx ( ) Lu ( u) = artanh tanh tanh artanh [ ] Lx ( ) + = λ λ = Lx ( ) Lx ( ) Lx ( ) mod--sum of n varables: [ Lx ] [ Lx ] { Lx Lx } sgn ( ) sgn ( ) mn ( ), ( ) tanh(x/) + - tanh(x/) λ λ - + artanh(x) n n Lu ( un) = artanh tanh ( Lx ( ) / ) + = Lx ( ) = = Lx ( x) { Lx } [ Lx ] mn ( ) sgn ( ) n = 39

General Approach for Soft-Output Decodng For FEC encoded sequence MAP crteron should be fulflled pu ( =, y) Symbol-by-Symbol MAP Crteron: Lu ( ˆ ) = ln pu=, y ( ) L-value for estmaton of nformaton bt u gven by receve sequence y Jont probablty densty functon p(u =/,y) not avalable elementary conversons Usng the completeness, the code space s splt nto two subsets Pa ( ) = Pab (, ) () ( ) Γ = contans all c wth u = Γ = contans all c wth u = ( ) ( ) ( ) ( ) c Γ ( ) { } ( ) { } ( ) p cy, ( ) p y c Pr c c Γ c Γ Lu ( ˆ ) = ln = ln p cy, p y c Pr c c Γ ( =, y) = ( ) p( cy, ) pu ( =, y) = ( ) p( cy, ) pu c Γ c Γ sum over k /= k- code words n numerator and n denomnator 4

General Approach for Soft-Output Decodng Assumng statstcal ndependency of the y j (transmsson over AWGNC) Succeedng nose terms n j are ndependent, but of course not succeedng code bts c j (nterdependences ntroduced by encoder)! p(y c) represents probablty densty condtoned on the hypothess c y j are statstcally ndependent random varables n ( y c) ( j j) p = p y c j= Each codeword c s unquely determned by the correspondng nfo word u n k (u are statstcally ndependent) k p y c Pr u () c Γ j= j= Pr{ c} = Pr{ u } = Pr{ u j} Lu ( ˆ ) = ln n k j= p y c Pr u Symbol-by-Symbol MAP () j= j= ( ˆ ) Lu = ln c Γ c Γ c Γ () () n j= n j= ( j cj) Pr{ c} p y ( j cj) Pr{ c} p y ( j j) { j} ( j j) { j} 4

General Approach for Soft-Output Decodng Symbol-by-Symbol MAP for systematc encoders For systematc encoder u = c holds for k- -th term p(y c ) s constant n numerator and denomnator can be separated together wth P(u ) ( ) ( ) Soft-Output can be splt nto 3 statstcally ndependent parts: Systematc part L ch y ( ) ( ) A-pror nformaton L a (u ) { } { } p y u = Pr u = Lu ( ˆ ) = ln + ln + ln p y u = Pr u = = L y + L u + L u ch a e k ( j j) Pr{ j} k ( j j) Pr{ j} Extrnsc nformaton L e (u ): nformaton provded by code bts connected wth u c Γ c Γ () ( ) n p y c u j= j= j j n p y c u j= j= j j 4

General Approach for Soft-Output Decodng Compact descrpton of extrnsc nformaton n k n p( y c ) Pr{ u } < k p( yj cj) Pr { uj} = p( yj; cj) wth p( y ; c ) = p( y c ) k < n j= j= j= j j j L Calculaton of extrnsc nformaton wth LLR s: ( ) n exp Lc ( ; y) c ( ) c Γ L y + L ( u ) < k L( ˆ e u ) = ln Lc ( ; y) = L y k < n c Γ n k n ( ) { } ( ) () c Γ ( ) p y c Pr c p y ; c j j j j j ( ) ( ) c Γ j= j= c Γ j= j j j ( ˆ e u ) = ln = ln n k n ( ) { } c Γ ( ) p y c Pr c p y ; c j j j j j j= j= j= j j j j j j j= j ch a wth n ch exp Lc ( j; yj) c j j= j 43

Soft-Output Decodng of Repetton Codes Code word cc = cc cc cc nn contans nn repettons of nformaton word uu = [uu ] Set of all code words for nn = 3 s gven by Γ = {, } ( ) ( ) ( ) { c} n n j j j ( ) c Γ j= j= n n ( j j) { c} j= j= p( y ) p( y ) p( y ) Pr{ u = } p( y ) p( y ) p( y ) Pr{ u = } ( ) ( ) ( ) p( y ) p( y ) p( y ) ( ) ( ) ( ) ( ) Corresponds to averagng of LLRs ( ) { c [ ]} p y c Pr p y Pr = Luˆ = ln = ln p y c Pr p y Pr = = ln c Γ ( j ) { c [ ]} { = } { u = } p y p y p y Pr u = ln + ln + ln + ln Pr = L y c + L y c + L y c + L u a 44

Soft-Output Decodng usng the Dual Code Calculaton of extrnsc nformaton requres summaton over all code words c of the code space Γ The (55,47,3) Hammng code contans 47 =.3 74 code words Instead of calculatng the LLR over all code words c of the code, t s also possble to perform ths calculaton wth respect to the dual code Benefcal, f the number of party bts s relatvely small dual code for (55,47,3) Hammng code contans only 8 = 56 code words Calculaton of extrnsc nformaton wth dual code: c n Lc ( ; y ) tanh c Γ = L ( ˆ e u ) = ln n c c Lc ( ; y) ( ) tanh Summaton over all Γ = n-k code c words c of the dual code 45

Soft-Output Decodng of (4,3,)-SPC usng the Dual Code Calculaton of extrnsc nformaton requres summaton over 3 = 8 code words. Instead, the dual code contans only n-k = words Γ = {, }. Calculaton of LLR n Lc ( ; y ) + tanh = Lu ( ˆ ) = Lch y + ln n Lc ( ; y ) tanh = n Lc ( ; y) Lch y artanh = + tanh = n { } [ ] L y+ mn Lc ( ; y) sgn Lc ( ; y) ch = Frst term n numerator and denomnator (c=) s one. wth + x ln = artanh x ( x) Each cîg fulflls cc T =,.e. c s gven by modulo--sum of all other code bts c j : c = c L( c) = + Lx ( ) j j e j j= j n 46

Soft-Output Decodng for (4,3,)-SPC-Code E s /N = db u L ch y -5. +7.+.9+.5 encodng + Approxmaton for c L e (û) +.9 -.9 -.5 -.9 ( ) L ch = 4 EsN = 4 EsN / / 4 6,34 = = L c = + Lx ( ) e j j= j n Pr{û correct} ln db BPSK = L ch y+ L e (û).96.99.65.65 x - + - + L(û) -3. +5. -.6 +.6 HD - + - + AWGN y -.8 +.+.3+.4 HD - + + + error detected. but not corrected error corrected 47

BCJR Algorthm for Convolutonal Codes Symbol-by-Symbol MAP Decodng: Bahl, Cocke, Jelnek, Ravv (97) p( u, ) ( ',, ) ( ',,,, ) ( ', ), ( ', ), k k = y ps sy ps s s s u= y s s u < y y = > Lu ( ˆ ) = ln = ln = ln p( u =, y) ps ( ', s, y) ps ( ', s, y, y, y ) ( s', s), u = ( s', s), u = k< k> Effcent calculaton of LLR based on the Trells dagram (explotng Markov prop.) state s state s ps ( ', yk< ) p( y, s s') p( y s) k> Trells of a RSC wth L c =3 - y u = u = [ ] y = y y y N = y y y,, n, 48

BCJR Algorthm for Convolutonal Codes state s ps ( ', yk< ) p( y, s s') state s p( y s) k > Splttng up the observatons y k> ps ( ', s, y, y, y ) = p( y s', s, y, y ) ps ( ', s, y, y ) k< k> k> k< k< Backward probablty: Probablty of the sequence y k>, f the trells s assumed n state s at tme nstant If state s at tme nstant s known, the β () s = p( yk> s', s, yk<, y) = p( yk> s) parameter s, y, y k< are not relevant Splttng up the observatons y p( ', s, y, y = ps (, y s', y ) p( s', y ) s k< ) k< k< Transton probablty: Probablty of observng y under the condton that the transton from s to s takes place at tme nstant y k< not relevant γ ( s', s ) = ps (, ys', yk< ) = ps (, ys') { s} { s', s} r { s' } ps ( ', s, y ) Pr = = p( y s', s) = p( y s', s) Pr s s' Pr ' P { } - p{y s,s}: transton probablty of channel Pr{s s } : a-pror-nformaton Possblty to use a-pror knowledge wthn the decodng process Pr{s s } ~ u 49

BCJR Algorthm for Convolutonal Codes Forward probablty: α ( s ) = ps ( ', yk< ) Probablty densty splts nto three terms p( s', s, yk<, y, yk> ) = α ( s') γ ( s', s) β () s Compact descrpton of Symbol-by-Symbol MAP ps ( ', s, y,, ) ( ', ), k k ( ', ), ( ') ( ', ) () s s u < y y > α s s u s γ s s β s = = Lu ( ˆ ) = ln = ln p( s', s, y, y, y ) α ( s') γ ( s', s) β () s Recursve Calculaton Forward probablty Backward probablty ( s', s), u = k< k> ( s', s), u = α Intalzaton Termnated code otherwse α s' = ( s') = s' ( s) = ps (, y = γ ( ', ) ( s') k<+ ) s s α s' β ( s ') = p( yk> s' ) = γ ( s', s) β( s) () s' = βn s = s' Probablty of sequence y k<, f the trells s assumed n state s at tme nstant - s β () N s = m a - () g (,) a - () state s ps ( ', yk< ) - g (3,) β - (3) g (3,3) p( y, s s') (m memory elements) 5 state s p( y s) k > a () β () β (3)

BCJR Algorthm for Convolutonal Codes Symbol-by-Symbol MAP Decodng: pu (, ) ( ', ), ( ') ( ', ) ( ) = y α s s u s γ s s β s = Lu ( ˆ ) = ln = ln p( u =, y) α ( s') γ ( s', s) β ( s) ( s', s), u = u = u = αα = αα 3 = αα γγ 3, γγ, αα γγ, αα γγ 3, +αα γγ 3, ββ NN γγ NN, ββ NN γγ NN, ββ NN = αα = αα αα αα 3 ββ NN ββ NN ββ NN = αα = αα αα αα 3 ββ NN ββ NN ββ NN = αα 3 = 3 NN NN NN αα 3 αα 3 αα 3 3 ββ NN 3 ββ NN 3 ββ NN 3 = 5

Calculaton n Logarthmc Doman Implementaton wth respect to probabltes s complcated numercal problems mplementaton n the logarthmc doman favorable Transton varable γ ( s', s ) = ln γ ( s', s) = ln p( y s', s) + ln Pr ss' { } = C y x( s', s) + ln Pr { u σn = u( s', s) } Forward varable α () s = ln α () s = ln γ ( s', s) α (') s = ln exp γ ( s', s) + α (' s ) Backward varable β ( s' ) = lnβ (') s = ln γ( s', s) β() s = ln e xp γ ( s', s) + β() s Intalzaton Termnated code otherwse s' = ' α( s') = ( ) s' s = βn = s' βn ( s) = const. ( ) ( ( )) s' s' ( ) ( ( )) s s 5

Calculaton n Logarthmc Doman: Jacob Logarthm In recurson, ln of sum of exponents occur Proof x x ( ) [ ] ( x x e e x ) x e + = + + = [ x x] ln max, ln max *, For x > x For x x [ ] ( ( ( ) )) x x ( ) ( ) Second term has small range between and ln effcently be mplemented by a lookup table w.r.t x -x ( ) ( ) * x x x x x x max x, x = ln e + e = ln e + ln + e = x+ ln + e [ ] ( ( ( ) )) x x ( ) ( ) ( ) ( ) * x x x x x x max x, x = ln e + e = ln e + ln + e = x + ln + e 53

Calculaton n Logarthmc Doman: Jacob Logarthm x x * x Smplfy logarthm of sums ln e + e = max x, x = max x, x + ln + e Forward varable α () s = ln α () s = ln exp γ ( s', s) + α (') s s' Backward varable * β ( s ) = ln β ( s ) = max γ( s, s) + β ( s), γ( s, s) + β ( s) Declaraton: ( ) = ma x γ ( s, s) + β () s + ln+ e s ( ( )) s' * [ γ( s, s) α ( s ) γ( s, s) α ( s ) ] [ γ ( s', s) α ( s') ] l ( e ) = max +, + = max + + n+ ( ) [ ] [ ] ( x ) correcton term Log-MAP: mplementaton of BCJR n log-doman wth correcton term Max-Log-MAP: mplementaton n log-doman wthout correcton term ( γ (, s) α ( ) ) ( γ ( s, s) α ( s )) = s + s + correcton term ( γ ( s, s ) β ( s )) ( γ ( s, s) + β ( s) ) = + 54

Iteratve Decodng General Structure for Parallel Concatenated Codes Turbo Decodng for (4,6,3)-Product Code Smulaton Results Turbo Decodng for Serally Concatenated Codes 55

General Concept for Iteratve Decodng Parallel Concatenated Codes u u=u C c y y D y L a, c LL( uu) L e, Π Π Π u C c L a, D LL( uu) L e, Π systematc (message) bts party bts A pror nformaton Decoder soft-decson estmates for message bts 56

Turbo Decodng for (4,6,3) Modfed Product Code () u x - encodng + + - + - - - + - AWGN LLR L ch y.6 5. 7.6-4.4.3 3.8-3. 6.3 -.6-9.5 BPSK - + + + - + + - + - SNR= db -7.6 3. -5.7 7.6.3.3 -.3 8. -9.5-.7 Vertcal extrnsc nfo serves as horzontal a-pror nfo LL aa, uu = LL ee, uu LL + - + - uu = L ch y + LL ee, uu.9-5.7 7.6-7.. vertcal extrnsc decodng nformaton -.3 -.3-3.8 -.6 -.7 6.3 -.5-3.8 LL ee, uu -.3 -.3-3.8 -.6 -.6.3 -.3-3. 4.5-3..5-3.8 L ch y -.6.3 -.3-3..6 -.3.3.6 -.6 3. -.3 -.6-7..7.9.9-4.4 8. 6.9 -. + L e, (û).6 -.3.3.6 -.6 3. -.3 -.6 57

Turbo Decodng for (4,6,3) Modfed Product Code () L ch y + LL aa, -.7 4.5-7..7-3..9.9.5-4.4 uu 6.3 -.5-3.8 6.3-3.8-9.5 8. 6.9 -.-.7.9-5.7 7.6-7..3. horzontal decodng LL ee, L ch y + LL aa, uu 3. 6.9. -.5 6.3.6-8.9 3. -.9 4.5 -.6.7-7..9 8.9 uu 8.9 -.-.7.9-5.7 7.6-7. -9.5.3.5 -.5 -.3.9 -.7.5.3.7 LL ee,.7-3. -.3.7 uu = LL aa,.7.5.3 -.7 uu L ch y + L e, (û) + L a, (û) LL uu û.8. -8.3.6 5.6 -.6 -.6 -.3 3..6 -.8-3. -5.7 7.6 -.8 9.5 58

Turbo Decodng for (4,6,3) Modfed Product Code (3) L ch y + L a, (u) 3. 6.9. -.5 6.3 û -.9 L (û) 7.6. -.9.6 -.9.7.9-9.5.5 -. -.4.4-8.9 4.5-7. 8.9.3-7. 3.9-6.3.3 3. -.6 8.9 -.-.7 x x 6.9.6. vertcal decodng.9-5.7 7.6-7. L ch y + L a, (u) -.3 7..6 -.3 6.3 L ch y + L e, (û) + L a, (û) L e, (û) -.9 -.6 -.7.9 L (û) 3.. -3.8 6.3.7.4-3..6-9.5 L e, (û) -.6.6.3 -.6 -.9.6 -. -.5 L ch y + -5.7.7 -.3.6-5. -.4 -.6 5.7.3. horzontal -.7.7-3..7.9 -.6.7 -.9 -.9.9 -.7.9 L e, (û) -.6-7..6 3.9-6.3 7.5-7.6 7. -.7.9.3-5.7.3 7.6 8. -7. -8.3 decodng -.3.3 -.3.6 -.6 -.6.3.6 59

Turbo Decodng for (4,6,3) Modfed Product Code (4) L ch y + L a,3 (u). 8..6-3.8 6.3 û 3 L (û) -.9 6.3.9 -.7 3.4 -.7.7. -9.5 3.9 -.3 -.3-3. -8.9 4.5-7. 8.9.3-8.9.6-6.3 8.8.9 -.9 8. -8.9-.7.7.7 8.8-9.7 3. vertcal decodng.9-5.7 7.6-7. L ch y + L a,3 (u) -.3 5.7.6 -. 6.3 L ch y + L e,3 (û) + L a,3 (û) L e,3 (û) -.9 -.9 -.7. L 3 (û) -.9 5. -.5 6.3..9-4.4 -.7-9.5 L e,3 (û) -.6.6.3 -.6.9 -.6-3.8 L ch y + -7.6 3.4 -.8.3-5. -.9 -.7 7.5.3 3. horzontal -.. -.5. -.9.7.7 -.7 -.. L e,3 (û) -8.9.3.4.6-6.3 7.5-8.4 7.8-.7.9-5.7.8 7.6 7.5-7. -7.8 decodng -.3.4.3.3 -.3.3.3 -.3 6

Turbo Decodng for Parallel Concatenated Codes LL a, uu =LL e, uu Π LL ch yy D LL uu = LL ch yy s Π D +LL e, uu +LL a, uu LL ch yy s + LL a, uu only L ch y s fed to decoder D LL uu = LL ch yy s +LL a, uu +LL e, uu Both decoders estmate same nformaton word u and each decoder receves correspondng channel outputs Systematc nformaton bts y s are fed to D va D and Π Each decoder generates extrnsc nformaton for bt u servng as a pror LLRs for other decoder A pror LLRs mprove decoders performance n each teraton as long as they are statstcally ndependent of regular nputs 6

Smulaton Results for Modfed Product Codes (7,4,3)-Hammng Codes P b It. It. It.3 analyt. - -4-6 4 6 8 log / ( E N ) b Observatons Gans decrease wth number of teratons Same nfo bts are estmated and correlaton of a-pror nformaton ncreases Wth the larger nterleaver length the gans of subsequent teratons are generally larger statstcal ndependence of bts s requred 6

Smulaton Results for Modfed Product Codes (5,,3)-Hammng-Codes It. It. It.3 analyt. - Observatons Larger nterleaver leads to mproved statstc gans for teraton 3 P b -4-6 3 4 5 6 log / ( E N ) b 63

Smulaton Results for Modfed Product Codes (3,6,3)-Hammng-Codes It. It. It.3 analyt. - Observatons Larger nterleaver leads to mproved statstc gans for teraton 3 P b -4 For larger SNR the BER flattens mnmum dstance domnates error rate for large SNR -6 3 4 5 6 log / ( E N ) b 64

Smulaton Results for Modfed Product Codes Hammng codes have dd mn = 3 for all lengths nn Analyzed product codes have same dd mn smlar error rates versus EE ss /NN Code rates are dfferent longer product codes are better versus EE bb /NN P b (7,4) (5,) (3,6) - P b (7,4) (5,) (3,6) - -4-4 -6 4 6 ( E N ) log / s -6 4 6 ( E N ) log / b 65

Smulaton Results for Turbo Codes (L c = 3) P b x Block-Interleaver - - It. It. It. 3 It. 4 It. 5 It. 6 P b 3x3 Block-Interleaver - - It. It. It. 3 It. 4 It. 5 It. 6-3 -3-4 -4-5 3 4 5 6 ( E N ) log / b -5 3 4 5 6 ( E N ) log / b Gans decrease wth number of teratons Increase of nterleaver sze leads to reduced BER 66

Smulaton Results for Turbo Codes (L c = 3) P b 9-Random-Interleaver, Rc=/3 - - It. It. It. 3 It. 4 It. 6 It. P b Comparson of dfferent nterleavers - - CC, Lc=9 BIL- BIL-4 BIL-9 RIL-9 RIL-9,Rc=/3-3 -3-4 -4-5 3 4 5 6 E b / N n db Usage of random nterleaver leads to sgnfcant performance mprovements n comparson to block nterleaver -5 3 4 5 6 Eb / N n db Random nterleaver (RIL) acheves larger gans n comparson to block nterleaver (BIL) 67

Turbo Decodng for Serally Concatenated Codes C outer encoder Π C systematc nner encoder LL aa, cc = Π{LL e, cc } Π LL cc = LL ch yy S +LL a, cc +LL e, cc Π D D LL cc = LL ch yy s +LL e, cc LL ch yy S + LL a, cc +LL a, cc LL uu Outer decoder receves nformaton only from nner decoder Outer decoder delvers estmates on nformaton bts u as well as extrnsc LLRs of code bts c beng nformaton bts of nner code C Extrnsc LLRs of code bts c serve as a pror LLRs for nner code C 68

Comparson of Seral and Parallel Concatenaton BER - - -3 seral parallel n = n = n = Results for specfc setup, no generalzaton possble! -4-5 3 4 5 6 E / N n db b 69

Repeat Accumulate Code by ten Brnk Approxmately decodng teratons are needed Half-rate outer repetton encoder and rate-one nner recursve convolutonal encoder - - BER -3-4 -5...3.4.5.6 E b /N n db 7

Repeat Accumulate Code by Stephan ten Brnk 7

EXtrnsc Informaton Transfer Chart (EXIT-Charts) Stephan ten Brnn

Parallel Concatenaton Mutual Informaton for Turbo Decoder Π C D Π D 73

Mutual Informaton for Sngle Decoder C BPSK D 74

General Concept of Iteratve Turbo Decodng BER curve shows three dfferent regons At low SNR the teratve decodng performs worse than uncoded transmsson At low to medum SNR the teratve decodng s extremely effectve waterfall regon At hgh SNR the decodng converges already n few teratons error floor How to understand ths varyng behavor? Extrnsc nformaton s exchanged between decoders Analyss of teratve process by sem-analytc approach Determne analytcally mutual nformaton II uu; LL aa uu between nformaton bts and a-pror nput of decoder Determne by smulaton mutual nformaton II uu; LL ee uu between nformaton bts and extrnsc output of decoder for specfc a-pror nformaton at nput Draw relatonshp between both mutual nformaton's Combne dagrams of both contrbutng decoders nto one chart: EXIT chart: EXtrnsc Informaton Transfer chart 75

Dstrbuton of Extrnsc Informaton Investgaton of extrnsc decoder output Example: [7,5]-RSC at E b /N =,, db PDF of extrnsc estmate s gven for x = + and x = - separately Extrnsc nformaton s nearly Gaussan dstrbuted Wth ncreasng SNR the mean s absolute value s ncreased the varance s ncreased Iteratve Decodng: Wth ncreasng number of teratons the extrnsc nformaton approaches a Gaussan dstrbuton..5..5 ( ˆ ) = ( ˆ ) ( ) L u Lu L y L u e ch a p p e e ( ξ x =+ ) - -5 - -5 5 5..5..5. db.5 db. db.5 db. db ( ξ x = ). db.5 db. db.5 db. db - -5 - -5 5 5 76

Analytcal Model for the A-Pror Informaton Extrnsc nformaton of decoder becomes a-pror-nformaton of decoder and vce versa For EXIT analyss the a-pror nformaton A=L a s modeled as A=µ A x+ na Gaussan random varable n A of zero mean and varance σσ AA s added to the value x of the transmtted systematc bt multpled by µ = σ p A ( ) σa x ξ ξ = exp πσ σ A A ( x) Normalzaton of a-pror nformaton wth σσ AA Wth ncreasng varance the probablty functons are more separated and do not overlap anymore.4.35.3.5..5..5 σ = A.6.5.4.3.. A A σ = 6 A.35.3.5..5..5 σ = 36 A -5 5-5 5-5 5 77

Motvaton for Modelng A-Pror Informaton LLR for uncoded transmsson over AWGNC s gven by y = x+ n~ ±, σn p{ y x= + } Es L( y x) = ln = 4 y = Lch y = Lc h ( x+ n) Es p{ y x= } N wth L 4 4 ch = = = N σ σ L( yx) = x+ n σ σ LLR s Gaussan dstrbuted wth mean m A and varance s A A The mean s absolute value equals the half of the varance Model for a-pror LLR A L x n n { ( ) µ = E L yx= = n n L ch { } ( ) σ A E n σ σ = N n Ts 4 n n σn σn σ = = σ = 4 = a =µ A + A ~ A ( ± σa, σ A) = ±, σn σn and ( ) n Es σ x = = T 78 s n

Mutual Informaton of A-Pror Informaton and Info Bts Mutual nformaton between systematc bts and a-pror LLR I A Integral has to be solved numercally J(s A ) s monotoncally ncreasng n s A has a unque nverse functon s A = J - (I A ) Close approxmaton for J-functon and ts nverse pa( ξ x) ( ξ = ) + ( ξ =+ ) I ( σ ) = I( X; A) = p ( x ) log d ξ ξ A A A x = { +, } pa x pa x { } ( A) ( ( ) ) ξ ξ σ A ( ) ( ) A = exp ξ σ log + e dξ = E log + e = J σ πσ.89.373 σ A ( σ A) = ( σ A) ( ) J I A A 35.64 /.64.8935 ( A) log ( A ) σa J I = I.373 I A ( σ ) A.9.8.7.6.5.4.3.. 3 4 5 6 7 8 σ A 79

Mutual Informaton of Extrnsc Informaton and Info Bts Mutual nformaton between systematc bts and extrnsc LLR I E pe( ξ x) ( ξ = ) + ( ξ =+ ) I = I( X; E) = p ( x ) log d ξ ξ E E x = { +, } pe x pe x Sem analytcal approach to determne the dependency of mutual nformaton at decoder nput and output Perform encodng for a random nformaton sequence u c = f(u) and x = -c Transmt BPSK sgnals over AWGN channel y = x + n For gven I A determne s A usng the nverse J-functon s A = J - (I A ) Model a-pror nformaton usng analytcal model: A = m A x + n A Perform decodng of nosy receve sgnal y usng a-pror nformaton A Determne mutual nformaton I E for extrnsc nformaton usng hstogram for approxmatng pdf p E ( ξ x ) Transfer characterstc shows dependency of I E and I A (, / ) I = Tr I E N E A b 8

Measurement of the Mutual Informaton By applcaton of ergodc theorem (expectaton s replaced by tme average), the mutual nformaton can be measured for large number N of samples N x L { } ( ) L ( ) ( ) n n I LX ; = Elog + e log + e N n= Measurement setup u {,} x { ±} systematc bts n / σn L( yx) La ( x) Le ( x) xl ( ) e log +e Average ( X) I L σ A = 4/ σn A 4 L = A~ ( ± σ, σ ) = ±, A A A A σn σn E, A n A / σ n A xl ( ) a log +e Average ( X) I L A, 8

I E I u; L e ( u).4 Dependency of Mutual Informaton at Decoder Input and Output =.8.6 ( ) -.5 db. db.5 db. db.5 db. db.5 db 3. db..4.6.8 I = I u; L ( u). A ( ) a LL ch yy LL aa Dec. LL ee Transfer characterstc for (37,3 r ) 8 - RSC code Decoder processes L(y x) and L a (x) Observatons I E ncreases wth growng SNR and I A I A = no a-pror nformaton avalable I A = perfect a-pror I E s relable regardless of SNR For hgh SNR, nearly no a- pror nformaton s requred for good decodng results 8

Behavor of dfferent Convolutonal Codes.9.8.7 L c = 3 Transfer characterstc f only a-pror nformaton s provded to the decoder (c.f. seral concatenaton) I E = ( ; ( )) I ul u e.6.5.4.3.. L c = 5 L c = 7...3.4.5.6.7.8.9 I A ( ; ( )) = I ul u a Weak codes better for low a-pror nformaton Strong codes better for hgh a-pror nformaton Pont of ntersecton for all convolutonal codes close to (.5,.5) (explanaton for ths behavor unknown!) Seral concatenaton: Outer decoder gets only a-pror nformaton of nner decoder Transfer functon of outer decoder s ndependent of SNR LL aa Dec. LL ee 83

Comparson of MAP and Max-Log-MAP Dec..8.6.4 E b / N = - db E b / N = db. E b / N = db E b / N = db E b / N = 3 db..4.6.8 Hgh channel SNR leads to hgh extrnsc nformaton Large a-pror nformaton can compensate bad channel condtons Max-Log-MAP decoder performs nearly as good as optmal MAP decoder 84

EXtrnsc Informaton Transfer (EXIT) Charts Extrnsc nformaton provded by one decoder s used as a-pror nformaton for other decoder D D For EXIT charts the transfer functon of both consttuent codes are drawn nto one dagram wth exchangng the abscssa and ordnate for the second code Assumptons A large nterleaver s assumed to assure statstcal ndependence of I A and I E For nner decoders n a seral concatenated scheme and for parallel concatenated schemes the nput parameters are L ch and I A For outer decoders n a seral concatenaton only I (outer) A appears as nput whch s taken form the nterleaved sgnal I (nner) E (Transfer functon of outer decoder s ndependent of SNR) 85

EXIT Charts for Seral Concatenaton pnch-off SNR: mnmum SNR for convergence of turbo decoder.8.6.4...4.6.8 E b /N = -. db E b /N =. db E b /N =. db E b /N =. db E b /N =. db E b /N = 3. db outer decoder Outer non-recursve convolutonal encoder (5,3) 8, RR cc = 3/4 Inner recursve convolutonal encoder (3,5 r ) 8, RR cc = /3 86

EXIT Charts for Seral Concatenaton.8.6 Outer non-recursve convolutonal encoder (5,3) 8, RR cc = 3/4 Inner recursve convolutonal encoder (3,5 r ) 8, RR cc = /3.4...4.6.8 87

EXtrnsc Informaton Transfer (EXIT) Charts Outer convolutonal code Inner Walsh-Hadamard code.9 P R E J I J I ( ) ( ) b b erfc 8 c + A + E N I ( u ; L ( )) ( ; e u = I u La ( u) ).8.7.6.5.4.3.....3.4.5.6.7.8.9 I ( u ; L ( )) ( ; a u = I u Le ( u) ) 88

EXtrnsc Informaton Transfer (EXIT) Charts Determnng pnch-off SNR: mnmum SNR for whch convergence s mantaned.9.8.7 I ( u ; L ( )) ( ; e u = I u La ( u) ).6.5.4.3.. log (E b /N ) = -.3 db...3.4.5.6.7.8.9 I ( u ; L ( )) ( ; a u = I u Le ( u) ) 89

Code Desgn for Half-Rate Repeat-Accumulate Code.8 Sgnal-to-Nose rato.6.4 Outer repetton code. outer repetton code nner code..4.6.8 Inner recursve convolutonal encoder 9

Btnterleaved Coded Modulaton General Structure for Serally Concatenated Blocks Calculaton of LLRs Smulaton Results 9

Bt-Interleaved Coded Modulaton (BICM) Π channel encoder Π mapper channel demapper Π - channel decoder Coded transmsson wth hgher order modulaton: Bnary vector of length m s mapped to one of m symbols of the alphabet XX Usually Gray mappng employed xx XX mnmzes bt error probablty wthout channel codng Good propertes regardng the capacty of a BICM system Interpretaton as serally concatenated system Inserton of nterleaver between encoder and mapper leads to pseudo random mappng of bts onto specfc levels and s crucal for teratve turbo detecton Iteratve detecton and decodng: demapper and decoder exchange extrnsc nformaton How to perform turbo detecton / decodng? Are there better mappng strateges than Gray mappng? 9

Soft-Output Demappng LLR for each of the m bts (for one specfc tme nstant kk): p( y c) Pr c p( yc, m µ = dem ) c GF( ), cµ = L ( c µ ) = Lc ( µ y) = ln = ln p yc, = p y c Pr c x, c = x, c = A pror nformaton LL aa m Pr ( µ ) ( ) Pr{ x} p yx µ = ln = ln p y { cν ( x) } µ = m ν= ν= ( x) Pr{ x} cc νν c ( xl ) ( c ) e ν + e ( ) c GF, c = x x provded by decoder L a a ν ( c ) ν µ µ m µ exp exp ( ) y x σn y x σ n m ν= m ν= { } { } Pr Pr { cν ( x) } { cν ( x) } 93

Soft-Output Demappng Denomnator of a pror nformaton cancels when nserted nto LL dem m y x cν( xl ) a ( c ν) exp e x σ µ n ν= dem L ( c µ ) = Lc ( µ y) = ln m y x cν( xl ) a ( c ν) exp e x σ µ n ν= dem Intrnsc nformaton LL cc νν s ndependent of a pror nformaton LL aa cc νν dem dem L c = L c La c ( µ ) ( µ ) ( µ ) = ln x µ µ x exp exp y x σ n y x σ n m ν=, ν µ m ν=, ν µ e e c ν ( xl ) ( c ) a ( ) ( ) c xl c ν a ν ν cc μμ 94

Soft-Output Demappng for 6-QAM dem dem ( ) L ( c ) 4 L c Im Im Re Re L dem ( c ) = ln x x exp exp { } ( ) y x Pr cν ( x) σ n { } ( ) y x Pr c ( x) σ ν n m ν= m ν= 95

System Model for BICM uu channel Π cc kk = cc [kk],, cc mm [kk] xx[kk] mapper encoder Transmtter channel uu channel decoder Π - LL dem cc νν LL dem +LL aa cc νν cc νν soft demapper yy[kk] Recever Π LL dec cc νν = LL aa cc νν 96

97 Selected Bt-Mappngs for 8-PSK 3 3 3 3 3 3 3 3 Gray natural d d3 Ant- Gray

EXtrnsc Informaton Transfer Charts I (c; L dem ) = I (c; L dec a ).8.6 I I E b /N = 5 db 8 8.4 Gray natural d. d3 Ant-Gray BCH(8,4)..4.6.8 I (c; L dem a ) = I (c; L dec ) Demapper: a pror nformaton mutual nformaton I (c; L dem a ) Detecton and decodng only once Gray s best Iteratve detecton and decodng Ant-Gray s best 98

Bt Error Rates BER - - -3-4 Gray natural -5 d d3 Ant-Gray 4 teratons -6 5 5 E b /N [db] Smulaton parameters BCH(8,4) 8-PSK Alamout scheme 36 coded bts per frame Independent Raylegh fadng Channel const. for 4 symbols Frst detecton and decodng Gray good, Ant-Gray bad After four teratons Ant-Gray s best Same results as predcted by EXIT charts 99

Low Densty Party Check Codes Defnton and propertes of LDPC codes Iteratve decodng Smulaton results

LDPC Codes Low Densty Party Check Codes Invented by Robert G. Gallager n hs PhD thess, 963 Re-nvented by Davd J.C. Kay n 999 LDPC codes are lnear block codes wth sparse party check matrx H contans relatvely few spread among many (for bnary codes) Iteratvely decoded on a factor graph of the check matrx Advantages Good codes Low decodng complexty

Introducton Recall: For every lnear bnary (n, k) code wth code rate R c = k/n There s a generator matrx G GF(q) k n such that code words x GF(q) n and nfo words u GF(q) k are related by x= ug There s a party-check matrx H GF(q) m n of rank{h} = n-k, such that T x H = Relaton of generator and party check matrx T G H =

Regular LDPC-Codes Defnton: A regular (d v,d c )-LDPC code of length n s defned by a party-check matrx H GF(q) m n wth d v ones n each column and d c ones n each row. The dmenson of the code (nfo word length) s k = n rank{h} Example: n = 8, m = 6, k = n - rank{h} = 4 (!), R C = / d v = 3, d c = 4 H = 3

Regular LDPC-Codes Desgn Rate: The true rate R C and the desgn rate R d are defned as k dv RC = and Rd = wth RC Rd n d c Proof: The number of ones n the check matrx m d c = n d v. Some party check equatons may be redundant,.e., m n-k, and thus k n k m dv = = n n n d The check matrces can be constructed randomly or determnstc Encodng LDPC codes are usually systematcally encoded,.e., by a systematc generator matrx G = Ik kpk n k The matrx P can be found by transformng H nto another check matrx of the code, that has the form T H = Pk n ki n k n k c 4

Factor Graph A factor graph of a code s a graphcal representaton of the code constrants defned by a party-check matrx of ths code T x H = The factor graph s a bpartte graph wth a varable node for each code symbol, a check node for each check equaton, an edge between a varable node and a check node f the code symbol partcpates n the check equaton Notce that each edge corresponds to one n the check matrx. 5

Example: x x x x = 3 4 5 x x x x = 4 5 x x x x = 3 5 x x x x = 3 6 7 x x x x = 4 6 7 x x x x = 6 7 Factor Graph T x H = [ x x x7] = chk chk chk chk 3 chk 4 chk 5 x x x x 3 x 4 x 5 x 6 x 7 chk chk chk chk 3 chk 4 chk 5 T n = 8 columns (code word length) n-k = 6 party check equatons Each check node represents one row of party check matrx Varable nodes Check nodes 6

Decodng wth the Sum-Product Algorthm Smlar to Turbo Decodng, extrnsc nformaton s exchanged Check nodes collect extrnsc nformaton from the connected varable nodes Varable nodes collect extrnsc nformaton from the connected check nodes Check node example Varable node example k chk = + L A = E x x 3 x 4 x 5 L L 3 L 4 L 5 Iteratve decodng procedure x chk chk boxed plus E chk chk extrnsc nfo k E + j j K = Also called message passng or beleve propagaton k L k E E Stop f T x H = 7

Frst check equaton Decodng wth the Sum-Product Algorthm Is the check equaton fulflled? Extrnsc nformaton x x3 x4 x5 x x3 x4 x5 = = ( ) L x = Lx ( ) Lx ( ) Lx ( ) e Lx ( ) Lx ( ) Lx ( ) L( x ) ck h = + 3 + 4 + 5 3 + 4 + 5 L(x ) = L ch y L(x ) = L ch y L(x ) = L ch y L(x 3 ) = L ch y 3 L(x 4 ) = L ch y 4 L(x 5 ) = L ch y 5 L(x 6 ) = L ch y 6 L(x 7 ) = L ch y 7 x x x x 3 x 4 x 5 x 6 x 7 chk chk chk chk 3 chk 4 chk 5 e e e ( ) ( ) ( ) L x = Lx ( ) Lx ( ) Lx ( ) 3 + 4 + 5 L x = Lx ( ) Lx ( ) Lx ( ) 4 + 3 + 5 L x = Lx ( ) Lx ( ) Lx ( ) 5 + 3 + 4 8

Decodng wth the Sum-Product Algorthm Second check equaton Thrd check equaton x x x x 3 x 4 x 5 x 6 x 7 x x x4 x5 = e ( ) e ( ) e ( ) ( ) x x x3 x5 = chk chk chk chk 3 chk 4 chk 5 L x = Lx ( ) Lx ( ) Lx ( ) + 4 + 5 L x = Lx ( ) Lx ( ) Lx ( ) + 4 + 5 L x = Lx ( ) Lx ( ) Lx ( ) 4 + + 5 L x = Lx ( ) Lx ( ) Lx ( ) e e e e e 5 + + 4 ( ) ( ) ( ) ( ) L x = Lx ( ) Lx ( ) Lx ( ) + 3 + 5 L x = Lx ( ) Lx ( ) Lx ( ) + 3 + 5 L x = Lx ( ) Lx ( ) Lx ( ) 3 + + 5 L x = Lx ( ) Lx ( ) Lx ( ) 5 + + 3 9

Varable update Decodng wth the Sum-Product Algorthm Collect extrnsc nformaton of check nodes and update varable nodes L(x ) = L ch y +A L(x ) = L ch y +A L(x ) = L ch y +A L(x 3 ) = L ch y 3 +A 3 L(x 4 ) = L ch y 4 +A 4 L(x 5 ) = L ch y 5 +A 5 L(x 6 ) = L ch y 6 +A 6 L(x 7 ) = L ch y 7 +A x x x x 3 x 4 x 5 x 6 x 7 A = E k k chk chk chk chk 3 chk 4 chk 5

Example: BEC X X -P q P q -P q Y L( y)? P q Y + y = Y = y =? y = Y x x x x 3 x 4 x 5 x 6 x 7?? chk chk chk chk 3 chk 4 chk 5

Example: BEC Check equatons calculate extrnsc nformaton x L chk e ( x) = Lx ( ) + Lx ( 4) + Lx ( 5) = + x + x? chk Le ( x) = Lx ( ) + Lx ( 3) + Lx ( 5) = + + x chk 5 3 Le ( x) = Lx ( ) + Lx ( 6) + Lx ( 7) = x 4 chk 3 3 L x 5 e ( x6) = Lx ( ) + Lx ( 3) + Lx ( 7) = + + chk + 4 4 x 6? chk L 5 e ( x6) = Lx ( ) + Lx ( 4) + Lx ( 7) = + x 7 5 L x = Lx ( ) + Lx ( ) + Lx ( ) = e ( ) 6 7 ( ) ( ) ( ) L x = L x = L x = e e 4 e 5 Varable check 5 L x L x L x L x 5 ( ) = ( ) + ( ) + ( ) = Le( x) Le( x) Le( x) 3 4 5 5 3 4 ( ) = ( ) + ( ) + ( ) = Le( x6) Le( x6) Le( x6) a e e e L x L x L x L x a 6 e 6 e 6 e 6 = + = + = + = +

Irregular LDPC-Codes Propertes: Generalzaton of regular LDPC codes Lower error rates,.e., better performance Irregular number of ones per column and per row Varable nodes of dfferent degrees Check nodes of dfferent degrees Example: H = x x x x 3 x 4 x 5 x 6 x 7 chk chk chk chk 3 chk 4 chk 5 3

Irregular LDPC-Codes Irregular number of ones per column and per row: : proporton of left (varable) nodes of degree r : proporton of rght (check) nodes of degree In example: 3 = 5 / 8 4 = / 8 5 = / 8 r 4 = 3 / 6 r 5 = / 6 r 6 = / 6 Proportons of edges: λ : proporton of edges ncdent to left nodes of degree r : proporton of edges ncdent to rght nodes of degree In example: l 3 = 5 / 9 l 4 = 4 / 9 l 5 = / 9 r 4 = / 9 r 5 = 5 / 9 r 6 = / 9 x x x x 3 x 4 x 5 x 6 x 7 chk chk chk chk 3 chk 4 chk 5 4

Irregular LDPC-Codes LDPC codes are optmzed va Densty Evoluton or EXIT analyss Probablty densty functons descrbng the dstrbuton of check and varable nodes n a party check matrx Specfc codes can be found va random code generaton followng these dstrbutons PDFs wll only be nearly fulflled due to the fnte number of checks and varables Qualty may vary n such an ensemble of codes due to random generaton Example: R c =/ LDPC Code wth n=496 and k=48 Varable node dstrbuton: Degree 3 6 7 PDF.4839494887.94475367.94475367.7455964589.643658 Number 986 349 33 56 Check node dstrbuton Degree 8 9 PDF.7493548387.586456 Number 85 59 5