State-of-the-Art Channel Coding

Similar documents
One Lesson of Information Theory

The Turbo Principle in Wireless Communications

THE EFFECT OF PUNCTURING ON THE CONVOLUTIONAL TURBO-CODES PERFORMANCES

Digital Communications

LDPC Codes. Slides originally from I. Land p.1

Modern Coding Theory. Daniel J. Costello, Jr School of Information Theory Northwestern University August 10, 2009

Bounds on Mutual Information for Simple Codes Using Information Combining

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

Turbo Codes for xdsl modems

Turbo Codes for Deep-Space Communications

ECEN 655: Advanced Channel Coding

A Relation between Conditional and Unconditional Soft Bit Densities of Binary Input Memoryless Symmetric Channels

Optimized Symbol Mappings for Bit-Interleaved Coded Modulation with Iterative Decoding

An Introduction to Low Density Parity Check (LDPC) Codes

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

THE seminal paper of Gallager [1, p. 48] suggested to evaluate

Channel Coding I. Exercises SS 2017

Message-Passing Decoding for Low-Density Parity-Check Codes Harish Jethanandani and R. Aravind, IIT Madras

THE EXIT CHART INTRODUCTION TO EXTRINSIC INFORMATION TRANSFER IN ITERATIVE PROCESSING

LDPC Codes. Intracom Telecom, Peania

Aalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes

Efficient Computation of EXIT Functions for Non-Binary Iterative Decoding

New Puncturing Pattern for Bad Interleavers in Turbo-Codes

Low-Density Parity-Check codes An introduction

Communication by Regression: Sparse Superposition Codes

On the Computation of EXIT Characteristics for Symbol-Based Iterative Decoding

POLAR CODES FOR ERROR CORRECTION: ANALYSIS AND DECODING ALGORITHMS

Capacity-approaching codes

Turbo Codes. Manjunatha. P. Professor Dept. of ECE. June 29, J.N.N. College of Engineering, Shimoga.

Lecture 4 : Introduction to Low-density Parity-check Codes

Unequal Error Protection Turbo Codes

Hybrid Concatenated Codes with Asymptotically Good Distance Growth

Performance of Multi Binary Turbo-Codes on Nakagami Flat Fading Channels

PUNCTURED 8-PSK TURBO-TCM TRANSMISSIONS USING RECURSIVE SYSTEMATIC CONVOLUTIONAL GF ( 2 N ) ENCODERS

Graph-based Codes and Iterative Decoding

A NEW CHANNEL CODING TECHNIQUE TO APPROACH THE CHANNEL CAPACITY

Joint Iterative Decoding of LDPC Codes and Channels with Memory

1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g

Polar Code Construction for List Decoding

Convolutional Codes ddd, Houshou Chen. May 28, 2012

Successive Cancellation Decoding of Single Parity-Check Product Codes

Soft-Output Decision-Feedback Equalization with a Priori Information

Bounds on the Error Probability of ML Decoding for Block and Turbo-Block Codes

LOW-density parity-check (LDPC) codes were invented

Communication Theory II

Polar Coding for the Large Hadron Collider: Challenges in Code Concatenation

Encoder. Encoder 2. ,...,u N-1. 0,v (0) ,u 1. ] v (0) =[v (0) 0,v (1) v (1) =[v (1) 0,v (2) v (2) =[v (2) (a) u v (0) v (1) v (2) (b) N-1] 1,...

16.36 Communication Systems Engineering

Interleaver Design for Turbo Codes

On the minimum distance of LDPC codes based on repetition codes and permutation matrices

Channel Coding I. Exercises SS 2017

Message Passing Algorithm and Linear Programming Decoding for LDPC and Linear Block Codes

Belief-Propagation Decoding of LDPC Codes

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Symbol Interleaved Parallel Concatenated Trellis Coded Modulation

Bifurcations and Chaos in Turbo Decoding Algorithms

Lecture 12. Block Diagram

The Super-Trellis Structure of Turbo Codes

The Concept of Soft Channel Encoding and its Applications in Wireless Relay Networks

Low Density Parity Check (LDPC) Codes and the Need for Stronger ECC. August 2011 Ravi Motwani, Zion Kwok, Scott Nelson

Chapter 7: Channel coding:convolutional codes

Low-density parity-check codes

Optimum Soft Decision Decoding of Linear Block Codes

Minimum Distance Bounds for Multiple-Serially Concatenated Code Ensembles

CoherentDetectionof OFDM

EXIT-Chart Aided Code Design for Symbol-Based Entanglement-Assisted Classical Communication over Quantum Channels

Trellis-based Detection Techniques

Low-density parity-check (LDPC) codes

ABSTRACT. The original low-density parity-check (LDPC) codes were developed by Robert

Decoding of LDPC codes with binary vector messages and scalable complexity

Low-Density Parity-Check Codes

Channel Coding I. Exercises SS 2016

Reed-Solomon codes. Chapter Linear codes over finite fields

ECE Information theory Final (Fall 2008)

Minimum Distance and Convergence Analysis of Hamming-Accumulate-Acccumulate Codes

Introduction to Convolutional Codes, Part 1

Kite Codes: Design, Analysis and Generalizations

Factor Graphs and Message Passing Algorithms Part 1: Introduction

Iterative Solutions Coded Modulation Library Theory of Operation

The PPM Poisson Channel: Finite-Length Bounds and Code Design

Introduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved.

Communication by Regression: Achieving Shannon Capacity

Short Polar Codes. Peihong Yuan. Chair for Communications Engineering. Technische Universität München

Research on Unequal Error Protection with Punctured Turbo Codes in JPEG Image Transmission System

Introducing Low-Density Parity-Check Codes

A New Performance Evaluation Metric for Sub-Optimal Iterative Decoders

RCA Analysis of the Polar Codes and the use of Feedback to aid Polarization at Short Blocklengths

QPP Interleaver Based Turbo-code For DVB-RCS Standard

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

Turbo Codes are Low Density Parity Check Codes

Non-Linear Turbo Codes for Interleaver-Division Multiple Access on the OR Channel.

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 53, NO. 8, AUGUST Linear Turbo Equalization Analysis via BER Transfer and EXIT Charts

Physical Layer and Coding

Linear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Nonlinear Turbo Codes for the broadcast Z Channel


Graph-based codes for flash memory

An Introduction to Low-Density Parity-Check Codes

These outputs can be written in a more convenient form: with y(i) = Hc m (i) n(i) y(i) = (y(i); ; y K (i)) T ; c m (i) = (c m (i); ; c m K(i)) T and n

Transcription:

Institut für State-of-the-Art Channel Coding Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/ September 2 Volker Kühn - State-of-the-Art Channel Coding UNIVERSITÄT ROSTOCK FAKULTÄT INFORMATIK UND ELEKTROTECHNIK

Outline of Lectures Lesson : One Lesson of Information Theory Principle structure of communication systems Definitions of entropy, mutual information, Channel coding theorem of Shannon Lesson 2: Introduction to Error Correcting Codes Basics of error correcting codes Linear block codes Convolutional codes (if time permits) Lesson 3: State-of-the-art channel coding Coding strategies to approach the capacity limits Definition of soft-information and turbo decoding principle Examples for state-of-the-art error correcting codes 2

Shorty Review of Milestones 948: Shannon defines his information theory Definition of entropy and mutual information Channel coding theorem 963: Robert G. Gallager: Low Density Parity Check Codes 966: G. David Forney: Concatenated Codes Computers to that time not strong enough to demonstrate potential of investigated coding schemes Turbo decoding was implicitly already invented 993: First presentation of Turbo-Codes by Berrou, Glavieux, et al. Approaching Shannon s capacity for half-rate code by.5 db 2: Stephan ten Brink: EXIT Chart Analysis Leads to further understanding of iterative decoding principles Allows design / optimization of powerful concatenated codes Repeat Accumulate Code approaches capacity up to.8 db 3

Potential of Turbo Codes P b Comparison conv. codes / turbo codes for R c =/2 Lc=3 Lc=5 - Lc=7 Lc=9 TC -2-3 -4.8.5 db -5 2 3 4 5 6.2 db log E / N b Optimized interleaver of length 256 x 256 = 65384 bit For this interleaver gain of nearly 3 db over conv. code with L c = 9 Gap to Shannon s channel capacity only.5 db Tremendous performance loss for smaller interleavers World record:. db gap to Shannon capacity by Stephan ten Brink 4

Serial and Parallel Code Concatenation Serial Code Concatenation Example: Repeat Accumulate Codes inner code outer code D 2 D C C 2 D 2 D Parallel Code Concatenation Example: Turbo Codes C P C 2 C q S 5

Interleaving Simple block interleaver write x x 3 x 6 x 9 x 2 read interleaving depth: 5 x x 4 x 7 x x 3 x 2 x 5 x 8 x x 4 Input sequence: x, x, x 2, x 3, x 4, x 5, x 6, x 7, x 8, x 9, x, x, x 2, x 3, x 4 Output sequence: x, x 3, x 6, x 9, x 2, x, x 4, x 7, x, x 3, x 2, x 5, x 8, x, x 4 Convolutional interleaver Random interleaver 6

Simple Example of Serial Concatenation Concatenation of (3,2,2)-SPC and (4,3,2)-SPC code C C 2 2 3 4 Total code rate: R c = 2/4 =.5 u c c 2 w H (c 2 ) 2 2 2 d min = 2 Minimum Hamming distance is not improved by code concatenation 7

Another Example of Serial Concatenation Concatenation of (4,3,2)-SPC and (7,4,3)-Hamming code Total code rate: Rc = 3/7 C C 2 3 4 7 uc c c 2 c 2 w H (c 2 ) w H (c 2 ) c 2 w H (c 2 ) 3 3 4 3 3 4 4 4 4 3 3 4 4 4 4 4 4 4 7 7 4 original concatenation: d min = 3 optimized concatenation: d min = 4 Minimum Hamming distance can only be improved by careful selection of subset of inner code 8

Serial Code Concatenation: Product Codes k n -k Information bits arranged in (k,k - )-matrix k u p C Row-wise encoding with code C - of rate k - / n - Column-wise encoding with code C of rate k / n n -k p C p + Entire code rate: R c = k k n n = R c R c Minimum Hamming distance: d min = d min d min 9

Examples of Product Codes () (2,6,4) product code x x 4 x 8 x x 5 x 9 x 2 x 6 x x 3 x 7 x Horizontal: (3,2,2)-SPC code no error correction possible Vertical: (4,3,2)-SPC code no error correction possible Code rate: /2 d min = 2 2 = 4 Correction of error possible

Examples of Product Codes (2) (28,2,6) product code x x 7 x 4 x 2 x x 7 x 4 x 2 x x x x 5 x 8 x 5 x 22 x x 8 x 5 x 22 x 2 x 9 x 6 x 23 x 2 x 9 x 6 x 23 x 3 x x 7 x 24 x 3 x x 7 x 24 x 4 x x 8 x 25 x 4 x x 8 x 25 x 5 x 2 x 9 x 26 x 5 x 2 x 9 x 26 x 6 x 3 x 2 x 27 x 6 x 3 x 2 x 27 Horizontal: (4,3,2)-SPC code no error correction possible Vertical: (7,4,3)-Hamming code single error correction possible d min = 2 3 = 6 2 errors correctable

Parallel Code Concatenation n -k k k - n - -k - u p C p - Information bits u row-wise encoded with C - column-wise encoded with C Parity check bits of component codes not encoded (no checks on checks) C - Entire code rate R c k k n n ( n k ) ( n k ) R R / c / c Minimum Hamming distance: d d d min min min 2

Example of Turbo Code 2 systematic, recursive convolutional encoders (L c = 3) Constituent code rates R c = 2/3 total code rate R c = /2 u c c u T T C u 2 T T c 2 P C 2 3

Turbo Code from Berrou and Glavieux 2 systematic, recursive convolutional encoders (L c = 5) Constituent code rates R c = 2/3 total code rate R c = /2 Pseudo random interleaver of length 65536 bits u c c u T T T T C c 2 u 2 C 2 T T T T P 4

Repeat Accumulate Code from ten Brink Outer half-rate repetition code Inner convolutional code (scrambler) of rate total code rate R c = /2 Random interleaver of different lengths Code doping: replace a few code bits (%) by information bits for improving the convergence of iterative decoding process u repetition encoder repetition encoder T T T inner convolutional code 5

Outline of Lectures Lesson : One Lesson of Information Theory Principle structure of communication systems Definitions of entropy, mutual information, Channel coding theorem of Shannon Lesson 2: Introduction to Error Correcting Codes Basics of error correcting codes Linear block codes Convolutional codes (if time permits) Lesson 3: State-of-the-art channel coding Coding strategies to approach the capacity limits Definition of soft-information and turbo decoding principle Examples for state-of-the-art error correcting codes 6

Log-Likelihood Ratios (LLRs) Definition log-likelihood ratio: Sign determines hard decision Magnitude represents reliability of hard decision 8 Probability of correct decision: P correct = e L(x) + e L(x) Expectation of LLR (soft bit) E{x} = Pr{x = +} Pr{x = } = el(x) + e L(x) + e µ L(x) L(x) = tanh 2 L(x) L(x) = log -8.2.4.6.8 7 6 4 2-2 -4-6 Pr{x = +} Pr{x = } Pr{x = +}

Log-Likelihood Ratios at AWGN Channel Output Scaled matched filter output equals LLR p(y x = +) L(y x) = log p(y x = ) = log exp[ (y )2 /2/σN 2 ] exp[ (y + ) 2 /2/σN 2 ] = 2 6 σ 2 N y 4 L(y x) 2-2 db 2 db -4 4 db 6 db 8dB -6-2 - 2 y 8

Example for Soft-Output Decoding Single parity check code: u u 2 p p = u u 2 Question: What is the LLR of u given the LLRs of u 2 and p? Resolving parity check equation w.r.t. u : Extrinsic LLR does not depend on u itself: u = u 2 p L e (u ) = log Pr{u 2 p = } Pr{u 2 p = } = log Pr{u 2 =, p = } + Pr{u 2 =, p = } Pr{u 2 =, p = } + Pr{u 2 =, p = } = log Pr{u 2 = } Pr{p = } + Pr{u 2 = } Pr{p = } Pr{u 2 = } Pr{p = } + Pr{u 2 = } Pr{p = }. = 2 atanh tanh µ L(u2 ) 2 µ L(p) tanh = 2 atanh E{u 2 } E{p} 2 9

L-Algebra mod-2-sum of 2 statistical h independent random variables: L(x x 2 ) = 2 atanh sgn tanh L(x ) L(x sgn )/2 L(x 2 ) tanh ³L(x min 2 )/2 i L(x ), L(x 2 ) ª L(x ) L(x 2 ) tanh(x/2) + - tanh(x/2) mod-2-sum of n variables: 2 artanh(x) - + " Y n µ # L(xi ) L(x x n ) = 2artanh tanh 2 i= ny sgn L(x i ) min L(xi ) ª i 2 i= L(x x 2 )

General Approach for Soft-Output Decoding For systematic encoders, soft-output of decoder can be split into 3 statistically independent parts: L(û i ) = log p(u i =, y) p(u i =, y) = log P P c Γ () i c Γ () i L ch y i L a (u i ) p(y x) Pr{c} p(y x) Pr{c} = log p(y i x i = +) p(y i x i = ) + log Pr{u i = } Pr{u i = } + log P c Γ () i P c Γ () i nq j= j6=i nq j= j6=i p(y j x j ) p(y j x j ) kq j= j6=i kq j= j6=i Pr{c j } Pr{c j } Intrinsic LLR (systematic part) A-priori LLR L e (û i ) Extrinsic LLR 2

Outline of Lectures Lesson : One Lesson of Information Theory Principle structure of communication systems Definitions of entropy, mutual information, Channel coding theorem of Shannon Lesson 2: Introduction to Error Correcting Codes Basics of error correcting codes Linear block codes Convolutional codes (if time permits) Lesson 3: State-of-the-art channel coding Coding strategies to approach the capacity limits Definition of soft-information and turbo decoding principle Examples for state-of-the-art error correcting codes 22

Soft-Output Decoding for (4,3,2)-SPC-Code E s /N = 2dB u L ch y -5. +7.+.9+2.5 encoding + approximation c L e (û) +.9 -.9-2.5 -.9 Pr{û correct} BPSK x - + - + = L ch y+ L e (û) L(û) -3.2 +5. -.6 +.6 HD.96.99.65.65 - + - + AWGN error corrected y -.8 +.+.3+.4 HD - + + + error detected. but not corrected 23

Turbo Decoding of (24,6,3)-Produktcode () u x - + + - + LLR.6 7.6.3-3.2 6.3 encoding + - - - - AWGN 5. -4.4 3.8 -.6-9.5 BPSK - + + + - + + - + - SNR=2 db -7.6 3.2-5.7 7.6.3.3 -.3 8.2-9.5-7.2 + - + -.9-5.7 7.6-7. L a, (û) = L e, (û). vertical extrinsic decoding information -.3 -.3-3.8 -.6 L (û) -.7 6.3-2.5-3.8 L e, (û) -.3 -.3-3.8 -.6 -.6.3 -.3-3.2 4.5-3. 2.5-3.8 L ch y -.6.3 -.3-3.2.6 -.3.3.6 -.6 3.2 -.3 -.6-7..7.9.9-4.4 8.2 6.9 -. + L e, (û).6 -.3.3.6 -.6 3.2 -.3 -.6 24

Turbo Decoding of (24,6,3)-Produktcode (2) L 2.5 e, (û) -.7 6.3-2.5-3.8 6.3 -.7.7.7 L (û) L ch y + L a, (û) 4.5-3. 2.5-3.8-9.5. horizontal -2.5 2.5-3. 2.5 L y + ch -7..7.9.9-4.4 8.2 6.9 -.-7.2.9-5.7 7.6-7..3 decoding -.3.9.3.7 -.3.7.3 -.7 L e, (û) + L a, (û).8 5.6 -.8-3. 2. -.6 -.6 -.3-8.3 3.2-5.7 9.5 2.6 2.6 7.6 -.8 L ch y + L a,2 (u) L e, (û) = 3. 6.9 2. -2.5 6.3 L a,2 (û) û 2.6 -.9.7.9-9.5-8.9 4.5-7. 8.9.3 3.2 -.6 8.9 -.2-7.2.9-5.7 7.6-7. 25

Turbo Decoding of (24,6,3)-Produktcode (3) 3. 2.6 L ch y + L a,2 (u) 6.9 2. -2.5 6.3 -.9.7.9-9.5 û 2 L 2 (û) -.9 7.6 2. -.9.5-2. -.4.4-8.9 4.5-7. 8.9.3-7. 3.9-6.3.3 3.2 -.6 8.9 -.2-7.2 x x 6.9.6.9-5.7 7.6-7. 2. vertical decoding L ch y + L a,2 (u) -.3 7..6 -.3 6.3 L ch y + L e,2 (û) + L a,2 (û) e,2 (û) -.9 -.6 -.7.9 -.9.6-2. -2.5 L ch y + L ch y + L 2 (û) 3.2.2-3.8 6.3.7.4-3..6-9.5-5.7.7 -.3 2.6-5. -.4 -.6 5.7.3 2. horizontal L e,2 (û) -.6 -.7.6.7.3-3. -.6.7.9 -.9 -.6.9.7 -.7 -.9.9 L a,2 (û) L e,2 (û) + L a,2 (û) -.6-7..6 3.9-6.3 7.5-7.6 7. -7.2.9.3-5.7.3 7.6 8.2-7. -8.3 decoding -.3.6.3 -.3 -.6 -.6.3.6 26

Turbo Decoding of (24,6,3)-Produktcode (4). 3.4 L ch y + L a,3 (u) 8.2 2.6-3.8 6.3-2.7.7. -9.5 û 3 L 2 (û) -.9 6.3.9-2.7 3.9 -.3 -.3-3.2-8.9 4.5-7. 8.9.3-8.9 2.6-6.3 8.8.9 -.9 8.2-8.9-7.2 2.7 2.7 8.8-9.7.9-5.7 7.6-7. 3. vertical decoding L ch y + L a,3 (u) -.3 5.7.6-2. 6.3 L ch y + L e,3 (û) + L a,3 (û) e,3 (û) -.9 -.9 -.7. L 3 (û) -.9 5. -2.5 6.3.2.9-4.4-2.7-9.5 L e,3 (û) -.6.6.3 -.6.9-2.6-3.8 L ch y + L ch y + -7.6 3.4 -.8.3-5. -.9-2.7 7.5.3 3. horizontal -.2.2-2.5.2 -.9 2.7.7 -.7 -.. L a,3 (û) L e,3 (û) + L a,3 (û) -8.9.3.4 2.6-6.3 7.5-8.4 7.8-7.2.9-5.7.8 7.6 7.5-7. -7.8 decoding -.3.4.3.3 -.3.3.3 -.3 27

General Concept of Turbo Decoding L 2 ˆ a u Le ( ) ( u) Lch y D D 2 L ( u ˆ) L L ch e L 2 e y ( u ˆ) ( uˆ ) s L ( u) L y L ( uˆ ) 2 a ch s e L 2 ( uˆ ) L L ch e L 2 e y ( uˆ ) ( uˆ ) s Each decoder supplies extrinsic information as a priori information to other decoder L e (û) is incorporated in L(û) for systematic encoders Improvement by additional decoding iteration with a-priori knowledge if L e (û), L a (û) and L ch y s are statistically independent 28

Simulation Results for Product Code (7,4,3)-Hamming Codes, parallel concatenation It. It.2 It.3 analyt. -2 P b -4-6 2 4 6 8 log E / N b 29

Simulation Results for Product Codes (5,,3)-Hamming-Codes, parallel concatenation It. It.2 It.3 analyt. -2 P b -4-6 2 3 4 5 6 log E / N 3 b

Simulation Results for Product Codes (3,26,3)-Hamming-Codes, parallel concatenation It. It.2 It.3 analyt. -2 P b -4-6 2 3 4 5 6 log E / N b 3

Simulation Results for Turbo Codes (L c = 3) Simple Block Interleaver No significant improvements after third decoding iteration P b x Block-Interleaver - -2 It. It. 2 It. 3 It. 4 It. 5 It. 6 P b 3x3 Block-Interleaver - -2 It. It. 2 It. 3 It. 4 It. 5 It. 6-3 -3-4 -4-5 2 3 4 5 6 log E / N b -5 2 3 4 5 6 log E / N b 32

Simulation Results for Turbo Codes (L c = 3) P b Block and Random Interleavers Iterative process gains significantly even after sixth iteration Increasing interleaver size improves performance remarkably 9-Random-Interleaver, Rc=/3 - -2 It. It. 2 It. 3 It. 4 It. 6 It. P b Comparison of different interleavers - -2 FC, Lc=9 BIL- BIL-4 BIL-9 RIL-9 RIL-9,Rc=/3-3 -3-4 -4-5 2 3 4 5 6 E b / N in db -5 2 3 4 5 6 Eb / N in db 33

Repeat Accumulate Code by ten Brink Half-rate outer repetition encoder Rate-one inner recursive convolutional encoder Approximately decoding iterations are needed - -2 BER -3-4 -5..2.3.4.5.6 E b /N in db 34

Repeat Accumulate Code by ten Brink 35

Application Areas of Turbo Detection Application of turbo processing not restricted to concatenated codes Applicable for any concatenated system Concatenation of source and channel coding (exploitation of residual redundancy from source coding) Concatenation of coding and modulation (bit-interleaved coded modulation) Channel equalization and decoding can be performed iteratively Multi-user detection and decoding can be performed iteratively 36

Institut für Thanks for your attention! September 2 Volker Kühn - State-of-the-Art Channel Coding UNIVERSITÄT ROSTOCK FAKULTÄT INFORMATIK UND ELEKTROTECHNIK