An Introduction to Low-Density Parity-Check Codes

Similar documents
Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

LDPC Codes. Intracom Telecom, Peania

ECEN 655: Advanced Channel Coding

Lecture 4 : Introduction to Low-density Parity-check Codes

LDPC Codes. Slides originally from I. Land p.1

5. Density evolution. Density evolution 5-1

Low-density parity-check codes

BOUNDS ON THE MAP THRESHOLD OF ITERATIVE DECODING SYSTEMS WITH ERASURE NOISE. A Thesis CHIA-WEN WANG

An Introduction to Low Density Parity Check (LDPC) Codes

Capacity-approaching codes

On Generalized EXIT Charts of LDPC Code Ensembles over Binary-Input Output-Symmetric Memoryless Channels

On the Typicality of the Linear Code Among the LDPC Coset Code Ensemble

Codes on graphs and iterative decoding

Codes on graphs and iterative decoding

Message-Passing Decoding for Low-Density Parity-Check Codes Harish Jethanandani and R. Aravind, IIT Madras

Iterative Encoding of Low-Density Parity-Check Codes

Practical Polar Code Construction Using Generalised Generator Matrices

LOW-density parity-check (LDPC) codes were invented

Belief-Propagation Decoding of LDPC Codes

Low-density parity-check (LDPC) codes

Introducing Low-Density Parity-Check Codes

Fountain Uncorrectable Sets and Finite-Length Analysis

Decoding of LDPC codes with binary vector messages and scalable complexity

RCA Analysis of the Polar Codes and the use of Feedback to aid Polarization at Short Blocklengths

Modern Coding Theory. Daniel J. Costello, Jr School of Information Theory Northwestern University August 10, 2009

Joint Iterative Decoding of LDPC Codes and Channels with Memory

COMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei

Exact Probability of Erasure and a Decoding Algorithm for Convolutional Codes on the Binary Erasure Channel

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lecture 8: Shannon s Noise Models

B I N A R Y E R A S U R E C H A N N E L

Graph-based codes for flash memory

6.451 Principles of Digital Communication II Wednesday, May 4, 2005 MIT, Spring 2005 Handout #22. Problem Set 9 Solutions

Slepian-Wolf Code Design via Source-Channel Correspondence

16.36 Communication Systems Engineering

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes

An Efficient Maximum Likelihood Decoding of LDPC Codes Over the Binary Erasure Channel

Making Error Correcting Codes Work for Flash Memory

Coding Techniques for Data Storage Systems

Optimal Rate and Maximum Erasure Probability LDPC Codes in Binary Erasure Channel

THE seminal paper of Gallager [1, p. 48] suggested to evaluate

Maximum Likelihood Decoding of Codes on the Asymmetric Z-channel

An Introduction to Algorithmic Coding Theory

Analysis of Sum-Product Decoding of Low-Density Parity-Check Codes Using a Gaussian Approximation

Graph-based Codes for Quantize-Map-and-Forward Relaying

Enhancing Binary Images of Non-Binary LDPC Codes

Convergence analysis for a class of LDPC convolutional codes on the erasure channel

Capacity-Achieving Ensembles for the Binary Erasure Channel With Bounded Complexity

Spatially Coupled LDPC Codes

Message Passing Algorithm with MAP Decoding on Zigzag Cycles for Non-binary LDPC Codes

Adaptive Cut Generation for Improved Linear Programming Decoding of Binary Linear Codes

Rate-Compatible Low Density Parity Check Codes for Capacity-Approaching ARQ Schemes in Packet Data Communications

Quasi-cyclic Low Density Parity Check codes with high girth

Codes on Graphs. Telecommunications Laboratory. Alex Balatsoukas-Stimming. Technical University of Crete. November 27th, 2008

Graph-based Codes and Iterative Decoding

APPLICATIONS. Quantum Communications

Why We Can Not Surpass Capacity: The Matching Condition

Construction of low complexity Array based Quasi Cyclic Low density parity check (QC-LDPC) codes with low error floor

ELEC546 Review of Information Theory

Asynchronous Decoding of LDPC Codes over BEC

LDPC Decoder LLR Stopping Criterion

An algorithm to improve the error rate performance of Accumulate-Repeat-Accumulate codes Tae-Ui Kim

The Turbo Principle in Wireless Communications

Distributed Source Coding Using LDPC Codes

Factor Graphs and Message Passing Algorithms Part 1: Introduction

Coding for loss tolerant systems

On Achievable Rates and Complexity of LDPC Codes over Parallel Channels: Bounds and Applications

Message Passing Algorithm and Linear Programming Decoding for LDPC and Linear Block Codes

Low Density Parity Check (LDPC) Codes and the Need for Stronger ECC. August 2011 Ravi Motwani, Zion Kwok, Scott Nelson

Turbo Codes for Deep-Space Communications

A t super-channel. trellis code and the channel. inner X t. Y t. S t-1. S t. S t+1. stages into. group two. one stage P 12 / 0,-2 P 21 / 0,2

Time-invariant LDPC convolutional codes

Communication by Regression: Achieving Shannon Capacity

Optimizing Flash based Storage Systems

Lecture 4 Noisy Channel Coding

Bounds on Achievable Rates of LDPC Codes Used Over the Binary Erasure Channel

ECC for NAND Flash. Osso Vahabzadeh. TexasLDPC Inc. Flash Memory Summit 2017 Santa Clara, CA 1

Error Floors of LDPC Coded BICM

Bounds on Mutual Information for Simple Codes Using Information Combining

Lecture 12. Block Diagram

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

Compressed Sensing and Linear Codes over Real Numbers

Recent Results on Capacity-Achieving Codes for the Erasure Channel with Bounded Complexity

A Short Length Low Complexity Low Delay Recursive LDPC Code

Integrated Code Design for a Joint Source and Channel LDPC Coding Scheme

ECE Information theory Final (Fall 2008)

Turbo Codes are Low Density Parity Check Codes

Structured Low-Density Parity-Check Codes: Algebraic Constructions

The PPM Poisson Channel: Finite-Length Bounds and Code Design

Iterative Decoding for Wireless Networks

Analysis of Absorbing Sets and Fully Absorbing Sets of Array-Based LDPC Codes

Advances in Error Control Strategies for 5G

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Decomposition Methods for Large Scale LP Decoding

On Bit Error Rate Performance of Polar Codes in Finite Regime

Research Letter Design of Short, High-Rate DVB-S2-Like Semi-Regular LDPC Codes

The Design of Rate-Compatible Structured Low-Density Parity-Check Codes

On Turbo-Schedules for LDPC Decoding

Transcription:

An Introduction to Low-Density Parity-Check Codes Paul H. Siegel Electrical and Computer Engineering University of California, San Diego 5/ 3/ 7 Copyright 27 by Paul H. Siegel

Outline Shannon s Channel Coding Theorem Error-Correcting Codes State-of-the-Art LDPC Code Basics Encoding Decoding LDPC Code Design Asymptotic performance analysis Design optimization 5/ 3/ 7 LDPC Codes 2

Outline EXIT Chart Analysis Applications Binary Erasure Channel Binary Symmetric Channel AWGN Channel Rayleigh Fading Channel Partial-Response Channel Basic References 5/ 3/ 7 LDPC Codes 3

A Noisy Communication System INFORMATION SOURCE TRANSMITTER RECEIVER DESTINATION CHANNEL SIGNAL RECEIVED SIGNAL MESSAGE MESSAGE NOISE SOURCE 5/ 3/ 7 LDPC Codes 4

Channels Binary erasure channel BEC(ε) - ε ε? ε - ε Binary symmetric channel BSC(p) -p p p -p 5/ 3/ 7 LDPC Codes 5

More Channels Additive white Gaussian noise channel AWGN f ( y ) f ( y ) P P 5/ 3/ 7 LDPC Codes 6

Shannon Capacity Every communication channel is characterized by a single number C, called the channel capacity. It is possible to transmit information over this channel reliably (with probability of error ) if and only if: R def # information bits = < C channel use 5/ 3/ 7 LDPC Codes 7

Channels and Capacities Binary erasure channel BEC(ε) - ε ε? ε - ε C = ε.9.8.7.6.5.4.3.2...2.3.4.5.6.7.8.9 Binary symmetric channel BSC(p) C = H 2 ( p) -p p p -p..2.3.4.5.6.7.8.9 5/ 3/ 7 LDPC Codes 8.9.8.7.6.5.4.3.2. H 2 ( p) = p log 2 p ( p)log2 ( p)

More Channels and Capacities Additive white Gaussian noise channel AWGN f (y ) f (y ) C P + σ = log 2 2 2.8.6 P P.4.2.8.6.4.2 - -8-6 -4-2 2 4 6 8 5/ 3/ 7 LDPC Codes 9

Coding We use a code to communicate over the noisy channel. Source x = x, x, 2, x k Encoder c = c, c, 2, c n Channel Sink xˆ = xˆ, xˆ 2,, ˆ x k Decoder y = y, y, 2, y n Code rate: R = 5/ 3/ 7 LDPC Codes k n

Shannon s Coding Theorems If C is a code with rate R>C, then the probability of error in decoding this code is bounded away from. (In other words, at any rate R>C, reliable communication is not possible.) C = Max (H(x) H y (x)) For any information rate R < C and any δ >, there exists a code C of length n δ and rate R, such that the probability of error in maximum likelihood decoding of this code is at most δ. Proof: Non-constructive! 5/ 3/ 7 LDPC Codes

Review of Shannon s Paper A pioneering paper: Shannon, C. E. A mathematical theory of communication. Bell System Tech. J. 27, (948). 379 423, 623 656 A regrettable review: Doob, J.L., Mathematical Reviews, MR26286 (,33e) The discussion is suggestive throughout, rather than mathematical, and it is not always clear that the author s mathematical intentions are honorable. Cover, T. Shannon s Contributions to Shannon Theory, AMS Notices, vol. 49, no., p., January 22 Doob has recanted this remark many times, saying that it and his naming of super martingales (processes that go down instead of up) are his two big regrets. 5/ 3/ 7 LDPC Codes 2

Finding Good Codes Ingredients of Shannon s proof: Random code Large block length Optimal decoding Problem Randomness + large block length + optimal decoding = COMPLEXITY! 5/ 3/ 7 LDPC Codes 3

Solution State-of-the-Art Long, structured, pseudorandom codes Practical, near-optimal decoding algorithms Examples Turbo codes (993) Low-density parity-check (LDPC) codes (96, 999) State-of-the-art Turbo codes and LDPC codes have brought Shannon limits to within reach on a wide range of channels. 5/ 3/ 7 LDPC Codes 4

Evolution of Coding Technology LDPC codes from Trellis and Turbo Coding, Schlegel and Perez, IEEE Press, 24 5/ 3/ 7 LDPC Codes 5

Linear Block Codes - Basics Parameters of binary linear block code C k = number of information bits n = number of code bits R = k/n d min = minimum distance There are many ways to describe C Codebook (list) Parity-check matrix / generator matrix Graphical representation ( Tanner graph ) 5/ 3/ 7 LDPC Codes 6

Example: (7,4) Hamming Code (n,k) = (7,4),, R = 4/7 d min min = 3 single error error correcting double erasure correcting 5 Encoding rule:.. Insert data bits in in,, 2, 2, 3, 3, 4. 4. 2. 2. Insert parity bits in in 5, 5, 6, 6, 7 to to ensure an an even number of of s s in in each circle 7 2 3 4 6 5/ 3/ 7 LDPC Codes 7

Example: (7,4) Hamming Code 2 k =6 codewords Systematic encoder places input bits in positions, 2, 3, 4 Parity bits are in positions 5, 6, 7 5/ 3/ 7 LDPC Codes 8

Hamming Code Parity Checks 7 5 2 3 4 6 2 3 4 5 6 7 5/ 3/ 7 LDPC Codes 9

Hamming Code: Matrix Perspective 5/ 3/ 7 2 LDPC Codes = T H c [ ] 7 6 5 4 3 2,,,,,, c c c c c c c c = Parity check matrix H = H Generator matrix G [ ] [ ] c G u c c c c c c c c u u u u u = = = 7 6 5 4 3 2 4 3 2,,,,,,.,, = G

Parity-Check Equations 5/ 3/ 7 2 LDPC Codes Parity-check matrix implies system of linear equations. 7 4 2 6 4 3 5 3 2 = + + + = + + + = + + + c c c c c c c c c c c c = H Parity-check matrix is not unique. Any set of vectors that span the rowspace generated by H can serve as the rows of a parity check matrix (including sets with more than 3 vectors).

Hamming Code: Tanner Graph Bi-partite graph representing parity-check equations c c 2 c 3 c 4 c 5 c 6 c 7 c c c + c2 + c3 + c5 = + c3 + c4 + c6 = + c2 + c4 + c7 = 5/ 3/ 7 LDPC Codes 22

Tanner Graph Terminology variable nodes (bit, left) check nodes (constraint, right) The degree of of a node is is the the number of of edges connected to to it. it. 5/ 3/ 7 LDPC Codes 23

Low-Density Parity-Check Codes Proposed by Gallager (96) Sparseness of matrix and graph descriptions Number of s in H grows linearly with block length Number of edges in Tanner graph grows linearly with block length Randomness of construction in: Placement of s in H Connectivity of variable and check nodes Iterative, message-passing decoder Simple local decoding at nodes Iterative exchange of information (message-passing) 5/ 3/ 7 LDPC Codes 24

Another pioneering work: Review of Gallager s Paper Gallager, R. G., Low-Density Parity-Check Codes, M.I.T. Press, Cambridge, Mass: 963. A more enlightened review: Horstein, M., IEEE Trans. Inform. Thoery, vol., no. 2, p. 72, April 964, This book is an extremely lucid and circumspect exposition of an important piece of research. A comparison with other coding and decoding procedures designed for high-reliability transmission... is difficult...furthermore, many hours of computer simulation are needed to evaluate a probabilistic decoding scheme... It appears, however, that LDPC codes have a sufficient number of desirable features to make them highly competitive with... other schemes... 5/ 3/ 7 LDPC Codes 25

Gallager s LDPC Codes Now called regular LDPC codes Parameters (n,j,k) n = codeword length j = # of parity-check equations involving each code bit = degree of each variable node k = # code bits involved in each parity-check equation = degree of each check node Locations of s can be chosen randomly, subject to (j,k) constraints. 5/ 3/ 7 LDPC Codes 26

Gallager s Construction (n,j,k) =(2,3,4) First n/k =5 rows have k=4 s each, descending. Next j-=2 submatrices of size n/k x n =5 x 2 obtained by applying randomly chosen column permutation to first submatrix. Result: jn/k x n = 5 x 2 parity check matrix for a (n,j,k) =(2,3,4) LDPC code. π π 2 --------------------------------------------- --------------------------------------------- 5/ 3/ 7 LDPC Codes 27

Regular LDPC Code Tanner Graph n = 2 variable nodes left degree j = 3 nj = 6 edges nj/k = 5 check right degree k = 4 nj = 6 edges 5/ 3/ 7 LDPC Codes 28

Properties of Regular LDPC Codes Design rate: R(j,k) = j/k Linear dependencies can increase rate Design rate achieved with high probability as n increases Example: (n,j,k)=(2,3,4) with R = 3/4 = /4. For j 3, the typical minimum distance of codes in the (j,k) ensemble grows linearly in the codeword length n. Their performance under maximum-likelihood decoding on BSC(p) is at least as good...as the optimum code of a somewhat higher rate. [Gallager, 96] 5/ 3/ 7 LDPC Codes 29

Performance of Regular LDPC Codes Gallager, Gallager, 963 963 5/ 3/ 7 LDPC Codes 3

Performance of Regular LDPC Codes Gallager, Gallager, 963 963 5/ 3/ 7 LDPC Codes 3

Performance of Regular LDPC Codes Gallager, Gallager, 963 963 5/ 3/ 7 LDPC Codes 32

Performance of Regular LDPC Codes Irregular LDPC (3,6) (3,6) Richardson, Richardson, Shokrollahi, Shokrollahi, and and Urbanke, Urbanke, 2 2 n= n= 6 6 R=/2 R=/2 5/ 3/ 7 LDPC Codes 33

Irregular LDPC Codes Irregular LDPC codes are a natural generalization of Gallager s LDPC codes. The degrees of variable and check nodes need not be constant. Ensemble defined by node degree distribution functions. Λ d v i Λ( Λ x Ρ( ) = Ρ i i x ) = i= = number of variable nodes of degree i x i= 2 Normalize for fraction of nodes of specified degree L( x) = Λ( x) Λ() 5/ 3/ 7 LDPC Codes 34 Ρ i d c = number of nodes of R ( x) = x i i check degree P( x) P() i

Irregular LDPC Codes Often, we use the degree distribution from the edge perspective d v λ( λ x ) = i= x i i λi = fraction of edges connected to variable nodes of degree i d c ρ( ρ x ) = i= 2 x i i ρi = fraction of edges connected to check nodes of degree i Conversion rule λ( x) = Λ Λ ( x) () = L L ( x) () ρ( x) = P P ( x) () = R R ( x) () 5/ 3/ 7 LDPC Codes 35

Irregular LDPC Codes Design rate R( λ, ρ ) = ρ i i i λi i i = ρ ( λ ( x ) dx x ) dx Under certain conditions related to codewords of weight n/2, the design rate is achieved with high probability as n increases. 5/ 3/ 7 LDPC Codes 36

Examples of Degree Distribution Pairs Hamming (7,4) code Λ Ρ λ ρ ( x) = 3x 4 ( x) = 3x ( x) = 4 3 ( x) = x + + 3x 2 2 x + + 4 x x 3 2 # edges = 2 2 3 R( λ, ρ) = = 7 4 7 (j,k) regular LDPC code, length-n ( x) Λ j = nx λ ( x ) Ρ jn k k ( x) = x ρ = x j k ( x) = x R( λ, ρ) / k = = / j j k 5/ 3/ 7 LDPC Codes 37

Encoding LDPC Codes Convert H into equivalent upper triangular form H H = n-k n-k k (e.g., by Gaussian elimination and column swapping complexity ~ O(n 3 ) ) This is a pre-processing step only. 5/ 3/ 7 LDPC Codes 38

Encoding LDPC Codes Set c n-k+,,c n equal to the data bits x,,x k. Solve for parities c l, l=,, n-k, in reverse order; i.e., starting with l=n-k, compute c l = n k j= l + H l, j c j k j= l + H l, j n+ k x j (complexity ~O(n 2 ) ) Another general encoding technique based upon approximate lower triangulation has complexity no more than O(n 2 ), with the constant coefficient small enough to allow practical encoding for block lengths on the order of n= 5. 5/ 3/ 7 LDPC Codes 39

Linear Encoding Complexity It has been shown that optimized ensembles of irregular LDPC codes can be encoded with preprocessing complexity at most O(n 3/2 ), and subsequent complexity ~O(n). It has been shown that a necessary condition for the ensemble of (λ, ρ)-irregular LDPC codes to be linear-time encodable is λ ( ) ρ () > Alternatively, LDPC code ensembles with additional structure have linear encoding complexity, such as irregular repeataccumulate (IRA) codes. 5/ 3/ 7 LDPC Codes 4

Decoding of LDPC Codes Gallager introduced the idea of iterative, messagepassing decoding of LDPC codes. The idea is to iteratively share the results of local node decoding by passing them along edges of the Tanner graph. We will first demonstrate this decoding method for the binary erasure channel BEC(ε). The performance and optimization of LDPC codes for the BEC will tell us a lot about other channels, too. 5/ 3/ 7 LDPC Codes 4

Decoding for the BEC Recall: Binary erasure channel, BEC(ε) x i -ε ε ε -ε? y i x = (x, x 2,, x n ) transmitted codeword y = (y, y 2,, y n ) received word Note: if y i {,}, then x i = y i. 5/ 3/ 7 LDPC Codes 42

Optimal Block Decoding - BEC Maximum a posteriori (MAP) block decoding rule minimizes block error probability: MAP xˆ ( y) = arg max P ( x y) x C Assume that codewords are transmitted equiprobably. MAP xˆ ( y) = arg max P ( y x) x C If the (non-empty) set X(y) of codewords compatible with y contains only one codeword x, then xˆ MAP ( y) = If X(y) contains more than one codeword, then declare a block erasure. x X Y Y X 5/ 3/ 7 LDPC Codes 43

Optimal Bit Decoding - BEC Maximum a posteriori (MAP) bit decoding rule minimizes bit error probability: MAP xˆ ( y) = arg max P ( b y) i = b b {,} arg max } {, x x = C b Assume that codewords are transmitted equiprobably. X Y If every codeword x X(y) satisfies x i =b, then set i X i Y P ( x y) xˆ MAP ( y) = b Otherwise, declare a bit erasure in position i. 5/ 3/ 7 LDPC Codes 44

MAP Decoding Complexity Let E {,,n} denote the positions of erasures in y, and let F denote its complement in {,,n}. Let w E and w F denote the corresponding sub-words of word w. Let H E and H F denote the corresponding submatrices of the parity check matrix H. Then X(y), the set of codewords compatible with y, satisfies { T T } X( y) = x C x = y and H x = H y F F E E F F So, optimal (MAP) decoding can be done by solving a set of linear equations, requiring complexity at most O(n 3 ). For large blocklength n, this can be prohibitive! 5/ 3/ 7 LDPC Codes 45

Simpler Decoding We now describe an alternative decoding procedure that can be implemented very simply. It is a local decoding technique that tries to fill in erasures one parity-check equation at a time. We will illustrate it using a very simple and familiar linear code, the (7,4) Hamming code. We ll compare its performance to that of optimal bitwise decoding. Then, we ll reformulate it as a message-passing decoding algorithm and apply it to LDPC codes. 5/ 3/ 7 LDPC Codes 46

Local Decoding of Erasures d min = 3, so any two erasures can be uniquely filled to get a codeword. Decoding can be done locally: Given any pattern of one or two erasures, there will always be a parity-check (circle) involving exactly one erasure. The parity-check represented by the circle can be used to fill in the erased bit. 7 5 2 3 4 6 This leaves at most one more erasure. Any parity-check (circle) involving it can be used to fill it in. 5/ 3/ 7 LDPC Codes 47

Local Decoding - Example All- s codeword transmitted. Two erasures as shown. Start with either the red parity or green parity circle. The red parity circle requires that the erased symbol inside it be.?? 5/ 3/ 7 LDPC Codes 48

Local Decoding -Example Next, the green parity circle or the blue parity circle can be selected. Either one requires that the remaining erased symbol be.? 5/ 3/ 7 LDPC Codes 49

Local Decoding -Example Estimated codeword: [ ] Decoding successful!! This procedure would have worked no matter which codeword was transmitted. 5/ 3/ 7 LDPC Codes 5

Initialization: Decoding with the Tanner Graph: an a-peeling Decoder Forward known variable node values along outgoing edges Accumulate forwarded values at check nodes and record the parity Delete known variable nodes and all outgoing edges 5/ 3/ 7 LDPC Codes 5

Peeling Decoder Initialization x??? x??? Forward known known values values 5/ 3/ 7 LDPC Codes 52

Peeling Decoder - Initialization x Accumulate parity x Delete known variable nodes and edges?????? 5/ 3/ 7 LDPC Codes 53

Decoding step: Decoding with the Tanner Graph: an a-peeling Decoder Select, if possible, a check node with one edge remaining; forward its parity, thereby determining the connected variable node Delete the check node and its outgoing edge Follow procedure in the initialization process at the known variable node Termination If remaining graph is empty, the codeword is determined If decoding step gets stuck, declare decoding failure 5/ 3/ 7 LDPC Codes 54

Peeling Decoder Step x Find degree- check node; forward accumulated parity; determine variable node value x Delete check node and edge; forward new variable node value???? 5/ 3/ 7 LDPC Codes 55

Peeling Decoder Step x?? Accumulate parity LDPC Codes Delete known variable nodes and edges 5/ 3/ 7 56 x??

x? Peeling Decoder Step 2 Find degree- check node; forward accumulated parity; determine variable node value LDPC Codes Delete check node and edge; forward new variable node value 5/ 3/ 7 57 x?

Peeling Decoder Step 2 x? Accumulate parity LDPC Codes Delete known variable nodes and edges 5/ 3/ 7 58 x?

x Peeling Decoder Step 3 Find degree- check node; forward accumulated parity; determine variable node value LDPC Codes Delete check node and edge; decoding complete 5/ 3/ 7 59 x

Message-Passing Decoding The local decoding procedure can be described in terms of an iterative, message-passing algorithm in which all variable nodes and all check nodes in parallel iteratively pass messages along their adjacent edges. The values of the code bits are updated accordingly. The algorithm continues until all erasures are filled in, or until the completion of a specified number of iterations. LDPC Codes 5/ 3/ 7 6

Variable-to-Check Node Message from channel???? v=? edge e uu? u? v=u edge e Variable-to-check message on edge e If all other incoming messages are?, send message v =? If any other incoming message u is or, send v=u and, if the bit was an erasure, fill it with u, too. (Note that there are no errors on the BEC, so a message that is or must be correct. Messages cannot be inconsistent.) 5/ 3/ 7 LDPC Codes 6

Check-to-Variable Node Message u =?? u = v + v 2 + v 3 v edge e edge e v v 2 v 2 v 3 Check-to-variable message on edge e If any other incoming message is?, send u =? If all other incoming messages are in {,}, send the XOR of them, u = v + v 2 + v 3. 5/ 3/ 7 LDPC Codes 62

y??? Message-Passing Example Initialization 5/ 3/ 7 LDPC Codes 63 x y?????? Variable-to-Check

x y?????? Message-Passing Example Round Check-to-Variable 5/ 3/ 7 LDPC Codes 64 x y????? Variable-to-Check

x y????? Message-Passing Example Round 2 Check-to-Variable 5/ 3/ 7 LDPC Codes 65 x y???? Variable-to-Check

x y???? Message-Passing Example Round 3 Check-to-Variable 5/ 3/ 7 LDPC Codes 66 x y??? Variable-to-Check Decoding complete

Sub-optimality of Message-Passing Decoder Hamming code: decoding of 3 erasures There are 7 patterns of 3 erasures that correspond to the support of a weight-3 codeword. These can not be decoded by any decoder! The other 28 patterns of 3 erasures can be uniquely filled in by the optimal decoder. We just saw a pattern of 3 erasures that was corrected by the local decoder. Are there any that it cannot? Test:?????? 5/ 3/ 7 LDPC Codes 67

Sub-optimality of Message-Passing Decoder Test:??? There is a unique way to fill the erasures and get a codeword: The optimal decoder would find it. But every parity-check has at least 2 erasures, so local decoding will not work! 5/ 3/ 7 LDPC Codes 68

Stopping Sets A stopping set is a subset S of the variable nodes such that every check node connected to S is connected to S at least twice. The empty set is a stopping set (trivially). The support set (i.e., the positions of s) of any codeword is a stopping set (parity condition). A stopping set need not be the support of a codeword. 5/ 3/ 7 LDPC Codes 69

Stopping Sets Example : (7,4) Hamming code Codeword support set S={4,6,7} 2 3 4 5 6 7 5/ 3/ 7 LDPC Codes 7

Stopping Sets Example 2: (7,4) Hamming code 2 3 4 5 6 7 5/ 3/ 7 LDPC Codes 7

Stopping Sets Example 2: (7,4) Hamming code Not the support set of a codeword S={,2,3} 2 3 4 5 6 7 5/ 3/ 7 LDPC Codes 72

Stopping Set Properties Every set of variable nodes contains a largest stopping set (since the union of stopping sets is also a stopping set). The message-passing decoder needs a check node with at most one edge connected to an erasure to proceed. So, if the remaining erasures form a stopping set, the decoder must stop. Let E be the initial set of erasures. When the messagepassing decoder stops, the remaining set of erasures is the largest stopping set S in E. If S is empty, the codeword has been recovered. If not, the decoder has failed. 5/ 3/ 7 LDPC Codes 73

Suboptimality of Message-Passing Decoder An optimal (MAP) decoder for a code C on the BEC fails if and only if the set of erased variables includes the support set of a codeword. The message-passing decoder fails if and only the set of erased variables includes a non-empty stopping set. Conclusion: Message-passing may fail where optimal decoding succeeds!! Message-passing is suboptimal!! 5/ 3/ 7 LDPC Codes 74

Comments on Message-Passing Decoding Bad news: Message-passing decoding on a Tanner graph is not always optimal... Good news: For any code C, there is a parity-check matrix on whose Tanner graph message-passing is optimal, e.g., the matrix of codewords of the dual code. Bad news: That Tanner graph may be very dense, so even message-passing decoding is too complex. C 5/ 3/ 7 LDPC Codes 75

Another (7,4) Code 5/ 3/ 7 76 LDPC Codes = H R=4/7 d min =2 All stopping sets contain codeword supports. Message-passing decoder on this graph is optimal! (Cycle-free Tanner graph implies this.)

Comments on Message-Passing Decoding Good news: If a Tanner graph is cycle-free, the messagepassing decoder is optimal! Bad news: Binary linear codes with cycle-free Tanner graphs are necessarily weak... Good news: The Tanner graph of a long LDPC code behaves almost like a cycle-free graph! 5/ 3/ 7 LDPC Codes 77

Analysis of LDPC Codes on BEC In the spirit of Shannon, we can analyze the performance of message-passing decoding on ensembles of LDPC codes with specified degree distributions (λ,ρ). The results of the analysis allow us to design LDPC codes that transmit reliably with MP decoding at rates approaching the Shannon capacity of the BEC. In fact, sequences of LDPC codes have been designed that actually achieve the Shannon capacity. The analysis can assume the all- s codeword is sent. 5/ 3/ 7 LDPC Codes 78

Key Results - Concentration With high probability, the performance of l rounds of MP decoding on a randomly selected (n, λ, ρ) code converges to the ensemble average performance as the length n. Convergence to cycle-free performance The average performance of l rounds of MP decoding on the (n, λ, ρ) ensemble converges to the performance on a graph with no cycles of length 2l as the length n. 5/ 3/ 7 LDPC Codes 79

Key Results - 2 Computing the cycle-free performance The cycle-free performance can be computed by a tractable algorithm density evolution. Threshold calculation There is a threshold probability p*(λ,ρ) such that, for channel erasure probability ε < p*(λ,ρ), the cycle-free error probability approaches as the number of iterations l. 5/ 3/ 7 LDPC Codes 8

Asymptotic Performance Analysis We assume a cycle-free (λ,ρ) Tanner graph. Let p = ε, the channel erasure probability. We find a recursion formula for p l, the probability that a randomly chosen edge carries a variable-to-check erasure message in round l. We then find the largest ε such that p l converges to, as l. This value is called the threshold. This procedure is called density evolution analysis. 5/ 3/ 7 LDPC Codes 8

Density Evolution- Consider a check node node of of degree d with with independent incoming messages. Pr ( u =?) = Pr( vi =?, for some i =,, d ) = Pr( v?, for all i =,, d ) = ( i p ) d The probability that edge e connects to a check node of degree d is ρ d, so Pr dc d ( u =?) = ρ ( ( p ) ) d = = d d = = ρ c d ρ ( p d ( p ) ) d v uu edge e v d 2 v d 5/ 3/ 7 LDPC Codes 82

u from channel u d u d 2 Density Evolution-2 Consider a variable node of degree d with independent incoming messages. edge e vv Pr( v =?) = Pr( u =?) Pr( ui =?, for all i =,, d ) d = p u [ ρ( p ) ] The probability that edge e connects to a variable node of degree d is λ d, so Pr v ( v?) = λ p [ ρ( p )] = ( ρ( p )) 5/ 3/ 7 LDPC Codes 83 = d = p l = p λ ( ρ( p l- )) d p λ d d

Threshold Property p l = p λ ( ρ( p l- )) There is a threshold probability p*(λ, ρ) such that if then ( λ, ) p = ε < p * ρ, lim p. 5/ 3/ 7 LDPC Codes 84

Threshold Interpretation Operationally, this means that using a code drawn from the ensemble of length-n LDPC codes with degree distribution pair (λ, ρ), we can transmit as reliably as desired over the BEC(ε) channel if ε < p ( λ, ) * ρ, for sufficiently large block length n. 5/ 3/ 7 LDPC Codes 85

Computing the Threshold Define f (p,x) = p λ ( ρ( x)) The threshold p*(λ, ρ) is the largest probability p such that f (p,x) x < on the interval x (,]. This leads to a graphical interpretation of the threshold p*(λ, ρ) 5/ 3/ 7 LDPC Codes 86

Graphical Determination of the Threshold Example: (j,k)=(3,4) ( 3 x ) 2 ( ) x f ( x, p) x = p p*.6474 p =.7 p =.6474 p=.6 p=.5 5/ 3/ 7 LDPC Codes 87

(j,k)-regular LDPC Code Thresholds There is a closed form expression for thresholds of (j,k)-regular LDPC codes. Examples: (j,k) R p Sh p*(j,k) (3,4) /4 ¾=.75.6474 (3,5) 2/5 3/5=.6.576 (3,6) /2 ½=.5.4294 (4,6) /3 ⅔.67.56 p *(3,4) = 325 3672 + 252.647426 2 (4,8) /2 ½=.5.3834 5/ 3/ 7 LDPC Codes 88

Degree Distribution Optimization Two approaches: Fix design rate R(λ,ρ) and find degree distributions λ(x), ρ(x) to maximize the threshold p*(λ,ρ). Fix the threshold p*, and find degree distributions λ(x), ρ(x) to maximize the rate R(λ,ρ). For the latter, we can: start with a specific ρ(x) and optimize λ(x); then, for the optimal λ(x), find the optimal check distribution; ping-pong back and forth until satisfied with the results. LDPC Codes 5/ 3/ 7 89

Variable Degree Distribution Optimization Fix a check degree distribution ρ(x) and threshold ε. Fix maximum variable degree l max. Define g( x, λ,, λ 2 lmax ) = ελ( ρ( x)) x = ε i 2 λ ( ρ( x)) Use linear programming to find max λ l max ( λ / i) λ ; max i i i= 2 i= 2 l Since the rate R(λ,ρ) is an increasing function of λ i /i, this maximizes the design rate. i λ = ; g i i x for x [,] 5/ 3/ 7 LDPC Codes 9

Practical Optimization In practice, good performance is found for a check degree distribution of the form: ρ( r x ) = ax + ( a) Example : l max = 8, r =6, design rate ½ x r λ( x) =.49x +.22x 2 +.768x 3 +.97x 6 +.5x 7 ρ( x) = x 5 Rate: R(λ,ρ).54 Threshold: p*(λ,ρ).48 5/ 3/ 7 LDPC Codes 9

Bound on the Threshold Taylor series analysis yields the general upper bound: p *( λ, ρ) λ () ρ (). For previous example with p*(λ,ρ).48, the upper bound gives: λ () ρ () = (.49) 5.489 5/ 3/ 7 LDPC Codes 92

EXIT Chart Analysis Extrinsic information transfer (EXIT) charts provide a nice graphical depiction of density evolution and MP decoding [tenbrink,999] Rewrite the density evolution recursion as: f ( x, p) = = pλ( ρ( v p (c( x)) x)) where v p ( x) = pλ( x) c( x) = ρ( x) 5/ 3/ 7 LDPC Codes 93

EXIT Chart Analysis Recall that the MP convergence condition was f ( x, p) < x, for all x (,) Since λ(x) is invertible, the condition becomes c( x) < v p ( x), for all x (,) Graphically, this says that the curve for c(x) must lie below the curve for v ( x) for all p < p*. p 5/ 3/ 7 LDPC Codes 94

EXIT Chart Example 5/ 3/ 7 95 LDPC Codes Example: (3,4)-regular LDPC code, p*=.6474 3 2 ) ( ) ( x x x x = = ρ λ 2 3 2 ) ( ) ( ) ( ) ( ) ( ) ( = = = = = p x x v x px x x c x p x v p p ρ λ

EXIT Chart Example Example: (3,4)-regular LDPC code, p*=.6474.9.8.7.6.5.4.3 p=.5 p=.6 p*.6474 p=.7 p=.8 v p ( x ) x = p 2 for various values of initial erasure probability p.2. c( x) = ( x) 3..2.3.4.5.6.7.8.9 5/ 3/ 7 LDPC Codes 96

EXIT Charts and Density Evolution 5/ 3/ 7 97 LDPC Codes EXIT charts can be used to visualize density evolution. Assume initial fraction of erasure messages p =p. The fraction of erasures emitted successively by check node q i and by variable nodes and p i are obtained by successively applying c(x) and v p (x). ] ) ( [note : )) ( ( ) ( ) ( ] ) ( [note : )) ( ( ) ( ) ( 2 2 2 2 2 q p v p c v q v p p c q q p v p c v q v p p c q p p p p p p = = = = = = = =

EXIT Charts and Density Evolution Graphically, this computation describes a staircase function. If p < p*, there is a tunnel between v p - (x) and c(x) through which the staircase descends to ground level, i.e., no erasures. If p > p*, the tunnel closes, stopping the staircase descent at a positive fraction of errors. 5/ 3/ 7 LDPC Codes 98

Density Evolution Visualization - Example: (3,4)-regular LDPC code, p=.6.9.8 v.6 x ( x) =.6 2.7.6.5.4 c( x) = ( x) 3.3.2...2.3.4.5.6.7.8.9 5/ 3/ 7 LDPC Codes 99

Density Evolution Visualization-2 Example: (3,4)-regular LDPC code q fraction of erasures from check nodes.9.8.7.6.5.4 p =.6.3.2. fraction of erasures from variable nodes..2.3.4.5.6.7.8.9 p 5/ 3/ 7 LDPC Codes

Density Evolution Visualization Example: (3,4)-regular LDPC code q fraction of erasures from check nodes.9.8.7.6.5 p =.6 q =.936.4.3.2. fraction of erasures from variable nodes..2.3.4.5.6.7.8.9 p 5/ 3/ 7 LDPC Codes

Density Evolution Visualization Example: (3,4)-regular LDPC code q fraction of erasures from check nodes.9.8.7.6.5 p.5257 q.936.4.3.2. fraction of erasures from variable nodes..2.3.4.5.6.7.8.9 p 5/ 3/ 7 LDPC Codes 2

Density Evolution Visualization Example: (3,4)-regular LDPC code q fraction of erasures from check nodes.9.8.7.6.5 p.5257 q 2.8933.4.3.2. fraction of erasures from variable nodes..2.3.4.5.6.7.8.9 p 5/ 3/ 7 LDPC Codes 3

Density Evolution Visualization Example: (3,4)-regular LDPC code q fraction of erasures from check nodes.9.8.7.6.5 p 2.4788 q 2.8933.4.3.2. fraction of erasures from variable nodes..2.3.4.5.6.7.8.9 p 5/ 3/ 7 LDPC Codes 4

Density Evolution Visualization Example: (3,4)-regular LDPC code q fraction of erasures from check nodes.9.8.7.6.5.4 p 2.4788 q 3.8584.3.2. fraction of erasures from variable nodes..2.3.4.5.6.7.8.9 p 5/ 3/ 7 LDPC Codes 5

Density Evolution Visualization Example: (3,4)-regular LDPC code.9 q fraction of erasures from check nodes.8.7.6.5 p 3.442 q 3.8584.4.3.2. fraction of erasures from variable nodes..2.3.4.5.6.7.8.9 p 5/ 3/ 7 LDPC Codes 6

Density Evolution Visualization Example: (3,4)-regular LDPC code.9 q fraction of erasures from check nodes.8.7.6.5 p 3.442 p l continues through the tunnel to..4.3.2. fraction of erasures from variable nodes..2.3.4.5.6.7.8.9 p 5/ 3/ 7 LDPC Codes 7

Density Evolution Visualization Example: (3,4)-regular LDPC code q fraction of erasures from check nodes.9.8.7.6.5.4 p l continues through tunnel to..3.2. fraction of erasures from variable nodes..2.3.4.5.6.7.8.9 p 5/ 3/ 7 LDPC Codes 8

Matching Condition For capacity-achieving sequences of LDPC codes for the BEC, the EXIT chart curves must match. This is called the matching condition. Such sequences have been developed: Tornado codes Right-regular LDPC codes Accumulate-Repeat-Accumulate codes 5/ 3/ 7 LDPC Codes 9

Decoding for Other Channels We now consider analysis and design of LDPC codes for BSC(p) and BiAWGN(σ) channels. We call p and σ the channel parameter for these two channels, respectively. Many concepts, results, and design methods have natural (but non-trivial) extensions to these channels. The messages are probability mass functions or loglikelihood ratios. The message-passing paradigm at variable and check nodes will be applied. The decoding method is called belief propagation or BP, for short. 5/ 3/ 7 LDPC Codes

Belief Propagation 5/ 3/ 7 LDPC Codes Consider transmission of binary inputs X {±} over a memoryless channel using linear code C. Assume codewords are transmitted equiprobably. Then where is the indicator function for C. { } { } { } { } ) ( ) ( argmax ) ( ) ( argmax ) ( argmax ) ( arg max ) ( ˆ ~ ~ ~ x f x y P x P x y P y x P y x P y x C x n j j j X Y x X x X Y x x Y X x i Y X x MAP i j j i i i i i i i i = = = = = ± ± ± ± f C (x)

Belief Propagation For codes with cycle-free Tanner graphs, there is a messagepassing approach to bit-wise MAP decoding. The messages are essentially conditional bit distributions, denoted u = [u(), u(-)]. The initial messages presented by the channel to the variable nodes are of the form u ch, i = [ uch, i (), uch, i ( )] = [ py X ( yi ), py X ( yi )] i i i i The variable-to-check and check-to-variable message updates are determined by the sum-product update rule. The BEC decoder can be formulated as a BP decoder. 5/ 3/ 7 LDPC Codes 2

Sum-Product Update Rule Variable-to-check u ch from channel v u u u d 2 v( b) = u d ch k = u k ( b), for b { ± } u d Check-to-variable u( b) = { x, x 2,, x d } f ( b, x, x 2,, x d ) d k = v k ( x where f f is is the the parity-check indicator function. k ), v v d- v v d u 5/ 3/ 7 LDPC Codes 3

Variable Node Update - Heuristic Variable-to-check v u ch from channel u d u u u d 2 v( b) = u d ch k = u k ( b), for b { ± } Suppose incoming messages u,, u,,..., u d- d- from from check nodes,,,,...,..., d- d- and and message u ch from ch from the the channel are are independent estimates of of [P(x [P(x = ), ), P(x P(x = -)]. -)]. Then, a reasonable estimate to to send send to to check node node based upon upon the the other other estimates would be be the the product of of those estimates (suitably normalized). Pˆ( x = b) = P ch ( x = b) We We do do not not use use the the intrinsic information u provided by by check node node.. The The estimate v represents extrinsic information. 5/ 3/ 7 LDPC Codes 4 d k = P k ( x = b)

Check-Node Update - Heuristic u( b) = { x, x 2,, x d } f ( b, x, x 2,, x d ) d k = v k ( x k ), Parity-check node node equation: r r s s t t = Over Over {-,}, this this translates to: to: r r s s t t = P(r=) = P(s P(s =,, t t = ) ) + P(s P(s = -, -, t t = -) -) r r s s v v u = P(s P(s = )P(t )P(t = ) ) + P(s P(s = -)P(t = -) -) [by [by independence assumption] t t v 2 Similarly P(r P(r = -) -) = P(s P(s =,, t t = -)+P(s = -, -, t t = ) ) = P(s P(s = )P(t )P(t = -)+P(s = -)P(t = ) ) 5/ 3/ 7 LDPC Codes 5

Log-Likelihood Formulation The sum-product update is simplified using log-likelihoods For message u, define u() L( u) = log u( ) Note that L( u) ( e u ) = and u( ) = L( u) L( u + e + e ) 5/ 3/ 7 LDPC Codes 6

Log-Likelihood Formulation Variable Node The variable-to-check update rule then takes the form: L(v) from channel L ( u ) L( u ) L( u d 2 ) L( v) d = k = L( u k ) L( u ) d 5/ 3/ 7 LDPC Codes 7

Log-Likelihood Formulation Check Node The check-to-variable update rule then takes the form: L(u) edge e L( u) d = 2 tanh k = tanh L( v k 2 ) L(v ) L(v d-2 ) L(v d- ) 5/ 3/ 7 LDPC Codes 8

Log-Likelihood Formulation Check Node To see this, consider the special case of a degree 3 check node. It is easy to verify that P r Q r = ( Ps Qs )( Pt Qt ) where t t P = P( a = ) and Q = P( a = ),for node a a a This can be generalized to a check node of any degree by a simple inductive argument. r r s s v v v 2 u 5/ 3/ 7 LDPC Codes 9

Log-Likelihood Formulation Check Node 5/ 3/ 7 2 LDPC Codes Translating to log-likelihood ratios, this becomes Noting that we conclude u v v 2 v r r s s t t ) ( ) ( ) ( ) ( ) ( ) ( 2 2 + + = + v L v L v L v L u L u L e e e e e e ) tanh( 2 ) ( ) ( ) ( 2 ) ( 2 ) ( 2 ) ( 2 ) ( a L a L a L a L a L e e e e e e a L a L = + = + ) ) tanh( tanh( ) tanh( 2 2 2 2 ) ( ) ( ) ( v L v L u L =

Key Results - Concentration With high probability, the performance of l rounds of BP decoding on a randomly selected (n, λ, ρ) code converges to the ensemble average performance as the length n. Convergence to cycle-free performance The average performance of l rounds of MP decoding on the (n, λ, ρ) ensemble converges to the performance on a graph with no cycles of length 2l as the length n. 5/ 3/ 7 LDPC Codes 2

Key Results -2 Computing the cycle-free performance The cycle-free performance can be computed by a somewhat more complex, but still tractable, algorithm density evolution. Threshold calculation There is a threshold channel parameter p*(λ,ρ) such that, for any better channel parameter p, the cycle-free error probability approaches as the number of iterations l. 5/ 3/ 7 LDPC Codes 22

Density Evolution (AWGN) Assume the all- s sequence is transmitted The density evolution algorithm computes the probability distribution or density of LLR messages after each round of BP decoding. Let P denote the initial LLR message density. It depends on the channel parameter σ. Let P l denote the density after l iterations. The density evolution equation for a (λ,ρ) degree distribution pair is: P = P λ( Γ ( ρ( Γ( P )))) 5/ 3/ 7 LDPC Codes 23

Density Evolution P = P λ( Γ ( ρ( Γ( P )))) Here denotes convolution of of densities and Γ is is interpreted as as an an invertible operator on on probability densities. We interpret λ(p) and ρ(p) as as operations on on densities: λ( P) = i 2 λ ( P) i ( i ) and ρ( P) = i 2 ρ ( P) i ( i ) The fraction of of incorrect (i.e., negative) messages after l iterations is: is: P ( z) dz LDPC Codes 5/ 3/ 7 24

Threshold P = P λ( Γ ( ρ( Γ( P )))) The threshold σ* σ* is is the the maximum σ such that lim P ( z) dz =. Operationally, this represents the the minimum SNR such that a code drawn from the the (λ,ρ) ensemble will ensure reliable transmission as as the the block length approaches infinity. 5/ 3/ 7 LDPC Codes 25

Degree Distribution Optimization For a given rate, the objective is is to to optimize λ(x) and ρ(x) for the best threshold p*. The maximum left and right degrees are fixed. For some channels, the optimization procedure is is not trivial, but there are some techniques that can be applied in in practice. 5/ 3/ 7 LDPC Codes 26

Thresholds - (j,k)-regular BSC(p) (j,k) R p*(j,k) p Sh (3,4).25.67.25 (4,6).333.6 (3,5).4.3 (3,6).5.84 (4,8).5.76.74.46.. BiAWGN(σ) (j,k) R σ* σ Sh (3,4).25.26.549 (4,6).333. (3,5).4. (3,6).5.88 (4,8).5.83.295.48.979.979 5/ 3/ 7 LDPC Codes 27

Thresholds and Optimized Irregular Codes BiAWGN σ Sh Sh =.979 Rate R=½ λ max σ* 5.9622 2.9646 3.969 4.978 5/ 3/ 7 LDPC Codes 28

Irregular Code vs. Turbo Codes AWGN R=/2 n = = 3 3,, 4 4,, 5 5,, 6 6 BER Richardson, Richardson, Shokrollahi, Shokrollahi, and and Urbanke, Urbanke, 2 2 5/ 3/ 7 LDPC Codes 29

Density Evolution Density evolution must track probability distributions/densities of the log-likelihood ratio messages. A discretized version of the sum-product algorithm, and associated discretized density evolution, speeds code design considerably. This design method has produced rate ½ LDPC ensembles with thresholds within.45db of the Shannon limit on the AWGN channel! A rate /2 code with block length 7 provided BER of -6 within.4 db of the Shannon limit! 5/ 3/ 7 LDPC Codes 3

Some Really Good LDPC Codes Chung, Chung, et et al., al., 2. 2..45dB from from Shannon limit! 5/ 3/ 7 LDPC Codes 3

Good Code Performance Chung, Chung, et et al., al., 2. 2. 5/ 3/ 7 LDPC Codes 32

Applications of LDPC Codes The performance benefits that LDPC codes offer on the BEC, BSC, and AWGN channels have been shown empirically (and sometimes analytically) to extend to many other channels, including Fading channels Channels with memory Coded modulation for bandwidth-limited channels MIMO Systems 5/ 3/ 7 LDPC Codes 33

Rayleigh Fading Channels Hou, Hou, et et al., al., 2 2 R=/2, (3,6) (3,6) 5/ 3/ 7 LDPC Codes 34

Rayleigh Fading Channels 5/ 3/ 7 LDPC Codes 35

Rayleigh Fading Channels 5/ 3/ 7 LDPC Codes 36

Partial-Response Channels Kurkoski, et et al., al., 22 22 5/ 3/ 7 LDPC Codes 37

Dicode (-D) Channel Results Rate Rate 7/8 7/8 Regular j=3 j=3 n=495 5/ 3/ 7 LDPC Codes 38

EPR4 (+D-D 2 -D 3 ) Channel Results Rate Rate 7/8 7/8 Regular j=3 j=3 n=495 5/ 3/ 7 LDPC Codes 39

Optimized Codes for Partial Response Varnica and and Kavcic, 23 23 5/ 3/ 7 LDPC Codes 4

Optimized Codes for Partial Response 5/ 3/ 7 LDPC Codes 4

Optimized Codes for Partial Response R=.7 n= n= 6 6 5/ 3/ 7 LDPC Codes 42

Some Basic References R.G. Gallager, Low-Density Parity-Check Codes. Cambridge, MA: MIT Press, 963 (Sc.D. MIT, 96). T. Richardson and R. Urbanke, Modern Coding Theory. Cambridge University Press (Preliminary version, May 2, 27) Special Issue on Codes on Graphs and Iterative Algorithms, IEEE Transactions on Information Theory, February 2. S-Y Chung, et al., On the design of low-density parity-check codes within.45 db of the Shannon limit, IEEE Commun. Letters, vol. 5, no. 2, pp. 58-6, February 2. 5/ 3/ 7 LDPC Codes 43

Additional References J. Hou, P.H. Siegel, and L.B. Milstein, Performance analysis and code optimization of low density parity-check codes on Rayleigh fading channels, IEEE J. Select. Areas Commun., Issue on The Turbo Principle: From Theory to Practice I, vol. 9, no. 5, pp. 924-934, May 2. B.M. Kurkoski, P.H. Siegel, and J.K. Wolf, Joint message passing decoding of LCPC coded partial-response channels, IEEE Trans. Inform. Theory, vol. 48, no. 6, pp. 4-422, June 22. (See also B.M. Kurkoski, P.H. Siegel, and J.K. Wolf, Correction to Joint message passing decoding of LCPC coded partial-response channels, IEEE Trans. Inform. Theory, vol. 49, no. 8, p. 276, August 23.) N. Varnica and A. Kavcic, Optimized LDPC codes for partial response channels, IEEE Communications Letters, vol. 7, pp. 68-7, April 23. LDPC Codes 5/ 3/ 7 44

Concluding Remarks LDPC codes are very powerful codes with enormous practical potential, founded upon deep and rich theory. There continue to be important advances in all of the key aspects of LDPC code design, analysis, and implementation. LDPC codes are now finding their way into many applications: Satellite broadcast Cellular wireless Data storage And many more 5/ 3/ 7 LDPC Codes 45