Belief-Propagation Decoding of LDPC Codes

Similar documents
ECEN 655: Advanced Channel Coding

Low-density parity-check codes

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

5. Density evolution. Density evolution 5-1

LDPC Codes. Slides originally from I. Land p.1

Bounds on the Performance of Belief Propagation Decoding

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

Low Density Parity Check (LDPC) Codes and the Need for Stronger ECC. August 2011 Ravi Motwani, Zion Kwok, Scott Nelson

Lecture 4 : Introduction to Low-density Parity-check Codes

Codes on graphs and iterative decoding

On the Application of LDPC Codes to Arbitrary Discrete-Memoryless Channels

Graph-based Codes and Iterative Decoding

Message Passing Algorithm and Linear Programming Decoding for LDPC and Linear Block Codes

Analysis of Sum-Product Decoding of Low-Density Parity-Check Codes Using a Gaussian Approximation

Graph-based codes for flash memory

An Introduction to Low Density Parity Check (LDPC) Codes

Decomposition Methods for Large Scale LP Decoding

Introducing Low-Density Parity-Check Codes

LOW-density parity-check (LDPC) codes were invented

An Introduction to Low-Density Parity-Check Codes

Codes on graphs and iterative decoding

Low-Density Parity-Check Codes

Turbo Codes are Low Density Parity Check Codes

Connecting Belief Propagation with Maximum Likelihood Detection

On Generalized EXIT Charts of LDPC Code Ensembles over Binary-Input Output-Symmetric Memoryless Channels

An Introduction to Algorithmic Coding Theory

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

The Distribution of Cycle Lengths in Graphical Models for Turbo Decoding

Design and Analysis of Nonbinary LDPC Codes for Arbitrary Discrete-Memoryless Channels

COMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei

Spatially Coupled LDPC Codes

LDPC Codes. Intracom Telecom, Peania

Ma/CS 6b Class 25: Error Correcting Codes 2

Bifurcations in iterative decoding and root locus plots

Decoding Codes on Graphs

BOUNDS ON THE MAP THRESHOLD OF ITERATIVE DECODING SYSTEMS WITH ERASURE NOISE. A Thesis CHIA-WEN WANG

CHAPTER 3 LOW DENSITY PARITY CHECK CODES

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Linear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Making Error Correcting Codes Work for Flash Memory

Iterative Decoding for Wireless Networks

An algorithm to improve the error rate performance of Accumulate-Repeat-Accumulate codes Tae-Ui Kim

A Universal Theory of Pseudocodewords

Message Passing Algorithm with MAP Decoding on Zigzag Cycles for Non-binary LDPC Codes

Capacity-approaching codes

Factor Graphs and Message Passing Algorithms Part 1: Introduction

Low-Density Parity-Check codes An introduction

ABSTRACT. The original low-density parity-check (LDPC) codes were developed by Robert

APPLICATIONS. Quantum Communications

4216 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 12, DECEMBER Density Evolution for Asymmetric Memoryless Channels

Bounds on Achievable Rates of LDPC Codes Used Over the Binary Erasure Channel

Performance of Low Density Parity Check Codes. as a Function of Actual and Assumed Noise Levels. David J.C. MacKay & Christopher P.

Information Theoretic Imaging

Codes on Graphs. Telecommunications Laboratory. Alex Balatsoukas-Stimming. Technical University of Crete. November 27th, 2008

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Shannon s noisy-channel theorem

2 Information transmission is typically corrupted by noise during transmission. Various strategies have been adopted for reducing or eliminating the n

Construction and Performance Evaluation of QC-LDPC Codes over Finite Fields

Analysis of a Randomized Local Search Algorithm for LDPCC Decoding Problem

Low-density parity-check (LDPC) codes

Aalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes

One-Bit LDPC Message Passing Decoding Based on Maximization of Mutual Information

Introduction to Convolutional Codes, Part 1

Quasi-cyclic Low Density Parity Check codes with high girth

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes

Lecture 8: Shannon s Noise Models

Design and Analysis of Graph-based Codes Using Algebraic Lifts and Decoding Networks

Maximum Likelihood Decoding of Codes on the Asymmetric Z-channel

Iterative Plurality-Logic and Generalized Algorithm B Decoding of q-ary LDPC Codes

Density Evolution and Functional Threshold. for the Noisy Min-Sum Decoder

Iterative Encoding of Low-Density Parity-Check Codes

LDPC Decoder LLR Stopping Criterion

Message-Passing Decoding for Low-Density Parity-Check Codes Harish Jethanandani and R. Aravind, IIT Madras

Information Projection Algorithms and Belief Propagation

Construction of low complexity Array based Quasi Cyclic Low density parity check (QC-LDPC) codes with low error floor

9 Forward-backward algorithm, sum-product on factor graphs

Practical Polar Code Construction Using Generalised Generator Matrices

Decoding of LDPC codes with binary vector messages and scalable complexity

1e

UTA EE5362 PhD Diagnosis Exam (Spring 2011)

Recent results on bit-flipping LDPC decoders

Slepian-Wolf Code Design via Source-Channel Correspondence

EFFICIENT DECODING ALGORITHMS FOR LOW DENSITY PARITY CHECK CODES

AN INTRODUCTION TO LOW-DENSITY PARITY-CHECK CODES

Iterative Quantization. Using Codes On Graphs

In Praise of Bad Codes for Multi-Terminal Communications

LP Decoding Corrects a Constant Fraction of Errors

An Efficient Maximum Likelihood Decoding of LDPC Codes Over the Binary Erasure Channel

How to Achieve the Capacity of Asymmetric Channels

Lecture 3: Channel Capacity

Asynchronous Decoding of LDPC Codes over BEC

Statistical mechanics and capacity-approaching error-correctingcodes

Joint Iterative Decoding of LDPC Codes and Channels with Memory

16.36 Communication Systems Engineering

On Bit Error Rate Performance of Polar Codes in Finite Regime

Convergence analysis for a class of LDPC convolutional codes on the erasure channel

Belief Propagation, Information Projections, and Dykstra s Algorithm

Two-Bit Message Passing Decoders for LDPC. Codes Over the Binary Symmetric Channel

Transcription:

LDPC Codes: Motivation Belief-Propagation Decoding of LDPC Codes Amir Bennatan, Princeton University Revolution in coding theory Reliable transmission, rates approaching capacity. BIAWGN, Rate =.5, Threshold.45 db of Shannon limit. BIAWGN, Rate =.88, Threshold.88 db of Shannon limit. BSC, Rate =.5, Threshold.5 of maximum crossover. BEC, any rate: achieve capacity! Low-complexity decoding: belief propagation 2 History Discrete-time, memoryless channel 963: Invented by Gallager 988: Belief-propagation, by Pearl 993: Turbo-codes, (Berrou, Glavieux, Thitimajshima) 996: Rediscovered (MacKay, Neal, Sipser, Spielman) Discrete time instances,,..., N. Output y n dependent only on x n. 3 4

Example: Binary Symmertic Channel (BSC) Decoding, no code case Assume y n =. Which x n was transmitted? Maximum likelihood rule: Pr[X n = Y n = ] =? (Assume p = /4) (assume p = /4) Pr[X n = Y n = ] =? 5 6 Decoding, no code case Decoding, no code case Pr[X n = Y n = ] = = Pr[Y n =,X n = ] Pr[Y n = ] = Pr[Y n = X n = ] Pr[X n = ] Pr[Y n = ] = Pr[Y n = X n = ] /2 Pr[Y n = ] Assumption: equal a priori probabilities: Pr[X n = ] = Pr[X n = ] = /2 Assume y n =. Which x n was transmitted? Pr[X n = Y n = ] =.75 (Assume p = /4) Pr[X n = Y n = ] =.25 Decoder decides: ˆx n = 7 8

Maximum likelihood rule: Decoding, no code case ˆx n = argmax Pr[X n = d Y n = y n ] d=, Example: C = Let s focus on x 2. Decoding, code case, Which x 2 was transmitted? y = 9 Example: C = Decoding, code case, y = Pr[X 2 = Y 2 = ] =.75 Pr[X 2 = Y 2 = ] =.25 Old decoding rule, Decoding, code case ˆx n = argmax Pr{ X n = d Y n = y n } d=, Better decoding rule, ˆx n = argmax Pr{ X n = d Y = y,..., Y N = y N, d=, [X,..., X N ] is a codeword } Decoder decides: ˆx 2 = Bad!!!!! 2

Example: C = Decoding, code case, With new decoding rule, y = Pr[X 2 = Y = y,x is a codeword] =.75.357 Pr[X 2 = Y = y,x is a codeword] =.25.9643 Word error vs. bit error Possibility : Minimize probability of word error. Pr[error] = Pr[ˆx x] Possibility 2: At each bit n, minimize probability of bit error. Pr[error in bit n] = Pr[ˆx n x n ] Our focus: bit error Decoder decides: ˆx 2 = ˆx 2 = 3 4 Old decoding rule, Decoding, code case ˆx n = argmax Pr{ X n = d Y n = y n } d=, Better decoding rule, ˆx n = argmax Pr{ X n = d Y = y,..., Y N = y N, d=, (R > is rate of code) [X,..., X N ] is a codeword } Complexity Θ(2 RN ) Status: Old decoding rule, Decoding, code case ˆx n = argmax Pr{X n = d Y n = y n } d=, Bad performance, excellent complexity. New decoding rule, ˆx n = argmax Pr{X n = d Y = y,x is a codeword} d=, Excellent performance, terrible complexity. Any compromise? 5 6

Linear binary block codes Parity check matrix. A binary matrix H. e.g. H x = Linear binary block codes Parity check matrix. A binary matrix H. e.g. H x =............ x x 2... x N =... h h 2... h M x x 2... x N =... x is a codeword H x = x is a codeword H x = h m x =, m =,..., M Each equation h m x = called a parity check. 7 8 ˆx n Linear binary block codes X is a codeword H X = h m X =, m =,..., M = argmaxpr{x n = d Y = y, X is a codeword} d=, = argmaxpr{x n = d Y = y, H X = } d=, = argmaxpr{x n = d Y = y,y 2 = y 2,..., Y N = y N d=, h X =,h 2 X =,...,h M X = } Decoding, code case Old decoding rule, ˆx n = argmaxpr{x n = d Y n = y n } d=, New decoding rule, ˆx n = argmaxpr{x n = d Y = y,y 2 = y 2,..., Y N = y N d=, h X =,h 2 X =,...,h M X = } Compromise: Use some {y n }, some parity checks! ˆx n = argmaxpr{x n = d Y l = y l,..., Y ll = y ll d=, h m X =,...,h mk X = } 9 2

Compromise: Iterative Decoding. Start with old decoding rule, Some formal stuff... Let w = [w,..., w N ], assume w / C, ˆx n = argmaxpr{x n = d Y n = y n } d=, Pr[X = w] =? 2. Iteratively add more h s and y s, ˆx n = argmaxpr{x n = d Y l = y l,..., Y ll = y ll d=, h m X =,...,h mk X = } How? Belief propagation Answer: Pr[X = w] = ( ) N 2 Pr[X = w X is a codeword] = Formal probability model 2 22 Properties of formal probability model Concepts of belief-propagation. Assumes no code 2. Valid mathematically 3. Non-restrictive 4. We can express other useful values.. Graph based 2. Beliefs 3. Iterative message passing 4. Extrinsic information rule 5. Ignore loops 23 24

Graph based Graph based Variable node #n, corresponds to time slot to unknown code bit X n. to received channel output y n. Check node #m, corresponds: to parity-check h m. 25 26 Parity check Concepts of belief-propagation. Graph based 2. Beliefs 3. Iterative message passing 4. Extrinsic information rule 5. Ignore loops h X = X + X 2 + X 3 + X 6 = Check node connected to participating variables. 27 28

Belief Propagation Belief Propagation The knowledge ( beliefs ) we have: Y = y,y 2 = y 2,..., Y N = y N h X =,h 2 X =,...,h M X = Divide it between the nodes. Variable nodes know channel outputs. Variable n knows value of y n. 29 3 Belief Propagation Concepts of belief-propagation. Graph based 2. Beliefs 3. Iterative message passing 4. Extrinsic information rule 5. Ignore loops Check nodes known parity checks. Check m knows that h m X =. 3 32

Iterative message passing Iterative message passing Nodes communicate using messages. Messages are sent through edges to neighboring nodes. Each message is a number m [, ]. Message from variable n to check m: V n m = Pr[X n = some h s and some y s] 33 34 Iterative message passing Iterative message passing Message from check m to check n: C m n = Pr[X n = other h s and other y s] Rightbound and leftbound iterations. Rightbound iteration. Variables send messages to checks. Leftbound iteration. Checks send messages to variables. 35 36

Iterative message passing Iterative message passing At node n,. Collect all incoming messages, previous iteration 2. Add my knowledge Rightbound iteration #: At variable n,. Collect all incoming messages, previous iteration (none) 2. Add my knowledge (channel output y n ) V n m = Pr[X n = Y n = y n ] 37 38 Iterative message passing Iterative message passing Leftbound iteration #: At check node, to variable 3. Collect all incoming messages (V,V 2,V 3,V 6 ) 2. Add my knowledge (parity check X + X 2 + X 3 + X 6 = ) C 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =, Y = y,y 2 = y 2,Y 3 = y 3,Y 6 = y 6 ] Leftbound iteration #: At check node, to variable 3. Collect all incoming messages (V,V 2,V 3,V 6 ) 2. Add my knowledge (parity check X + X 2 + X 3 + X 6 = ) C 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =, Y = y,y 2 = y 2,Y 3 = y 3,Y 6 = y 6 ] Extrinsic information rule: Message to node never function of message from node. 39 4

Iterative message passing Some formal stuff... Leftbound iteration #: At check node, to variable 3. Collect all incoming messages (V,V 2,V 6 ) 2. Add my knowledge (parity check X + X 2 + X 3 + X 6 = ) Pr[X =,X 2 = ] = Pr[X = ] Pr[X 2 = ] Pr[X =,Y 2 =,X 2 = ] = Pr[X = ] Pr[Y 2 =,X 2 = ] Pr[X + X 3 =,Y 3 =,X 2 + X 4 =,Y 4 = ] = Pr[X + X 3 =,Y 3 = ] Pr[X 2 + X 4 =,Y 4 = ] C 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =, Y = y,y 2 = y 2,Y 6 = y 6 ] =? 4 42 Iterative message passing Leftbound iteration #: At check node, to variable 3. Collect all incoming messages (V,V 2,V 6 ) 2. Add my knowledge (parity check X + X 2 + X 3 + X 6 = ) C 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =, = 2 Y = y,y 2 = y 2,Y 6 = y 6 ] ( 2V i ) i=,2,6 Leftbound iteration #: At check node, to variable 3. Collect all incoming messages (V,V 2,V 3,V 6 ) 2. Add my knowledge (parity check X + X 2 + X 3 + X 6 = ) C 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =, = 2 Y = y,y 2 = y 2,Y 6 = y 6 ] ( 2V i ) i=,2,6 43 44

Iterative message passing Rightbound iteration #2: From variable 3, to check node 3. Collect all incoming messages (C 3,C 2 3 ) 2. Add my knowledge (channel output y 3 ) V 3 3 = Pr[X 3 = some knowledge] Leftbound iteration #: At check node, to variable 3. Collect all incoming messages (V 4 2,V 3 2,V 5 2 ) 2. Add my knowledge (parity check X 3 + X 4 + X 5 = ) C 2 3 = Pr[X 3 = X 3 + X 4 + X 5 =, Y 4 = y 4,Y 5 = y 6 ] = 2 ( 2V i 2 ) i=4,5 45 46 Notation: V 3 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, X 3 + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 Y 3 = y 3 ] C 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6 ] C 2 3 = Pr[X 3 = X 3 + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] V 3 3 = Pr[X 3 = h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] Therefore, V 3 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, X 3 + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 Y 3 = y 3 ] 47 48

Iterative message passing Pr[X 3 = h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] Pr[h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] Rightbound iteration #2: From variable 3, to check node 3. Collect all incoming messages (C 3,C 2 3 ) 2. Add my knowledge (channel output y 3 ) V 3 3 = Pr[X 3 = h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] 49 5 Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,Y 3 = y 3, X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, X 3 + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] = Pr[X 3 =,Y 3 = y 3, X + X 2 + + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] =? Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,Y 3 = y 3, X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, X 3 + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] = Pr[X 3 =,Y 3 = y 3, X + X 2 + + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] = Pr[X 3 =,Y 3 = y 3 ] Pr[X + X 2 + + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6 ] Pr[ + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] 5 52

Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,Y 3 = y 3 ] Pr[X + X 2 + + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6 ] Pr[ + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ]... = Pr[X 3 = Y 3 = y 3 ] Pr[X 3 = X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6 ] Pr[X 3 = X 3 + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] fun(y,y 2,y 6,y 4,y 5 ) Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,Y 3 = y 3 ] Pr[X + X 2 + + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6 ] Pr[ + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ]... = P 3 C 3 C 2 3 fun(y,y 2,y 6,y 4,y 5 ) P 3 = Pr[X 3 = Y 3 = y 3 ] 53 54 Pr[X 3 = h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] Pr[h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] P 3 C 3 C 2 3 fun(y,y 2,y 6,y 4,y 5 ) = Pr[h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ]... P 3 i=,2 = C i 3 P 3 i=,2 C i 3 + ( P 3 ) i=,2 ( C i 3) Iterative message passing Rightbound iteration #2: From variable 3, to check 3. Collect all incoming messages (C 3,C 2 3 ) 2. Add my knowledge (channel output y 3 ) P 3 i=,2 V 3 3 = C i 3 P 3 i=,2 C i 3 + ( P 3 ) i=,2 ( C i 3) 55 56

Let s change the problem... Information flow graph, A loop! 57 58 Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,Y 3 = y 3, X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, X 3 + X 2 + X 5 =,Y 2 = y 2,Y 5 = y 5 ] = Pr[X 3 =,Y 3 = y 3, X + X 2 + + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, + X 2 + X 5 =, Y 2 = y 2,Y 5 = y 5 ] Pr[X 3 =,Y 3 = y 3 ] Pr[X + X 2 + + X 6 =,Y = y, Y 2 = y 2,Y 6 = y 6 ] Pr[ + X 2 + X 5 =,Y 2 = y 2,Y 5 = y 5 ] Pr[X 3 = h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] P 3 i=,2 C i 3 P 3 i=,2 C i 3 + ( P 3 ) i=,2 ( C i 3) What to do? 59 6

Iterative message passing Rightbound iteration #2: From variable 3, to check 3. Collect all incoming messages (C 3,C 2 3 ) 2. Add my knowledge (channel output y 3 ) Ignoring loops Why is ignoring loops okay? Number of loops small. Simulation results okay even when some loops. P 3 i=,2 V 3 3 = C i 3 P 3 i=,2 C i 3 + ( P 3 ) i=,2 ( C i 3) Ignore loop: Compute V 3 3 as if no loop! 6 62 Low-density parity checks: d << N, Graph randomly generated. d = average check degree N = block length Belief Propagation Algorithm Rightbound iteration #t: At variable node n, P n i A(n)\{m} V n m = C i n P n i A(n)\{m} C i n + ( P n ) i A(n)\{m} ( C i n) Leftbound iteration #t: At check node m, C m n = 2 ( 2V i m ) i A(m)\{n} where A(n), A(m) are the sets of adjacent nodes to n and m. 63 64