Status of Knowledge on Non-Binary LDPC Decoders

Similar documents
Extended MinSum Algorithm for Decoding LDPC Codes over GF (q)

Lecture 4 : Introduction to Low-density Parity-check Codes

Decoding Algorithms for Nonbinary LDPC Codes over GF(q)

Low-density parity-check (LDPC) codes

Low-Complexity Decoding for Non-Binary LDPC Codes in High Order Fields

Constructions of Nonbinary Quasi-Cyclic LDPC Codes: A Finite Field Approach

LDPC Codes. Intracom Telecom, Peania

Non-binary Hybrid LDPC Codes: structure, decoding and optimization

Codes on Graphs. Telecommunications Laboratory. Alex Balatsoukas-Stimming. Technical University of Crete. November 27th, 2008

ON THE MINIMUM DISTANCE OF NON-BINARY LDPC CODES. Advisor: Iryna Andriyanova Professor: R.. udiger Urbanke

Low Density Parity Check (LDPC) Codes and the Need for Stronger ECC. August 2011 Ravi Motwani, Zion Kwok, Scott Nelson

Quasi-cyclic Low Density Parity Check codes with high girth

LDPC Codes. Slides originally from I. Land p.1

Structured Low-Density Parity-Check Codes: Algebraic Constructions

Graph-based codes for flash memory

Iterative Encoding of Low-Density Parity-Check Codes

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Factor Graphs and Message Passing Algorithms Part 1: Introduction

BInary low-density parity-check (LDPC) codes, discovered

5. Density evolution. Density evolution 5-1

Construction of low complexity Array based Quasi Cyclic Low density parity check (QC-LDPC) codes with low error floor

Making Error Correcting Codes Work for Flash Memory

Design of regular (2,dc)-LDPC codes over GF(q) using their binary images

ECC for NAND Flash. Osso Vahabzadeh. TexasLDPC Inc. Flash Memory Summit 2017 Santa Clara, CA 1

GALLAGER S binary low-density parity-check (LDPC)

Message Passing Algorithm with MAP Decoding on Zigzag Cycles for Non-binary LDPC Codes

An Introduction to Low Density Parity Check (LDPC) Codes

Polar Codes: Graph Representation and Duality

Low-density parity-check codes

Enhancing Binary Images of Non-Binary LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Codes designed via algebraic lifts of graphs

Turbo Codes are Low Density Parity Check Codes

Construction and Performance Evaluation of QC-LDPC Codes over Finite Fields

Integrated Code Design for a Joint Source and Channel LDPC Coding Scheme

CHAPTER 3 LOW DENSITY PARITY CHECK CODES

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

ECEN 655: Advanced Channel Coding

COMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei

SPA decoding on the Tanner graph

Design of Non-Binary Quasi-Cyclic LDPC Codes by Absorbing Set Removal

Message-Passing Decoding for Low-Density Parity-Check Codes Harish Jethanandani and R. Aravind, IIT Madras

Probabilistic and Bayesian Machine Learning

Asynchronous Decoding of LDPC Codes over BEC

Modern Coding Theory. Daniel J. Costello, Jr School of Information Theory Northwestern University August 10, 2009

RECURSIVE CONSTRUCTION OF (J, L) QC LDPC CODES WITH GIRTH 6. Communicated by Dianhua Wu. 1. Introduction

Finite Alphabet Iterative Decoders Approaching Maximum Likelihood Performance on the Binary Symmetric Channel

Low-Density Arrays of Circulant Matrices: Rank and Row-Redundancy Analysis, and Quasi-Cyclic LDPC Codes

A Simplified Min-Sum Decoding Algorithm. for Non-Binary LDPC Codes

Practical Coding Scheme for Universal Source Coding with Side Information at the Decoder

Capacity-approaching codes

Advances in Error Control Strategies for 5G

Introducing Low-Density Parity-Check Codes

Low-Density Parity-Check Codes

Codes on graphs and iterative decoding

Design and Analysis of Nonbinary LDPC Codes for Arbitrary Discrete-Memoryless Channels

On the minimum distance of LDPC codes based on repetition codes and permutation matrices 1

Codes on Graphs, Normal Realizations, and Partition Functions

Pre-sorted Forward-Backward NB-LDPC Check Node Architecture

Random Redundant Soft-In Soft-Out Decoding of Linear Block Codes

Construction of LDPC codes

On Turbo-Schedules for LDPC Decoding

Practical Polar Code Construction Using Generalised Generator Matrices

Successive Cancellation Decoding of Single Parity-Check Product Codes

Research Letter Design of Short, High-Rate DVB-S2-Like Semi-Regular LDPC Codes

Optimal Rate and Maximum Erasure Probability LDPC Codes in Binary Erasure Channel

Analysis and Design of Finite Alphabet. Iterative Decoders Robust to Faulty Hardware

Graph-based Codes and Iterative Decoding

Low-complexity error correction in LDPC codes with constituent RS codes 1

A Proposed Quantum Low Density Parity Check Code

Belief Propagation, Information Projections, and Dykstra s Algorithm

A Key Recovery Attack on MDPC with CCA Security Using Decoding Errors

Joint Iterative Decoding of LDPC Codes and Channels with Memory

STUDY OF PERMUTATION MATRICES BASED LDPC CODE CONSTRUCTION

Which Codes Have 4-Cycle-Free Tanner Graphs?

Part III Advanced Coding Techniques

Which Codes Have 4-Cycle-Free Tanner Graphs?

TREE-STRUCTURED EXPECTATION PROPAGATION FOR LDPC DECODING OVER THE AWGN CHANNEL

On the Typicality of the Linear Code Among the LDPC Coset Code Ensemble

NB-LDPC check node with pre-sorted input

LDPC Decoder LLR Stopping Criterion

Circulant Arrays on Cyclic Subgroups of Finite Fields: Rank Analysis and Construction of Quasi-Cyclic LDPC Codes

Joint Equalization and Decoding for Nonlinear Two-Dimensional Intersymbol Interference Channels with Application to Optical Storage

Belief-Propagation Decoding of LDPC Codes

JOINT ITERATIVE DETECTION AND DECODING IN THE PRESENCE OF PHASE NOISE AND FREQUENCY OFFSET

Codes on graphs and iterative decoding

Block Codes :Algorithms in the Real World

Maximum Likelihood Decoding of Codes on the Asymmetric Z-channel

Vector spaces. EE 387, Notes 8, Handout #12

Time-invariant LDPC convolutional codes

Trapping Set Enumerators for Specific LDPC Codes

An Introduction to Algorithmic Coding Theory

Analysis of Sum-Product Decoding of Low-Density Parity-Check Codes Using a Gaussian Approximation

Coding Techniques for Data Storage Systems

On Generalized EXIT Charts of LDPC Code Ensembles over Binary-Input Output-Symmetric Memoryless Channels

BOUNDS ON THE MAP THRESHOLD OF ITERATIVE DECODING SYSTEMS WITH ERASURE NOISE. A Thesis CHIA-WEN WANG

Implementing the Belief Propagation Algorithm in MATLAB

Belief propagation decoding of quantum channels by passing quantum messages

Analysis of a Randomized Local Search Algorithm for LDPCC Decoding Problem

Transcription:

Status of Knowledge on Non-Binary LDPC Decoders Part I: From Binary to Non-Binary Belief Propagation Decoding D. Declercq 1 1 ETIS - UMR8051 ENSEA/Cergy-University/CNRS France IEEE SSC SCV Tutorial, Santa Clara, October 21st, 2010 D. Declercq (ETIS - UMR8051) 1 / 59

Outline 1 Introduction 2 Belief Propagation on a General Graph 3 Binary Belief Propagation Decoder 4 Non-Binary Belief Propagation Decoding D. Declercq (ETIS - UMR8051) 2 / 59

Outline 1 Introduction 2 Belief Propagation on a General Graph 3 Binary Belief Propagation Decoder 4 Non-Binary Belief Propagation Decoding D. Declercq (ETIS - UMR8051) 3 / 59

Small History of Binary LDPC Codes: Landmarks Gallager 1962 regular LDPC codes, proof of convergence (MLD), algo. A (bit Tanner 1981 MacKay 1995 Rich. et Urb. 2001 flipping), algo. B, composite codes on graphs, link with product codes and LDPC codes, Belief Propagation (BP) decoding, link with iterative turbo-decoding, irregular LDPC codes, proof of convergence (BP), optimization of irregularity, codes approaching capacity (BEC, BI-AWGN), Since then Optimization for other types of channels (freq. selective, multilevel, multi-user, turbo-equalization, joint source-channel coding), finding good matrices for small sizes, lowering the error floor. Golden age of LDPC codes, application in many standards. D. Declercq (ETIS - UMR8051) 4 / 59

Small History of Non-Binary LDPC Codes: Landmarks Gallager 1963 LDPC codes in Galois Fields, iterative hard decoding algo. B for d v = 3, MacKay 1998 Advantages for small blocks/high rates, ultra-sparse d v = 2 LDPC codes in high-order Fields, 2003-2006 Development of practical decoders for Non-Binary LDPC codes, 2006-2010 Attempts to find applications where NB-LDPC codes outperform binary LDPC codes, 2010-xxx Golden age of Non-Binary LDPC codes? [DAVEY 98] M. DAVEY AND D.J.C. MACKAY, LOW DENSITY PARITY CHECK CODES OVER GF(Q), IEEE communication letter, VOL. 2, PP 165 167, JUNE 1998 [MACKAY 99] D.J.C. MACKAY AND M. DAVEY, EVALUATION OF GALLAGER CODES FOR SHORT BLOCK LENGTH AND HIGH RATE APPLICATIONS, proc. of IMA workshop on codes, systems and graphical models, 1999 D. Declercq (ETIS - UMR8051) 5 / 59

Definitions and Quantities LDPC Code j C H = c GF(q) N ff H.c GF(q) = 0 LDPC Code C G = c = G.u, u GF (q) K with H(M N), parity check matrix, with G(N K ), generator matrix. 8 < of a codeword N R = K /N Size of information K (if H is full rank) : of the redundancy M Density of H : d H = nb. nonzero elements in H M.N LDPC : d H N + 0 D. Declercq (ETIS - UMR8051) 6 / 59

Tanner Graph Representation of a Binary LDPC Code Tanner Graph is a Bi-partite graph with Adjacency Matrix H LDPC: Low Density Parity Check Codes c c c c 0 1 2 3 4 5 6 7 c c c c codeword H = 1 0 1 1 0 0 0 1 1 1 0 1 0 0 1 0 0 0 1 0 1 1 0 1 0 1 0 0 1 1 1 0 c c c c c c c c 0 1 2 3 4 5 6 7 Interleaver : message Parity checks Π codeword Parity checks D. Declercq (ETIS - UMR8051) 7 / 59

Tanner graph Representation of a Non-Binary LDPC code Tanner graph of an Irregular Non-Binary LDPC code in GF (8) H = 3 0 7 1 0 0 0 2 4 3 0 5 0 0 1 0 1 0 5 0 3 6 0 7 0 6 0 0 1 2 0 0 c c c c c c c c 0 1 2 3 4 5 6 7 c c c c c c c c 0 1 2 3 4 5 6 7 Interleaver.. Π Codeword : message parity checks Codeword parity checks D. Declercq (ETIS - UMR8051) 8 / 59

parameters for Non-Binary LDPC code irregularity Irregularity distribution, Irregularity profile 1 edges proportions : λ i : proportion of nonzero values {H kl } in degree i columns, ρ j : proportion of nonzero values {H kl } in degree j rows, 2 node proportions : λ i : proportion of columns in H with degree i, ρ j : proportion of rows in H with degree j, 3 non-zero values distribution : dvmax X dcmax X λ(x) = λ i x i 1 ρ(x) = ρ j x j 1 i=2 h ij (ω) : uniformly distributed in GF(q)\0. j=2 D. Declercq (ETIS - UMR8051) 9 / 59

Notion of Code Family A LDPC code family is defined by (λ(x), ρ(x), N) for characterization, proofs, theoretical studies A fixed LDPC code is defined by (λ(x), ρ(x), N, Π, {h ij }) for practical application Π λ(x) = 4 24 x + 3 24 x 2 + 8 24 x 3 + 9 24 x 8 ρ(x) = x 5 D. Declercq (ETIS - UMR8051) 10 / 59

Outline 1 Introduction 2 Belief Propagation on a General Graph 3 Binary Belief Propagation Decoder 4 Non-Binary Belief Propagation Decoding D. Declercq (ETIS - UMR8051) 11 / 59

Concept of Iterative Decoder on Graph From Local Computation to Global Optimization The general concept of LDPC decoders is based on message passing between nodes in the Tanner graph of the code, so that iterative updates of the messages lead to a stable state of the messages: convergence to a fixed point. Messages represent probability density functions of the random variables. For a discrete random variable in a set of q elements: µ(0) = Prob (x n = 0.)... µ(q 1) = Prob (x n = q 1.) q 1 X µ(k) = 1 i=0 The decoding result is the a posteriori probability of one random variable x n: Prob (x n y 0, y 1,..., y N 1 ) a particular scheduling of the computation of the messages defines a decoding iteration. D. Declercq (ETIS - UMR8051) 12 / 59

Terminology and some History Belief Propagation: BP 1 Artificial Intelligence Statistical learning : Pearl s Belief Propagation (1981-86) Neural Networks : sum-product algorithm (ΣΠ) (1985-86) 2 Information Theory Gallager iterative decoders for LDPC (1963), Viterbi (1967), BCJR (1974): can be analysed as BP on Factor graphs, 3 Statistical Physics BP = Bethe approximation of the global free energy of complex systems (1935), Generalized BP = Kikuchi approximation of the free energy (1951) D. Declercq (ETIS - UMR8051) 13 / 59

Example Tanner graph (for PC codes) is a special case of Factor Graph Let A(ω), B(ω), C(ω), D(ω) be dependant random variables. Let A, B, C, D be their noisy observation. A A B B p(b A,C,D ) C D C D With Belief propagation on a tree, we get the a posteriori density : optimal solution. D. Declercq (ETIS - UMR8051) 14 / 59

Notations for bi-partite graphs x n µ x f µ f x F(.) variable node function node (µ x f, µ f x ) : messages = p.d.f. The graph is not oriented: messages needed in both directions. 2 types of nodes = 2 types of local updates: data node update and function node update D. Declercq (ETIS - UMR8051) 15 / 59

Concept of Computational Tree Expansion of the graph from a symbol/check node. F1 F2 Nodes seen after 1 iteration Nodes seen after 2 iterations LLR LLR LLR D. Declercq (ETIS - UMR8051) 16 / 59

Concept of Computational Tree Past of the graph = set of nodes. F1 F2 Nodes seen after 1 iteration Nodes seen after 2 iterations LLR LLR LLR Past of F1: S Past of F2: F1 S F2 D. Declercq (ETIS - UMR8051) 17 / 59

Concept of Computational Tree Independence assumption Independent F1 F2 Nodes seen after 1 iteration Nodes seen after 2 iterations LLR LLR LLR Past of F1: S F1 Past of F2: S F2 Disjoint D. Declercq (ETIS - UMR8051) 18 / 59

Variable Node Update: Bayesian Merging bitnode/symbol updates for LDPC codes 4 F(.) 3 F(.) 4 µ f x 3 µ f x x 1 1 µ x f 1 F(.) 2 F(.) 2 µ f x µ 1 x f [k] 4Y i=2 µ i f x [k] k = 0... q 1 ASSUMPTION: input messages µ i f x are independent ASSUMPTION: noisy symbol sets leading to µ i f x are disjoint! Update equation is NOT normalized D. Declercq (ETIS - UMR8051) 19 / 59

Function Node Update: Bayesian Marginalization checknode updates for LDPC codes F( x 1, x 2, x, 3 x 4 ) 4 µ x f x 4 x 1 1 µ f x 3 µ x f 2 µ x f x 3 x 2 µ 1 f x [k 1] = X k 2,k 3,k 4 F `x1 = k 1, x 2 = k 2, x 3 = k 3, x 4 = k 4 4Y µ i x f [k i ] i=2 k 1 = 0... q 1 ASSUMPTION: input messages µ i are independent f x ASSUMPTION: noisy symbol sets leading to µ i are disjoint f x D. Declercq (ETIS - UMR8051) 20 / 59

Function Node Update: Bayesian Marginalization Non-Binary Parity-check case in GF(q) µ 1 f x [k 1] = X k 2,k 3,k 4 F `x1 = k 1, x 2 = k 2, x 3 = k 3, x 4 = k 4 4Y µ i x f [k i ] i=2 let α k GF(q) = 0, 1, α, α 2,..., α q 1 Parity-check case: the function node reduces to an indicator function: F x 1 = α 1, x 2 = α 2, x 3 = α 3, x 4 = α 4 = 1 if α 1 + α 2 + α 3 + α 4 = 0 F x 1 = α 1, x 2 = α 2, x 3 = α 3, x 4 = α 4 = 0 if α 1 + α 2 + α 3 + α 4 0 Parity-check case: results in one less sum dimension in the marginalization: µ 1 f x [α 1] = X α 2,α 3 µ 2 x f [α 2] µ 3 x f [α 3] µ 4 x f [α 1 + α 2 + α 3 ]. D. Declercq (ETIS - UMR8051) 21 / 59

Scheduling and Definition of Iteration This ordering of the messages is called flooding schedule One decoding iteration = µ µ APP D. Declercq (ETIS - UMR8051) 22 / 59

Concept of Computational Tree (cont d) APP Nodes seen after 1 iteration Nodes seen after 2 iterations LLR LLR LLR Computational span of L iterations: in L iterations, a maximum of d v (d v 1) L 1 (d c 1) L nodes are seen from the top of the tree. As a consequence, a usual assumption is that the BP decoder needs at least L = log(n) iterations to converge (to see all LLRs). As a consequence, the independence assumption for the BP decoder breaks after at most L = log(n) iterations. D. Declercq (ETIS - UMR8051) 23 / 59

Concept of Computational Tree (cont d) Breaking the independance assumption wrong update wrong APP wrong update Nodes seen after 1 iteration Nodes seen after 2 iterations LLR LLR LLR a crucial parameter of the graph is its girth g, i.e. the size of the smallest closed path/cycle, As a consequence, only g/4 decoding iterations correspond to an exact inference!!! D. Declercq (ETIS - UMR8051) 24 / 59

Alternate Scheduling of messages (1) Layered BP or Shuffled Scheduling D. Declercq (ETIS - UMR8051) 25 / 59

Alternate Scheduling of messages (2) Layered BP or Shuffled Scheduling D. Declercq (ETIS - UMR8051) 26 / 59

Alternate Scheduling of messages (3) Layered BP or Shuffled Scheduling D. Declercq (ETIS - UMR8051) 27 / 59

Alternate Scheduling of messages (4) Layered BP or Shuffled Scheduling Advantage: for bitnodes with degree d v 3 Messages are computed several times during ONE iteration Faster convergence. D. Declercq (ETIS - UMR8051) 28 / 59

Outline 1 Introduction 2 Belief Propagation on a General Graph 3 Binary Belief Propagation Decoder 4 Non-Binary Belief Propagation Decoding D. Declercq (ETIS - UMR8051) 29 / 59

Binary Belief Propagation Algorithm in the Log-Domain definition of messages in the log-domain u k = log µk f x [0] v µ k f x [1] k = log µ k x f [0] µ k x f [1] U 0 (LLR) message update through the 2 types of nodes U k C V m v m = u 0 + Xd v u k k=1,k m tanh u k 2 = d c Y m=1;m k tanh vm 2 V m U k D. Declercq (ETIS - UMR8051) 30 / 59

From the Probability-Domain to the Log-Domain (1) Bitnode Update dv 1 x f [k] = Y µ i f x [k] k = 0, 1 i=1 µ dv Let us consider a d v = 3 bitnode with v 3 as output message: v 3 = log µ3 x f [0] µ 3 x f [1] = log µ0 [0] µ 1 f x [0] µ1 f x [0] µ 0 [1] µ 1 f x [1] µ1 f x [1] = u 0 + u 1 + u 2 D. Declercq (ETIS - UMR8051) 31 / 59

From the Probability-Domain to the Log-Domain (2) Checknode Update µ dc f x [α d c ] = X α 1,...,α dc 1 dy c 1 i=1 µ i x f [α i ] 1 X k α k = 0! α k = 0, 1 Let us consider a d c = 3 bitnode with u 3 as output message: µ 3 f x [0] = µ1 x f [0]µ2 x f [0] + µ1 x f [1]µ2 x f [1] µ 3 f x [1] = µ1 x f [0]µ2 x f [1] + µ1 x f [1]µ2 x f [0] D. Declercq (ETIS - UMR8051) 32 / 59

From the Probability-Domain to the Log-Domain (3) Checknode Update Intermediate step: decoding in the Fourier Domain Now compute the factorization of the sum... (µ 3 f x [0] + µ3 f x [1]) = (µ1 x f [0] + µ1 x f [1]) (µ2 x f [0] + µ2 x f [1])... and the factorization of the difference (µ 3 f x [0] µ3 f x [1]) = (µ1 x f [0] µ1 x f [1]) (µ2 x f [0] µ2 x f [1]) we can write in vector form: " # " # " 1 1 µ 3 f x [0] 1 1 # " µ 1 x f [0] #! " 1 1 # " µ 2 x f [0] #! 1 1 µ 3 f x [1] = 1 1 µ 1 x f [1] 1 1 µ 2 x f [1] D. Declercq (ETIS - UMR8051) 33 / 59

From the Probability-Domain to the Log-Domain (4) Checknode Update Intermediate step: decoding in the Fourier Domain with the definitions of the Fourier Transforms» 1 1 F = 1 1 F 1 = 1 2» 1 1 1 1 we obtain the checknode update in the Fourier Domain: " µ 3 f x [0] # " µ 1 x f [0] # " µ 2 x f [0] #! µ 3 f x [1] = F 1 F µ 1 x f [1] F µ 2 x f [1] D. Declercq (ETIS - UMR8051) 34 / 59

From the Probability-Domain to the Log-Domain (5) link between the probability domain and the log-domain µ 3 x f [0] = eu 3 e u 3 + 1 µ 3 x f [1] = 1 e u 3 + 1 From previous equations, we have: (µ 3 f x [0] µ3 f x [1]) = (µ1 x f [0] µ1 x f [1]) (µ2 x f [0] µ2 x f [1]) e u 3 1 e u 3 + 1 = e u 3/2 e u 3/2 e u 3/2 + e u 3/2 = e v 1 1 e v 1 + 1 «e v «2 1 e v 2 + 1! ev1/2 e v1/2 e v 1/2 + e v 1/2 e v 2/2 e v 2/2 e v 2/2 + e v 2/2! tanh u 3 2 = tanh v 1 2 tanh v 2 2 D. Declercq (ETIS - UMR8051) 35 / 59

Final Step: Remove all Products Lets compute the BP checknode update in the Log-Domain: log tanh u 3 2 = log tanh v 1 2 + log tanh v 2 2 The sign of the message is computed in a parallel stream: sign tanh u 3 2 = sign tanh v 1 2 sign tanh v 2 2 sign (u 3 ) = sign (v 1 ) sign (v 2 ) D. Declercq (ETIS - UMR8051) 36 / 59

Binary Belief Propagation Algorithm in the Log-Domain definition of messages in the log-domain u k = log µk f x [0] v µ k f x [1] k = log µ k x f [0] µ k x f [1] message update through the 2 types of nodes U k C U 0 (LLR) V m dvx v m = u 0 + u k k=1,k m log tanh u k 2 = dcx log tanh vm 2 m=1;m k dcy sign(u k ) = sign(v m) m=1;m k V m U k D. Declercq (ETIS - UMR8051) 37 / 59

From the Log-Domain BP to the Min-Sum decoder From previous equations: µ 3 f x [0] = µ1 x f [0]µ2 x f [0] + µ1 x f [1]µ2 x f [1] µ 3 f x [1] = µ1 x f [0]µ2 x f [1] + µ1 x f [1]µ2 x f [0] µ 3 f x u 3 = log [0]! µ 3 f x [1] e v 1 e v «2 = log e v 1 + 1 e v 2 + 1 + 1 1 e v 1 + 1 e v 2 + 1 e v 1 log e v 1 + 1 = log `e v 1+v 2 + 1 log `e v 1 + e v 2 = max (v 1 + v 2, 0) max (v 1, v 2 ) 1 e v 2 + 1 + 1 e v 1 + 1 where max (x, y) = log `e x + e y denotes the Jacobian Logarithm. e v «2 e v 2 + 1 D. Declercq (ETIS - UMR8051) 38 / 59

From the Log-Domain BP to the Min-Sum decoder After some transformations: u 3 = max (v 1 + v 2, 0) max (v 1, v 2 ) The additionnal term log = max(v 1 + v 2, 0) max(v 1, v 2 ) + log = sign(v 1 ) sign(v 2 ) min ( v 1, v 2 ) + log! 1 + e v 1+v 2 1 + e v 1 v 2 1 + e v 1+v 2 1 + e v 1 v 2! 1 + e v 1+v 2 1 + e v can be replaced by a constant value. 1 v 2 noting that this term is negative when v 1 and v 2 have the same sign, and the term is positive when v 1 and v 2 have different signs,...! D. Declercq (ETIS - UMR8051) 39 / 59

We finally get the Corrected Min-Sum decoder 1 Bitnode update: same as for BP v m = u 0 + Xd v u k k=1,k m 2 Checknode update: 0 d c Y 1 u k = @ sign(v m) A. min vm m k m=1;m k 3 Compensation/Correction: ũ k = max(0, u k γ) if u k > 0 ũ k = min(0, u k + γ) if u k < 0 D. Declercq (ETIS - UMR8051) 40 / 59

Comments on the different Decoding Algorithms Shuffled Scheduling can be parallelized if the LDPC is properly designed increased throughput, Shuffled Scheduling converges approximately 2 to 3 times faster than flooding schedule reduced latency, Bit-flipping, Gal-A and Gal-B: easier to get theorems on theoretical performance, Min-Sum with proper offset correction approaches BP for regular or slightly irregular LDPC codes, In some particular cases, the Min-Sum decoder can surpass the BP decoder in the error floor region. D. Declercq (ETIS - UMR8051) 41 / 59

Outline 1 Introduction 2 Belief Propagation on a General Graph 3 Binary Belief Propagation Decoder 4 Non-Binary Belief Propagation Decoding D. Declercq (ETIS - UMR8051) 42 / 59

Belief Propagation in the Probability Domain Check Node Equations: augmenting the Factor graph representation Now the code is defined from Non Binary Parity Check equations Xd c h ij.c j = 0 j=1 with GF (q) = 0, α 0, α 1,..., α q 1 in GF (q) c 1 c 2 c 3 c 1 c 2 c 3 µ v p Permutation Nodes µ x f µ f x µ p v h 1 c 1 h 2 c 2 h 3 c 3 µ p c µ c p h 1c 1 + h 2c 2 + h 3c 3 + D. Declercq (ETIS - UMR8051) 43 / 59

Belief Propagation in the Probability Domain Variable Node Equations Now the code is defined from Non Binary Parity Check equations Xd c h ij.c j = 0 j=1 with GF (q) = 0, α 0, α 1,..., α q 1 in GF (q) Variable node Update is the Kronecker product of all incomming messages: dy v 1 µ dv v p[k] = µ i p v [k] k = 0,..., q 1 i=1 Or in vector form: µ dv v p = µ0 1... µdv p v p v D. Declercq (ETIS - UMR8051) 44 / 59

Explaining the Permutation Step cyclic permutation / rotation GF(q) is a cyclic Field, as such, multiplication by h ij acts on the symbols as a cyclic permutation of the Field elements: µ i p c [k ] = µ i v p [k] αk = h ij α k k = 0,..., q 1 GF(8) 0 α α α = α α 2 2 2 c α α = α α i 3 3 2 α α = α α α α 0 1 4 5 6 α GF(8) 0 0 α = α α 1 4 α = α α 5 α = α α 6 2 2 2 2 2 α = α α 5 6 0 1 2 3 4 h. c ji Internal operation in GF(8) i D. Declercq (ETIS - UMR8051) 45 / 59

Belief Propagation in the Probability Domain Check Node Equations Now the code is defined from Non Binary Parity Check equations Xd c h ij.c j = 0 j=1 with GF (q) = 0, α 0, α 1,..., α q 1 in GF (q) Check node Update is still a Bayesian marginalization. Case of d c = 3 and GF(4): h 1 c 1 h 2 c 2 h 3 c 3 2 µ p c 3 1 µ µ c p p c c 1 \ c 2 0 1 2 3 0 0 1 2 3 1 1 0 3 2 2 2 3 0 1 3 3 2 1 0 + D. Declercq (ETIS - UMR8051) 46 / 59

Belief Propagation in the Probability Domain Check Node Equations h 1 c 1 h 2 c 2 h 3 c 3 2 µ p c 1 µ p c + 3 µ c p c 1 \ c 2 0 1 2 3 0 0 1 2 3 1 1 0 3 2 2 2 3 0 1 3 3 2 1 0 Check node Update is still a Bayesian marginalization. Case of d c = 3 and GF(4): µ 3 c p [0] = µ1 p c [0]µ2 p c [0] + µ1 p c [1]µ2 p c [1] + µ1 p c [2]µ2 p c [2] + µ1 p c [3]µ2 p c [3] µ 3 c p [1] = µ1 p c [0]µ2 p c [1] + µ1 p c [1]µ2 p c [0] + µ1 p c [2]µ2 p c [3] + µ1 p c [3]µ2 p c [2] µ 3 c p [2] = µ1 p c [0]µ2 p c [2] + µ1 p c [2]µ2 p c [0] + µ1 p c [1]µ2 p c [3] + µ1 p c [3]µ2 p c [1] µ 3 c p [3] = µ1 p c [0]µ2 p c [3] + µ1 p c [0]µ2 p c [3] + µ1 p c [1]µ2 p c [2] + µ1 p c [2]µ2 p c [1] The number of terms in the above equations grows as q 2. D. Declercq (ETIS - UMR8051) 47 / 59

Belief Propagation in the Probability Domain How to simplify the checknode update? h 1 c 1 h 2 c 2 h 3 c 3 2 µ p c 1 µ p c 3 µ c p + Fourier? D. Declercq (ETIS - UMR8051) 48 / 59

Tensorial Notation of Messages in case of binary extension fields GF(2 p ), the symbols c GF(q) could be represented by a binary map, or a polynomial: c = [c 1,..., c p] with {c 1,..., c p} {0, 1} px c(x) = c i x i 1 with {c 1,..., c p} {0, 1} i=1 Let put the probability weights µ(c = α k ) in a size-2, p-dimensional tensor indexed by binary values {c 1,..., c p}. Prob(c(x)=0) Prob(c(x)=1) C = = Prob(u(x)=x) Prob(c(x)=1+x) C[0,0] C[1,0] C[0,1] C[1,1] C[i,j]= D. Declercq (ETIS - UMR8051) 49 / 59

Tensorial Notation of Messages a GF(8) example C = Prob(c(x)=0) Prob(c(x)=1) Prob(c(x)=x) Prob(c(x)=x 2) Prob(c(x)=1+x) Prob(c(x)=x+x 2) Prob(c(x)=1+x+x 2) Prob(c(x)=1+x 2) = C[0,0,0] C[1,0,0] C[0,1,0] C[0,0,1] C[1,1,0] C[0,1,1] C[1,1,1] C[1,0,1] C[i,j,k]= D. Declercq (ETIS - UMR8051) 50 / 59

Tensorial Notation of Messages a GF(8) example C = Prob(c(x)=0) Prob(c(x)=1) Prob(c(x)=x) Prob(c(x)=x 2) Prob(c(x)=1+x) Prob(c(x)=x+x 2) Prob(c(x)=1+x+x 2) Prob(c(x)=1+x 2) = C[0,0,0] C[1,0,0] C[0,1,0] C[0,0,1] C[1,1,0] C[0,1,1] C[1,1,1] C[1,0,1] C[i,j,k]= D. Declercq (ETIS - UMR8051) 51 / 59

Tensorial Notation of Messages a GF(8) example C = Prob(c(x)=0) Prob(c(x)=1) Prob(c(x)=x) Prob(c(x)=x 2) Prob(c(x)=1+x) Prob(c(x)=x+x 2) Prob(c(x)=1+x+x 2) Prob(c(x)=1+x 2) = C[0,0,0] C[1,0,0] C[0,1,0] C[0,0,1] C[1,1,0] C[0,1,1] C[1,1,1] C[1,0,1] C[i,j,k]= D. Declercq (ETIS - UMR8051) 52 / 59

Tensorial Notation of Messages a GF(8) example C = Prob(c(x)=0) Prob(c(x)=1) Prob(c(x)=x) Prob(c(x)=x 2) Prob(c(x)=1+x) Prob(c(x)=x+x 2) Prob(c(x)=1+x+x 2) Prob(c(x)=1+x 2) = C[0,0,0] C[1,0,0] C[0,1,0] C[0,0,1] C[1,1,0] C[0,1,1] C[1,1,1] C[1,0,1] C[i,j,k]= D. Declercq (ETIS - UMR8051) 53 / 59

Fast Fourier Transform applied to Tensors Expression of the Fourier Transform» 1 1 C= F(C) = C 1 F 2 F... p F F = 1 1 where k denotes the tensor product in the k-th dimension of the tensor C(i 1,..., i p). in the k-th dimension, (i 1,..., i k 1, i k+1,..., i p) {0, 1} p 1 C (i 1,..., i k 1, 0, i k+1,..., i p) = C(i 1,..., i k 1, 0, i k+1,..., i p) + C(i 1,..., i k 1, 1, i k+1,..., i p) C (i 1,..., i k 1, 1, i k+1,..., i p) = C(i 1,..., i k 1, 0, i k+1,..., i p) C(i 1,..., i k 1, 1, i k+1,..., i p) for the Fourier Transform in one dimension, we perform 2 p = q operations, the total number of operations for F(.) is then p 2 p = q log(q) operations: Fast Fourier Transform D. Declercq (ETIS - UMR8051) 54 / 59

Illustration of the FFT in multiple dimensions 1 1 GF(8) : 3 dimensions 1 1 1 1 1 1 GF(4) : 2 dimensions 1 1 1 1 1 1 1 1 1 1 1 1 D. Declercq (ETIS - UMR8051) 55 / 59

Belief Propagation Decoding Steps in the Fourier Domain V pv U vp product Information Symbols permutation V cp U pc Permutation Nodes F F F F F F F F Interleaver Π Fourier Tranform Fourier Product Node product D. Declercq (ETIS - UMR8051) 56 / 59

Belief Propagation in the Log-Domain (1) recursive use of the max operator. Quantization impacts on the performance are very strong in the Probability Domain u(k) = log µc p[k] µ c p[0] k = 0,..., q 1 v(k) = log µp c[k] µ p c[0] k = 0,..., q 1 Case of d c = 3 and GF(4): µ 3 c p [0] = µ1 p c [0]µ2 p c [0] + µ1 p c [1]µ2 p c [1] + µ1 p c [2]µ2 p c [2] + µ1 p c [3]µ2 p c [3] µ 3 c p [1] = µ1 p c [0]µ2 p c [1] + µ1 p c [1]µ2 p c [0] + µ1 p c [2]µ2 p c [3] + µ1 p c [3]µ2 p c [2] µ 3 c p [2] = µ1 p c [0]µ2 p c [2] + µ1 p c [2]µ2 p c [0] + µ1 p c [1]µ2 p c [3] + µ1 p c [3]µ2 p c [1] µ 3 c p [3] = µ1 p c [0]µ2 p c [3] + µ1 p c [0]µ2 p c [3] + µ1 p c [1]µ2 p c [2] + µ1 p c [2]µ2 p c [1] D. Declercq (ETIS - UMR8051) 57 / 59

Belief Propagation in the Log-Domain (2) Limitations After some manipulations: u 3 (1) = max (v 1 (1), v 2 (1), v 1 (2) + v 2 (3), v 1 (3) + v 2 (2)) K u 3 (2) = max (v 1 (2), v 2 (2), v 1 (1) + v 2 (3), v 1 (3) + v 2 (1)) K u 3 (3) = max (v 1 (3), v 2 (3), v 1 (1) + v 2 (1), v 1 (2) + v 2 (1)) K K = max (0, v 1 (1) + v 2 (1), v 1 (2) + v 2 (2), v 1 (3) + v 2 (3)) The number of max operators grows in O(q 2 ), Its a recursive implementation: approximations (e.g. use of max instead of max, small LUT) become rapidly catastrophic, Log-Domain implementation and the FFT complexity reduction O(q 2 ) O(q log(q)) are not compliant. D. Declercq (ETIS - UMR8051) 58 / 59

Conclusion on Non-Binary Belief Propagation Decoding The bottleneck of the decoder complexity is the check node update 1 Belief Propagation in the Time/Probability-Domain, [Davey,1998] M. DAVEY AND D.J.C. MACKAY, LOW DENSITY PARITY CHECK CODES OVER GF(Q), IEEE communication letter, VOL. 2, PP 165 167, JUNE 1998 2 Belief Propagation in the Time/Log-Domain (limits to GF(16)), [Wymeersch,2004] H. WYMEERSCH, H. STEENDAM AND M. MOENECLAEY, LOG DOMAIN DECODING OF LDPC CODES OVER GF(q), Proceedings of IEEE ICC, PARIS, FRANCE, JUNE 2004. 3 Belief Propagation in the Frequency/Probability Domain, [Davey,1998] M. DAVEY AND D.J.C. MACKAY, LOW DENSITY PARITY CHECK CODES OVER GF(Q), IEEE communication letter, VOL. 2, PP 165 167, JUNE 1998 [Barnault,2003] L. BARNAULT AND D. DECLERCQ, FAST DECODING ALGORITHM FOR LDPC CODES OVER GF(2 q ), Proceedings of IEEE Information theory workshop, PARIS, FRANCE, MARCH, 2003. 4 Belief Propagation in the Frequency/Log-Domain (partially - limits to GF(16)), [Song,2003] H. SONG AND J.R. CRUZ, REDUCED-COMPLEXITY DECODING OF Q-ARY LDPC CODES FOR MAGNETIC RECORDING, IEEE Transactions on Magnetics, VOL. 39(2), MARCH 2003 D. Declercq (ETIS - UMR8051) 59 / 59