LDPC Codes: Motivation Belief-Propagation Decoding of LDPC Codes Amir Bennatan, Princeton University Revolution in coding theory Reliable transmission, rates approaching capacity. BIAWGN, Rate =.5, Threshold.45 db of Shannon limit. BIAWGN, Rate =.88, Threshold.88 db of Shannon limit. BSC, Rate =.5, Threshold.5 of maximum crossover. BEC, any rate: achieve capacity! Low-complexity decoding: belief propagation 2 History Discrete-time, memoryless channel 963: Invented by Gallager 988: Belief-propagation, by Pearl 993: Turbo-codes, (Berrou, Glavieux, Thitimajshima) 996: Rediscovered (MacKay, Neal, Sipser, Spielman) Discrete time instances,,..., N. Output y n dependent only on x n. 3 4
Example: Binary Symmertic Channel (BSC) Decoding, no code case Assume y n =. Which x n was transmitted? Maximum likelihood rule: Pr[X n = Y n = ] =? (Assume p = /4) (assume p = /4) Pr[X n = Y n = ] =? 5 6 Decoding, no code case Decoding, no code case Pr[X n = Y n = ] = = Pr[Y n =,X n = ] Pr[Y n = ] = Pr[Y n = X n = ] Pr[X n = ] Pr[Y n = ] = Pr[Y n = X n = ] /2 Pr[Y n = ] Assumption: equal a priori probabilities: Pr[X n = ] = Pr[X n = ] = /2 Assume y n =. Which x n was transmitted? Pr[X n = Y n = ] =.75 (Assume p = /4) Pr[X n = Y n = ] =.25 Decoder decides: ˆx n = 7 8
Maximum likelihood rule: Decoding, no code case ˆx n = argmax Pr[X n = d Y n = y n ] d=, Example: C = Let s focus on x 2. Decoding, code case, Which x 2 was transmitted? y = 9 Example: C = Decoding, code case, y = Pr[X 2 = Y 2 = ] =.75 Pr[X 2 = Y 2 = ] =.25 Old decoding rule, Decoding, code case ˆx n = argmax Pr{ X n = d Y n = y n } d=, Better decoding rule, ˆx n = argmax Pr{ X n = d Y = y,..., Y N = y N, d=, [X,..., X N ] is a codeword } Decoder decides: ˆx 2 = Bad!!!!! 2
Example: C = Decoding, code case, With new decoding rule, y = Pr[X 2 = Y = y,x is a codeword] =.75.357 Pr[X 2 = Y = y,x is a codeword] =.25.9643 Word error vs. bit error Possibility : Minimize probability of word error. Pr[error] = Pr[ˆx x] Possibility 2: At each bit n, minimize probability of bit error. Pr[error in bit n] = Pr[ˆx n x n ] Our focus: bit error Decoder decides: ˆx 2 = ˆx 2 = 3 4 Old decoding rule, Decoding, code case ˆx n = argmax Pr{ X n = d Y n = y n } d=, Better decoding rule, ˆx n = argmax Pr{ X n = d Y = y,..., Y N = y N, d=, (R > is rate of code) [X,..., X N ] is a codeword } Complexity Θ(2 RN ) Status: Old decoding rule, Decoding, code case ˆx n = argmax Pr{X n = d Y n = y n } d=, Bad performance, excellent complexity. New decoding rule, ˆx n = argmax Pr{X n = d Y = y,x is a codeword} d=, Excellent performance, terrible complexity. Any compromise? 5 6
Linear binary block codes Parity check matrix. A binary matrix H. e.g. H x = Linear binary block codes Parity check matrix. A binary matrix H. e.g. H x =............ x x 2... x N =... h h 2... h M x x 2... x N =... x is a codeword H x = x is a codeword H x = h m x =, m =,..., M Each equation h m x = called a parity check. 7 8 ˆx n Linear binary block codes X is a codeword H X = h m X =, m =,..., M = argmaxpr{x n = d Y = y, X is a codeword} d=, = argmaxpr{x n = d Y = y, H X = } d=, = argmaxpr{x n = d Y = y,y 2 = y 2,..., Y N = y N d=, h X =,h 2 X =,...,h M X = } Decoding, code case Old decoding rule, ˆx n = argmaxpr{x n = d Y n = y n } d=, New decoding rule, ˆx n = argmaxpr{x n = d Y = y,y 2 = y 2,..., Y N = y N d=, h X =,h 2 X =,...,h M X = } Compromise: Use some {y n }, some parity checks! ˆx n = argmaxpr{x n = d Y l = y l,..., Y ll = y ll d=, h m X =,...,h mk X = } 9 2
Compromise: Iterative Decoding. Start with old decoding rule, Some formal stuff... Let w = [w,..., w N ], assume w / C, ˆx n = argmaxpr{x n = d Y n = y n } d=, Pr[X = w] =? 2. Iteratively add more h s and y s, ˆx n = argmaxpr{x n = d Y l = y l,..., Y ll = y ll d=, h m X =,...,h mk X = } How? Belief propagation Answer: Pr[X = w] = ( ) N 2 Pr[X = w X is a codeword] = Formal probability model 2 22 Properties of formal probability model Concepts of belief-propagation. Assumes no code 2. Valid mathematically 3. Non-restrictive 4. We can express other useful values.. Graph based 2. Beliefs 3. Iterative message passing 4. Extrinsic information rule 5. Ignore loops 23 24
Graph based Graph based Variable node #n, corresponds to time slot to unknown code bit X n. to received channel output y n. Check node #m, corresponds: to parity-check h m. 25 26 Parity check Concepts of belief-propagation. Graph based 2. Beliefs 3. Iterative message passing 4. Extrinsic information rule 5. Ignore loops h X = X + X 2 + X 3 + X 6 = Check node connected to participating variables. 27 28
Belief Propagation Belief Propagation The knowledge ( beliefs ) we have: Y = y,y 2 = y 2,..., Y N = y N h X =,h 2 X =,...,h M X = Divide it between the nodes. Variable nodes know channel outputs. Variable n knows value of y n. 29 3 Belief Propagation Concepts of belief-propagation. Graph based 2. Beliefs 3. Iterative message passing 4. Extrinsic information rule 5. Ignore loops Check nodes known parity checks. Check m knows that h m X =. 3 32
Iterative message passing Iterative message passing Nodes communicate using messages. Messages are sent through edges to neighboring nodes. Each message is a number m [, ]. Message from variable n to check m: V n m = Pr[X n = some h s and some y s] 33 34 Iterative message passing Iterative message passing Message from check m to check n: C m n = Pr[X n = other h s and other y s] Rightbound and leftbound iterations. Rightbound iteration. Variables send messages to checks. Leftbound iteration. Checks send messages to variables. 35 36
Iterative message passing Iterative message passing At node n,. Collect all incoming messages, previous iteration 2. Add my knowledge Rightbound iteration #: At variable n,. Collect all incoming messages, previous iteration (none) 2. Add my knowledge (channel output y n ) V n m = Pr[X n = Y n = y n ] 37 38 Iterative message passing Iterative message passing Leftbound iteration #: At check node, to variable 3. Collect all incoming messages (V,V 2,V 3,V 6 ) 2. Add my knowledge (parity check X + X 2 + X 3 + X 6 = ) C 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =, Y = y,y 2 = y 2,Y 3 = y 3,Y 6 = y 6 ] Leftbound iteration #: At check node, to variable 3. Collect all incoming messages (V,V 2,V 3,V 6 ) 2. Add my knowledge (parity check X + X 2 + X 3 + X 6 = ) C 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =, Y = y,y 2 = y 2,Y 3 = y 3,Y 6 = y 6 ] Extrinsic information rule: Message to node never function of message from node. 39 4
Iterative message passing Some formal stuff... Leftbound iteration #: At check node, to variable 3. Collect all incoming messages (V,V 2,V 6 ) 2. Add my knowledge (parity check X + X 2 + X 3 + X 6 = ) Pr[X =,X 2 = ] = Pr[X = ] Pr[X 2 = ] Pr[X =,Y 2 =,X 2 = ] = Pr[X = ] Pr[Y 2 =,X 2 = ] Pr[X + X 3 =,Y 3 =,X 2 + X 4 =,Y 4 = ] = Pr[X + X 3 =,Y 3 = ] Pr[X 2 + X 4 =,Y 4 = ] C 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =, Y = y,y 2 = y 2,Y 6 = y 6 ] =? 4 42 Iterative message passing Leftbound iteration #: At check node, to variable 3. Collect all incoming messages (V,V 2,V 6 ) 2. Add my knowledge (parity check X + X 2 + X 3 + X 6 = ) C 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =, = 2 Y = y,y 2 = y 2,Y 6 = y 6 ] ( 2V i ) i=,2,6 Leftbound iteration #: At check node, to variable 3. Collect all incoming messages (V,V 2,V 3,V 6 ) 2. Add my knowledge (parity check X + X 2 + X 3 + X 6 = ) C 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =, = 2 Y = y,y 2 = y 2,Y 6 = y 6 ] ( 2V i ) i=,2,6 43 44
Iterative message passing Rightbound iteration #2: From variable 3, to check node 3. Collect all incoming messages (C 3,C 2 3 ) 2. Add my knowledge (channel output y 3 ) V 3 3 = Pr[X 3 = some knowledge] Leftbound iteration #: At check node, to variable 3. Collect all incoming messages (V 4 2,V 3 2,V 5 2 ) 2. Add my knowledge (parity check X 3 + X 4 + X 5 = ) C 2 3 = Pr[X 3 = X 3 + X 4 + X 5 =, Y 4 = y 4,Y 5 = y 6 ] = 2 ( 2V i 2 ) i=4,5 45 46 Notation: V 3 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, X 3 + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 Y 3 = y 3 ] C 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6 ] C 2 3 = Pr[X 3 = X 3 + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] V 3 3 = Pr[X 3 = h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] Therefore, V 3 3 = Pr[X 3 = X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, X 3 + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 Y 3 = y 3 ] 47 48
Iterative message passing Pr[X 3 = h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] Pr[h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] Rightbound iteration #2: From variable 3, to check node 3. Collect all incoming messages (C 3,C 2 3 ) 2. Add my knowledge (channel output y 3 ) V 3 3 = Pr[X 3 = h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] 49 5 Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,Y 3 = y 3, X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, X 3 + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] = Pr[X 3 =,Y 3 = y 3, X + X 2 + + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] =? Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,Y 3 = y 3, X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, X 3 + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] = Pr[X 3 =,Y 3 = y 3, X + X 2 + + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] = Pr[X 3 =,Y 3 = y 3 ] Pr[X + X 2 + + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6 ] Pr[ + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] 5 52
Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,Y 3 = y 3 ] Pr[X + X 2 + + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6 ] Pr[ + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ]... = Pr[X 3 = Y 3 = y 3 ] Pr[X 3 = X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6 ] Pr[X 3 = X 3 + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ] fun(y,y 2,y 6,y 4,y 5 ) Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,Y 3 = y 3 ] Pr[X + X 2 + + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6 ] Pr[ + X 4 + X 5 =,Y 4 = y 4,Y 5 = y 5 ]... = P 3 C 3 C 2 3 fun(y,y 2,y 6,y 4,y 5 ) P 3 = Pr[X 3 = Y 3 = y 3 ] 53 54 Pr[X 3 = h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] Pr[h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] P 3 C 3 C 2 3 fun(y,y 2,y 6,y 4,y 5 ) = Pr[h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ]... P 3 i=,2 = C i 3 P 3 i=,2 C i 3 + ( P 3 ) i=,2 ( C i 3) Iterative message passing Rightbound iteration #2: From variable 3, to check 3. Collect all incoming messages (C 3,C 2 3 ) 2. Add my knowledge (channel output y 3 ) P 3 i=,2 V 3 3 = C i 3 P 3 i=,2 C i 3 + ( P 3 ) i=,2 ( C i 3) 55 56
Let s change the problem... Information flow graph, A loop! 57 58 Pr[X 3 =,h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] = = Pr[X 3 =,Y 3 = y 3, X + X 2 + X 3 + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, X 3 + X 2 + X 5 =,Y 2 = y 2,Y 5 = y 5 ] = Pr[X 3 =,Y 3 = y 3, X + X 2 + + X 6 =,Y = y,y 2 = y 2,Y 6 = y 6, + X 2 + X 5 =, Y 2 = y 2,Y 5 = y 5 ] Pr[X 3 =,Y 3 = y 3 ] Pr[X + X 2 + + X 6 =,Y = y, Y 2 = y 2,Y 6 = y 6 ] Pr[ + X 2 + X 5 =,Y 2 = y 2,Y 5 = y 5 ] Pr[X 3 = h X =,Y = y,h 2 X =,Y 2 = y 2,Y 3 = y 3 ] P 3 i=,2 C i 3 P 3 i=,2 C i 3 + ( P 3 ) i=,2 ( C i 3) What to do? 59 6
Iterative message passing Rightbound iteration #2: From variable 3, to check 3. Collect all incoming messages (C 3,C 2 3 ) 2. Add my knowledge (channel output y 3 ) Ignoring loops Why is ignoring loops okay? Number of loops small. Simulation results okay even when some loops. P 3 i=,2 V 3 3 = C i 3 P 3 i=,2 C i 3 + ( P 3 ) i=,2 ( C i 3) Ignore loop: Compute V 3 3 as if no loop! 6 62 Low-density parity checks: d << N, Graph randomly generated. d = average check degree N = block length Belief Propagation Algorithm Rightbound iteration #t: At variable node n, P n i A(n)\{m} V n m = C i n P n i A(n)\{m} C i n + ( P n ) i A(n)\{m} ( C i n) Leftbound iteration #t: At check node m, C m n = 2 ( 2V i m ) i A(m)\{n} where A(n), A(m) are the sets of adjacent nodes to n and m. 63 64