The Concept of Soft Channel Encoding and its Applications in Wireless Relay Networks Gerald Matz Institute of Telecommunications Vienna University of Technology institute of telecommunications
Acknowledgements People: Andreas Winkelbauer Clemens Novak Stefan Schwandter Funding: SISE - Information Networks (FWF Grant S10606) 2
Outline 1. Introduction 2. Soft channel encoding 3. Approximations 4. Applications 5. Conclusions and outlook 3
Outline 1. Introduction 2. Soft channel encoding 3. Approximations 4. Applications 5. Conclusions and outlook 4
Introduction: Basic Idea Soft information processing popular in modern receivers idea: use soft information also for transmission Soft-input soft-output (SISO) encoding extension of hard-input hard-output (HIHO) encoding data only known in terms of probabilities how to (efficiently) encode noisy data? Applications physical layer network coding distributed (turbo) coding joint source-channel coding 5
Introduction: Motivation (1) Example: relay channel x S S x S R x R D u SR ch. ũ R π S ũ Encoder S Encoder R x S x R relay R assists S D transmission decode-and-forward distributed channel coding D can perform iterative (turbo) decoding What if relay fails to decode? forwarding soft information might help how to transmit soft information? 6
Introduction: Motivation (2) Soft information forwarding H. Sneessens and L. Vandendorpe, Soft decode and forward improves cooperative communications, in Proc. Workshop on Computational Advances in Multi-Sensor Adaptive Processing, 2005. P. Weitkemper, D. Wübben, V. Kühn, and K.D. Kammeyer, Soft information relaying for wireless networks with errorprone source-relay link, in Proc. ITG Conference on Source and Channel Coding, 2008. Transmission of soft information LLR quantization information bottleneck N. Tishby, F. Pereira, and W. Bialek, The information bottleneck method, in Proc. 37th Allerton Conference on Communication and Computation, 1999. quantizer labels & symbol mapping binary switching K. Zeger and A. Gersho, Pseudo-gray coding, IEEE Trans. Comm., vol. 38, no. 12, 1990. 7
Outline 1. Introduction 2. Soft channel encoding 3. Approximations 4. Applications 5. Conclusions and outlook 8
Hard Encoding/Decoding Revisited Notation information bit sequence: u = (u 1... u K ) T {0, 1} K code bit sequence: c = (c 1... c N ) T {0, 1} N assume N K Linear binary channel code one-to-one mapping φ between data bits u and code bits c encoding: c = φ(u) codebook: C = φ ( {0, 1} K) Hard decoding: observed code bit sequence c = c e = φ(u) e decoder: mapping ψ such that ψ(c ) is close to u 9
Soft Decoding Revisited Word-level observations: code bit sequence probabilities p in (c ) enforce code constraint: p (c) = { pin (c) c C p in(c ), c C 0, else info bit sequence probabilities: p out (u) = p (φ(u)) conceptually simple, computationally infeasible Bit-level observed: code bit probabilities p in (c n ) = c n p in (c ) desired: info bit probabilities p out (u k ) = u k p out (φ(u)) conceptually more difficult, computationally feasible example: equivalent to BCJR for convolutional codes 10
Soft Encoding: Basics Word-level given: info bit sequence probabilities p in (u) code bit sequence probabilities: { p in (φ 1 (c)), c C p out (c) = 0, else conceptually simple, computationally infeasible Bit-level given: info bit probabilities p in (u k ) p in (u) = K p in (u k ) desired: code bit probabilities p out (c n ) = c n p out (c) = Main question: efficient implementation? k=1 c C:c n p in (φ 1 (c)) 11
Soft Encoding: LLRs System model: binary source u noisy L(u) SISO L(c u) channel encoder Log-likelihood ratios (LLR) definition: noisy binary source L(u k ) = log p in(u k =0) p in (u k =1), L(c n) = log p out(c n =0) p out (c n =1) encoder input: L(u) = (L(u 1 )... L(u K )) T encoder output: L(c) = (L(c 1 )... L(c N )) T 12
HIHO versus SISO HIHO is reversible: SISO is irreversible: except if L(u k ) = for all k however: sign(l 1 (u k )) = sign(l 2 (u k )) Post-sliced SISO identical to pre-scliced HIHO L(u) SISO L(c) ĉ L(u) û HIHO ĉ encoder encoder 13
Block Codes (1) Binary (N, K) block code C with generator matrix G F N K 2 HIHO encoding: c = Gu, involves XOR/modulo 2 sum Statistics of XOR: p(u k u l =0) = p(u k =0)p(u l =0) + p(u k =1)p(u l =1) Boxplus: L(u k u l ) L(u k ) L(u l ) = 1 + el(u k)+l(u l ) e L(u k) + e L(u l) is associative and commutative L 1 L 2 min{ L 1, L 2 } J. Hagenauer, E. Offer, and L. Papke, Iterative decoding of binary block and convolutional codes, IEEE Trans. Inf. Theory, vol. 42, no. 2, Feb. 1996. SISO encoder: replace in HIHO encoder by 14
Block Codes (2) Example: systematic (5, 3) block code c 1 1 0 0 u 1 c 2 0 1 0 u 1 u 2 c 3 = 0 0 1 u 2 = u 3 c 4 1 1 0 u 3 u 1 u 2 c 5 0 1 1 }{{} u u 2 u 3 }{{}}{{} c G u 1 u 2 u 3 c 1 c 2 c 3 L(u 1 ) L(u 2 ) L(u 3 ) L(c 1 ) L(c 2 ) L(c 3 ) c 4 L(c 4) c 5 L(c 5) 15
Soft Encoding: Convolutional Codes (1) Code C with given trellis: use BCJR algorithm m (m (1), m (0)) α k (m) β k (m) 0 (0, 0) 1 2 (0, 1) (1, 0) 3 (1, 1) k = 0 1 2 3 T 3 T 2 T 1 T 16
Soft Encoding: Convolutional Codes (1) Code C with given trellis: use BCJR algorithm m (m (1), m (0)) α k (m) β k (m) 0 (0, 0) 1 2 (0, 1) (1, 0) 3 (1, 1) k = 0 1 2 3 T 3 T 2 T 1 T γ k (m, m) = P { S k+1 = m S k = m } α k (m) = M 1 α k 1(m )γ k 1 (m, m) m = 0 β k (m) = M 1 β k+1(m )γ k (m, m ) m = 0 ( (j)) p b c k = α k (m )γ k (m, m)β k+1 (m) (m,m) A (j) b A. Winkelbauer and G. Matz, On efficient soft-input soft-output encoding of convolutional codes, in Proc. ICASSP 2011 16
Soft Encoding: Convolutional Codes (2) Observe: only 2 transition probabilites per time instant Backward recursion β k (m) is rendered superfluous m (m (1), m (0)) α k (m) β k (m) 0 (0, 0) 1 2 (0, 1) (1, 0) 3 (1, 1) k = 0 1 2 3 T 3 T 2 T 1 T 17
Soft Encoding: Convolutional Codes (2) Observe: only 2 transition probabilites per time instant Backward recursion β k (m) is rendered superfluous m (m (1), m (0)) α k (m) β k (m) 0 (0, 0) 1 2 (0, 1) (1, 0) 3 (1, 1) k = 0 1 2 3 T 3 T 2 T 1 T Simplified forward recursion encoder (FRE), reduces computational complexity memory requirements encoding delay s k+1 (m) = s k (m )γ k (m, m) m B m ( (j)) p b c k = s k (m )γ k (m, m) (m,m) A (j) b 17
Soft Encoding: Convolutional Codes (3) Special case: non-recursive shift register encoder (SRE) soft encoder with shift register implementation linear complexity with minimal memory requirements Example: (7, 5) 8 convolutional code 1000 L(u k ) D D L(c 2k ) L(c 2k+1 ) operations per unit time 800 600 400 200 BCJR FRE SRE (upper bound) 0 2 4 6 8 10 constraint length ν 18
Soft Encoding: LDPC Codes (1) Sparse parity check matrix H given: v C iff H T v = 0 Graphical representation of H: Tanner graph bipartite graph with variable nodes and check nodes let V denote the set of variable nodes Encoding: iterative erasure filling matrix multiplication Gu is infeasible for large block lengths erasure pattern is known 19
Soft Encoding: First Approach Consider systematic code: c = (u T p T ) T Erasure channel: L(c) = (L(u) T L(p) T ) T (L(u) T 0 T ) T (L(u) T L(p) T ) T deterministic erasure channel (L(u) T L(p) T ) T (L(u) T 0 T ) T SISO encoding = decoding... for the erasure channel (erasure filling) or, equivalently, w/o channel observation, but with prior soft information on u Consider special problem structure yields efficient implementation (much) less complex than SISO decoding adjustable accuracy/complexity trade-off 20
Soft Encoding: LDPC Codes (2) Iterative erasure filling algorithm 1. find all check nodes involving a single erasure 2. fill the erasures found in step 1 3. repeat steps 1-2 until there are no more (recoverable) erasures 21
Soft Encoding: LDPC Codes (2) Iterative erasure filling algorithm 1. find all check nodes involving a single erasure 2. fill the erasures found in step 1 3. repeat steps 1-2 until there are no more (recoverable) erasures Encoding example: Problem: stopping sets non-recoverable erasures 21
Soft Encoding: LDPC Codes (3) Definition: S V is a stopping set if all neighbors of S are connected to S at least twice Example: 22
Soft Encoding: LDPC Codes (3) Definition: S V is a stopping set if all neighbors of S are connected to S at least twice Example: Solution: modify H such that stopping sets vanish constraints: code C remains unchanged and H remains sparse M. Shaqfeh, N. Görtz, Systematic Modification of Parity-Check Matrices for Efficient Encoding of LDPC Codes, in Proc. ICC 2007 22
Outline 1. Introduction 2. Soft channel encoding 3. Approximations 4. Applications 5. Conclusions and outlook 23
Approximations: Boxplus Operator Boxplus operator is used frequently in SISO encoding Recall a b = 1 + ea+b e a + e b Approximation: a b a b = sign(a)sign(b) min( a, b ) small error if a b large overestimates true result: a b a b suitable for hardware implementation 24
Approximations: Boxplus Operator Boxplus operator is used frequently in SISO encoding Recall a b = 1 + ea+b e a + e b Approximation: a b a b = sign(a)sign(b) min( a, b ) small error if a b large overestimates true result: a b a b suitable for hardware implementation Correction terms a b = a b + log(1 + e a+b ) log(1 + e a b ) }{{} log(2) additivecorrection 0 ( ) 1 1 + e a b a b = a b 1 log min( a, b ) 1 + e a + b }{{} 0 multiplicativecorrection 1 Store correction term in (small) lookup table Decrease lookup table size by LLR clipping 24
Approximations: Max-log Approximation FRE: perform computation in log-domain (f = log f) s k+1 (m) = log exp ( s m B k (m ) + γk (m, m) ) m p ( (j)) b y k = log exp ( s k (m ) + γk (m, m) ) (m,m) A (j) b Approximation: log k exp(a k) max k a k a M 25
Approximations: Max-log Approximation FRE: perform computation in log-domain (f = log f) s k+1 (m) = log exp ( s m B k (m ) + γk (m, m) ) m p ( (j)) b y k = log exp ( s k (m ) + γk (m, m) ) (m,m) A (j) b Approximation: log k exp(a k) max k a k a M Correction term: log(e a + e b ) = max(a, b) + log(1 + e a b ) nesting yields log k exp(a k) = a M + log k exp(a k a M ) Correction term depends only on a b lookup table 25
Outline 1. Introduction 2. Soft channel encoding 3. Approximations 4. Applications 5. Conclusions and outlook 26
Applications: Soft Re-encoding (1) Soft network coding (NC) for the two-way relay channel users A and B exchange independent messages relay R performs network coding with soft re-encoding R R x R R x R x A x A x B x B A B A B A B 27
Applications: Soft Re-encoding (1) Soft network coding (NC) for the two-way relay channel users A and B exchange independent messages relay R performs network coding with soft re-encoding R R x R R x R x A x A x B x B A B A B A B y AR Demapper y BR Demapper SISO decoder SISO decoder π π SISO encoder SISO encoder Puncturer Puncturer Network encoder Q( ) Modulator x R Transmission of quantized soft information is critical A. Winkelbauer and G. Matz, Soft-Information-Based Joint Network-Channel Coding for the Two-Way Relay Channel, submitted to NETCOD 2011 27
Applications: Soft Re-encoding (2) BER simulation results sym. channel conditions, R halfway between A, B, rate 1 bpcu, 256 info bits, 4 level quantization, 1 decoder iteration 10 0 10 1 10 2 BER 10 3 PCCC (5 it.) hard NC soft NC 10 4 10 5 8 6 4 2 0 2 4 6 γ AB [db] SNR gain of 4.5 db at BER 10 3 over hard NC 28
Applications: Convolutional Network Coding Physical layer NC for the multiple-access relay channel A x A R x R D B x B 29
Applications: Convolutional Network Coding Physical layer NC for the multiple-access relay channel A x A R x R D B x B NC at relay ĉ A L(c A ) L(c A ) ĉ B ĉ R L(c B ) L R L(c B ) SISO encoder L R hard NC soft NC convolutional NC 29
Outline 1. Introduction 2. Soft channel encoding 3. Approximations 4. Applications 5. Conclusions and outlook 30
Conclusions and Outlook Conclusions: Efficient methods for soft encoding Approximations facilitate practical implementation Applications show usefulness of soft encoding 31
Conclusions and Outlook Conclusions: Efficient methods for soft encoding Approximations facilitate practical implementation Applications show usefulness of soft encoding Outlook: Frequency domain soft encoding Code and transceiver design for physical layer NC Performance analysis of physical layer NC schemes 31
Conclusions and Outlook Conclusions: Efficient methods for soft encoding Approximations facilitate practical implementation Applications show usefulness of soft encoding Outlook: Frequency domain soft encoding Code and transceiver design for physical layer NC Performance analysis of physical layer NC schemes Thank you! 31