The Concept of Soft Channel Encoding and its Applications in Wireless Relay Networks

Similar documents
An Introduction to Low Density Parity Check (LDPC) Codes

Code design: Computer search

Efficient Computation of EXIT Functions for Non-Binary Iterative Decoding

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

LDPC Codes. Slides originally from I. Land p.1

Modern Coding Theory. Daniel J. Costello, Jr School of Information Theory Northwestern University August 10, 2009

Turbo Compression. Andrej Rikovsky, Advisor: Pavol Hanus

Low Density Parity Check (LDPC) Codes and the Need for Stronger ECC. August 2011 Ravi Motwani, Zion Kwok, Scott Nelson

Coding on a Trellis: Convolutional Codes

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

THE EFFECT OF PUNCTURING ON THE CONVOLUTIONAL TURBO-CODES PERFORMANCES

Graph-based codes for flash memory

Sub-Gaussian Model Based LDPC Decoder for SαS Noise Channels

Convolutional Codes. Telecommunications Laboratory. Alex Balatsoukas-Stimming. Technical University of Crete. November 6th, 2008

Lecture 4 : Introduction to Low-density Parity-check Codes

Digital Communications

Turbo Codes for Deep-Space Communications

Iterative Quantization. Using Codes On Graphs

Optimized Symbol Mappings for Bit-Interleaved Coded Modulation with Iterative Decoding

Simplified Implementation of the MAP Decoder. Shouvik Ganguly. ECE 259B Final Project Presentation

Graph-based Codes for Quantize-Map-and-Forward Relaying

Construction of low complexity Array based Quasi Cyclic Low density parity check (QC-LDPC) codes with low error floor

LDPC Codes. Intracom Telecom, Peania

Chapter 7: Channel coding:convolutional codes

On Bit Error Rate Performance of Polar Codes in Finite Regime

Message-Passing Decoding for Low-Density Parity-Check Codes Harish Jethanandani and R. Aravind, IIT Madras

On the Computation of EXIT Characteristics for Symbol-Based Iterative Decoding

Practical Polar Code Construction Using Generalised Generator Matrices

Aalborg Universitet. Bounds on information combining for parity-check equations Land, Ingmar Rüdiger; Hoeher, A.; Huber, Johannes

Quasi-cyclic Low Density Parity Check codes with high girth

Soft-Output Trellis Waveform Coding

Iterative Encoding of Low-Density Parity-Check Codes

ECEN 655: Advanced Channel Coding

Factor Graphs and Message Passing Algorithms Part 1: Introduction

Introduction to Wireless & Mobile Systems. Chapter 4. Channel Coding and Error Control Cengage Learning Engineering. All Rights Reserved.

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

State-of-the-Art Channel Coding

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Soft-Input Soft-Output Sphere Decoding

Some UEP Concepts in Coding and Physical Transport

Distributed Source Coding Using LDPC Codes

ECC for NAND Flash. Osso Vahabzadeh. TexasLDPC Inc. Flash Memory Summit 2017 Santa Clara, CA 1

Advances in Error Control Strategies for 5G

Message Passing Algorithm with MAP Decoding on Zigzag Cycles for Non-binary LDPC Codes

Rate-Compatible Low Density Parity Check Codes for Capacity-Approaching ARQ Schemes in Packet Data Communications

Decoding of LDPC codes with binary vector messages and scalable complexity

A Hyper-Trellis based Turbo Decoder for Wyner-Ziv Video Coding

Exact Probability of Erasure and a Decoding Algorithm for Convolutional Codes on the Binary Erasure Channel

Error Floors of LDPC Coded BICM

Appendix D: Basics of convolutional codes

Low-density parity-check codes

Fountain Uncorrectable Sets and Finite-Length Analysis

Making Error Correcting Codes Work for Flash Memory

Trapping Set Enumerators for Specific LDPC Codes

1 1 0, g Exercise 1. Generator polynomials of a convolutional code, given in binary form, are g

COMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei

Random Redundant Soft-In Soft-Out Decoding of Linear Block Codes

AALTO UNIVERSITY School of Electrical Engineering. Sergio Damian Lembo MODELING BLER PERFORMANCE OF PUNCTURED TURBO CODES

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes

PUNCTURED 8-PSK TURBO-TCM TRANSMISSIONS USING RECURSIVE SYSTEMATIC CONVOLUTIONAL GF ( 2 N ) ENCODERS

Channel Codes for Short Blocks: A Survey

New Designs for Bit-Interleaved Coded Modulation with Hard-Decision Feedback Iterative Decoding

Introduction to Binary Convolutional Codes [1]

5. Density evolution. Density evolution 5-1

Codes on graphs and iterative decoding

Unequal Error Protection Turbo Codes

Binary Convolutional Codes

JOINT ITERATIVE DETECTION AND DECODING IN THE PRESENCE OF PHASE NOISE AND FREQUENCY OFFSET

Sum Capacity of General Deterministic Interference Channel with Channel Output Feedback

Quantization for Soft-Output Demodulators in Bit-Interleaved Coded Modulation Systems

Integrated Code Design for a Joint Source and Channel LDPC Coding Scheme

A Short Length Low Complexity Low Delay Recursive LDPC Code

Extended Subspace Error Localization for Rate-Adaptive Distributed Source Coding

AN INTRODUCTION TO LOW-DENSITY PARITY-CHECK CODES

Graph-Based Decoding in the Presence of ISI

CHAPTER 8 Viterbi Decoding of Convolutional Codes

BOUNDS ON THE MAP THRESHOLD OF ITERATIVE DECODING SYSTEMS WITH ERASURE NOISE. A Thesis CHIA-WEN WANG

Capacity-approaching codes

Convolutional Codes ddd, Houshou Chen. May 28, 2012

Mapper & De-Mapper System Document

Maximum Likelihood Decoding of Codes on the Asymmetric Z-channel

The Turbo Principle in Wireless Communications

Girth Analysis of Polynomial-Based Time-Invariant LDPC Convolutional Codes

Information Theoretic Imaging

Codes on graphs and iterative decoding

On the Joint Decoding of LDPC Codes and Finite-State Channels via Linear Programming

ELEC 405/511 Error Control Coding. Binary Convolutional Codes

Convergence analysis for a class of LDPC convolutional codes on the erasure channel

High rate soft output Viterbi decoder

Low-density parity-check (LDPC) codes

THE EXIT CHART INTRODUCTION TO EXTRINSIC INFORMATION TRANSFER IN ITERATIVE PROCESSING

Adaptive Cut Generation for Improved Linear Programming Decoding of Binary Linear Codes

An Introduction to Low-Density Parity-Check Codes

Joint Iterative Decoding of LDPC Codes and Channels with Memory

Trellis-based Detection Techniques

Research on Unequal Error Protection with Punctured Turbo Codes in JPEG Image Transmission System

Introduction to Convolutional Codes, Part 1

Bandwidth Efficient and Rate-Matched Low-Density Parity-Check Coded Modulation

CHAPTER 3 LOW DENSITY PARITY CHECK CODES

Turbo Codes for xdsl modems

Transcription:

The Concept of Soft Channel Encoding and its Applications in Wireless Relay Networks Gerald Matz Institute of Telecommunications Vienna University of Technology institute of telecommunications

Acknowledgements People: Andreas Winkelbauer Clemens Novak Stefan Schwandter Funding: SISE - Information Networks (FWF Grant S10606) 2

Outline 1. Introduction 2. Soft channel encoding 3. Approximations 4. Applications 5. Conclusions and outlook 3

Outline 1. Introduction 2. Soft channel encoding 3. Approximations 4. Applications 5. Conclusions and outlook 4

Introduction: Basic Idea Soft information processing popular in modern receivers idea: use soft information also for transmission Soft-input soft-output (SISO) encoding extension of hard-input hard-output (HIHO) encoding data only known in terms of probabilities how to (efficiently) encode noisy data? Applications physical layer network coding distributed (turbo) coding joint source-channel coding 5

Introduction: Motivation (1) Example: relay channel x S S x S R x R D u SR ch. ũ R π S ũ Encoder S Encoder R x S x R relay R assists S D transmission decode-and-forward distributed channel coding D can perform iterative (turbo) decoding What if relay fails to decode? forwarding soft information might help how to transmit soft information? 6

Introduction: Motivation (2) Soft information forwarding H. Sneessens and L. Vandendorpe, Soft decode and forward improves cooperative communications, in Proc. Workshop on Computational Advances in Multi-Sensor Adaptive Processing, 2005. P. Weitkemper, D. Wübben, V. Kühn, and K.D. Kammeyer, Soft information relaying for wireless networks with errorprone source-relay link, in Proc. ITG Conference on Source and Channel Coding, 2008. Transmission of soft information LLR quantization information bottleneck N. Tishby, F. Pereira, and W. Bialek, The information bottleneck method, in Proc. 37th Allerton Conference on Communication and Computation, 1999. quantizer labels & symbol mapping binary switching K. Zeger and A. Gersho, Pseudo-gray coding, IEEE Trans. Comm., vol. 38, no. 12, 1990. 7

Outline 1. Introduction 2. Soft channel encoding 3. Approximations 4. Applications 5. Conclusions and outlook 8

Hard Encoding/Decoding Revisited Notation information bit sequence: u = (u 1... u K ) T {0, 1} K code bit sequence: c = (c 1... c N ) T {0, 1} N assume N K Linear binary channel code one-to-one mapping φ between data bits u and code bits c encoding: c = φ(u) codebook: C = φ ( {0, 1} K) Hard decoding: observed code bit sequence c = c e = φ(u) e decoder: mapping ψ such that ψ(c ) is close to u 9

Soft Decoding Revisited Word-level observations: code bit sequence probabilities p in (c ) enforce code constraint: p (c) = { pin (c) c C p in(c ), c C 0, else info bit sequence probabilities: p out (u) = p (φ(u)) conceptually simple, computationally infeasible Bit-level observed: code bit probabilities p in (c n ) = c n p in (c ) desired: info bit probabilities p out (u k ) = u k p out (φ(u)) conceptually more difficult, computationally feasible example: equivalent to BCJR for convolutional codes 10

Soft Encoding: Basics Word-level given: info bit sequence probabilities p in (u) code bit sequence probabilities: { p in (φ 1 (c)), c C p out (c) = 0, else conceptually simple, computationally infeasible Bit-level given: info bit probabilities p in (u k ) p in (u) = K p in (u k ) desired: code bit probabilities p out (c n ) = c n p out (c) = Main question: efficient implementation? k=1 c C:c n p in (φ 1 (c)) 11

Soft Encoding: LLRs System model: binary source u noisy L(u) SISO L(c u) channel encoder Log-likelihood ratios (LLR) definition: noisy binary source L(u k ) = log p in(u k =0) p in (u k =1), L(c n) = log p out(c n =0) p out (c n =1) encoder input: L(u) = (L(u 1 )... L(u K )) T encoder output: L(c) = (L(c 1 )... L(c N )) T 12

HIHO versus SISO HIHO is reversible: SISO is irreversible: except if L(u k ) = for all k however: sign(l 1 (u k )) = sign(l 2 (u k )) Post-sliced SISO identical to pre-scliced HIHO L(u) SISO L(c) ĉ L(u) û HIHO ĉ encoder encoder 13

Block Codes (1) Binary (N, K) block code C with generator matrix G F N K 2 HIHO encoding: c = Gu, involves XOR/modulo 2 sum Statistics of XOR: p(u k u l =0) = p(u k =0)p(u l =0) + p(u k =1)p(u l =1) Boxplus: L(u k u l ) L(u k ) L(u l ) = 1 + el(u k)+l(u l ) e L(u k) + e L(u l) is associative and commutative L 1 L 2 min{ L 1, L 2 } J. Hagenauer, E. Offer, and L. Papke, Iterative decoding of binary block and convolutional codes, IEEE Trans. Inf. Theory, vol. 42, no. 2, Feb. 1996. SISO encoder: replace in HIHO encoder by 14

Block Codes (2) Example: systematic (5, 3) block code c 1 1 0 0 u 1 c 2 0 1 0 u 1 u 2 c 3 = 0 0 1 u 2 = u 3 c 4 1 1 0 u 3 u 1 u 2 c 5 0 1 1 }{{} u u 2 u 3 }{{}}{{} c G u 1 u 2 u 3 c 1 c 2 c 3 L(u 1 ) L(u 2 ) L(u 3 ) L(c 1 ) L(c 2 ) L(c 3 ) c 4 L(c 4) c 5 L(c 5) 15

Soft Encoding: Convolutional Codes (1) Code C with given trellis: use BCJR algorithm m (m (1), m (0)) α k (m) β k (m) 0 (0, 0) 1 2 (0, 1) (1, 0) 3 (1, 1) k = 0 1 2 3 T 3 T 2 T 1 T 16

Soft Encoding: Convolutional Codes (1) Code C with given trellis: use BCJR algorithm m (m (1), m (0)) α k (m) β k (m) 0 (0, 0) 1 2 (0, 1) (1, 0) 3 (1, 1) k = 0 1 2 3 T 3 T 2 T 1 T γ k (m, m) = P { S k+1 = m S k = m } α k (m) = M 1 α k 1(m )γ k 1 (m, m) m = 0 β k (m) = M 1 β k+1(m )γ k (m, m ) m = 0 ( (j)) p b c k = α k (m )γ k (m, m)β k+1 (m) (m,m) A (j) b A. Winkelbauer and G. Matz, On efficient soft-input soft-output encoding of convolutional codes, in Proc. ICASSP 2011 16

Soft Encoding: Convolutional Codes (2) Observe: only 2 transition probabilites per time instant Backward recursion β k (m) is rendered superfluous m (m (1), m (0)) α k (m) β k (m) 0 (0, 0) 1 2 (0, 1) (1, 0) 3 (1, 1) k = 0 1 2 3 T 3 T 2 T 1 T 17

Soft Encoding: Convolutional Codes (2) Observe: only 2 transition probabilites per time instant Backward recursion β k (m) is rendered superfluous m (m (1), m (0)) α k (m) β k (m) 0 (0, 0) 1 2 (0, 1) (1, 0) 3 (1, 1) k = 0 1 2 3 T 3 T 2 T 1 T Simplified forward recursion encoder (FRE), reduces computational complexity memory requirements encoding delay s k+1 (m) = s k (m )γ k (m, m) m B m ( (j)) p b c k = s k (m )γ k (m, m) (m,m) A (j) b 17

Soft Encoding: Convolutional Codes (3) Special case: non-recursive shift register encoder (SRE) soft encoder with shift register implementation linear complexity with minimal memory requirements Example: (7, 5) 8 convolutional code 1000 L(u k ) D D L(c 2k ) L(c 2k+1 ) operations per unit time 800 600 400 200 BCJR FRE SRE (upper bound) 0 2 4 6 8 10 constraint length ν 18

Soft Encoding: LDPC Codes (1) Sparse parity check matrix H given: v C iff H T v = 0 Graphical representation of H: Tanner graph bipartite graph with variable nodes and check nodes let V denote the set of variable nodes Encoding: iterative erasure filling matrix multiplication Gu is infeasible for large block lengths erasure pattern is known 19

Soft Encoding: First Approach Consider systematic code: c = (u T p T ) T Erasure channel: L(c) = (L(u) T L(p) T ) T (L(u) T 0 T ) T (L(u) T L(p) T ) T deterministic erasure channel (L(u) T L(p) T ) T (L(u) T 0 T ) T SISO encoding = decoding... for the erasure channel (erasure filling) or, equivalently, w/o channel observation, but with prior soft information on u Consider special problem structure yields efficient implementation (much) less complex than SISO decoding adjustable accuracy/complexity trade-off 20

Soft Encoding: LDPC Codes (2) Iterative erasure filling algorithm 1. find all check nodes involving a single erasure 2. fill the erasures found in step 1 3. repeat steps 1-2 until there are no more (recoverable) erasures 21

Soft Encoding: LDPC Codes (2) Iterative erasure filling algorithm 1. find all check nodes involving a single erasure 2. fill the erasures found in step 1 3. repeat steps 1-2 until there are no more (recoverable) erasures Encoding example: Problem: stopping sets non-recoverable erasures 21

Soft Encoding: LDPC Codes (3) Definition: S V is a stopping set if all neighbors of S are connected to S at least twice Example: 22

Soft Encoding: LDPC Codes (3) Definition: S V is a stopping set if all neighbors of S are connected to S at least twice Example: Solution: modify H such that stopping sets vanish constraints: code C remains unchanged and H remains sparse M. Shaqfeh, N. Görtz, Systematic Modification of Parity-Check Matrices for Efficient Encoding of LDPC Codes, in Proc. ICC 2007 22

Outline 1. Introduction 2. Soft channel encoding 3. Approximations 4. Applications 5. Conclusions and outlook 23

Approximations: Boxplus Operator Boxplus operator is used frequently in SISO encoding Recall a b = 1 + ea+b e a + e b Approximation: a b a b = sign(a)sign(b) min( a, b ) small error if a b large overestimates true result: a b a b suitable for hardware implementation 24

Approximations: Boxplus Operator Boxplus operator is used frequently in SISO encoding Recall a b = 1 + ea+b e a + e b Approximation: a b a b = sign(a)sign(b) min( a, b ) small error if a b large overestimates true result: a b a b suitable for hardware implementation Correction terms a b = a b + log(1 + e a+b ) log(1 + e a b ) }{{} log(2) additivecorrection 0 ( ) 1 1 + e a b a b = a b 1 log min( a, b ) 1 + e a + b }{{} 0 multiplicativecorrection 1 Store correction term in (small) lookup table Decrease lookup table size by LLR clipping 24

Approximations: Max-log Approximation FRE: perform computation in log-domain (f = log f) s k+1 (m) = log exp ( s m B k (m ) + γk (m, m) ) m p ( (j)) b y k = log exp ( s k (m ) + γk (m, m) ) (m,m) A (j) b Approximation: log k exp(a k) max k a k a M 25

Approximations: Max-log Approximation FRE: perform computation in log-domain (f = log f) s k+1 (m) = log exp ( s m B k (m ) + γk (m, m) ) m p ( (j)) b y k = log exp ( s k (m ) + γk (m, m) ) (m,m) A (j) b Approximation: log k exp(a k) max k a k a M Correction term: log(e a + e b ) = max(a, b) + log(1 + e a b ) nesting yields log k exp(a k) = a M + log k exp(a k a M ) Correction term depends only on a b lookup table 25

Outline 1. Introduction 2. Soft channel encoding 3. Approximations 4. Applications 5. Conclusions and outlook 26

Applications: Soft Re-encoding (1) Soft network coding (NC) for the two-way relay channel users A and B exchange independent messages relay R performs network coding with soft re-encoding R R x R R x R x A x A x B x B A B A B A B 27

Applications: Soft Re-encoding (1) Soft network coding (NC) for the two-way relay channel users A and B exchange independent messages relay R performs network coding with soft re-encoding R R x R R x R x A x A x B x B A B A B A B y AR Demapper y BR Demapper SISO decoder SISO decoder π π SISO encoder SISO encoder Puncturer Puncturer Network encoder Q( ) Modulator x R Transmission of quantized soft information is critical A. Winkelbauer and G. Matz, Soft-Information-Based Joint Network-Channel Coding for the Two-Way Relay Channel, submitted to NETCOD 2011 27

Applications: Soft Re-encoding (2) BER simulation results sym. channel conditions, R halfway between A, B, rate 1 bpcu, 256 info bits, 4 level quantization, 1 decoder iteration 10 0 10 1 10 2 BER 10 3 PCCC (5 it.) hard NC soft NC 10 4 10 5 8 6 4 2 0 2 4 6 γ AB [db] SNR gain of 4.5 db at BER 10 3 over hard NC 28

Applications: Convolutional Network Coding Physical layer NC for the multiple-access relay channel A x A R x R D B x B 29

Applications: Convolutional Network Coding Physical layer NC for the multiple-access relay channel A x A R x R D B x B NC at relay ĉ A L(c A ) L(c A ) ĉ B ĉ R L(c B ) L R L(c B ) SISO encoder L R hard NC soft NC convolutional NC 29

Outline 1. Introduction 2. Soft channel encoding 3. Approximations 4. Applications 5. Conclusions and outlook 30

Conclusions and Outlook Conclusions: Efficient methods for soft encoding Approximations facilitate practical implementation Applications show usefulness of soft encoding 31

Conclusions and Outlook Conclusions: Efficient methods for soft encoding Approximations facilitate practical implementation Applications show usefulness of soft encoding Outlook: Frequency domain soft encoding Code and transceiver design for physical layer NC Performance analysis of physical layer NC schemes 31

Conclusions and Outlook Conclusions: Efficient methods for soft encoding Approximations facilitate practical implementation Applications show usefulness of soft encoding Outlook: Frequency domain soft encoding Code and transceiver design for physical layer NC Performance analysis of physical layer NC schemes Thank you! 31