Tightened Upper Bounds on the ML Decoding Error Probability of Binary Linear Block Codes and Applications

Similar documents
Tightened Upper Bounds on the ML Decoding Error Probability of Binary Linear Block Codes and Applications

Analytical Bounds on Maximum-Likelihood Decoded Linear Codes: An Overview

An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Bounds on the Error Probability of ML Decoding for Block and Turbo-Block Codes

Performance Analysis of Linear Codes under Maximum-Likelihood Decoding: A Tutorial

Tight Exponential Upper Bounds on the ML Decoding Error Probability of Block Codes Over Fully Interleaved Fading Channels

An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels

Bounds on the Maximum Likelihood Decoding Error Probability of Low Density Parity Check Codes

On Achievable Rates and Complexity of LDPC Codes over Parallel Channels: Bounds and Applications

LECTURE 10. Last time: Lecture outline

Error Exponent Region for Gaussian Broadcast Channels

Maximum Likelihood Decoding of Codes on the Asymmetric Z-channel

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Evaluate the Word Error Rate of Binary Block Codes with Square Radius Probability Density Function

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

Capacity-Achieving Ensembles for the Binary Erasure Channel With Bounded Complexity

Non-Linear Turbo Codes for Interleaver-Division Multiple Access on the OR Channel.

Polar Code Construction for List Decoding

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

Finding the best mismatched detector for channel coding and hypothesis testing

ECEN 655: Advanced Channel Coding

Channel Polarization and Blackwell Measures

Recent Results on Capacity-Achieving Codes for the Erasure Channel with Bounded Complexity

Bit-Interleaved Coded Modulation Revisited: A Mismatched Decoding Perspective

Constellation Shaping for Communication Channels with Quantized Outputs

Approaching Blokh-Zyablov Error Exponent with Linear-Time Encodable/Decodable Codes

Bounds on Achievable Rates of LDPC Codes Used Over the Binary Erasure Channel

On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

Cut-Set Bound and Dependence Balance Bound

ECE Information theory Final (Fall 2008)

arxiv:cs/ v1 [cs.it] 16 Aug 2005

ML and Near-ML Decoding of LDPC Codes Over the BEC: Bounds and Decoding Algorithms

Short Polar Codes. Peihong Yuan. Chair for Communications Engineering. Technische Universität München

On the Distribution of Mutual Information

LDPC Code Ensembles that Universally Achieve Capacity under BP Decoding: A Simple Derivation

Bounds on the Performance of Belief Propagation Decoding

On the Concentration of the Crest Factor for OFDM Signals

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Performance of small signal sets

An Introduction to Low Density Parity Check (LDPC) Codes

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

On Two-user Fading Gaussian Broadcast Channels. with Perfect Channel State Information at the Receivers. Daniela Tuninetti

Bit-interleaved coded modulation revisited: a mismatched decoding perspective Martinez Vicente, A.; Guillén i Fàbregas, A.; Caire, G.; Willems, F.M.J.

Variable Rate Channel Capacity. Jie Ren 2013/4/26

Constellation Shaping for Communication Channels with Quantized Outputs

On Concentration of Martingales and Applications in Information Theory, Communication & Coding

Lecture 1: The Multiple Access Channel. Copyright G. Caire 12

Tight Bounds for Symmetric Divergence Measures and a New Inequality Relating f-divergences

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

The Method of Types and Its Application to Information Hiding

The Capacity of Finite Abelian Group Codes Over Symmetric Memoryless Channels Giacomo Como and Fabio Fagnani

Capacity Bounds for Diamond Networks

On the Concentration of the Crest Factor for OFDM Signals

Performance Analysis of Physical Layer Network Coding

Practical Polar Code Construction Using Generalised Generator Matrices

ELEC546 Review of Information Theory

Universal Anytime Codes: An approach to uncertain channels in control

Coding Techniques for Data Storage Systems

Rematch and Forward: Joint Source-Channel Coding for Communications

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Belief Propagation, Information Projections, and Dykstra s Algorithm

On the Application of LDPC Codes to Arbitrary Discrete-Memoryless Channels

Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function

Turbo Codes for Deep-Space Communications

APPLICATIONS. Quantum Communications

Shannon s noisy-channel theorem

Successive Cancellation Decoding of Single Parity-Check Product Codes

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

Message-Passing Decoding for Low-Density Parity-Check Codes Harish Jethanandani and R. Aravind, IIT Madras

RCA Analysis of the Polar Codes and the use of Feedback to aid Polarization at Short Blocklengths

arxiv: v4 [cs.it] 17 Oct 2015

LDPC Codes. Slides originally from I. Land p.1

Graph-based codes for flash memory

Broadcasting over Fading Channelswith Mixed Delay Constraints

Channel combining and splitting for cutoff rate improvement

Exact Random Coding Error Exponents of Optimal Bin Index Decoding

On the Error Exponents of ARQ Channels with Deadlines

Simple Channel Coding Bounds

Lecture 4 : Introduction to Low-density Parity-check Codes

The PPM Poisson Channel: Finite-Length Bounds and Code Design

Lecture 4 Noisy Channel Coding

Accumulate-Repeat-Accumulate Codes: Capacity-Achieving Ensembles of Systematic Codes for the Erasure Channel with Bounded Complexity

Minimum Distance Bounds for Multiple-Serially Concatenated Code Ensembles

These outputs can be written in a more convenient form: with y(i) = Hc m (i) n(i) y(i) = (y(i); ; y K (i)) T ; c m (i) = (c m (i); ; c m K(i)) T and n

Upper Bounds on the Capacity of Binary Intermittent Communication

Exponential Error Bounds for Block Concatenated Codes with Tail Biting Trellis Inner Codes

Exercise 1. = P(y a 1)P(a 1 )

Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding

Reliable Communication Under Mismatched Decoding

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift

16.36 Communication Systems Engineering

Error Exponent Regions for Gaussian Broadcast and Multiple Access Channels

Channel Coding and Interleaving

Structured Low-Density Parity-Check Codes: Algebraic Constructions

Codes designed via algebraic lifts of graphs

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Appendix B Information theory from first principles

Lecture 4: Proof of Shannon s theorem and an explicit code

Transcription:

on the ML Decoding Error Probability of Binary Linear Block Codes and Moshe Twitto Department of Electrical Engineering Technion-Israel Institute of Technology Haifa 32000, Israel Joint work with Igal Sason and Shlomo Shamai 2006 IEEE International Symposium on Information Theory, Seattle, Washington, USA July 2006

Outline Background 1 Background 2 DS2 Bound Shulman and Feder Bound. 3 4

Background The error performance of coded communication systems rarely admits exact expressions Tight analytical bounds emerge as a useful tool for assessing performance. The union bound is useless at rates above the cutoff rate of the channel Improved upper bounds which are not subject to the cutoff rate limitations are needed, solely depending on the distance spectrum of a code/ ensemble.

The DS2 Bound Shulman and Feder Bound. Fixed Codes Maximum Likelihood (ML) decoding: P e m p N (y c m ) ( ) pn (y c m ) λ p y N (y c m ) Example m m λ, ρ 0 ρ = 1, λ = 1/2 Bhattacharyya-Union bound. The substitution λ = 1 1+ρ and optimization over 0 ρ 1 gives a tight bound for orthogonal codes. Usually impractical to evaluate in terms of distance spectrum of the codes. ρ

DS2 Bound Shulman and Feder Bound. Gallager 65 Bound: Random Codes NY Memoryless channel: p N (y x) = Memoryless input-distribution: q N (x) = l=1 p(y l x l ) NY l=1 q(x l ) The 1965 Gallager Random Coding Bound where P e 2 NEr(R) E r(r) = max (E 0(ρ, q) ρr) 0 ρ,q 1 8 < X E 0 (ρ, q) log 2 : y X x q N (x)p N (y x) 1 1+ρ )! 1+ρ 9 = ;.

Outline Background DS2 Bound Shulman and Feder Bound. 1 Background 2 DS2 Bound Shulman and Feder Bound. 3 4

DS2 Bound Shulman and Feder Bound. The second version of Duman and Salehi Bound Suggested by Sason and Shamai as a generalization of the Duman and Salehi bound which originally derived for the AWGN channel (Duman Ph.D. dissertation 1998). Let ψ m N (y) be a measure (may depend on c m).

DS2 Bound Shulman and Feder Bound. The second version of Duman and Salehi Bound (Cont.) P e m X y = X y Jensen ψ m N (y) ψ m N (y) 1 p N (y c m) ψ m N (y) λ, ρ 0. 0 @ X m m 0 @ ψ m N (y) 1 ρ p N (y c m) 1 ρ X y 0 @ X pn (y c m ) p m m N (y c m) X m m p N (y c m) ρ 1 ψn m (y) 1 ρ 1 0 ρ 1, λ 0. pn (y c m ) p N (y c m) 1ρ λ A 1ρ λ A λ! ρ pn (y c m ) p N (y c m)

DS2 Bound Shulman and Feder Bound. The second version of Duman and Salehi Bound (Cont.) Let P e m X ψn m (y) = Gm N (y) p N(y c m ) y Gm N (y) p N(y c m ) G m N (y) p N (y c m) y 0 @ X X m m y! 1 ρ p N (y c m) G m N (y) 1 1 ρ 0 ρ 1, λ 0. pn (y c m ) p N (y c m) 1ρ λ A,

DS2 Bound Shulman and Feder Bound. Advantages Gives results in term of basic code features (such as distance spectrum) to structured codes and ensembles. Difficulty Achieve capacity for the ensemble of random codes. Many upper bounds are particular cases of this bound. Computationally heavy since it requires numerical optimizations over some parameters and numerical integrations for each constant weight subcode.

Outline Background DS2 Bound Shulman and Feder Bound. 1 Background 2 DS2 Bound Shulman and Feder Bound. 3 4

DS2 Bound Shulman and Feder Bound. The Shulman and Feder Bound (SFB) Adaption of the random-coding bound to structured ensembles of block codes. For a binary linear block code (or ensemble), C, with distance spectrum {A l } the SFB reads where log α(c) NEr(R+ ) P e 2 N α(c) max 1 l N A l 2 N(1 R) ( N l and E r is the random coding error exponent. )

DS2 Bound Shulman and Feder Bound. The Shulman and Feder Bound (Cont.) Advantage Clearly, the SFB reproduces the random coding bound for the ensemble of random linear block codes. Drawback Depends on the maximal ratio between the distance spectrum of the code and the binomial distribution May not be tight for some efficient codes. The SFB was reproduced as a particular case of the DS2 bound [Sason & Shamai, IEEE Trans. on IT, Dec. 2002].

Distance Spectra of Turbo-Like Ensembles Let B l 2 N(1 R)( ) N l l = 0, 1,..., N designate the average distance spectrum of random code. The tightness of the SFB depends on the maximal ratio between the spectrum of the code (ensemble) and the average spectrum of random code of the same rate and length.

Distance Spectrums of Turbo-Like Ensembles (Cont.) Example: Random Turbo-Block codes with Systematic Component Codes. 10 2 10 35 10 30 10 25 10 1 10 20 A l /B l A l /B l 10 15 10 10 10 5 10 0 10 0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 l/n l/n Turbo-random code, R=0.72 bits/sym, N=100 Turbo-random code, R=0.72 bits/sym, N=1000

Partitioning of C into two subcodes Observation: For a relatively large portion of the Hamming weights, the distance spectrum of the code resembles the binomial distribution of random codes. Suggestion (Miller & Burshtein): Partition the original code into two subcodes, C and C ; C contains all the codewords with Hamming weight l U {1, 2,..., N}, while C contains the other codewords. Both subcodes contain the all-zero codeword. The union bound provides P e = P e 0 P e 0 (C ) + P e 0 (C ). Use the SFB as an upper bound on P e 0 (C ), and apply the UB on P e 0 (C ).

Partitioning... Background The problem of finding the optimal partitioning is very complex. In our work, we suggest a certain partitioning which depends on the ratio between the distance spectrum of the code and the binomial distribution (where the latter characterizes the distance spectra of the ensemble of fully random block codes).

Outline Background 1 Background 2 DS2 Bound Shulman and Feder Bound. 3 4

Upper Bound on the Block Error Probability Theorem (Modified Shulman and Feder Bound). Let C be partitioned into two subcodes C and C, as mentioned. Then, for an MBIOS channel: where P e 0 (C ) SFB(ρ) A(ρ) X y B(ρ) X y 2 4 X l U ( ( N l P e P e 0 (C ) + P e 0 (C ) A(ρ) A(ρ) + B(ρ) [p(y 0)p(y 1)] p(y 0) 2 1+ρ 1 1+ρ l B(ρ) A(ρ) + B(ρ) 3ρ N l 5, 0 ρ 1 1 2 p(y 0) 1+ρ 1 + 1 ρ 1 ) 2 p(y 1) 1+ρ 1 ρ 1 ) 1 2 p(y 0) 1 1+ρ + 1 2 p(y 1) 1 1+ρ.

Upper Bound on the Block Error Probability The Essence of the Proof. Instead P e 0 (C ) max l U 8 >< NX >: l=1 N l A l 2 N(1 R) N l 0 @ X y! ρ 0 @ X y g(y) 1 1 ρ p(y 0) g(y) p(y 0) 1 A N l 0 @ X y 1 A N(1 ρ) 2 N(1 R)ρ 1 g(y) 1 ρ 1 p(y 0) 1 λ p(y 1) λ A ρ 9 l>= >;. Note Since typically C contains only a small fraction of the codewords in C, we use the simple UB as an upper bound on P e 0 (C ).

Upper Bound on the Block Error Probability The Essence of the Proof. We start the derivation of the bound by writing P e 0 (C ) max l U 8 >< X >: l U N l A l 2 N(1 R) N l 0 @ X y! ρ 0 @ X y g(y) 1 1 ρ p(y 0) g(y) p(y 0) 1 A N l 0 @ X y 1 A N(1 ρ) 2 N(1 R)ρ 1 g(y) 1 ρ 1 p(y 0) 1 λ p(y 1) λ A ρ 9 l>= >;. and rely on the symmetry properties of the MBIOS channel. Note Since typically C contains only a small fraction of the codewords in C, we use the simple UB as an upper bound on P e 0 (C ).

Upper Bounds on the BER The original SFB was derived as an upper bound on the block error probability. Sason and Shamai derived the bit-error version of the DS2 for fully interleaved fading channels with perfect channel state information at the receiver. We generalize the result of Sason and Shamai to arbitrary MBIOS channels.

The SFB on the BER Theorem. (The SFB Version on the BER) Let C be an (N, K ) binary linear block code. Assume an MBIOS channel. Let A w,l designate the number of codewords in C which are encoded by information bits whose Hamming weight is w and their Hamming weight after encoding is l. The BER is upper bound by where α b (C) max 0<l N P b 2 NEr(R+ log α b (C) N ) A l 2 N(1 R)( N l ), A l K w=1 ( w K ) A w,l.

Modified SFB on the Bit Error Probability In order to tighten the resulting bound, we obtain the bit-error version of the modified SFB, analogously to the derivation for the block error probability.

Theorem. Simplified DS2 Bound Let C be a binary linear block code of length N and rate R, and assume the communication takes place over an MBIOS channel. Let A l (C ) P NR w=1 w NR Aw,l if l U 0 otherwise. The bit error probability is upper bounded by where P b 0 (C ) 2 N E 0 (ρ) ρ NX ( A ᾱ ρ(c ) l (C ) N l=0 2 N(1 R) N l l P b P b 0 (C ) + P b 0 (C ) R+ log ᾱρ(c ) N A(ρ) A(ρ) + B(ρ) l, 0 ρ 1 B(ρ) A(ρ) + B(ρ) N l ).

The Simplified DS2 Bound (Cont.) Notes: The same bound can be applied to block error probability, with the replacement of A l (C ) with A l (C ). Depends on the average ratio of A l B l (instead of the maximal ratio) yields a tighter bounding technique than the SFB.

: Random Turbo-Block Codes 10 0 TSB UB+MSFB 10 1 Upper bounds on the BLOCK error probability 10 2 10 3 10 4 10 5 10 6 10 7 3 3.2 3.4 3.6 3.8 4 4.2 4.4 4.6 E b /N 0 [db] N = 1072, R = 0.932 bits per channel use

: Random Turbo-Block Codes 10 0 10 1 TSB UB+simplified DS2 Upper bounds on the BIT error probability 10 2 10 3 10 4 10 5 10 6 10 7 3 3.2 3.4 3.6 3.8 4 4.2 4.4 4.6 E b /N 0 [db] N = 1072, R = 0.932 bits per channel use

Summary Background Tightened versions of the Shulman and Feder bound were derived based on the general concept of the DS2 bounding technique, and by revisiting the alternative proof for reproducing the Shulman and Feder bound as a particular case of the DS2 bound. An expurgation of the distance spectrum of the code further tightens the resulting upper bounds (as exemplified in the full paper version). The tightened upper bounds derived in this work were demonstrated to outperform the TSB in some cases, especially for ensembles of turbo-like codes of high rate.

Further Reading M. Twitto, I. Sason and S. Shamai, Tightened upper bounds on the ML decoding error probability of binary linear codes, submitted to the IEEE Trans. on Information Theory, February 2006. [Online]. Available: http://www.ee.technion.ac.il/people/sason/sfb.pdf I. Sason and S. Shamai, Performance analysis of linear codes under maximum-likelihood decoding: a tutorial, Foundations and Trends in Communications and Information Theory, vol. 3, no. 1 2, pp. 1 222, NOW Publishers, Delft, the Netherlands, July 2006.