Iterative Quantization. Using Codes On Graphs

Similar documents
5. Density evolution. Density evolution 5-1

ECEN 655: Advanced Channel Coding

Polar Codes are Optimal for Lossy Source Coding

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

An Introduction to Low Density Parity Check (LDPC) Codes

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

The PPM Poisson Channel: Finite-Length Bounds and Code Design

arxiv:cs/ v1 [cs.it] 15 Aug 2005

Message Passing Algorithm with MAP Decoding on Zigzag Cycles for Non-binary LDPC Codes

Joint Iterative Decoding of LDPC Codes and Channels with Memory

Communication by Regression: Sparse Superposition Codes

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Practical Polar Code Construction Using Generalised Generator Matrices

The Concept of Soft Channel Encoding and its Applications in Wireless Relay Networks

Performance of Polar Codes for Channel and Source Coding

Belief-Propagation Decoding of LDPC Codes

Lecture 20: Quantization and Rate-Distortion

On Bit Error Rate Performance of Polar Codes in Finite Regime

ALARGE class of codes, including turbo codes [3] and

arxiv: v1 [eess.sp] 22 Oct 2017

Soft-Output Trellis Waveform Coding

X 1 : X Table 1: Y = X X 2

Distributed Source Coding Using LDPC Codes

Accumulate-Repeat-Accumulate Codes: Capacity-Achieving Ensembles of Systematic Codes for the Erasure Channel with Bounded Complexity

The pioneering work of Shannon provides fundamental bounds on the

ECE Information theory Final (Fall 2008)

On Generalized EXIT Charts of LDPC Code Ensembles over Binary-Input Output-Symmetric Memoryless Channels

Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery

Lecture 22: Final Review

Codes on graphs and iterative decoding

NAME... Soc. Sec. #... Remote Location... (if on campus write campus) FINAL EXAM EE568 KUMAR. Sp ' 00

ECE Information theory Final

Capacity of a channel Shannon s second theorem. Information Theory 1/33

COMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei

Compressed Sensing and Linear Codes over Real Numbers

Convergence analysis for a class of LDPC convolutional codes on the erasure channel

Expectation propagation for symbol detection in large-scale MIMO communications

One Lesson of Information Theory

SPA decoding on the Tanner graph

Information Dimension

Low-Complexity Encoding Algorithm for LDPC Codes

Capacity-Achieving Ensembles for the Binary Erasure Channel With Bounded Complexity

Iterative Encoding of Low-Density Parity-Check Codes

Capacity-approaching codes

6.451 Principles of Digital Communication II Wednesday, May 4, 2005 MIT, Spring 2005 Handout #22. Problem Set 9 Solutions

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

Spatially Coupled LDPC Codes

Chapter 9 Fundamental Limits in Information Theory

SOURCE CODING WITH SIDE INFORMATION AT THE DECODER (WYNER-ZIV CODING) FEB 13, 2003

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

Recent Results on Capacity-Achieving Codes for the Erasure Channel with Bounded Complexity

Message-Passing Decoding for Low-Density Parity-Check Codes Harish Jethanandani and R. Aravind, IIT Madras

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

Codes on Graphs, Normal Realizations, and Partition Functions

MMSE DECODING FOR ANALOG JOINT SOURCE CHANNEL CODING USING MONTE CARLO IMPORTANCE SAMPLING

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Codes on graphs and iterative decoding

Raptor Codes: From a Math Idea to LTE embms. BIRS, October 2015

Polar Codes: Graph Representation and Duality

Lecture 11: Polar codes construction

Polar Codes are Optimal for Lossy Source Coding

Coding Techniques for Data Storage Systems

Accumulate-Repeat-Accumulate Codes: Capacity-Achieving Ensembles of Systematic Codes for the Erasure Channel with Bounded Complexity

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

Reliable Computation over Multiple-Access Channels

An Introduction to Low-Density Parity-Check Codes

Enhancing Binary Images of Non-Binary LDPC Codes

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

An Improved Sphere-Packing Bound for Finite-Length Codes over Symmetric Memoryless Channels

Upper Bounding the Performance of Arbitrary Finite LDPC Codes on Binary Erasure Channels

Codes on Graphs. Telecommunications Laboratory. Alex Balatsoukas-Stimming. Technical University of Crete. November 27th, 2008

Lecture 8: Shannon s Noise Models

CHAPTER 3 LOW DENSITY PARITY CHECK CODES

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Lecture 9 Polar Coding

Successive Cancellation Decoding of Single Parity-Check Product Codes

Code design: Computer search

Coding for loss tolerant systems

From Stopping sets to Trapping sets

LDPC Codes. Intracom Telecom, Peania

On the Typicality of the Linear Code Among the LDPC Coset Code Ensemble

Belief Propagation, Information Projections, and Dykstra s Algorithm

Constructions of Nonbinary Quasi-Cyclic LDPC Codes: A Finite Field Approach

Error Control for Real-Time Streaming Applications

One-Bit LDPC Message Passing Decoding Based on Maximization of Mutual Information

Fountain Codes. Amin Shokrollahi EPFL

Factor Graphs and Message Passing Algorithms Part 1: Introduction

Graph-based Codes for Quantize-Map-and-Forward Relaying

Low-Density Parity-Check Codes

Expansion Coding for Channel and Source. Coding

LDPC Codes. Slides originally from I. Land p.1

Compression and Coding

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

CODING FOR COOPERATIVE COMMUNICATIONS. A Dissertation MOMIN AYUB UPPAL

In Praise of Bad Codes for Multi-Terminal Communications

LOW-density parity-check (LDPC) codes were invented

Information Sources. Professor A. Manikas. Imperial College London. EE303 - Communication Systems An Overview of Fundamentals

Transcription:

Iterative Quantization Using Codes On Graphs Emin Martinian and Jonathan S. Yedidia 2 Massachusetts Institute of Technology 2 Mitsubishi Electric Research Labs

Lossy Data Compression: Encoding: Map source x = [x x 2... x n ] to bits. x 00 0 0 Decoding: Map bits to y = [y y 2... y n ]. y x Encoder 00 0 0 Decoder Goal: Construct mapping so that y near x. y

Lossy Compression Performance: Minimum rate is R(D) = inf I(x; y) p(y x):e[d(x,y)]=d

Lossy Compression Performance: Minimum rate is R(D) = inf I(x; y) p(y x):e[d(x,y)]=d Gap to R(D) for mean square distortion measures: Asymptotically high rates ECSQ gap =.53 db

Lossy Compression Performance: Minimum rate is R(D) = inf I(x; y) p(y x):e[d(x,y)]=d Gap to R(D) for mean square distortion measures: Asymptotically high rates ECSQ gap =.53 db Moderate rates, Gaussian source ECSQ gap =.6 3.4 db

Lossy Compression Performance: Minimum rate is R(D) = inf I(x; y) p(y x):e[d(x,y)]=d Gap to R(D) for mean square distortion measures: Asymptotically high rates ECSQ gap =.53 db Moderate rates, Gaussian source ECSQ gap =.6 3.4 db 256-state Trellis Coded Quantization gap = 0.5.4 db

Lossy Compression Performance: Minimum rate is R(D) = inf I(x; y) p(y x):e[d(x,y)]=d Gap to R(D) for mean square distortion measures: Asymptotically high rates ECSQ gap =.53 db Moderate rates, Gaussian source ECSQ gap =.6 3.4 db 256-state Trellis Coded Quantization gap = 0.5.4 db How do we approach R(D) with low complexity?

Compressing a source with erasures x 0 y 0 * Pr[x = ] = s, Pr[x = 0] = Pr[x = ] = ( s)/2

Compressing a source with erasures x 0 d(0,)=0. * d(*, )=0 y 0 d(,0)=0 Pr[x = ] = s, Pr[x = 0] = Pr[x = ] = ( s)/2 Quantizing 0 or 0 causes non-zero distortion

Performance for erasures models R(D = 0) PSfrag replacements Capacity Pr[x = ] Pr[x = ]

Performance for erasures models R(D = 0) Capacity PSfrag replacements Pr[x = ] Pr[x = ]

Coding Complexity Random Linear Code O(n 3 ) PSfrag replacements O(n) Block Length

Coding Complexity Random Linear Code O(n 3 ) Codes On Graphs O(n) PSfrag replacements Block Length Block Length

Data Compression Error Correction Duality Data Compression Code x Source Source Encoder Decoder y Error Correction Code Channel y x Channel Channel Encoder Decoder

An LDPC Quantizer 0 Source * Symbols 0 To Quantize * *

An LDPC Quantizer 0 0 * *

An LDPC Quantizer 0 0 * *

An LDPC Quantizer 0 * 0 * *

An LDPC Quantizer 0 * 0 * *

Problems with LDPC Quantizers: Theorem: LDPC quantizers fail if parity check density < o(log n). Proof Idea: Imagine f n checks have degree d

Problems with LDPC Quantizers: Theorem: LDPC quantizers fail if parity check density < o(log n). Proof Idea: Imagine f n checks have degree d For each low degree check P r[check unsatisfied] 2 ( e)d

Problems with LDPC Quantizers: Theorem: LDPC quantizers fail if parity check density < o(log n). Proof Idea: Imagine f n checks have degree d For each low degree check P r[check unsatisfied] 2 ( e)d E[bad checks] f n 2 ( e)d

Problems with LDPC Quantizers: Theorem: LDPC quantizers fail if parity check density < o(log n). Proof Idea: Imagine f n checks have degree d For each low degree check P r[check unsatisfied] 2 ( e)d E[bad checks] f n 2 ( e)d Successful quantization requires d log n

Dual LDPC Quantizer 0 Uncompressed * 0 Bits * * Compressed Bits

Dual LDPC Quantizer 0

Dual LDPC Quantizer 0 0

Dual LDPC Quantizer 0

Dual LDPC Quantizer 0 * 0 * *

Dual LDPC Quantizer 0 * 0 * * 0 0

Optimal Decoding/Quantization Duality: * 0 * 0/ * Any data encoded with C and received with erasures e can be decoded any source with erasures e can be quantized by C.

Message-Passing Rules for Erasure Decoding: + 0 0 0 0 = 0 0 0 # 0 # 0

Erasure Decoding For LDPC Code: *

Erasure Decoding For LDPC Code:

Erasure Quantization For Dual LDPC Code: * *

Erasure Quantization For Dual LDPC Code: * * *

Erasure Quantization For Dual LDPC Code: * * *

Erasure Quantization For Dual LDPC Code: * * * 0

Erasure Quantization For Dual LDPC Code: * * 0

Erasure Decoding vs. Erasure Quantization: * 0 * + 0 0 0 0 = 0 0 0 # 0 # 0

Erasure Decoding vs. Erasure Quantization: * 0 * + 0 0 0 0 = 0 0 * * * * 0 0 # 0 # 0

Message-Passing Rules for Erasure Quantization: + 0 0 0 0 = 0 0 0 # 0 0 # 0 0

Erasure Quantization For Dual LDPC Code: 0 * * * *

Erasure Quantization For Dual LDPC Code: * 0 * * * *

Erasure Quantization For Dual LDPC Code: * * 0 * * * *

Erasure Quantization For Dual LDPC Code: * * 0 * * * * 0

Erasure Quantization For Dual LDPC Code: * * 0 * * * *

Erasure Quantization For Dual LDPC Code: * * 0 * * * *

Erasure Quantization For Dual LDPC Code: * 0 * * * *

Erasure Quantization For Dual LDPC Code: 0 * 0 * *

Iterative Decoding/Quantization Duality: Theorem: The code C can iteratively decode erasures e iff the dual code C can iteratively quantize erasures e.

Iterative Decoding/Quantization Duality: Theorem: The code C can iteratively decode erasures e iff the dual code C can iteratively quantize erasures e. Corollary: Duals of capacity achieving codes for erasure decoding, achieve the minimum rate for erasure quantization.

In theory, there is no difference between theory and practice. But, in practice, there is. Jan L.A. van de Snepscheut

In theory, there is no difference between theory and practice. But, in practice, there is. Jan L.A. van de Snepscheut

Concluding Remarks: LDPC codes = bad quantizers; dual codes = good quantizers. Duality between optimal/iterative decoding/quantization. Future work: extend analysis/algorithms/duality to all source models and distortion measures.