Low-Density Parity-Check Codes

Similar documents
LDPC Codes. Slides originally from I. Land p.1

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems

Codes on Graphs. Telecommunications Laboratory. Alex Balatsoukas-Stimming. Technical University of Crete. November 27th, 2008

1.6: Solutions 17. Solution to exercise 1.6 (p.13).

Low-density parity-check codes

ECEN 655: Advanced Channel Coding

An Introduction to Low Density Parity Check (LDPC) Codes

5. Density evolution. Density evolution 5-1

Codes on graphs and iterative decoding

Codes on graphs and iterative decoding

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

9 Forward-backward algorithm, sum-product on factor graphs

Belief-Propagation Decoding of LDPC Codes

Ma/CS 6b Class 25: Error Correcting Codes 2

Quasi-cyclic Low Density Parity Check codes with high girth

LDPC Codes. Intracom Telecom, Peania

Introducing Low-Density Parity-Check Codes

Graph-based codes for flash memory

CS Lecture 4. Markov Random Fields

Low-density parity-check (LDPC) codes

Lecture 4 : Introduction to Low-density Parity-check Codes

Making Error Correcting Codes Work for Flash Memory

Factor Graphs and Message Passing Algorithms Part 1: Introduction

CS264: Beyond Worst-Case Analysis Lecture #11: LP Decoding

Message Passing Algorithm and Linear Programming Decoding for LDPC and Linear Block Codes

COMPSCI 650 Applied Information Theory Apr 5, Lecture 18. Instructor: Arya Mazumdar Scribe: Hamed Zamani, Hadi Zolfaghari, Fatemeh Rezaei

CHAPTER 3 LOW DENSITY PARITY CHECK CODES

Iterative Encoding of Low-Density Parity-Check Codes

Decoding Codes on Graphs

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

Message-Passing Decoding for Low-Density Parity-Check Codes Harish Jethanandani and R. Aravind, IIT Madras

U Logo Use Guidelines

Approximate Message Passing

B I N A R Y E R A S U R E C H A N N E L

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Message Passing Algorithm with MAP Decoding on Zigzag Cycles for Non-binary LDPC Codes

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Linear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Cyclic Redundancy Check Codes

Compressed Sensing and Linear Codes over Real Numbers

Shannon s noisy-channel theorem

ON THE MINIMUM DISTANCE OF NON-BINARY LDPC CODES. Advisor: Iryna Andriyanova Professor: R.. udiger Urbanke

Analytical Performance of One-Step Majority Logic Decoding of Regular LDPC Codes

Analysis of a Randomized Local Search Algorithm for LDPCC Decoding Problem

Community Detection in Signed Networks: an Error-Correcting Code Approach

Introduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah

Error Detection and Correction: Hamming Code; Reed-Muller Code

9 THEORY OF CODES. 9.0 Introduction. 9.1 Noise

Symmetric Product Codes

STA 4273H: Statistical Machine Learning

Channel Coding for Secure Transmissions

Adaptive Cut Generation for Improved Linear Programming Decoding of Binary Linear Codes

Introduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah

Ma/CS 6b Class 24: Error Correcting Codes

Lecture 4: Linear Codes. Copyright G. Caire 88

Exercise 1. = P(y a 1)P(a 1 )

On the Block Error Probability of LP Decoding of LDPC Codes

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Block Codes :Algorithms in the Real World

MATH3302 Coding Theory Problem Set The following ISBN was received with a smudge. What is the missing digit? x9139 9

An Introduction to Algorithmic Coding Theory

Local correctability of expander codes

Spatially Coupled LDPC Codes

exercise in the previous class (1)

Probabilistic Graphical Models

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Exercises with solutions (Set B)

1 Introduction to information theory

On the minimum distance of LDPC codes based on repetition codes and permutation matrices 1

Construction and Performance Evaluation of QC-LDPC Codes over Finite Fields

Belief propagation decoding of quantum channels by passing quantum messages

Lecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem

Lecture 8: Bayesian Networks

Lecture 2 Linear Codes

Decomposition Methods for Large Scale LP Decoding

On Bit Error Rate Performance of Polar Codes in Finite Regime

6.02 Fall 2011 Lecture #9

An Introduction to Low-Density Parity-Check Codes

Turbo Codes are Low Density Parity Check Codes

Weaknesses of Margulis and Ramanujan Margulis Low-Density Parity-Check Codes

A Key Recovery Attack on MDPC with CCA Security Using Decoding Errors

Graph-based Codes and Iterative Decoding

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Slepian-Wolf Code Design via Source-Channel Correspondence

Lecture 7 September 24

Low-Density Parity-Check codes An introduction

LECTURE 10. Last time: Lecture outline

Iterative Decoding for Wireless Networks

Chapter 3 Linear Block Codes

Bounds on the Maximum Likelihood Decoding Error Probability of Low Density Parity Check Codes

Polar Codes: Graph Representation and Duality

Fountain Uncorrectable Sets and Finite-Length Analysis

Coding Techniques for Data Storage Systems

Lecture 8: Shannon s Noise Models

COMPUTER SCIENCE TRIPOS

Binary Convolutional Codes

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

Transcription:

Department of Computer Sciences Applied Algorithms Lab. July 24, 2011

Outline 1 Introduction 2 Algorithms for LDPC 3 Properties 4 Iterative Learning in Crowds 5 Algorithm 6 Results 7 Conclusion

PART I LDPC Codes

Coding Theory Basics Wish to send original message over a noisy channel Encode message into a Code Add redundant bits for robustness Code gets corrupted through noisy channel Bit flip error (BSC: Binary Symmetric Channel) W.p. p, a bit flips (error probability) Restoring corrupted code into original code is called Decoding Figure: Communication Channel Diagram

LDPC Codes Low-Density: sparse constraints on each bit Parity-Check: Bits of same constraint must sum to 0 (modulo 2) Graphical representation is a regular bipartite graph All codewords are equidistant from one another (Hamming-wise) Figure: Parity-Check Diagram for a (7,2,4)-code. Square nodes are check nodes and circles are message nodes.

Matrix Representation A c n matrix representation of LDPC, where there are c check nodes and n message nodes (i.e. length n codewords): 0 1 0 1 1 0 0 1 H = 1 1 1 0 0 1 0 0 0 0 1 0 0 1 1 1 1 0 0 1 1 0 1 0 Characteristics: Each row and column has the same number of 1s. Specifies the parity constraints: e.g. For each codeword, the 2nd, 4th, 5th, and the 8th bit should sum to 0 (by first row)... and similar constraints according to the subsequent rows

Code Verification - Constraints on Codewords Given a code w, how do we know if it s a valid codeword? Must satisfy parity constraints i.e. Sum to 0, for each row For a given codeword w of length n, the following must be satisfied: Hw = 0 (H {0, 1} c n, w {0, 1} n ) That is, all valid codewords should belong to the null space of the LDPC matrix.

Encoding A Generator matrix G for an LDPC code provides a basis for the code sets such that w = m i g i = mg, i where m is k-bit original message, and w is a valid codeword. G and H have a dual relationship in that G = [I k P] and H = [P T I n k ] Hence, to derive G from H, put H in the form of [P T I n k ] through row operations...... and set G = [I k P]. But this encoding scheme could take O(n 2 ) operations Linear-time encoding schemes exist

Decoding Exact decoding for LDPC codes is NP-hard Instead, we approximate by belief propagation (message passing) The messages passed are the log-likelihood of the observed bit Message from a codeword node v to a check node c: Likelihood of v given the observed values of v Message from a check node c to a codeword node v: Likelihood of v given all the messages passed to c

Decoding Formula Derivation [3] Define the likelihood of a binary r.v. x as L(x) L(x y) P(x = 0) P(x = 1) P(x = 0 y) P(x = 1 y) If P(x = 0) = P(x = 1) then L(x y) = L(y x) by Bayes rule. Therefore, if y 1,, y d are independent r.v. s, ln L(x y 1,, y d ) = i ln L(x y i ) (1) Eqn 1 will become the message from codeword node v to check node c.

Decoding Derivation (Cont.) Let l i = ln L(x i y i ). Consider the log-likelihood ln L(x 1 + +x l y 1,, y l ) = ln 1+( l i=1 tanh(l i/2)) 1 ( l i=1 tanh(l i/2)) (Summations are mod 2) This equation holds due to: and 2P(x 1 + +x l = 0 y 1,, y l ) 1 = l (2P(x i = 0 y i ) 1) i=1 2P(x i = 0 y i ) 1 = L(x i y i ) 1 L(x i y i )+1 = tanh(l i/2)

Defining Messages At round 0, m v is set to the log-likelihood of v conditioned on the observed value. At round k, m (k) vc = m (k) cv { m v, if k = 0 m v + c C v {c} m(k 1) c v, if k 1 = ln 1+ v V c {v} tanh(m(k) /2) v c 1 v V c {v} tanh(m(k) /2) v c To see that m cv makes sense, notice that x 1 + +x v 1 + x v+1 + +x p = 0 (or 1) x v = 0 (or 1), due to mod 2 operation and parity constraints. Hence, m cv is the likelihood for the node v.

Decoding - Final Decision Once the messages converge, we can set ln L(x v y v) = c C(v) mcv, and make the decision on the bit v by: { 0, if ln L(x v v v) > 0 1, if ln L(x v v v) < 0 If the log likelihood is positive, it means P(x = 0 y) > P(x = 1 y) and similar for negative.

Error Bounds [1] The error bound is given in terms of the depth of the tree (i.e. # of iterations): Theorem Given an (n, j, k)-ldpc code, with [ ] n ln n 2k 2j(k 1) 2 ln(j 1)(k 1) m ln n ln(j 1)(k 1), the probability of a decoding error after m iterations is ( [ ] α ) n P m exp c jk 2k n 2j(k 1) ln j 1 2 if j odd 2 ln(j 1)(k 1) α = ln j 2 if j even, 2 ln(j 1)(k 1) for suitable positive constants c jk.

PART II Iterative Learning in Crowds [2]

Tasks and Workers Workers: There are n workers - w a for a {1,, n} Each worker gets r tasks randomly. Tasks: There are m tasks - t i for i {1,, m} Each task gets assigned to l workers. NOTE: The above settings imply lm = rn.

Responses For simplicity, assume a binary response: Worker w a completing task t i yields response A ia {±1}. Correct answer for t i is s i = 1. Let p a P(A ia = s i ) (iid by some probability distribution).

Task Allocation Given the setting, we can construct a bipartite graph G({t i },{w a}, E, A): E [m] [n] - connected if t i assigned to w a. A ia is the weight assigned to edge (i, a). How to assign: According to a random (l, r)-regular (bipartite) graph. i.e. all workers have degree l, all tasks have degree r. Among all possible(l, r)-regular graphs, we sample one uniformly random.

Message Passing Messages on k-th iteration: x (k) i a from task to worker. y (k) a i from worker to task. ** {ya i} 0 N(1, 1) x (k) i a = b N(i)a y (k) a i = j N(a)i A ib y (k 1) b i A ja x (k 1) j a

Final Decision For task i, run a predefined number of iterations k. The decision is made according to: = sign A ib y (k 1) If x (k) i = 0, flip a fair coin. x (k) i b N(i) b i

Theorems From the random variables defined previously, we can derive probability bounds on mis-prediction: Theorem Assumˆlˆrq 2 > 1. Then lim k P(x(k) 0) exp where ˆx x 1, and q E[(2p 1) 2 ]. ( 1 2 ) l 3 (ˆlˆrq 2 1), ˆl 3 (4+ˆrq)

Interpretations With message passing, error decreases exponentially with rounds. Regardless of how tasks are assigned (as long as (l, r) regularity is satisfied). Initial messages can be random as long as the distribution has non-zero mean. But also, Strict setting: binary response, uniform prior on p s. Theorem holds for q 2ˆlˆr > 1: Can hold even if r is large.. i.e. even if workers process many irrelevant tasks (counter-intuitive).

Robert Gallager. Low-density parity check codes, 1963. David Karger, Sewoong Oh, and Devavrat Shah. Iterative learning from a crowd. In WIDS, 2011. Amin Shokrollahi. LDPC codes: An introduction, 2003.