Recognition of a code in a noisy environment

Similar documents
Cyclic Redundancy Check Codes

Local correctability of expander codes

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

Symmetric Product Codes

Linear Algebra. F n = {all vectors of dimension n over field F} Linear algebra is about vectors. Concretely, vectors look like this:

Solutions of Exam Coding Theory (2MMC30), 23 June (1.a) Consider the 4 4 matrices as words in F 16

Extended Subspace Error Localization for Rate-Adaptive Distributed Source Coding

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

Ma/CS 6b Class 24: Error Correcting Codes

Chapter 7: Channel coding:convolutional codes

Hamming Codes 11/17/04

3. Coding theory 3.1. Basic concepts

Chapter 3 Linear Block Codes

1.6: Solutions 17. Solution to exercise 1.6 (p.13).

2. Polynomials. 19 points. 3/3/3/3/3/4 Clearly indicate your correctly formatted answer: this is what is to be graded. No need to justify!

MATH32031: Coding Theory Part 15: Summary

Linear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

9 THEORY OF CODES. 9.0 Introduction. 9.1 Noise

Error Detection and Correction: Hamming Code; Reed-Muller Code

Assume that the follow string of bits constitutes one of the segments we which to transmit.

MATH3302 Coding Theory Problem Set The following ISBN was received with a smudge. What is the missing digit? x9139 9

MATH Examination for the Module MATH-3152 (May 2009) Coding Theory. Time allowed: 2 hours. S = q


EE 229B ERROR CONTROL CODING Spring 2005

MATH 291T CODING THEORY

And for polynomials with coefficients in F 2 = Z/2 Euclidean algorithm for gcd s Concept of equality mod M(x) Extended Euclid for inverses mod M(x)

U Logo Use Guidelines

Rank and Kernel of binary Hadamard codes.

ERROR-CORRECTING CODES AND LATIN SQUARES

The Golay codes. Mario de Boer and Ruud Pellikaan

Low-Density Parity-Check Codes

Coding Theory and Applications. Linear Codes. Enes Pasalic University of Primorska Koper, 2013

16.36 Communication Systems Engineering

The Hamming Codes and Delsarte s Linear Programming Bound

The extended coset leader weight enumerator

MATH/MTHE 406 Homework Assignment 2 due date: October 17, 2016

Network Coding and Schubert Varieties over Finite Fields

Research on Unequal Error Protection with Punctured Turbo Codes in JPEG Image Transmission System

IEEE Working Group on Mobile Broadband Wireless Access. <

Vector spaces. EE 387, Notes 8, Handout #12

Lecture 7 September 24

Shannon s noisy-channel theorem

MATH 291T CODING THEORY

Lecture 18: Gaussian Channel

CSCI 2570 Introduction to Nanocomputing

Communications II Lecture 9: Error Correction Coding. Professor Kin K. Leung EEE and Computing Departments Imperial College London Copyright reserved

CS264: Beyond Worst-Case Analysis Lecture #11: LP Decoding

Binary Convolutional Codes

Error-correcting codes and applications

Improved Fast Correlation Attacks Using Parity-Check Equations of Weight 4 and 5

Blind recovery of k/n rate convolutional encoders in a noisy environment

Decoding One Out of Many

Code design: Computer search

Lecture 14: Hamming and Hadamard Codes

Can You Hear Me Now?

Binary Linear Codes G = = [ I 3 B ] , G 4 = None of these matrices are in standard form. Note that the matrix 1 0 0

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Part III Advanced Coding Techniques

Introduction to Signal Detection and Classification. Phani Chavali

Convolutional Coding LECTURE Overview

Cyclotomic Cosets, Codes and Secret Sharing

Algebraic Geometry Codes. Shelly Manber. Linear Codes. Algebraic Geometry Codes. Example: Hermitian. Shelly Manber. Codes. Decoding.

Multimedia Systems WS 2010/2011

The Support Splitting Algorithm and its Application to Code-based Cryptography

Fault Tolerant Computing CS 530 Information redundancy: Coding theory. Yashwant K. Malaiya Colorado State University

ECE8771 Information Theory & Coding for Digital Communications Villanova University ECE Department Prof. Kevin M. Buckley Lecture Set 2 Block Codes

An Introduction to (Network) Coding Theory

Performance Analysis for Sparse Support Recovery

Introduction to Convolutional Codes, Part 1

Arrangements, matroids and codes

exercise in the previous class (1)

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

Lecture 12. Block Diagram

An Introduction to Low Density Parity Check (LDPC) Codes

Coding Theory and Applications. Solved Exercises and Problems of Cyclic Codes. Enes Pasalic University of Primorska Koper, 2013

Lecture 2 Linear Codes

Coding theory: Applications

Computer Vision & Digital Image Processing

From Stopping sets to Trapping sets

Sparsity and Compressed Sensing

MATH3302. Coding and Cryptography. Coding Theory

Mathematics Department

Coset Decomposition Method for Decoding Linear Codes

4 An Introduction to Channel Coding and Decoding over BSC

Computability and Complexity Theory: An Introduction

The cocycle lattice of binary matroids

Investigation of the Elias Product Code Construction for the Binary Erasure Channel

Lecture 3: Error Correcting Codes

These outputs can be written in a more convenient form: with y(i) = Hc m (i) n(i) y(i) = (y(i); ; y K (i)) T ; c m (i) = (c m (i); ; c m K(i)) T and n

Communication Engineering Prof. Surendra Prasad Department of Electrical Engineering Indian Institute of Technology, Delhi

SYND: a Fast Code-Based Stream Cipher with a Security Reduction

An Enhanced (31,11,5) Binary BCH Encoder and Decoder for Data Transmission

Channel Coding for Secure Transmissions

Computing the Monodromy Group of a Plane Algebraic Curve Using a New Numerical-modular Newton-Puiseux Algorithm

Construction X for quantum error-correcting codes

Chapter 2. Error Correcting Codes. 2.1 Basic Notions

Linear Cyclic Codes. Polynomial Word 1 + x + x x 4 + x 5 + x x + x f(x) = q(x)h(x) + r(x),

An Introduction to (Network) Coding Theory

Non-Linear Turbo Codes for Interleaver-Division Multiple Access on the OR Channel.

General Qubit Errors Cannot Be Corrected

Transcription:

Introduction Recognition of a code Conclusion Université de Limoges - XLIM - DMI INRIA Rocquencourt - Projet CODES ISIT Nice - June 29, 2007

Introduction Recognition of a code Conclusion Outline 1 Introduction 2 Recognition of a code Problem Theoretical results Experimental results 3 Conclusion

Introduction Recognition of a code Conclusion Introduction Context: Data transmission. Several transformations applied to the data. Transmitted data is corrupted by noise. General Problem: Finding some information on the transformations used only from an intercepted noisy sequence.

Introduction Recognition of a code Conclusion Reconstruction of code Recovering the rst layer i.e. the error correcting code used. Problem of reconstruction of code. State of art: convolutional codes, turbo codes: [Fil],[Bar],[Din] linear block codes: [Val],[Clu],[Bar] This problem is NP-Hard [Val]. Complexity exponential in the error rate. Easier problem : recognition of a code.

Problem Let be given a binary sequence S and a binary linear code C. Is this sequence S composed of noisy codewords of C?

Binary linear code Let C be a binary linear code of length n and dimension k. i.e. C is a vector subspace of F n 2 of dimension k. It can be represented by: its generator matrix (whose lines span the vector subspace C ). its parity check matrix (whose lines span C ). We will use a basis of the dual code of C, H = (h 1, h 2,..., h n k ), then, for all j {1,..., n k}, for all c C, < h j, c >= n h j,i c i = 0. i=1 We will note, for x F n 2, supp(x) = {i {1,..., n} x i 0}, and w(x) = #(supp(x)) its Hamming weight.

Binary channel We will consider a binary symmetric channel with error rate p. S = (s j ) 1 j M is a sequence taken at the output of the channel, s j F n 2. Note: if p = 1/2, S is random (one-time pad).

Main idea If S comes from C : if p = 0, if 0 < p << 1/2, Else: < h i, s j >= 0 for all i,j. < h i, s j >= 1 for a few number of i,j. < h i, s j >= 1 half of the time.

Dierent cases Let h be a non-zero word of C. If S comes from a random sequence: dim(h #(h ) = n 1, ) #(F n 2 ) = 1 2, P [< h, s j >= 1] = 1/2. If S comes from words of C : For all j, s j = c j + e j, with c j C and e j the error vector. < h, s j >= 1 #(supp(h) supp(e j )) is odd. P [< h, s j >= 1] = 1 (1 2p)w(h) 2.

Dierent cases If S comes from a sequence of words of C C : There exists at least one h 0 in a basis of C such that h 0 C, for this h 0, P [< h 0, s j >= 1] = 1/2. Other cases: heuristic result but conrmed by experiments. P [< h, s j >= 1] = 1/2.

Statistical test Summary: If S does not come from C : There exists at least one h {h 1,..., h n k }, s.t. If S comes from C : For all h {h 1,..., h n k }, P [< h, s j >= 1] = 1/2. P [< h, s j >= 1] = 1 2 1 2 (1 2p)w(h). Idea: M Is < h, s j > (sum in Z) close to M 2 M 2 (1 2p)w(h)? j=1

Theorem Theorem: The statistical test consisting in deciding that S = (s j ) 1 j M comes from a sequence of M words of h if and M only if < h, s j > T with j=1 M = ( ) 2 b 1 (1 2p) 2w(h) a, (1 2p) w(h) T = 1 2 (M + a M) and a = φ 1 (α), b = φ 1 (1 β) veries: M P[( < h, s j >) T S is random] = α (false alarm), j=1 M P[( < h, s j >) T S comes from h ] = β (non detection). j=1

Input: C a [n, k]-linear binary code, a binary symmetric channel with error rate p, S = (s j ) 1 j M a sequence taken at the output of the channel, α,β false alarm and non detection probabilities. Initialization: Compute (h 1,..., h n k ) a basis of C with low weight words. Compute M i and T i for each i {1,..., n k}. Algorithm: M i N i = < h i, s j > (sum in Z) for i {1,..., n k}. Output: j=1 If N i T i for all i {1,..., n k}, say that S comes from a sequence of words of C. If N i T i for at least one i {1,..., n k}, say that S does not come from a sequence of words of C.

Computing results Code w(h) Error Time used n max rate p M T is s BCH 511 140 0.005 1462 640 1 BCH 511 140 0.01 25823 12529 17 RM 1024 136 0.005 1345 585 4 RM 1024 136 0.01 21963 10629 28 Random 2000 535 0.001 724 298 9 Random 2000 535 0.002 6540 3078 62 Here, α = β = 10 6.

Synchronization Input: C, p, S, α and β. Initialization: Compute S (l) = (s (l) j ) 1 j M, 0 l n 1. Compute (h 1,..., h r ) some words of a basis of C. Compute M i and T i for each i {1,..., r}. Algorithm: For l from 0 to n 1 do M N (l) i i = < h i, s (l) j > (sum in Z) for i {1,..., r}. j=1 Output: If N (l) i T i for all i {1,..., r}, say that S seems to come from a sequence of words of C with synchronization l check it!

Introduction Recognition of a code Conclusion Conclusion Easy and fast algorithm to recognize a code. Can reach high error rate (compared to reconstruction). Application to synchronization.

Introduction Recognition of a code Conclusion Thanks for your attention.