Channel Coding for Secure Transmissions

Size: px
Start display at page:

Download "Channel Coding for Secure Transmissions"

Transcription

1 Channel Coding for Secure Transmissions March 27, / 51

2 McEliece Cryptosystem Coding Approach: Noiseless Main Channel Coding Approach: Noisy Main Channel 2 / 51

3 Outline We present an overiew of linear block codes that can be used for secure transmissions. A cryptographic algorithm based on linear block codes is presented for computational security. For perfect secrecy, we also show that linear block codes can be used. Information-theortic approaches are used to show perfect secrecy. 3 / 51

4 For convenience, we only consider binary codes. Channel codes are to provide protection of message sequences against channel errors. Denote by m {0, 1} k a message sequence. Denote by c {0, 1} n a codeword. We assume that n > k and encoding is to find c for a given m. Code rate r = k n. A code is assumed to be a set of codewords: C = {c 1,..., c 2 k}. 4 / 51

5 Examples k = 2 and n = 3 Input Output (n = 3) A very simple block code is the repetition code. For (n, k) = (3, 1), 0 bit becomes 000 and 1 becomes 111. If one bit error happens, it can be corrected: / 51

6 There are some requirements to be good block codes: Good error detection and correction performance Computationally efficient encoding and decoding rules In order to find good block codes, some mathematical approaches are used (algebraic codes). Linear codes are easy to build and analyze. Thus, we focus on linear binary block codes. 6 / 51

7 Modulo-2 operations: Addition 0 0 = = = = 0. Multiplication Inverse ac = { c, if a = 1 0, if a = 0 a = a 7 / 51

8 A linear code A (n, k) linear block code consists of 2 k codewords of n tuples: C = {c 1, c 2,..., c 2 k}, c i = [c i,1 c i,2 c i,n ]. Every linear combination of codewords yields another codeword. A linear block code must include the codeword 0 because c i + c i = 0. Due to the linear property, we can see that c i Span(c 1, c 2,..., c 2 k). 8 / 51

9 With a linear code, any codeword is given by c i = k g p b i,p, p=1 where g p are some binary row vectors of n-tuple and b i,p {0, 1}. Let G = g 1 g 2. g k, which is called the generator matrix (of size k n). 9 / 51

10 An analogy Consider a vector x = [x 1 x 2 x 3 ] in a 3-dimensional space. A set of vectors can be defined as X = {x = [x 1 x 2 0] x 1, x 2 R}. Clearly, any linear combination of vectors in X becomes a vector of X. That is, if x 1, x 2 X, then c 1 x 1 + c 2 x 2 X. Any vector in X can be also written as x = [1 0 0]x [ 1 + [0 1 0]x ] = [x 1 x 2 ] = bg. 10 / 51

11 Encoding using Generator matrix Any linear code can be characterized by a generator matrix, G, which is k n binary matrix. Encoding is the following matrix-vector multiplication: c = mg, where m is a 1 k message vector and c is its codeword. Example: G = [ ] Input vector Coded vector / 51

12 If G has the following form G = [I k k P k (n k) ], the linear block code is called systematic. If G is systematic, the message vector is a subvector of its codeword. 12 / 51

13 Error detection and correction Define the (n k) n parity-check matrix as H = [ P T I (n k) (n k) ] = [P T I (n k) (n k) ]. It can be shown that GH T = [I P] [ P I ] = P + P = 0. If c is a codeword, we can show that ch T = bgh T = / 51

14 Example: For the generator matrix, [ G = the parity-check matrix is H = ],. 14 / 51

15 Error detection Suppose that the received signal vector is where e is the error vector. r = c e, Then, we have the 1 (n k) syndrome s as s = rh T = (c e)h T = ch }{{} T eh T =0 = eh T (Syndrome test). Detection based on syndrome test (Error Detection) We assume that there is no error if s = 0. Otherwise, there are errors. Note that there are cases that even though s = 0, e 0 (errors are not detected in this case). 15 / 51

16 Error correction For the error correction, we need to have the 2 (n k) 2 k standard array format: Coset leaders Syndroms u 1 (+0) u 2 (+0) u 2 k(+0) s 1 = 0 e 2 u 2 + e 2 u 2 k + e 2 s 2 = e 2 H T..... e 2 (n k) u 2 + e 2 (n k) u 2 k + e 2 (n k) s 2 (n k) In this array, all the possible combinations of n-tuple binary vectors are listed. The vectors in the same row have the same syndrome. 16 / 51

17 Procedure: 1. Calculate the syndrome of the received vector, r, and find the corresponding row in the array. 2. Locate the coset leader (error pattern) ê corresponding to the syndrome rh T. 3. Provide the corrected coded block as u = r ê = (c e) e = c, if ê = e 17 / 51

18 Example: (6, 3) block code Let us consider the following generator matrix: G = Then, we can construct a standard array format as follows: The coset leaders are chosen under the minimum weight criterion. Here, the weight means the number of 1 s. This criterion is vital to minimize the BER. 18 / 51

19 Hamming distance and error correction Consider two codewords. c i = [01011] c j = [10101], The Hamming distance between the two codewords, which is denoted by d H (c i, c j ), is the number of different bits, 4. The Hamming weight of a codeword is the number of 1 s. For c i, the Hamming weight is 3 and the Hamming weight is denoted by w(c i ). 19 / 51

20 A key result: d H (c i, c j ) = w(c i c j ) The performance of error correction is decided by the minimum Hamming distance of the code which is defined as It can also be shown that d min = min i j d H(c i, c j ), c i, c j C. d min = min c i 0 w(c i), c i C. Remarks: If dmin = 3, we can correct up to 1 bit error. If dmin = 11, we can correct up to 5 bit errors. The number of bit errors that can be corrected is t = d min / 51

21 Error detection Consider a simplified channel including transmitter and Due to the receiver, noise, which there can are be errors characterized and the by error a single probability parameter is p. p, An sim example to called see the thiscrossover channelprobability: is the following binary symmetric the binarychannel symmetric chan (BSC): (BSC): 1 p 0 0 p 1 p 1 p 1 In this case, Forwe binary haveask, we have ( {) ( ) 1 p, if y = x f Y X (y, x) = a 2Eb p = Q p, = Q if y x σ w N 0 Note that f X (x) isnotgivenyet. 21 / 51

22 Through the BSC, the received coded signal can have error. r = c i + e, where e is the error vector which will be zero vector if there is no error. With r, we can consider some cases: no error: e = 0 and its corresponding probability is (1 p) n. undetected error: r = c j, where c j c i. For this, there should be more than d min bit errors. That is, the undetected error probability is ( n P ud d min ) ( p d min (1 p) n d min n d min ) p d min when p 1. errors not corrected: 22 / 51

23 Consider the probability that errors cannot be corrected. If there are more than t errors, the codeword cannot be recovered and its probability becomes P uc n j=t+1 ( n j ( n t + 1 ) p j (1 p) n j ) p t+1, p 1. In most cases, a codeword can be incorrectly corrected to be a neighborhood which has 2t + 1 bits difference. 23 / 51

24 Hence, among n bits, there are 2t + 1 different bits. It gives an approximate bit error probability as Pe coded 2t + 1 ( ) n p t+1 n t + 1 (n 1)! = (2t + 1) (t + 1)!(n t 1)! pt+1 24 / 51

25 Perfect code If d min is odd, then any errors with the Hamming weight w(e) t = d min 1 2 can be corrected. For even d min, we have w(e) t = d min 2. 2 There are 2 k codewords, c i, and the number of possible received signals, r, is 2 n in n-dimensional space. If any r lies within a Hamming distance t = d min 1 2 from one of the codewords, the code is called perfect. 25 / 51

26 In this case, the spheres of center c i and radius t should cover the full n-dimension. Since t = d min 1 2, the spheres are not overlapped. d min should be odd in a perfect code There are t ( ) n i i=0 received vectors belong to the sphere of center c i Since there are 2 k codewords, 2 n = 2 k in a perfect code. t i=0 ( ) n or i t i=0 ( ) n = 2 n k i 26 / 51

27 In any code, we have ( t n i=0 i Hamming bound. ) 2 n k, which is called Hamming codes: Let t = 1 (single error correction). Then, we have 1 + n 2 n k Hamming codes are single error correcting perfect codes (i.e., 2 n k = 1 + n). n k = 2 (3, 1) n k = 3 (7, 4) n k = 4 (15, 11) n k = 5 (31, 26) 27 / 51

28 Maximum likelihood decoding The likelihood function of the qth bit over BSC is given by { 1 p, if rq = c f(r q c q ) = q, q = 1, 2,..., n. p, if r q c q, For independent errors, the likelihood function of c becomes n f(r c) = f(r q c q ) = p dh(r,c) (1 p) n dh(r,c). q=1 Using the Hamming distance between r and c, it can be shown that the log-likelihood function is given by or log f(r c) = d H (r, c) log p + (n d H (r, c)) log(1 p). p log f(r c) = d H (r, c) log +Const. 1 p }{{} <1 28 / 51

29 The ML decoding is to find the codeword of the minimum distance from r as follows: ĉ = arg max f(r c) c C = arg max log f(r c) c C = arg max c C d H(r, c) log = arg min c C d H(r, c) p 1 p An exhaustive search requires a complexity of 2 k. 29 / 51

30 Error probability of ML decoding Error correction can be done up to t bits. Decoding errors happen if there are more than t errors. The probability that a j-bit error over a BSC with crossover probability p is p j (1 p) n j. The number of the error patterns with j-bit error is ( n j). Hence, the error probability of (all possible) j-bit error becomes ( ) n P j = p j (1 p) n j. j The decoding error probability is given by n n ( ) n P d = P j = p j (1 p) n j. j j=t+1 j=t+1 30 / 51

31 Information theory (revisited): Noisy channel theorem The channel capacity of BSC is C = 1 h b (p). We want to know whether we can really achieve this capacity using a block code or not. Fact: np k=0 ( ) n < 2 nhb(p), k where h b (p) = p log 2 p (1 p) log 2 (1 p) and n is sufficiently large and p < 1/2. M: the number of n-length codeword Let R: the rate We have R = k n or M = 2k = 2 nr. 31 / 51

32 In order to have a good performance, we need to have np M ( ) n 2 n. k k=0 To have a good performance: acodeword received vector BSC np We need to have np ( n M k Since M = 2 nr, we have np ( n k=0 k) < 2 n(1 R). Using Fact, it should be satisfied k=0 np k=0 ( ) n k where p is the bit error probability. Since M =2 nrt,wehave np k=0 ( n k ) 2 n, < 2 nh b(p) 2 n(1 R) R < 1 h b (p) = C ) < 2 n(1 RT ). Using Fact, it should be satisfied 32 / 51

33 McEliece Cryptosystem McEliece Cryptosystem Using the notion of channel coding, a cryptosystem can be built. An approach was proposed by McEliece in 1978 (an asymmetric cipher). This does not provide perfect secrecy, but computationally secure. Bob can choose an (n, k) linear code with generator matrix G. Bob also chooses a k k invertible matrix S and an n n permutation matrix P. Define G 1 = SGP, which is the public key (available to Alice as well as Eve). 33 / 51

34 McEliece Cryptosystem The code with G is powerful and can correct upto t errors. Alice does not know S and P, but G 1 which is the public key. At Alice: Alice encrypts the plaintext, x, to find the ciphertext as y = xg 1 + e, where e is a 1 n random binary vector of weight t. 34 / 51

35 McEliece Cryptosystem At Bob: Compute yp 1 = (xsgp + e)p 1 = xsg + ep 1 Since P is a permutation matrix, its inverse is also a permutation matrix and ep 1 is a 1 n vector of weight t. Let z = xs. Perform decoding to estimate z as Decoding(yP 1 ) = z x = zs 1 (= xss 1 ) Note that the decoding is successful as the code is powerful to correct upto t errors. 35 / 51

36 McEliece Cryptosystem At Eve: y and G 1 are given. For any possible ê of weight t, Alice attempts to solve y = xg 1 + ê There are ( n t) ê s of weight t. For example, if n = 1024 and t = 50, we have ( ) ( ) n 1024 = t 50 Eve s attack with known ciphertext, y, becomes computationally infeasible. 36 / 51

37 Coding Approach: Noiseless Main Channel Coding Approach: Noiseless Main Channel We consider a simple example under the following setting: X(Alice) Y (Bob) Z(Eve), where X = Y (noise free) and the channel from Bob (or Alice) to Eve is a BEC: Here, ɛ is the erasure probability. 37 / 51

38 Coding Approach: Noiseless Main Channel Message M {0, 1} Random coding: M X n = { b B0, if M = 0; b B 1, if M = 1, where B m = {b n i=1 b i = b 1 b n = m {0, 1}} Let Z n denote the received signal at Eve. Define { 0, if Z E = n does not have any erasure; 1, otherwise. 38 / 51

39 Coding Approach: Noiseless Main Channel We have H(M Z n ) = H(M Z n, E) Thus, we have = H(M Z n, E = 0) Pr(E = 0) + H(M Z n, E = 1) Pr(E = 1) = H(M Z n, E = 1)(1 (1 ɛ) n ) = H(M)(1 (1 ɛ) n ) = H(M) H(M)(1 ɛ) n lim n H(M Zn ) = H(M), which implies that the random coding can achieve perfect secrecy over Eve s BEC. In general, random coding is a key element for secure communications to exploit degraded Eve s channel. 39 / 51

40 Coding Approach: Noiseless Main Channel Suppose that Eve has a degraded channel and can only distinguish bigger blocks, while Bob can see details and distinguish all blocks. Alice randomly chooses any red block among 16 red block to send a (secret) message 00 (red for 00, blue for 01, yellow Wiretapfor Codes 11, and greed for 10), while Eve can only see a triangle. Confusion at the eavesdropper Msg 1 Msg 2 Msg 3 Msg 4 Information Theoretic Security: Fundamentals and Applications : Ashish Khisti (University of Toronto) 17 / 53 A general approach can be obtained for BSC using coset codes. 40 / 51

41 Coding Approach: Noiseless Main Channel Coset Codes for Secure Transmission over BSCs Let C M = 1 and C W < 1 (over BSC). Let H be a parity check matrix of size K N, where K < N. (Encoding) For a given message s, we choose x randomly from C(s) = {x Hx = s}. Since K < N, there are more than 1 x in C(s). That is, C(s) = 2N 2 K = 2N K. For convenience, let S K = s and X N = x. Thus, for given S K, we have H(X N S K ) = N K. 41 / 51

42 Coding Approach: Noiseless Main Channel Consider Eve s capacity: S K X N (Alice) X N (Bob) Z N (Eve) I(X N ; Z N ) C W = max P (X) N I(S K, X N ; Z N ) = I(S K ; Z N ) + I(X N ; Z N S K ) = I(X N ; Z N ) + I(S K ; Z N X N ). }{{} =0 42 / 51

43 Coding Approach: Noiseless Main Channel I(S K ; Z N ) = I(X N ; Z N ) I(X N ; Z N S K ) = NC W (H(X N S K ) H(X N Z N, S K ) = NC W H(X N S K ) + H(X N Z N, S K ) }{{} 0 = NC W (N K) + ɛ N = K N(1 C W ) + ɛ N If we have K = N(1 C W ) R = K N = 1 C W, I(S K ; Z N ) 0, N. 43 / 51

44 Coding Approach: Noiseless Main Channel Fano s inequality: H(X N Z N, S K ) = ɛ N 0, N. For given S K, there are 2 N K possible X N in C(S K ). The code rate is r = N K N = 1 R = C W. Since Eve s channel capacity is C W, Eve can decode X N from Z N when S K is available with a low error probability, p N. Due to Fano s inequality, we have H(X N Z N, S K ) h b (p N )+p N log 2 ( C(S K ) 1) 0, N. 44 / 51

45 Coding Approach: Noisy Main Channel Coding Approach: Noisy Main Channel Consider BSCs for Bob and Eve. S N X N Y N (Bob) Z N (Eve) Each channel has independent bit errors. We now assume that C M = I(X; Y ) < 1 (i.e., noisy main channel). In addition, C M > C W. Due to the noisy main channel, we also need to have a capacity achieveing 45 / 51

46 Coding Approach: Noisy Main Channel Two key ingredients Random coding (coset codes) s :M 1 (binary codeword) H :M N, (N > M) Denote by C(s) = {x Hx = s} the coset with coset leader s. We have C(s) = 2N 2 M = 2N M For a given s, random coding is to randomly choose one of the elements in C(s). Thus, Capacity achieving codes H(x s) = N M. }{{} H x = 0 if (K C M N) (N K) N 46 / 51

47 Coding Approach: Noisy Main Channel Combining the two ingredients over BSCs Code design: [ H1 H 2 ] [ 0 x = s ], { H1 : R N N H 2 : RN N For given s, Alice randomly chooses x from { C(s) = x [ ] [ H1 0 x = H 2 s Rate: H 2 x = s {0, 1} RN That is, the secrecy rate is R, while the total rate is R + R. ]} 47 / 51

48 Coding Approach: Noisy Main Channel Decoding at Bob At Bob, find ˆx from y that satisfies H 1 x = 0 (1 R < C M ) to have a negligible error prob. Then, from ˆx, ŝ becomes ŝ = H 2ˆx (R 1) to uniquely decode s Uncertainty at Eve The size of C(s): C(s) = 2 N 2 = ) N(R+R ) 2N N(R+R = 2 N(1 (R+R )) Thus, we have the following key result: H(x s) = N(1 (R+R )) = N(1 R R) = N(C M δ N R) 48 / 51

49 Coding Approach: Noisy Main Channel Using the key result, we have I(s; z) = I(x; z) I(x; z s) = I(x; z) H(x s) + H(x s, z) = NC W N(C M δ N R) + H(x s, z) = NC W N(C M δ N R) + ɛ N (Fano s inequality) = N(R (C M C W )) + Nδ N + ɛ N Thus, if R < C M C W, we have I(s; z) / 51

50 Coding Approach: Noisy Main Channel Nested codes (Thangaraj et al.) Consider a code consists of 2 NR disjoint subcodes: C = 2NR i=1 C i, where C i is the ith subcode is a capacity-achieving code over the eavesdropper s channel: 1 N I(XN ; Z N M = i) C W ɛ Suppose that all the subcodes are capacity-achieving and i is the index for secret message, M. 50 / 51

51 Coding Approach: Noisy Main Channel It can be shown that I(M; Z N ) = I(Z N ; M, X N ) I(Z N ; X N M) = I(X N ; Z N ) + I(Z N ; M X N ) I(Z N ; X N M). Since M X N Z N, we have I(Z N ; M X N ) = 0. Thus, 1 N I(M; ZN ) = 1 N I(XN ; Z N ) 1 N I(ZN ; X N M) C W (C W ɛ) = ɛ This implies that a nested code can achieve weak secrecy. 51 / 51

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Codes used in Cryptography

Codes used in Cryptography Prasad Krishnan Signal Processing and Communications Research Center, International Institute of Information Technology, Hyderabad March 29, 2016 Outline Coding Theory and Cryptography Linear Codes Codes

More information

MATH32031: Coding Theory Part 15: Summary

MATH32031: Coding Theory Part 15: Summary MATH32031: Coding Theory Part 15: Summary 1 The initial problem The main goal of coding theory is to develop techniques which permit the detection of errors in the transmission of information and, if necessary,

More information

EE 229B ERROR CONTROL CODING Spring 2005

EE 229B ERROR CONTROL CODING Spring 2005 EE 229B ERROR CONTROL CODING Spring 2005 Solutions for Homework 1 1. Is there room? Prove or disprove : There is a (12,7) binary linear code with d min = 5. If there were a (12,7) binary linear code with

More information

Optimum Soft Decision Decoding of Linear Block Codes

Optimum Soft Decision Decoding of Linear Block Codes Optimum Soft Decision Decoding of Linear Block Codes {m i } Channel encoder C=(C n-1,,c 0 ) BPSK S(t) (n,k,d) linear modulator block code Optimal receiver AWGN Assume that [n,k,d] linear block code C is

More information

Coding Theory and Applications. Linear Codes. Enes Pasalic University of Primorska Koper, 2013

Coding Theory and Applications. Linear Codes. Enes Pasalic University of Primorska Koper, 2013 Coding Theory and Applications Linear Codes Enes Pasalic University of Primorska Koper, 2013 2 Contents 1 Preface 5 2 Shannon theory and coding 7 3 Coding theory 31 4 Decoding of linear codes and MacWilliams

More information

Errors, Eavesdroppers, and Enormous Matrices

Errors, Eavesdroppers, and Enormous Matrices Errors, Eavesdroppers, and Enormous Matrices Jessalyn Bolkema September 1, 2016 University of Nebraska - Lincoln Keep it secret, keep it safe Public Key Cryptography The idea: We want a one-way lock so,

More information

Answers and Solutions to (Even Numbered) Suggested Exercises in Sections of Grimaldi s Discrete and Combinatorial Mathematics

Answers and Solutions to (Even Numbered) Suggested Exercises in Sections of Grimaldi s Discrete and Combinatorial Mathematics Answers and Solutions to (Even Numbered) Suggested Exercises in Sections 6.5-6.9 of Grimaldi s Discrete and Combinatorial Mathematics Section 6.5 6.5.2. a. r = = + = c + e. So the error pattern is e =.

More information

Code-based Cryptography

Code-based Cryptography a Hands-On Introduction Daniel Loebenberger Ηράκλειο, September 27, 2018 Post-Quantum Cryptography Various flavours: Lattice-based cryptography Hash-based cryptography Code-based

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

Cryptographic Engineering

Cryptographic Engineering Cryptographic Engineering Clément PERNET M2 Cyber Security, UFR-IM 2 AG, Univ. Grenoble-Alpes ENSIMAG, Grenoble INP Outline Coding Theory Introduction Linear Codes Reed-Solomon codes Application: Mc Eliece

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

Solutions to problems from Chapter 3

Solutions to problems from Chapter 3 Solutions to problems from Chapter 3 Manjunatha. P manjup.jnnce@gmail.com Professor Dept. of ECE J.N.N. College of Engineering, Shimoga February 28, 2016 For a systematic (7,4) linear block code, the parity

More information

ELEC 405/ELEC 511 Error Control Coding. Hamming Codes and Bounds on Codes

ELEC 405/ELEC 511 Error Control Coding. Hamming Codes and Bounds on Codes ELEC 405/ELEC 511 Error Control Coding Hamming Codes and Bounds on Codes Single Error Correcting Codes (3,1,3) code (5,2,3) code (6,3,3) code G = rate R=1/3 n-k=2 [ 1 1 1] rate R=2/5 n-k=3 1 0 1 1 0 G

More information

Error-correcting Pairs for a Public-key Cryptosystem

Error-correcting Pairs for a Public-key Cryptosystem Error-correcting Pairs for a Public-key Cryptosystem Ruud Pellikaan g.r.pellikaan@tue.nl joint work with Irene Márquez-Corbella Code-based Cryptography Workshop 2012 Lyngby, 9 May 2012 Introduction and

More information

3. Coding theory 3.1. Basic concepts

3. Coding theory 3.1. Basic concepts 3. CODING THEORY 1 3. Coding theory 3.1. Basic concepts In this chapter we will discuss briefly some aspects of error correcting codes. The main problem is that if information is sent via a noisy channel,

More information

Chapter 3 Linear Block Codes

Chapter 3 Linear Block Codes Wireless Information Transmission System Lab. Chapter 3 Linear Block Codes Institute of Communications Engineering National Sun Yat-sen University Outlines Introduction to linear block codes Syndrome and

More information

MATH Examination for the Module MATH-3152 (May 2009) Coding Theory. Time allowed: 2 hours. S = q

MATH Examination for the Module MATH-3152 (May 2009) Coding Theory. Time allowed: 2 hours. S = q MATH-315201 This question paper consists of 6 printed pages, each of which is identified by the reference MATH-3152 Only approved basic scientific calculators may be used. c UNIVERSITY OF LEEDS Examination

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

Turbo Codes Can Be Asymptotically Information-Theoretically Secure

Turbo Codes Can Be Asymptotically Information-Theoretically Secure Turbo Codes Can Be Asymptotically Information-Theoretically Secure Xiao Ma Department of ECE, Sun Yat-sen University Guangzhou, GD 50275, China E-mail: maxiao@mailsysueducn Abstract This paper shows that

More information

EE512: Error Control Coding

EE512: Error Control Coding EE512: Error Control Coding Solution for Assignment on Linear Block Codes February 14, 2007 1. Code 1: n = 4, n k = 2 Parity Check Equations: x 1 + x 3 = 0, x 1 + x 2 + x 4 = 0 Parity Bits: x 3 = x 1,

More information

U Logo Use Guidelines

U Logo Use Guidelines COMP2610/6261 - Information Theory Lecture 22: Hamming Codes U Logo Use Guidelines Mark Reid and Aditya Menon logo is a contemporary n of our heritage. presents our name, d and our motto: arn the nature

More information

A Key Recovery Attack on MDPC with CCA Security Using Decoding Errors

A Key Recovery Attack on MDPC with CCA Security Using Decoding Errors A Key Recovery Attack on MDPC with CCA Security Using Decoding Errors Qian Guo Thomas Johansson Paul Stankovski Dept. of Electrical and Information Technology, Lund University ASIACRYPT 2016 Dec 8th, 2016

More information

} has dimension = k rank A > 0 over F. For any vector b!

} has dimension = k rank A > 0 over F. For any vector b! FINAL EXAM Math 115B, UCSB, Winter 2009 - SOLUTIONS Due in SH6518 or as an email attachment at 12:00pm, March 16, 2009. You are to work on your own, and may only consult your notes, text and the class

More information

Error-correcting codes and applications

Error-correcting codes and applications Error-correcting codes and applications November 20, 2017 Summary and notation Consider F q : a finite field (if q = 2, then F q are the binary numbers), V = V(F q,n): a vector space over F q of dimension

More information

Notes 10: Public-key cryptography

Notes 10: Public-key cryptography MTH6115 Cryptography Notes 10: Public-key cryptography In this section we look at two other schemes that have been proposed for publickey ciphers. The first is interesting because it was the earliest such

More information

Attacking and defending the McEliece cryptosystem

Attacking and defending the McEliece cryptosystem Attacking and defending the McEliece cryptosystem (Joint work with Daniel J. Bernstein and Tanja Lange) Christiane Peters Technische Universiteit Eindhoven PQCrypto 2nd Workshop on Postquantum Cryptography

More information

Section 3 Error Correcting Codes (ECC): Fundamentals

Section 3 Error Correcting Codes (ECC): Fundamentals Section 3 Error Correcting Codes (ECC): Fundamentals Communication systems and channel models Definition and examples of ECCs Distance For the contents relevant to distance, Lin & Xing s book, Chapter

More information

Notes on Alekhnovich s cryptosystems

Notes on Alekhnovich s cryptosystems Notes on Alekhnovich s cryptosystems Gilles Zémor November 2016 Decisional Decoding Hypothesis with parameter t. Let 0 < R 1 < R 2 < 1. There is no polynomial-time decoding algorithm A such that: Given

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

MATH3302 Coding Theory Problem Set The following ISBN was received with a smudge. What is the missing digit? x9139 9

MATH3302 Coding Theory Problem Set The following ISBN was received with a smudge. What is the missing digit? x9139 9 Problem Set 1 These questions are based on the material in Section 1: Introduction to coding theory. You do not need to submit your answers to any of these questions. 1. The following ISBN was received

More information

ELEC 405/ELEC 511 Error Control Coding and Sequences. Hamming Codes and the Hamming Bound

ELEC 405/ELEC 511 Error Control Coding and Sequences. Hamming Codes and the Hamming Bound ELEC 45/ELEC 5 Error Control Coding and Sequences Hamming Codes and the Hamming Bound Single Error Correcting Codes ELEC 45 2 Hamming Codes One form of the (7,4,3) Hamming code is generated by This is

More information

U.C. Berkeley CS276: Cryptography Luca Trevisan February 5, Notes for Lecture 6

U.C. Berkeley CS276: Cryptography Luca Trevisan February 5, Notes for Lecture 6 U.C. Berkeley CS276: Cryptography Handout N6 Luca Trevisan February 5, 2009 Notes for Lecture 6 Scribed by Ian Haken, posted February 8, 2009 Summary The encryption scheme we saw last time, based on pseudorandom

More information

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122 Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel

More information

A Public Key Encryption Scheme Based on the Polynomial Reconstruction Problem

A Public Key Encryption Scheme Based on the Polynomial Reconstruction Problem A Public Key Encryption Scheme Based on the Polynomial Reconstruction Problem Daniel Augot and Matthieu Finiasz INRIA, Domaine de Voluceau F-78153 Le Chesnay CEDEX Abstract. The Polynomial Reconstruction

More information

Math 512 Syllabus Spring 2017, LIU Post

Math 512 Syllabus Spring 2017, LIU Post Week Class Date Material Math 512 Syllabus Spring 2017, LIU Post 1 1/23 ISBN, error-detecting codes HW: Exercises 1.1, 1.3, 1.5, 1.8, 1.14, 1.15 If x, y satisfy ISBN-10 check, then so does x + y. 2 1/30

More information

Linear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Linear Block Codes. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 26 Linear Block Codes Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay July 28, 2014 Binary Block Codes 3 / 26 Let F 2 be the set

More information

Lecture 3: Channel Capacity

Lecture 3: Channel Capacity Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

MATH3302. Coding and Cryptography. Coding Theory

MATH3302. Coding and Cryptography. Coding Theory MATH3302 Coding and Cryptography Coding Theory 2010 Contents 1 Introduction to coding theory 2 1.1 Introduction.......................................... 2 1.2 Basic definitions and assumptions..............................

More information

Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper

Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper Reevana Balmahoon and Ling Cheng School of Electrical and Information Engineering University of the Witwatersrand

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

Strengthening McEliece Cryptosystem

Strengthening McEliece Cryptosystem Strengthening McEliece Cryptosystem Pierre Loidreau Project CODES, INRIA Rocquencourt Research Unit - B.P. 105-78153 Le Chesnay Cedex France Pierre.Loidreau@inria.fr Abstract. McEliece cryptosystem is

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Lecture 11: Polar codes construction

Lecture 11: Polar codes construction 15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last

More information

Know the meaning of the basic concepts: ring, field, characteristic of a ring, the ring of polynomials R[x].

Know the meaning of the basic concepts: ring, field, characteristic of a ring, the ring of polynomials R[x]. The second exam will be on Friday, October 28, 2. It will cover Sections.7,.8, 3., 3.2, 3.4 (except 3.4.), 4. and 4.2 plus the handout on calculation of high powers of an integer modulo n via successive

More information

ELEC 519A Selected Topics in Digital Communications: Information Theory. Hamming Codes and Bounds on Codes

ELEC 519A Selected Topics in Digital Communications: Information Theory. Hamming Codes and Bounds on Codes ELEC 519A Selected Topics in Digital Communications: Information Theory Hamming Codes and Bounds on Codes Single Error Correcting Codes 2 Hamming Codes (7,4,3) Hamming code 1 0 0 0 0 1 1 0 1 0 0 1 0 1

More information

4 An Introduction to Channel Coding and Decoding over BSC

4 An Introduction to Channel Coding and Decoding over BSC 4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Cryptanalysis of the McEliece Public Key Cryptosystem Based on Polar Codes

Cryptanalysis of the McEliece Public Key Cryptosystem Based on Polar Codes Cryptanalysis of the McEliece Public Key Cryptosystem Based on Polar Codes Magali Bardet 1 Julia Chaulet 2 Vlad Dragoi 1 Ayoub Otmani 1 Jean-Pierre Tillich 2 Normandie Univ, France; UR, LITIS, F-76821

More information

Chapter 7. Error Control Coding. 7.1 Historical background. Mikael Olofsson 2005

Chapter 7. Error Control Coding. 7.1 Historical background. Mikael Olofsson 2005 Chapter 7 Error Control Coding Mikael Olofsson 2005 We have seen in Chapters 4 through 6 how digital modulation can be used to control error probabilities. This gives us a digital channel that in each

More information

On Function Computation with Privacy and Secrecy Constraints

On Function Computation with Privacy and Secrecy Constraints 1 On Function Computation with Privacy and Secrecy Constraints Wenwen Tu and Lifeng Lai Abstract In this paper, the problem of function computation with privacy and secrecy constraints is considered. The

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

MATH/MTHE 406 Homework Assignment 2 due date: October 17, 2016

MATH/MTHE 406 Homework Assignment 2 due date: October 17, 2016 MATH/MTHE 406 Homework Assignment 2 due date: October 17, 2016 Notation: We will use the notations x 1 x 2 x n and also (x 1, x 2,, x n ) to denote a vector x F n where F is a finite field. 1. [20=6+5+9]

More information

Coset Decomposition Method for Decoding Linear Codes

Coset Decomposition Method for Decoding Linear Codes International Journal of Algebra, Vol. 5, 2011, no. 28, 1395-1404 Coset Decomposition Method for Decoding Linear Codes Mohamed Sayed Faculty of Computer Studies Arab Open University P.O. Box: 830 Ardeya

More information

Lecture 1: Perfect Secrecy and Statistical Authentication. 2 Introduction - Historical vs Modern Cryptography

Lecture 1: Perfect Secrecy and Statistical Authentication. 2 Introduction - Historical vs Modern Cryptography CS 7880 Graduate Cryptography September 10, 2015 Lecture 1: Perfect Secrecy and Statistical Authentication Lecturer: Daniel Wichs Scribe: Matthew Dippel 1 Topic Covered Definition of perfect secrecy One-time

More information

APPLICATIONS. Quantum Communications

APPLICATIONS. Quantum Communications SOFT PROCESSING TECHNIQUES FOR QUANTUM KEY DISTRIBUTION APPLICATIONS Marina Mondin January 27, 2012 Quantum Communications In the past decades, the key to improving computer performance has been the reduction

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Code Based Cryptology at TU/e

Code Based Cryptology at TU/e Code Based Cryptology at TU/e Ruud Pellikaan g.r.pellikaan@tue.nl University Indonesia, Depok, Nov. 2 University Padjadjaran, Bandung, Nov. 6 Institute Technology Bandung, Bandung, Nov. 6 University Gadjah

More information

PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY

PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY BURTON ROSENBERG UNIVERSITY OF MIAMI Contents 1. Perfect Secrecy 1 1.1. A Perfectly Secret Cipher 2 1.2. Odds Ratio and Bias 3 1.3. Conditions for Perfect

More information

Outline. Computer Science 418. Number of Keys in the Sum. More on Perfect Secrecy, One-Time Pad, Entropy. Mike Jacobson. Week 3

Outline. Computer Science 418. Number of Keys in the Sum. More on Perfect Secrecy, One-Time Pad, Entropy. Mike Jacobson. Week 3 Outline Computer Science 48 More on Perfect Secrecy, One-Time Pad, Mike Jacobson Department of Computer Science University of Calgary Week 3 2 3 Mike Jacobson (University of Calgary) Computer Science 48

More information

Linear Codes and Syndrome Decoding

Linear Codes and Syndrome Decoding Linear Codes and Syndrome Decoding These notes are intended to be used as supplementary reading to Sections 6.7 9 of Grimaldi s Discrete and Combinatorial Mathematics. The proofs of the theorems are left

More information

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser

More information

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where

More information

MATH 291T CODING THEORY

MATH 291T CODING THEORY California State University, Fresno MATH 291T CODING THEORY Spring 2009 Instructor : Stefaan Delcroix Chapter 1 Introduction to Error-Correcting Codes It happens quite often that a message becomes corrupt

More information

PERFECTLY secure key agreement has been studied recently

PERFECTLY secure key agreement has been studied recently IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 2, MARCH 1999 499 Unconditionally Secure Key Agreement the Intrinsic Conditional Information Ueli M. Maurer, Senior Member, IEEE, Stefan Wolf Abstract

More information

1 Classical Information Theory and Classical Error Correction

1 Classical Information Theory and Classical Error Correction 1 Classical Information Theory and Classical Error Correction Markus Grassl 1.1 Introduction Information theory establishes a framework for any kind of communication and information processing. It allows

More information

Lecture 4: Proof of Shannon s theorem and an explicit code

Lecture 4: Proof of Shannon s theorem and an explicit code CSE 533: Error-Correcting Codes (Autumn 006 Lecture 4: Proof of Shannon s theorem and an explicit code October 11, 006 Lecturer: Venkatesan Guruswami Scribe: Atri Rudra 1 Overview Last lecture we stated

More information

Lecture Notes, Week 6

Lecture Notes, Week 6 YALE UNIVERSITY DEPARTMENT OF COMPUTER SCIENCE CPSC 467b: Cryptography and Computer Security Week 6 (rev. 3) Professor M. J. Fischer February 15 & 17, 2005 1 RSA Security Lecture Notes, Week 6 Several

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Low-Density Parity-Check Codes

Low-Density Parity-Check Codes Department of Computer Sciences Applied Algorithms Lab. July 24, 2011 Outline 1 Introduction 2 Algorithms for LDPC 3 Properties 4 Iterative Learning in Crowds 5 Algorithm 6 Results 7 Conclusion PART I

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

MATH 291T CODING THEORY

MATH 291T CODING THEORY California State University, Fresno MATH 291T CODING THEORY Fall 2011 Instructor : Stefaan Delcroix Contents 1 Introduction to Error-Correcting Codes 3 2 Basic Concepts and Properties 6 2.1 Definitions....................................

More information

Algebraic Decoding of Rank Metric Codes

Algebraic Decoding of Rank Metric Codes Algebraic Decoding of Rank Metric Codes Françoise Levy-dit-Vehel ENSTA Paris, France levy@ensta.fr joint work with Ludovic Perret (UCL Louvain) Special Semester on Gröbner Bases - Workshop D1 Outline The

More information

Can You Hear Me Now?

Can You Hear Me Now? Can You Hear Me Now? An Introduction to Coding Theory William J. Turner Department of Mathematics & Computer Science Wabash College Crawfordsville, IN 47933 19 October 2004 W. J. Turner (Wabash College)

More information

Computer Science A Cryptography and Data Security. Claude Crépeau

Computer Science A Cryptography and Data Security. Claude Crépeau Computer Science 308-547A Cryptography and Data Security Claude Crépeau These notes are, largely, transcriptions by Anton Stiglic of class notes from the former course Cryptography and Data Security (308-647A)

More information

9 THEORY OF CODES. 9.0 Introduction. 9.1 Noise

9 THEORY OF CODES. 9.0 Introduction. 9.1 Noise 9 THEORY OF CODES Chapter 9 Theory of Codes After studying this chapter you should understand what is meant by noise, error detection and correction; be able to find and use the Hamming distance for a

More information

Division Property: a New Attack Against Block Ciphers

Division Property: a New Attack Against Block Ciphers Division Property: a New Attack Against Block Ciphers Christina Boura (joint on-going work with Anne Canteaut) Séminaire du groupe Algèbre et Géometrie, LMV November 24, 2015 1 / 50 Symmetric-key encryption

More information

Maximum Likelihood Decoding of Codes on the Asymmetric Z-channel

Maximum Likelihood Decoding of Codes on the Asymmetric Z-channel Maximum Likelihood Decoding of Codes on the Asymmetric Z-channel Pål Ellingsen paale@ii.uib.no Susanna Spinsante s.spinsante@univpm.it Angela Barbero angbar@wmatem.eis.uva.es May 31, 2005 Øyvind Ytrehus

More information

MATH 433 Applied Algebra Lecture 22: Review for Exam 2.

MATH 433 Applied Algebra Lecture 22: Review for Exam 2. MATH 433 Applied Algebra Lecture 22: Review for Exam 2. Topics for Exam 2 Permutations Cycles, transpositions Cycle decomposition of a permutation Order of a permutation Sign of a permutation Symmetric

More information

Information-Theoretic Security: an overview

Information-Theoretic Security: an overview Information-Theoretic Security: an overview Rui A Costa 1 Relatório para a disciplina de Seminário, do Mestrado em Informática da Faculdade de Ciências da Universidade do Porto, sob a orientação do Prof

More information

Secret Key and Private Key Constructions for Simple Multiterminal Source Models

Secret Key and Private Key Constructions for Simple Multiterminal Source Models Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems

More information

An Introduction. Dr Nick Papanikolaou. Seminar on The Future of Cryptography The British Computer Society 17 September 2009

An Introduction. Dr Nick Papanikolaou. Seminar on The Future of Cryptography The British Computer Society 17 September 2009 An Dr Nick Papanikolaou Research Fellow, e-security Group International Digital Laboratory University of Warwick http://go.warwick.ac.uk/nikos Seminar on The Future of Cryptography The British Computer

More information

Error Correcting Codes: Combinatorics, Algorithms and Applications Spring Homework Due Monday March 23, 2009 in class

Error Correcting Codes: Combinatorics, Algorithms and Applications Spring Homework Due Monday March 23, 2009 in class Error Correcting Codes: Combinatorics, Algorithms and Applications Spring 2009 Homework Due Monday March 23, 2009 in class You can collaborate in groups of up to 3. However, the write-ups must be done

More information

L7. Diffie-Hellman (Key Exchange) Protocol. Rocky K. C. Chang, 5 March 2015

L7. Diffie-Hellman (Key Exchange) Protocol. Rocky K. C. Chang, 5 March 2015 L7. Diffie-Hellman (Key Exchange) Protocol Rocky K. C. Chang, 5 March 2015 1 Outline The basic foundation: multiplicative group modulo prime The basic Diffie-Hellman (DH) protocol The discrete logarithm

More information

LDPC Codes. Slides originally from I. Land p.1

LDPC Codes. Slides originally from I. Land p.1 Slides originally from I. Land p.1 LDPC Codes Definition of LDPC Codes Factor Graphs to use in decoding Decoding for binary erasure channels EXIT charts Soft-Output Decoding Turbo principle applied to

More information

The extended coset leader weight enumerator

The extended coset leader weight enumerator The extended coset leader weight enumerator Relinde Jurrius Ruud Pellikaan Eindhoven University of Technology, The Netherlands Symposium on Information Theory in the Benelux, 2009 1/14 Outline Codes, weights

More information

ECEN 655: Advanced Channel Coding

ECEN 655: Advanced Channel Coding ECEN 655: Advanced Channel Coding Course Introduction Henry D. Pfister Department of Electrical and Computer Engineering Texas A&M University ECEN 655: Advanced Channel Coding 1 / 19 Outline 1 History

More information

RSA RSA public key cryptosystem

RSA RSA public key cryptosystem RSA 1 RSA As we have seen, the security of most cipher systems rests on the users keeping secret a special key, for anyone possessing the key can encrypt and/or decrypt the messages sent between them.

More information

Cyclic Redundancy Check Codes

Cyclic Redundancy Check Codes Cyclic Redundancy Check Codes Lectures No. 17 and 18 Dr. Aoife Moloney School of Electronics and Communications Dublin Institute of Technology Overview These lectures will look at the following: Cyclic

More information

1.6: Solutions 17. Solution to exercise 1.6 (p.13).

1.6: Solutions 17. Solution to exercise 1.6 (p.13). 1.6: Solutions 17 A slightly more careful answer (short of explicit computation) goes as follows. Taking the approximation for ( N K) to the next order, we find: ( N N/2 ) 2 N 1 2πN/4. (1.40) This approximation

More information

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes. 5 Binary Codes You have already seen how check digits for bar codes (in Unit 3) and ISBN numbers (Unit 4) are used to detect errors. Here you will look at codes relevant for data transmission, for example,

More information

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70 Name : Roll No. :.... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/2011-12 2011 CODING & INFORMATION THEORY Time Allotted : 3 Hours Full Marks : 70 The figures in the margin indicate full marks

More information

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik

MULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik MULTITERMINAL SECRECY AND TREE PACKING With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik Information Theoretic Security A complementary approach to computational security

More information

UTA EE5362 PhD Diagnosis Exam (Spring 2011)

UTA EE5362 PhD Diagnosis Exam (Spring 2011) EE5362 Spring 2 PhD Diagnosis Exam ID: UTA EE5362 PhD Diagnosis Exam (Spring 2) Instructions: Verify that your exam contains pages (including the cover shee. Some space is provided for you to show your

More information

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land Ingmar Land, SIPCom8-1: Information Theory and Coding (2005 Spring) p.1 Overview Basic Concepts of Channel Coding Block Codes I:

More information

exercise in the previous class (1)

exercise in the previous class (1) exercise in the previous class () Consider an odd parity check code C whose codewords are (x,, x k, p) with p = x + +x k +. Is C a linear code? No. x =, x 2 =x =...=x k = p =, and... is a codeword x 2

More information

Side-channel analysis in code-based cryptography

Side-channel analysis in code-based cryptography 1 Side-channel analysis in code-based cryptography Tania RICHMOND IMATH Laboratory University of Toulon SoSySec Seminar Rennes, April 5, 2017 Outline McEliece cryptosystem Timing Attack Power consumption

More information