The peculiarities of the model: - it is allowed to use by attacker only noisy version of SG C w (n), - CO can be known exactly for attacker.

Size: px
Start display at page:

Download "The peculiarities of the model: - it is allowed to use by attacker only noisy version of SG C w (n), - CO can be known exactly for attacker."

Transcription

1 Lecture 6. SG based on noisy channels [4]. CO Detection of SG The peculiarities of the model: - it is alloed to use by attacker only noisy version of SG C (n), - CO can be knon exactly for attacker. Practical applications: - transmission of SG over satellite, mobile or optical fiber channels, - interception SG by attacker over side channels, - simulation of noisy channels. - slightly noised musical files placed on some sites in the Internet. 1

2 The main idea to design SG based on noisy channels: Camouflage the embedded message by noise of the channel. Attacker s problem in this setting : To distinguish if there exists channel noise only either both channel noise and SG. Simplification in design of SG based on noisy channel in comparison ith conventional case: Description of distribution for channel noise is simpler than description of CO distribution. We consider to types of the channels: 1. Binary symmetric channel ithout memory (BSC).. Continuous channel ith Gaussian hite additive noise. Condition. We ill assume that legal and illegal channel have the same distributions of noise.

3 1. SG based on BSC. Embedding: C( n) = C( n) Θ b( n) π ( n), n= 1,.. Cn ( ) {,1}; π( n) {,1}, iid..., P( π( n) = 1) = P, P( π( n) = ) = 1 PΘ ( n) = b, n= 1,.. " " - modulo to addition., b (1) After a passing of C (n) over BSC e get: Cʹ ( n) = C( n) ε ( n), n= 1,.. ε( n) {,1}; iid..., P( ε( n) = 1) = P, P( ε( n) = ) = 1 P, P < 1/. Attack in order to detect SG: 1.!C (n) = C! (n) C(n), (C(n) is knon exactly by attacker ) (3). To hypothesis testing: - Bernoulli sequence ith the parameter P, - Bernoulli sequence ith the parameter H :! C (n),n =1,.. H 1 :! C (n),n =1,.. P = P(1 P ) + P (1 P). 1 () 3

4 Evident testing method : H, if t < τ, H 1, if t τ, (4) here t = H ( C! (n),n =1,.. ), H (X) the Hamming eight of X. some threshold. Performance evaluation of SG detecting is determined by probabilities P fa and P m : τ 1 i i i i Pm = P1(1 P1), Pfa P(1 P), i= 1 i = i= τ i!. i = i!( i)! here τ But the exact calculation by (5) is hard for > 1 3. Thus e ill use the criterion of relative entropy (see Lecture 5). (5) 4

5 PH ( x) D= D( PH P ) ( )log, H1 = PH x P ( x) x X For BSC model here X = {,1}, e get easily P 1 P D= Plog + (1 P)log. P1 1 P1 Then the bound for P fa and P m (see Lecture 5) is : Pfa 1 Pfa Pfa log + (1 Pfa)log D, 1 P P fa here D is calculated by (6). m H 1 (6) (7) ext e ill find the probability of error P e under extraction of the embedded bit by informed legal decoder. Decoding scheme: Λ = ( C " (n) C(n))π (n) b! ' 1,Λ λ = (, ),Λ < λ n=1 here λ is some threshold, is the number of samples hich ere used to embed one information bit. (8) 5

6 Substituting (1) and () into (8), e get Λ = Λ b for b = and b = 1, here Then Λ = ε( n) π( n), Λ = ( ε( n) π( n)) π( n). 1 n= 1 n= 1 δ = 1 here (11) H (π) is the Hamming eight of the sequence π(n), n=1,. Substituting (1) and (11) into (1) e get In a similar manner e can obtain that P(/1) = P(/1, s) P( H ( π ) = s), P(/1, s) = P( Λ < / s) = s P (1 P) PH s P P s s j s j 1 λ j= s λ j s s ( ( π ) = ) = (1 ). s s P P P P P s= 1 s j= s λ j s s j s j (/1) = (1 ) (1 ). s s P P P P P s= 1 s j= λ j s s j s j (1/ ) = (1 ) (1 ). (9) (1) (1) (13) (14) 6

7 Because the sequence π(n) is knon at the decoder it is possible to get P(/1) = P(1/), selecting λ = S/, that gives s s Pe = P = P = P P P P s= s j= s/ j s s j s j (/1) (1/) (1 ) (1 ). We apply Chernoff bound to inner sum in (15) in order to simplify it : s j= s/ s s P j [ ] / (1 P s j ) 4 P (1 P ), j and substituting (16) into (15) e get s Pe P P P P s= s [ ] s (1 ) 4 (1 ). Finally using Binomial eton formula in (17) e have s / (15) (16) (17) Pe P P P+ ( (1 ) 1) 1, K K ( a+ b) = a b. K= K because (18) 7

8 Optimization problem in design of SG based on noisy channels : Given D (security), P (state of BSC), P e (the desired reliability) it is necessary to find the parameters P and, that provide maximum of the number secure embedded bits m = /. / m Pe ( P(1 P) 1) P 1 +, P 1 P D= Plog + (1 P)log, P1 1 P1 P = P(1 P ) + P (1 P). 1 Example. D=.1, P =.1, P e =.1. Then m=63 for =137391, P = If e take the restriction 1 6, then m=19 for P = (19) 8

9 . SG based on Gaussian channels. Embedding: b C( n) = C( n) + ( 1) σπ ( n), n= 1,.., π( n) iid... (,1), σ here amplitude coefficient. After a passing of SG C (n) over Gaussian channel e get: Cʹ ( n) = C( n) + ε ( n), n= 1,.., here ε( n) iid... (, σε ). (1) () Attack to detect SG consists in to steps: 1. (С(n) is knon exactly for attacker ).!C (n) = C! (n) C(n),. To hypothesis testing: H :! C (n),n =1,..,i.i.d., (,σ ε ), H 1 :! C (n),n =1,..,i.i.d., (,σ ε +σ ). There exists ell knon testing methods for this model but it is more convenient to apply relative entropy technique for to Gaussian distributions. (3) 9

10 + PH ( x) D= D( PH P ) ( )log, H = 1 PH x dx PH ( x) 1 P ( x) (, σ ), P ( x) (, σ + σ ). here After a calculation of the integral in (4) and simple transforms e get : here H ε H ε 1 D= η 1,7 ln(1 ) (1 ), η η = σ / σ ε (noise-to-atermark ratio (WR)). For large WR, that provides high security of SGS, e have from (5) D,36. η η Express, from (6) as a function of and D: η.6 / D. (4) (5) (6) (7) 1

11 ext e find the probability P e under extraction of the embedded bit by legal informed decoder. Decoding scheme. Λ = ( C " (n) C(n))π (n) b! $ 1,Λ = %. &,Λ < n=1 Substituting (1) and () into (8) e get The probability of error P e = P(/1) = P(1/) can be easily found by (9) taking into account that Λ is Gaussian random variable oing Central Limit Theorem: here b Λ= ( ε( n) + ( 1) σ π( n)) π( n). n= 1 E{ Λ} Pe = Q, Var { Λ} t 1 Qx ( ) = e dt. π x (8) (9) (3) 11

12 After simple calculation of E{Λ} and Var{Λ} e get σ σ P e = Q Q = Q. σ σ η ε σ + ε Substituting (7) into the last equality e obtain ( ) 1/ P = Q 1.9 ( D) / m. e Remark. There is a simple bound for the function Q(x) [35 ]: Qx ( ) exp( x ). Substituting (33) into (3) e get 1/ ( ) P exp.83( D) / m. e Example. Given D=.1, =1 e get P e (31) (3) (33) (34) Important remark. In Gaussian noisy channel it is possible to embed any number of bits m given any security level D and any reliability P e.(in contrast to BSC case!) 1

13 General conclusions for SG based on noisy channels. 1. This is unique case here it is possible to provide a secrecy of SG in attack ith knon exactly CO.. A choice of channel model (BSC or Gaussian) depends on the condition in hich place, of telecommunication line is alloed to embed and extract additional information. 3. Design of SG does not require the knoledge of CO statistics. 4. In the case of BSC the number of secure and reliably embedded bits is limited, hereas in the Gaussian case it is unlimited (at the coast of increasing of ) but the embedding rate approaches to zero as. 5. The use of error correcting codes allos to increase slightly the embedding rate but it is kept as before approaching to zero as approaches to infinity on the contrary to conventional communication systems here due to Shannon s theorem the date rate can be fixed if it is less than channel capacity. (See Lecture 15 Capacity of SG and WM systems ). 6. The more is the number of embedded bits m in the Gaussian case the less is amplitude of embedding that results in turn in problems for practical implementation. 13

14 Spread-time stegosystems (SG-ST) [4]. b C ( n) = C( n) + ( 1) σπ( n) ith the probability P C ( ) ( ) 1 n = C n ith the probability P (35) Fig 1. SG-ST system ith embedding into pseudorandom samples. (Here = 8, = 38, P 8 38) In order to take a decision about presence or absence of SG under the condition of knon Cn ( ) he (or she) has to perform a testing of to hypothesis: s (36) 14

15 Optimal hypothesis testing based on maximum likelihood ratio is ( ) λ H ( ) ΛΛ Λ, ΛΛ Λ < λ H, here ( ) ΛΛ Λ = 1 Suboptimal testing: P( δ H1) P( δ H )!Λ! λ H 1, Λ! <! λ H,% ' here Λ! = 1 & δ (n). ' ( n=1 (37) This method is intuitively reasonable. (38) The efficiency of SG-ST can be estimated (as usually) by to probabilities: P m the probability of SG - ST missing, P fa the probability of SG - ST false detection (false alarm). 15

16 It as proved in [4 ] that if e let (for simplicity) that folloing loer bound for holds true: P s, η η P Q or P Q here If s P P = P = P, then the (39) σ c η =, s = P (the total number of samples ith embedding) σ as, then P 1 and hence SG - STapproaches to undetectable SG. The probability of bit error after its extraction by legal user (the relation (31))orks. m fa Example. = = = P = = 7 1, η, 1,.45( s 456). Then 15 bits can be embedded ith the bit error probability at most 3 and 1 P.4 16

17 The use of error correcting codes in SG-ST Then an embedding procedure has to be performed as follos: C ( n ) = C( n ) + ( 1) σπ( n ) ith the probability P j j j C ( n ) = C( n ) ith the probability 1 P l j j b ij b j i = l, here denotes the - th bit the - th codeord of the lenght ij is some integer value. Decoder takes a decision about the embedding of the bij i= argmax ( Cʹ ( n) C( n))( 1) π ( n) 1 i k n= 1 The probability of block error be expressed as P be k d d Pbe ( 1) Q exp R ln, η + η here R= k i -th codeord by making based on ell knon union bound [43 ] can (4) (41) Example. = 1, P.4, η =, = 1, P be Then using simplex code (13,1,51) it is possible to embed 44 bits. 17

18 Optimal detecting of SG-ST Folloing to eq.(37) and to notations (36) e get the logarithmic normalized likelihood ratio as Since is sufficiently large, e can apply the Central Limit Theorem to the sum in (4).Then e get for such choice of the threshold λ, hich provides Pm = Pfa = P the folloing upper bound here, for j =,1 1 1 ΛL ( Λ1 Λ ) = log Pexp δ ( n) (1 P + ) n= 1 ησ W ε # P Q!µ!µ & 1 % $!σ ( ' (!!! $ $ $ *!µ j = E log P exp δ(n) # # " η W σ & + ( 1 P ) *# & " " ε % & % )* %!σ = 1! Var #! log! P $ $ δ(n) # exp # " η W σ & + ( 1 P ) # & " ε % " %! = 1 (! # E *#! log! P $ $ δ(n) # # exp # " η W σ & + ( 1 P ) * # & # " ε % " % )* n=1 " (4) (43) here the random values δ ( n) have the probability distributions given by (36). n=1 n=1 H j + - H j -,- $ & H & = % $ + $ &- & -!µ & & %,- & % 18

19 Since it is very hard to find analytically the values!µ,!µ 1 and!σ, e estimate them just by the simulation. σ ε η " P = S!m!m 1!σ P = Q!m!m 1 $ #!σ % ' &

20 σ ε η S "!m P =!m 1!σ P = Q!m!m % 1 $ #!σ ' &

21 It can be seen that the use of the optimal decision rule does not break undetectability of SG-ST, hence it can be declared as a secure SG system indeed. 1

22 The aveforms of audio signal after its passing over noisy channel ith signal-to-noise ratio 1 db (a), and the ave form of the same signal after embedding by SG-ST algorithm ith WR=dB (b). (The arros sho the samples ith embedding) We can see that neither visually nor by ears the presence of some embedding cannot be detected. The fact of undetectability of SG-ST by optimal statistical methods has been proved before.

STEGOSYSTEMS BASED ON NOISY CHANNELS

STEGOSYSTEMS BASED ON NOISY CHANNELS International Journal of Computer Science and Applications c Technomathematics Research Foundation Vol. 8 No. 1, pp. 1-13, 011 STEGOSYSTEMS BASED ON NOISY CHANNELS VALERY KORZHIK State University of Telecommunications,

More information

Lecture 5. Ideal and Almost-Ideal SG.

Lecture 5. Ideal and Almost-Ideal SG. Lecture 5. Ideal and Almost-Ideal SG. Definition. SG is called ideal (perfect or unconditionally undetectable ISG),if its detection is equivalently to random guessing of this fact even with the use of

More information

Lecture 12. Block Diagram

Lecture 12. Block Diagram Lecture 12 Goals Be able to encode using a linear block code Be able to decode a linear block code received over a binary symmetric channel or an additive white Gaussian channel XII-1 Block Diagram Data

More information

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

for some error exponent E( R) as a function R,

for some error exponent E( R) as a function R, . Capacity-achieving codes via Forney concatenation Shannon s Noisy Channel Theorem assures us the existence of capacity-achieving codes. However, exhaustive search for the code has double-exponential

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary

Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary ECE 830 Spring 207 Instructor: R. Willett Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics Executive summary In the last lecture we saw that the likelihood

More information

Digital Transmission Methods S

Digital Transmission Methods S Digital ransmission ethods S-7.5 Second Exercise Session Hypothesis esting Decision aking Gram-Schmidt method Detection.K.K. Communication Laboratory 5//6 Konstantinos.koufos@tkk.fi Exercise We assume

More information

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Jean-Pierre Nadal CNRS & EHESS Laboratoire de Physique Statistique (LPS, UMR 8550 CNRS - ENS UPMC Univ.

More information

Lecture 18: Gaussian Channel

Lecture 18: Gaussian Channel Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400

More information

Lecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem

Lecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem Channel Definition (Channel) A channel is defined by Λ = (X, Y, Π), where X is the set of input alphabets, Y is the set of output alphabets and Π is the transition probability of obtaining a symbol y Y

More information

Chapter 2 Signal Processing at Receivers: Detection Theory

Chapter 2 Signal Processing at Receivers: Detection Theory Chapter Signal Processing at Receivers: Detection Theory As an application of the statistical hypothesis testing, signal detection plays a key role in signal processing at receivers of wireless communication

More information

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

Detection theory. H 0 : x[n] = w[n]

Detection theory. H 0 : x[n] = w[n] Detection Theory Detection theory A the last topic of the course, we will briefly consider detection theory. The methods are based on estimation theory and attempt to answer questions such as Is a signal

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

Lecture 1: Introduction, Entropy and ML estimation

Lecture 1: Introduction, Entropy and ML estimation 0-704: Information Processing and Learning Spring 202 Lecture : Introduction, Entropy and ML estimation Lecturer: Aarti Singh Scribes: Min Xu Disclaimer: These notes have not been subjected to the usual

More information

Lecture 7 Introduction to Statistical Decision Theory

Lecture 7 Introduction to Statistical Decision Theory Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7

More information

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture) ECE 564/645 - Digital Communications, Spring 018 Homework # Due: March 19 (In Lecture) 1. Consider a binary communication system over a 1-dimensional vector channel where message m 1 is sent by signaling

More information

Lecture 4: Proof of Shannon s theorem and an explicit code

Lecture 4: Proof of Shannon s theorem and an explicit code CSE 533: Error-Correcting Codes (Autumn 006 Lecture 4: Proof of Shannon s theorem and an explicit code October 11, 006 Lecturer: Venkatesan Guruswami Scribe: Atri Rudra 1 Overview Last lecture we stated

More information

Information Theory - Entropy. Figure 3

Information Theory - Entropy. Figure 3 Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system

More information

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 12 Scribe: Indraneel Mukherjee March 12, 2008

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 12 Scribe: Indraneel Mukherjee March 12, 2008 COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture # 12 Scribe: Indraneel Mukherjee March 12, 2008 In the previous lecture, e ere introduced to the SVM algorithm and its basic motivation

More information

ECE Information theory Final

ECE Information theory Final ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the

More information

Lecture 8: Information Theory and Statistics

Lecture 8: Information Theory and Statistics Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 23, 2015 1 / 50 I-Hsiang

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Revision of Lecture 5

Revision of Lecture 5 Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)?

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)? ECE 830 / CS 76 Spring 06 Instructors: R. Willett & R. Nowak Lecture 3: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics Executive summary In the last lecture we

More information

Introduction to Convolutional Codes, Part 1

Introduction to Convolutional Codes, Part 1 Introduction to Convolutional Codes, Part 1 Frans M.J. Willems, Eindhoven University of Technology September 29, 2009 Elias, Father of Coding Theory Textbook Encoder Encoder Properties Systematic Codes

More information

LIKELIHOOD RECEIVER FOR FH-MFSK MOBILE RADIO*

LIKELIHOOD RECEIVER FOR FH-MFSK MOBILE RADIO* LIKELIHOOD RECEIVER FOR FH-MFSK MOBILE RADIO* Item Type text; Proceedings Authors Viswanathan, R.; S.C. Gupta Publisher International Foundation for Telemetering Journal International Telemetering Conference

More information

Detection theory 101 ELEC-E5410 Signal Processing for Communications

Detection theory 101 ELEC-E5410 Signal Processing for Communications Detection theory 101 ELEC-E5410 Signal Processing for Communications Binary hypothesis testing Null hypothesis H 0 : e.g. noise only Alternative hypothesis H 1 : signal + noise p(x;h 0 ) γ p(x;h 1 ) Trade-off

More information

Lecture 7 September 24

Lecture 7 September 24 EECS 11: Coding for Digital Communication and Beyond Fall 013 Lecture 7 September 4 Lecturer: Anant Sahai Scribe: Ankush Gupta 7.1 Overview This lecture introduces affine and linear codes. Orthogonal signalling

More information

Investigation of the Elias Product Code Construction for the Binary Erasure Channel

Investigation of the Elias Product Code Construction for the Binary Erasure Channel Investigation of the Elias Product Code Construction for the Binary Erasure Channel by D. P. Varodayan A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF BACHELOR OF APPLIED

More information

If there exists a threshold k 0 such that. then we can take k = k 0 γ =0 and achieve a test of size α. c 2004 by Mark R. Bell,

If there exists a threshold k 0 such that. then we can take k = k 0 γ =0 and achieve a test of size α. c 2004 by Mark R. Bell, Recall The Neyman-Pearson Lemma Neyman-Pearson Lemma: Let Θ = {θ 0, θ }, and let F θ0 (x) be the cdf of the random vector X under hypothesis and F θ (x) be its cdf under hypothesis. Assume that the cdfs

More information

Lecture 15: Thu Feb 28, 2019

Lecture 15: Thu Feb 28, 2019 Lecture 15: Thu Feb 28, 2019 Announce: HW5 posted Lecture: The AWGN waveform channel Projecting temporally AWGN leads to spatially AWGN sufficiency of projection: irrelevancy theorem in waveform AWGN:

More information

Massachusetts Institute of Technology. Solution to Problem 1: The Registrar s Worst Nightmare

Massachusetts Institute of Technology. Solution to Problem 1: The Registrar s Worst Nightmare Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Department of Mechanical Engineering 6.050J/2.0J Information and Entropy Spring 2006 Issued: March 4, 2006

More information

Networks of McCulloch-Pitts Neurons

Networks of McCulloch-Pitts Neurons s Lecture 4 Netorks of McCulloch-Pitts Neurons The McCulloch and Pitts (M_P) Neuron x x sgn x n Netorks of M-P Neurons One neuron can t do much on its on, but a net of these neurons x i x i i sgn i ij

More information

Lecture 3a: The Origin of Variational Bayes

Lecture 3a: The Origin of Variational Bayes CSC535: 013 Advanced Machine Learning Lecture 3a: The Origin of Variational Bayes Geoffrey Hinton The origin of variational Bayes In variational Bayes, e approximate the true posterior across parameters

More information

Data Complexity and Success Probability for Various Cryptanalyses

Data Complexity and Success Probability for Various Cryptanalyses Data Complexity and Success Probability for Various Cryptanalyses Céline Blondeau, Benoît Gérard and Jean Pierre Tillich INRIA project-team SECRET, France Blondeau, Gérard and Tillich. Data Complexity

More information

Fundamental Limits of Invisible Flow Fingerprinting

Fundamental Limits of Invisible Flow Fingerprinting Fundamental Limits of Invisible Flow Fingerprinting Ramin Soltani, Dennis Goeckel, Don Towsley, and Amir Houmansadr Electrical and Computer Engineering Department, University of Massachusetts, Amherst,

More information

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift Ching-Yao Su Directed by: Prof. Po-Ning Chen Department of Communications Engineering, National Chiao-Tung University July

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

NOISE can sometimes improve nonlinear signal processing

NOISE can sometimes improve nonlinear signal processing 488 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 59, NO. 2, FEBRUARY 2011 Noise Benefits in Quantizer-Array Correlation Detection and Watermark Decoding Ashok Patel, Member, IEEE, and Bart Kosko, Fellow,

More information

ECE 695C, Midterm #1 6:30-7:30pm Tuesday, September 29, 2009, ME 312,

ECE 695C, Midterm #1 6:30-7:30pm Tuesday, September 29, 2009, ME 312, ECE 695C, Midterm #1 6:30-7:30pm Tuesday, September 29, 2009, ME 312, 1. Enter your name, student ID number, e-mail address, and signature in the space provided on this page, NOW! 2. This is a closed book

More information

PATTERN RECOGNITION AND MACHINE LEARNING

PATTERN RECOGNITION AND MACHINE LEARNING PATTERN RECOGNITION AND MACHINE LEARNING Slide Set 3: Detection Theory January 2018 Heikki Huttunen heikki.huttunen@tut.fi Department of Signal Processing Tampere University of Technology Detection theory

More information

Censoring for Improved Sensing Performance in Infrastructure-less Cognitive Radio Networks

Censoring for Improved Sensing Performance in Infrastructure-less Cognitive Radio Networks Censoring for Improved Sensing Performance in Infrastructure-less Cognitive Radio Networks Mohamed Seif 1 Mohammed Karmoose 2 Moustafa Youssef 3 1 Wireless Intelligent Networks Center (WINC),, Egypt 2

More information

Practical aspects of QKD security

Practical aspects of QKD security Practical aspects of QKD security Alexei Trifonov Audrius Berzanskis MagiQ Technologies, Inc. Secure quantum communication Protected environment Alice apparatus Optical channel (insecure) Protected environment

More information

Channel Coding for Secure Transmissions

Channel Coding for Secure Transmissions Channel Coding for Secure Transmissions March 27, 2017 1 / 51 McEliece Cryptosystem Coding Approach: Noiseless Main Channel Coding Approach: Noisy Main Channel 2 / 51 Outline We present an overiew of linear

More information

y(x) = x w + ε(x), (1)

y(x) = x w + ε(x), (1) Linear regression We are ready to consider our first machine-learning problem: linear regression. Suppose that e are interested in the values of a function y(x): R d R, here x is a d-dimensional vector-valued

More information

Logistic Regression. Machine Learning Fall 2018

Logistic Regression. Machine Learning Fall 2018 Logistic Regression Machine Learning Fall 2018 1 Where are e? We have seen the folloing ideas Linear models Learning as loss minimization Bayesian learning criteria (MAP and MLE estimation) The Naïve Bayes

More information

New Traceability Codes against a Generalized Collusion Attack for Digital Fingerprinting

New Traceability Codes against a Generalized Collusion Attack for Digital Fingerprinting New Traceability Codes against a Generalized Collusion Attack for Digital Fingerprinting Hideki Yagi 1, Toshiyasu Matsushima 2, and Shigeichi Hirasawa 2 1 Media Network Center, Waseda University 1-6-1,

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Optimum Soft Decision Decoding of Linear Block Codes

Optimum Soft Decision Decoding of Linear Block Codes Optimum Soft Decision Decoding of Linear Block Codes {m i } Channel encoder C=(C n-1,,c 0 ) BPSK S(t) (n,k,d) linear modulator block code Optimal receiver AWGN Assume that [n,k,d] linear block code C is

More information

Artificial Neural Networks. Part 2

Artificial Neural Networks. Part 2 Artificial Neural Netorks Part Artificial Neuron Model Folloing simplified model of real neurons is also knon as a Threshold Logic Unit x McCullouch-Pitts neuron (943) x x n n Body of neuron f out Biological

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

Noise Benefits in Quantizer-Array Correlation Detection and Watermark Decoding

Noise Benefits in Quantizer-Array Correlation Detection and Watermark Decoding To appear in the IEEE Transactions on Signal Processing Current draft: 9 November 010 Noise Benefits in Quantizer-Array Correlation Detection and Watermark Decoding Ashok Patel, Member IEEE, and Bart Kosko,

More information

Coding theory: Applications

Coding theory: Applications INF 244 a) Textbook: Lin and Costello b) Lectures (Tu+Th 12.15-14) covering roughly Chapters 1,9-12, and 14-18 c) Weekly exercises: For your convenience d) Mandatory problem: Programming project (counts

More information

Channel Coding I. Exercises SS 2017

Channel Coding I. Exercises SS 2017 Channel Coding I Exercises SS 2017 Lecturer: Dirk Wübben Tutor: Shayan Hassanpour NW1, Room N 2420, Tel.: 0421/218-62387 E-mail: {wuebben, hassanpour}@ant.uni-bremen.de Universität Bremen, FB1 Institut

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

APPLICATIONS. Quantum Communications

APPLICATIONS. Quantum Communications SOFT PROCESSING TECHNIQUES FOR QUANTUM KEY DISTRIBUTION APPLICATIONS Marina Mondin January 27, 2012 Quantum Communications In the past decades, the key to improving computer performance has been the reduction

More information

Constructing Polar Codes Using Iterative Bit-Channel Upgrading. Arash Ghayoori. B.Sc., Isfahan University of Technology, 2011

Constructing Polar Codes Using Iterative Bit-Channel Upgrading. Arash Ghayoori. B.Sc., Isfahan University of Technology, 2011 Constructing Polar Codes Using Iterative Bit-Channel Upgrading by Arash Ghayoori B.Sc., Isfahan University of Technology, 011 A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree

More information

MMSE Equalizer Design

MMSE Equalizer Design MMSE Equalizer Design Phil Schniter March 6, 2008 [k] a[m] P a [k] g[k] m[k] h[k] + ṽ[k] q[k] y [k] P y[m] For a trivial channel (i.e., h[k] = δ[k]), e kno that the use of square-root raisedcosine (SRRC)

More information

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance. 9. Distance measures 9.1 Classical information measures How similar/close are two probability distributions? Trace distance Fidelity Example: Flipping two coins, one fair one biased Head Tail Trace distance

More information

Shannon Information Theory

Shannon Information Theory Chapter 3 Shannon Information Theory The information theory established by Shannon in 948 is the foundation discipline for communication systems, showing the potentialities and fundamental bounds of coding.

More information

University of Siena. Multimedia Security. Watermark extraction. Mauro Barni University of Siena. M. Barni, University of Siena

University of Siena. Multimedia Security. Watermark extraction. Mauro Barni University of Siena. M. Barni, University of Siena Multimedia Security Mauro Barni University of Siena : summary Optimum decoding/detection Additive SS watermarks Decoding/detection of QIM watermarks The dilemma of de-synchronization attacks Geometric

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

EVALUATION OF PACKET ERROR RATE IN WIRELESS NETWORKS

EVALUATION OF PACKET ERROR RATE IN WIRELESS NETWORKS EVALUATION OF PACKET ERROR RATE IN WIRELESS NETWORKS Ramin Khalili, Kavé Salamatian LIP6-CNRS, Université Pierre et Marie Curie. Paris, France. Ramin.khalili, kave.salamatian@lip6.fr Abstract Bit Error

More information

Privacy Amplification Theorem for Noisy Main Channel

Privacy Amplification Theorem for Noisy Main Channel Privacy Amplification Theorem for Noisy Main Channel Valeri Korjik 1, Guillermo Morales-Luna 2, and Vladimir B. Balakirsky 3 1 Telecommunications, CINVESTAV-IPN, Guadalajara Campus Prol. López Mateos Sur

More information

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land Ingmar Land, SIPCom8-1: Information Theory and Coding (2005 Spring) p.1 Overview Basic Concepts of Channel Coding Block Codes I:

More information

Lattice Coding I: From Theory To Application

Lattice Coding I: From Theory To Application Dept of Electrical and Computer Systems Engineering Monash University amin.sakzad@monash.edu Oct. 2013 1 Motivation 2 Preliminaries s Three examples 3 Problems Sphere Packing Problem Covering Problem Quantization

More information

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

Turbo Codes for Deep-Space Communications

Turbo Codes for Deep-Space Communications TDA Progress Report 42-120 February 15, 1995 Turbo Codes for Deep-Space Communications D. Divsalar and F. Pollara Communications Systems Research Section Turbo codes were recently proposed by Berrou, Glavieux,

More information

CHARACTERIZATION OF ULTRASONIC IMMERSION TRANSDUCERS

CHARACTERIZATION OF ULTRASONIC IMMERSION TRANSDUCERS CHARACTERIZATION OF ULTRASONIC IMMERSION TRANSDUCERS INTRODUCTION David D. Bennink, Center for NDE Anna L. Pate, Engineering Science and Mechanics Ioa State University Ames, Ioa 50011 In any ultrasonic

More information

Detection and Estimation Theory

Detection and Estimation Theory ESE 524 Detection and Estimation Theory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 2 Urbauer

More information

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University) Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song

More information

Some Classes of Invertible Matrices in GF(2)

Some Classes of Invertible Matrices in GF(2) Some Classes of Invertible Matrices in GF() James S. Plank Adam L. Buchsbaum Technical Report UT-CS-07-599 Department of Electrical Engineering and Computer Science University of Tennessee August 16, 007

More information

10. Composite Hypothesis Testing. ECE 830, Spring 2014

10. Composite Hypothesis Testing. ECE 830, Spring 2014 10. Composite Hypothesis Testing ECE 830, Spring 2014 1 / 25 In many real world problems, it is difficult to precisely specify probability distributions. Our models for data may involve unknown parameters

More information

Coding into a source: an inverse rate-distortion theorem

Coding into a source: an inverse rate-distortion theorem Coding into a source: an inverse rate-distortion theorem Anant Sahai joint work with: Mukul Agarwal Sanjoy K. Mitter Wireless Foundations Department of Electrical Engineering and Computer Sciences University

More information

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission

More information

Modulation & Coding for the Gaussian Channel

Modulation & Coding for the Gaussian Channel Modulation & Coding for the Gaussian Channel Trivandrum School on Communication, Coding & Networking January 27 30, 2017 Lakshmi Prasad Natarajan Dept. of Electrical Engineering Indian Institute of Technology

More information

Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication

Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation

More information

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes Brian M. Kurkoski, Kazuhiko Yamaguchi and Kingo Kobayashi kurkoski@ice.uec.ac.jp Dept. of Information and Communications Engineering

More information

Codes on graphs and iterative decoding

Codes on graphs and iterative decoding Codes on graphs and iterative decoding Bane Vasić Error Correction Coding Laboratory University of Arizona Prelude Information transmission 0 0 0 0 0 0 Channel Information transmission signal 0 0 threshold

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

One Lesson of Information Theory

One Lesson of Information Theory Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/

More information

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Jilei Hou, Paul H. Siegel and Laurence B. Milstein Department of Electrical and Computer Engineering

More information

Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1

Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1 EECS 121 Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1 PRINT your student ID: PRINT AND SIGN your name:, (last) (first) (signature) PRINT your Unix account login: ee121- Prob.

More information

4 An Introduction to Channel Coding and Decoding over BSC

4 An Introduction to Channel Coding and Decoding over BSC 4 An Introduction to Channel Coding and Decoding over BSC 4.1. Recall that channel coding introduces, in a controlled manner, some redundancy in the (binary information sequence that can be used at the

More information

Lecture 11: Polar codes construction

Lecture 11: Polar codes construction 15-859: Information Theory and Applications in TCS CMU: Spring 2013 Lecturer: Venkatesan Guruswami Lecture 11: Polar codes construction February 26, 2013 Scribe: Dan Stahlke 1 Polar codes: recap of last

More information

A First Course in Digital Communications

A First Course in Digital Communications A First Course in Digital Communications Ha H. Nguyen and E. Shwedyk February 9 A First Course in Digital Communications 1/46 Introduction There are benefits to be gained when M-ary (M = 4 signaling methods

More information

ECE313 Summer Problem Set 5. Note: It is very important that you solve the problems first and check the solutions. 0, if X = 2k, = 3.

ECE313 Summer Problem Set 5. Note: It is very important that you solve the problems first and check the solutions. 0, if X = 2k, = 3. ECE Summer 20 Problem Set 5 Reading: Independence, Important Discrete RVs Quiz date: no quiz, this is for the midterm Note: It is very important that you solve the problems first and chec the solutions

More information

Binary Convolutional Codes

Binary Convolutional Codes Binary Convolutional Codes A convolutional code has memory over a short block length. This memory results in encoded output symbols that depend not only on the present input, but also on past inputs. An

More information

FIBONACCI NUMBERS AND DECIMATION OF BINARY SEQUENCES

FIBONACCI NUMBERS AND DECIMATION OF BINARY SEQUENCES FIBONACCI NUMBERS AND DECIMATION OF BINARY SEQUENCES Jovan Dj. Golić Security Innovation, Telecom Italia Via Reiss Romoli 274, 10148 Turin, Italy (Submitted August 2004-Final Revision April 200) ABSTRACT

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Bounds on the Maximum Likelihood Decoding Error Probability of Low Density Parity Check Codes

Bounds on the Maximum Likelihood Decoding Error Probability of Low Density Parity Check Codes Bounds on the Maximum ikelihood Decoding Error Probability of ow Density Parity Check Codes Gadi Miller and David Burshtein Dept. of Electrical Engineering Systems Tel-Aviv University Tel-Aviv 69978, Israel

More information