The peculiarities of the model: - it is allowed to use by attacker only noisy version of SG C w (n), - CO can be known exactly for attacker.

Similar documents
STEGOSYSTEMS BASED ON NOISY CHANNELS

Lecture 5. Ideal and Almost-Ideal SG.

Lecture 12. Block Diagram

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

CSCI 2570 Introduction to Nanocomputing

for some error exponent E( R) as a function R,

Lecture 8: Shannon s Noise Models

Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary

Digital Transmission Methods S

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Lecture 18: Gaussian Channel

Lecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem

Chapter 2 Signal Processing at Receivers: Detection Theory

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Detection theory. H 0 : x[n] = w[n]

16.36 Communication Systems Engineering

Lecture 1: Introduction, Entropy and ML estimation

Lecture 7 Introduction to Statistical Decision Theory

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

Lecture 4: Proof of Shannon s theorem and an explicit code

Information Theory - Entropy. Figure 3

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 12 Scribe: Indraneel Mukherjee March 12, 2008

ECE Information theory Final

Lecture 8: Information Theory and Statistics

Principles of Communications

Revision of Lecture 5

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)?

Introduction to Convolutional Codes, Part 1

LIKELIHOOD RECEIVER FOR FH-MFSK MOBILE RADIO*

Detection theory 101 ELEC-E5410 Signal Processing for Communications

Lecture 7 September 24

Investigation of the Elias Product Code Construction for the Binary Erasure Channel

If there exists a threshold k 0 such that. then we can take k = k 0 γ =0 and achieve a test of size α. c 2004 by Mark R. Bell,

Lecture 15: Thu Feb 28, 2019

Massachusetts Institute of Technology. Solution to Problem 1: The Registrar s Worst Nightmare

Networks of McCulloch-Pitts Neurons

Lecture 3a: The Origin of Variational Bayes

Data Complexity and Success Probability for Various Cryptanalyses

Fundamental Limits of Invisible Flow Fingerprinting

Decoding the Tail-Biting Convolutional Codes with Pre-Decoding Circular Shift

Dispersion of the Gilbert-Elliott Channel

ELEC546 Review of Information Theory

Shannon s noisy-channel theorem

Channel Coding I. Exercises SS 2017

NOISE can sometimes improve nonlinear signal processing

ECE 695C, Midterm #1 6:30-7:30pm Tuesday, September 29, 2009, ME 312,

PATTERN RECOGNITION AND MACHINE LEARNING

Censoring for Improved Sensing Performance in Infrastructure-less Cognitive Radio Networks

Practical aspects of QKD security

Channel Coding for Secure Transmissions

y(x) = x w + ε(x), (1)

Logistic Regression. Machine Learning Fall 2018

New Traceability Codes against a Generalized Collusion Attack for Digital Fingerprinting

Lecture 4 Noisy Channel Coding

Optimum Soft Decision Decoding of Linear Block Codes

Artificial Neural Networks. Part 2

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

Noise Benefits in Quantizer-Array Correlation Detection and Watermark Decoding

Coding theory: Applications

Channel Coding I. Exercises SS 2017

An introduction to basic information theory. Hampus Wessman

APPLICATIONS. Quantum Communications

Constructing Polar Codes Using Iterative Bit-Channel Upgrading. Arash Ghayoori. B.Sc., Isfahan University of Technology, 2011

MMSE Equalizer Design

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

Shannon Information Theory

University of Siena. Multimedia Security. Watermark extraction. Mauro Barni University of Siena. M. Barni, University of Siena

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

The Method of Types and Its Application to Information Hiding

EVALUATION OF PACKET ERROR RATE IN WIRELESS NETWORKS

Privacy Amplification Theorem for Noisy Main Channel

SIPCom8-1: Information Theory and Coding Linear Binary Codes Ingmar Land

Lattice Coding I: From Theory To Application

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding

Turbo Codes for Deep-Space Communications

CHARACTERIZATION OF ULTRASONIC IMMERSION TRANSDUCERS

Detection and Estimation Theory

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

Some Classes of Invertible Matrices in GF(2)

10. Composite Hypothesis Testing. ECE 830, Spring 2014

Coding into a source: an inverse rate-distortion theorem

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Modulation & Coding for the Gaussian Channel

Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes

Codes on graphs and iterative decoding

Lecture 22: Final Review

One Lesson of Information Theory

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Coding for Digital Communication and Beyond Fall 2013 Anant Sahai MT 1

4 An Introduction to Channel Coding and Decoding over BSC

Lecture 11: Polar codes construction

A First Course in Digital Communications

ECE313 Summer Problem Set 5. Note: It is very important that you solve the problems first and check the solutions. 0, if X = 2k, = 3.

Binary Convolutional Codes

FIBONACCI NUMBERS AND DECIMATION OF BINARY SEQUENCES

Chapter 2: Random Variables

Bounds on the Maximum Likelihood Decoding Error Probability of Low Density Parity Check Codes

Transcription:

Lecture 6. SG based on noisy channels [4]. CO Detection of SG The peculiarities of the model: - it is alloed to use by attacker only noisy version of SG C (n), - CO can be knon exactly for attacker. Practical applications: - transmission of SG over satellite, mobile or optical fiber channels, - interception SG by attacker over side channels, - simulation of noisy channels. - slightly noised musical files placed on some sites in the Internet. 1

The main idea to design SG based on noisy channels: Camouflage the embedded message by noise of the channel. Attacker s problem in this setting : To distinguish if there exists channel noise only either both channel noise and SG. Simplification in design of SG based on noisy channel in comparison ith conventional case: Description of distribution for channel noise is simpler than description of CO distribution. We consider to types of the channels: 1. Binary symmetric channel ithout memory (BSC).. Continuous channel ith Gaussian hite additive noise. Condition. We ill assume that legal and illegal channel have the same distributions of noise.

1. SG based on BSC. Embedding: C( n) = C( n) Θ b( n) π ( n), n= 1,.. Cn ( ) {,1}; π( n) {,1}, iid..., P( π( n) = 1) = P, P( π( n) = ) = 1 PΘ ( n) = b, n= 1,.. " " - modulo to addition., b (1) After a passing of C (n) over BSC e get: Cʹ ( n) = C( n) ε ( n), n= 1,.. ε( n) {,1}; iid..., P( ε( n) = 1) = P, P( ε( n) = ) = 1 P, P < 1/. Attack in order to detect SG: 1.!C (n) = C! (n) C(n), (C(n) is knon exactly by attacker ) (3). To hypothesis testing: - Bernoulli sequence ith the parameter P, - Bernoulli sequence ith the parameter H :! C (n),n =1,.. H 1 :! C (n),n =1,.. P = P(1 P ) + P (1 P). 1 () 3

Evident testing method : H, if t < τ, H 1, if t τ, (4) here t = H ( C! (n),n =1,.. ), H (X) the Hamming eight of X. some threshold. Performance evaluation of SG detecting is determined by probabilities P fa and P m : τ 1 i i i i Pm = P1(1 P1), Pfa P(1 P), i= 1 i = i= τ i!. i = i!( i)! here τ But the exact calculation by (5) is hard for > 1 3. Thus e ill use the criterion of relative entropy (see Lecture 5). (5) 4

PH ( x) D= D( PH P ) ( )log, H1 = PH x P ( x) x X For BSC model here X = {,1}, e get easily P 1 P D= Plog + (1 P)log. P1 1 P1 Then the bound for P fa and P m (see Lecture 5) is : Pfa 1 Pfa Pfa log + (1 Pfa)log D, 1 P P fa here D is calculated by (6). m H 1 (6) (7) ext e ill find the probability of error P e under extraction of the embedded bit by informed legal decoder. Decoding scheme: Λ = ( C " (n) C(n))π (n) b! ' 1,Λ λ = (, ),Λ < λ n=1 here λ is some threshold, is the number of samples hich ere used to embed one information bit. (8) 5

Substituting (1) and () into (8), e get Λ = Λ b for b = and b = 1, here Then Λ = ε( n) π( n), Λ = ( ε( n) π( n)) π( n). 1 n= 1 n= 1 δ = 1 here (11) H (π) is the Hamming eight of the sequence π(n), n=1,. Substituting (1) and (11) into (1) e get In a similar manner e can obtain that P(/1) = P(/1, s) P( H ( π ) = s), P(/1, s) = P( Λ < / s) = s P (1 P) PH s P P s s j s j 1 λ j= s λ j s s ( ( π ) = ) = (1 ). s s P P P P P s= 1 s j= s λ j s s j s j (/1) = (1 ) (1 ). s s P P P P P s= 1 s j= λ j s s j s j (1/ ) = (1 ) (1 ). (9) (1) (1) (13) (14) 6

Because the sequence π(n) is knon at the decoder it is possible to get P(/1) = P(1/), selecting λ = S/, that gives s s Pe = P = P = P P P P s= s j= s/ j s s j s j (/1) (1/) (1 ) (1 ). We apply Chernoff bound to inner sum in (15) in order to simplify it : s j= s/ s s P j [ ] / (1 P s j ) 4 P (1 P ), j and substituting (16) into (15) e get s Pe P P P P s= s [ ] s (1 ) 4 (1 ). Finally using Binomial eton formula in (17) e have s / (15) (16) (17) Pe P P P+ ( (1 ) 1) 1, K K ( a+ b) = a b. K= K because (18) 7

Optimization problem in design of SG based on noisy channels : Given D (security), P (state of BSC), P e (the desired reliability) it is necessary to find the parameters P and, that provide maximum of the number secure embedded bits m = /. / m Pe ( P(1 P) 1) P 1 +, P 1 P D= Plog + (1 P)log, P1 1 P1 P = P(1 P ) + P (1 P). 1 Example. D=.1, P =.1, P e =.1. Then m=63 for =137391, P = 6.7 1-7. If e take the restriction 1 6, then m=19 for P = 8.63 1-6. (19) 8

. SG based on Gaussian channels. Embedding: b C( n) = C( n) + ( 1) σπ ( n), n= 1,.., π( n) iid... (,1), σ here amplitude coefficient. After a passing of SG C (n) over Gaussian channel e get: Cʹ ( n) = C( n) + ε ( n), n= 1,.., here ε( n) iid... (, σε ). (1) () Attack to detect SG consists in to steps: 1. (С(n) is knon exactly for attacker ).!C (n) = C! (n) C(n),. To hypothesis testing: H :! C (n),n =1,..,i.i.d., (,σ ε ), H 1 :! C (n),n =1,..,i.i.d., (,σ ε +σ ). There exists ell knon testing methods for this model but it is more convenient to apply relative entropy technique for to Gaussian distributions. (3) 9

+ PH ( x) D= D( PH P ) ( )log, H = 1 PH x dx PH ( x) 1 P ( x) (, σ ), P ( x) (, σ + σ ). here After a calculation of the integral in (4) and simple transforms e get : here H ε H ε 1 D= + 1 + η 1,7 ln(1 ) (1 ), η η = σ / σ ε (noise-to-atermark ratio (WR)). For large WR, that provides high security of SGS, e have from (5) D,36. η η Express, from (6) as a function of and D: η.6 / D. (4) (5) (6) (7) 1

ext e find the probability P e under extraction of the embedded bit by legal informed decoder. Decoding scheme. Λ = ( C " (n) C(n))π (n) b! $ 1,Λ = %. &,Λ < n=1 Substituting (1) and () into (8) e get The probability of error P e = P(/1) = P(1/) can be easily found by (9) taking into account that Λ is Gaussian random variable oing Central Limit Theorem: here b Λ= ( ε( n) + ( 1) σ π( n)) π( n). n= 1 E{ Λ} Pe = Q, Var { Λ} t 1 Qx ( ) = e dt. π x (8) (9) (3) 11

After simple calculation of E{Λ} and Var{Λ} e get σ σ P e = Q Q = Q. σ σ η ε σ + ε Substituting (7) into the last equality e obtain ( ) 1/ P = Q 1.9 ( D) / m. e Remark. There is a simple bound for the function Q(x) [35 ]: Qx ( ) exp( x ). Substituting (33) into (3) e get 1/ ( ) P exp.83( D) / m. e Example. Given D=.1, =1 e get P e 3 1-4. (31) (3) (33) (34) Important remark. In Gaussian noisy channel it is possible to embed any number of bits m given any security level D and any reliability P e.(in contrast to BSC case!) 1

General conclusions for SG based on noisy channels. 1. This is unique case here it is possible to provide a secrecy of SG in attack ith knon exactly CO.. A choice of channel model (BSC or Gaussian) depends on the condition in hich place, of telecommunication line is alloed to embed and extract additional information. 3. Design of SG does not require the knoledge of CO statistics. 4. In the case of BSC the number of secure and reliably embedded bits is limited, hereas in the Gaussian case it is unlimited (at the coast of increasing of ) but the embedding rate approaches to zero as. 5. The use of error correcting codes allos to increase slightly the embedding rate but it is kept as before approaching to zero as approaches to infinity on the contrary to conventional communication systems here due to Shannon s theorem the date rate can be fixed if it is less than channel capacity. (See Lecture 15 Capacity of SG and WM systems ). 6. The more is the number of embedded bits m in the Gaussian case the less is amplitude of embedding that results in turn in problems for practical implementation. 13

Spread-time stegosystems (SG-ST) [4]. b C ( n) = C( n) + ( 1) σπ( n) ith the probability P C ( ) ( ) 1 n = C n ith the probability P (35) Fig 1. SG-ST system ith embedding into pseudorandom samples. (Here = 8, = 38, P 8 38) In order to take a decision about presence or absence of SG under the condition of knon Cn ( ) he (or she) has to perform a testing of to hypothesis: s (36) 14

Optimal hypothesis testing based on maximum likelihood ratio is ( ) λ H ( ) ΛΛ Λ, ΛΛ Λ < λ H, here 1 1 1 ( ) ΛΛ Λ = 1 Suboptimal testing: P( δ H1) P( δ H )!Λ! λ H 1, Λ! <! λ H,% ' here Λ! = 1 & δ (n). ' ( n=1 (37) This method is intuitively reasonable. (38) The efficiency of SG-ST can be estimated (as usually) by to probabilities: P m the probability of SG - ST missing, P fa the probability of SG - ST false detection (false alarm). 15

It as proved in [4 ] that if e let (for simplicity) that folloing loer bound for holds true: P s, η η P Q or P Q here If s P P = P = P, then the (39) σ c η =, s = P (the total number of samples ith embedding) σ as, then P 1 and hence SG - STapproaches to undetectable SG. The probability of bit error after its extraction by legal user (the relation (31))orks. m fa Example. = = = P = = 7 1, η, 1,.45( s 456). Then 15 bits can be embedded ith the bit error probability at most 3 and 1 P.4 16

The use of error correcting codes in SG-ST Then an embedding procedure has to be performed as follos: C ( n ) = C( n ) + ( 1) σπ( n ) ith the probability P j j j C ( n ) = C( n ) ith the probability 1 P l j j b ij b j i = l, here denotes the - th bit the - th codeord of the lenght ij is some integer value. Decoder takes a decision about the embedding of the bij i= argmax ( Cʹ ( n) C( n))( 1) π ( n) 1 i k n= 1 The probability of block error be expressed as P be k d d Pbe ( 1) Q exp R ln, η + η here R= k i -th codeord by making based on ell knon union bound [43 ] can (4) (41) Example. = 1, P.4, η =, = 1, P 1. 7 3 be Then using simplex code (13,1,51) it is possible to embed 44 bits. 17

Optimal detecting of SG-ST Folloing to eq.(37) and to notations (36) e get the logarithmic normalized likelihood ratio as Since is sufficiently large, e can apply the Central Limit Theorem to the sum in (4).Then e get for such choice of the threshold λ, hich provides Pm = Pfa = P the folloing upper bound here, for j =,1 1 1 ΛL ( Λ1 Λ ) = log Pexp δ ( n) (1 P + ) n= 1 ησ W ε # P Q!µ!µ & 1 % $!σ ( ' (!!! $ $ $ *!µ j = E log P exp δ(n) # # " η W σ & + ( 1 P ) *# & " " ε % & % )* %!σ = 1! Var #! log! P $ $ δ(n) # exp # " η W σ & + ( 1 P ) # & " ε % " %! = 1 (! # E *#! log! P $ $ δ(n) # # exp # " η W σ & + ( 1 P ) * # & # " ε % " % )* n=1 " (4) (43) here the random values δ ( n) have the probability distributions given by (36). n=1 n=1 H j + - H j -,- $ & H & = % $ + $ &- & -!µ & & %,- & % 18

Since it is very hard to find analytically the values!µ,!µ 1 and!σ, e estimate them just by the simulation. σ ε η " P = S!m!m 1!σ P = Q!m!m 1 $ #!σ % ' &.1431.161414.16667.4398.41753 1 5.3578.157674.15881.917.41759 4 1 1.7156.15646.157585.5445.41737.1431.161414.1659.4398.41754 5 5.3578.157675.15881.917.41745 1.7156.15646.157583.5445.4176.456.51618.513737.7671.41741 1 5.1131.533.51449.7971.4189 5 1 1.63.49667.49783.718483.41737.456.51618.513854.7671.4177 5 5.1131.4996.577.77769.4186 1.63.49667.497795.718483.41745 19

6 1 7 1 σ ε 1 5 1 η S.1431 5.3578 1.7156.1431 5.3578 1.7156 "!m P =!m 1!σ P = Q!m!m % 1 $ #!σ ' &.16 88.1584 79.1576 86.16 88.1584 79.1576 86.163 93.1585 85.1578 8.164 35.1585 98.1577 97.433 41.314 4.83 97.431 87.314 4.83 97.41835.4186.41548.4175.41711.41461.456 5.135 1 5.41964 5 5.13615 1 7.69844 1 5 5.1131 5.1145 1 5.419 5 5.146 1 7.3173 1 5 1.63 4.97464 1 5.41777 5 4.97587 1 7.836 1 5.456 5.1366 1 5.4181 5 5.135 1 5 7.69844 1 5 5.1131 5.1145 1 5 5.145 1 5 7.3173 1 5.41969 1.63 4.97569 1 5 7.836 1 5.41686 4.97464 1 5

It can be seen that the use of the optimal decision rule does not break undetectability of SG-ST, hence it can be declared as a secure SG system indeed. 1

The aveforms of audio signal after its passing over noisy channel ith signal-to-noise ratio 1 db (a), and the ave form of the same signal after embedding by SG-ST algorithm ith WR=dB (b). (The arros sho the samples ith embedding) We can see that neither visually nor by ears the presence of some embedding cannot be detected. The fact of undetectability of SG-ST by optimal statistical methods has been proved before.