Information Theoretical Analysis of Digital Watermarking. Multimedia Security

Similar documents
Spread Spectrum Watermarking. Information Technologies for IPR Protection

The Method of Types and Its Application to Information Hiding

and its Extension to Authenticity

Audio Zero-Watermarking Based on Discrete Hartley Transform and Non-Negative Matrix Factorization

Quantum Wireless Sensor Networks

On the Bayes Risk in Information-Hiding Protocols

Exercise 1. = P(y a 1)P(a 1 )


University of Siena. Multimedia Security. Watermark extraction. Mauro Barni University of Siena. M. Barni, University of Siena

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

NOISE can sometimes improve nonlinear signal processing

Communication Theory II

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

ELEC546 Review of Information Theory

Outline. Computer Science 418. Number of Keys in the Sum. More on Perfect Secrecy, One-Time Pad, Entropy. Mike Jacobson. Week 3

Bioinformatics: Biology X

Multimedia Communications. Scalar Quantization

AN INTRODUCTION TO SECRECY CAPACITY. 1. Overview

Probability review. September 11, Stoch. Systems Analysis Introduction 1

Information Hiding and Covert Communication

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2

Shannon s Noisy-Channel Coding Theorem

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Information and Entropy

BASICS OF DETECTION AND ESTIMATION THEORY

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Quantum Entanglement and Cryptography. Deepthi Gopal, Caltech

Lecture 12. Block Diagram

Coding for Computing. ASPITRG, Drexel University. Jie Ren 2012/11/14

Channel Coding I. Exercises SS 2017

INFORMATION PROCESSING ABILITY OF BINARY DETECTORS AND BLOCK DECODERS. Michael A. Lexa and Don H. Johnson

CSCI 2570 Introduction to Nanocomputing

Cut-Set Bound and Dependence Balance Bound

1 Introduction to information theory

Lecture 14 February 28

On the Bayes Risk in Information-Hiding Protocols

Lecture 4 Noisy Channel Coding

Decentralized Detection In Wireless Sensor Networks

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

ECE Information theory Final

Wavelet Packet Based Digital Image Watermarking

Shannon s noisy-channel theorem

Compression and Coding

X 1 : X Table 1: Y = X X 2

6.02 Fall 2011 Lecture #9

A Framework for Optimizing Nonlinear Collusion Attacks on Fingerprinting Systems

1. INTRODUCTION. Yen-Chung Chiu. Wen-Hsiang Tsai.

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

Entropies & Information Theory

Computational Systems Biology: Biology X

Revision of Lecture 5

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur

Lecture 4 Channel Coding

Coding for Discrete Source

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

Information Theoretic Security and Privacy of Information Systems. Edited by Holger Boche, Ashish Khisti, H. Vincent Poor, and Rafael F.

Noisy channel communication

Appendix B Information theory from first principles

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

SHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe

Lecture 1. Introduction

Information Theory - Entropy. Figure 3

Quiz 2 Date: Monday, November 21, 2016

Reed-Solomon code. P(n + 2k)

Bounding the number of affine roots

Upper Bounds on the Capacity of Binary Intermittent Communication

Lecture 7 Introduction to Statistical Decision Theory

Noisy-Channel Coding

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

MPRI Course on Concurrency. Lecture 14. Application of probabilistic process calculi to security

STEGOSYSTEMS BASED ON NOISY CHANNELS

On Sequential Watermark Detection

ECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)

Network Coding and Schubert Varieties over Finite Fields

Chapter 9 Fundamental Limits in Information Theory

Example: sending one bit of information across noisy channel. Effects of the noise: flip the bit with probability p.

The peculiarities of the model: - it is allowed to use by attacker only noisy version of SG C w (n), - CO can be known exactly for attacker.

Data Detection for Controlled ISI. h(nt) = 1 for n=0,1 and zero otherwise.

Lecture 5 Channel Coding over Continuous Channels

One Lesson of Information Theory

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Dept. of Linguistics, Indiana University Fall 2015

Hiding Data in a QImage File

Cryptography and Security Final Exam

Chapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1)

Essentially Optimal Robust Secret Sharing with Maximal Corruptions

Detection theory 101 ELEC-E5410 Signal Processing for Communications

Digital communication system. Shannon s separation principle

Joint Compression and Digital Watermarking: Information-Theoretic Study and Algorithms Development

PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY

Lecture 8: Shannon s Noise Models

Detection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010

Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper

Energy State Amplification in an Energy Harvesting Communication System

On Dependence Balance Bounds for Two Way Channels

Lecture 2. Capacity of the Gaussian channel

Transcription:

Information Theoretical Analysis of Digital Watermaring Multimedia Security

Definitions: X : the output of a source with alphabet X W : a message in a discrete alphabet W={1,2,,M} Assumption : X is a discrete alphabet, follows a discrete distribution S 0,1 : a rv. which indicates whether X will be watermared. P x The variable S is introduced in the model only to provide the possibility of expressing mathematically the existence or nonexistence of a watermar in a simple way. 2

K : a secret ey defined on a discrete alphabet. X S=1 : (watermared version) f 1 : x w y (The output of the watermaring function ) f S=0 : X 0 Y (non-watermared version) f 0 : 1 f Y x y f 1 The output of the watermaring function f 1 depends on the value of K, a secret ey which uniquely identifies the copyright owner. 3

S X W Y Z K fs P Z Y g ψ q Xˆ Wˆ Ŝ General model of a watermaring system 4

The watermared version Y then passes through a noisy channel and is transformed into Z y. This channel models both unintentional distortions suffered by Y and attacs aimed at deleting or corrupting the watermar information. In both cases we assume that the secret ey is not nown, so the noisy channel can be defined by the distribution P Z Y z y which is independent of K. 5

x X Finally, Z is processed to obtain a point will be used by the recipient instead of X. There are two tests that can serve to verify the ownership of Z : the watermar detection test q : y 0,1 the watermar decoding test : y w the detection test is used to obtain an estimate which of S (to decide whether Z has been watermared using ) the decoding test is used to obtain an estimate of W. s w 6

Imperceptibility : d : x x Let be a perceptually significant distortion. A watermaring system must guarantee that the functions f 0, f1 and g introduce imperceptible alternations with respect to X. E d x, g f x D 0 0 x, g f 1 x, w, D1 E d With expectations taen wrt. X, W, K, (Mean Distortion Consraints) 7

8 or (Maximum constraints) K W w X x D w x f g x d X x D x f g x d,,,,,,,, 1 1 0 0

Hiding Information The performance of the watermar decoding process is measured by the probability of error, defined as : P e P r w W P z P, r w 9

For each value of K, the space y is partitioned into decision regions D 1,..., D M where M W is the no. of possible hidden messages. Decoding errors are due to the uncertainty about the source output X from which the watermared version was obtained. 10

Detecting the Watermar For each value of, the watermar detection test can be mathematically defined as a binary hypothesis test in which we have to decide if Z was x generated by the distribution of or the distribution of f x, w,, where X ~ Px and W is modeled as a random variable. 1 x f 0 11

Let Z y q z, 1 be the critical region for the watermar detection test performed with, i.e. the set of point in y where s 1 is decided for that ey. The watermar detection test is completely defined by the sets, K 12

13 The performance of the watermar detection test is measured by the probabilities of false alarm and detection, defined as : F P P D r r D r r F s Z P P s s P P s Z P P s s P P 1, 1 1 0 0 1

Suppose there is no distortion during distribution, so Z=Y optimizing the performance of the watermar detection test in terms of PF and PD is in a way equivalent to maximizing the Kullbac- Laibler distance between distributions : P Y s 1, and PY s 0 The maximum achievable distance is limited by the perceptual distortion constraint and entropy of the source. 14

The probability of collision between eys and : the probability of deciding s 1 in the watermar detection test for certain ey K 1 when Z has been watermared using a different ey K 2. In the context of copyright protection, this probability should be constrained below a maximum allowed value for all pairs ( K 1, K 2 ) since otherwise the author in possession of K 1 could claim authorship of information watermared by the author who owns. K 2 K 1 K 2 15

This constraint imposes a limit to the cardinality of the ey space since the minimum achievable maximum probability of collision between eys increase with the number of eys for fixed P and P. F D 16

Attacs In the following discussion we will assume that the attacer has unlimited computation power and that the algorithm for watermaring, detection and decoding are public. The security of the watermaring system relies exclusively on the secret ey K of the copyright owner. 17

The Elimination Attac Alternate a watermared source output Y to obtain a negative result s 0 in the watermar detection test for the secret ey used by the legitimate owner. The alteration made by the attacer should not be perceptible, since the resulting output Z will be used as a substitute for the watermared source output Y. 18

This constraint can be expressed in mathematical form as an average distortion constraint Ed Z, Y DE or as a maximum distortion constraint dz, Y DE, Z, Y, where d(.,.) is a distortion function and D E is the maximum distortion allowed by the attacer. 19

The Elimination Attac can be represented by a game-theoretic model : Given a certain watermared source output Y, the attacer will choose the point Z y, subject to the distortion constraint, which maximizes his probability of success. 20

Under a maximum distortion constraint, this maximum probability of success for a given Y can be expressed as P E Y Z : d max Z, Y D E K Y 1 qz, K After averaging out over y, the average probability of success in the elimination attac is max P E P Y P K Y 1 q Z, K Z : d Z, Y D Y E P 21

We can model the transformation made by the attacer as a channel with conditioned pdf P Z. Y Then the optimal elimination strategy can be seen as a worst-case channel P Z in the sense that it Y minimizes the P D for given critical regions and watermaring function. f 1 Note that the attacer is limited to those channels which satisfy the average distortion constraint. 22

The minimum achievable function of. D E is a non-increasing The optimum watermaring strategy consists in choosing the watermaring function f 1 and the critical regions maximizing the minimum P D achievable by the attacer through the choice of a channel P Z Y.Hence, the design of the watermaring system is a robust hypothesis testing problem. P D 23

The Corruption Attac The attacer is not interested in eliminating the watermar, but increasing the probability of error in the watermar decoding process. 24

Cryptographic Security The securing level of the system can be measured by the uncertainty about the ey given a watermared source output Y. Using an information-theoretical terminology, this uncertainty is the conditioned entropy HK Y, also called equivocation. 25

Size of Key Space A minimum cardinality of the ey space K is a necessary condition for specifying the equivocation HK Y. Increasing the equivocation helps in increasing the robustness against elimination attacs. However, increasing the number of available eys also increases the probability of collision among eys. Therefore, if we specify a maximum allowable probability of collision, this constraint will impose a limit on the maximum number of eys. 26

Summary Decoding of hidden information is affected by uncertainty due to the source output (not available at the receiver), distortion and attacs. We can thin that there is a channel between W and Z which can be characterized by a certain capacity. Watermaring and watermar detection under a constrained maximum probability of collision between eys can be seen as an application of identification via channels, with additional constraints derived from the limited admissible perceptual distortion in the watermaring process. The combination of watermar detection and data hiding can be related to the theory of identification plus transmission codes. 27