Imperfect Stegosystems

Size: px
Start display at page:

Download "Imperfect Stegosystems"

Transcription

1 DISSERTATION DEFENSE Imperfect Stegosystems Asymptotic Laws and Near-Optimal Practical Constructions Tomá² Filler Dept. of Electrical and Computer Engineering SUNY Binghamton, New York April 1, 2011

2 2 of 29 Steganography / Steganalysis Steganography is a mode of covert communication. Message Message Message encryption Shared encryption/decryption keys Shared stego key Message decryption Cover image Embedding function Stego image Steganalyzer Extraction function Cover source Alice Eve Bob GOAL: embed messages so that the source of stego images is statistically undistinguishable from the source of cover images.

3 3 of 29 Why is steganography relevant? First ocially reported case: June 2010 FBI reported that Russian spy gang hid secret information in images which they posted on the Internet. Method was broken by revealing the secret keys. Spies were deported. Other known cases in history: message tattooed on slave's head - ancient Greeks microdots - Germans in WW2 Steganalysis is even more important for law enforcement agencies but cannot be advanced without advancing steganography.

4 4 of 29 Security in Steganography Security is dened in terms of statistical distinguishability. Steganalysis: Statistical test between cover/stego H 0 : given image is cover H 1 : given image is stego Perfectly secure stegosystem (Cachin): Cover distribution P and stego distribution Q satisfy D KL (P Q) = P(x) P(x) log x Q(x) = 0 Implies P = Q every detector makes random guessing. Stegosystem is imperfect (ε-secure) if D KL (P Q) = ε > 0.

5 5 of 29 Imperfect Stegosystems Imperfect stegosystem: cover and stego distributions are dierent statistical detectors exist. perfectly secure stegosystems exist for articial cover sources all known stegosystems for digital media are imperfect digital media cover sources will hardly ever be perfectly understood to allow perfectly secure stegosystems It is important to study imperfect stegosystems for complex cover sources.

6 6 of 29 Dissertation Outline Gibbs Sampler Markov Chains MI Embedd. Square- Root Law Gibbs Random Fields Fisher Inform. How does secure payload scale with number of pixels? Multi- Layered Constr. Imperfect Stegosystems: Asymptotic Laws & Practical Constructions Gibbs Construction Syndrome- Trellis Codes Viterbi Alg. Distortion Design Convol. Codes SVMs How to design and implement practical embedding schemes? Steg. Features

7 Part I IMPERFECT STEGOSYSTEMS: ASYMPTOTIC LAWS How does the secure payload scale with the number of cover elements?

8 8 of 29 Asymptotic Laws for Secure Payload Secure payload: Number of bits one can hide in n-element cover while reaching certain level of detectability by the passive warden. Perfectly secure stegosystems: Assuming full knowledge of the cover source, the secure payload of perfectly secure stegosystems scales linearly in n. Communication rate of perfectly secure stegosystems is positive. Imperfect stegosystems: How does secure payload scale for imperfect stegosystems? Many hints suggest that secure payload is sublinear in n.

9 9 of 29 Mutually Independent Embedding Operation Impact of embedding is modeled as probabilistic mapping acting on each cover element independently - MI embedding. Pr(Y l = j X l = i) = b ij (β) X l... l-th cover element Y l... l-th stego element β... change rate (rel. payload) Matrix B = (b ij ) is transition probability matrix for β 0. LSB embedding: B = = I + βc = + β = 1 β = β = 1 = 1

10 10 of 29 Perfectly Secure Cover Source Cover source is perfectly secure w.r.t. given MI embedding the resulting stegosystem is perfectly secure.

11 11 of 29 Perfectly Secure Covers w.r.t. MI embedding Invariant distributions of MI embedding: Matrix B is stochastic has k 1 left eigenvectors (invariant distributions) π (a), a {1,...,k} to 1, π (a) B = π (a).

12 11 of 29 Perfectly Secure Covers w.r.t. MI embedding Invariant distributions of MI embedding: Matrix B is stochastic has k 1 left eigenvectors (invariant distributions) π (a), a {1,...,k} to 1, π (a) B = π (a). Example (perfectly secure cover): If P(X 1 = i, X 2 = j) = π (a), then P is perfectly secure. i π (a ) i

13 11 of 29 Perfectly Secure Covers w.r.t. MI embedding Invariant distributions of MI embedding: Matrix B is stochastic has k 1 left eigenvectors (invariant distributions) π (a), a {1,...,k} to 1, π (a) B = π (a). Example (perfectly secure cover): If P(X 1 = i, X 2 = j) = π (a), then P is perfectly secure. i π (a ) i Elements distributed independently with some invariant distribution form perfectly secure cover source. Set of all perfectly secure distributions form convex hull. There are exactly k n linearly independent perfectly secure cover sources on n elements.

14 12 of 29 Assumptions Summary 1 COVER SOURCE - Markov Chain rst order Markov Chain with tran. prob. mat. A = (a ij ) 2 EMBEDDING ALGORITHM - MI embedding independent substitution of states 3 STEGOSYSTEM - ε-secure (imperfect) stegosystem is IMPERFECT (ε-secure) stego cover i i j k l a ij a jk akl... j k l HMC embedding MC

15 Square Root Law of Markov Covers Under the assumptions of Markov covers, MI embedding, and imperfect stegosystem, we prove the following theorem. Theorem (SRL for Markov Covers): 1 embedding payload that grows slower than n leads to eventual ε-security for arbitrarily small ε > 0 2 embedding payload that grows exactly as A n leads to ε-secure stegosystem with xed ε > 0 3 embedding payload that grows faster than n leads to eventual detection with arbitrary P FA and P MD This implies that the secure payload of imperfect stegosystems with Markov covers and MI embedding scales as A n. BEST PAPER AWARD of 29

16 14 of 29 Fisher Information Rate From Taylor expansion... (nβ = # of changes) ( D KL P (n) Q (n) β ) = 1 nβ 2 I + O(β 3 ) = ε 2 I... Fisher information (rate) at β = 0. Secure payload of ε-secure stegosystems scales as r n. Root rate: (more rened measure of capacity) r 1 I I can be expressed as a quadratic form w.r.t. C and f (A).

17 15 of 29 Convex Combination of LSB and ±1 Embedding Use ±1 embedding in rst λ n pixels and LSB embedding in the rest. r(λ) 2 1 NRCS NRCS-JPEG CAMRAW Root rate r(λ): c λ = λc ±1 + (1 λ)c LSB r(λ) = 1 c T λ Fc λ λ 0 - LSB 1 - ±1 Higher root rate = better method.

18 Part II IMPERFECT STEGOSYSTEMS: NEAR-OPTIMAL PRACTICAL CONSTRUCTIONS From steganography to statistical physics Motivated by B O S S Break Our Steganographic System Break Our Steganographic System

19 17 of 29 Gibbs Construction For FIXED cover x stego y Y is distributed according to π. General detectability measure: D(x, y) D(x,y) is a cost of changing cover x to stego y. Assume D(x,y) < for all y Y and existence of D. What distortion measures are interesting? D(x,y) = F (x) F (y)... HUGO [IH 2010] F ( )... feature vector used in blind steganalysis D(x, y) can be derived from some side information

20 17 of 29 Gibbs Construction For FIXED cover x stego y Y is distributed according to π. General detectability measure: D(x, y) D(x,y) is a cost of changing cover x to stego y. Assume D(x,y) < for all y Y and existence of D. What distortion measures are interesting? D(x,y) = F (x) F (y)... HUGO [IH 2010] F ( )... feature vector used in blind steganalysis D(x, y) can be derived from some side information PAYLOAD-LIMITED SENDER: choose π such that entropy H(π) = m bits and avg dist. E π [D(x,y)] is minimal. DISTORTION-LIMITED SENDER: choose π such that average dist. E π [D(x,y)] = D ε and entropy H(π) is maximal.

21 18 of 29 The Separation Principle 3 basic problems to solve. (1) What distribution π solves the embedding problems? PLS: min π E π [D(x,y)] subject to H(π) = m. DLS: max π H(π) subject to E π [D(x,y)] = D ε.

22 18 of 29 The Separation Principle 3 basic problems to solve. (1) What distribution π solves the embedding problems? PLS: min π E π [D(x,y)] subject to H(π) = m. DLS: max π H(π) subject to E π [D(x,y)] = D ε. (2) How to simulate the best possible embedding algorithm? Algorithm based on D can be tested by blind steganalysis without being able to embed real message in practice.

23 18 of 29 The Separation Principle 3 basic problems to solve. (1) What distribution π solves the embedding problems? PLS: min π E π [D(x,y)] subject to H(π) = m. DLS: max π H(π) subject to E π [D(x,y)] = D ε. (2) How to simulate the best possible embedding algorithm? Algorithm based on D can be tested by blind steganalysis without being able to embed real message in practice. (3) How to implement the embedding algorithm in practice? Implement practical algorithm with payload close to H(π) and average distortion close to E π [D(x,y)].

24 19 of 29 (1) Optimality of Gibbs Random Field Let D(x,y) < for all y Y be a detectability measure. Optimal distribution π solving either PLS: min π E π [D(x,y)] subject to H(π) = m, or DLS: max π H(π) subject to E π [D(x,y)] = D ε has the form of a Gibbs Random Field π(y) = 1 Z(λ) exp ( λ D(x,y) ), ( ) Z(λ) = exp λ D(x,y) y Y λ 0 chosen to satisfy emb. constraint (PLS/DLS). Z(λ)... partition function (sum of Y terms HUGE). since D(x,y) <, then π(y) > 0 for all y Y.

25 20 of 29 (2) Sampling from the Gibbs Random Field Sampling from the GRF is well known and studied problem in statistical physics. One such algorithm is Gibbs sampler (Markov-Chain Monte Carlo): [Geman&Geman 1984] (1) Start with an arbitrary image, for example x. (2) Apply series of sweeps. In every sweep, all pixels are updated according to local statistics obtained so far. Properties of the GS algorithm y k... sample obtained after k sweeps (1) y π (allows to sample from π) (2) 1 k k D(x,yk ) E π [D] (allows to calculate expected val.) (3) H(y k y k 1 ) H(π) (not necessarily capacity achieving)

26 (3) Practical Embedding Algorithm Embedding algorithm is equivalent to a coset quantizer. Pixel-additive distortion function: D(x, y) = i ρ i (x, y i ) Syndrome-Trellis Codes: (SPIE 2010, WIFS 2010, TIFS 2011) message is communicated as a syndrome of a convolutional code represented in a dual domain embedding (nding the closest stego) is solved using Viterbi algorithm. Fast and very ecient solution. m x costs{ρ i } BEST PAPER AWARD 2010 STC encoder y = arg min D(x, y ) Hy =m y STC decoder Hy H = 0 0 m 21 of 29

27 22 of 29 Application to Spatial-Domain Digital Images undetectable 0.5 Average error of SVM-based steganalyzer with SPAM features coding loss BOWS2 database Payload-limited sender simulated emb. STC/ML-STC pentary (±2) 0.1 binary (±1) LSB matching alg. ternary (±1) ternary binary detectable Relative payload α (bits per pixel)

28 23 of 29 Distortion Optimization Using Blind Steganalysis 1 dene class of distortion functions with parameter θ 2 nd θ that maximizes steganalyzer error How to evaluate detectability of a given embedding method? 1 embed chosen payload into a large database of images 2 extract features from cover & stego images 3 train machine-learning based classier (Gaussian SVM) 4 report chosen error metric, such as trustworthy measure 1 P E = min P FA 2 (P FA + P MD ) very slow (requires many images, slow to train classier)

29 24 of 29 Detectability Criterion: Margin of Linear SVM decision boundary gi = 1 w gi = +1 support vectors misclassied samples SVM training problem: 1 min w R d 2 wt w + C ξ (w;f i, g i ) i margin regularization term

30 Detectability Criterion: Margin of Linear SVM decision boundary gi = 1 w gi = +1 L2R_L2LOSS detectability criterion: support vectors misclassied samples SVM training problem: 1 min w R d 2 wt w + C ξ (w;f i, g i ) i margin regularization term 1 embed chosen payload in N images and extract features 2 L2R_L2LOSS = size of the margin from soft-margin L 2 -regularized L 2 -loss linear SVM We used N = 80 images to evaluate L2R_L2LOSS. Takes 1 2 seconds on 40CPU cluster for images. 24 of 29

31 Optimized w.r.t. CC-PEV features and tested with CDF set. 25 of 29 Application to DCT-Domain Digital Images undetectable 0.5 Average error PE Gaussian SVM Simulated embedding Inter-block optimized non-shrinkage F5 0.1 Real impl. STC h = 10 Inter-block optimized detectable Relative payload α (bits per non-zero AC coefficients)

32 26 of 29 Conclusion Gibbs Sampler Motivated by B O S S Ö ÇÙÖ ËØ ÒÓ Ö Ô ËÝ Ø Ñ Ö ÇÙÖ ËØ ÒÓ Ö Ô ËÝ Ø Ñ Markov Chains MI Embedd. Square- Root Law Gibbs Random Fields Fisher Inform. How does secure payload scale with number of pixels? Multi- Layered Constr. Imperfect Stegosystems: Asymptotic Laws & Practical Constructions Gibbs Construction Syndrome- Trellis Codes Viterbi Alg. Distortion Design Convol. Codes SVMs How to design and implement practical embedding schemes? Steg. Features

33 List of Published Papers 1/2 Square-Root Law: [1] The Square Root Law of Steganographic Capacity for Markov Covers, with A. D. Ker, J. Fridrich, Proc. SPIE, [2] Complete Characterization of Perfectly Secure Stegosystems with Mutually Independent Embedding Operation, with J. Fridrich, IEEE ICASSP [3] Capacity of ε-secure Steganography, with J. Fridrich, Information Hiding, Syndrome coding: [4] Binary Quantization using Belief Propagation with Decimation over Factor Graphs of LDGM Codes, with J. Fridrich, Proc. Allerton Conference [5] Wet ZZW Construction for Steganography, with J. Fidrich, IEEE, WIFS [6] Minimizing Embedding Impact in Steganography using Trellis-Coded Quantization, with J. Judas and J. Fridrich, SPIE [7] Minimizing Additive Distortion Functions With Non-Binary Embedding Operation in Steganography, with J. Fridrich, IEEE WIFS, [8] Minimizing Additive Distortion in Steganography using Syndrome-Trellis Codes, with J. Judas, J. Fridrich, IEEE T-IFS Steganography: [9] Design of Adaptive Steganographic Schemes for Digital Images, with J. Fridrich, SPIE, [10] Using High-Dimensional Image Models to Perform Highly Undetectable Steganography, with T. Pevny and P. Bas, Information Hiding, [11] Steganography Using Gibbs Random Fields, with J. Fridrich, ACM MMSec, [12] Gibbs Construction in Steganography, with J. Fridrich, IEEE T-IFS 2010.

34 List of Published Papers 2/2 Digital forensics: [13] Managing Large Database of Camera Fingerprints, with J. Fridrich, M. Goljan, SPIE [14] Camera Identication - Large Scale Test, with J. Fridrich, M. Goljan, SPIE, [15] Using Sensor Pattern Noise for Camera Model Identication, with J. Fridrich, M. Goljan, IEEE, ICIP Reports: [16] Important properties of normalized KL divergence under the HMC model, [17] Capacity of ε-secure Steganography-proofs, Best paper awards: [1] Digital Watermarking AlIiance - SPIE 2009 The Square Root Law of Steganographic Capacity for Markov Covers. [2] Digital Watermarking AlIiance - SPIE 2010 Minimizing Embedding Impact in Steganography using Trellis-Coded Quantization. BEST PAPER AWARD 2009 BEST PAPER AWARD 2010

35 We can only see a short distance ahead, but we can see plenty there that needs to be done. ALAN TURING, (1950)

Fisher Information Determines Capacity of ε-secure Steganography

Fisher Information Determines Capacity of ε-secure Steganography Fisher Information Determines Capacity of ε-secure Steganography Tomáš Filler and Jessica Fridrich Dept. of Electrical and Computer Engineering SUNY Binghamton, New York 11th Information Hiding, Darmstadt,

More information

Complete characterization of perfectly secure stego-systems with mutually independent embedding operation

Complete characterization of perfectly secure stego-systems with mutually independent embedding operation Complete characterization of perfectly secure stego-systems with mutually independent embedding operation Tomáš Filler and Jessica Fridrich Dept. of Electrical and Computer Engineering SUNY Binghamton,

More information

Information Hiding and Covert Communication

Information Hiding and Covert Communication Information Hiding and Covert Communication Andrew Ker adk @ comlab.ox.ac.uk Royal Society University Research Fellow Oxford University Computing Laboratory Foundations of Security Analysis and Design

More information

MINIMIZING ADDITIVE DISTORTION FUNCTIONS WITH NON-BINARY EMBEDDING OPERATION IN STEGANOGRAPHY. Tomáš Filler and Jessica Fridrich

MINIMIZING ADDITIVE DISTORTION FUNCTIONS WITH NON-BINARY EMBEDDING OPERATION IN STEGANOGRAPHY. Tomáš Filler and Jessica Fridrich MINIMIZING ADDITIVE DISTORTION FUNCTIONS WITH NON-BINARY EMBEDDING OPERATION IN STEGANOGRAPHY Tomáš Filler and Jessica Fridrich Department of ECE, SUNY Binghamton, NY, USA {tomas.filler, fridrich}@binghamton.edu

More information

IMPERFECT STEGOSYSTEMS ASYMPTOTIC LAWS AND NEAR-OPTIMAL PRACTICAL CONSTRUCTIONS

IMPERFECT STEGOSYSTEMS ASYMPTOTIC LAWS AND NEAR-OPTIMAL PRACTICAL CONSTRUCTIONS IMPERFECT STEGOSYSTEMS ASYMPTOTIC LAWS AND NEAR-OPTIMAL PRACTICAL CONSTRUCTIONS BY TOMÁŠ FILLER M.S. in Computer Science, Czech Technical University, Prague, 2007 DISSERTATION Submitted in partial fulfillment

More information

Theoretical Model of the FLD Ensemble Classifier Based on Hypothesis Testing Theory

Theoretical Model of the FLD Ensemble Classifier Based on Hypothesis Testing Theory Theoretical Model of the FLD Ensemble Classifier Based on Hypothesis Testing Theory Rémi Cogranne, Member, IEEE, ICD - ROSAS - LM2S Troyes University of Technology - UMR 6281, CNRS 12, rue Marie Curie

More information

Minimizing Embedding Impact in Steganography using Trellis-Coded Quantization

Minimizing Embedding Impact in Steganography using Trellis-Coded Quantization Minimizing Embedding Impact in Steganography using Trellis-Coded Quantization Tomáš Filler, Jan Judas, and Jessica Fridrich Department of Electrical and Computer Engineering SUNY Binghamton, Binghamton,

More information

The Square Root Law of Steganographic Capacity

The Square Root Law of Steganographic Capacity The Square Root Law of Steganographic Capacity ABSTRACT Andrew D. Ker Oxford University Computing Laboratory Parks Road Oxford OX1 3QD, UK adk@comlab.ox.ac.uk Jan Kodovský Dept. of Electrical and Computer

More information

Detection of Content Adaptive LSB Matching (a Game Theory Approach)

Detection of Content Adaptive LSB Matching (a Game Theory Approach) Detection of Content Adaptive LSB Matching (a Game Theory Approach) Tomáš Denemark, Jessica Fridrich / 5 Content-adaptive steganography Every pixel is changed with probability β i = exp( λρ i) + exp( λρ

More information

Improving Selection-Channel-Aware Steganalysis Features

Improving Selection-Channel-Aware Steganalysis Features Improving Selection-Channel-Aware Steganalysis Features Tomáš Denemark and Jessica Fridrich, Department of ECE, SUNY Binghamton, NY, USA, {tdenema1,fridrich}@binghamton.edu, Pedro Comesaña-Alfaro, Department

More information

Detection of Content Adaptive LSB Matching (a Game Theory Approach)

Detection of Content Adaptive LSB Matching (a Game Theory Approach) Detection of Content Adaptive LSB Matching (a Game Theory Approach) Tomáš Denemark and Jessica Fridrich Department of ECE, SUNY Binghamton, NY, USA ABSTRACT This paper is an attempt to analyze the interaction

More information

Detection of Content Adaptive LSB Matching (a Game Theory Approach)

Detection of Content Adaptive LSB Matching (a Game Theory Approach) Detection of Content Adaptive LSB Matching (a Game Theory Approach) Tomáš Denemark and Jessica Fridrich Department of ECE, SUNY Binghamton, NY, USA ABSTRACT This paper is an attempt to analyze the interaction

More information

Image Data Compression. Steganography and steganalysis Alexey Pak, PhD, Lehrstuhl für Interak;ve Echtzeitsysteme, Fakultät für Informa;k, KIT

Image Data Compression. Steganography and steganalysis Alexey Pak, PhD, Lehrstuhl für Interak;ve Echtzeitsysteme, Fakultät für Informa;k, KIT Image Data Compression Steganography and steganalysis 1 Stenography vs watermarking Watermarking: impercep'bly altering a Work to embed a message about that Work Steganography: undetectably altering a

More information

Steganalysis of Spread Spectrum Data Hiding Exploiting Cover Memory

Steganalysis of Spread Spectrum Data Hiding Exploiting Cover Memory Steganalysis of Spread Spectrum Data Hiding Exploiting Cover Memory Kenneth Sullivan, Upamanyu Madhow, Shivkumar Chandrasekaran, and B.S. Manjunath Department of Electrical and Computer Engineering University

More information

Generalized Transfer Component Analysis for Mismatched Jpeg Steganalysis

Generalized Transfer Component Analysis for Mismatched Jpeg Steganalysis Generalized Transfer Component Analysis for Mismatched Jpeg Steganalysis Xiaofeng Li 1, Xiangwei Kong 1, Bo Wang 1, Yanqing Guo 1,Xingang You 2 1 School of Information and Communication Engineering Dalian

More information

Obtaining Higher Rates for Steganographic Schemes while Maintaining the Same Detectability

Obtaining Higher Rates for Steganographic Schemes while Maintaining the Same Detectability Obtaining Higher Rates for Steganographic Schemes while Maintaining the Same Detectability Anindya Sarkar, Kaushal Solanki and and B. S. Manjunath Department of Electrical and Computer Engineering, University

More information

STATISTICAL DETECTION OF INFORMATION HIDING BASED ON ADJACENT PIXELS DIFFERENCE

STATISTICAL DETECTION OF INFORMATION HIDING BASED ON ADJACENT PIXELS DIFFERENCE 2th European Signal Processing Conference EUSIPCO 22) Bucharest, Romania, August 27-3, 22 STATISTICAL DETECTION OF INFORMATION HIDING BASED ON ADJACENT PIXELS DIFFERENCE Rémi Cogranne, Cathel Zitzmann,

More information

Content-Adaptive Steganography by Minimizing Statistical Detectability

Content-Adaptive Steganography by Minimizing Statistical Detectability 1 Content-Adaptive Steganography by Minimizing Statistical Detectability Vahid Sedighi, Member, IEEE, Rémi Cogranne, Member, IEEE, and Jessica Fridrich, Senior Member, IEEE Abstract Most current steganographic

More information

MPSteg-color: a new steganographic technique for color images

MPSteg-color: a new steganographic technique for color images MPSteg-color: a new steganographic technique for color images Giacomo Cancelli, Mauro Barni Università degli Studi di Siena Dipartimento di Ingegneria dell Informazione, Italy {cancelli,barni}@dii.unisi.it

More information

New Steganographic scheme based of Reed- Solomon codes

New Steganographic scheme based of Reed- Solomon codes New Steganographic scheme based of Reed- Solomon codes I. DIOP; S.M FARSSI ;O. KHOUMA ; H. B DIOUF ; K.TALL ; K.SYLLA Ecole Supérieure Polytechnique de l Université Dakar Sénégal Email: idydiop@yahoo.fr;

More information

Lecture 5. Ideal and Almost-Ideal SG.

Lecture 5. Ideal and Almost-Ideal SG. Lecture 5. Ideal and Almost-Ideal SG. Definition. SG is called ideal (perfect or unconditionally undetectable ISG),if its detection is equivalently to random guessing of this fact even with the use of

More information

NEW PIXEL SORTING METHOD FOR PALETTE BASED STEGANOGRAPHY AND COLOR MODEL SELECTION

NEW PIXEL SORTING METHOD FOR PALETTE BASED STEGANOGRAPHY AND COLOR MODEL SELECTION NEW PIXEL SORTING METHOD FOR PALETTE BASED STEGANOGRAPHY AND COLOR MODEL SELECTION Sos S. Agaian 1 and Juan P. Perez 2 1NSP Lab, The University of Texas at San Antonio, Electrical Engineering Department,

More information

STEGANOGRAPHIC CAPACITY OF LOCALLY UNIFORM MARKOV COVERS

STEGANOGRAPHIC CAPACITY OF LOCALLY UNIFORM MARKOV COVERS STEGANOGRAPHIC CAPACITY OF LOCALLY UNIFORM MARKOV COVERS Valeriy Voloshko Research Institute for Applied Problems of Mathematics and Informatics Minsk, BELARUS e-mail: ValeraVoloshko@yandex.ru Abstract

More information

STEGANOGRAPHY is the art and science of communicating

STEGANOGRAPHY is the art and science of communicating 1 Provably Secure Steganography on Generative Media Kejiang Chen, Hang Zhou, Dongdong Hou, Hanqing Zhao, Weiming Zhang, Nenghai Yu arxiv:1811.03732v1 [cs.mm] 9 Nov 2018 Abstract In this paper, we propose

More information

JPEG-Compatibility Steganalysis Using Block-Histogram of Recompression Artifacts

JPEG-Compatibility Steganalysis Using Block-Histogram of Recompression Artifacts JPEG-Compatibility Steganalysis Using Block-Histogram of Recompression Artifacts Jan Kodovský and Jessica Fridrich Department of ECE, Binghamton University, NY, USA {fridrich,jan.kodovsky}@binghamton.edu

More information

Matrix Embedding with Pseudorandom Coefficient Selection and Error Correction for Robust and Secure Steganography

Matrix Embedding with Pseudorandom Coefficient Selection and Error Correction for Robust and Secure Steganography Matrix Embedding with Pseudorandom Coefficient Selection and Error Correction for Robust and Secure Steganography Anindya Sarkar, Student Member, IEEE, Upamanyu Madhow, Fellow, IEEE, and B. S. Manjunath,

More information

Improved Adaptive LSB Steganography based on Chaos and Genetic Algorithm

Improved Adaptive LSB Steganography based on Chaos and Genetic Algorithm Improved Adaptive LSB Steganography based on Chaos and Genetic Algorithm Lifang Yu, Yao Zhao 1, Rongrong Ni, Ting Li Institute of Information Science, Beijing Jiaotong University, BJ 100044, China Abstract

More information

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain

More information

Steganography via Codes for Memory with Defective Cells

Steganography via Codes for Memory with Defective Cells Steganography via Codes for Memory with Defective Cells Jessica Fridrich a, Miroslav Goljan a, and David Soukal b a Department of Electrical and Computer Engineering, b Department of Computer Science SUNY

More information

Quantitative Steganalysis of LSB Embedding in JPEG Domain

Quantitative Steganalysis of LSB Embedding in JPEG Domain Quantitative Steganalysis of LSB Embedding in JPEG Domain Jan Kodovský, Jessica Fridrich September 10, 2010 / ACM MM&Sec 10 1 / 17 Motivation Least Significant Bit (LSB) embedding Simplicity, high embedding

More information

CONTENT ADAPTIVE STEGANOGRAPHY DESIGN AND DETECTION

CONTENT ADAPTIVE STEGANOGRAPHY DESIGN AND DETECTION CONTENT ADAPTIVE STEGANOGRAPHY DESIGN AND DETECTION BY VOJTĚCH HOLUB MS, Czech Technical University, Prague, 2010 DISSERTATION Submitted in partial fulfillment of the requirements for the degree of Doctor

More information

Support Vector Machine (SVM) and Kernel Methods

Support Vector Machine (SVM) and Kernel Methods Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2014 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin

More information

Part 1: Expectation Propagation

Part 1: Expectation Propagation Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud

More information

Detecting LSB Matching by Applying Calibration Technique for Difference Image

Detecting LSB Matching by Applying Calibration Technique for Difference Image Detecting LSB atching by Applying Calibration Technique for Difference Image iaolong Li Institute of Computer Science & Technology Peking University 87, Beijing, P. R. China lixiaolong@icst.pku.edu.cn

More information

Optimal Detection of OutGuess using an Accurate Model of DCT Coefficients

Optimal Detection of OutGuess using an Accurate Model of DCT Coefficients Optimal Detection of OutGuess using an Accurate Model of DCT Coefficients Thanh Hai Thai, Rémi Cogranne, Florent Retraint ICD - ROSAS - LMS - Troyes University of Technology - UMR 68, CNRS Troyes - France

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

Wet Paper Codes with Improved Embedding Efficiency

Wet Paper Codes with Improved Embedding Efficiency 1 Wet Paper Codes with Improved Embedding Efficiency Jessica Fridrich, Member, IEEE, Miroslav Goljan, and David Soukal Abstract Wet paper codes were previously proposed as a tool for construction of steganographic

More information

Perturbed Quantization Steganography

Perturbed Quantization Steganography Perturbed Quantization Steganography Jessica Fridrich SUNY Binghamton Department of ECE Binghamton, NY 13902-6000 001 607 777 2577 fridrich@binghamton.edu Miroslav Goljan SUNY Binghamton Department of

More information

Image Dependent Log-likelihood Ratio Allocation for Repeat Accumulate Code based Decoding in Data Hiding Channels

Image Dependent Log-likelihood Ratio Allocation for Repeat Accumulate Code based Decoding in Data Hiding Channels Image Dependent Log-likelihood Ratio Allocation for Repeat Accumulate Code based Decoding in Data Hiding Channels Anindya Sarkar and B. S. Manjunath Department of Electrical and Computer Engineering, University

More information

Quantitative Structural Steganalysis of Jsteg

Quantitative Structural Steganalysis of Jsteg Quantitative Structural Steganalysis of Jsteg Jan Kodovský and Jessica Fridrich, Member, IEEE Abstract Quantitative steganalysis strives to estimate the change rate defined as the relative number of embedding

More information

Benchmarking for Steganography

Benchmarking for Steganography Benchmarking for Steganography Tomáš Pevný 1 and Jessica Fridrich 2 1 Department of CS, SUNY Binghamton, pevnak@gmail.com 2 Department of ECE, SUNY Binghamton, fridrich@binghamton.edu Abstract. With the

More information

ML Detection of Steganography

ML Detection of Steganography ML Detection of Steganography Mark T Hogan, Neil J Hurley, Guénolé CM Silvestre, Félix Balado and Kevin M Whelan Department of Computer Science, University College Dublin, Belfield, Dublin 4, Ireland ABSTRACT

More information

Quantization Index Modulation using the E 8 lattice

Quantization Index Modulation using the E 8 lattice 1 Quantization Index Modulation using the E 8 lattice Qian Zhang and Nigel Boston Dept of Electrical and Computer Engineering University of Wisconsin Madison 1415 Engineering Drive, Madison, WI 53706 Email:

More information

Lecture 16: Modern Classification (I) - Separating Hyperplanes

Lecture 16: Modern Classification (I) - Separating Hyperplanes Lecture 16: Modern Classification (I) - Separating Hyperplanes Outline 1 2 Separating Hyperplane Binary SVM for Separable Case Bayes Rule for Binary Problems Consider the simplest case: two classes are

More information

Research Article Improved Adaptive LSB Steganography Based on Chaos and Genetic Algorithm

Research Article Improved Adaptive LSB Steganography Based on Chaos and Genetic Algorithm Hindawi Publishing Corporation EURASIP Journal on Advances in Signal Processing Volume 2010, Article ID 876946, 6 pages doi:10.1155/2010/876946 Research Article Improved Adaptive LSB Steganography Based

More information

A Hybrid Method with Lorenz attractor based Cryptography and LSB Steganography

A Hybrid Method with Lorenz attractor based Cryptography and LSB Steganography International Journal of Current Engineering and Technology E-ISSN 2277 4106, P-ISSN 2347 5161 2015INPRESSCO, All Rights Reserved Available at http://inpressco.com/category/ijcet Research Article A Hybrid

More information

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that

More information

Support Vector Machine (SVM) and Kernel Methods

Support Vector Machine (SVM) and Kernel Methods Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2015 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov

More information

Capacity-Approaching Codes for Reversible Data Hiding

Capacity-Approaching Codes for Reversible Data Hiding Capacity-Approaching Codes for Reversible Data Hiding Weiming Zhang 1,2,BiaoChen 1, and Nenghai Yu 1, 1 Department of Electrical Engineering & Information Science, University of Science and Technology

More information

Low Complexity Features for JPEG Steganalysis Using Undecimated DCT

Low Complexity Features for JPEG Steganalysis Using Undecimated DCT 1 Low Complexity Features for JPEG Steganalysis Using Undecimated DCT Vojtěch Holub and Jessica Fridrich, Member, IEEE Abstract This article introduces a novel feature set for steganalysis of JPEG images

More information

2. Definition & Classification

2. Definition & Classification Information Hiding Data Hiding K-H Jung Agenda 1. Data Hiding 2. Definition & Classification 3. Related Works 4. Considerations Definition of Data Hiding 3 Data Hiding, Information Hiding Concealing secret

More information

ROC Curves for Steganalysts

ROC Curves for Steganalysts ROC Curves for Steganalysts Andreas Westfeld Technische Universität Dresden Institut für Systemarchitektur 01062 Dresden mailto:westfeld@inf.tu-dresden.de 1 Introduction There are different approaches

More information

Support Vector Machine (SVM) and Kernel Methods

Support Vector Machine (SVM) and Kernel Methods Support Vector Machine (SVM) and Kernel Methods CE-717: Machine Learning Sharif University of Technology Fall 2016 Soleymani Outline Margin concept Hard-Margin SVM Soft-Margin SVM Dual Problems of Hard-Margin

More information

STEGOSYSTEMS BASED ON NOISY CHANNELS

STEGOSYSTEMS BASED ON NOISY CHANNELS International Journal of Computer Science and Applications c Technomathematics Research Foundation Vol. 8 No. 1, pp. 1-13, 011 STEGOSYSTEMS BASED ON NOISY CHANNELS VALERY KORZHIK State University of Telecommunications,

More information

Support Vector Machine (SVM) & Kernel CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2012

Support Vector Machine (SVM) & Kernel CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2012 Support Vector Machine (SVM) & Kernel CE-717: Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Linear classifier Which classifier? x 2 x 1 2 Linear classifier Margin concept x 2

More information

Stego key searching for LSB steganography on JPEG decompressed image

Stego key searching for LSB steganography on JPEG decompressed image . RESEARCH PAPER. SCIENCE CHINA Information Sciences March 2016, Vol. 59 032105:1 032105:15 doi: 10.1007/s11432-015-5367-x Stego key searching for LSB steganography on JPEG decompressed image Jiufen LIU

More information

Expectation Maximization

Expectation Maximization Expectation Maximization Bishop PRML Ch. 9 Alireza Ghane c Ghane/Mori 4 6 8 4 6 8 4 6 8 4 6 8 5 5 5 5 5 5 4 6 8 4 4 6 8 4 5 5 5 5 5 5 µ, Σ) α f Learningscale is slightly Parameters is slightly larger larger

More information

Machine Learning. Support Vector Machines. Fabio Vandin November 20, 2017

Machine Learning. Support Vector Machines. Fabio Vandin November 20, 2017 Machine Learning Support Vector Machines Fabio Vandin November 20, 2017 1 Classification and Margin Consider a classification problem with two classes: instance set X = R d label set Y = { 1, 1}. Training

More information

Probabilistic and Bayesian Machine Learning

Probabilistic and Bayesian Machine Learning Probabilistic and Bayesian Machine Learning Day 4: Expectation and Belief Propagation Yee Whye Teh ywteh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/

More information

MIRA, SVM, k-nn. Lirong Xia

MIRA, SVM, k-nn. Lirong Xia MIRA, SVM, k-nn Lirong Xia Linear Classifiers (perceptrons) Inputs are feature values Each feature has a weight Sum is the activation activation w If the activation is: Positive: output +1 Negative, output

More information

Unsupervised Learning

Unsupervised Learning Unsupervised Learning Bayesian Model Comparison Zoubin Ghahramani zoubin@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc in Intelligent Systems, Dept Computer Science University College

More information

A Study of Embedding Operations and Locations for Steganography in H.264 Video

A Study of Embedding Operations and Locations for Steganography in H.264 Video A Study of Embedding Operations and Locations for Steganography in H.264 Video Andreas Neufeld and Andrew D. Ker Oxford University Department of Computer Science Andreas Neufeld is now with the Image and

More information

Approximate Inference Part 1 of 2

Approximate Inference Part 1 of 2 Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ Bayesian paradigm Consistent use of probability theory

More information

Approximate Inference Part 1 of 2

Approximate Inference Part 1 of 2 Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ 1 Bayesian paradigm Consistent use of probability theory

More information

Adaptive Steganography and Steganalysis with Fixed-Size Embedding

Adaptive Steganography and Steganalysis with Fixed-Size Embedding Adaptive Steganography and Steganalysis with Fixed-Size Embedding Benjamin Johnson ae, Pascal Schöttle b, Aron Laszka c, Jens Grossklags d, and Rainer Böhme b a Cylab, Carnegie Mellon University, USA b

More information

UNSUPERVISED LEARNING

UNSUPERVISED LEARNING UNSUPERVISED LEARNING Topics Layer-wise (unsupervised) pre-training Restricted Boltzmann Machines Auto-encoders LAYER-WISE (UNSUPERVISED) PRE-TRAINING Breakthrough in 2006 Layer-wise (unsupervised) pre-training

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 295-P, Spring 213 Prof. Erik Sudderth Lecture 11: Inference & Learning Overview, Gaussian Graphical Models Some figures courtesy Michael Jordan s draft

More information

CS Lecture 19. Exponential Families & Expectation Propagation

CS Lecture 19. Exponential Families & Expectation Propagation CS 6347 Lecture 19 Exponential Families & Expectation Propagation Discrete State Spaces We have been focusing on the case of MRFs over discrete state spaces Probability distributions over discrete spaces

More information

Bioinformatics: Biology X

Bioinformatics: Biology X Bud Mishra Room 1002, 715 Broadway, Courant Institute, NYU, New York, USA Model Building/Checking, Reverse Engineering, Causality Outline 1 Bayesian Interpretation of Probabilities 2 Where (or of what)

More information

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo Group Prof. Daniel Cremers 11. Sampling Methods: Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

A Prediction Mode-Based Information Hiding Approach for H.264/AVC Videos Minimizing the Impacts on Rate-Distortion Optimization

A Prediction Mode-Based Information Hiding Approach for H.264/AVC Videos Minimizing the Impacts on Rate-Distortion Optimization A Prediction Mode-Based Information Hiding Approach for H.264/AVC Videos Minimizing the Impacts on Rate-Distortion Optimization Yu Wang 1,2(B), Yun Cao 1,2, Xianfeng Zhao 1,2, and Linna Zhou 3 1 State

More information

Steganalysis of ±k Steganography based on Noncausal Linear Predictor

Steganalysis of ±k Steganography based on Noncausal Linear Predictor INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL ISSN 1841-9836, 9(5):623-632, October, 2014. Steganalysis of ±k Steganography based on Noncausal Linear Predictor K.M. Singh, Y.J. Chanu, T.

More information

Constructing good covering codes for applications in Steganography

Constructing good covering codes for applications in Steganography Constructing good covering codes for applications in Steganography Jürgen Bierbrauer 1 and Jessica Fridrich 2 1 Department of Mathematical Sciences Michigan Technological University HOUGHTON (MI) 49931

More information

UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2014

UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2014 UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2014 Exam policy: This exam allows two one-page, two-sided cheat sheets (i.e. 4 sides); No other materials. Time: 2 hours. Be sure to write

More information

Non-Bayesian Classifiers Part II: Linear Discriminants and Support Vector Machines

Non-Bayesian Classifiers Part II: Linear Discriminants and Support Vector Machines Non-Bayesian Classifiers Part II: Linear Discriminants and Support Vector Machines Selim Aksoy Department of Computer Engineering Bilkent University saksoy@cs.bilkent.edu.tr CS 551, Fall 2018 CS 551, Fall

More information

QISLSQb: A Quantum Image Steganography Scheme Based on Least Significant Qubit

QISLSQb: A Quantum Image Steganography Scheme Based on Least Significant Qubit 06 International Conference on Mathematical, Computational and Statistical Sciences and Engineering (MCSSE 06) ISBN: 978--60595-396-0 QISLSQb: A Quantum Image Steganography Scheme Based on Least Significant

More information

Chapter 20. Deep Generative Models

Chapter 20. Deep Generative Models Peng et al.: Deep Learning and Practice 1 Chapter 20 Deep Generative Models Peng et al.: Deep Learning and Practice 2 Generative Models Models that are able to Provide an estimate of the probability distribution

More information

Information similarity metrics in information security and forensics

Information similarity metrics in information security and forensics University of New Mexico UNM Digital Repository Electrical and Computer Engineering ETDs Engineering ETDs 2-9-2010 Information similarity metrics in information security and forensics Tu-Thach Quach Follow

More information

Feature selection. c Victor Kitov August Summer school on Machine Learning in High Energy Physics in partnership with

Feature selection. c Victor Kitov August Summer school on Machine Learning in High Energy Physics in partnership with Feature selection c Victor Kitov v.v.kitov@yandex.ru Summer school on Machine Learning in High Energy Physics in partnership with August 2015 1/38 Feature selection Feature selection is a process of selecting

More information

Steganalysis with a Computational Immune System

Steganalysis with a Computational Immune System DIGITAL FORENSIC RESEARCH CONFERENCE Steganalysis with a Computational Immune System By Jacob Jackson, Gregg Gunsch, Roger Claypoole, Gary Lamont Presented At The Digital Forensic Research Conference DFRWS

More information

Information Hiding. Andrew D. Ker. 10 Lectures, Hilary Term (complete) Department of Computer Science, Oxford University

Information Hiding. Andrew D. Ker. 10 Lectures, Hilary Term (complete) Department of Computer Science, Oxford University Information Hiding (complete) Andrew D. Ker 10 Lectures, Hilary Term 2016 Department of Computer Science, Oxford University ii Contents 0 Introduction 1 0.1 Preliminary Material..............................

More information

Code design: Computer search

Code design: Computer search Code design: Computer search Low rate codes Represent the code by its generator matrix Find one representative for each equivalence class of codes Permutation equivalences? Do NOT try several generator

More information

Soft-Output Trellis Waveform Coding

Soft-Output Trellis Waveform Coding Soft-Output Trellis Waveform Coding Tariq Haddad and Abbas Yongaçoḡlu School of Information Technology and Engineering, University of Ottawa Ottawa, Ontario, K1N 6N5, Canada Fax: +1 (613) 562 5175 thaddad@site.uottawa.ca

More information

Hiding Data in a QImage File

Hiding Data in a QImage File Hiding Data in a QImage File Gabriela Mogos Abstract The idea of embedding some information within a digital media, in such a way that the inserted data are intrinsically part of the media itself, has

More information

A short introduction to supervised learning, with applications to cancer pathway analysis Dr. Christina Leslie

A short introduction to supervised learning, with applications to cancer pathway analysis Dr. Christina Leslie A short introduction to supervised learning, with applications to cancer pathway analysis Dr. Christina Leslie Computational Biology Program Memorial Sloan-Kettering Cancer Center http://cbio.mskcc.org/leslielab

More information

A vector from the origin to H, V could be expressed using:

A vector from the origin to H, V could be expressed using: Linear Discriminant Function: the linear discriminant function: g(x) = w t x + ω 0 x is the point, w is the weight vector, and ω 0 is the bias (t is the transpose). Two Category Case: In the two category

More information

Final Exam, Machine Learning, Spring 2009

Final Exam, Machine Learning, Spring 2009 Name: Andrew ID: Final Exam, 10701 Machine Learning, Spring 2009 - The exam is open-book, open-notes, no electronics other than calculators. - The maximum possible score on this exam is 100. You have 3

More information

CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling

CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling Professor Erik Sudderth Brown University Computer Science October 27, 2016 Some figures and materials courtesy

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear

More information

NEURAL NETWORK FOR THE FAST GAUSSIAN DISTRIBUTION TEST

NEURAL NETWORK FOR THE FAST GAUSSIAN DISTRIBUTION TEST IGOR BELI^, ALEKSANDER PUR NEURAL NETWORK FOR THE FAST GAUSSIAN DISTRIBUTION TEST There are several problems where it is very important to know whether the tested data are distributed according to the

More information

THIS paper is aimed at designing efficient decoding algorithms

THIS paper is aimed at designing efficient decoding algorithms IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 7, NOVEMBER 1999 2333 Sort-and-Match Algorithm for Soft-Decision Decoding Ilya Dumer, Member, IEEE Abstract Let a q-ary linear (n; k)-code C be used

More information

An Information-Theoretic Model for Steganography

An Information-Theoretic Model for Steganography An Information-Theoretic Model for Steganography Christian Cachin March 3, 2004 Abstract An information-theoretic model for steganography with a passive adversary is proposed. The adversary s task of distinguishing

More information

A graph contains a set of nodes (vertices) connected by links (edges or arcs)

A graph contains a set of nodes (vertices) connected by links (edges or arcs) BOLTZMANN MACHINES Generative Models Graphical Models A graph contains a set of nodes (vertices) connected by links (edges or arcs) In a probabilistic graphical model, each node represents a random variable,

More information

Stream ciphers I. Thomas Johansson. May 16, Dept. of EIT, Lund University, P.O. Box 118, Lund, Sweden

Stream ciphers I. Thomas Johansson. May 16, Dept. of EIT, Lund University, P.O. Box 118, Lund, Sweden Dept. of EIT, Lund University, P.O. Box 118, 221 00 Lund, Sweden thomas@eit.lth.se May 16, 2011 Outline: Introduction to stream ciphers Distinguishers Basic constructions of distinguishers Various types

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Gaussian discriminant analysis Naive Bayes

Gaussian discriminant analysis Naive Bayes DM825 Introduction to Machine Learning Lecture 7 Gaussian discriminant analysis Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. is 2. Multi-variate

More information