Covert Communication with Channel-State Information at the Transmitter

Similar documents
EE 4TM4: Digital Communications II. Channel Capacity

The Poisson Channel with Side Information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 4 Noisy Channel Coding

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Performance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)

The Method of Types and Its Application to Information Hiding

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Dirty Paper Writing and Watermarking Applications

Information Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results

Shannon s noisy-channel theorem

UCSD ECE 255C Handout #12 Prof. Young-Han Kim Tuesday, February 28, Solutions to Take-Home Midterm (Prepared by Pinar Sen)

LECTURE 10. Last time: Lecture outline

Generalized Writing on Dirty Paper

Covert Communications on Renewal Packet Channels

ELEC546 Review of Information Theory

Lecture 3: Channel Capacity

Equivalence for Networks with Adversarial State

Sparse Regression Codes for Multi-terminal Source and Channel Coding

Covert Communications on Poisson Packet Channels

Second-Order Asymptotics in Information Theory

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

Lecture 5 Channel Coding over Continuous Channels

A Novel Asynchronous Communication Paradigm: Detection, Isolation, and Coding

Multicoding Schemes for Interference Channels

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

arxiv: v1 [cs.it] 4 Jun 2018

LECTURE 13. Last time: Lecture outline

ECE 534 Information Theory - Midterm 2

Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper

Appendix B Information theory from first principles

Covert Communications on Poisson Packet Channels

Capacity of a channel Shannon s second theorem. Information Theory 1/33

A Comparison of Two Achievable Rate Regions for the Interference Channel

Information Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem

Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel

A Comparison of Superposition Coding Schemes

Feedback Capacity of a Class of Symmetric Finite-State Markov Channels

On Multiple User Channels with State Information at the Transmitters

(Classical) Information Theory III: Noisy channel coding

Exercise 1. = P(y a 1)P(a 1 )

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

X 1 : X Table 1: Y = X X 2

Intermittent Communication

arxiv: v2 [cs.it] 28 May 2017

4 An Introduction to Channel Coding and Decoding over BSC

A Formula for the Capacity of the General Gel fand-pinsker Channel

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Explicit Receivers for Optical Communication and Quantum Reading

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016

Channel Coding for Secure Transmissions

ECE Information theory Final

Classical codes for quantum broadcast channels. arxiv: Ivan Savov and Mark M. Wilde School of Computer Science, McGill University

Upper Bounds on the Capacity of Binary Intermittent Communication

Fundamental Limits of Invisible Flow Fingerprinting

Network coding for multicast relation to compression and generalization of Slepian-Wolf

On the Capacity of the Two-Hop Half-Duplex Relay Channel

Dispersion of the Gilbert-Elliott Channel

Approximate Capacity of Fast Fading Interference Channels with no CSIT

Capacity bounds for multiple access-cognitive interference channel

On the variable-delay reliability function of discrete memoryless channels with access to noisy feedback

Lecture 2. Capacity of the Gaussian channel

The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR

Nikola Zlatanov, Erik Sippel, Vahid Jamali, and Robert Schober. Abstract

New communication strategies for broadcast and interference networks

The Gallager Converse

A Finite Block Length Achievability Bound for Low Probability of Detection Communication

18.2 Continuous Alphabet (discrete-time, memoryless) Channel

Capacity of channel with energy harvesting transmitter

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

ECE Information theory Final (Fall 2008)

Simultaneous Nonunique Decoding Is Rate-Optimal

On the Capacity Region of the Gaussian Z-channel

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Lecture 15: Conditional and Joint Typicaility

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Fading Wiretap Channel with No CSI Anywhere

On Third-Order Asymptotics for DMCs

Information Capacity of an Energy Harvesting Sensor Node

Broadcasting over Fading Channelswith Mixed Delay Constraints

IET Commun., 2009, Vol. 3, Iss. 4, pp doi: /iet-com & The Institution of Engineering and Technology 2009

Structured interference-mitigation in two-hop networks

Lecture 7. Union bound for reducing M-ary to binary hypothesis testing

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Soft Covering with High Probability

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Lecture 18: Gaussian Channel

On the Rate-Limited Gelfand-Pinsker Problem

On the Duality between Multiple-Access Codes and Computation Codes

Energy State Amplification in an Energy Harvesting Communication System

CSCI 2570 Introduction to Nanocomputing

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

1174 IET Commun., 2010, Vol. 4, Iss. 10, pp

Lecture 4 Channel Coding

Transcription:

Covert Communication with Channel-State Information at the Transmitter Si-Hyeon Lee Joint Work with Ligong Wang, Ashish Khisti, and Gregory W. Wornell July 27, 2017 1 / 21

Covert Communication Transmitter and receiver wish to communicate reliably, while ensuring that their communication is not detected by the warden. 2 / 21

Covert Communication Hypothesis test by the warden Probability of false alarm: α Probability of missed detection: β Covertness guaranteed if D(P Z,Sending P Z,Not sending ) is small Blind test: α + β = 1 Optimal test: α + β 1 D(P Z,Sending P Z,Not sending ) 3 / 21

Square Root Law N Y Transmitter X Y Z Receiver Warden N Z Maximum amount of information scales like the square root of the blocklength AWGN channel [Bash-Goekel-Towsley 13] A broad class of DMC [Bloch 16], [Wang-Wornell-Zheng 16] 4 / 21

When We Beat the Square Root Law? Unknown channel statistics at the warden BSCs with unknown cross over probability [Che-Bakshi-Chan-Jaggi 14] AWGN with unknown noise power [Lee-Baxley-Weitnauer-Walkenhorst 15] This talk: State-dependent channel with CSI at the transmitter 5 / 21

Binary Symmetric Channel X 0 1 p 0 Z 1 p fixed and known: Zero capacity [Bloch 16], [Wang-Wornell-Zheng 16] p random and unknown: Positive capacity [Che-Bakshi-Chan-Jaggi 14] p fixed and known, realizations known to the transmitter Our model Positive capacity by pre-cancellation 6 / 21

Binary Symmetric Channel X 0 1 p 0 Z 1 p fixed and known: Zero capacity [Bloch 16], [Wang-Wornell-Zheng 16] p random and unknown: Positive capacity [Che-Bakshi-Chan-Jaggi 14] p fixed and known, realizations known to the transmitter Our model Positive capacity by pre-cancellation 6 / 21

Model K M Transmitter X n p(y, z s, x) Y n Z n Receiver Warden ˆM S i or S n State-dependent DMC (X, S, Y, Z, P S, P Y,Z S,X ) Transmitter and receiver share a secret key K [1 : 2 nr 0 ] Input cost function b(x n ) = 1 n n i=1 b(x i ) State sequence known to the transmitter causally or noncausally (R, R K, n) code for causal CSI Encoding function at time i: (M, K, S i ) X i Decoding function (Y n, K) ˆM 7 / 21

Model K M Transmitter X n p(y, z s, x) Y n Z n Receiver Warden ˆM S i or S n State-dependent DMC (X, S, Y, Z, P S, P Y,Z S,X ) Transmitter and receiver share a secret key K [1 : 2 nr 0 ] Input cost function b(x n ) = 1 n n i=1 b(x i ) State sequence known to the transmitter causally or noncausally (R, R K, n) code for noncausal CSI Encoding function: (M, K, S n ) X n Decoding function (Y n, K) ˆM 7 / 21

Model K M Transmitter X n p(y, z s, x) Y n Z n Receiver Warden ˆM S i or S n Covert communication: x 0 X : no input symbol, Q 0 ( ) = s S P S (s)p Z S,X ( s, x 0 ) Small D( P Z n Q n 0 ) ensures the warden s best hypothesis test performs not much better than a blind test. A covert rate of R is achievable if a sequence of (R, R K, n) codes that simultaneously satisfies input cost constraint lim sup n E M,K,S n [b(x n )] B reliability constraint lim n P( ˆM M) = 0 covertness constraint lim n D( P Z n Q n 0 ) = 0 Covert capacity C: Supremum of all achievable covert rates 8 / 21

Main Results: Causal CSI Theorem 1 (Upper Bound) For R K 0 and B 0, the covert capacity is upper-bounded as C max P V, x(v,s): P Z =Q 0, E[b(X )] B I (V ; Y ). Theorem 2 (Lower Bound) For R K 0 and B 0, the covert capacity is lower-bounded as C max I (V ; Y ). P V, x(v,s): P Z =Q 0, E[b(X )] B, I (V ;Z) I (V ;Y )<R K 9 / 21

Main Results: Causal CSI Theorem 1 (Upper Bound) For R K 0 and B 0, the covert capacity is upper-bounded as C max P V, x(v,s): P Z =Q 0, E[b(X )] B I (V ; Y ). Theorem 2 (Lower Bound) For R K 0 and B 0, the covert capacity is lower-bounded as C max I (V ; Y ). P V, x(v,s): P Z =Q 0, E[b(X )] B, I (V ;Z) I (V ;Y )<R K 9 / 21

Main Results: Causal CSI Theorem 1 (Upper Bound) For R K 0 and B 0, the covert capacity is upper-bounded as C max P V, x(v,s): P Z =Q 0, E[b(X )] B I (V ; Y ). Theorem 2 (Lower Bound) For R K 0 and B 0, the covert capacity is lower-bounded as C max I (V ; Y ). P V, x(v,s): P Z =Q 0, E[b(X )] B, I (V ;Z) I (V ;Y )<R K 9 / 21

Main Results: Noncausal CSI Theorem 3 (Upper Bound) For R K 0 and B 0, the covert capacity is upper-bounded as C max P U S, x(u,s): P Z =Q 0, E[b(X )] B I (U; Y ) I (U; S). Theorem 4 (Lower Bound) For R K 0 and B 0, the covert capacity is lower-bounded as C max I (U; Y ) I (U; S). P U S, x(u,s): P Z =Q 0, E[b(X )] B, I (U;Z) I (U;Y )<R K 10 / 21

Main Results: Noncausal CSI Theorem 3 (Upper Bound) For R K 0 and B 0, the covert capacity is upper-bounded as C max P U S, x(u,s): P Z =Q 0, E[b(X )] B I (U; Y ) I (U; S). Theorem 4 (Lower Bound) For R K 0 and B 0, the covert capacity is lower-bounded as C max I (U; Y ) I (U; S). P U S, x(u,s): P Z =Q 0, E[b(X )] B, I (U;Z) I (U;Y )<R K 10 / 21

Main Results: Noncausal CSI Theorem 3 (Upper Bound) For R K 0 and B 0, the covert capacity is upper-bounded as C max P U S, x(u,s): P Z =Q 0, E[b(X )] B I (U; Y ) I (U; S). Theorem 4 (Lower Bound) For R K 0 and B 0, the covert capacity is lower-bounded as C max I (U; Y ) I (U; S). P U S, x(u,s): P Z =Q 0, E[b(X )] B, I (U;Z) I (U;Y )<R K 10 / 21

Example: Binary Symmetric Channels 0 0 X p Y = Z S Bern(p) 1 R K > 0, x 0 = 0 1 X Y = Z Without CSI at the transmitter: C = 0 With CSI at the transmitter: C = H b (p) for both causal and noncausal cases X = V S, V Bern(p) independent with S 11 / 21

Example: Binary Symmetric Channels 0 0 X p Y = Z S Bern(p) 1 R K > 0, x 0 = 0 1 X Y = Z Without CSI at the transmitter: C = 0 With CSI at the transmitter: C = H b (p) for both causal and noncausal cases X = V S, V Bern(p) independent with S 11 / 21

Example: Binary Symmetric Channels 0 0 X p Y = Z S Bern(p) 1 R K > 0, x 0 = 0 1 X = V S Y = Z Without CSI at the transmitter: C = 0 With CSI at the transmitter: C = H b (p) for both causal and noncausal cases X = V S, V Bern(p) independent with S 11 / 21

Example: Binary Symmetric Channels 0 0 X p Y = Z 1 R K > 0, x 0 = 0 1 V = Y = Z Without CSI at the transmitter: C = 0 With CSI at the transmitter: C = H b (p) for both causal and noncausal cases X = V S, V Bern(p) independent with S 11 / 21

Example: AWGN Channels N Y N (0, 1) Y X E[X 2 ] P S N (0, T ) Z N Z N (0, σ 2 ) Without CSI at the transmitter: C = 0 With CSI at the transmitter: C > 0 Make room for message transmission by reducing the interference power, i.e., X = X γ S Choose P and γ to simultaneously satisfy { the covertness } constraint and 1, P, T 2T = (1 γ ) 2 T, the input power constraint, i.e., γ = min P = T T 12 / 21

Example: AWGN Channels N Y N (0, 1) Y X E[X 2 ] P S N (0, T ) Z N Z N (0, σ 2 ) Without CSI at the transmitter: C = 0 With CSI at the transmitter: C > 0 Make room for message transmission by reducing the interference power, i.e., X = X γ S Choose P and γ to simultaneously satisfy { the covertness } constraint and 1, P, T 2T = (1 γ ) 2 T, the input power constraint, i.e., γ = min P = T T 12 / 21

Example: AWGN Channels N Y N (0, 1) Y X = X γ S E[X 2 ] P S N (0, T ) Z N Z N (0, σ 2 ) Without CSI at the transmitter: C = 0 With CSI at the transmitter: C > 0 Make room for message transmission by reducing the interference power, i.e., X = X γ S Choose P and γ to simultaneously satisfy { the covertness } constraint and 1, P, T 2T = (1 γ ) 2 T, the input power constraint, i.e., γ = min P = T T 12 / 21

Example: AWGN Channels N Y N (0, 1) Y X E[X 2 ] P (1 γ )S N (0, (1 γ ) 2 T ) Z N Z N (0, σ 2 ) Without CSI at the transmitter: C = 0 With CSI at the transmitter: C > 0 Make room for message transmission by reducing the interference power, i.e., X = X γ S Choose P and γ to simultaneously satisfy { the covertness } constraint and 1, P, T 2T = (1 γ ) 2 T, the input power constraint, i.e., γ = min P = T T 12 / 21

Example: AWGN Channels N Y N (0, 1) Y X E[X 2 ] P S N (0, T ) Z N Z N (0, σ 2 ) Without CSI at the transmitter: C = 0 With CSI at the transmitter: C > 0 Make room for message transmission by reducing the interference power, i.e., X = X γ S Choose P and γ to simultaneously satisfy { the covertness } constraint and 1, P, T 2T = (1 γ ) 2 T, the input power constraint, i.e., γ = min P = T T 12 / 21

Example: AWGN Channels N Y N (0, 1) Y X E[X 2 ] P S N (0, T ) Z N Z N (0, σ 2 ) Causal CSI at the transmitter Choose V = X and treat interference as noise at the receiver 13 / 21

Example: AWGN Channels γ = min { } 1, P 2T, T = (1 γ ) 2 T, P = T T Theorem 5 (Causal CSI) If R K > 1 ( ) 2 log 1 + P 12 ) (1 T + σ log + P, 2 T + 1 the covert capacity with causal CSI at the transmitter is lower-bounded as C c 1 ( ) 2 log 1 + P. T + 1 If the warden s channel is degraded, i.e., σ 2 > 1, a secret key is not needed to achieve the rate 14 / 21

Example: AWGN Channels N Y N (0, 1) Y X E[X 2 ] P S N (0, T ) Z N Z N (0, σ 2 ) Noncausal CSI at the transmitter Choose U as in the dirty paper coding with input power constraint P and interference power T 15 / 21

Example: AWGN Channels γ = min { } 1, P 2T, T = (1 γ ) 2 T, P = T T Theorem 6 (Noncausal CSI) If R K > 1 2 log (1 + 1 2 log (1 + (P + P T ) ) 2 P +1 (P + ( P P +1 )2 T )(P + T + σ 2 ) (P + P T P +1 ) 2 (P + P T ) 2 P +1 (P + ( P P +1 )2 T )(P + T + 1) (P + P T P +1 ) 2 the covert capacity with noncausal CSI at the transmitter is given by C nc = 1 2 log (1 + P ). ), If the warden s channel is degraded, i.e., σ 2 > 1, a secret key is not needed to achieve the capacity 16 / 21

Example: AWGN Channels For R K sufficiently large, C c 1 ( ) 2 log 1 + P T + 1 C nc = 1 2 log (1 + P ). If T = 0, i.e., T P, Cc = Cnc. 2 As T, P P and hence C nc approaches to the capacity without a covertness constraint 17 / 21

Achievability with Noncausal CSI Key idea: Gelfend-Pinsker coding, except that likelihood encoding by [Song-Cuff-Poor 16], [Goldfeld-Kramer-Permuter-Cuff 17] is used instead of joint-typicality encoding. Codebook generation: Fix P U S and x(u, s) such that P Z = Q 0 and E[b(X )] B 1+ɛ. For each k [1 : 2 nr K ] and m [1 : 2 nr ], randomly generate a subcodebook C(k, m) = {u n (k, m, l) : l [1 : 2 nr ]} according to n i=1 P U(u i ). Encoding: Given state sequence s n, secret key k, and message m, evaluate the likelihood g(l s n, k, m) = P n S U (sn u n (k, m, l)) l [1:2 nr ] P n S U (sn u n (k, m, l )). The encoder randomly generates l according to the above and transmits x n where x i = x(u i (k, m, l), s i ). 18 / 21

Achievability with Noncausal CSI Decoding: Upon receiving y n, with access to the secret key k, the decoder declares that ˆm is sent if it is the unique message such that (u n (k, ˆm, l), y n ) T (n) ɛ for some l [1 : 2 nr ]; otherwise it declares an error. Covertness analysis: Γ: Joint distribution when the codeword index in the subcodebook is uniformly chosen and then the state sequence is generated in an iid manner according to P S U. PZ n Γ Z n, if R > I (U; S) Γ Z n Q n 0, if R K + R + R > I (U; Z) Reliability analysis: Done by packing lemma R + R < I (U; Y ) 19 / 21

Converse with Noncausal CSI Step (a): Bounding techniques for channels with noncausal CSI without covertness constraint, where U i := (M, K, Y i 1, S n i+1 ). Step (b): Characterization of capacity function C(A, B) := max P U S,P X U,S : E[b(X )] B, D(P Z Q 0 ) A (I (U; Y ) I (U; S)) The function C(A, B) is non-decreasing in each of A and B, and concave and continuous in (A, B). Step (c): Application of covertness and input cost constraints R (a) 1 n (b) 1 n (b) C n (I (U i ; Y i ) I (U i ; S i )) + ɛ n i=1 n C(D( P Zi Q 0 ), E[b(X i )]) + ɛ n i=1 ( 1 n (c) C (0, B) ) n D( P Zi Q 0 ), 1 n E[b(X i )] + ɛ n n i=1 i=1 20 / 21

Conclusion Considered state dependent channel with state information at the transmitter Characterized the covert capacity when a sufficiently long secret key is shared between the transmitter and the receiver Derived lower bounds on the rate of the secret key that is needed to achieve the covert capacity For certain channel models, showed that the covert capacity is positive with CSI at the transmitter, but is zero without CSI Full version will be uploaded on arxiv soon 21 / 21