Common Randomness Principles of Secrecy
|
|
- Sharlene Chapman
- 5 years ago
- Views:
Transcription
1 Common Randomness Principles of Secrecy Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of Systems Research
2 1 Correlated Data, Distributed in Space and Time Sensor Networks Cloud Computing Biometric Security Hardware Security
3 2 Secure Processing of Distributed Data Three classes of problems are studied: 1. Secure Function Computation with Trusted Parties 2. Communication Requirements for Secret Key Generation 3. Querying Eavesdropper
4 Secure Processing of Distributed Data Three classes of problems are studied: 1. Secure Function Computation with Trusted Parties 2. Communication Requirements for Secret Key Generation 3. Querying Eavesdropper Our Approach Identify the underlying common randomness Decompose common randomness into independent components 2
5 3 Outline 1. Basic Concepts 2. Secure Computation 3. Minimal Communication for Optimum Rate Secret Keys 4. Querying Common Randomness 5. Principles of Secrecy Generation
6 4 Basic Concepts Multiterminal Source Model Interactive Communication Protocol Common Randomness Secret Key
7 5 Multiterminal Source Model X n 1 X n 2 X n m Assumption on the data Xi n = (X i1,...,x in ) - Data observed at time instance t: X Mt = (X 1t,...,X mt ) - Probability distribution of X 1,...,X m is known. Observations are i.i.d. across time: - X M1,...,X Mn are i.i.d. rvs. Observations are finite-valued.
8 Interactive Communication Protocol COMMUNICATION NETWORK F 11 X n 1 X n 2 X n m Assumptions on the protocol Each terminal has access to all the communication. Multiple rounds of interactive communication are allowed. Communication from terminal 1: F 11 = f 11 (X1) n 6
9 Interactive Communication Protocol COMMUNICATION NETWORK F 11 F 21 X n 1 X n 2 X n m Assumptions on the protocol Each terminal has access to all the communication. Multiple rounds of interactive communication are allowed. Communication from terminal 2: F 21 = f 21 (X2,F n 11 ) 6
10 Interactive Communication Protocol COMMUNICATION NETWORK F 1 F 2 F m X n 1 X n 2 X n m Assumptions on the protocol Each terminal has access to all the communication. Multiple rounds of interactive communication are allowed. r rounds of interactive communication: F = F 1,...,F m 6
11 Common Randomness COMMUNICATION NETWORK A F X n 1 X n 2 X n m L 1 L 2 Definition. L is an ǫ-common randomness for A from F if P(L = L i (Xi n,f), i A) 1 ǫ Ahlswede-Csiszár 93 and 98. 7
12 Secret Key COMMUNICATION NETWORK A F X n 1 X n 2 X n m K K Definition. An rv K K is an ǫ-secret key for A from F if 1. Recoverability: K is an ǫ-cr for A from F 2. Security: K is concealed from an observer of F Maurer 93 Ahlswede-Csiszár 93 Csiszár-Narayan 04. 8
13 Secret Key COMMUNICATION NETWORK A F X n 1 X n 2 X n m K K Definition. An rv K K is an ǫ-secret key for A from F if 1. Recoverability: K is an ǫ-cr for A from F 2. Security: K is concealed from an observer of F P KF U K P F Maurer 93 Ahlswede-Csiszár 93 Csiszár-Narayan 04. 8
14 Notions of Security Kullback-Leibler Divergence s in (K,F) = D(P KF U K P F ) = log K H(K)+I(K F) 0 Variational Distance s var (K,F) = P KF U K P F 1 0 Weak s weak (K,F) = 1 n s in(k,f) 0 9
15 Notions of Security Kullback-Leibler Divergence s in (K,F) = D(P KF U K P F ) = log K H(K)+I(K F) 0 Variational Distance s var (K,F) = P KF U K P F 1 0 Weak s weak (K,F) = 1 n s in(k,f) 0 2s var (K,F) 2 s in (K,F) s var (K,F)log K s var(k,f) 9
16 10 Secret Key Capacity Definition. An rv K K is an ǫ-secret key for A from F if 1. Recoverability: K is an ǫ-cr for A from F 2. Security: s(k,f) 0 as n
17 10 Secret Key Capacity Definition. An rv K K is an ǫ-secret key for A from F if 1. Recoverability: K is an ǫ-cr for A from F 2. Security: s(k,f) 0 as n Rate of K 1 n log K
18 10 Secret Key Capacity Definition. An rv K K is an ǫ-secret key for A from F if 1. Recoverability: K is an ǫ-cr for A from F 2. Security: s(k,f) 0 as n Rate of K 1 n log K ǫ-sk capacity C(ǫ) = supremum over the rates of ǫ-sks SK capacity C = inf 0<ǫ<1 C(ǫ)
19 Secret Key Capacity Theorem (Csiszár-Narayan 04) The SK capacity is given by C = H(X M ) R CO, where m R CO = min R i, such that i B R i H(X B X B c), for all A B M. i=1 11
20 Secret Key Capacity R CO min. rate of communication for omniscience" for A Theorem (Csiszár-Narayan 04) The SK capacity is given by C = H(X M ) R CO, where m R CO = min R i, such that i B R i H(X B X B c), for all A B M. i=1 11
21 Secret Key Capacity R CO min. rate of communication for omniscience" for A Theorem (Csiszár-Narayan 04) The SK capacity is given by C = H(X M ) R CO, where m R CO = min R i, such that i B R i H(X B X B c), for all A B M. i=1 (Maurer 93, Ahlswede-Csiszár 93) For m = 2: C = I(X 1 X 2 ) 11
22 Secure Computation
23 Computing Functions of Distributed Data COMMUNICATION NETWORK F X n 1 X n 2 X n m g 1 g 2 g m Function computed at terminal i: g i (x 1,...,x m ) - Denote the random value of g i (x 1,...,x m ) by G i 13
24 Computing Functions of Distributed Data COMMUNICATION NETWORK F X n 1 X n 2 X n m g 1 g 2 g m Function computed at terminal i: g i (x 1,...,x m ) - Denote the random value of g i (x 1,...,x m ) by G i ( ) P G n i = Ĝ(n) i (Xi n,f), for all 1 i m 1 ǫ 13
25 Secure Function Computation COMMUNICATION NETWORK I(G n 0 F) 0 F X n 1 X n 2 X n m g 1 g 2 g m Value of private function g 0 must not be revealed Definition. Functions g 0,g 1,...,g m are securely computable if ( ) 1. Recoverability: P G n i = Ĝ(n) i (Xi n,f), i M 1 2. Security: I(G n 0 F) 0 14
26 Secure Function Computation When are functions g 0,g 1,...,g m securely computable? COMMUNICATION NETWORK I(G n 0 F) 0 F X n 1 X n 2 X n m g 1 g 2 g m Value of private function g 0 must not be revealed Definition. Functions g 0,g 1,...,g m are securely computable if ( ) 1. Recoverability: P G n i = Ĝ(n) i (Xi n,f), i M 1 2. Security: I(G n 0 F) 0 14
27 Secure Function Computation When is a function g securely computable? COMMUNICATION NETWORK I(G n F) 0 F X n 1 X n 2 X n m g g g Value of Private function g 0 = g Definition. Function g is securely computable if ( ) 1. Recoverability: P G n = Ĝ(n) i (Xi n,f), i M 1 2. Security: I(G n F) 0 15
28 16 A Necessary Condition If g is securely computable, then it constitutes an SK for M. Therefore, rate of G SK Capacity, i.e., H(G) C.
29 When is g securely computable? Theorem If g is securely computable, then H(G) C. Conversely, g is securely computable if H(G) < C. For a securely computable function g: Omniscience can be obtained using F G n. Noninteractive communication suffices. Randomization is not needed. 17
30 Example: Secure Computation using Secret Keys X 1 B 1 K K B 2 X 2 H(K) = 1 Secret Key K g(x 1,X 2 ) = B 1 B 2 18
31 18 Example: Secure Computation using Secret Keys B 1 X 1 B 1 K K B 2 X 2 H(K) = 1 Secret Key K g(x 1,X 2 ) = B 1 B 2
32 18 Example: Secure Computation using Secret Keys B 1 X 1 B 1 K K B 2 X 2 K B 1 B 2 H(K) = 1 Secret Key K g(x 1,X 2 ) = B 1 B 2
33 18 Example: Secure Computation using Secret Keys Do fewer than n bits suffice? B B 1n K 1... K n K 1... K n B B 2n Secret Key K g(x n 1,X n 2) = B 11 B 21,...,B 1n B 2n
34 18 Example: Secure Computation using Secret Keys Do fewer than n bits suffice? B B 1n K 1... K n K 1... K n B B 2n Secret Key K g(x n 1,X n 2) = B 11 B 21,...,B 1n B 2n If parity is securely computable: 1 = H(G) C = H(K)
35 Characterization of Secure Computability Theorem The functions g 0,g 1,...,g m are secure computable if (>) and only if ( ) H(X M G 0 ) R. R : minimum rate of F such that F F X n i X n i G n 0 G n i X n M Omniscience A data compression problem with no secrecy 19
36 20 Example: Functions of Binary Sources X 1 0 Pr(X 1 = 1) = δ δ δ 1 δ X 2 0 Pr(X 1 X 2 ) = δ 1 1 h(δ) δ = 0.5 δ Functions are securely computable iff(!) h(δ) τ g 0 g 1 g 2 τ X 1 X 2 X 1 X 2 φ 1 X 1 X 2 X 1 X 2 X 1.X 2 2/3 X 1 X 2 X 1 X 2 X 1 X 2 1/2 X 1 X 2, X 1.X 2 X 1 X 2, X 1.X 2 X 1.X 2 2δ/3
37 20 Example: Functions of Binary Sources X 1 0 Pr(X 1 = 1) = δ δ δ 1 δ X 2 0 Pr(X 1 X 2 ) = δ 1 1 h(δ) δ = 0.5 δ Functions are securely computable iff(!) h(δ) τ g 0 g 1 g 2 τ X 1 X 2 X 1 X 2 φ 1 X 1 X 2 X 1 X 2 X 1.X 2 2/3 X 1 X 2 X 1 X 2 X 1 X 2 1/2 X 1 X 2, X 1.X 2 X 1 X 2, X 1.X 2 X 1.X 2 2δ/3
38 Example: Functions of Binary Sources X 1 0 Pr(X 1 = 1) = δ δ δ 1 δ X 2 0 Pr(X 1 X 2 ) = δ 1 1 h(δ) δ = 0.5 δ Functions are securely computable iff(!) h(δ) τ g 0 g 1 g 2 τ X 1 X 2 X 1 X 2 φ 1 X 1 X 2 X 1 X 2 X 1.X 2 2/3 X 1 X 2 X 1 X 2 X 1 X 2 1/2 X 1 X 2, X 1.X 2 X 1 X 2, X 1.X 2 X 1.X 2 2δ/3 20
39 Computing the Private Function H(X M G 0 ) R Suppose g i = g 0 F R : minimum rate of F such that F F X n i X n i G n 0 X n i G n 0 X n M Omniscience 21
40 21 Computing the Private Function H(X M G 0 ) R Suppose g i = g 0 R : minimum rate of F such that F F If g 0 is securely computable at a terminal then the entire F data can be recovered securely at that terminal X n i X n i G n 0 X n i G n 0 X n M Omniscience
41 Minimal Communication for an Optimum Rate Secret Key
42 Secret Key Generation for Two Terminals COMMUNICATION NETWORK F X n 1 X n 2 K K Weak secrecy criterion: 1 n s in(k,f) 0. Secret key capacity C = I(X 1 X 2 ) Maurer 93 Ahlswede-Csiszár 93 23
43 Common Randomness for SK Capacity What is the form of CR that yields an optimum rate SK? Maurer-Ahlswede-Csiszár Common randomness generated X1 n or X2 n Rate of communication required min{h(x 1 X 2 ),H(X 2 X 1 )} Decomposition H(X 1 ) = H(X 1 X 2 )+I(X 1 X 2 ) H(X 2 ) = H(X 2 X 1 )+I(X 1 X 2 ) 24
44 25 Digression: Secret Keys and Biometric Security Secure Server K(X 1 ) X 1 Public Server F(X 1 )
45 25 Digression: Secret Keys and Biometric Security Secure Server K(X 1 ) X 1 Public Server F(X 1 ) X 2
46 25 Digression: Secret Keys and Biometric Security Secure Server K(X 1 ) =? Public Server F(X 1 ) X 2
47 25 Digression: Secret Keys and Biometric Security Secure Server K(X 1 ) =? Public Server F(X 1 ) X 2 Similar approach can be applied for physically uncloneable functions
48 26 Common Randomness for SK Capacity What is the form of CR that yields an optimum rate SK? Maurer-Ahlswede-Csiszár Common randomness generated X n 1 or Xn 2 Rate of communication required min{h(x 1 X 2 ),H(X 2 X 1 )} Decomposition H(X 1 ) = H(X 1 X 2 )+I(X 1 X 2 ) H(X 2 ) = H(X 2 X 1 )+I(X 1 X 2 )
49 26 Common Randomness for SK Capacity What is the form of CR that yields an optimum rate SK? Maurer-Ahlswede-Csiszár Csiszár-Narayan Common randomness generated Rate of communication required X n 1 or Xn 2 (X n 1,Xn 2 ) min{h(x 1 X 2 ),H(X 2 X 1 )} H(X 1 X 2 )+H(X 2 X 1 ) Decomposition H(X 1 ) = H(X 1 X 2 )+I(X 1 X 2 ) H(X 2 ) = H(X 2 X 1 )+I(X 1 X 2 ) H(X 1,X 2 ) = H(X 1 X 2 )+H(X 2 X 1 )+I(X 1 X 2 )
50 27 Characterization of CR for Optimum Rate SK Theorem A CR J recoverable from F yields an optimum rate SK iff 1 n I(Xn 1 Xn 2 J,F) 0. Examples: X n 1 or X n 2 or (X n 1,X n 2)
51 28 Interactive Common Information Interactive Common Information Let J be a CR from communication F. CI r i (X 1;X 2 ) min. rate of L = (J,F) such that 1 n I(Xn 1 Xn 2 L) 0 ( ) CI i (X 1 X 2 ) := lim r CI r i (X 1;X 2 )
52 28 Interactive Common Information Interactive Common Information Let J be a CR from communication F. CI r i (X 1;X 2 ) min. rate of L = (J,F) such that 1 n I(Xn 1 Xn 2 L) 0 ( ) CI i (X 1 X 2 ) := lim r CI r i (X 1;X 2 ) Wyner s Common Information CI(X 1 X 2 ) min. rate of L(X1 n,xn 2 ) s.t. ( ) holds
53 29 Minimum Communication for Optimum Rate SK R r SK : min. rate of an r-round communication F needed to generate an optimum rate SK Theorem The minimum rate R r SK is given by R r SK = CIr i (X 1;X 2 ) I(X 1 X 2 ). It follows upon taking the limit r that R SK = CI i (X 1 X 2 ) I(X 1 X 2 ) A single letter characterization of CI r i is available.
54 29 Minimum Communication for Optimum Rate SK R r SK : min. rate of an r-round communication F needed to generate an optimum rate SK Theorem The minimum rate R r SK is given by R r SK = CIr i (X 1;X 2 ) I(X 1 X 2 ). It follows upon taking the limit r that R SK = CI i (X 1 X 2 ) I(X 1 X 2 ) Binary symmetric rvs: CI 1 i =... = CIr i = min{h(x 1),H(X 2 )}
55 29 Minimum Communication for Optimum Rate SK R r SK : min. rate of an r-round communication F needed to generate an optimum rate SK Theorem The minimum rate R r SK is given by R r SK = CIr i (X 1;X 2 ) I(X 1 X 2 ). It follows upon taking the limit r that R SK = CI i (X 1 X 2 ) I(X 1 X 2 ) There is an example with CI 1 i > CI 2 i Interaction does help!
56 30 Common Information Quantities CI GC I(X 1 X 2 ) CI CI i min{h(x 1 ),H(X 2 )}
57 30 Common Information Quantities CI GC I(X 1 X 2 ) CI CI i min{h(x 1 ),H(X 2 )}
58 30 Common Information Quantities CI GC I(X 1 X 2 ) CI CI i min{h(x 1 ),H(X 2 )}
59 30 Common Information Quantities CI GC I(X 1 X 2 ) CI CI i min{h(x 1 ),H(X 2 )}
60 30 Common Information Quantities CI GC I(X 1 X 2 ) CI CI i min{h(x 1 ),H(X 2 )} Interactive Common Information
61 30 Common Information Quantities CI GC I(X 1 X 2 ) CI CI i min{h(x 1 ),H(X 2 )} Interactive Common Information CI i is indeed a new quantity Binary symmetric rvs: CI < min{h(x 1 ),H(X 2 )} = CI i.
62 Querying Common Randomness
63 Common Randomness COMMUNICATION NETWORK A F X n 1 X n 2 X n m L 1 L 2 Definition. L is an ǫ-common randomness for A from F if P(L = L i (Xi n,f), i A) 1 ǫ Ahlswede-Csiszár 93 and
64 33 Query Strategy V = v q(u 1 v) = 1 q(u t v) = t Is U = u 1? no Is U = u t? yes yes no u 1 u t Query strategy for U given V Massey 94, Arikan 96, Arikan-Merhav 99, Hanawal-Sundaresan 11
65 34 Query Strategy Given rvs U,V with values in the sets U,V. Definition. A query strategy q for U given V = v is a bijection q( v) : U {1,..., U }, where the querier, upon observing V = v, asks the question Is U = u?" in the q(u v) th query. q(u V): random query number for U upon observing V
66 34 Query Strategy Given rvs U,V with values in the sets U,V. Definition. A query strategy q for U given V = v is a bijection q( v) : U {1,..., U }, where the querier, upon observing V = v, asks the question Is U = u?" in the q(u v) th query. q(u V): random query number for U upon observing V {u : q(u v) < γ} < γ
67 35 Optimum Query Exponent Definition. E 0 is an ǫ-achievable query exponent if there exists ǫ-cr L n for A from F n such that sup q P ( q(l n F n ) < 2 ne) 0 as n, where the sup is over every query strategy for L n given F n.
68 35 Optimum Query Exponent Definition. E 0 is an ǫ-achievable query exponent if there exists ǫ-cr L n for A from F n such that sup q P ( q(l n F n ) < 2 ne) 0 as n, where the sup is over every query strategy for L n given F n. E (ǫ) sup{e : E is an ǫ-achievable query exponent} E inf 0<ǫ<1 E (ǫ) : optimum query exponent
69 36 Characterization of Optimum Query Exponent Theorem For 0 < ǫ < 1, the optimum query exponent E equals E = E (ǫ) = C.
70 36 Characterization of Optimum Query Exponent Theorem For 0 < ǫ < 1, the optimum query exponent E equals Proof. E = E (ǫ) = C. Achievability: E (ǫ) C(ǫ) - Easy Converse: E (ǫ) C - Main contribution
71 Characterization of Optimum Query Exponent Theorem For 0 < ǫ < 1, the optimum query exponent E equals Proof. E = E (ǫ) = C. Achievability: E (ǫ) C(ǫ) - Easy Converse: E (ǫ) C - Main contribution Theorem (Strong converse for SK capacity) For 0 < ǫ < 1, the ǫ-sk capacity is given by C(ǫ) = E = C. 36
72 37 A Single-Shot Converse For rvs Y 1,...,Y k, let L be an ǫ-cr for {1,...,k} from F. Theorem Let θ be such that ({ P (y 1,...,y k ) : P Y 1,...,Y k (y 1,...,y k ) k i=1 P Y i (y i ) θ }) 1. Then, there exists a query strategy q 0 for L given F such that ( ) P q 0 (L F) θ 1 k 1 (1 ǫ) 2 > 0.
73 38 Small Cardinality Sets with Large Probabilities Rényi entropy of order α of a probability measure µ on U: H α (µ) 1 1 α log µ(u) α, 0 α 1 u U Lemma. There exists a set U δ U with µ(u δ ) 1 δ s.t. U δ exp(h α (µ)), 0 α < 1.
74 38 Small Cardinality Sets with Large Probabilities Rényi entropy of order α of a probability measure µ on U: H α (µ) 1 1 α log µ(u) α, 0 α 1 u U Lemma. There exists a set U δ U with µ(u δ ) 1 δ s.t. U δ exp(h α (µ)), 0 α < 1. Conversely, for any set U δ U with µ(u δ ) 1 δ, U δ exp(h α (µ)), α > 1.
75 Small Cardinality Sets with Large Probabilities Rényi entropy of order α of a probability measure µ on U: H α (µ) 1 1 α log µ(u) α, 0 α 1 u U Lemma. There exists a set U δ U with µ(u δ ) 1 δ s.t. U δ exp(h α (µ)), 0 α < 1. Conversely, for any set U δ U with µ(u δ ) 1 δ, H α (P) log X U δ exp(h α (µ)), α > 1. H(X) 1 α 38
76 In Closing...
77 Our Approach Identify the underlying common randomness Decompose common randomness into independent components 40
78 Our Approach Identify the underlying common randomness Decompose common randomness into independent components Secure Computing Common Randomness Omniscience with side information g 0 for decoding Decomposition The private function, the communication and the residual randomness 40
79 Our Approach Identify the underlying common randomness Decompose common randomness into independent components Two Terminal Secret Key Generation Common Randomness Renders the observations conditionally independent Decomposition The secret key and the communication 40
80 Our Approach Identify the underlying common randomness Decompose common randomness into independent components Querying Eavesdropper Requiring the number of queries to be as large as possible is tantamount to decomposition into independent parts 40
81 41 ÈÖ Ò ÔÐ Ó Ë Ö Ý Ò Ö Ø ÓÒ Computing the private function g 0 at a terminal is as difficult as securely recovering the entire data at that terminal. A CR yields an optimum rate SK iff it renders the observations of the two terminals (almost) conditionally independent. Almost independence secrecy criterion is equivalent to imposing a lower bound on the complexity of a querier of the secret.
82 Supplementary Slides
83 Sufficiency Share all data to compute g: Omniscience X n M Can we attain omniscience using F G n? Entropy of G Total randomness: H(X M ) Communication required to share the randomness: R CO Claim: Omniscience can be attained using F G n if: H(G) < H(X M ) R CO. 43
84 Random Mappings For Omniscience COMMUNICATION NETWORK F X n 1 X n 2 X n m ˆX (n) M ˆX (n) M ˆX (n) M F i = F i (Xi n): random mapping of rate R i. With large probability, F 1,...,F m result in omniscience if: R i H(X B X B c), B M. i B R CO = min i M R i. Csiszár-Körner 80 Han-Kobayashi 80 Csiszár-Narayan 04 44
85 Independence Properties of Random Mappings P be a family of N pmfs on X s.t. P ({x X : P(x) > 12 }) ǫ, P P. d Balanced Coloring Lemma: Probability that a random mapping F : X {1,...,2 r } fails to satisfy for some P P 2 r P(F(X) = i) 1 3ǫ. 2 r i=1 is less than exp { r +log(2n) (ǫ 2 /3)2 (d r)}. Ahlswede-Csiszár 93 and 98 45
86 Independence Properties of Random Mappings P be a family of N pmfs on X s.t. P ({x X : P(x) > 12 }) ǫ, P P. d Balanced Coloring Lemma: Probability that a random mapping F : X {1,...,2 r } fails to satisfy for some P P 2 r P(F(X) = i) 1 3ǫ. 2 r i=1 is less than exp { r +log(2n) (ǫ 2 /3)2 (d r)}. Generalized Privacy Amplification Ahlswede-Csiszár 93 and 98 Bennett-Brassard-Crépeau-Maurer 95 45
87 Sufficiency of H(G) < H(X M ) R CO Consider random mappings F i = F i (Xi n ) of rates R i such that R i H (X B X B c), B M. i B F results in omniscience at all the terminals. F is approximately independent of G n. 46
88 Sufficiency of H(G) < H(X M ) R CO Consider random mappings F i = F i (Xi n ) of rates R i such that R i H (X B X B c), B M. i B F results in omniscience at all the terminals. F is approximately independent of G n. Note: I(F 1,...,F m G n ) m i=1 I(F i G n,f M\i ) 46
89 Sufficiency of H(G) < H(X M ) R CO Consider random mappings F i = F i (Xi n ) of rates R i such that R i H (X B X B c), B M. i B F results in omniscience at all the terminals. F is approximately independent of G n. Note: I(F 1,...,F m G n ) m i=1 I(F i G n,f M\i ) Show I(F i G n,f M\i ) 0 with probability close to 1 - using an extension of the BC Lemma [Lemma 2.7] 46
Secret Key Generation and Secure Computing
Secret Key Generation and Secure Computing Himanshu Tyagi Department of Electrical and Computer Engineering and Institute of System Research University of Maryland, College Park, USA. Joint work with Prakash
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/41 Outline Two-terminal model: Mutual information Operational meaning in: Channel coding: channel
More informationSHARED INFORMATION. Prakash Narayan with. Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe
SHARED INFORMATION Prakash Narayan with Imre Csiszár, Sirin Nitinawarat, Himanshu Tyagi, Shun Watanabe 2/40 Acknowledgement Praneeth Boda Himanshu Tyagi Shun Watanabe 3/40 Outline Two-terminal model: Mutual
More informationSecret Key Agreement: General Capacity and Second-Order Asymptotics. Masahito Hayashi Himanshu Tyagi Shun Watanabe
Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Two party secret key agreement Maurer 93, Ahlswede-Csiszár 93 X F Y K x K y ArandomvariableK
More informationMULTITERMINAL SECRECY AND TREE PACKING. With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik
MULTITERMINAL SECRECY AND TREE PACKING With Imre Csiszár, Sirin Nitinawarat, Chunxuan Ye, Alexander Barg and Alex Reznik Information Theoretic Security A complementary approach to computational security
More informationA Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing. Himanshu Tyagi and Shun Watanabe
A Bound For Multiparty Secret Key Agreement And Implications For A Problem Of Secure Computing Himanshu Tyagi and Shun Watanabe Multiparty Secret Key Agreement COMMUNICATION NETWORK F X 1 X 2 X m K 1 K
More informationSecret Key Agreement: General Capacity and Second-Order Asymptotics
Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe 1 Abstract We revisit the problem of secret key agreement using interactive public communication
More informationAN INTRODUCTION TO SECRECY CAPACITY. 1. Overview
AN INTRODUCTION TO SECRECY CAPACITY BRIAN DUNN. Overview This paper introduces the reader to several information theoretic aspects of covert communications. In particular, it discusses fundamental limits
More informationPerfect Omniscience, Perfect Secrecy and Steiner Tree Packing
Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing S. Nitinawarat and P. Narayan Department of Electrical and Computer Engineering and Institute for Systems Research University of Maryland College
More informationSecret Key Agreement: General Capacity and Second-Order Asymptotics
Secret Key Agreement: General Capacity and Second-Order Asymptotics Masahito Hayashi Himanshu Tyagi Shun Watanabe Abstract We revisit the problem of secret key agreement using interactive public communication
More informationInteractive Communication for Data Exchange
Interactive Communication for Data Exchange Himanshu Tyagi Indian Institute of Science, Bangalore Joint work with Pramod Viswanath and Shun Watanabe The Data Exchange Problem [ElGamal-Orlitsky 84], [Csiszár-Narayan
More informationSecret Key and Private Key Constructions for Simple Multiterminal Source Models
Secret Key and Private Key Constructions for Simple Multiterminal Source Models arxiv:cs/05050v [csit] 3 Nov 005 Chunxuan Ye Department of Electrical and Computer Engineering and Institute for Systems
More informationEECS 750. Hypothesis Testing with Communication Constraints
EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.
More informationConverses For Secret Key Agreement and Secure Computing
Converses For Secret Key Agreement and Secure Computing Himanshu Tyagi and Shun Watanabe Abstract We consider information theoretic secret key agreement and secure function computation by multiple parties
More informationGroup Secret Key Agreement over State-Dependent Wireless Broadcast Channels
Group Secret Key Agreement over State-Dependent Wireless Broadcast Channels Mahdi Jafari Siavoshani Sharif University of Technology, Iran Shaunak Mishra, Suhas Diggavi, Christina Fragouli Institute of
More informationCommon Randomness and Secret Key Generation with a Helper
344 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 46, NO. 2, MARCH 2000 Common Romness Secret Key Generation with a Helper Imre Csiszár, Fellow, IEEE, Prakash Narayan, Senior Member, IEEE Abstract We consider
More informationProblems of the CASCADE Protocol and Renyi Entropy Reduction in Classical and Quantum Key Generation
arxiv:quant-ph/0703012v1 1 Mar 2007 Problems of the CASCADE Protocol and Renyi Entropy Reduction in Classical and Quantum Key Generation Koichi Yamazaki 1,2, Ranjith Nair 2, and Horace P. Yuen 2 1 Department
More informationHypothesis Testing with Communication Constraints
Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline
More informationThe Communication Complexity of Correlation. Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan
The Communication Complexity of Correlation Prahladh Harsha Rahul Jain David McAllester Jaikumar Radhakrishnan Transmitting Correlated Variables (X, Y) pair of correlated random variables Transmitting
More informationInteractive Communication for Data Exchange
Interactive Communication for Data Exchange Himanshu Tyagi, Member, IEEE, Pramod Viswanath, Fellow, IEEE, and Shun Watanabe, Member, IEEE Abstract Two parties observing correlated data seek to exchange
More informationSource Coding with Lists and Rényi Entropy or The Honey-Do Problem
Source Coding with Lists and Rényi Entropy or The Honey-Do Problem Amos Lapidoth ETH Zurich October 8, 2013 Joint work with Christoph Bunte. A Task from your Spouse Using a fixed number of bits, your spouse
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationMultiterminal Source Coding with an Entropy-Based Distortion Measure
Multiterminal Source Coding with an Entropy-Based Distortion Measure Thomas Courtade and Rick Wesel Department of Electrical Engineering University of California, Los Angeles 4 August, 2011 IEEE International
More informationEE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16
EE539R: Problem Set 4 Assigned: 3/08/6, Due: 07/09/6. Cover and Thomas: Problem 3.5 Sets defined by probabilities: Define the set C n (t = {x n : P X n(x n 2 nt } (a We have = P X n(x n P X n(x n 2 nt
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationInformation Theoretic Limits of Randomness Generation
Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication
More informationInteractive Hypothesis Testing with Communication Constraints
Fiftieth Annual Allerton Conference Allerton House, UIUC, Illinois, USA October - 5, 22 Interactive Hypothesis Testing with Communication Constraints Yu Xiang and Young-Han Kim Department of Electrical
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationSecret Key and Private Key Constructions for Simple Multiterminal Source Models Chunxuan Ye, Senior Member, IEEE, and Prakash Narayan, Fellow, IEEE
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 2, FEBRUARY 2012 639 Secret Key and Private Key Constructions for Simple Multiterminal Source Models Chunxuan Ye, Senior Member, IEEE, and Prakash Narayan,
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationInformation Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper
Information Leakage of Correlated Source Coded Sequences over a Channel with an Eavesdropper Reevana Balmahoon and Ling Cheng School of Electrical and Information Engineering University of the Witwatersrand
More informationInformation Masking and Amplification: The Source Coding Setting
202 IEEE International Symposium on Information Theory Proceedings Information Masking and Amplification: The Source Coding Setting Thomas A. Courtade Department of Electrical Engineering University of
More informationNew Results on the Equality of Exact and Wyner Common Information Rates
08 IEEE International Symposium on Information Theory (ISIT New Results on the Equality of Exact and Wyner Common Information Rates Badri N. Vellambi Australian National University Acton, ACT 0, Australia
More informationExplicit Capacity-Achieving Coding Scheme for the Gaussian Wiretap Channel. Himanshu Tyagi and Alexander Vardy
Explicit Capacity-Achieving Coding Scheme for the Gaussian Wiretap Channel Himanshu Tyagi and Alexander Vardy The Gaussian wiretap channel M Encoder X n N(0,σ 2 TI) Y n Decoder ˆM N(0,σ 2 WI) Z n Eavesdropper
More informationOn Oblivious Transfer Capacity
On Oblivious Transfer Capacity Rudolph Ahlswede 1 and Imre Csiszár 2, 1 University of Bielefeld, Germany 2 Rényi Institute of Mathematics, Budapest, Hungary Abstract. Upper and lower bounds to the oblivious
More informationPassword Cracking: The Effect of Bias on the Average Guesswork of Hash Functions
Password Cracking: The Effect of Bias on the Average Guesswork of Hash Functions Yair Yona, and Suhas Diggavi, Fellow, IEEE Abstract arxiv:608.0232v4 [cs.cr] Jan 207 In this work we analyze the average
More informationLECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem
LECTURE 15 Last time: Feedback channel: setting up the problem Perfect feedback Feedback capacity Data compression Lecture outline Joint source and channel coding theorem Converse Robustness Brain teaser
More informationAhmed S. Mansour, Rafael F. Schaefer and Holger Boche. June 9, 2015
The Individual Secrecy Capacity of Degraded Multi-Receiver Wiretap Broadcast Channels Ahmed S. Mansour, Rafael F. Schaefer and Holger Boche Lehrstuhl für Technische Universität München, Germany Department
More informationInformation Theory and Hypothesis Testing
Summer School on Game Theory and Telecommunications Campione, 7-12 September, 2014 Information Theory and Hypothesis Testing Mauro Barni University of Siena September 8 Review of some basic results linking
More informationNetwork coding for multicast relation to compression and generalization of Slepian-Wolf
Network coding for multicast relation to compression and generalization of Slepian-Wolf 1 Overview Review of Slepian-Wolf Distributed network compression Error exponents Source-channel separation issues
More informationInformation-theoretic Secrecy A Cryptographic Perspective
Information-theoretic Secrecy A Cryptographic Perspective Stefano Tessaro UC Santa Barbara WCS 2017 April 30, 2017 based on joint works with M. Bellare and A. Vardy Cryptography Computational assumptions
More informationPerformance-based Security for Encoding of Information Signals. FA ( ) Paul Cuff (Princeton University)
Performance-based Security for Encoding of Information Signals FA9550-15-1-0180 (2015-2018) Paul Cuff (Princeton University) Contributors Two students finished PhD Tiance Wang (Goldman Sachs) Eva Song
More informationOn Common Information and the Encoding of Sources that are Not Successively Refinable
On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa
More informationDistributed Functional Compression through Graph Coloring
Distributed Functional Compression through Graph Coloring Vishal Doshi, Devavrat Shah, Muriel Médard, and Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology
More informationOn the Optimality of Treating Interference as Noise in Competitive Scenarios
On the Optimality of Treating Interference as Noise in Competitive Scenarios A. DYTSO, D. TUNINETTI, N. DEVROYE WORK PARTIALLY FUNDED BY NSF UNDER AWARD 1017436 OUTLINE CHANNEL MODEL AND PAST WORK ADVANTAGES
More informationTwo Applications of the Gaussian Poincaré Inequality in the Shannon Theory
Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on
More informationChapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye
Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality
More informationCh. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited
Ch. 8 Math Preliminaries for Lossy Coding 8.4 Info Theory Revisited 1 Info Theory Goals for Lossy Coding Again just as for the lossless case Info Theory provides: Basis for Algorithms & Bounds on Performance
More information4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information
4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk
More informationArimoto Channel Coding Converse and Rényi Divergence
Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code
More informationSoft Covering with High Probability
Soft Covering with High Probability Paul Cuff Princeton University arxiv:605.06396v [cs.it] 20 May 206 Abstract Wyner s soft-covering lemma is the central analysis step for achievability proofs of information
More informationLecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157
Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x
More informationCorrelation Detection and an Operational Interpretation of the Rényi Mutual Information
Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum
More informationGuesswork Subject to a Total Entropy Budget
Guesswork Subject to a Total Entropy Budget Arman Rezaee, Ahmad Beirami, Ali Makhdoumi, Muriel Médard, and Ken Duffy arxiv:72.09082v [cs.it] 25 Dec 207 Abstract We consider an abstraction of computational
More informationInformation Projection Algorithms and Belief Propagation
χ 1 π(χ 1 ) Information Projection Algorithms and Belief Propagation Phil Regalia Department of Electrical Engineering and Computer Science Catholic University of America Washington, DC 20064 with J. M.
More informationSecret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper
Secret Key Agreement Using Conferencing in State- Dependent Multiple Access Channels with An Eavesdropper Mohsen Bahrami, Ali Bereyhi, Mahtab Mirmohseni and Mohammad Reza Aref Information Systems and Security
More informationOn the Secrecy Capacity of Fading Channels
On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationErgodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.
Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions
More informationOn the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel
On the Simulatability Condition in Key Generation Over a Non-authenticated Public Channel Wenwen Tu and Lifeng Lai Department of Electrical and Computer Engineering Worcester Polytechnic Institute Worcester,
More informationCommon Information and Decentralized Inference with Dependent Observations
Syracuse University SURFACE Electrical Engineering and Computer Science - Dissertations College of Engineering and Computer Science 8-2013 Common Information and Decentralized Inference with Dependent
More informationSolutions to Set #2 Data Compression, Huffman code and AEP
Solutions to Set #2 Data Compression, Huffman code and AEP. Huffman coding. Consider the random variable ( ) x x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0. 0.04 0.04 0.03 0.02 (a) Find a binary Huffman code
More informationPrivacy Amplification Theorem for Noisy Main Channel
Privacy Amplification Theorem for Noisy Main Channel Valeri Korjik, Guillermo Morales-Luna, and Vladimir B. Balakirsky Telecommunications, CINVESTAV-IPN, Guadalajara Campus Prol. López Mateos Sur No. 590,
More informationCommon Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014
Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2
More informationLecture 2: August 31
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy
More informationSecret-Key Agreement over Unauthenticated Public Channels Part I: Definitions and a Completeness Result
Secret-Key Agreement over Unauthenticated Public Channels Part I: Definitions and a Completeness Result Ueli Maurer, Fellow, IEEE Stefan Wolf Abstract This is the first part of a three-part paper on secret-key
More informationIN THIS paper, we consider a cooperative data exchange
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 62, NO. 7, JULY 2016 3785 Coded Cooperative Data Exchange for a Secret Key Thomas A. Courtade, Member, IEEE, and Thomas R. Halford, Member, IEEE Abstract We
More informationQuantifying Computational Security Subject to Source Constraints, Guesswork and Inscrutability
Quantifying Computational Security Subject to Source Constraints, Guesswork and Inscrutability Ahmad Beirami, Robert Calderbank, Ken Duffy, Muriel Médard Department of Electrical and Computer Engineering,
More informationDistributed Detection With Vector Quantizer Wenwen Zhao, Student Member, IEEE, and Lifeng Lai, Member, IEEE
IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, VOL 2, NO 2, JUNE 206 05 Distributed Detection With Vector Quantizer Wenwen Zhao, Student Member, IEEE, and Lifeng Lai, Member, IEEE
More informationSimple and Tight Bounds for Information Reconciliation and Privacy Amplification
Simple and Tight Bounds for Information Reconciliation and Privacy Amplification Renato Renner 1 and Stefan Wolf 2 1 Computer Science Department, ETH Zürich, Switzerland. renner@inf.ethz.ch. 2 Département
More informationLecture 3. Mathematical methods in communication I. REMINDER. A. Convex Set. A set R is a convex set iff, x 1,x 2 R, θ, 0 θ 1, θx 1 + θx 2 R, (1)
3- Mathematical methods in communication Lecture 3 Lecturer: Haim Permuter Scribe: Yuval Carmel, Dima Khaykin, Ziv Goldfeld I. REMINDER A. Convex Set A set R is a convex set iff, x,x 2 R, θ, θ, θx + θx
More informationIntroduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2
Introduction to Information Theory B. Škorić, Physical Aspects of Digital Security, Chapter 2 1 Information theory What is it? - formal way of counting information bits Why do we need it? - often used
More informationHash Property and Fixed-rate Universal Coding Theorems
1 Hash Property and Fixed-rate Universal Coding Theorems Jun Muramatsu Member, IEEE, Shigeki Miyake Member, IEEE, Abstract arxiv:0804.1183v1 [cs.it 8 Apr 2008 The aim of this paper is to prove the achievability
More informationSource Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria
Source Coding Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Asymptotic Equipartition Property Optimal Codes (Huffman Coding) Universal
More informationINFORMATION THEORY AND STATISTICS
CHAPTER INFORMATION THEORY AND STATISTICS We now explore the relationship between information theory and statistics. We begin by describing the method of types, which is a powerful technique in large deviation
More informationInformation measures in simple coding problems
Part I Information measures in simple coding problems in this web service in this web service Source coding and hypothesis testing; information measures A(discrete)source is a sequence {X i } i= of random
More informationRemote Source Coding with Two-Sided Information
Remote Source Coding with Two-Sided Information Basak Guler Ebrahim MolavianJazi Aylin Yener Wireless Communications and Networking Laboratory Department of Electrical Engineering The Pennsylvania State
More informationInformation-Theoretic Security: an overview
Information-Theoretic Security: an overview Rui A Costa 1 Relatório para a disciplina de Seminário, do Mestrado em Informática da Faculdade de Ciências da Universidade do Porto, sob a orientação do Prof
More informationMGMT 69000: Topics in High-dimensional Data Analysis Falll 2016
MGMT 69000: Topics in High-dimensional Data Analysis Falll 2016 Lecture 14: Information Theoretic Methods Lecturer: Jiaming Xu Scribe: Hilda Ibriga, Adarsh Barik, December 02, 2016 Outline f-divergence
More informationOn the Capacity Region for Secure Index Coding
On the Capacity Region for Secure Index Coding Yuxin Liu, Badri N. Vellambi, Young-Han Kim, and Parastoo Sadeghi Research School of Engineering, Australian National University, {yuxin.liu, parastoo.sadeghi}@anu.edu.au
More informationlossless, optimal compressor
6. Variable-length Lossless Compression The principal engineering goal of compression is to represent a given sequence a, a 2,..., a n produced by a source as a sequence of bits of minimal possible length.
More informationAPC486/ELE486: Transmission and Compression of Information. Bounds on the Expected Length of Code Words
APC486/ELE486: Transmission and Compression of Information Bounds on the Expected Length of Code Words Scribe: Kiran Vodrahalli September 8, 204 Notations In these notes, denotes a finite set, called the
More informationCoding of memoryless sources 1/35
Coding of memoryless sources 1/35 Outline 1. Morse coding ; 2. Definitions : encoding, encoding efficiency ; 3. fixed length codes, encoding integers ; 4. prefix condition ; 5. Kraft and Mac Millan theorems
More informationCapacity Bounds for. the Gaussian Interference Channel
Capacity Bounds for the Gaussian Interference Channel Abolfazl S. Motahari, Student Member, IEEE, and Amir K. Khandani, Member, IEEE Coding & Signal Transmission Laboratory www.cst.uwaterloo.ca {abolfazl,khandani}@cst.uwaterloo.ca
More informationAchieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel
Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel Shahid Mehraj Shah and Vinod Sharma Department of Electrical Communication Engineering, Indian Institute of
More informationLECTURE 3. Last time:
LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate
More informationSecret Message Capacity of Erasure Broadcast Channels with Feedback
Secret Message Capacity of Erasure Broadcast Channels with Feedback László Czap Vinod M. Prabhakaran Christina Fragouli École Polytechnique Fédérale de Lausanne, Switzerland Email: laszlo.czap vinod.prabhakaran
More informationSecure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel
Secure Degrees of Freedom of the MIMO Multiple Access Wiretap Channel Pritam Mukherjee Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 074 pritamm@umd.edu
More informationLecture 6. Today we shall use graph entropy to improve the obvious lower bound on good hash functions.
CSE533: Information Theory in Computer Science September 8, 010 Lecturer: Anup Rao Lecture 6 Scribe: Lukas Svec 1 A lower bound for perfect hash functions Today we shall use graph entropy to improve the
More informationEE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More informationContext tree models for source coding
Context tree models for source coding Toward Non-parametric Information Theory Licence de droits d usage Outline Lossless Source Coding = density estimation with log-loss Source Coding and Universal Coding
More informationSource Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime
Source Coding and Function Computation: Optimal Rate in Zero-Error and Vanishing Zero-Error Regime Solmaz Torabi Dept. of Electrical and Computer Engineering Drexel University st669@drexel.edu Advisor:
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationA Formula for the Capacity of the General Gel fand-pinsker Channel
A Formula for the Capacity of the General Gel fand-pinsker Channel Vincent Y. F. Tan Institute for Infocomm Research (I 2 R, A*STAR, Email: tanyfv@i2r.a-star.edu.sg ECE Dept., National University of Singapore
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationFrans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)
Eindhoven University of Technology IEEE EURASIP Spain Seminar on Signal Processing, Communication and Information Theory, Universidad Carlos III de Madrid, December 11, 2014 : Secret-Based Authentication
More informationMemory in Classical Information Theory: A Brief History
Beyond iid in information theory 8- January, 203 Memory in Classical Information Theory: A Brief History sergio verdu princeton university entropy rate Shannon, 948 entropy rate: memoryless process H(X)
More informationQuiz 2 Date: Monday, November 21, 2016
10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,
More informationArimoto-Rényi Conditional Entropy. and Bayesian M-ary Hypothesis Testing. Abstract
Arimoto-Rényi Conditional Entropy and Bayesian M-ary Hypothesis Testing Igal Sason Sergio Verdú Abstract This paper gives upper and lower bounds on the minimum error probability of Bayesian M-ary hypothesis
More informationGraph Coloring and Conditional Graph Entropy
Graph Coloring and Conditional Graph Entropy Vishal Doshi, Devavrat Shah, Muriel Médard, Sidharth Jaggi Laboratory for Information and Decision Systems Massachusetts Institute of Technology Cambridge,
More information