On the Correlation between Boolean Functions of Sequences of Random Variables

Similar documents
Bounds on the Effective-length of Optimal Codes for Interference Channel with Feedback

The Order Relation and Trace Inequalities for. Hermitian Operators

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

EGR 544 Communication Theory

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Appendix for Causal Interaction in Factorial Experiments: Application to Conjoint Analysis

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

APPENDIX A Some Linear Algebra

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Affine transformations and convexity

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Complete subgraphs in multipartite graphs

More metrics on cartesian products

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

Error Probability for M Signals

FINITELY-GENERATED MODULES OVER A PRINCIPAL IDEAL DOMAIN

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem

A new Approach for Solving Linear Ordinary Differential Equations

ECE559VV Project Report

The Synchronous 8th-Order Differential Attack on 12 Rounds of the Block Cipher HyRAL

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

2.3 Nilpotent endomorphisms

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

Problem Set 9 Solutions

12 MATH 101A: ALGEBRA I, PART C: MULTILINEAR ALGEBRA. 4. Tensor product

Research Article Green s Theorem for Sign Data

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

ON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EQUATION

Lecture 10 Support Vector Machines II

Convexity preserving interpolation by splines of arbitrary degree

Numerical Heat and Mass Transfer

Maximizing the number of nonnegative subsets

Chapter 7 Channel Capacity and Coding

Lecture 4: Universal Hash Functions/Streaming Cont d

Linear, affine, and convex sets and hulls In the sequel, unless otherwise specified, X will denote a real vector space.

PHYS 705: Classical Mechanics. Calculus of Variations II

The Minimum Universal Cost Flow in an Infeasible Flow Network

COMPLEX NUMBERS AND QUADRATIC EQUATIONS

Perfect Competition and the Nash Bargaining Solution

Lecture 3: Shannon s Theorem

Structure and Drive Paul A. Jensen Copyright July 20, 2003

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

Appendix B. Criterion of Riemann-Stieltjes Integrability

Calculation of time complexity (3%)

Expected Value and Variance

The Expectation-Maximization Algorithm

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Assortment Optimization under MNL

Lecture 5 Decoding Binary BCH Codes

Natural Language Processing and Information Retrieval

Linear Approximation with Regularization and Moving Least Squares

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1

arxiv: v1 [math.co] 1 Mar 2014

A CHARACTERIZATION OF ADDITIVE DERIVATIONS ON VON NEUMANN ALGEBRAS

The lower and upper bounds on Perron root of nonnegative irreducible matrices

DUE: WEDS FEB 21ST 2018

Communication Complexity 16:198: February Lecture 4. x ij y ij

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan.

Yong Joon Ryang. 1. Introduction Consider the multicommodity transportation problem with convex quadratic cost function. 1 2 (x x0 ) T Q(x x 0 )

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Homework Notes Week 7

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper

Notes on Frequency Estimation in Data Streams

arxiv:quant-ph/ Jul 2002

Perron Vectors of an Irreducible Nonnegative Interval Matrix

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

The optimal delay of the second test is therefore approximately 210 hours earlier than =2.

Exact-Regenerating Codes between MBR and MSR Points

Low Complexity Soft-Input Soft-Output Hamming Decoder

Refined Coding Bounds for Network Error Correction

Affine and Riemannian Connections

On the Multicriteria Integer Network Flow Problem

Formulas for the Determinant

DIFFERENTIAL FORMS BRIAN OSSERMAN

On the correction of the h-index for career length

Rate-Memory Trade-off for the Two-User Broadcast Caching Network with Correlated Sources

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS

FACTORIZATION IN KRULL MONOIDS WITH INFINITE CLASS GROUP

LECTURE 9 CANONICAL CORRELATION ANALYSIS

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

The Second Anti-Mathima on Game Theory

20. Mon, Oct. 13 What we have done so far corresponds roughly to Chapters 2 & 3 of Lee. Now we turn to Chapter 4. The first idea is connectedness.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

Quantum and Classical Information Theory with Disentropy

The L(2, 1)-Labeling on -Product of Graphs

Fuzzy Boundaries of Sample Selection Model

5 The Rational Canonical Form

Signal space Review on vector space Linear independence Metric space and norm Inner product

On Network Coding of Independent and Dependent Sources in Line Networks

(1 ) (1 ) 0 (1 ) (1 ) 0

Density matrix. c α (t)φ α (q)

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens

Lecture 3: Probability Distributions

NUMERICAL DIFFERENTIATION

Canonical transformations

Transcription:

On the Correlaton between Boolean Functons of Sequences of Random Varables Farhad Shran Chaharsoogh Electrcal Engneerng and Computer Scence Unversty of Mchgan Ann Arbor, Mchgan, 48105 Emal: fshran@umch.edu S. Sandeep Pradhan Electrcal Engneerng and Computer Scence Unversty of Mchgan Ann Arbor, Mchgan, 48105 Emal: pradhanv@umch.edu arxv:170.01353v1 [cs.it] 4 Feb 017 Abstract In ths paper, we establsh a new nequalty tyng together the effectve length and the maxmum correlaton between the outputs of an arbtrary par of Boolean functons whch operate on two sequences of correlated random varables. We derve a new upper-bound on the correlaton between the outputs of these functons. The upper-bound s useful n varous dscplnes whch deal wth common-nformaton. We buld upon Wtsenhausen s [] bound on maxmum-correlaton. The prevous upper-bound dd not take the effectve length of the Boolean functons nto account. One possble applcaton of the new bound s to characterze the communcaton-cooperaton tradeoff n mult-termnal communcatons. In ths problem, there are lowerbounds on the effectve length of the Boolean functons due to the rate-dstorton constrants n the problem, as well as lower bounds on the output correlaton at dfferent nodes due to the mult-termnal nature of the problem. I. Introducton A fundamental problem of broad theoretcal and practcal nterest s to characterze the maxmum correlaton between the outputs of a par of functons of random sequences. Consder the two dstrbuted agents shown n Fgure 1. A par of correlated dscrete memoryless sources DMS) are fed to the two agents. These agents are to each make a bnary decson. The goal of the problem s to maxmze the correlaton between the outputs of these agents subject to specfc constrants on the decson functons. The study of ths setup has had mpact on a varety of dscplnes, for nstance, by takng the agents to be two encoders n the dstrbuted source codng problem [3], [8], or two transmtters n the nterference channel problem [8], or Alce and Bob n a secret key-generaton problem [4], [5], or two agents n a dstrbuted control problem [6]. A specal case of the problem s the study of commonnformaton CI) generated by the two agents. As an example, consder two encoders n a Slepan-Wolf SW) setup. Let U 1, U, and V be ndependent, non-constant bnary random varables. Then, an encoder observng the DMS X V, U 1 ), and an encoder observng Y V, U ) agree on the value of V wth probablty one. The random varable V s called the CI observed by the two encoders. These encoders requre a sumrate equal to HV) + HU 1 ) + HU ) to transmt the source to the decoder. Ths gves a reducton n rate equal to the X 1 ;X ; ;X n Y 1 ;Y ; ;Y n Agent 1 Agent ex n ) f0; 1g fy n ) f0; 1g Fg. 1: Correlated Boolean decson functons. entropy of V, compared to the transmsson of the sources over ndependent pont-to-pont channels. The gan n performance s drectly related to the entropy of the CI. So, t s desrable to maxmze the entropy of the CI between the encoders. In [1], the authors nvestgated mult-letterzaton as a method for ncreasng the CI. They showed that multletterzaton does not lead to an ncrease n the CI. More precsely, they prove the followng statement: Let X and Y be two sequences of DMSs. Let f n X n ) and g n Y n ) be two sequences of functons whch converge to one another n probablty. Then, the normalzed entropes 1 n H f nx n )), and 1 n Hg ny n )) are less than or equal to the entropy of the CI between X and Y for large n. A stronger verson of the result was proved by Wtsenhausen [], where maxmum correlaton between the outputs s upperbounded subject to the followng restrctons on the decson functons: 1) The entropy of the bnary output s fxed. ) The agents cooperate wth each other. It was shown that maxmum correlaton s acheved f both users output a sngle element of the strng wthout further processng e.g. each user outputs the frst element of ts correspondng strng). Ths was used to conclude that common-nformaton can not be nduced by mult-letterzaton. Whle, the result was used extensvely n a varety of areas such as nformaton theory, securty, and control [4], [5], [6], n many problems, there are addtonal constrants on the set of admssble decson functons. For example, one can consder constrants on the effectve length of the decson functons. Ths s a vald assumpton, for nstance, n the case

of communcaton systems, the users have lower-bounds on ther effectve lengths due to the rate-dstorton requrements n the problem [8]. In ths paper, the problem under these addtonal constrants s consdered. A new upper-bound on the correlaton between the outputs of arbtrary pars of Boolean functons s derved. The bound s presented as a functon of the dependency spectrum of the Boolean functons. Ths s done n several steps. Frst, the effectve length of an addtve Boolean functon s defned. Then, we use a method smlar to [], and map the Boolean functons to the set of real-valued functons. Usng tools n real analyss, we fnd an addtve decomposton of these functons. The decomposton components have welldefned effectve lengths. Usng the decomposton we fnd the dependency spectrum of the Boolean functon. The dependency spectrum s a generalzaton of the effectve length and s defned for non-addtve Boolean functons. Lastly, we use the dependency spectrum to derve the new upper-bound. The rest of the paper s organzed as follows: Secton II presents the notaton used n the paper. Secton III develops useful mathematcal machnery to analyze Boolean functon. Secton IV contans the man result of the paper. Fnally, Secton V concludes the paper. II. Notaton In ths secton, we ntroduce the notaton used n ths paper. We represent random varables by captal letters such as X, U. Sets are denoted by callgraphc letters such as X, U. Partcularly, the set of natural numbers and real numbers are shown by N, and R, respectvely. For random varables, the n-length vector X 1, X,, X n ), X X s denoted by X n X n. The bnary strng 1,,, n ), j {0, 1} s wrtten as. The vector of random varables X j1, X j,, X jk ), j [1, n], j j k, s denoted by X, where jl 1, l [1, k]. For example, take n 3, the vector X 1, X 3 ) s denoted by X 101, and the vector X 1, X ) by X 110. For two bnary strngs, j, we wrte < j f and only f k < j k, k [1, n]. For a bnary strng we defne N w H ), where w H denotes the Hammng weght. Lastly, the vector s the element-wse complement of. III. The Dependency Spectrum of a Functon In ths secton, we study the correlaton between the output of a Boolean functon wth subsets of the nput. Partcularly, we are nterested n the answers to questons such as How strongly does the frst element X 1 affect the output of ex n )? Is ths effect amplfed when we take X nto account as well? Is there a subset of random varables that almost) determnes the value of the output?. We formulate these questons n mathematcal terms, and fnd a characterzaton of the dependency spectrum of a Boolean functon. The dependency spectrum s a vector whch captures the correlaton between dfferent subsets of the nput elements wth each element of the output. As an ntermedate step, we defne the effectve length of an addtve Boolean functon below: Defnton 1. For a Boolean functon e : {0, 1} n {0, 1} defned by ex n ) J X, J [1, n], where the addton operator s the bnary addton, the effectve length s defned as the cardnalty of the set J. For a general Boolean functon e.g. non-addtve), we fnd a decomposton of e nto a set of functons e, {0, 1} n whose effectve length s well-defned. Frst, we provde a mappng from the set of Boolean functons to the set of real functons. Ths allows us to use the tools avalable n real analyss to analyze these functons. Fx a dscrete memoryless source X, and a Boolean functon defned by e : {0, 1} n {0, 1}. Let P ex n ) 1) q. The real-valued functon correspondng to e s represented by ẽ, and s defned as follows: ẽx n 1 q, ex n ) 1, ) 1) q. otherwse. Remark 1. Note that ẽ has zero mean and varance q1 q). The random varable ẽx n ) has fnte varance on the probablty space X n, Xn, P X n). The set of all such functons s denoted by H X,n. More precsely, we defne H X,n L X n, Xn, P X n) as the separable Hlbert space of all measurable functons h : X n R. Snce X s a DMS, the somorphy relaton H X,n H X,1 H X,1 H X,1 ) holds [7], where ndcates the tensor product. Example 1. Let n1. The Hlbert space H X,1 s the space of all measurable functons h : X R. The space s spanned by the two lnearly ndependent functons h 1 X) 1X) and h X) 1 X), where X X 1. We conclude that the space s two-dmensonal. Remark. The tensor operaton n H X,n s real multplcaton.e. f 1, f H X,1 : f 1 X 1 ) f X ) f 1 X 1 ) f X )). Let { f X) [1, d]} be a bass for H X,1, then a bass for H X,n would be the set of all the real multplcatons of these bass elements: {Π j [1,n] f j X j ), j [1, d]}. Example 1 gves a decomposton of the space H X,1. Next, we ntroduce another decomposton of H X,1 whch turns out to be very useful. Let I X,1 be the subset of all measurable functons of X whch have 0 mean, and let γ X,1 be the set of constant real functons of X. We argue that H X,1 I X,1 γ X,1 gves a decomposton of H X,1. I X,1 and γ X,1 are lnear subspaces of H X,1. I X,1 s the null space of the lnear functonal whch takes an arbtrary functon H X,1 to ts expected value E X f ). The null space of any non-zero lnear functonal s a hyper-space n H X,1. So, I X,1 s a one-dmensonal subspace of H X,1. From Remark 1, ẽ 1 I X,1. We conclude that any element of I X,1 can be wrtten as cẽ 1 X n ), c R. γ X,1 s also one dmensonal. It s spanned by the functon gx) 1. Consder an arbtrary element H X,1. One can wrte f 1 + where 1 E X f ) I X,1, and E X f ) γ X,1. Replacng H X,1 wth I X,1 γ X,1 n ), we

have: where H X,n n 1 H X,1 n 1 I X,1 γ X,1 ) a) {0,1} ng 1 G G n ), 3) γ X,1 j 0, G j I X,1 j 1, and, n a), we have used the dstrbutve property of tensor products over drect sums. Remark 3. Equaton 3), can be nterpreted as follows: for any ẽ H X,n, n N, we can fnd a decomposton ẽ ẽ, where ẽ G 1 G G n. ẽ can be vewed as the component of ẽ whch s only a functon of {X j j 1}. In ths sense, the collecton {ẽ j [1,n] j k}, s the set of components of ẽ whose effectve length s k. In order clarfy the notaton, we provde the followng example: Example. Let X be a bnary symmetrc source, and let ex 1, X ) X 1 X be the bnary and functon. The correspondng real functon s: 1 ẽx 1, X ) 4 X 1, X ) 1, 1), 3 4 X 1, X ) 1, 1). Lagrange nterpolaton gves ẽ X 1 X 1 4. The decomposton s gven by: ẽ 1,1 X 1 1 )X 1 ), ẽ 1,0 1 X 1 1 ), ẽ 0,1 1 X 1 ), ẽ 0,0 0. The varances of these functons are gven below: Varẽ) 3 16, Varẽ 0,1) Varẽ 1,0 ) Varẽ 1,1 ) 1 16. As we shall see n the next secton, these varances play a major role n determnng the correlaton preservng propertes of ẽ. The vector whose elements nclude these varances s called the dependency spectrum of e. In the perspectve of the effectve length, the functon ẽ has 3 of ts varance dstrbuted between ẽ 0,1, and ẽ 1,0 whch have effectve length one, and 1 3 of the varance s on ẽ 1,1 whch s has effectve length two. Smlar to the above examples, for arbtrary ẽ H X,n, n N, we fnd a decomposton ẽ ẽ, where ẽ G 1 G G n. We characterze ẽ n terms of products of the bass elements of j [1,n] G j usng the followng result n lnear algebra: Lemma 1 [7]). Let H, [1, n] be vector spaces over a feld F. Also, let B {v, j j [1, d ]} be the bass for H where d s the dmenson of H. Then, any element v [1,n] H can be wrtten as v j 1 [1,d 1 ] j [1,d ] j n [1,d n ] c j nv j1 v j v jn. Snce G j s, j [1, n] take values from the set {I X,1, γ X,1 }, they are all one-dmensonal. For the bnary source X wth PX 1) q, defne h as: 1 q, f X 1, hx) 4) q. f X 0. Then, the sngle element set { hx)} s a bass for I X,1. Also, the functon hx) 1 spans γ X,1. So, usng Lemma 1, ẽ X n ) c t: t 1 hx t ), c R. We are nterested n the varance of ẽ s. In the next proposton, we show that the ẽ s are uncorrelated and we derve an expresson for the varance of ẽ. Proposton 1. Defne P as the varance of ẽ. The followng hold: 1) Eẽ ẽ j ) 0, j, n other words ẽ s are uncorrelated. ) P Eẽ ) c q1 q))w H). Proof: 1) follows by drect calculaton. ) holds from the ndependence of X s. Next, we fnd the characterzaton for ẽ. Lemma. ẽ E X n X ẽ X ) ẽ j gves the unque orthogonal decomposton of ẽ nto the Hlbert spaces G 1 G G n, {0, 1} n. Proof: Please refer to the Appendx. The followng example clarfes the notaton used n Lemma. Example 3. Consder the case where n. We have the followng decomposton of H X, : H X, I X,1 I X,1 ) I X,1 γ X,1 ) γ X,1 I X,1 ) γ X,1 γ X,1 ). 5) Let ẽx 1, X ) be an arbtrary functon n H X,. The unque decomposton of ẽ n the form gven n 5) s as follows: ẽ ẽ 1,1 + ẽ 1,0 + ẽ 0,1 + ẽ 0,0, ẽ 1,1 ẽ E X X 1 ẽ X 1 ) E X1 X ẽ X ) + E X1,X ẽ) ẽ 1,0 E X X 1 ẽ X 1 ) E X1,X ẽ), ẽ 0,1 E X1 X ẽ X ) E X1,X ẽ), ẽ 0,0 E X1,X ẽ). It s straghtforward to show that each of the ẽ, j s,, j {0, 1}, belong to ther correspondng subspaces. For nstance, ẽ 0,1 s constant n X 1, and s a 0 mean functon of X.e. E X ẽ0,1 x 1, X ) ) 0, x 1 {0, 1}), so ẽ 0,1 γ X,1 I X,1. The followng proposton descrbes some of the propertes of ẽ whch were derved n the proof of Lemma : Proposton. The followng hold: 1), E X nẽ )0. ) k, we have E X n X j ẽ X k ) ẽ. 3) E X nẽ ẽ k ) 0, for k. 4) k : ẽ X k ) 0. Lastly, we derve an expresson for P :

Lemma 3. For arbtrary e : {0, 1} n {0, 1}, let ẽ be the correspondng real functon, and let ẽ ẽ be the decomposton n the form of Equaton 3). The varance of each component n the decomposton s gven by the followng recursve formula P E X E X n X ẽ X )) P j, F n, where P 0 0. Proof: P Var X ẽ X n )) E X ẽ Xn )) E X ẽ X n )) a) E X E X n X ẽ X ) ẽ j 0 E X E X n X ẽ X ) ) ) E X EX n X ẽ X )ẽ j + EX ẽ j ) ) g) E X E X n X ẽ X )) E Xj ẽ j ) + 1j k)e X ẽ j ) E X E X n X ẽ X )) P j, where a) follows from 1) n Proposton 4, b) follows from the decomposton n Equaton 3), c) uses lnearty of expectaton, d) uses 4) n Proposton 4, e) holds from ) n 4, and n f) and g) we have used 1) n Proposton 4. Corollary 1. For an arbtrary e : {0, 1} n {0, 1} wth correspondng real functon ẽ, and decomposton ẽ j ẽ j. Let the varance of ẽ be denoted by P. Then, P j P j. The corollary s a specal case of Lemma 3, where we have taken to be the all ones vector. The followng provdes a defnton of the dependency spectrum of a Boolean functon: k< k< Defnton Dependency Spectrum). For a Boolean functon e, the vector of varances P ) {0,1} n s called the dependency spectrum of e. In the next secton, we wll use the dependency spectrum to upper-bound the maxmum correlaton between the outputs of two arbtrary Boolean functons. IV. Correlaton Preservaton n Arbtrary Functons We proceed wth presentng the man result of ths paper. Let X, Y) be a par of DMS s. Consder two arbtrary Boolean functons e : X n {0, 1} and f : Y n {0, 1}. Let q Pe 1), r P f 1). Let ẽ e, and f f gve the decomposton of these functons as defned n the prevous secton. The followng theorem provdes an upper-bound on b) E X E X n X ẽ X ) ) E X E the probablty of equalty of ex X n X ẽ l X )ẽ j n ) and f Y n ). l Theorem 1. Let ɛ PX Y), the followng bound holds: + E X ẽ j ) ) P Q C P 1 Q 1 PeX n ) f Y n )) c) E X E X n X ẽ X ) ) E X E X n X ẽ l X )ẽ j 1 P l Q + C P 1 Q 1, + E X ẽ j ) ) where C 1 ɛ) N, P s the varance of ẽ, and ẽ s the d) E X E X n X ẽ X ) ) real functon correspondng to e, and Q s the varance of, E X 1l )E X n X ẽ l X )ẽ j and fnally, N w H ). l Proof: Please refer to the appendx. + E X ẽ j ) ) Remark 4. C s decreasng wth N. So, n order to ncrease e) E X E X n X ẽ X ) ) PeX E X ẽ l ẽ j n ) f Y n )), most of the varance P should be dstrbuted on ẽ whch have lower N.e. operate on smaller + E X ẽ j ) ) l< f ) E X E X n X ẽ X ) ) blocks). Partcularly, the lower bound s mnmzed by settng ) 1j l)e X ẽl ẽ j + EX ẽ j ) ) 1 1, l< E X E X n X ẽ X ) ) P 0 otherwse. E Xj ẽ j ) + E X ẽ j ) ) Ths recovers the result n []. E X E X n X ẽ X )) E Xj ẽ j ) + E X ẽ j ẽ k ) We derved a relaton between the dependency spectrum of a Boolean functon and ts correlaton preservng propertes. Ths can be used n a varety of dscplnes. For example, n communcaton problems, cooperaton among dfferent nodes n a network requres correlated outputs whch can be lnked to the dependency spectrum through the results derved here. On the other hand, there are restrctons on the dependency spectrum based on the rate-dstorton requrements better performance requres larger effectve lengths). We nvestgate ths n [9], and show that the large blocklength sngle-letter codng strateges used n networks are sub-optmal n varous problems. V. Concluson We derved a new bound on the maxmum correlaton between Boolean functons operatng on pars of sequences of random varable. The bound was presented as a functon of the dependency spectrum of the functons. We developed a new mathematcal apparatus for analyzng Boolean functons,

provded formulas for decomposng the Boolean functon nto addtve components, and for calculatng the dependency spectrum of these functons. The new bound has wde rangng applcatons n securty, control and nformaton theory. A. Proof of Lemma Appendx Proof: The unqueness of such a decomposton follows from the somorphy relaton stated n equaton 3). We prove that the ẽ gven n the lemma are ndeed the decomposton nto the components of the drect sum. Equvalently, we show that 1) ẽ ẽ, and ) ẽ G 1 G G n, {0, 1} n. Frst we check the equalty ẽ ẽ. Let t denote the n- length vector whose elements are all ones. We have: ẽ t E X n X t ẽ X t ) <t ẽ a) ẽ t + <t ẽ ẽ b) ẽ ẽ, {0,1} n where n a) we have used 1) X t X n and ) for any functon of X n, E X n X n f X n ) f, and b) holds snce < t t.. It remans to show that ẽ G 1 G G n, {0, 1} n. The next proposton provdes a means to verfy ths property. Proposton 3. Fx {0, 1} n, defne A 0 {s s 0}, and A 1 {s s 1}. s an element of G 1 G G n f and only f 1) t s constant n all X s, s A 0, and ) t has 0 mean on all X s, when s A 1. Proof: By defnton, any element of G 1 G G n satsfes the condtons n the proposton. Conversely, we show that any functon satsfyng the condtons 1) and ) s n the tensor product. Let f j f j, j G j1 G j G jn. Assume k 1 for some k [1, n]. Then: j 0 ) E X n X k 1) j: j k 0 j j X k ) a) E X n X k X k ) ) j j: j k 0 E X n X k f j X k ) where we have used lnearty of expectaton n a), and the last two equaltes use the fact that j G j1 G j G jn whch means t satsfes propertes 1) and ). So far we have shown that f j f j. Now assume k 0. Then: j f 1) E X n X k j X k ) E X n X k j X k ) j : j k 0 f j j : j k 1 j f j 0. So, f j f j. By assumpton we have G 1 G G n. Returnng to the orgnal problem, t s enough to show that ẽ s satsfy the condtons n Proposton 3. We prove the stronger result presented n the next proposton. Proposton 4. The followng hold: 1) E X nẽ )0. f j, j ) k, we have E X n X j ẽ X k ) ẽ. 3) E X nẽ ẽ k ) 0, for k. 4) k : ẽ X k ) 0. Proof: 1) For two n-length bnary vectors, and j, we wrte j f k j k, k [1, n]. The set {0, 1} n equpped wth s a well-founded set.e. any subset of {0, 1} n has at least one mnmal element). The followng presents the prncple of Noetheran nducton on well-founded sets: Proposton 5 Prncple of Noetheran Inducton). Let A, ) be a well-founded set. To prove the property Px) s true for all elements x n A, t s suffcent to prove the followng 1) Inducton Bass: Px) s true for all mnmal elements n A. ) Inducton Step: For any non-mnmal element x n A, f Py) s true for all mnmal y such that y x, then t s true for x. We wll use Noetheran nducton to prove the result. Let j, j [1, n] be the jth element of the standard bass. Then ẽ j E X n X j ẽ X j ). By the smoothng property of expectaton, E X nẽ j ) E X nẽ) 0. Assume that j <, E X nẽ j ) 0. Then, E X nẽ ) E X n E X n X ẽ X ) ẽ j E X nẽ) E X nẽ j ) 0 0 0. ) Ths statement s also proved by nducton. E X n X ẽ X ) s a functon of X, so by nducton ẽ E X n X ẽ X ) ẽ k s also a functon of X. 3) Let k, k [1, n] be defned as the kth element of the standard bass, and take j, j [1, n], j j. We have: E X nẽ j ẽ j ) E X ne X n X j ẽ X j )E X n X j ẽ X j )) a) E X ne X n X j ẽ X j ))E X ne X n X j ẽ X j )) b) E Xnẽ) 0, where we have used the memoryless property of the source n a) and b) results from the smoothng property of expectaton. We extend the argument by Noetheran nducton. Fx, k. Assume that E X nẽ j ẽ j ) 1j j )E X nẽ j ), j <, j k, and j, j < k. E X nẽ ẽ k ) E X n E X n X ẽ X ) ẽ j ẽ X k ) ẽ j j <k E Xn EX n X ẽ X ) ẽ X k ) ) ẽj ẽ X k ) ) j <k E X n ẽj E X n X ẽ X ) ) +,j <k E X n E X nẽ j ẽ j ). The second and thrd terms n the above expresson can be smplfed as follows. Frst, note that: ẽ E X n X ẽ X ) ẽ j ẽ j E X n X ẽ X ). 6) j

Our goal s to smplfy E X nẽ j E X n X j ẽ X j )). We proceed by consderng two dfferent cases: Case 1: k and k : Let j < : E X nẽ j ẽ X k )) 6) E X nẽ j ẽ j )) E X nẽ j ẽ l ) 1j l)e X nẽ j ) 1j k)e X nẽ j ). l k l k l k By the same arguments, for j k: E X n ẽj E X n X ẽ X ) ) 1j )E X nẽ j ). Replacng the terms n the orgnal equalty we get: E X nẽ ẽ k ) E X n EX n X ẽ X ) ẽ X k ) ) 1j k)e X nẽ j ) 1j )E X nẽ j ) + j k,j <k 1j j )E X nẽ j ) E X n EX n X ẽ X ) ẽ X k ) ) E X nẽ j ) a) E X ne X n X k ẽx n ) X k )) j k E X nẽ j ) j k b) E X ne X n X k ẽx n ) X k )) E X n j k ẽ j ) 6) 0 Where n b) we have used that ẽ s are uncorrelated, and a) s proved below: E X n EX n X ẽ X ) ẽ X k ) ) Px k ) x k Px k +)E X n X ẽ X ) Px k +) ẽ X k x k + x k + Px k )E X n X k ẽ x k ) x k E X ne X n X k ẽx n ) X k )). Case : Assume k: E X nẽ ẽ k ) E X n EX n X ẽ X ) ẽ X k ) ) 1j j )E X nẽ j ) 1j j)e X nẽ j ) + j k,j <k 1j j )E X nẽ j ) E X ne X n X ẽ X )) E X nẽ j ) E X nẽ j ) + E X nẽ j ) 0. Case 3: When k the proof s smlar to case. 4) Clearly when 1, the clam holds. Assume t s true for all j such that j <. Take {0, 1} n and t [1, n], t 1 j j arbtrarly. We frst prove the clam for k t : ẽ X k ) E X n X ẽ) ẽ j X k ) EX n X ẽ X ) X k ẽ j X k ) a) ẽ X k ) b) c) s t ẽ j X k ) j t s ẽ s X k s ) d) 0. ẽ j X k ) 5) ẽ j j t ẽ j X k ) s t ẽ j X k ) ẽ s X k ) Where n a) we have used > k, also b) follows from j < k, c) uses k s ) k s, and fnally, d) uses the nducton assumpton. Now we extend the result to general k <. Fx k. Assume the clam s true for all j such that k < j <.e k < j <, ẽ Xj X k ) 0). We have: ẽ X k ) E X n X ẽ X ) ẽ j X k ) EX n X ẽ X ) X k ẽ j X k ) 6) ẽ X k ) ẽ j 0. j k Remark 5. The second condton above s equvalent to condton ) n Proposton 3. The fourth condton s equvalent to 1) n Proposton 3. Usng propostons 3 and 4, we conclude that ẽ G 1 G G n, {0, 1} n. Ths completes the proof of Lemma. B. Proof of Theorem 1 Proof: The proof nvolves three man steps. Frst, we bound the Pearson correlaton between the real-valued functons ẽ, and f. In the second step, we relate the correlaton to the probablty that the two functons are equal and derve the lower bound. Fnally, n the thrd step we use the lower bound proved n the frst two steps to derve the upper bound. Step 1: From Remark 1, the expectaton of both functons s E 0. So, the Pearson correlaton s gven by X n,y n ẽ f ). Our goal s to bound ths value. We have: j k E X n,y nẽ f ) a) E X n,y n ẽ ) {0,1} n b) {0,1} n k {0,1} n rq1 q)1 r)) 1 k ) E X n,y nẽ f k ). 7) k {0,1} n In a) we have used Remark 3, and n b) we use lnearty of expectaton. Usng the fact that ẽ G 1 G G n and

Lemma 1, we have: ẽ c t: t 1 ẽ t X t ), f k d k t:k t 1 We replace ẽ and k n 7): E X n,y nẽ f k ) 8) E X n,y n c ẽ t X t ) d k t: t 1 a) c d k E X n,y n ẽ t X t ) t Y t ) E X n t: t 1,k t 0 t: t 1,k t 1 ẽ t X t ) E Y n b) 1 k)c d k t: t 1 c) 1 k)c d k 1 ɛ) N t: t 0,k t 1 t Y t ) E X n,y ẽt n X t ) t Y t ) ) t: t 1 E 1 X n f t X t ). 8) s:k s 1 ẽ t X t ) ) E 1 Y n s Y s ) f t Y t ) ) d) 1 k)1 ɛ) N P 1 Q 1 1 k)c P 1 Q 1. 9) In a) we have used the fact that n a par of DMS s, X and Y j are ndependent for j. b) holds snce from Proposton 4, Eẽ ) E ) 0, [1, n]. We prove c) n Lemma 4 below. In d) we have used proposton 1. Lemma 4. Let gx) and hy) be two arbtrary zero-mean, real valued functons, then: E X gx)hy)) 1 ɛ)e 1 X g X))E 1 Y h Y)). Proof: Please refer to the [9]. Usng equatons 7) and 9) we get: E X ẽ f ) C P 1 Q 1. Step : We use the results from step one to derve a bound on Pe f ). Defne a PeX n ) 1, f Y n ) 1), b PeX n ) 0, f Y n ) 1), c PeX n ) 1, f Y n ) 0), and d PeX n ) 0, f Y n ) 0), then E X n,y N ẽxn ) f Y n )) a1 q)1 r) bq1 r) c1 q)r + dqr, 10) We wrte ths equaton n terms of σ P f g), q, and r usng the followng relatons: 1)a + c q, )b + d 1 q, 3)a + b r, 4)c + d 1 r, 5)b + c σ. Solvng the above we get: a q + r σ c q r + σ, b r + σ q,, d 1 q + r + σ. 11) We replace a, b, c, and d n 10) by ther values n 11): σ q + r r )1 q)1 r) + q )q1 r) + r q q + r )1 q)r + qr1 ) σ q + r rq C P 1 Q 1 C P 1 Q 1 σ q1 r) r1 q)) + q1 q)r1 r) C P 1 Q 1 σ q1 q)r1 r) C P 1 Q 1 On the other hand E X ẽ ) q1 q) P, where the last equalty follows from the fact that ẽ s are uncorrelated. Ths proves the lower bound. Next we use the lower bound to derve the upper bound. Step 3: The upper-bound can be derved by consderng the functon hy n ) to be the complement of f Y n ).e. hy n ) 1 f Y n ).) In ths case PhY n ) 1) P f Y n ) 0) 1 r. The correspondng real functon for hy n ) s: hy n r hy n ) 1, ) 1 r) hy n ) 0, r f Y n ) 0, 1 r) f Y n ) 1, hy n ) f Y n ). So, hy n ) f. Usng the same method as n the prevous step, we have: E X n,y nẽ h) E X n,y nẽ f ) C P 1 Q 1 PeX n ) hy n )) P Q C P 1 Q 1 On the other hand PeX n ) hy n )) PeX n ) 1 f Y n )) PeX n ) f Y n )) 1 PeX n ) f Y n ). So, 1 PeX n ) f Y n )) P Q C P 1 Q 1 PeX n ) f Y n )) 1 P Q + C P 1 Q 1. Ths completes the proof. References [1] P. Gacs and J. Körner, Common nformaton s far less than mutual nformaton, Problems of Control and Informaton Theory, vol., no., pp. 119 16, 197. [] H. S. Wtsenhausen, On sequences of par of dependent random varables, SIAM Journal of Appled Mathematcs, vol. 8, no. 1, pp. 100 113, 1975. [3] F. S. Chaharsoogh, A. G. Saheb, and S. S. Pradhan, Dstrbuted source codng n absence of common components, n Informaton Theory Proceedngs ISIT), 013 IEEE Internatonal Symposum on, July 013, pp. 136 1366.

[4] A. Bogdanov and E. Mossel, On extractng common random bts from correlated sources, IEEE Transactons on Informaton Theory, vol. 57, no. 10, pp. 6351 6355, Oct 011. [5] I. Csszar and P. Narayan, Common randomness and secret key generaton wth a helper, IEEE Transactons on Informaton Theory, vol. 46, no., pp. 344 366, Mar 000. [6] A. Mahajan, A. Nayyar, and D. Teneketzs, Identfyng tractable decentralzed control problems on the bass of nformaton structure, n 008 46th Annual Allerton Conference on Communcaton, Control, and Computng, Sept 008, pp. 1440 1449. [7] M. Reed and B. Smon, Methods of Modern Mathematcal Physcs, I: Functonal Analyss. New York: Academc Press Inc. Ltd., 197. [8] F. Shran, M. Hedar, S. S. Pradhan, On the Sub-optmalty of Sngleletter Codng n Mult-letter Communcatons, arxv.org, Jan 017. [9] F. Shran, S. S. Pradhan, On the Correlaton between Boolean Functons of Sequences of Random Varables, arxv.org, Jan 017.