Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21
Presentation Outline 1 Hypothesis Testing Stein s Lemma 2 Communication Constraints Problem Formulation General Bivariate Hypothesis Testing A Single-Letter Lower Bound Special Cases 3 Conclusions and References Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 2 / 21
Outline 1 Hypothesis Testing Stein s Lemma 2 Communication Constraints Problem Formulation General Bivariate Hypothesis Testing A Single-Letter Lower Bound Special Cases 3 Conclusions and References Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 3 / 21
Bivariate Hypothesis Testing Given sensor measurements x 1, x 2,..., x n, determine if an earthquake occurred or not? X 1, X 2,..., X n be i.i.d Hypothesis H 0 : Distribution is P(x) Hypothesis H 1 : Distribution is Q(x) Statistician s task : Decide on H 0 or H 1 based on x n = x 1, x 2,..., x n Decision rule : Declare H 0 if x n A X n, else declare H 1 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 4 / 21
Error events Two kinds of errors False Alarm: Declare H 0 as H 1. Miss: Declare H 1 as H 0. Corresponding probabilities P(Error of type 1) α = P n (A c ) P(Error of type 2) β = Q n (A) Usually there is a trade-off between α and β Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 5 / 21
Stein s Lemma Let α go to 0 arbitrarily slowly with n What is the best we can do as regards the probability of type 2 error β? Answer given by Stein s lemma Stein s Lemma 1 lim n n log β n(ɛ) θ(ɛ) = D(P Q) ɛ (0, 1) Can be proved by using the typical set as the acceptance region Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 6 / 21
Outline 1 Hypothesis Testing Stein s Lemma 2 Communication Constraints Problem Formulation General Bivariate Hypothesis Testing A Single-Letter Lower Bound Special Cases 3 Conclusions and References Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 7 / 21
Problem Formulation Common assumption is that all data is known to the statistician in advance What if he/she can be informed about the data at a finite rate R? Not a significant constraint if data is collected at a single location or if only one random variable is present In the above case, transmission of one bit is sufficient to enable optimal decision Problem is interesting when different variables are measured at different locations Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 8 / 21
Problem formulation n X X Encoder n f( X ) Statistician H 0 n Y Y Encoder g( Y n ) H 1 This notion of encoding is more general than standard source coding Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 9 / 21
General Bivariate Hypothesis Testing Hypothesis testing with 2 arbitrary Hypothesis P XY (Hypothesis H 0 ) and P X Ȳ (Hypothesis H 1) Statistician observes X n and Y n via encoding functions of rate (R 1, R 2 ) We are interested in asymptotics of θ (R1,R 2 )(n, ɛ) Assume for simplicity that R 2 = Will derive an achievable lower bound θ L (R, ɛ) to θ(r, ɛ) Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 10 / 21
Key Ideas Choose acceptance region to be the typical set under hypothesis H 0 Decoder has access only to the types P u n, P y n and P u n,y n Decoder reproduces the largest family of random variables it can using available information Exponent will be the divergence between the families resulting from H 0 and H 1 Larger the family, larger the divergence since D(X 1 Y 1 X 2 Y 2 ) D(X 1 X 2 ) Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 11 / 21
Lemma We need the following lemma Let U,X,Y be finite random variables such that U X Y. Then there exists u 1,..., u M T n µ (U) (M = exp[n(i (U; X ) + η)]) and M disjoint subsets C 1,..., C M T n µ (X u i ) for which M {P(X n Y n C i Tµ n (Y u i ))} 1 δ i=1 for any fixed η > 0 and δ > 0 Proof using standard information-theoretic ideas Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 12 / 21
Hypothesis Testing Scheme Let M, u i, C i be as given in the lemma X -encoder defined as f (x n ) = { i if x n C i 0 else Statistician has access to i {1, 2,..., M} and y n Decision Rule : Declare H 0 if y n T n µ (Y u i ) Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 13 / 21
Acceptance Region Decision rule induces an acceptance region A n given by A n = M (C i Tµ n (Y u i )) i=1 No single module has all the information required to determine if (x n, y n ) A n Probability of type-1 error is bounded by the lemma α n = P(X n Y n A c n) δ Need to bound β n = (x n,y n ) A n P(( X n Ȳ n ) = (x n, y n )) Can be done using type-counting in A n Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 14 / 21
An Achievable Lower Bound Two sets of Auxiliary random variables S(R) = {U : I (U; X ) R, U X Y } Describes the X -Encoder. Rate constraint is automatically met Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 15 / 21
An Achievable Lower Bound Two sets of Auxiliary random variables S(R) = {U : I (U; X ) R, U X Y } L(U) = {Ũ X Ỹ : P(Ũ X ) = P(UX ), P(ŨỸ ) = P(UY )} Decoder can reproduce the set of joint types P (u n,y n ) and P (u n,x n ) Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 15 / 21
An Achievable Lower Bound Two sets of Auxiliary random variables S(R) = {U : I (U; X ) R, U X Y } L(U) = {Ũ X Ỹ : P(Ũ X ) = P(UX ), P(ŨỸ ) = P(UY )} Define Ū to satisfy Ū X Ȳ and P(Ū X ) = P(U X ) Same Encoder used in case of either hypothesis Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 15 / 21
An Achievable Lower Bound Two sets of Auxiliary random variables S(R) = {U : I (U; X ) R, U X Y } L(U) = {Ũ X Ỹ : P(Ũ X ) = P(UX ), P(ŨỸ ) = P(UY )} Define Ū to satisfy Ū X Ȳ and P(Ū X ) = P(U X ) For every R 0 and 0 < ɛ < 1, the exponent θ L (R, ɛ) = is achievable sup min U S(R) Ũ X Ỹ L(U) D(Ũ X Ỹ Ū X Ȳ ) Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 15 / 21
Special Case: Lower Bound of Ahlswede and Csisźar For any U S(R), we have θ L (R, ɛ) D(X X ) + D(UY UŶ ) where Ŷ is such that U X Ŷ and P(Ŷ X ) = P(Ȳ X ) Follows from simple algebraic manipulations This lower bound doesn t exploit P (un,x n ) and is consequently weaker Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 16 / 21
Special Case: Test against Independence Suppose P( X Ȳ ) = P(X )P(Y ). Then for any 0 < ɛ < 1, θ L (R, ɛ) max I (U; Y ) U S(R) Follows from simple algebraic manipulations This case was completely solved by Ahlswede and Csisźar who proved the converse as well Their proof used Divergence-Characterization techniques Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 17 / 21
Further Comments If R H(X ), then the lower bound becomes θ L (R, ɛ) = D(XY X Ȳ ) Extension to two sided compression is straight-forward Involves introduction of further auxiliary random variables V and Ṽ Approach seems best suited to get achievability results Divergence characterization techniques better suited for converses Lower bound can be significantly tightened Above encoding scheme for zero-error reconstruction of the joint types Can consider encoders that reconstructions with exponentially low error probability Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 18 / 21
Outline 1 Hypothesis Testing Stein s Lemma 2 Communication Constraints Problem Formulation General Bivariate Hypothesis Testing A Single-Letter Lower Bound Special Cases 3 Conclusions and References Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 19 / 21
Conclusions Bivariate hypothesis testing with one sided data compression was studied A single-letter lower bound to the power exponent was derived This bound subsumes other known bounds and achievability results Easily extendable to two-sided compression case Other statistical problems such as parameter estimation and pattern classification also studied under similar rate constraints Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 20 / 21
References R.Ahlswede and I.Csiszár, Hypothesis Testing with Communication constraints, IEEE trans. on info. theory, vol. IT-32, No.4 July 1986 Te Sun Han, Hypothesis Testing with Multiterminal Data Compression, IEEE trans. on info. theory, vol. IT-33, No.6 November 1987 Te Sun Han and Shun-ichi Amari, Statistical Inference Under Multiterminal Data Compression, IEEE trans. on info. theory, vol. 44, No.6 October 1998 R.Ahlswede and János Körner, Source coding with side information and a converse for degraded broadcast channels, IEEE trans. on info. theory, vol. IT-21, No.6 November 1975 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 21 / 21