ESE 524 Detection and Etimation Theory Joeph A. O Sullivan Samuel C. Sach Profeor Electronic Sytem and Signal Reearch Laboratory Electrical l and Sytem Engineering Wahington Univerity 2 Urbauer Hall 34-935-473 (Lynda Markham anwer jao@wutl.edu J. A. O'S. ESE 524, Lecture 6, /29/9
Announcement Problem Set i due Other announcement? Quetion? J. A. O'S. ESE 524, Lecture 6, /29/9 2
Information Rate Function and Performance Bound Motivation Chernoff Bound Binary Hypothei Teting Tilted Ditribution Relative Entropy Information Rate Function Additivity of Information Example: Gauian ame covariance, different mean Poion different mean, Gauian ame mean, different variance J. A. O'S. ESE 524, Lecture 6, /29/9 3
Information Rate Function: Motivation i Receiver Operating Characteritic i often not eaily computable Performance i often a function of a few parameter: SNR, N, other tatitic Bound on performance may be more eaily computed The bound below guarantee a level of performance for optimal tet Bound are exponential in the rate function Rate function i additive in information (proportional to N J. A. O'S. ESE 524, Lecture 6, /29/9 4
Chernoff Bound x X ( E e px ( X e dx ( ( X x Let x be a random variable p X e dx A with probability denity function p x (X. Define the A moment generating function e px ( X dx, for all and the log-moment A generating g function for real- ln Px ( A < A + ln (, for all valued. The Chernoff bound i a I( A up[ A ln ( ] bound on the tail probability. The probability of a rare event ln Px ( A < IA ( i cloely approximated by φ( ln ( the Chernoff bound. The (information rate function equal the Legendre- e T N Fenchel tranform of the logmoment generating function. φ( ln E x e, for X The rate function can be ued ln P( x A > inf I( X X A to bound the probability of T any open et in N dimenion. I( X up X ln ( J. A. O'S. ESE 524, Lecture 6, /29/9 5
Binary Hypothei Teting Probabilitie of mi and fale alarm are tail probabilitie. l p( r H ( r ln p ( r H l( r l L ( E e H p ( L H e dl PF P (( l r γ H ln Pl ( ( r γ H < I( γ Upper bound them uing I ( γ up[ γ ln ( ] Chernoff bound information rate function given hypothee and. φ ( ln ( Performance i better than PM P(( l r γ H computed point: optimal ln P (P D,P F i up and to the M < I( γ left of point σl( r σl ( σ E e H pl ( L H e dl I γ [ ] I ( γ up σγ ln ( σ σ φ ( σ ln ( σ J. A. O'S. ESE 524, Lecture 6, /29/9 6
Aid Aide Tail probabilitie can be on either ih ide. For the bound, the variable in the (log- moment generating function i a dummy variable. The variable in the log- moment generating function under the two hypothee are different. P P(( l r γ H M ln P < I ( γ M σl( r σ l σl ( E e H p ( L H e dl I γ e p ( L H e l γ σγ σ L dl p ( L H dl, for all σ ( γ up σγ ln ( σ l [ ] σ J. A. O'S. ESE 524, Lecture 6, /29/9 7
Tilted Ditribution ib i The moment generating function are for the loglikelihood ratio given the hypothee. If the upremum defining the rate function i achieved at an interior point, then the derivative i zero. Tilt the original ditribution until the mean of the loglikelihood function equal the threhold. p( r H l( r ln p ( r H l ( r ( Ee H p( R H ln p( R H p ( R H e d R [ p( R H ] [ p( R H ] dr [ γ ] I ( γ up ln ( d γ ln ( * d d (* d E* l( r (* p ( R [ ] p( R H e ( d (* d (* p ( R H ln p( R H J. A. O'S. ESE 524, Lecture 6, /29/9 8
Relationhip to Rl Relative Entropy D ( p q p log p q Relative entropy i a quantitative i meaure of information, given in bit or nat. Information rate function equal the relative entropy between the tilted pdf and the pdf under the hypothei. Duality of exponential family and it mean: mean determine parameter; parameter determine mean. E p p ln x / x plog p q q p D( p q p ( R D ( p p p ( R ln d R p ( R H ( R p ( R H e ( p( R H ln p( R H [ r ] D( p p E l( ln ( γ ln ( I ( γ [ l( r ] γ γ * * γ J. A. O'S. ESE 524, Lecture 6, /29/9 9
Relationhip to Relative Entropy Part 2 σl( r ( E e H σ [ ] ( γ up σγ ln ( σ σ Relative entropy d ( σ* between the tilted d γ ln ( σ* d denity and the denity dσ ( σ* under hypothei i imply related to that p( R H under hypothei. σ ln p( R H p( R H e pσ + ( R ( σ p ( R D( p p p( Rln dr p( H p ( R H ( ln p( H p( H e σ R + R R p ( R H ln p( R H ( σ + p( R H e p ( R d ( ( σ* E [ dσ σ * + D( p p E[ ( l( r ln ( ] [ l( r ] ( σ* + ln ( + I ( γ γ γ γ I J. A. O'S. ESE 524, Lecture 6, /29/9 σ
Summary of Simple Rl Relationhip σ l ( r σ E e ( H p( R H Find I a a function ( σ Eexp ( σ + ln H of the threhold; p( H R ubtract threhold ( σ ( σ + to get I. I ( γ γ + I ( γ Plot bound ( ( Vary parameter to gain a better ( i a convex function undertanding. γ D( p p (, D( p p i a point Information i γ D( p p ( D( p p, i a point additive. N i.i.d. (N I, N I γ ( D( p p, D( p p i a point dln ( d J. A. O'S. ESE 524, Lecture 6, /29/9
Information Rate Function and Performance Bound Example: Exponential Ditribution Additivity of Information Example: Gauian ame covariance, different mean Poion different mean, Gauian ame mean, different variance J. A. O'S. ESE 524, Lecture 6, /29/9 2
Summary of Simple Rl Relationhip Find I a a function of the threhold; ubtract threhold to get I. Plot bound Vary parameter to gain a better undertanding. d ( σ Ee H σl( r ( σ ( σ + I ( γ γ + I ( γ ( ( γ ( i a convex function D( p p (, D( p p i a point γ D ( p p ( D ( p p, i apoint γ ( D( p p, D( p p i a point ln ( d J. A. O'S. ESE 524, Lecture 6, /29/9 3
Example: Exponential Ditribution ti R H : r λe λ, R, i, 2,... N, i.i.d. i α R : i α,,, 2,..., i.i.d. H r e R i N λ > α N α l( R Ri ( λ α + ln i λ l( r ( E e H N [ ] ( [ ( ] p R H p R H dr i ( ( [ ] ( [ ( ] N i i i pr H pr H dr J. A. O'S. ESE 524, Lecture 6, /29/9 4
( ( N Ex: Exponential Ditribution Compute the moment generating function and information rate function. Plot parametric curve: fn of R(( λ+ α ( λ α e dr λ α (( λ+ α I ( γ up γ ln ( function and [ ] γ, I (γ, I (γ γ + I (γ, I d(* d γ ln * d ( d (* γ E[ l( r ], one component (( λ + α ( γ lnα (( λ+ α ( ln λ+ ln(( λ+ α J. A. O'S. ESE 524, Lecture 6, /29/9 5
Example continued N L μi L e plh ( L H, i i L N μ ( N! i We know exact μ, μ λ performance for thi D l H cae. ( P p L H dl γ ' γ ' α ( PF pl H L H dl P P P F F D N λγ ' k e ( λγ ' k k! N γ '' k e ( γ '', γ '' λγ ' k! k N αγ '' αγ '' exp λ k! λ k k J. A. O'S. ESE 524, Lecture 6, /29/9 6
Information i Additive Relative entropy of product ditribution equal the um of the relative entropie. Log-moment o generating e g function add. Information rate function add. Information i additive. N i.i.d. (N I, N I Exponential error bound. J. A. O'S. ESE 524, Lecture 6, /29/9 7