Detection and Estimation Theory

Similar documents
Detection and Estimation Theory

Detection and Estimation Theory

Detection and Estimation Theory

Detection and Estimation Theory

MINITAB Stat Lab 3

Exercises with solutions (Set D)

Comparing Means: t-tests for Two Independent Samples

If Y is normally Distributed, then and 2 Y Y 10. σ σ

e st t u(t 2) dt = lim t dt = T 2 2 e st = T e st lim + e st

Gaussian Random Variables Why we Care

Suggested Answers To Exercises. estimates variability in a sampling distribution of random means. About 68% of means fall

The Moment Method; Convex Duality; and Large/Medium/Small Deviations

Lecture 7: Testing Distributions

Detection and Estimation Theory

SIMPLE LINEAR REGRESSION

ESE 523 Information Theory

arxiv: v1 [math.ca] 23 Sep 2017

Regression. What is regression? Linear Regression. Cal State Northridge Ψ320 Andrew Ainsworth PhD

Detection and Estimation Theory

Uniform Distribution. Uniform Distribution. Uniform Distribution. Graphs of Gamma Distributions. Gamma Distribution. Continuous Distributions CD - 1

ON THE APPROXIMATION ERROR IN HIGH DIMENSIONAL MODEL REPRESENTATION. Xiaoqun Wang

Digital Communications I: Modulation and Coding Course. Term Catharina Logothetis Lecture 8

Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

Z a>2 s 1n = X L - m. X L = m + Z a>2 s 1n X L = The decision rule for this one-tail test is

Semiconductor Physics and Devices

Quantile-quantile plots and the method of peaksover-threshold

Listen-and-Talk: Full-duplex Cognitive Radio Networks

1 if X v I v (X) = 0 if X < v. e sx p(x)dx

skipping section 6.6 / 5.6 (generating permutations and combinations) concludes basic counting in Chapter 6 / 5

Hoeffding, Chernoff, Bennet, and Bernstein Bounds

P ( N m=na c m) (σ-additivity) exp{ P (A m )} (1 x e x for x 0) m=n P (A m ) 0

IEOR 3106: Fall 2013, Professor Whitt Topics for Discussion: Tuesday, November 19 Alternating Renewal Processes and The Renewal Equation


Machine Learning Basics Lecture 7: Multiclass Classification. Princeton University COS 495 Instructor: Yingyu Liang

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Concentration inequalities and tail bounds

Source slideplayer.com/fundamentals of Analytical Chemistry, F.J. Holler, S.R.Crouch. Chapter 6: Random Errors in Chemical Analysis

TMA4125 Matematikk 4N Spring 2016

Concentration Inequalities

Research Article Goodness-of-Fit Based Secure Cooperative Spectrum Sensing for Cognitive Radio Network

Gaussian source Assumptions d = (x-y) 2, given D, find lower bound of I(X;Y)

Maximization of Technical Efficiency of a Normal- Half Normal Stochastic Production Frontier Model

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

March 1, Florida State University. Concentration Inequalities: Martingale. Approach and Entropy Method. Lizhe Sun and Boning Yang.

Chapter 4: Continuous channel and its capacity

μ + = σ = D 4 σ = D 3 σ = σ = All units in parts (a) and (b) are in V. (1) x chart: Center = μ = 0.75 UCL =

Beta Burr XII OR Five Parameter Beta Lomax Distribution: Remarks and Characterizations

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

1. The F-test for Equality of Two Variances

Lecture 4 Topic 3: General linear models (GLMs), the fundamentals of the analysis of variance (ANOVA), and completely randomized designs (CRDs)

Minimum Eigenvalue Detection for Spectrum Sensing in Cognitive Radio

Exponential Distribution and Poisson Process

Tests of Statistical Hypotheses with Respect to a Fuzzy Set

Testing the Equality of Two Pareto Distributions

(3) A bilinear map B : S(R n ) S(R m ) B is continuous (for the product topology) if and only if there exist C, N and M such that

Chapter 4. Chapter 4 sections

NEGATIVE z Scores. TABLE A-2 Standard Normal (z) Distribution: Cumulative Area from the LEFT. (continued)

Jan Purczyński, Kamila Bednarz-Okrzyńska Estimation of the shape parameter of GED distribution for a small sample size

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

Financial Econometrics and Volatility Models Extreme Value Theory

10-704: Information Processing and Learning Spring Lecture 8: Feb 5

The space complexity of approximating the frequency moments

3. Review of Probability and Statistics

C up (E) C low (E) E 2 E 1 E 0

U Logo Use Guidelines

Informatics 2B: Learning and Data Lecture 10 Discriminant functions 2. Minimal misclassifications. Decision Boundaries

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

15-388/688 - Practical Data Science: Basic probability. J. Zico Kolter Carnegie Mellon University Spring 2018

Matching Feature Distributions for Robust Speaker Verification

Composite Hypotheses and Generalized Likelihood Ratio Tests

EP2200 Queueing theory and teletraffic systems

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

Chapter 7. Basic Probability Theory

Bayesian estimation, classification, and denoising based on alpha-stable statistical modeling

Singular Value Analysis of Linear- Quadratic Systems!

arxiv: v2 [math.nt] 30 Apr 2015

1.3 and 3.9: Derivatives of exponential and logarithmic functions

Performance Analysis of Wald s SPRT with Independent but Non-Stationary Log-Likelihood Ratios

EC381/MN308 Probability and Some Statistics. Lecture 7 - Outline. Chapter Cumulative Distribution Function (CDF) Continuous Random Variables

ECE 4400:693 - Information Theory

Problem Set 8 Solutions

Lecture 7 Introduction to Statistical Decision Theory

Confusion matrices. True / False positives / negatives. INF 4300 Classification III Anne Solberg The agenda today: E.g., testing for cancer

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Correction of Overlapping Template Matching Test Included in NIST Randomness Test Suite

Problem 1. Construct a filtered probability space on which a Brownian motion W and an adapted process X are defined and such that

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

STATISTICS 3A03. Applied Regression Analysis with SAS. Angelo J. Canty

Estimation Theory. as Θ = (Θ 1,Θ 2,...,Θ m ) T. An estimator

Name: Solutions Exam 2

Chapter 2 Signal Processing at Receivers: Detection Theory

Lecture 2: Review of Basic Probability Theory

Alternate Dispersion Measures in Replicated Factorial Experiments

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

Lecture 5: Importance sampling and Hamilton-Jacobi equations

EE 508 Lecture 16. Filter Transformations. Lowpass to Bandpass Lowpass to Highpass Lowpass to Band-reject

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Transcription:

ESE 524 Detection and Etimation Theory Joeph A. O Sullivan Samuel C. Sach Profeor Electronic Sytem and Signal Reearch Laboratory Electrical l and Sytem Engineering Wahington Univerity 2 Urbauer Hall 34-935-473 (Lynda Markham anwer jao@wutl.edu J. A. O'S. ESE 524, Lecture 7, 2/3/9

Announcement Problem Set i due today Problem Set 2 i poted, due 2/3/9 No cla 2/, 2/2, and 2/9 Make up on Friday: 2/2, 2/27 Cla webite: http://clae.engineering.wutl.edu/ee524/ Quetion? J. A. O'S. ESE 524, Lecture 7, 2/3/9 2

Lat Cla: Information Rate Function and dperformance Bound Motivation Chernoff Bound Binary Hypothei Teting Tilted Ditribution Relative Entropy Information Rate Function Additivity of Information Example: Gauian ame covariance, different mean Poion different mean, Gauian ame mean, different variance J. A. O'S. ESE 524, Lecture 7, 2/3/9 3

Lat Cla: Information Rate Function: Motivation i Receiver Operating Characteritic i often not eaily computable Performance i often a function of a few parameter: SNR, N, other tatitic Bound on performance may be more eaily computed The bound below guarantee a level of performance for optimal tet Bound are exponential in the rate function Rate function i additive in information (proportional to N J. A. O'S. ESE 524, Lecture 7, 2/3/9 4

Lat Cla: Chernoff Bound x X Φ ( E e px ( X e dx Φ( ( X x Let x be a random variable p X e dx A with probability denity function p x (X. Define the A moment generating function e px ( X dx, for all and the log-moment A generating g function for real- ln Px ( A < A + ln Φ (, for all valued. The Chernoff bound i a I( A up[ A ln Φ( ] bound on the tail probability. The probability of a rare event ln Px ( A < IA ( i cloely approximated by φ( ln Φ( the Chernoff bound. The (information rate function equal the Legendre- e T N Fenchel tranform of the logmoment generating function. φ( ln E x e, for X The rate function can be ued ln P( x A > inf I( X X A to bound the probability of T any open et in N dimenion. I( X up X ln Φ( J. A. O'S. ESE 524, Lecture 7, 2/3/9 5

Chernoff Bound Summary For random variable: x X Φ ( E e px ( X e dx φ( ln Φ( I ( A up A φ ( [ ] ln Px ( A < IA ( For random vector: T x φ( ln E e,for X ln P( x A > inf I( X X A T I( X up X ln Φ( J. A. O'S. ESE 524, Lecture 7, 2/3/9 6 N

Binary Hypothei Teting Probabilitie of mi and fale alarm are tail probabilitie. l p( r H ( r ln p ( r H l( r l L Φ ( E e H p ( L H e dl PF P (( l r γ H ln Pl ( ( r γ H < I( γ Upper bound them uing I ( γ up[ γ ln Φ ( ] Chernoff bound information rate function given hypothee and. φ ( ln Φ ( Performance i better than PM P(( l r γ H computed point: optimal ln P (P D,P F i up and to the M < I( γ left of point σl( r σl Φ ( σ E e H pl ( L H e dl I γ [ ] I ( γ up σγ ln Φ ( σ σ φ ( σ ln Φ ( σ J. A. O'S. ESE 524, Lecture 7, 2/3/9 7

Aid Aide Tail probabilitie can be on either ih ide. For the bound, the variable in the (log- moment generating function i a dummy variable. The variable in the log- moment generating function under the two hypothee are different. P P(( l r γ H M ln P < I ( γ M σl( r σ l σl Φ ( E e H p ( L H e dl I γ e p ( L H e l γ σγ σ L dl p ( L H dl, for all σ ( γ up σγ ln Φ ( σ l [ ] σ J. A. O'S. ESE 524, Lecture 7, 2/3/9 8

Tilted Ditribution ib i The moment generating function are for the loglikelihood ratio given the hypothee. If the upremum defining the rate function i achieved at an interior point, then the derivative i zero. Tilt the original ditribution until the mean of the loglikelihood function equal the threhold. p( r H l( r ln p ( r H l ( r ( Ee H Φ p( R H ln p( R H p ( R H e d R [ p( R H ] [ p( R H ] dr [ γ ] I ( γ up ln Φ ( d γ ln Φ ( * d d Φ (* d E* l( r Φ (* p ( R [ ] p( R H e Φ ( d Φ (* d Φ (* p ( R H ln p( R H J. A. O'S. ESE 524, Lecture 7, 2/3/9 9

Relationhip to Rl Relative Entropy D ( p q p log p q Relative entropy i a quantitative i meaure of information, given in bit or nat. Information rate function equal the relative entropy between the tilted pdf and the pdf under the hypothei. Duality of exponential family and it mean: mean determine parameter; parameter determine mean. E p p ln x / x plog p q q p D( p q p ( R D ( p p p ( R ln d R p ( R H ( R p ( R H e Φ ( p( R H ln p( R H [ r ] D( p p E l( ln Φ( γ ln Φ ( I ( γ [ l( r ] γ γ * * γ J. A. O'S. ESE 524, Lecture 7, 2/3/9

Relationhip to Relative Entropy Part 2 σl( r Φ ( E e H σ [ ] ( γ up σγ ln Φ ( σ σ Relative entropy d Φ ( σ* between the tilted d γ ln Φ ( σ* d denity and the denity dσ Φ( σ* under hypothei i imply related to that p( R H under hypothei. σ ln p( R H p( R H e pσ + ( R Φ ( σ p ( R D( p p p( Rln dr p( H p ( R H ( ln p( H p( H e σ R + R R p ( R H ln p( R H Φ ( σ + p( R H e p ( R dφ Φ( ( σ* E [ dσ σ * + D( p p E[ ( l( r ln Φ( ] [ l( r ] Φ( σ* + ln Φ ( + I ( γ γ γ γ I J. A. O'S. ESE 524, Lecture 7, 2/3/9 σ

Summary of Simple Rl Relationhip σlσ l ( r σ E e Φ ( H p( R H Find I a a function Φ ( σ Eexp ( σ + ln H of the threhold; p( H R ubtract threhold Φ ( σ Φ ( σ + to get I. I ( γ γ + I ( γ Plot bound Φ ( Φ ( Vary parameter to gain a better Φ( i a convex function undertanding. γ D( p p (, D( p p i a point Information i γ D( p p ( D( p p, i a point additive. N i.i.d. (N I, N I γ ( D( p p, D( p p i a point dln Φ( d J. A. O'S. ESE 524, Lecture 7, 2/3/9 2

Information Rate Function and Performance Bound Example: Exponential Ditribution Additivity of Information Example: Gauian ame covariance, different mean Poion different mean, Gauian ame mean, different variance J. A. O'S. ESE 524, Lecture 7, 2/3/9 3

Summary of Simple Rl Relationhip Find I a a function of the threhold; ubtract threhold to get I. Plot bound Vary parameter to gain a better undertanding. d ( σ Ee H σl( r Φ Φ ( σ Φ ( σ + I ( γ γ + I ( γ Φ ( Φ ( γ Φ ( i a convex function D( p p (, D( p p i a point γ D ( p p ( D ( p p, i apoint γ ( D( p p, D( p p i a point ln Φ( d J. A. O'S. ESE 524, Lecture 7, 2/3/9 4

Example: Exponential Ditribution ti R H : r λe λ, R, i, 2,... N, i.i.d. i α R : i α,,, 2,..., i.i.d. H r e R i N λ > α N α l ( R Ri ( λ α + ln i λ l( r Φ ( E e H Φ N [ ] ( [ ( ] p R H p R H dr i ( Φ ( [ ] ( [ ( ] N i i i pr H pr H dr J. A. O'S. ESE 524, Lecture 7, 2/3/9 5

Ex: Exponential Ditribution Compute the moment generating function and information rate function. Φ ( Φ ( R(( λ+ α λ α e dr Φ ( λ α (( λ + α I ( γ up γ ln Φ ( N [ ] Plot parametric curve: fn of ( γ, I (γ, I (γ -γ + I (γ, dφ(* d γ ln Φ ( * d d Φ(* λ α α γ E [ l( r ] + ln, one component (( λ + α λ ( λ α α I( γ + ln lnα (( λ+ α λ ln λ+ ln(( λ+ α λ λ ln ( λ+ α ( λ+ α I α α ( γ ln ( λ+ α ( λ+ α J. A. O'S. ESE 524, Lecture 7, 2/3/9 6

Example continued We know exact performance The threhold h are related The log-probabilitie can be N L μi L e plh ( L H, i i L N μi ( N! μ, μ λ α P p L H dl compared to bound D l H ( N α l( R Ri ( λ α + ln i λ N α ( λ α Ri + N ln i λ N α Ri l( R Nln λ α λ γ ' γ N ln α λ α λ γ '' λγ ' i λn γ α ln PF ( γ '' ln PF ln N N λ α N λ γ ' γ ' ( PF pl H L H dl P P P F F D N λγ ' k e ( λγ ' k k! N γ '' k e ( γ '', γ '' λγ ' k! k N '' '' αγ αγ exp λ k! λ k J. A. O'S. ESE 524, Lecture 7, 2/3/9 7 k

Matlab Code function [pf,pd]gammainforate(n,lambda,alpha gamma:n/2:(n+5; k:(n-; kfacfactorial(k; gamexp[gamma]; if N>3, for kk2:n-; gamexp[gamexp; gamma.^kk]; end end pfexp(-gamma.*(+(./kfac*gamexp; gdalpha*gamma/lambda; gamexp[gd]; if N>3, for kk2:n-; gamexp[gamexp; gd.^kk]; end end pdexp(-gd.*(+(./kfac*gamexp; :.:; infolambda./((-*lambda+*alpha--log(lambda./((-*lambda+*alpha; gamma(lambda-alpha./((-*lambda+*alpha+log(alpha/lambda; alpha *lambda+*alpha+log(alpha/lambda; infoalpha./((-*lambda+*alpha--log(alpha./((-*lambda+*alpha;.3 I in blue, I in red P D.8.6.4.2 ROC in blue, Bound in red 2.2 4.4 6.6 8.8 P F ROC in blue, Bound in red Info ormation Rate 25.25.2.5..5 alpha.5 lambda N, J. A. O'S. ESE 524, Lecture 7, 2/3/9 8 -...2.3 Threhold γ P D - -2 - -5 P F

ROC in blue, Bound in red MtlbCd Matlab Code figure figure; axe('parent',figure,'plotboxapectratio',[,,[.49 2.98],'LineWidth',.5,...,, 'FontSize',6,'DataApectRatio',[ ]; ylim([ ]; box('on'; hold('all'; % Create plot plot(pf,pd,'linewidth',.5,'color',[ ]; plot(exp(-n*info,-exp(-n*info,'linewidth',.5,'color',[ ]; xlabel('p_f','fontsize',6; ylabel('p_d','fontsize',6; title('roc in blue, Bound in red','fontsize',6; figure figure; axe('parent',figure,'linewidth',.5,'fontsize',6; box('on'; hold('all'; % Create plot plot(gamma,info,'linewidth',.5,'color',[,,,,[ ]; plot(gamma,info,'linewidth',.5,'color',[ ]; axi tight xlabel('threhold \gamma','fontsize',6; ylabel('information Rate','FontSize',6; title('i_ in blue, I_ in red','fontsize',6; Rate Information.3.25.2.5..5 P D.8.6.4 2.2.2.4.6.8 P F I in blue, I in red -...2.3 Threhold γ J. A. O'S. ESE 524, Lecture 7, 2/3/9 9

Relationhip of Bound to Log-Probability bilit gammagamma*((lambda-alpha/n*lambda+log(alpha/lambda; plot(gamma,-log(pf/n,'linewidth',.5,'color',[ l ( 'C l ' ]; plot(gamma,-log(-pd/n,'linewidth',.5,'color',[ ]; λn γ α ln P F ( γ '' ln P F ln N N λ α N λ.4 γ '' λ α α γ + ln N λ λ N.3 Info ormation Rat te λn α ln PF γ ln I ( γ N λ α λ N.2. I in blue, I in red - -.5 5 5.5 Threhold γ J. A. O'S. ESE 524, Lecture 7, 2/3/9 2

Information i Additive Relative entropy of product ditribution equal the um of the relative entropie. Log-moment o generating e g function add. Information rate function add. Information i additive. N i.i.d. (N I, N I Exponential error bound. J. A. O'S. ESE 524, Lecture 7, 2/3/9 2

Information i Additive Relative entropy of product ditribution equal the um of the relative entropie. Log-moment o generating e g function add. Information rate function add. Information i additive. N i.i.d. (N I, N I Exponential error bound. J. A. O'S. ESE 524, Lecture 7, 2/3/9 22