Detection and Estimation Theory

Similar documents
Detection and Estimation Theory

Detection and Estimation Theory

Detection and Estimation Theory

Detection and Estimation Theory

MINITAB Stat Lab 3

Comparing Means: t-tests for Two Independent Samples

If Y is normally Distributed, then and 2 Y Y 10. σ σ

Suggested Answers To Exercises. estimates variability in a sampling distribution of random means. About 68% of means fall

Lecture 7: Testing Distributions

arxiv: v1 [math.ca] 23 Sep 2017

SIMPLE LINEAR REGRESSION

Z a>2 s 1n = X L - m. X L = m + Z a>2 s 1n X L = The decision rule for this one-tail test is

Exercises with solutions (Set D)

Uniform Distribution. Uniform Distribution. Uniform Distribution. Graphs of Gamma Distributions. Gamma Distribution. Continuous Distributions CD - 1

ON THE APPROXIMATION ERROR IN HIGH DIMENSIONAL MODEL REPRESENTATION. Xiaoqun Wang

Regression. What is regression? Linear Regression. Cal State Northridge Ψ320 Andrew Ainsworth PhD

Source slideplayer.com/fundamentals of Analytical Chemistry, F.J. Holler, S.R.Crouch. Chapter 6: Random Errors in Chemical Analysis

Semiconductor Physics and Devices

μ + = σ = D 4 σ = D 3 σ = σ = All units in parts (a) and (b) are in V. (1) x chart: Center = μ = 0.75 UCL =

Beta Burr XII OR Five Parameter Beta Lomax Distribution: Remarks and Characterizations

Lecture 4 Topic 3: General linear models (GLMs), the fundamentals of the analysis of variance (ANOVA), and completely randomized designs (CRDs)

Gaussian Random Variables Why we Care

Jan Purczyński, Kamila Bednarz-Okrzyńska Estimation of the shape parameter of GED distribution for a small sample size

ESE 523 Information Theory

Testing the Equality of Two Pareto Distributions

Detection and Estimation Theory

Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger

EC381/MN308 Probability and Some Statistics. Lecture 7 - Outline. Chapter Cumulative Distribution Function (CDF) Continuous Random Variables

1. The F-test for Equality of Two Variances

Listen-and-Talk: Full-duplex Cognitive Radio Networks

Bogoliubov Transformation in Classical Mechanics

Tests of Statistical Hypotheses with Respect to a Fuzzy Set

The continuous time random walk (CTRW) was introduced by Montroll and Weiss 1.

Alternate Dispersion Measures in Replicated Factorial Experiments

1.3 and 3.9: Derivatives of exponential and logarithmic functions

Performance Analysis of Wald s SPRT with Independent but Non-Stationary Log-Likelihood Ratios

NEGATIVE z Scores. TABLE A-2 Standard Normal (z) Distribution: Cumulative Area from the LEFT. (continued)

EP2200 Queueing theory and teletraffic systems

The Moment Method; Convex Duality; and Large/Medium/Small Deviations

Hoeffding, Chernoff, Bennet, and Bernstein Bounds

Machine Learning Basics Lecture 7: Multiclass Classification. Princeton University COS 495 Instructor: Yingyu Liang

A FUNCTIONAL BAYESIAN METHOD FOR THE SOLUTION OF INVERSE PROBLEMS WITH SPATIO-TEMPORAL PARAMETERS AUTHORS: CORRESPONDENCE: ABSTRACT

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

Quantitative Information Leakage. Lecture 9

Concentration inequalities and tail bounds

Maximization of Technical Efficiency of a Normal- Half Normal Stochastic Production Frontier Model

Matching Feature Distributions for Robust Speaker Verification

Research Article Goodness-of-Fit Based Secure Cooperative Spectrum Sensing for Cognitive Radio Network

Random Sparse Linear Systems Observed Via Arbitrary Channels: A Decoupling Principle

Gaussian source Assumptions d = (x-y) 2, given D, find lower bound of I(X;Y)

Concentration Inequalities

Correction of Overlapping Template Matching Test Included in NIST Randomness Test Suite

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

IEOR 3106: Fall 2013, Professor Whitt Topics for Discussion: Tuesday, November 19 Alternating Renewal Processes and The Renewal Equation

Detection and Estimation Theory

Factor Analysis with Poisson Output

A Likelihood Ratio Formula for Two- Dimensional Random Fields

Control chart for waiting time in system of (M / M / S) :( / FCFS) Queuing model

Minimum Eigenvalue Detection for Spectrum Sensing in Cognitive Radio

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

THE HAUSDORFF MEASURE OF SIERPINSKI CARPETS BASING ON REGULAR PENTAGON

Chapter 5 Optimum Receivers for the Additive White Gaussian Noise Channel

P ( N m=na c m) (σ-additivity) exp{ P (A m )} (1 x e x for x 0) m=n P (A m ) 0

Exponential Distribution and Poisson Process

skipping section 6.6 / 5.6 (generating permutations and combinations) concludes basic counting in Chapter 6 / 5

General Relativity (sort of)

(3) A bilinear map B : S(R n ) S(R m ) B is continuous (for the product topology) if and only if there exist C, N and M such that

PhysicsAndMathsTutor.com

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

Chapter 4. Chapter 4 sections

Fermi Distribution Function. n(e) T = 0 T > 0 E F

10-704: Information Processing and Learning Spring Lecture 8: Feb 5

Name: Solutions Exam 2

Theory and Practice Making use of the Barkhausen Effect

Singular Value Analysis of Linear- Quadratic Systems!

Modeling Local and Advective Diffusion of Fuel Vapors to Understand Aqueous Foams in Fighting Fires. Final Report

TMA4125 Matematikk 4N Spring 2016

Digital Communications I: Modulation and Coding Course. Term Catharina Logothetis Lecture 8

Jul 4, 2005 turbo_code_primer Revision 0.0. Turbo Code Primer

Problem 1. Construct a filtered probability space on which a Brownian motion W and an adapted process X are defined and such that

Correction for Simple System Example and Notes on Laplace Transforms / Deviation Variables ECHE 550 Fall 2002

Quantile-quantile plots and the method of peaksover-threshold

arxiv: v2 [math.nt] 30 Apr 2015

ECE 4400:693 - Information Theory

Confusion matrices. True / False positives / negatives. INF 4300 Classification III Anne Solberg The agenda today: E.g., testing for cancer

Lecture 8: Period Finding: Simon s Problem over Z N

Lecture 7 Introduction to Statistical Decision Theory

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

On the Observability of a Linear System with a Sparse Initial State

Estimation Theory. as Θ = (Θ 1,Θ 2,...,Θ m ) T. An estimator

Lecture 2: Review of Basic Probability Theory

RELIABILITY OF REPAIRABLE k out of n: F SYSTEM HAVING DISCRETE REPAIR AND FAILURE TIMES DISTRIBUTIONS

Saddlepoint Approximation of the Error Probability of Binary Hypothesis Testing

Lecture 5: Importance sampling and Hamilton-Jacobi equations

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Acceptance sampling uses sampling procedure to determine whether to

EE 508 Lecture 16. Filter Transformations. Lowpass to Bandpass Lowpass to Highpass Lowpass to Band-reject

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

LOCALIZATION OF MULTIPLE SOURCES USING TIME-DIFFERENCE OF ARRIVAL MEASUREMENTS

Transcription:

ESE 524 Detection and Etimation Theory Joeph A. O Sullivan Samuel C. Sach Profeor Electronic Sytem and Signal Reearch Laboratory Electrical l and Sytem Engineering Wahington Univerity 2 Urbauer Hall 34-935-473 (Lynda Markham anwer jao@wutl.edu J. A. O'S. ESE 524, Lecture 6, /29/9

Announcement Problem Set i due Other announcement? Quetion? J. A. O'S. ESE 524, Lecture 6, /29/9 2

Information Rate Function and Performance Bound Motivation Chernoff Bound Binary Hypothei Teting Tilted Ditribution Relative Entropy Information Rate Function Additivity of Information Example: Gauian ame covariance, different mean Poion different mean, Gauian ame mean, different variance J. A. O'S. ESE 524, Lecture 6, /29/9 3

Information Rate Function: Motivation i Receiver Operating Characteritic i often not eaily computable Performance i often a function of a few parameter: SNR, N, other tatitic Bound on performance may be more eaily computed The bound below guarantee a level of performance for optimal tet Bound are exponential in the rate function Rate function i additive in information (proportional to N J. A. O'S. ESE 524, Lecture 6, /29/9 4

Chernoff Bound x X ( E e px ( X e dx ( ( X x Let x be a random variable p X e dx A with probability denity function p x (X. Define the A moment generating function e px ( X dx, for all and the log-moment A generating g function for real- ln Px ( A < A + ln (, for all valued. The Chernoff bound i a I( A up[ A ln ( ] bound on the tail probability. The probability of a rare event ln Px ( A < IA ( i cloely approximated by φ( ln ( the Chernoff bound. The (information rate function equal the Legendre- e T N Fenchel tranform of the logmoment generating function. φ( ln E x e, for X The rate function can be ued ln P( x A > inf I( X X A to bound the probability of T any open et in N dimenion. I( X up X ln ( J. A. O'S. ESE 524, Lecture 6, /29/9 5

Binary Hypothei Teting Probabilitie of mi and fale alarm are tail probabilitie. l p( r H ( r ln p ( r H l( r l L ( E e H p ( L H e dl PF P (( l r γ H ln Pl ( ( r γ H < I( γ Upper bound them uing I ( γ up[ γ ln ( ] Chernoff bound information rate function given hypothee and. φ ( ln ( Performance i better than PM P(( l r γ H computed point: optimal ln P (P D,P F i up and to the M < I( γ left of point σl( r σl ( σ E e H pl ( L H e dl I γ [ ] I ( γ up σγ ln ( σ σ φ ( σ ln ( σ J. A. O'S. ESE 524, Lecture 6, /29/9 6

Aid Aide Tail probabilitie can be on either ih ide. For the bound, the variable in the (log- moment generating function i a dummy variable. The variable in the log- moment generating function under the two hypothee are different. P P(( l r γ H M ln P < I ( γ M σl( r σ l σl ( E e H p ( L H e dl I γ e p ( L H e l γ σγ σ L dl p ( L H dl, for all σ ( γ up σγ ln ( σ l [ ] σ J. A. O'S. ESE 524, Lecture 6, /29/9 7

Tilted Ditribution ib i The moment generating function are for the loglikelihood ratio given the hypothee. If the upremum defining the rate function i achieved at an interior point, then the derivative i zero. Tilt the original ditribution until the mean of the loglikelihood function equal the threhold. p( r H l( r ln p ( r H l ( r ( Ee H p( R H ln p( R H p ( R H e d R [ p( R H ] [ p( R H ] dr [ γ ] I ( γ up ln ( d γ ln ( * d d (* d E* l( r (* p ( R [ ] p( R H e ( d (* d (* p ( R H ln p( R H J. A. O'S. ESE 524, Lecture 6, /29/9 8

Relationhip to Rl Relative Entropy D ( p q p log p q Relative entropy i a quantitative i meaure of information, given in bit or nat. Information rate function equal the relative entropy between the tilted pdf and the pdf under the hypothei. Duality of exponential family and it mean: mean determine parameter; parameter determine mean. E p p ln x / x plog p q q p D( p q p ( R D ( p p p ( R ln d R p ( R H ( R p ( R H e ( p( R H ln p( R H [ r ] D( p p E l( ln ( γ ln ( I ( γ [ l( r ] γ γ * * γ J. A. O'S. ESE 524, Lecture 6, /29/9 9

Relationhip to Relative Entropy Part 2 σl( r ( E e H σ [ ] ( γ up σγ ln ( σ σ Relative entropy d ( σ* between the tilted d γ ln ( σ* d denity and the denity dσ ( σ* under hypothei i imply related to that p( R H under hypothei. σ ln p( R H p( R H e pσ + ( R ( σ p ( R D( p p p( Rln dr p( H p ( R H ( ln p( H p( H e σ R + R R p ( R H ln p( R H ( σ + p( R H e p ( R d ( ( σ* E [ dσ σ * + D( p p E[ ( l( r ln ( ] [ l( r ] ( σ* + ln ( + I ( γ γ γ γ I J. A. O'S. ESE 524, Lecture 6, /29/9 σ

Summary of Simple Rl Relationhip σ l ( r σ E e ( H p( R H Find I a a function ( σ Eexp ( σ + ln H of the threhold; p( H R ubtract threhold ( σ ( σ + to get I. I ( γ γ + I ( γ Plot bound ( ( Vary parameter to gain a better ( i a convex function undertanding. γ D( p p (, D( p p i a point Information i γ D( p p ( D( p p, i a point additive. N i.i.d. (N I, N I γ ( D( p p, D( p p i a point dln ( d J. A. O'S. ESE 524, Lecture 6, /29/9

Information Rate Function and Performance Bound Example: Exponential Ditribution Additivity of Information Example: Gauian ame covariance, different mean Poion different mean, Gauian ame mean, different variance J. A. O'S. ESE 524, Lecture 6, /29/9 2

Summary of Simple Rl Relationhip Find I a a function of the threhold; ubtract threhold to get I. Plot bound Vary parameter to gain a better undertanding. d ( σ Ee H σl( r ( σ ( σ + I ( γ γ + I ( γ ( ( γ ( i a convex function D( p p (, D( p p i a point γ D ( p p ( D ( p p, i apoint γ ( D( p p, D( p p i a point ln ( d J. A. O'S. ESE 524, Lecture 6, /29/9 3

Example: Exponential Ditribution ti R H : r λe λ, R, i, 2,... N, i.i.d. i α R : i α,,, 2,..., i.i.d. H r e R i N λ > α N α l( R Ri ( λ α + ln i λ l( r ( E e H N [ ] ( [ ( ] p R H p R H dr i ( ( [ ] ( [ ( ] N i i i pr H pr H dr J. A. O'S. ESE 524, Lecture 6, /29/9 4

( ( N Ex: Exponential Ditribution Compute the moment generating function and information rate function. Plot parametric curve: fn of R(( λ+ α ( λ α e dr λ α (( λ+ α I ( γ up γ ln ( function and [ ] γ, I (γ, I (γ γ + I (γ, I d(* d γ ln * d ( d (* γ E[ l( r ], one component (( λ + α ( γ lnα (( λ+ α ( ln λ+ ln(( λ+ α J. A. O'S. ESE 524, Lecture 6, /29/9 5

Example continued N L μi L e plh ( L H, i i L N μ ( N! i We know exact μ, μ λ performance for thi D l H cae. ( P p L H dl γ ' γ ' α ( PF pl H L H dl P P P F F D N λγ ' k e ( λγ ' k k! N γ '' k e ( γ '', γ '' λγ ' k! k N αγ '' αγ '' exp λ k! λ k k J. A. O'S. ESE 524, Lecture 6, /29/9 6

Information i Additive Relative entropy of product ditribution equal the um of the relative entropie. Log-moment o generating e g function add. Information rate function add. Information i additive. N i.i.d. (N I, N I Exponential error bound. J. A. O'S. ESE 524, Lecture 6, /29/9 7