Detection and Estimation Theory
|
|
- Avice Madeleine Stewart
- 5 years ago
- Views:
Transcription
1 ESE 524 Detection and Estimation Theory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 2 Urbauer all (Lynda answers) jao@wustl.edu J. A. O'S. ESE 524, Lecture 2, /5/9
2 Dt Detection ti Theory Which model best fits the measured data? Decision or detection or hypothesis testing Principled approach Define the concept of a model Define the concept of a decision Define an objective function to be minimized Given these, derive an optimal decision Quantify the performance (as a function of parameters) J. A. O'S. ESE 524, Lecture 2, /5/9 2
3 Example: Throwing Darts Suppose that when people throw darts at a blackboard, there is a distribution ib i of the locations of the darts. The first individual is very good and his darts land in the vicinity of the target, with a circularly symmetric distribution. The second individual has a systematic error that he does not seem to be able to correct. is darts land in a circularly symmetric distribution offset from the truth. The third individual has no systematic error, but his darts, while still landing according to a circularly symmetric distribution around the truth, are more widespread than the first individuals. id Problem : Individual or 2 Problem 2: Individual or 3 Other problems J. A. O'S. ESE 524, Lecture 2, /5/9 3
4 Example: Throwing Darts These problems are still not -4 well posed. Need additional -6 assumptions such as: -8 A: The distributions are two dimensional Gaussian distributions around the truth 6 with known covariance matrix. 5 4 A2: All throws are independent 3 of all other throws. A3: The covariance matrix is diagonal with equal variance in each direction J. A. O'S. ESE 524, Lecture 2, /5/9 4
5 Matlab Code for Plots function errf=dartslecture2 errf=; x=randn(2,); -8 y=2*ones(2,)+randn(2,); - z=3*randn(2,); figure Can decide among hypotheses plot(x(,:),x(2,:),'bx') 6 if more throws (data) are hold on observed 5 4 plot(y(,:),y(2,:),'ro') 3 plot(z(,:),z(2,:),'gv') 2 axis equal; figure plot(x(,:),x(2,:),'bx') - hold on -2 plot(2+y(,:),2+y(2,:),'ro'), ) axis equal; errf=; J. A. O'S. ESE 524, Lecture 2, /5/
6 Probability Density Functions: M l bm Matlab Mesh h Pl Plots J. A. O'S. ESE 524, Lecture 2, /5/9 6
7 MtlbCd Matlab Code frpdfc for Computation ptti function errf=dartslecture2pdf errf=; x=-5:.:5; [x,y]=meshgrid(x); p=exp(-(x.^2+y.^2)/2)/(2*pi); (x ^2)/2)/(2*pi); p2=exp(-((x-2).^2+(y-2).^2)/2)/(2*pi); p3=exp(-(x.^2+y.^2)/8)/(8*pi); figure mesh(x,y,p) figure mesh(x,y,p2) figure mesh(x,y,p3) figure mesh(x,y,max(max(p,p2),p3)) max(max(p p2) p3)) errf=; J. A. O'S. ESE 524, Lecture 2, /5/9 7
8 Example: Throwing (One Dim) )Darts Suppose that one of two possible known signals is transmitted during the interval [,T]: s(t) or s(t) The signal is measured in white Gaussian noise: rt () = a Est () + wt (), t T { } a +, The receiver first computes the integral of r(t) times s(t) (unit energy) over the interval to get r = a E + n n N (, N / 2) Problem: a=+ or a=- J. A. O'S. ESE 524, Lecture 2, /5/9 8
9 Example: Throwing 2DD Darts In QAM (quadrature amplitude modulation), one of four signals is sent over [,T]: s( t) cos ( ωct), cos ( ωct), sin ( ωct), cos( ωct) T T T T Assume measurements are in white Gaussian noise Assuming the cosine and sine are orthogonal over the interval, the measurements are integrated against each to get two measurements. Problem: which of four signals was sent? = + r I E T T,,, 2 2 r Q () cos( ω ) () cos( ω ) T T n (, T T I N N / 2) 2 2 () sin( ω ) () sin( ω ) n N (, N / 2) rt () Est () wt (), t T r = r t t dt = E s t t dt+ n I c c I r = r t t dt = E s t t dt+ n Q c c Q T T Q J. A. O'S. ESE 524, Lecture 2, /5/9 9 n I n Q
10 Finite Dimensional Binary Detection Theory Problem: Given a measurement of a random vector r, decide which h of two models it is drawn from. Source System Decision or r or Source produces or probabilities P and P =-P. Probabilities may not be known. Random measurement vector results. Transition probabilities (system) are often assumed known. Decision rule is a partition of measurement space into two sets Z and Z corresponding to decisions and. J. A. O'S. ESE 524, Lecture 2, /5/9
11 Ot Outcomes, Bayes Criterion Critri There are four possible outcomes: Decide and is true: P (-P D ) Decide and is true: P P D Decide and is true: P (-P F ) Decide and d is true: P P F Probability of False Alarm: P F Decide given is true Probability bili of Detection: P D Decide given is true Probability of Miss: P M =-P D Decide given is true Error probabilities equal integrals of conditional (or transition) probability density functions over decision i regions J. A. O'S. ESE 524, Lecture 2, /5/9
12 Bayes Criterion Assumptions: Source prior probabilities P and P are known Transition densities are known There are costs of decision outcomes and these are known: C ij cost of deciding i and j is true C -C > is the relative cost of a miss C -C > is the relative cost of a false alarm Bayes decision criterion: Pick the decision rule to minimize the average risk (cost) R = C P ( PF ) + C P ( PD ) + C PP F + C PP D = ( C C) P ( PD) + ( C C ) PP F + CP + CP = CM P ( PD ) + CF PP F Simplified notation + C P + C P J. A. O'S. ESE 524, Lecture 2, /5/9 2
13 Results Theorem: Likelihood ratio test is optimal Corrolary: The log-likelihood ratio test is optimal For proofs, see notes and text Example: deterministic signal in Gaussian noise J. A. O'S. ESE 524, Lecture 2, /5/9 3
14 Decision i Rules Criterion Bayes: mimize expected risk or cost Transition Priors Costs Densities P &P C ij Yes Yes Yes Minimum Probability of Error Yes Yes No Minimax Yes No Yes Neyman-Pearson Yes No No R = C P P + C P P M M F F P = pr ( R ) dr M Z P = pr ( R ) dr F Z The entire space is divided into two regions: Decide or The integrals of the probability density functions determine the probabilities of error Integrate over the other or wrong region J. A. O'S. ESE 524, Lecture 4, /25/7 4
15 Bayes Criterion Define the two decision regions. Minimize Bayes risk over the decision regions. Every point in the space must be included in one integral or the other Put a point in a region if it contributes less to that integral than to the other Introduce new notation for the result of the comparison R = C P P + C P P M M F F = C MP p ( ) d + CFP r R R pr ( R ) dr C Z Pp ( R ) C Pp ( R ) M r F r Z J. A. O'S. ESE 524, Lecture 2, /5/9 5
16 Bayes Criterion Compare CMP pr ( R ) and C P p F r ( R ) Introduce new notation for the result of the comparison If the left side is bigger, decide If the right side is bigger, decide For continuous pdfs, do not worry about equality R = C P P + C P P M M F F = C P p ( R ) dr+ C P p ( R ) dr C M r F r Z Z Pp ( R ) > C Pp ( R ) < M r F r < J. A. O'S. ESE 524, Lecture 2, /5/9 6
17 Bayes Criterion: Derivation of Likelihood Ratio Test Compare CMP pr ( R ) and C P p F r ( R ) There are several equivalent ways to compare these values including the likelihood ratio and the log-likelihood ratio > C Λ P p ( R ) > C P p ( R ) < M r F r R ( ) r ( R ) F p > C P p ( R ) < C P η r M l R ( ) p ( ) r R CF P ln > ln p ( ) < r R MP C γ J. A. O'S. ESE 524, Lecture 2, /5/9 7
18 Bayes Criterion: Computation of Error r Probabilities biliti The likelihood ratio is a nonnegative-valued function of the observed data. Evaluated at a random realization of the data, the likelihood lih ratio is a random variable. Comparing the likelihood ratio to a threshold is the optimal test. The probabilities of error can be computed in terms of the probability density functions for either the likelihood lih ratio or the log-likelihood lik lih ratio. J. A. O'S. ESE 524, Lecture 2, /5/9 8
19 Optimal Decision i Rules Likelihood Ratio Test Threshold for Bayes Rule: CF P ( C C) P η = = CM P ( C C ) P Computations of the error probabilities p p ( ) ( r R > ) r R > Λ ( R) = η l ( R) = ln pr ( R pr ( R ) ) < < η P = p ( Λ ) d Λ M F p( ( ) η P = p Λ dλ γ P = p ( L ) dl M l P = p ( L ) dl F γ l ln η J. A. O'S. ESE 524, Lecture 4, /25/7 9
20 Minimum i Probability bilit of Errorr Minimum Probability of Error P = P P + P P CM = CF = P η = P e M F Loglikelihood Ratio Test J. A. O'S. ESE 524, Lecture 2, /5/9 2
21 Minimum i Probability bilit of Errorr Alternative View of Decision Rule: Compute Posterior Probabilities on ypotheses Select Most Likely ypothesis p ( ) r R > P p ( R ) < P r pr ( R ) P > pr ( R ) P g( R) < g( R) ( R P ) > r ( R ) p P r p P p( R) < p( R) p ( R ) = p ( R ) P + p ( R ) P r r < > P ( R) P ( R) for any g( R) > J. A. O'S. ESE 524, Lecture 4, /25/7 2
22 Gaussian Example J. A. O'S. ESE 524, Lecture 2, /5/9 22
23 Minimax i Decision i Rule Find the decision rule that minimizes the maximum Bayes Risk over all possible priors. Criterion Bayes: mimize expected risk or cost min maxr R Z P Transition Priors Costs Densities P & P C ij Yes Yes Yes Minimum Probability of Error Yes Yes No Minimax Yes No Yes Neyman-Pearson Yes No No J. A. O'S. ESE 524, Lecture 4, /25/7 23
24 Minimax i Decision i Rule: Analysis For any fixed decision rule the risk is linear in P The maximum over P is achieved at an end point To make that end point as low as possible, the risk should be constant with respect to P To minimize that constant value, the risk should achieve the minimum risk at some P *. At that value of the prior, the best decision rule is a likelihood ratio test R = C P P + C P P M M F F P = pr ( R ) dr M Z P = pr ( R ) dr F Z J. A. O'S. ESE 524, Lecture 4, /25/7 24
25 Minimax i Decision i Rule: Analysis Bayes risk is concave (it is always below its tangent) Minimax is achieved ed at an end point or at an interior point on the Bayes risk curve where the tangent is zero J. A. O'S. ESE 524, Lecture 4, /25/7 25
26 .9.8 Minimax Decision Rule: 5 X.7 : x 5 e, X.6 Example X Matlab Code pf=:.:; pd=pf.^.2; eta=.2*(pf.^(-.8)); % eta=dp_d/dp_f figure plot(pf,pd);xlabel('p_f');ylabel('p_d') pd);xlabel('p cm=;cf=; pstar=./(+cm*eta/cf); riskoptimal=cm*(-pd).*pstar+cf*pf.*(-pstar); figure plot(pstar,riskoptimal,'b'), hold on p=:.:; r=cm*(-pd())*p+cf*pf()*(-p); plot(p,r,'r'), hold on r2=cm*(-pd(2))*p+cf*pf(2)*(-p); p+cf pf(2) (-p); plot(p,r2,'g'), hold on r3=cm*(-pd(3))*p+cf*pf(3)*(-p); plot(p,r3,'c') xlabel('p_');ylabel('risk') : x e, X l = 4X ln5 P = 5e dx = e F γ ' γ ' 5X 5 γ ' X P = e dx = e M P = P = P D.2.2 M( ) = F F F M C P C P F γ ' P D Risk P F P J. A. O'S. ESE 524, Lecture 4, /25/7 26
27 Neyman-Pearson Decision i Rule M ( α) F = P + η P F Minimize P M subject to P F α Variational approach Upper bound usually achieved Likelihood ratio test; threshold? = p ( ) d + η p ( ) d α r R R r R R Z Z Criterion Bayes: mimize expected risk or cost Transition Densities Priors P & P Costs C ij Yes Yes Yes Minimum Probability of Error Yes Yes No Minimax Yes No Yes Neyman-Pearson Yes No No J. A. O'S. ESE 524, Lecture 4, /25/7 27
28 Neyman-Pearson Decision i Rule M ( α) F = P + η P F Minimize P M subject to P F α Variational approach Upper bound usually achieved Likelihood ratio test; threshold? = p ( ) d + η p ( ) d α r R R r R R Z Z Plot the ROC: P D versus P F for the family of likelihood ratio tests Draw a vertical line where P F = α Find the corresponding P D At that point, the threshold equals the derivative of the ROC η = = dp dp dp dp D F D F dη dη J. A. O'S. ESE 524, Lecture 4, /25/7 28
29 Summary Several decision rules Likelihood (and loglikelihood) ratio test is optimal Receiver operating characteristic (ROC) plots probability of detection versus probability of false alarm with the threshold as a parameter all possible optimal performance Neyman-Pearson is a point on the ROC (P F = α) Minimax is a point on the ROC (P F C F =P M C M ) Probability of error is a point on the ROC (slope η = (-P )/P ) J. A. O'S. ESE 524, Lecture 4, /25/7 29
Detection and Estimation Theory
ESE 524 Detection and Estimation Theory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 2 Urbauer
More informationDetection and Estimation Theory
ESE 524 Detection and Estimation Theory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 211 Urbauer
More informationDigital Transmission Methods S
Digital ransmission ethods S-7.5 Second Exercise Session Hypothesis esting Decision aking Gram-Schmidt method Detection.K.K. Communication Laboratory 5//6 Konstantinos.koufos@tkk.fi Exercise We assume
More informationDetection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010
Detection Theory Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010 Outline Neyman-Pearson Theorem Detector Performance Irrelevant Data Minimum Probability of Error Bayes Risk Multiple
More informationDetection theory 101 ELEC-E5410 Signal Processing for Communications
Detection theory 101 ELEC-E5410 Signal Processing for Communications Binary hypothesis testing Null hypothesis H 0 : e.g. noise only Alternative hypothesis H 1 : signal + noise p(x;h 0 ) γ p(x;h 1 ) Trade-off
More informationChapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1)
Chapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1) Detection problems can usually be casted as binary or M-ary hypothesis testing problems. Applications: This chapter: Simple hypothesis
More informationDetection and Estimation Theory
ESE 54 Detection and Estimation heory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University Urbauer
More informationIntroduction to Statistical Inference
Structural Health Monitoring Using Statistical Pattern Recognition Introduction to Statistical Inference Presented by Charles R. Farrar, Ph.D., P.E. Outline Introduce statistical decision making for Structural
More informationUniversity of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries
University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout :. The Multivariate Gaussian & Decision Boundaries..15.1.5 1 8 6 6 8 1 Mark Gales mjfg@eng.cam.ac.uk Lent
More informationIf there exists a threshold k 0 such that. then we can take k = k 0 γ =0 and achieve a test of size α. c 2004 by Mark R. Bell,
Recall The Neyman-Pearson Lemma Neyman-Pearson Lemma: Let Θ = {θ 0, θ }, and let F θ0 (x) be the cdf of the random vector X under hypothesis and F θ (x) be its cdf under hypothesis. Assume that the cdfs
More informationIntroduction to Detection Theory
Introduction to Detection Theory Detection Theory (a.k.a. decision theory or hypothesis testing) is concerned with situations where we need to make a decision on whether an event (out of M possible events)
More informationDetection and Estimation Theory
ESE 524 Detection and Etimation Theory Joeph A. O Sullivan Samuel C. Sach Profeor Electronic Sytem and Signal Reearch Laboratory Electrical l and Sytem Engineering Wahington Univerity 2 Urbauer Hall 34-935-473
More informationLecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics. 1 Executive summary
ECE 830 Spring 207 Instructor: R. Willett Lecture 5: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics Executive summary In the last lecture we saw that the likelihood
More informationDetection theory. H 0 : x[n] = w[n]
Detection Theory Detection theory A the last topic of the course, we will briefly consider detection theory. The methods are based on estimation theory and attempt to answer questions such as Is a signal
More informationECE531 Lecture 2b: Bayesian Hypothesis Testing
ECE531 Lecture 2b: Bayesian Hypothesis Testing D. Richard Brown III Worcester Polytechnic Institute 29-January-2009 Worcester Polytechnic Institute D. Richard Brown III 29-January-2009 1 / 39 Minimizing
More informationEE 574 Detection and Estimation Theory Lecture Presentation 8
Lecture Presentation 8 Aykut HOCANIN Dept. of Electrical and Electronic Engineering 1/14 Chapter 3: Representation of Random Processes 3.2 Deterministic Functions:Orthogonal Representations For a finite-energy
More informationECE531 Lecture 6: Detection of Discrete-Time Signals with Random Parameters
ECE531 Lecture 6: Detection of Discrete-Time Signals with Random Parameters D. Richard Brown III Worcester Polytechnic Institute 26-February-2009 Worcester Polytechnic Institute D. Richard Brown III 26-February-2009
More information2. What are the tradeoffs among different measures of error (e.g. probability of false alarm, probability of miss, etc.)?
ECE 830 / CS 76 Spring 06 Instructors: R. Willett & R. Nowak Lecture 3: Likelihood ratio tests, Neyman-Pearson detectors, ROC curves, and sufficient statistics Executive summary In the last lecture we
More informationECE531 Lecture 4b: Composite Hypothesis Testing
ECE531 Lecture 4b: Composite Hypothesis Testing D. Richard Brown III Worcester Polytechnic Institute 16-February-2011 Worcester Polytechnic Institute D. Richard Brown III 16-February-2011 1 / 44 Introduction
More informationDetection and Estimation Theory
ESE 524 Detection and Etimation Theory Joeph A. O Sullivan Samuel C. Sach Profeor Electronic Sytem and Signal Reearch Laboratory Electrical l and Sytem Engineering Wahington Univerity 2 Urbauer Hall 34-935-473
More informationLecture 8: Information Theory and Statistics
Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 23, 2015 1 / 50 I-Hsiang
More informationChapter 2 Signal Processing at Receivers: Detection Theory
Chapter Signal Processing at Receivers: Detection Theory As an application of the statistical hypothesis testing, signal detection plays a key role in signal processing at receivers of wireless communication
More informationF2E5216/TS1002 Adaptive Filtering and Change Detection. Course Organization. Lecture plan. The Books. Lecture 1
Adaptive Filtering and Change Detection Bo Wahlberg (KTH and Fredrik Gustafsson (LiTH Course Organization Lectures and compendium: Theory, Algorithms, Applications, Evaluation Toolbox and manual: Algorithms,
More informationDetection and Estimation Theory
ESE 54 Detection and Estiation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electronic Systes and Signals Research Laboratory Electrical and Systes Engineering Washington University 11 Urbauer
More informationHypothesis testing (cont d)
Hypothesis testing (cont d) Ulrich Heintz Brown University 4/12/2016 Ulrich Heintz - PHYS 1560 Lecture 11 1 Hypothesis testing Is our hypothesis about the fundamental physics correct? We will not be able
More informationLecture 7 Introduction to Statistical Decision Theory
Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7
More informationDetection and Estimation Chapter 1. Hypothesis Testing
Detection and Estimation Chapter 1. Hypothesis Testing Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2015 1/20 Syllabus Homework:
More informationLecture 8: Information Theory and Statistics
Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and Estimation I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 22, 2015
More information44 CHAPTER 2. BAYESIAN DECISION THEORY
44 CHAPTER 2. BAYESIAN DECISION THEORY Problems Section 2.1 1. In the two-category case, under the Bayes decision rule the conditional error is given by Eq. 7. Even if the posterior densities are continuous,
More informationECE531 Screencast 9.2: N-P Detection with an Infinite Number of Possible Observations
ECE531 Screencast 9.2: N-P Detection with an Infinite Number of Possible Observations D. Richard Brown III Worcester Polytechnic Institute Worcester Polytechnic Institute D. Richard Brown III 1 / 7 Neyman
More informationA Simple Example Binary Hypothesis Testing Optimal Receiver Frontend M-ary Signal Sets Message Sequences. possible signals has been transmitted.
Introduction I We have focused on the problem of deciding which of two possible signals has been transmitted. I Binary Signal Sets I We will generalize the design of optimum (MPE) receivers to signal sets
More informationProblem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30
Problem Set 2 MAS 622J/1.126J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 30 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain
More informationDecision Criteria 23
Decision Criteria 23 test will work. In Section 2.7 we develop bounds and approximate expressions for the performance that will be necessary for some of the later chapters. Finally, in Section 2.8 we summarize
More informationDETECTION theory deals primarily with techniques for
ADVANCED SIGNAL PROCESSING SE Optimum Detection of Deterministic and Random Signals Stefan Tertinek Graz University of Technology turtle@sbox.tugraz.at Abstract This paper introduces various methods for
More informationIntroduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization
Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras 1 Motivation Recall: Discrete filter Discretize the
More informationECE531 Lecture 2a: A Mathematical Model for Hypothesis Testing (Finite Number of Possible Observations)
ECE531 Lecture 2a: A Mathematical Model for Hypothesis Testing (Finite Number of Possible Observations) D. Richard Brown III Worcester Polytechnic Institute 26-January-2011 Worcester Polytechnic Institute
More information1.1 Basis of Statistical Decision Theory
ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 Lecture 1: Introduction Lecturer: Yihong Wu Scribe: AmirEmad Ghassami, Jan 21, 2016 [Ed. Jan 31] Outline: Introduction of
More informationOn the Optimality of Likelihood Ratio Test for Prospect Theory Based Binary Hypothesis Testing
1 On the Optimality of Likelihood Ratio Test for Prospect Theory Based Binary Hypothesis Testing Sinan Gezici, Senior Member, IEEE, and Pramod K. Varshney, Life Fellow, IEEE Abstract In this letter, the
More informationMultivariate statistical methods and data mining in particle physics
Multivariate statistical methods and data mining in particle physics RHUL Physics www.pp.rhul.ac.uk/~cowan Academic Training Lectures CERN 16 19 June, 2008 1 Outline Statement of the problem Some general
More informationParameter Estimation
1 / 44 Parameter Estimation Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay October 25, 2012 Motivation System Model used to Derive
More informationFundamentals of Statistical Signal Processing Volume II Detection Theory
Fundamentals of Statistical Signal Processing Volume II Detection Theory Steven M. Kay University of Rhode Island PH PTR Prentice Hall PTR Upper Saddle River, New Jersey 07458 http://www.phptr.com Contents
More informationBayes Decision Theory
Bayes Decision Theory Minimum-Error-Rate Classification Classifiers, Discriminant Functions and Decision Surfaces The Normal Density 0 Minimum-Error-Rate Classification Actions are decisions on classes
More informationEEL 851: Biometrics. An Overview of Statistical Pattern Recognition EEL 851 1
EEL 851: Biometrics An Overview of Statistical Pattern Recognition EEL 851 1 Outline Introduction Pattern Feature Noise Example Problem Analysis Segmentation Feature Extraction Classification Design Cycle
More informationStephen Scott.
1 / 35 (Adapted from Ethem Alpaydin and Tom Mitchell) sscott@cse.unl.edu In Homework 1, you are (supposedly) 1 Choosing a data set 2 Extracting a test set of size > 30 3 Building a tree on the training
More informationECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)
ECE 564/645 - Digital Communications, Spring 018 Homework # Due: March 19 (In Lecture) 1. Consider a binary communication system over a 1-dimensional vector channel where message m 1 is sent by signaling
More informationTwo results in statistical decision theory for detecting signals with unknown distributions and priors in white Gaussian noise.
Two results in statistical decision theory for detecting signals with unknown distributions and priors in white Gaussian noise. Dominique Pastor GET - ENST Bretagne, CNRS UMR 2872 TAMCIC, Technopôle de
More informationLecture 7. Union bound for reducing M-ary to binary hypothesis testing
Lecture 7 Agenda for the lecture M-ary hypothesis testing and the MAP rule Union bound for reducing M-ary to binary hypothesis testing Introduction of the channel coding problem 7.1 M-ary hypothesis testing
More informationPartitioning the Parameter Space. Topic 18 Composite Hypotheses
Topic 18 Composite Hypotheses Partitioning the Parameter Space 1 / 10 Outline Partitioning the Parameter Space 2 / 10 Partitioning the Parameter Space Simple hypotheses limit us to a decision between one
More informationIntroduction to Signal Detection and Classification. Phani Chavali
Introduction to Signal Detection and Classification Phani Chavali Outline Detection Problem Performance Measures Receiver Operating Characteristics (ROC) F-Test - Test Linear Discriminant Analysis (LDA)
More informationProblem Set 7 Due March, 22
EE16: Probability and Random Processes SP 07 Problem Set 7 Due March, Lecturer: Jean C. Walrand GSI: Daniel Preda, Assane Gueye Problem 7.1. Let u and v be independent, standard normal random variables
More informationBayesian inference. Justin Chumbley ETH and UZH. (Thanks to Jean Denizeau for slides)
Bayesian inference Justin Chumbley ETH and UZH (Thanks to Jean Denizeau for slides) Overview of the talk Introduction: Bayesian inference Bayesian model comparison Group-level Bayesian model selection
More informationBayes Rule for Minimizing Risk
Bayes Rule for Minimizing Risk Dennis Lee April 1, 014 Introduction In class we discussed Bayes rule for minimizing the probability of error. Our goal is to generalize this rule to minimize risk instead
More informationSTOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes
STOCHASTIC PROCESSES, DETECTION AND ESTIMATION 6.432 Course Notes Alan S. Willsky, Gregory W. Wornell, and Jeffrey H. Shapiro Department of Electrical Engineering and Computer Science Massachusetts Institute
More informationEECS564 Estimation, Filtering, and Detection Exam 2 Week of April 20, 2015
EECS564 Estimation, Filtering, and Detection Exam Week of April 0, 015 This is an open book takehome exam. You have 48 hours to complete the exam. All work on the exam should be your own. problems have
More informationLecture 3. STAT161/261 Introduction to Pattern Recognition and Machine Learning Spring 2018 Prof. Allie Fletcher
Lecture 3 STAT161/261 Introduction to Pattern Recognition and Machine Learning Spring 2018 Prof. Allie Fletcher Previous lectures What is machine learning? Objectives of machine learning Supervised and
More informationMonte Carlo Integration I
Monte Carlo Integration I Digital Image Synthesis Yung-Yu Chuang with slides by Pat Hanrahan and Torsten Moller Introduction L o p,ωo L e p,ωo s f p,ωo,ω i L p,ω i cosθ i i d The integral equations generally
More informationEE4304 C-term 2007: Lecture 17 Supplemental Slides
EE434 C-term 27: Lecture 17 Supplemental Slides D. Richard Brown III Worcester Polytechnic Institute, Department of Electrical and Computer Engineering February 5, 27 Geometric Representation: Optimal
More informationMachine Learning Linear Classification. Prof. Matteo Matteucci
Machine Learning Linear Classification Prof. Matteo Matteucci Recall from the first lecture 2 X R p Regression Y R Continuous Output X R p Y {Ω 0, Ω 1,, Ω K } Classification Discrete Output X R p Y (X)
More informationSequential Procedure for Testing Hypothesis about Mean of Latent Gaussian Process
Applied Mathematical Sciences, Vol. 4, 2010, no. 62, 3083-3093 Sequential Procedure for Testing Hypothesis about Mean of Latent Gaussian Process Julia Bondarenko Helmut-Schmidt University Hamburg University
More informationIN HYPOTHESIS testing problems, a decision-maker aims
IEEE SIGNAL PROCESSING LETTERS, VOL. 25, NO. 12, DECEMBER 2018 1845 On the Optimality of Likelihood Ratio Test for Prospect Theory-Based Binary Hypothesis Testing Sinan Gezici, Senior Member, IEEE, and
More informationSYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I
SYDE 372 Introduction to Pattern Recognition Probability Measures for Classification: Part I Alexander Wong Department of Systems Design Engineering University of Waterloo Outline 1 2 3 4 Why use probability
More informationPATTERN RECOGNITION AND MACHINE LEARNING
PATTERN RECOGNITION AND MACHINE LEARNING Slide Set 3: Detection Theory January 2018 Heikki Huttunen heikki.huttunen@tut.fi Department of Signal Processing Tampere University of Technology Detection theory
More informationUTA EE5362 PhD Diagnosis Exam (Spring 2011)
EE5362 Spring 2 PhD Diagnosis Exam ID: UTA EE5362 PhD Diagnosis Exam (Spring 2) Instructions: Verify that your exam contains pages (including the cover shee. Some space is provided for you to show your
More informationp(x ω i 0.4 ω 2 ω
p(x ω i ).4 ω.3.. 9 3 4 5 x FIGURE.. Hypothetical class-conditional probability density functions show the probability density of measuring a particular feature value x given the pattern is in category
More informationEncoding or decoding
Encoding or decoding Decoding How well can we learn what the stimulus is by looking at the neural responses? We will discuss two approaches: devise and evaluate explicit algorithms for extracting a stimulus
More informationInformatics 2B: Learning and Data Lecture 10 Discriminant functions 2. Minimal misclassifications. Decision Boundaries
Overview Gaussians estimated from training data Guido Sanguinetti Informatics B Learning and Data Lecture 1 9 March 1 Today s lecture Posterior probabilities, decision regions and minimising the probability
More informationSignal Detection Basics - CFAR
Signal Detection Basics - CFAR Types of noise clutter and signals targets Signal separation by comparison threshold detection Signal Statistics - Parameter estimation Threshold determination based on the
More informationBayesian Decision Theory
Bayesian Decision Theory Selim Aksoy Department of Computer Engineering Bilkent University saksoy@cs.bilkent.edu.tr CS 551, Fall 2017 CS 551, Fall 2017 c 2017, Selim Aksoy (Bilkent University) 1 / 46 Bayesian
More informationConcerns of the Psychophysicist. Three methods for measuring perception. Yes/no method of constant stimuli. Detection / discrimination.
Three methods for measuring perception Concerns of the Psychophysicist. Magnitude estimation 2. Matching 3. Detection/discrimination Bias/ Attentiveness Strategy/Artifactual Cues History of stimulation
More informationLecture Testing Hypotheses: The Neyman-Pearson Paradigm
Math 408 - Mathematical Statistics Lecture 29-30. Testing Hypotheses: The Neyman-Pearson Paradigm April 12-15, 2013 Konstantin Zuev (USC) Math 408, Lecture 29-30 April 12-15, 2013 1 / 12 Agenda Example:
More informationEE456 Digital Communications
EE456 Digital Communications Professor Ha Nguyen September 5 EE456 Digital Communications Block Diagram of Binary Communication Systems m ( t { b k } b k = s( t b = s ( t k m ˆ ( t { bˆ } k r( t Bits in
More informationChapter 3: Maximum-Likelihood & Bayesian Parameter Estimation (part 1)
HW 1 due today Parameter Estimation Biometrics CSE 190 Lecture 7 Today s lecture was on the blackboard. These slides are an alternative presentation of the material. CSE190, Winter10 CSE190, Winter10 Chapter
More informationESE 524 Detection and Estimation Theory
ESE 524 Detection and Estimation heory Joseh A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 2 Urbauer
More informationMA : Introductory Probability
MA 320-001: Introductory Probability David Murrugarra Department of Mathematics, University of Kentucky http://www.math.uky.edu/~dmu228/ma320/ Spring 2017 David Murrugarra (University of Kentucky) MA 320:
More informationHypothesis Testing. Testing Hypotheses MIT Dr. Kempthorne. Spring MIT Testing Hypotheses
Testing Hypotheses MIT 18.443 Dr. Kempthorne Spring 2015 1 Outline Hypothesis Testing 1 Hypothesis Testing 2 Hypothesis Testing: Statistical Decision Problem Two coins: Coin 0 and Coin 1 P(Head Coin 0)
More informationMachine Learning, Midterm Exam: Spring 2009 SOLUTION
10-601 Machine Learning, Midterm Exam: Spring 2009 SOLUTION March 4, 2009 Please put your name at the top of the table below. If you need more room to work out your answer to a question, use the back of
More informationProblem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30
Problem Set MAS 6J/1.16J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 30 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain
More informationDetecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf
Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf Reading: Ch. 5 in Kay-II. (Part of) Ch. III.B in Poor. EE 527, Detection and Estimation Theory, # 5c Detecting Parametric Signals in Noise
More informationTesting Distributed Parameter Hypotheses for the Detection of Climate Change
3464 JOURNAL OF CLIMATE VOLUME 14 Testing Distributed Parameter Hypotheses for the Detection of Climate Change HAROON S. KHESHGI AND BENJAMIN S. WHITE Corporate Strategic Research, ExxonMobil Research
More informationBAYESIAN DECISION THEORY
Last updated: September 17, 2012 BAYESIAN DECISION THEORY Problems 2 The following problems from the textbook are relevant: 2.1 2.9, 2.11, 2.17 For this week, please at least solve Problem 2.3. We will
More informationECE531 Lecture 13: Sequential Detection of Discrete-Time Signals
ECE531 Lecture 13: Sequential Detection of Discrete-Time Signals D. Richard Brown III Worcester Polytechnic Institute 30-Apr-2009 Worcester Polytechnic Institute D. Richard Brown III 30-Apr-2009 1 / 32
More informationMidterm, Fall 2003
5-78 Midterm, Fall 2003 YOUR ANDREW USERID IN CAPITAL LETTERS: YOUR NAME: There are 9 questions. The ninth may be more time-consuming and is worth only three points, so do not attempt 9 unless you are
More informationIN MOST problems pertaining to hypothesis testing, one
1602 IEEE SIGNAL PROCESSING LETTERS, VOL. 23, NO. 11, NOVEMBER 2016 Joint Detection and Decoding in the Presence of Prior Information With Uncertainty Suat Bayram, Member, IEEE, Berkan Dulek, Member, IEEE,
More informationMinimum Error Rate Classification
Minimum Error Rate Classification Dr. K.Vijayarekha Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur-613 401 Table of Contents 1.Minimum Error Rate Classification...
More informationEstimation Tasks. Short Course on Image Quality. Matthew A. Kupinski. Introduction
Estimation Tasks Short Course on Image Quality Matthew A. Kupinski Introduction Section 13.3 in B&M Keep in mind the similarities between estimation and classification Image-quality is a statistical concept
More informationESE 523 Information Theory
ESE 53 Inforation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electrical and Systes Engineering Washington University 11 Urbauer Hall 10E Green Hall 314-935-4173 (Lynda Marha Answers) jao@wustl.edu
More informationبسم الله الرحمن الرحيم
بسم الله الرحمن الرحيم Reliability Improvement of Distributed Detection in Clustered Wireless Sensor Networks 1 RELIABILITY IMPROVEMENT OF DISTRIBUTED DETECTION IN CLUSTERED WIRELESS SENSOR NETWORKS PH.D.
More informationECE521 week 3: 23/26 January 2017
ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear
More informationBayesian Decision Theory
Introduction to Pattern Recognition [ Part 4 ] Mahdi Vasighi Remarks It is quite common to assume that the data in each class are adequately described by a Gaussian distribution. Bayesian classifier is
More information40.530: Statistics. Professor Chen Zehua. Singapore University of Design and Technology
Singapore University of Design and Technology Lecture 9: Hypothesis testing, uniformly most powerful tests. The Neyman-Pearson framework Let P be the family of distributions of concern. The Neyman-Pearson
More informationLecture 2: Basic Concepts of Statistical Decision Theory
EE378A Statistical Signal Processing Lecture 2-03/31/2016 Lecture 2: Basic Concepts of Statistical Decision Theory Lecturer: Jiantao Jiao, Tsachy Weissman Scribe: John Miller and Aran Nayebi In this lecture
More informationPattern Classification
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley & Sons, 2000 with the permission of the authors
More informationClassification CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2012
Classification CE-717: Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Topics Discriminant functions Logistic regression Perceptron Generative models Generative vs. discriminative
More informationMinimum Error-Rate Discriminant
Discriminants Minimum Error-Rate Discriminant In the case of zero-one loss function, the Bayes Discriminant can be further simplified: g i (x) =P (ω i x). (29) J. Corso (SUNY at Buffalo) Bayesian Decision
More informationL2: Review of probability and statistics
Probability L2: Review of probability and statistics Definition of probability Axioms and properties Conditional probability Bayes theorem Random variables Definition of a random variable Cumulative distribution
More informationON UNIFORMLY MOST POWERFUL DECENTRALIZED DETECTION
ON UNIFORMLY MOST POWERFUL DECENTRALIZED DETECTION by Uri Rogers A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Electrical and Computer Engineering
More informationProblem Set 3 Due Oct, 5
EE6: Random Processes in Systems Lecturer: Jean C. Walrand Problem Set 3 Due Oct, 5 Fall 6 GSI: Assane Gueye This problem set essentially reviews detection theory. Not all eercises are to be turned in.
More information10. Composite Hypothesis Testing. ECE 830, Spring 2014
10. Composite Hypothesis Testing ECE 830, Spring 2014 1 / 25 In many real world problems, it is difficult to precisely specify probability distributions. Our models for data may involve unknown parameters
More informationPubH 7470: STATISTICS FOR TRANSLATIONAL & CLINICAL RESEARCH
PubH 7470: STATISTICS FOR TRANSLATIONAL & CLINICAL RESEARCH The First Step: SAMPLE SIZE DETERMINATION THE ULTIMATE GOAL The most important, ultimate step of any of clinical research is to do draw inferences;
More informationLECTURE NOTE #3 PROF. ALAN YUILLE
LECTURE NOTE #3 PROF. ALAN YUILLE 1. Three Topics (1) Precision and Recall Curves. Receiver Operating Characteristic Curves (ROC). What to do if we do not fix the loss function? (2) The Curse of Dimensionality.
More information