Detection and Estimation Theory
|
|
- Melina Hampton
- 5 years ago
- Views:
Transcription
1 ESE 54 Detection and Estiation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electronic Systes and Signals Research Laboratory Electrical and Systes Engineering Washington University 11 Urbauer Hall Lynda answers jao@wustl.edu J. A. O'S. ESE 54, Lecture 17, 03/19/09 1
2 Outline: Expectation-axiization E Algorith Review of E algorith Alternate Derivation Using the Convex Decoposition o Lea Exaples J. A. O'S. ESE 54, Lecture 17, 03/19/09
3 axiu Likelihood Estiation axiu likelihood estiation often involves axiizing a coplicated nonlinear function Use standard optiization algoriths or the E algorith E algorith is atched to probles that can be odeled with hidden data Depster, Laird, Rubin, J. Royal Statistical Society, I. iller and D. L. Snyder, Proc. IEEE, DLS Tutorial axiu likelihood Estiation θl = arg ax p r θ = arg ax ln p r θ θ View as function of θ θl = arg ax l θ, l θ = ln p r θ θ θ s Source p s θ p r s J. A. O'S. ESE 54, Lecture 17, 03/19/09 θ r 3
4 E Algorith Start with a reasonable guess as the initial iti estiate t Copute the expected value of the coplete data loglikelihood function given the incoplete or observed data and the current estiate axiize this function over the paraeters Iteratet E Algorith k l lik lih d f ti i 0 1. Initialization step: Select θ, let k = 0.. E-step: Copute the expected value k Q θθ = E ln p s θ r, θ ln, k = p s θ p s r θ ds 3. -step: axiize this function k 1 θ + = arg ax Q k θθ θ 4. Iteration step: stop if converged; else k = k+ 1, go to step. θ s Source p s θ p r s r J. A. O'S. ESE 54, Lecture 17, 03/19/09 4
5 k+ 1 k ln ln Properties of the E k+ 1 k k+ 1 k k Q θ θ H θ θ C θ Algorith k k k k k Q θ θ H θ θ + C θ The E algorith onotonically increases the loglikelihood function at every iteration. i Equality if and only if Current estiate is a axiu, and Posterior density reains unchanged Applicable to AP probles by odifying the -step p r θ p r θ = + k+ 1 k k k θ θ Q θ θ = Q k k k+ 1 k + H θ θ H θ θ k+ 1 k k k Q θ θ Q θ θ by -step H k k k+ 1 k θ θ H θ θ k p s r, θ, k = p s r θ ln ds 0 k 1 p s r, θ + θ θ k+ 1 k ln ln p r axiu A Posteriori AP Estiation θ = arg ax ln θ + ln θ L θ p r [ p r + p ] J. A. O'S. ESE 54, Lecture 17, 03/19/09 k+ 1 k 5 New -Step: θ = arg ax Q θ θ + ln p θ θ
6 Alternate Derivation of E Algorith Via Convex Optiization L Estiation Proble θ = arg ax ln p r θ L θ Hidden variables odel p r θ = p r s p s θ ds ln p r θ = ln p r s p s θ ds Define the set of probability density functions P { : s s 0, s ds 1} = Φ Φ Φ = Fix θ= θ k, then ln k s p p θ d Φ =in Φ s ln ds r s s s Φ P k p r s p s θ Double axiization for L Estiation p r s p s θ ax ln p r θ = ax ax Φ s ln ds θ θ Φ P Φ s J. A. O'S. ESE 54, Lecture 17, 03/19/09 6
7 E Algorith as an Alternating axiization Algorith E Algorith Alternately axiize over the posterior density and the paraeter vector Based on a variational representation of the loglikelihood function fro the convex decoposition lea We lift the original proble to a higher diensional proble where the optiization is easier. θ k 1. Select initial guess. Set 0. k. E-step: axiize over with fixed. k Φ s = k Φ P k p r s p s θ k p r s' p s' θ ds' k 3. -step. axiize i over θ with Φ s fixed k 1 θ + k k = arg ax Φ s ln p s θ ds= arg ax Q θ θ θ 4. Check for convergence; else k = k + 1, go to step. Double axiization for L Estiation p r s p s θ ax ln p r θ = ax ax Φ s ln ds θ θ Φ P Φ s P = { Φ: s Φ s 0, Φ s ds =1} J. A. O'S. ESE 54, Lecture 17, 03/19/09 = θ θ 7
8 Convex Decoposition Lea Lea: Suppose q <, q 0, at least one q > 0. Then i i i i pi ln qi = in piln, where i p P i q i P = p: pi 0, pi = 1 i Proof: pi L p = piln ν pi 1 i qi i pl L p = ln + 1 ν = 0 p q p l q = qe = q * ν 1 l l l * p p * ln i i i q = ln qi i i i l i J. A. O'S. ESE 54, Lecture 17, 03/19/09 8
9 Exaple of Convex Decoposition Lea: Poisson Data odel Change to a double axiization i j yi, jln λ, i j λ, i j = K/ L/ K/ L/ yi, jln hklci, k, jl hklci, k, jl i j k= K/ l= L/ k= K/ l=l/ hklci, k, j l = ax Φ kl, i, jy i, j ln h k, l c i k l Φ k l Φ k, l i, j, j i j k l [ ] J. A. O'S. ESE 54, Lecture 17, 03/19/09 9
10 E Algorith E-step is a weighted Poisson ean -step sets next value to the ean s c y c y c E ln P, = E s i, j, ln c i, j c i, j i j + 1 c k, l = E s k, l yc, c k, l y i, j = h i k, j l hi k, j l h ik', j l' c k', l' i, j i, j k', l' This algorith is widely used in astronoical iaging called the Lucy-Richardson algorith; see also D. L. Snyder and T. Schulz, positron eission toography PET; any other situations EL expectation axiization axiu likelhood lh J. A. O'S. ESE 54, Lecture 17, 03/19/09 10
11 Exaple: Poisson Plus Gaussian Signal odel is a Poisson rando variable plus additive discrete-tie tie white Gaussian noise. The ean of the Poisson represents the activity of interest. This is a odel for the data available at the readout of any charge- coupled devices. The charges are shifted fro one well to another, then read out serially at one location using an aplifier. The aplifier noise is well odeled as white Gaussian. The charge is odeled as resulting fro counting gphotons and is odeled as Poisson. J. A. O'S. ESE 54, Lecture 17, 03/19/09 y = n+ w k λ λ n e, k 0 k! w N0, σ p Y y k λ = k! k = 0 Yk e λ 1 σ e πσ λ = arg ax ln p Y L λ y 7
12 Exaple: Poisson Plus Gaussian Hidden data: Poisson rando variable Coplete data loglikelihood is Poisson Expected value of the coplete data loglikelihood given the easured data and the current estiate depends only on the posterior ean of the data. The next estiate of the paraeter equals the posterior ean. k λ 1 λ py Y = e e k 0 k! = πσ λ = arg ax ln p Y L λ Q λ λ = Enln λλ Y, λ k+ 1 k λ = arg ax Q λ λ λ λ + = E n Y, λ k+ 1 k y Yk σ J. A. O'S. ESE 54, Lecture 17, 03/19/09 8
13 Exaple: Poisson Plus Gaussian Yk k Iterations involve a λ 1 λ σ p y Y = e e nonlinear function in k 0 k! = πσ this case. k+ 1 k λ = En Y, λ Nuerical, analytical, or lookup k Y λ k approxiation ay be 1 λ σ e e required. k 0! = En Y, λ πσ = k Y λ k 1 λ σ e e = 0! πσ 1 k λ Y σ e k k = 1 1! En Y, λ = λ k Y λ σ e = 0! J. A. O'S. ESE 54, Lecture 17, 03/19/09 9
14 Estiate Variance of Gaussian in White Gaussian Noise Suppose that N i.i.d. easureents of zero ean Gaussian rando variables s n are ade in additive white Gaussian noise w n of fknown variance. Find the axiu likelihood estiate t of the unknown variance. Two approaches: Analytical E algorith r = s, 1,,..., + w = s i.i.d. N 0, P, w i.i.d. N 0, N, s wk,, k r iid i.i.d. N 0, P + N 1 1 r l r P = ln P+ N = 1 P + N First order necessary condition r = 1 l 1 1 = + = 0 P P+ N P + N 1 PL = r N ax,0 = 1 J. A. O'S. ESE 54, Lecture 17, 03/19/09 30
15 Gaussian Variance in AWGN Analytical: can find the solution directly E algorith Coplete data coprise the pairs of rando variables s n,w n r = s + w, = 1,,..., s i.i.d. N 0, P, w i.i.d. N 0, N, s w,, k k r i.i.d. N 0, P+ N 1 1 l r P = ln P+ N = 1 r P+ N Coplete e data a loglikelihood ood function s lcd s P = ln P P = 1 k cd s k Q P P = E l P r, P Q k Q PP 1 = ln P E s r, P 1 P = k 1 Q P P = ln P Es, r P = P 1 k+ 1 1 k P = Es r, P J. A. O'S. ESE 54, Lecture 17, 03/19/09 = 1 k k 31
16 Estiate Variance of Gaussian in White Gaussian Noise r = s + w, = 1,,..., Coplete data loglikelihood function 1 Q P P P E s r P k k = ln, = 1 P 1 P E s r P k+ 1 k =, = 1 E s r P Es r P E s E s r P r P k 1 k P k k Es = r, P r, var, = s r k P P k P + N 1+ P N k k k k,,,, = + k k+ 1 k P 1 k = + k P N = 1 P = P + r P N + J. A. O'S. ESE 54, Lecture 17, 03/19/09 3
17 Estiate Variance of Gaussian in White Gaussian Noise Fixed point at axiu likelihood solution Region of convergence? All positive starting ti points. Rate of convergence? Linear. See below, where the equality holds for positive estiate. If the estiate is 0, then the convergence is sublinear k 1 k k P + 1 k P = P + r k P N P N + = 1 k * k+ 1 * k P 1 * * k = k P N + = 1 P P P P r P N P P + k * k P = P P 1 k P + N * k P P 1 P + N * P * k P P J. A. O'S. * ESE 54, Lecture 17, 03/19/09 SNR = 1 1+ SNR 33
18 Estiation of a Source Distribution fro Sensor Array Data Suppose that an array of sensors is distributed over soe area on a two-diensional plane. The location of each array eleent is denoted x k,y k. A coplex-valued signal is incident upon the array fro angle θ,φ relative to the x-y axes θ is pitch, φ is yaw. For radar, sonar, or radio counications, the coplex values can represent the in phase and quadrature coponents of the signal. A narrowband approxiation is usually ade. The signal is assued to coe fro the far field. The sensors saple the incoing waves at their respective locations. There is assued to be additive white Gaussian noise at the sensors. Far field assuption: The curvature of the wavefront relative to the extent of the array is negligible. Thus the phase shift relative to the center of the array at the center frequency is deterined only by the direction cosines. Narrowband assuption: The bandwidth of the signal relative to the center frequency is sall often taken at less than 10%. The sapling rate of the sensors satisfies the Nyquist criterion at least twice the bandwidth. Sapling is often done at an interediate frequency or at baseband. The change in the signal across the sensor array is negligible. That is, relative to the axiu J. A. O'S. ESE delay 54, across Lecture 17, the 03/19/09 array, the signal does not change substantially. 34
19 Estiation of a Source Distribution fro Sensor Array Data The signal can coe fro a source or fro reflections of a transitted wave. For reflections, we assue diffuse and incoherent scatterers, so the saples of the reflectivity are independent and identically distributed rando variables. If they coe fro a collection of saller scatterers, then a coplex Gaussian odel is appropriate. For a source, we assue a distributed incoherent source, whose saples are well odeled by saples of a coplex Gaussian distribution. For coplex Gaussian distributions, the real and iaginary parts are independent Gaussian rando variables with zero ean and equal variance. This is also referred to as Goodan class. The AWGN is coplex Gaussian, independent of the signal. J. A. O'S. ESE 54, Lecture 17, 03/19/09 35
20 Estiation of a Source Distribution fro Sensor Array Data Under the assuptions, a signal vector received by the array equals a linear cobination of direction vectors ties coplex Gaussian rando variables. The data vector that is available equals this signal vector plus a noise vector. If all sensors are identical, and the noise is independent d fro sensor to sensor, the covariance atrix for the noise is a constant ties an identity atrix. Direction vector is deterined by the tie delay a θ, φ = cosθ cos φ, a θ, φ = sinφ x d θφ, = x a θφ, + y a θφ, k k x k y y dk θφ, τk θ, φ = c Phase shift is deterined by the tie delay and the center frequency j π f 0τk θ, φ d, e exp k θφ = j π λ0 Signal equals a linear cobination dk θ, φ skn = cn expjπ = 1 λ0 Data equals signal plus noise rkn = s, kn + wkn k = 1,,.., K, n= 1,,..., N J. A. O'S. ESE 54, Lecture 17, 03/19/09 36
21 Estiation of a Source Distribution fro Sensor Array Data Signal equals a linear cobination d, k θ φ skn = cn expjπ = 1 λ0 Data equals signal plus noise rkn = skn + wkn, k = 1 1,,.., K, n= 1 1,,..., N rn = sn + wn, wn CN 0, N0I sn = Acn d1 θ1, φ1 d1 θ, φ d1 θ, φ expjπ expjπ exp jπ λ0 λ0 λ0 d θ1, φ1 d θ, φ d θ, φ expjπ expjπ expjπ A = λ0 λ0 λ0 d θ1, φ1 d θ, φ d θ, φ expjπ K expjπ K expjπ K λ0 J. A. O'S. ESE 54, Lecture λ0 17, 03/19/09 λ0 37
22 Estiation of a Source Distribution fro Sensor Array Data Data equals signal plus noise r s w, w 0, I, i.i.d., 1,,..., n = n + n n CN N0 n= N s = Ac, c CN 0, Σ, i.i.d., iid n = 1, 1,..., N n n n n 0, + 0, i.i.d., = 1,,..., r CN AΣA N I n N A is the coplex conjugate transpose. ore on coplex Gaussian: Let x and y be independent Gaussian rando variables with zero ean and variances equal to σ. Then the joint probability density function is x + y x + y 1 x + y 1 1 σ σ N0 e = e = e. πσ πσ π N0 That is, the joint pdf is paraeterized by the total variance. J. A. O'S. ESE 54, Lecture 17, 03/19/09 38
Using EM To Estimate A Probablity Density With A Mixture Of Gaussians
Using EM To Estiate A Probablity Density With A Mixture Of Gaussians Aaron A. D Souza adsouza@usc.edu Introduction The proble we are trying to address in this note is siple. Given a set of data points
More informationCS Lecture 13. More Maximum Likelihood
CS 6347 Lecture 13 More Maxiu Likelihood Recap Last tie: Introduction to axiu likelihood estiation MLE for Bayesian networks Optial CPTs correspond to epirical counts Today: MLE for CRFs 2 Maxiu Likelihood
More informationBayes Decision Rule and Naïve Bayes Classifier
Bayes Decision Rule and Naïve Bayes Classifier Le Song Machine Learning I CSE 6740, Fall 2013 Gaussian Mixture odel A density odel p(x) ay be ulti-odal: odel it as a ixture of uni-odal distributions (e.g.
More informationESE 523 Information Theory
ESE 53 Inforation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electrical and Systes Engineering Washington University 11 Urbauer Hall 10E Green Hall 314-935-4173 (Lynda Marha Answers) jao@wustl.edu
More informationFeature Extraction Techniques
Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that
More informationA Simple Regression Problem
A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where
More informationA method to determine relative stroke detection efficiencies from multiplicity distributions
A ethod to deterine relative stroke detection eiciencies ro ultiplicity distributions Schulz W. and Cuins K. 2. Austrian Lightning Detection and Inoration Syste (ALDIS), Kahlenberger Str.2A, 90 Vienna,
More informationNon-Parametric Non-Line-of-Sight Identification 1
Non-Paraetric Non-Line-of-Sight Identification Sinan Gezici, Hisashi Kobayashi and H. Vincent Poor Departent of Electrical Engineering School of Engineering and Applied Science Princeton University, Princeton,
More informationProbabilistic Machine Learning
Probabilistic Machine Learning by Prof. Seungchul Lee isystes Design Lab http://isystes.unist.ac.kr/ UNIST Table of Contents I.. Probabilistic Linear Regression I... Maxiu Likelihood Solution II... Maxiu-a-Posteriori
More informationKernel Methods and Support Vector Machines
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic
More informationEstimating Parameters for a Gaussian pdf
Pattern Recognition and achine Learning Jaes L. Crowley ENSIAG 3 IS First Seester 00/0 Lesson 5 7 Noveber 00 Contents Estiating Paraeters for a Gaussian pdf Notation... The Pattern Recognition Proble...3
More informationHIGH RESOLUTION NEAR-FIELD MULTIPLE TARGET DETECTION AND LOCALIZATION USING SUPPORT VECTOR MACHINES
ICONIC 2007 St. Louis, O, USA June 27-29, 2007 HIGH RESOLUTION NEAR-FIELD ULTIPLE TARGET DETECTION AND LOCALIZATION USING SUPPORT VECTOR ACHINES A. Randazzo,. A. Abou-Khousa 2,.Pastorino, and R. Zoughi
More informationSPECTRUM sensing is a core concept of cognitive radio
World Acadey of Science, Engineering and Technology International Journal of Electronics and Counication Engineering Vol:6, o:2, 202 Efficient Detection Using Sequential Probability Ratio Test in Mobile
More informationTopic 5a Introduction to Curve Fitting & Linear Regression
/7/08 Course Instructor Dr. Rayond C. Rup Oice: A 337 Phone: (95) 747 6958 E ail: rcrup@utep.edu opic 5a Introduction to Curve Fitting & Linear Regression EE 4386/530 Coputational ethods in EE Outline
More informationAn Improved Particle Filter with Applications in Ballistic Target Tracking
Sensors & ransducers Vol. 72 Issue 6 June 204 pp. 96-20 Sensors & ransducers 204 by IFSA Publishing S. L. http://www.sensorsportal.co An Iproved Particle Filter with Applications in Ballistic arget racing
More informationSupport recovery in compressed sensing: An estimation theoretic approach
Support recovery in copressed sensing: An estiation theoretic approach Ain Karbasi, Ali Horati, Soheil Mohajer, Martin Vetterli School of Coputer and Counication Sciences École Polytechnique Fédérale de
More informationExperimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis
City University of New York (CUNY) CUNY Acadeic Works International Conference on Hydroinforatics 8-1-2014 Experiental Design For Model Discriination And Precise Paraeter Estiation In WDS Analysis Giovanna
More informationWill Monroe August 9, with materials by Mehran Sahami and Chris Piech. image: Arito. Parameter learning
Will Monroe August 9, 07 with aterials by Mehran Sahai and Chris Piech iage: Arito Paraeter learning Announceent: Proble Set #6 Goes out tonight. Due the last day of class, Wednesday, August 6 (before
More informationIntelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley osig 1 Winter Seester 2018 Lesson 6 27 February 2018 Outline Perceptrons and Support Vector achines Notation...2 Linear odels...3 Lines, Planes
More information13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices
CS71 Randoness & Coputation Spring 018 Instructor: Alistair Sinclair Lecture 13: February 7 Disclaier: These notes have not been subjected to the usual scrutiny accorded to foral publications. They ay
More informationSupplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion
Suppleentary Material for Fast and Provable Algoriths for Spectrally Sparse Signal Reconstruction via Low-Ran Hanel Matrix Copletion Jian-Feng Cai Tianing Wang Ke Wei March 1, 017 Abstract We establish
More informationProbability Distributions
Probability Distributions In Chapter, we ephasized the central role played by probability theory in the solution of pattern recognition probles. We turn now to an exploration of soe particular exaples
More informationFairness via priority scheduling
Fairness via priority scheduling Veeraruna Kavitha, N Heachandra and Debayan Das IEOR, IIT Bobay, Mubai, 400076, India vavitha,nh,debayan}@iitbacin Abstract In the context of ulti-agent resource allocation
More informationSupplementary Information for Design of Bending Multi-Layer Electroactive Polymer Actuators
Suppleentary Inforation for Design of Bending Multi-Layer Electroactive Polyer Actuators Bavani Balakrisnan, Alek Nacev, and Elisabeth Sela University of Maryland, College Park, Maryland 074 1 Analytical
More informationCh 12: Variations on Backpropagation
Ch 2: Variations on Backpropagation The basic backpropagation algorith is too slow for ost practical applications. It ay take days or weeks of coputer tie. We deonstrate why the backpropagation algorith
More informatione-companion ONLY AVAILABLE IN ELECTRONIC FORM
OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer
More informationComputational and Statistical Learning Theory
Coputational and Statistical Learning Theory TTIC 31120 Prof. Nati Srebro Lecture 2: PAC Learning and VC Theory I Fro Adversarial Online to Statistical Three reasons to ove fro worst-case deterinistic
More information1 Generalization bounds based on Rademacher complexity
COS 5: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #0 Scribe: Suqi Liu March 07, 08 Last tie we started proving this very general result about how quickly the epirical average converges
More informationIntelligent Systems: Reasoning and Recognition. Artificial Neural Networks
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley MOSIG M1 Winter Seester 2018 Lesson 7 1 March 2018 Outline Artificial Neural Networks Notation...2 Introduction...3 Key Equations... 3 Artificial
More informationModel Fitting. CURM Background Material, Fall 2014 Dr. Doreen De Leon
Model Fitting CURM Background Material, Fall 014 Dr. Doreen De Leon 1 Introduction Given a set of data points, we often want to fit a selected odel or type to the data (e.g., we suspect an exponential
More informationCOS 424: Interacting with Data. Written Exercises
COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well
More informationVisualization Techniques to Identify and Quantify Sources and Paths of Exterior Noise Radiated from Stationary and Nonstationary Vehicles
Purdue University Purdue e-pubs Publications of the ay W. Herrick Laboratories School of Mechanical Engineering 6-2 Visualization Techniques to Identify and Quantify Sources and Paths of Exterior Noise
More informationA Note on the Applied Use of MDL Approximations
A Note on the Applied Use of MDL Approxiations Daniel J. Navarro Departent of Psychology Ohio State University Abstract An applied proble is discussed in which two nested psychological odels of retention
More informationMachine Learning Basics: Estimators, Bias and Variance
Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics
More informationIdentical Maximum Likelihood State Estimation Based on Incremental Finite Mixture Model in PHD Filter
Identical Maxiu Lielihood State Estiation Based on Increental Finite Mixture Model in PHD Filter Gang Wu Eail: xjtuwugang@gail.co Jing Liu Eail: elelj20080730@ail.xjtu.edu.cn Chongzhao Han Eail: czhan@ail.xjtu.edu.cn
More informationLecture 9 November 23, 2015
CSC244: Discrepancy Theory in Coputer Science Fall 25 Aleksandar Nikolov Lecture 9 Noveber 23, 25 Scribe: Nick Spooner Properties of γ 2 Recall that γ 2 (A) is defined for A R n as follows: γ 2 (A) = in{r(u)
More informationProc. of the IEEE/OES Seventh Working Conference on Current Measurement Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES
Proc. of the IEEE/OES Seventh Working Conference on Current Measureent Technology UNCERTAINTIES IN SEASONDE CURRENT VELOCITIES Belinda Lipa Codar Ocean Sensors 15 La Sandra Way, Portola Valley, CA 98 blipa@pogo.co
More informationSymbolic Analysis as Universal Tool for Deriving Properties of Non-linear Algorithms Case study of EM Algorithm
Acta Polytechnica Hungarica Vol., No., 04 Sybolic Analysis as Universal Tool for Deriving Properties of Non-linear Algoriths Case study of EM Algorith Vladiir Mladenović, Miroslav Lutovac, Dana Porrat
More informationPULSE-TRAIN BASED TIME-DELAY ESTIMATION IMPROVES RESILIENCY TO NOISE
PULSE-TRAIN BASED TIME-DELAY ESTIMATION IMPROVES RESILIENCY TO NOISE 1 Nicola Neretti, 1 Nathan Intrator and 1,2 Leon N Cooper 1 Institute for Brain and Neural Systes, Brown University, Providence RI 02912.
More informationEffective joint probabilistic data association using maximum a posteriori estimates of target states
Effective joint probabilistic data association using axiu a posteriori estiates of target states 1 Viji Paul Panakkal, 2 Rajbabu Velurugan 1 Central Research Laboratory, Bharat Electronics Ltd., Bangalore,
More informationW-BASED VS LATENT VARIABLES SPATIAL AUTOREGRESSIVE MODELS: EVIDENCE FROM MONTE CARLO SIMULATIONS
W-BASED VS LATENT VARIABLES SPATIAL AUTOREGRESSIVE MODELS: EVIDENCE FROM MONTE CARLO SIMULATIONS. Introduction When it coes to applying econoetric odels to analyze georeferenced data, researchers are well
More informationLecture October 23. Scribes: Ruixin Qiang and Alana Shine
CSCI699: Topics in Learning and Gae Theory Lecture October 23 Lecturer: Ilias Scribes: Ruixin Qiang and Alana Shine Today s topic is auction with saples. 1 Introduction to auctions Definition 1. In a single
More informationLower Bounds for Quantized Matrix Completion
Lower Bounds for Quantized Matrix Copletion Mary Wootters and Yaniv Plan Departent of Matheatics University of Michigan Ann Arbor, MI Eail: wootters, yplan}@uich.edu Mark A. Davenport School of Elec. &
More informationIn this chapter, we consider several graph-theoretic and probabilistic models
THREE ONE GRAPH-THEORETIC AND STATISTICAL MODELS 3.1 INTRODUCTION In this chapter, we consider several graph-theoretic and probabilistic odels for a social network, which we do under different assuptions
More informationPattern Recognition and Machine Learning. Artificial Neural networks
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2016/2017 Lessons 9 11 Jan 2017 Outline Artificial Neural networks Notation...2 Convolutional Neural Networks...3
More informationOn Hyper-Parameter Estimation in Empirical Bayes: A Revisit of the MacKay Algorithm
On Hyper-Paraeter Estiation in Epirical Bayes: A Revisit of the MacKay Algorith Chune Li 1, Yongyi Mao, Richong Zhang 1 and Jinpeng Huai 1 1 School of Coputer Science and Engineering, Beihang University,
More informationThe linear sampling method and the MUSIC algorithm
INSTITUTE OF PHYSICS PUBLISHING INVERSE PROBLEMS Inverse Probles 17 (2001) 591 595 www.iop.org/journals/ip PII: S0266-5611(01)16989-3 The linear sapling ethod and the MUSIC algorith Margaret Cheney Departent
More informationA Self-Organizing Model for Logical Regression Jerry Farlow 1 University of Maine. (1900 words)
1 A Self-Organizing Model for Logical Regression Jerry Farlow 1 University of Maine (1900 words) Contact: Jerry Farlow Dept of Matheatics Univeristy of Maine Orono, ME 04469 Tel (07) 866-3540 Eail: farlow@ath.uaine.edu
More informationPAC-Bayes Analysis Of Maximum Entropy Learning
PAC-Bayes Analysis Of Maxiu Entropy Learning John Shawe-Taylor and David R. Hardoon Centre for Coputational Statistics and Machine Learning Departent of Coputer Science University College London, UK, WC1E
More informationComplex Quadratic Optimization and Semidefinite Programming
Coplex Quadratic Optiization and Seidefinite Prograing Shuzhong Zhang Yongwei Huang August 4 Abstract In this paper we study the approxiation algoriths for a class of discrete quadratic optiization probles
More informationNonmonotonic Networks. a. IRST, I Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I Povo (Trento) Italy
Storage Capacity and Dynaics of Nononotonic Networks Bruno Crespi a and Ignazio Lazzizzera b a. IRST, I-38050 Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I-38050 Povo (Trento) Italy INFN Gruppo
More informationRandomized Accuracy-Aware Program Transformations For Efficient Approximate Computations
Randoized Accuracy-Aware Progra Transforations For Efficient Approxiate Coputations Zeyuan Allen Zhu Sasa Misailovic Jonathan A. Kelner Martin Rinard MIT CSAIL zeyuan@csail.it.edu isailo@it.edu kelner@it.edu
More informationSupplementary to Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data
Suppleentary to Learning Discriinative Bayesian Networks fro High-diensional Continuous Neuroiaging Data Luping Zhou, Lei Wang, Lingqiao Liu, Philip Ogunbona, and Dinggang Shen Proposition. Given a sparse
More informationCombining Classifiers
Cobining Classifiers Generic ethods of generating and cobining ultiple classifiers Bagging Boosting References: Duda, Hart & Stork, pg 475-480. Hastie, Tibsharini, Friedan, pg 246-256 and Chapter 10. http://www.boosting.org/
More informationStatistical Logic Cell Delay Analysis Using a Current-based Model
Statistical Logic Cell Delay Analysis Using a Current-based Model Hanif Fatei Shahin Nazarian Massoud Pedra Dept. of EE-Systes, University of Southern California, Los Angeles, CA 90089 {fatei, shahin,
More informationTesting Properties of Collections of Distributions
Testing Properties of Collections of Distributions Reut Levi Dana Ron Ronitt Rubinfeld April 9, 0 Abstract We propose a fraework for studying property testing of collections of distributions, where the
More informationOptimal Jamming Over Additive Noise: Vector Source-Channel Case
Fifty-first Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 2-3, 2013 Optial Jaing Over Additive Noise: Vector Source-Channel Case Erah Akyol and Kenneth Rose Abstract This paper
More informationAlgorithms for parallel processor scheduling with distinct due windows and unit-time jobs
BULLETIN OF THE POLISH ACADEMY OF SCIENCES TECHNICAL SCIENCES Vol. 57, No. 3, 2009 Algoriths for parallel processor scheduling with distinct due windows and unit-tie obs A. JANIAK 1, W.A. JANIAK 2, and
More informationThis model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t.
CS 493: Algoriths for Massive Data Sets Feb 2, 2002 Local Models, Bloo Filter Scribe: Qin Lv Local Models In global odels, every inverted file entry is copressed with the sae odel. This work wells when
More informationMultiscale Entropy Analysis: A New Method to Detect Determinism in a Time. Series. A. Sarkar and P. Barat. Variable Energy Cyclotron Centre
Multiscale Entropy Analysis: A New Method to Detect Deterinis in a Tie Series A. Sarkar and P. Barat Variable Energy Cyclotron Centre /AF Bidhan Nagar, Kolkata 700064, India PACS nubers: 05.45.Tp, 89.75.-k,
More informationAsynchronous Gossip Algorithms for Stochastic Optimization
Asynchronous Gossip Algoriths for Stochastic Optiization S. Sundhar Ra ECE Dept. University of Illinois Urbana, IL 680 ssrini@illinois.edu A. Nedić IESE Dept. University of Illinois Urbana, IL 680 angelia@illinois.edu
More informationHamming Compressed Sensing
Haing Copressed Sensing Tianyi Zhou, and Dacheng Tao, Meber, IEEE Abstract arxiv:.73v2 [cs.it] Oct 2 Copressed sensing CS and -bit CS cannot directly recover quantized signals and require tie consuing
More informationMachine Learning: Fisher s Linear Discriminant. Lecture 05
Machine Learning: Fisher s Linear Discriinant Lecture 05 Razvan C. Bunescu chool of Electrical Engineering and Coputer cience bunescu@ohio.edu Lecture 05 upervised Learning ask learn an (unkon) function
More informationLeast Squares Fitting of Data
Least Squares Fitting of Data David Eberly, Geoetric Tools, Redond WA 98052 https://www.geoetrictools.co/ This work is licensed under the Creative Coons Attribution 4.0 International License. To view a
More informationMeasuring Temperature with a Silicon Diode
Measuring Teperature with a Silicon Diode Due to the high sensitivity, nearly linear response, and easy availability, we will use a 1N4148 diode for the teperature transducer in our easureents 10 Analysis
More informationQuantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search
Quantu algoriths (CO 781, Winter 2008) Prof Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search ow we begin to discuss applications of quantu walks to search algoriths
More informationAnalyzing Simulation Results
Analyzing Siulation Results Dr. John Mellor-Cruey Departent of Coputer Science Rice University johnc@cs.rice.edu COMP 528 Lecture 20 31 March 2005 Topics for Today Model verification Model validation Transient
More informationBoosting with log-loss
Boosting with log-loss Marco Cusuano-Towner Septeber 2, 202 The proble Suppose we have data exaples {x i, y i ) i =... } for a two-class proble with y i {, }. Let F x) be the predictor function with the
More informationStudy on Markov Alternative Renewal Reward. Process for VLSI Cell Partitioning
Int. Journal of Math. Analysis, Vol. 7, 2013, no. 40, 1949-1960 HIKARI Ltd, www.-hikari.co http://dx.doi.org/10.12988/ia.2013.36142 Study on Markov Alternative Renewal Reward Process for VLSI Cell Partitioning
More information1 Bounding the Margin
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #12 Scribe: Jian Min Si March 14, 2013 1 Bounding the Margin We are continuing the proof of a bound on the generalization error of AdaBoost
More informationA general forulation of the cross-nested logit odel Michel Bierlaire, Dpt of Matheatics, EPFL, Lausanne Phone: Fax:
A general forulation of the cross-nested logit odel Michel Bierlaire, EPFL Conference paper STRC 2001 Session: Choices A general forulation of the cross-nested logit odel Michel Bierlaire, Dpt of Matheatics,
More informationTracking using CONDENSATION: Conditional Density Propagation
Tracking using CONDENSATION: Conditional Density Propagation Goal Model-based visual tracking in dense clutter at near video frae rates M. Isard and A. Blake, CONDENSATION Conditional density propagation
More informationStatistical clustering and Mineral Spectral Unmixing in Aviris Hyperspectral Image of Cuprite, NV
CS229 REPORT, DECEMBER 05 1 Statistical clustering and Mineral Spectral Unixing in Aviris Hyperspectral Iage of Cuprite, NV Mario Parente, Argyris Zynis I. INTRODUCTION Hyperspectral Iaging is a technique
More informationUfuk Demirci* and Feza Kerestecioglu**
1 INDIRECT ADAPTIVE CONTROL OF MISSILES Ufuk Deirci* and Feza Kerestecioglu** *Turkish Navy Guided Missile Test Station, Beykoz, Istanbul, TURKEY **Departent of Electrical and Electronics Engineering,
More informationTraining an RBM: Contrastive Divergence. Sargur N. Srihari
Training an RBM: Contrastive Divergence Sargur N. srihari@cedar.buffalo.edu Topics in Partition Function Definition of Partition Function 1. The log-likelihood gradient 2. Stochastic axiu likelihood and
More informationGeneral Properties of Radiation Detectors Supplements
Phys. 649: Nuclear Techniques Physics Departent Yarouk University Chapter 4: General Properties of Radiation Detectors Suppleents Dr. Nidal M. Ershaidat Overview Phys. 649: Nuclear Techniques Physics Departent
More informationarxiv: v1 [cs.lg] 8 Jan 2019
Data Masking with Privacy Guarantees Anh T. Pha Oregon State University phatheanhbka@gail.co Shalini Ghosh Sasung Research shalini.ghosh@gail.co Vinod Yegneswaran SRI international vinod@csl.sri.co arxiv:90.085v
More informationFinite-State Markov Modeling of Flat Fading Channels
International Telecounications Syposiu ITS, Natal, Brazil Finite-State Markov Modeling of Flat Fading Channels Cecilio Pientel, Tiago Falk and Luciano Lisbôa Counications Research Group - CODEC Departent
More informationPattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lesson 1 4 October 2017 Outline Learning and Evaluation for Pattern Recognition Notation...2 1. The Pattern Recognition
More informationA Low-Complexity Congestion Control and Scheduling Algorithm for Multihop Wireless Networks with Order-Optimal Per-Flow Delay
A Low-Coplexity Congestion Control and Scheduling Algorith for Multihop Wireless Networks with Order-Optial Per-Flow Delay Po-Kai Huang, Xiaojun Lin, and Chih-Chun Wang School of Electrical and Coputer
More informationData-Driven Imaging in Anisotropic Media
18 th World Conference on Non destructive Testing, 16- April 1, Durban, South Africa Data-Driven Iaging in Anisotropic Media Arno VOLKER 1 and Alan HUNTER 1 TNO Stieltjesweg 1, 6 AD, Delft, The Netherlands
More informationMulti-Scale/Multi-Resolution: Wavelet Transform
Multi-Scale/Multi-Resolution: Wavelet Transfor Proble with Fourier Fourier analysis -- breaks down a signal into constituent sinusoids of different frequencies. A serious drawback in transforing to the
More informationOn Constant Power Water-filling
On Constant Power Water-filling Wei Yu and John M. Cioffi Electrical Engineering Departent Stanford University, Stanford, CA94305, U.S.A. eails: {weiyu,cioffi}@stanford.edu Abstract This paper derives
More informationMAXIMUM LIKELIHOOD BASED TECHNIQUES FOR BLIND SOURCE SEPARATION AND APPROXIMATE JOINT DIAGONALIZATION
BEN-GURION UNIVERSITY OF TE NEGEV FACULTY OF ENGINEERING SCIENCE DEPARTENT OF ELECTRICAL AND COPUTER ENGINEERING AXIU LIKELIOOD BASED TECNIQUES FOR BLIND SOURCE SEPARATION AND APPROXIATE JOINT DIAGONALIZATION
More informationA Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair
Proceedings of the 6th SEAS International Conference on Siulation, Modelling and Optiization, Lisbon, Portugal, Septeber -4, 006 0 A Siplified Analytical Approach for Efficiency Evaluation of the eaving
More informationMSEC MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL SOLUTION FOR MAINTENANCE AND PERFORMANCE
Proceeding of the ASME 9 International Manufacturing Science and Engineering Conference MSEC9 October 4-7, 9, West Lafayette, Indiana, USA MSEC9-8466 MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL
More informationDetection and Estimation Theory
ESE 524 Detection and Estimation Theory Joseph A. O Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Electrical and Systems Engineering Washington University 2 Urbauer
More informationESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics
ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS A Thesis Presented to The Faculty of the Departent of Matheatics San Jose State University In Partial Fulfillent of the Requireents
More informationSEISMIC FRAGILITY ANALYSIS
9 th ASCE Specialty Conference on Probabilistic Mechanics and Structural Reliability PMC24 SEISMIC FRAGILITY ANALYSIS C. Kafali, Student M. ASCE Cornell University, Ithaca, NY 483 ck22@cornell.edu M. Grigoriu,
More informationA BLOCK MONOTONE DOMAIN DECOMPOSITION ALGORITHM FOR A NONLINEAR SINGULARLY PERTURBED PARABOLIC PROBLEM
INTERNATIONAL JOURNAL OF NUMERICAL ANALYSIS AND MODELING Volue 3, Nuber 2, Pages 211 231 c 2006 Institute for Scientific Coputing and Inforation A BLOCK MONOTONE DOMAIN DECOMPOSITION ALGORITHM FOR A NONLINEAR
More informationCorrelated Bayesian Model Fusion: Efficient Performance Modeling of Large-Scale Tunable Analog/RF Integrated Circuits
Correlated Bayesian odel Fusion: Efficient Perforance odeling of Large-Scale unable Analog/RF Integrated Circuits Fa Wang and Xin Li ECE Departent, Carnegie ellon University, Pittsburgh, PA 53 {fwang,
More informationVariations on Backpropagation
2 Variations on Backpropagation 2 Variations Heuristic Modifications Moentu Variable Learning Rate Standard Nuerical Optiization Conjugate Gradient Newton s Method (Levenberg-Marquardt) 2 2 Perforance
More informationBiostatistics Department Technical Report
Biostatistics Departent Technical Report BST006-00 Estiation of Prevalence by Pool Screening With Equal Sized Pools and a egative Binoial Sapling Model Charles R. Katholi, Ph.D. Eeritus Professor Departent
More informationInference in the Presence of Likelihood Monotonicity for Polytomous and Logistic Regression
Advances in Pure Matheatics, 206, 6, 33-34 Published Online April 206 in SciRes. http://www.scirp.org/journal/ap http://dx.doi.org/0.4236/ap.206.65024 Inference in the Presence of Likelihood Monotonicity
More informationPattern Recognition and Machine Learning. Artificial Neural networks
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lessons 7 20 Dec 2017 Outline Artificial Neural networks Notation...2 Introduction...3 Key Equations... 3 Artificial
More informationBootstrapping Dependent Data
Bootstrapping Dependent Data One of the key issues confronting bootstrap resapling approxiations is how to deal with dependent data. Consider a sequence fx t g n t= of dependent rando variables. Clearly
More informationDERIVING PROPER UNIFORM PRIORS FOR REGRESSION COEFFICIENTS
DERIVING PROPER UNIFORM PRIORS FOR REGRESSION COEFFICIENTS N. van Erp and P. van Gelder Structural Hydraulic and Probabilistic Design, TU Delft Delft, The Netherlands Abstract. In probles of odel coparison
More informationSharp Time Data Tradeoffs for Linear Inverse Problems
Sharp Tie Data Tradeoffs for Linear Inverse Probles Saet Oyak Benjain Recht Mahdi Soltanolkotabi January 016 Abstract In this paper we characterize sharp tie-data tradeoffs for optiization probles used
More informationA remark on a success rate model for DPA and CPA
A reark on a success rate odel for DPA and CPA A. Wieers, BSI Version 0.5 andreas.wieers@bsi.bund.de Septeber 5, 2018 Abstract The success rate is the ost coon evaluation etric for easuring the perforance
More informationIntroduction to Discrete Optimization
Prof. Friedrich Eisenbrand Martin Nieeier Due Date: March 9 9 Discussions: March 9 Introduction to Discrete Optiization Spring 9 s Exercise Consider a school district with I neighborhoods J schools and
More information