A NEW ROBUST AND EFFICIENT ESTIMATOR FOR ILL-CONDITIONED LINEAR INVERSE PROBLEMS WITH OUTLIERS

Size: px
Start display at page:

Download "A NEW ROBUST AND EFFICIENT ESTIMATOR FOR ILL-CONDITIONED LINEAR INVERSE PROBLEMS WITH OUTLIERS"

Transcription

1 A NEW ROBUST AND EFFICIENT ESTIMATOR FOR ILL-CONDITIONED LINEAR INVERSE PROBLEMS WITH OUTLIERS Marta Martinez-Caara 1, Michael Mua 2, Abdelhak M. Zoubir 2, Martin Vetterli 1 1 School of Coputer and Counication Sciences, Ecole Polytechnique Fédérale de Lausanne (EPFL), CH-1015 Lausanne, Switzerland, arta.artinez-caara@epfl.ch artin.vetterli@epfl.ch 2 Signal Processing Group, Technische Universität Darstadt, Merckstr. 25, DE Darstadt, Gerany, ua@spg.tu-darstadt.de zoubir@spg.tu-darstadt.de ABSTRACT Solving a linear inverse proble ay include difficulties such as the presence of outliers and a iing atri with a large condition nuber. In such cases a regularized robust estiator is needed. We propose a new τ-type regularized robust estiator that is siultaneously highly robust against outliers, highly efficient in the presence of purely Gaussian noise, and also stable when the iing atri has a large condition nuber. We also propose an algorith to copute the estiates, based on a regularized iterative reweighted least squares algorith. A basic and a fast version of the algorith are given. Finally, we test the perforance of the proposed approach using nuerical eperients and copare it with other estiators. Our estiator provides superior robustness, even up to 40% of outliers, while at the sae tie perforing quite close to the optial aiu likelihood estiator in the outlier-free case. 1. INTRODUCTION Let a linear inverse proble be given by y = A + n. (1) The unknown source signal R n 1 is to be estiated. The easureent vector y R 1 and the iing atri A R n are known. Finally, the noise n R n 1 is in general unknown. This type of proble appears frequently in signal processing [1], as well as in physics, geosciences [2, 3], finance, and engineering in general. Traditionally, n is assued to have independent and identically distributed (iid) Gaussian rando entries. Under this assuption, the optial estiator is the least squares (LS) estiator [4]: = arg in y A 2 2. (2) However, in any odern engineering probles this idealistic noise odel does not hold. The distribution of n is far fro Gaussian when it contains outliers, which causes the distribution to be heavy This work was supported by a SNF Grant: Non-linear Sapling Methods FNS , by the project HANDiCAMS which acknowledges the financial support of the Future and Eerging Technologies (FET) prograe within the Seventh Fraework Prograe for Research of the European Coission, under FET-Open grant nuber: and by an ERC Advanced Grant Support for Frontier Research SPARSAM Nr: tailed [5]. In such scenarios, the LS-estiator s perforance drastically degrades. As a solution, any statistically robust estiators have been proposed in the literature. The first systeatic approach to statistically robust estiation is M-estiation [6], which generalizes aiu likelihood (ML) estiation. However M-estiators are non-robust, i.e., have a breakdown point (BP) equal to zero, when leverage points are present (errors in the entries of A). The BP is the proportion of outliers in the data that an estiator can handle before giving an uninforative (e.g. arbitrarily large) result. A second fundaental class of robust estiators are the ones that are based on the iniization of a robust residual scale. This class includes estiators such as the least-tried-squares (LTS) [7], the least-edian-of-squares (LMS) [7], and the S-estiators [8]. The latter iniizes the M-estiate of scale of the residuals. S- estiators can be highly robust, i.e., have a high BP. However, in this case, they do not have high efficiency, i.e., the variance of the S-estiator, when it is tuned for high robustness, is uch larger than for the ML-estiator. This led to the developent of a third class of robust estiators, which are siultaneously robust and efficient. The τ-estiator [9] belongs to this class. Another coon difficulty is A having a large condition nuber. In such cases, the LS solution in (2) does not depend continuously on the proble data y and A, i.e., it is unstable. Such inverse probles are said to be ill-posed. To stabilize, it is possible to introduce additional a priori inforation in the proble. This is referred to as regularization. Tikhonov regularization is one of the oldest and ost well-known techniques for dealing with ill-posed inverse probles [10]: = arg in y A λ 2 2. (3) Moreover, soe inverse probles suffer fro both outliers and a atri A with a large condition nuber [11]. In these cases, a regularized robust estiator is needed. Recently, there has been increased interest in regularized estiators in the robust statistics counity. In particular, regularized M-estiators [12], S-estiators [13], and MM-estiators [14] were proposed. Of these, only the MM-estiator is siultaneously efficient and robust against leverage points [5]. We propose a regularized estiator of the τ-type, which is siultaneously highly robust, efficient, and stable when A has a large condition nuber. An advantage of our proposed estiator over the /15/$ IEEE 3422 ICASSP 2015

2 regularized MM-estiator is that it provides a high BP and a highly efficient estiate of the scale of the errors, in addition to the regularized estiate. We also derive an algorith to copute the estiates, which forulates the τ-iniization proble as a regularized iterative reweighted least squares (IRLS) proble of a nonconve function. A fast version of the algorith is provided, which is highly practical. Siulation studies are also conducted to copare our estiator to the regularized LS-estiator and M-estiator in ters of solving an outlier-containated inverse proble, where the atri A has a high condition nuber. A brief discussion about how the regularization affects the robustness of the estiators is provided by running the sae eperient, but using a atri A with a low condition nuber. This paper is organized as follows. Section 2 introduces the proposed regularized τ-estiator. Section 3 contains the derivation of an algorith to solve the τ-iniization proble as a regularized IRLS proble of a non-conve function in a coputationally feasible way. Section 4, provides a Monte-Carlo siulation study that copares the proposed approach to the regularized LS-estiator and M-estiator in ters of the nor of the error. Section 5 concludes the paper and provides a brief outlook on future research. 2. THE REGULARIZED τ-estimator The τ-scale of the residuals σ τ (r()) [9] is defined by στ 2 (r()) = σm 2 (r()) 1 ( ) ri() ρ 2. (4) σ M (r()) Here, σ M (r()) is the robust M-scale estiate, given by 1 ( ) ri() ρ 1 = b, (5) σ M (r()) with b = E F [ρ 1(r())], E F [ ] denoting the epectation w.r.t. the standard Gaussian distribution F, and r i = y i A, for i {1 }. The choice of ρ 2, controls the efficiency of the τ-estiator, e.g., full efficiency for the Gaussian distribution can be obtained by setting ρ 2(r) = r 2, which corresponds to the ML-estiator under the Gaussian assuption. The BP of the τ-estiator, on the other hand, is controlled by ρ 1 and is the sae as that of an S-estiator [8] that uses ρ 1. In this way, one can obtain BP=0.5 siultaneously with high efficiency [9]. In this paper, we introduce the regularized τ-estiator of regression as the iniizer of the su of the τ-estiate of scale of the residuals and the Tikhonov regularization: =arg in τ 2 R(r()), (6) where τ 2 R(r()) = σ 2 τ (r()) + λ 2 2 and λ 0 is the regularization paraeter. 3. COMPUTING THE ESTIMATES A ajor challenge of the regularized τ-estiator is its coputation. We note that (6) is a non-conve function, so finding the eact global iniu ay be coputationally prohibitive. Thus, to iniize (6), we propose a odification of the heuristic algorith given in [15]. The ain idea behind this heuristic algorith is intuitively suarized as follows: find several local inia, and select the best one aong the as the final solution. There is, of course, no guarantee that this solution is the global iniu of (6) Finding local inia The paraeter values that locally iniize τ 2 R(r()) are the ones with a derivative w.r.t. being equal to zero. yields Defining r() := r() σ M (r()) (7) ψ j() := ρj(), (8) τr(r()) 2 σm (r()) 1 = 2σ M (r()) ρ 2( r i())+ [ ] + σm 2 (r()) 1 Aiσ M (r()) r i() σ M (r) ψ 2( r i()) σ 2 M (r()) + λ 2 i = 0. (9) To find σ M (r), we take the derivative of (5) w.r.t. : σ M (r()) = σ M (r()) ψ1( ri())ai ψ1( ri())ri() (10) Replacing (10) in (9) we obtain ( W ()ψ 1( r i()) ψ 2( r i()) where W () := ) σ M (r())a i + + λ 2 i = 0, (11) 2ρ2( ri()) + ψ2( ri()) ri() ψ1( ri()) ri(). (12) We can see that (11) is also f() where f() is the regularized weighted least squares estiator with weights n f() = w i()(y i A i) 2 + λ 2 i (13) w i() = ψτ () 2 r i(), (14) where ( ψ τ () = 1 ) W ()ψ 1( r i()) ψ 2( r i()). (15) 3423

3 So f() is just another way of writing (6). Hence, f() can be iniized iteratively as a regularized IRLS [16] proble with weights given by (14). The regularized IRLS algorith for our proble is detailed in Algorith 1. As f() is a non-conve function, the result of this iniisation depends on the initial condition [0] given to the IRLS. Algorith 1 Regularized IRLS INPUT: y R, A R n, λ, ɛ, [0], K require λ 0, ɛ 0, K 0 for k = 0 to K 1 do r[k] y A[k] Copute σ M [k] Copute W[k] [k + 1] (A W [k]w[k]a + λ 2 ) 1 A W [k]w[k]y if [k + 1] [k] < ɛ then break end if return [k + 1] Fro 11, we deduce that the regularized tau-estiator is asyptotically equivalent to a regularized M-estiator with data-adaptive psi-function. Then the basic algorith is shown below as Algorith 2. Algorith 2 Basic algorith INPUT: y R, A R n, λ, ɛ, K, Q require λ 0, ɛ 0, K 0, Q 0 for q = 0 to Q 1 do q[0] rando initial condition q Algorith 1 (y, A, λ, ɛ, q[0], K) return arg in τr(r( 2 q)) q Naturally, the value of K ust be large enough to ensure sufficient convergence of the solutions, typically on the order of hundreds of iterations. Hence, Algorith 2 ay require large run ties, especially when high nubers of candidate solutions, Q, are considered Fast algorith To reduce the coputational cost of the basic version of the algorith, we adopt the idea proposed in [15]. It is a two-step algorith, as shown below. Algorith 3 Fast algorith INPUT: y R, A R n, λ, ɛ, K, Q,M require λ 0, ɛ 0, K 0, Q 0, M 0 for q = 0 to Q 1 do q[0] rando initial condition q Algorith 1 (y, A, λ, ɛ, q[0], K) Z set of M best solutions q for z Z do z Algorith 1 (y, A, λ, ɛ, z, ) return arg in τr(r( 2 z)) z In the first step, a very sall K is chosen, in the order of 3 to 5. Hence, this step, though identical to the basic algorith, takes uch less tie. After the first step, instead of choosing just one best solution, as in Algorith 2, the M best solutions are kept as input to the second step. Note that in the second step, all the M solutions are forced to run to coplete convergence by setting K. Fro these M converged solutions, the best solution is selected as the final output. 4. NUMERICAL EXPERIMENTS In this section, we copare the perforance of the regularized τ- estiator with that of the regularized LS-estiator and the regularized M-estiator. The siulation setup is as follows: we generate a atri A R with rando iid Gaussian entries and a condition nuber of We use the sae piece-wise constant for all the eperients. Additive outliers and Gaussian noise are included in generating y: y = A + n G + n o, (16) where n G R has rando iid Gaussian entries. On the other hand, n o R is a sparse vector. The few non-zero randoly selected entries have a rando unifor value with zero ean and variance equal to 10 ties that of A. These entries represent the outliers. With y and A as input, is estiated with the three different estiators. To show the efficiency and robustness of the estiators in ters of the nor of the error e = ˆ, several eperients with different percentages of outliers (0%, 10%, 20%, 30%, 40%) are carried out. For the loss function ρ(t) of the M-estiator, we choose the Huber function [17], while for the τ-estiator we choose the so-called optial weight function with clipping paraeters c 1 = and c 2 = They are plotted in Figure 1. These loss functions produce a τ-estiator with BP = 0.5 and efficiency of 95% [15]. In each realization, the optial λ was found eperientally and used in the eperient. Figure 2 shows the average of the e evaluated over 100 Monte-Carlo runs. We can observe in Figure 2 that the e of the regularized LSestiator and regularized M-estiator strongly increase already for 10% of outliers. The regularized τ-estiator, on the other hand, provides superior robustness, even up to 40% of outliers. The LSestiator, which coincides with the ML-estiator, provides the best 3424

4 1 = = Fig. 1. Optial weight [15] loss functions ρ 1(t) (solid line) and ρ 2(t) (dashed line) used in the regularized-τ estiator. Thus, the aiu e = = 3.6 for the particular source we chose. This intuitively represents the BP of the regularized estiator. In the regularized case, the LS saturates to this aiu e, while the M-estiator reains under this bound. Its perforance is uch better than in the non-regularized case. The τ-estiator perfors well up to 40% of outliers in the regularized case, but in the non-regularized case it is not robust for 40% of outliers. Fro these results, we can conclude that the regularization iproves the robustness of the estiation. It includes in the proble ore a priori inforation about the solution, which perits the estiator to copute a solution closer to the real one, and to discard unlikely solutions which cause a breakdown LS M tau 4 3 Reg. LS Reg. M Reg. tau % outliers % outliers Fig. 2. Average of the produced by the regularized LSestiator, M-estiator, and τ-estiator evaluated over 100 Monte- Carlo runs. perforance in the outlier-free case. The τ-estiator has a high relative efficiency, producing estiates which are quite close to the LS-estiates. In order to deonstrate how the regularization affects the robustness of the estiators, we run the sae eperient, but using a atri A with a low condition nuber of the order of 10. Then, is estiated using the non-regularized LS-estiator, M-estiator, and τ-estiator, respectively. Results are shown in Figure 3. First, we note that the errors in the non-regularized case are, in general, one order of agnitude larger than in the regularized case. This is due to the regularization liiting the aiu error: we chose the optiu regularization paraeter. The aiu error is thus obtained when the paraeter is large enough to ake = 0. Fig. 3. Average of the produced by the non-regularized LSestiator, M-estiator, and τ-estiator evaluated over 200 Monte- Carlo runs. 5. CONCLUSIONS In this paper, we proposed a new regularized robust estiator that is siultaneously highly robust against outliers, highly efficient in the presence of purely Gaussian noise, and also stable when the iing atri has a large condition nuber. A large part of this paper was dedicated to deriving a fast way of coputing this estiator based on a regularized iterative reweighted least squares algorith. We showed that our estiator provides superior robustness, copared to eisting regularized estiators, and approiately aintains a perforance quite close to the optial aiu likelihood estiator in the outlier-free case, even up to 40% of outliers. The proposed estiator sees well suited to solve inverse transport odelling of atospheric eissions, where the iing atri ay also contain outliers. This real data application will be part of our future work. Furtherore, fundaental robustness concepts, such as the breakdown point, the aiu bias curve and the influence function are influenced by the regularization. We showed via nuerical eperients that a breakdown occurs when the nor of the errors converges to a fied value that depends on the regularization. A full statistical robustness analysis of the proposed estiator will provide new insights on the interrelation of robustness and regularization. 3425

5 6. REFERENCES [1] A. Ribes and F. Schitt, Linear inverse probles in iaging, IEEE Signal Process. Mag., pp , July [2] R. H. Stolt and A. B. Weglein, Seisic iaging and inversion, Cabridge, [3] M. Martinez-Caara, I. Dokanic, J. Ranieri, R. Scheibler, M. Vetterli, and A. Stohl, The Fukushia inverse proble, in Acoustics, Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on, May 2013, pp [4] P. J. Rousseeuw and A. M. Leroy, Robust regression and outlier detection, John Wiley & Sons, Ltd, [5] A. M. Zoubir, V. Koivunen, Y. Chakhchoukh, and M. Mua, Robust estiation in signal processing: a tutorial-style treatent of fundaental concepts, IEEE Signal Process. Mag., vol. 29, no. 4, pp , July [6] P. J. Huber, Robust estiation of a location paraeter, Ann. Math. Statist., vol. 35, no. 1, pp , [7] P. J. Rousseeuw, Least edian of squares regression, J. A. Statist. Assoc., vol. 79, pp , [8] P. J. Rousseeuw and V. J. Yohai, Robust regression by eans of S-estiators, Lect. Notes Stat., vol. 26, pp , [9] V. J. Yohai and R.H. Zaar, High breakdown-point estiates of regression by eans of the iniization of an efficient scale, J. Aer. Statist. Assoc., vol. 83, no. 402, pp , [10] A.N. Tikhonov and V.Y. Arsenin, Solution of ill-posed probles, Winston-Willey, New York, [11] M. Martinez-Caara, B. Béjar Haro, A. Stohl, and M. Vetterli, A robust ethod for inverse transport odelling of atospheric eissions using blind outlier detection, Geosci. Model Dev. Discuss., vol. 7, no. 3, pp , [12] M. J. Silavapulle, Robust ridge regrssion based on an M- estiator, Aust. J. Stat., vol. 33, no. 3, pp , [13] K. Thararatna, G. Claeskens, C. Crou, and M. Salibian- Barrera, S-estiation for penalised regression splines, J. Coput. Graph. Stat., vol. 19, pp , [14] R. A. Maronna, Robust ridge regression for high-diensional data, Technoetrics, vol. 53, no. 1, pp , [15] M. Salibian-Barrera, G. Willes, and R.H. Zaar, The fasttau estiator for regression, J. Coput. Graph. Stat., vol. 17, pp , [16] R. Wolke and H. Schwetlick, Iteratively reweighted least squares: Algoriths, convergence analysis, and nuerical coparisons, SIAM J. Sci. and Stat. Coput., vol. 9, no. 5, pp , [17] P. J. Huber and E. M. Rochetti, Robust Statistics, vol. 2, John Willey & Sons,

Randomized Recovery for Boolean Compressed Sensing

Randomized Recovery for Boolean Compressed Sensing Randoized Recovery for Boolean Copressed Sensing Mitra Fatei and Martin Vetterli Laboratory of Audiovisual Counication École Polytechnique Fédéral de Lausanne (EPFL) Eail: {itra.fatei, artin.vetterli}@epfl.ch

More information

Non-Parametric Non-Line-of-Sight Identification 1

Non-Parametric Non-Line-of-Sight Identification 1 Non-Paraetric Non-Line-of-Sight Identification Sinan Gezici, Hisashi Kobayashi and H. Vincent Poor Departent of Electrical Engineering School of Engineering and Applied Science Princeton University, Princeton,

More information

Estimation of the Mean of the Exponential Distribution Using Maximum Ranked Set Sampling with Unequal Samples

Estimation of the Mean of the Exponential Distribution Using Maximum Ranked Set Sampling with Unequal Samples Open Journal of Statistics, 4, 4, 64-649 Published Online Septeber 4 in SciRes http//wwwscirporg/ournal/os http//ddoiorg/436/os4486 Estiation of the Mean of the Eponential Distribution Using Maiu Ranked

More information

Support recovery in compressed sensing: An estimation theoretic approach

Support recovery in compressed sensing: An estimation theoretic approach Support recovery in copressed sensing: An estiation theoretic approach Ain Karbasi, Ali Horati, Soheil Mohajer, Martin Vetterli School of Coputer and Counication Sciences École Polytechnique Fédérale de

More information

Sharp Time Data Tradeoffs for Linear Inverse Problems

Sharp Time Data Tradeoffs for Linear Inverse Problems Sharp Tie Data Tradeoffs for Linear Inverse Probles Saet Oyak Benjain Recht Mahdi Soltanolkotabi January 016 Abstract In this paper we characterize sharp tie-data tradeoffs for optiization probles used

More information

Ch 12: Variations on Backpropagation

Ch 12: Variations on Backpropagation Ch 2: Variations on Backpropagation The basic backpropagation algorith is too slow for ost practical applications. It ay take days or weeks of coputer tie. We deonstrate why the backpropagation algorith

More information

Experimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis

Experimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis City University of New York (CUNY) CUNY Acadeic Works International Conference on Hydroinforatics 8-1-2014 Experiental Design For Model Discriination And Precise Paraeter Estiation In WDS Analysis Giovanna

More information

Model Fitting. CURM Background Material, Fall 2014 Dr. Doreen De Leon

Model Fitting. CURM Background Material, Fall 2014 Dr. Doreen De Leon Model Fitting CURM Background Material, Fall 014 Dr. Doreen De Leon 1 Introduction Given a set of data points, we often want to fit a selected odel or type to the data (e.g., we suspect an exponential

More information

e-companion ONLY AVAILABLE IN ELECTRONIC FORM

e-companion ONLY AVAILABLE IN ELECTRONIC FORM OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer

More information

Block designs and statistics

Block designs and statistics Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent

More information

Keywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution

Keywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution Testing approxiate norality of an estiator using the estiated MSE and bias with an application to the shape paraeter of the generalized Pareto distribution J. Martin van Zyl Abstract In this work the norality

More information

This model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t.

This model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t. CS 493: Algoriths for Massive Data Sets Feb 2, 2002 Local Models, Bloo Filter Scribe: Qin Lv Local Models In global odels, every inverted file entry is copressed with the sae odel. This work wells when

More information

COS 424: Interacting with Data. Written Exercises

COS 424: Interacting with Data. Written Exercises COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well

More information

A Simple Regression Problem

A Simple Regression Problem A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where

More information

SPECTRUM sensing is a core concept of cognitive radio

SPECTRUM sensing is a core concept of cognitive radio World Acadey of Science, Engineering and Technology International Journal of Electronics and Counication Engineering Vol:6, o:2, 202 Efficient Detection Using Sequential Probability Ratio Test in Mobile

More information

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley osig 1 Winter Seester 2018 Lesson 6 27 February 2018 Outline Perceptrons and Support Vector achines Notation...2 Linear odels...3 Lines, Planes

More information

TABLE FOR UPPER PERCENTAGE POINTS OF THE LARGEST ROOT OF A DETERMINANTAL EQUATION WITH FIVE ROOTS. By William W. Chen

TABLE FOR UPPER PERCENTAGE POINTS OF THE LARGEST ROOT OF A DETERMINANTAL EQUATION WITH FIVE ROOTS. By William W. Chen TABLE FOR UPPER PERCENTAGE POINTS OF THE LARGEST ROOT OF A DETERMINANTAL EQUATION WITH FIVE ROOTS By Willia W. Chen The distribution of the non-null characteristic roots of a atri derived fro saple observations

More information

Weighted- 1 minimization with multiple weighting sets

Weighted- 1 minimization with multiple weighting sets Weighted- 1 iniization with ultiple weighting sets Hassan Mansour a,b and Özgür Yılaza a Matheatics Departent, University of British Colubia, Vancouver - BC, Canada; b Coputer Science Departent, University

More information

Inspection; structural health monitoring; reliability; Bayesian analysis; updating; decision analysis; value of information

Inspection; structural health monitoring; reliability; Bayesian analysis; updating; decision analysis; value of information Cite as: Straub D. (2014). Value of inforation analysis with structural reliability ethods. Structural Safety, 49: 75-86. Value of Inforation Analysis with Structural Reliability Methods Daniel Straub

More information

W-BASED VS LATENT VARIABLES SPATIAL AUTOREGRESSIVE MODELS: EVIDENCE FROM MONTE CARLO SIMULATIONS

W-BASED VS LATENT VARIABLES SPATIAL AUTOREGRESSIVE MODELS: EVIDENCE FROM MONTE CARLO SIMULATIONS W-BASED VS LATENT VARIABLES SPATIAL AUTOREGRESSIVE MODELS: EVIDENCE FROM MONTE CARLO SIMULATIONS. Introduction When it coes to applying econoetric odels to analyze georeferenced data, researchers are well

More information

Improving Ground Based Telescope Focus through Joint Parameter Estimation. Maj J. Chris Zingarelli USAF AFIT/ENG

Improving Ground Based Telescope Focus through Joint Parameter Estimation. Maj J. Chris Zingarelli USAF AFIT/ENG Iproving Ground Based Telescope Focus through Joint Paraeter Estiation Maj J Chris Zingarelli USAF AFIT/ENG Lt Col Travis Blake DARPA/TTO - Space Systes Dr Stephen Cain USAF AFIT/ENG Abstract-- Space Surveillance

More information

A note on the multiplication of sparse matrices

A note on the multiplication of sparse matrices Cent. Eur. J. Cop. Sci. 41) 2014 1-11 DOI: 10.2478/s13537-014-0201-x Central European Journal of Coputer Science A note on the ultiplication of sparse atrices Research Article Keivan Borna 12, Sohrab Aboozarkhani

More information

An Improved Particle Filter with Applications in Ballistic Target Tracking

An Improved Particle Filter with Applications in Ballistic Target Tracking Sensors & ransducers Vol. 72 Issue 6 June 204 pp. 96-20 Sensors & ransducers 204 by IFSA Publishing S. L. http://www.sensorsportal.co An Iproved Particle Filter with Applications in Ballistic arget racing

More information

Spine Fin Efficiency A Three Sided Pyramidal Fin of Equilateral Triangular Cross-Sectional Area

Spine Fin Efficiency A Three Sided Pyramidal Fin of Equilateral Triangular Cross-Sectional Area Proceedings of the 006 WSEAS/IASME International Conference on Heat and Mass Transfer, Miai, Florida, USA, January 18-0, 006 (pp13-18) Spine Fin Efficiency A Three Sided Pyraidal Fin of Equilateral Triangular

More information

Birthday Paradox Calculations and Approximation

Birthday Paradox Calculations and Approximation Birthday Paradox Calculations and Approxiation Joshua E. Hill InfoGard Laboratories -March- v. Birthday Proble In the birthday proble, we have a group of n randoly selected people. If we assue that birthdays

More information

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization Recent Researches in Coputer Science Support Vector Machine Classification of Uncertain and Ibalanced data using Robust Optiization RAGHAV PAT, THEODORE B. TRAFALIS, KASH BARKER School of Industrial Engineering

More information

PULSE-TRAIN BASED TIME-DELAY ESTIMATION IMPROVES RESILIENCY TO NOISE

PULSE-TRAIN BASED TIME-DELAY ESTIMATION IMPROVES RESILIENCY TO NOISE PULSE-TRAIN BASED TIME-DELAY ESTIMATION IMPROVES RESILIENCY TO NOISE 1 Nicola Neretti, 1 Nathan Intrator and 1,2 Leon N Cooper 1 Institute for Brain and Neural Systes, Brown University, Providence RI 02912.

More information

DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS

DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS ISSN 1440-771X AUSTRALIA DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS An Iproved Method for Bandwidth Selection When Estiating ROC Curves Peter G Hall and Rob J Hyndan Working Paper 11/00 An iproved

More information

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds

More information

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians Using EM To Estiate A Probablity Density With A Mixture Of Gaussians Aaron A. D Souza adsouza@usc.edu Introduction The proble we are trying to address in this note is siple. Given a set of data points

More information

Hybrid System Identification: An SDP Approach

Hybrid System Identification: An SDP Approach 49th IEEE Conference on Decision and Control Deceber 15-17, 2010 Hilton Atlanta Hotel, Atlanta, GA, USA Hybrid Syste Identification: An SDP Approach C Feng, C M Lagoa, N Ozay and M Sznaier Abstract The

More information

Nonlinear Stabilization of a Spherical Particle Trapped in an Optical Tweezer

Nonlinear Stabilization of a Spherical Particle Trapped in an Optical Tweezer Nonlinear Stabilization of a Spherical Particle Trapped in an Optical Tweezer Aruna Ranaweera ranawera@engineering.ucsb.edu Bassa Baieh baieh@engineering.ucsb.edu Andrew R. Teel teel@ece.ucsb.edu Departent

More information

Effective joint probabilistic data association using maximum a posteriori estimates of target states

Effective joint probabilistic data association using maximum a posteriori estimates of target states Effective joint probabilistic data association using axiu a posteriori estiates of target states 1 Viji Paul Panakkal, 2 Rajbabu Velurugan 1 Central Research Laboratory, Bharat Electronics Ltd., Bangalore,

More information

3D acoustic wave modeling with a time-space domain dispersion-relation-based Finite-difference scheme

3D acoustic wave modeling with a time-space domain dispersion-relation-based Finite-difference scheme P-8 3D acoustic wave odeling with a tie-space doain dispersion-relation-based Finite-difference schee Yang Liu * and rinal K. Sen State Key Laboratory of Petroleu Resource and Prospecting (China University

More information

Bootstrapping Dependent Data

Bootstrapping Dependent Data Bootstrapping Dependent Data One of the key issues confronting bootstrap resapling approxiations is how to deal with dependent data. Consider a sequence fx t g n t= of dependent rando variables. Clearly

More information

C na (1) a=l. c = CO + Clm + CZ TWO-STAGE SAMPLE DESIGN WITH SMALL CLUSTERS. 1. Introduction

C na (1) a=l. c = CO + Clm + CZ TWO-STAGE SAMPLE DESIGN WITH SMALL CLUSTERS. 1. Introduction TWO-STGE SMPLE DESIGN WITH SMLL CLUSTERS Robert G. Clark and David G. Steel School of Matheatics and pplied Statistics, University of Wollongong, NSW 5 ustralia. (robert.clark@abs.gov.au) Key Words: saple

More information

Super-Channel Selection for IASI Retrievals

Super-Channel Selection for IASI Retrievals Super-Channel Selection for IASI Retrievals Peter Schlüssel EUMETSAT, A Kavalleriesand 31, 64295 Darstadt, Gerany Abstract The Infrared Atospheric Sounding Interferoeter (IASI), to be flown on Metop as

More information

Identical Maximum Likelihood State Estimation Based on Incremental Finite Mixture Model in PHD Filter

Identical Maximum Likelihood State Estimation Based on Incremental Finite Mixture Model in PHD Filter Identical Maxiu Lielihood State Estiation Based on Increental Finite Mixture Model in PHD Filter Gang Wu Eail: xjtuwugang@gail.co Jing Liu Eail: elelj20080730@ail.xjtu.edu.cn Chongzhao Han Eail: czhan@ail.xjtu.edu.cn

More information

arxiv: v1 [cs.ds] 3 Feb 2014

arxiv: v1 [cs.ds] 3 Feb 2014 arxiv:40.043v [cs.ds] 3 Feb 04 A Bound on the Expected Optiality of Rando Feasible Solutions to Cobinatorial Optiization Probles Evan A. Sultani The Johns Hopins University APL evan@sultani.co http://www.sultani.co/

More information

Stochastic Subgradient Methods

Stochastic Subgradient Methods Stochastic Subgradient Methods Lingjie Weng Yutian Chen Bren School of Inforation and Coputer Science University of California, Irvine {wengl, yutianc}@ics.uci.edu Abstract Stochastic subgradient ethods

More information

Analyzing Simulation Results

Analyzing Simulation Results Analyzing Siulation Results Dr. John Mellor-Cruey Departent of Coputer Science Rice University johnc@cs.rice.edu COMP 528 Lecture 20 31 March 2005 Topics for Today Model verification Model validation Transient

More information

On the Maximum Likelihood Estimation of Weibull Distribution with Lifetime Data of Hard Disk Drives

On the Maximum Likelihood Estimation of Weibull Distribution with Lifetime Data of Hard Disk Drives 314 Int'l Conf. Par. and Dist. Proc. Tech. and Appl. PDPTA'17 On the Maxiu Likelihood Estiation of Weibull Distribution with Lifetie Data of Hard Disk Drives Daiki Koizui Departent of Inforation and Manageent

More information

Ştefan ŞTEFĂNESCU * is the minimum global value for the function h (x)

Ştefan ŞTEFĂNESCU * is the minimum global value for the function h (x) 7Applying Nelder Mead s Optiization Algorith APPLYING NELDER MEAD S OPTIMIZATION ALGORITHM FOR MULTIPLE GLOBAL MINIMA Abstract Ştefan ŞTEFĂNESCU * The iterative deterinistic optiization ethod could not

More information

A Note on Scheduling Tall/Small Multiprocessor Tasks with Unit Processing Time to Minimize Maximum Tardiness

A Note on Scheduling Tall/Small Multiprocessor Tasks with Unit Processing Time to Minimize Maximum Tardiness A Note on Scheduling Tall/Sall Multiprocessor Tasks with Unit Processing Tie to Miniize Maxiu Tardiness Philippe Baptiste and Baruch Schieber IBM T.J. Watson Research Center P.O. Box 218, Yorktown Heights,

More information

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical IEEE TRANSACTIONS ON INFORMATION THEORY Large Alphabet Source Coding using Independent Coponent Analysis Aichai Painsky, Meber, IEEE, Saharon Rosset and Meir Feder, Fellow, IEEE arxiv:67.7v [cs.it] Jul

More information

The proofs of Theorem 1-3 are along the lines of Wied and Galeano (2013).

The proofs of Theorem 1-3 are along the lines of Wied and Galeano (2013). A Appendix: Proofs The proofs of Theore 1-3 are along the lines of Wied and Galeano (2013) Proof of Theore 1 Let D[d 1, d 2 ] be the space of càdlàg functions on the interval [d 1, d 2 ] equipped with

More information

Boosting with log-loss

Boosting with log-loss Boosting with log-loss Marco Cusuano-Towner Septeber 2, 202 The proble Suppose we have data exaples {x i, y i ) i =... } for a two-class proble with y i {, }. Let F x) be the predictor function with the

More information

OBJECTIVES INTRODUCTION

OBJECTIVES INTRODUCTION M7 Chapter 3 Section 1 OBJECTIVES Suarize data using easures of central tendency, such as the ean, edian, ode, and idrange. Describe data using the easures of variation, such as the range, variance, and

More information

Optimum Value of Poverty Measure Using Inverse Optimization Programming Problem

Optimum Value of Poverty Measure Using Inverse Optimization Programming Problem International Journal of Conteporary Matheatical Sciences Vol. 14, 2019, no. 1, 31-42 HIKARI Ltd, www.-hikari.co https://doi.org/10.12988/ijcs.2019.914 Optiu Value of Poverty Measure Using Inverse Optiization

More information

Recovering Data from Underdetermined Quadratic Measurements (CS 229a Project: Final Writeup)

Recovering Data from Underdetermined Quadratic Measurements (CS 229a Project: Final Writeup) Recovering Data fro Underdeterined Quadratic Measureents (CS 229a Project: Final Writeup) Mahdi Soltanolkotabi Deceber 16, 2011 1 Introduction Data that arises fro engineering applications often contains

More information

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels Extension of CSRSM for the Paraetric Study of the Face Stability of Pressurized Tunnels Guilhe Mollon 1, Daniel Dias 2, and Abdul-Haid Soubra 3, M.ASCE 1 LGCIE, INSA Lyon, Université de Lyon, Doaine scientifique

More information

Figure 1: Equivalent electric (RC) circuit of a neurons membrane

Figure 1: Equivalent electric (RC) circuit of a neurons membrane Exercise: Leaky integrate and fire odel of neural spike generation This exercise investigates a siplified odel of how neurons spike in response to current inputs, one of the ost fundaental properties of

More information

MSEC MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL SOLUTION FOR MAINTENANCE AND PERFORMANCE

MSEC MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL SOLUTION FOR MAINTENANCE AND PERFORMANCE Proceeding of the ASME 9 International Manufacturing Science and Engineering Conference MSEC9 October 4-7, 9, West Lafayette, Indiana, USA MSEC9-8466 MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL

More information

Kernel Methods and Support Vector Machines

Kernel Methods and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic

More information

Gradient-Based Cuckoo Search for Global Optimization

Gradient-Based Cuckoo Search for Global Optimization Instituto Tecnologico de Aguascalientes Fro the SelectedWorks of Adrian Bonilla-Petriciolet 4 Gradient-Based Cuckoo Search for Global Optiization Seif E.K. Fateen, Cairo University Adrian Bonilla-Petriciolet

More information

Tracking using CONDENSATION: Conditional Density Propagation

Tracking using CONDENSATION: Conditional Density Propagation Tracking using CONDENSATION: Conditional Density Propagation Goal Model-based visual tracking in dense clutter at near video frae rates M. Isard and A. Blake, CONDENSATION Conditional density propagation

More information

On the Use of A Priori Information for Sparse Signal Approximations

On the Use of A Priori Information for Sparse Signal Approximations ITS TECHNICAL REPORT NO. 3/4 On the Use of A Priori Inforation for Sparse Signal Approxiations Oscar Divorra Escoda, Lorenzo Granai and Pierre Vandergheynst Signal Processing Institute ITS) Ecole Polytechnique

More information

Optimal nonlinear Bayesian experimental design: an application to amplitude versus offset experiments

Optimal nonlinear Bayesian experimental design: an application to amplitude versus offset experiments Geophys. J. Int. (23) 155, 411 421 Optial nonlinear Bayesian experiental design: an application to aplitude versus offset experients Jojanneke van den Berg, 1, Andrew Curtis 2,3 and Jeannot Trapert 1 1

More information

Combining Classifiers

Combining Classifiers Cobining Classifiers Generic ethods of generating and cobining ultiple classifiers Bagging Boosting References: Duda, Hart & Stork, pg 475-480. Hastie, Tibsharini, Friedan, pg 246-256 and Chapter 10. http://www.boosting.org/

More information

Best Linear Unbiased and Invariant Reconstructors for the Past Records

Best Linear Unbiased and Invariant Reconstructors for the Past Records BULLETIN of the MALAYSIAN MATHEMATICAL SCIENCES SOCIETY http:/athusy/bulletin Bull Malays Math Sci Soc (2) 37(4) (2014), 1017 1028 Best Linear Unbiased and Invariant Reconstructors for the Past Records

More information

THE KALMAN FILTER: A LOOK BEHIND THE SCENE

THE KALMAN FILTER: A LOOK BEHIND THE SCENE HE KALMA FILER: A LOOK BEHID HE SCEE R.E. Deain School of Matheatical and Geospatial Sciences, RMI University eail: rod.deain@rit.edu.au Presented at the Victorian Regional Survey Conference, Mildura,

More information

Machine Learning Basics: Estimators, Bias and Variance

Machine Learning Basics: Estimators, Bias and Variance Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics

More information

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lesson 1 4 October 2017 Outline Learning and Evaluation for Pattern Recognition Notation...2 1. The Pattern Recognition

More information

Pattern Recognition and Machine Learning. Artificial Neural networks

Pattern Recognition and Machine Learning. Artificial Neural networks Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lessons 7 20 Dec 2017 Outline Artificial Neural networks Notation...2 Introduction...3 Key Equations... 3 Artificial

More information

Approximation in Stochastic Scheduling: The Power of LP-Based Priority Policies

Approximation in Stochastic Scheduling: The Power of LP-Based Priority Policies Approxiation in Stochastic Scheduling: The Power of -Based Priority Policies Rolf Möhring, Andreas Schulz, Marc Uetz Setting (A P p stoch, r E( w and (B P p stoch E( w We will assue that the processing

More information

are equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are,

are equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are, Page of 8 Suppleentary Materials: A ultiple testing procedure for ulti-diensional pairwise coparisons with application to gene expression studies Anjana Grandhi, Wenge Guo, Shyaal D. Peddada S Notations

More information

An l 1 Regularized Method for Numerical Differentiation Using Empirical Eigenfunctions

An l 1 Regularized Method for Numerical Differentiation Using Empirical Eigenfunctions Journal of Matheatical Research with Applications Jul., 207, Vol. 37, No. 4, pp. 496 504 DOI:0.3770/j.issn:2095-265.207.04.0 Http://jre.dlut.edu.cn An l Regularized Method for Nuerical Differentiation

More information

ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics

ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS A Thesis Presented to The Faculty of the Departent of Matheatics San Jose State University In Partial Fulfillent of the Requireents

More information

Interactive Markov Models of Evolutionary Algorithms

Interactive Markov Models of Evolutionary Algorithms Cleveland State University EngagedScholarship@CSU Electrical Engineering & Coputer Science Faculty Publications Electrical Engineering & Coputer Science Departent 2015 Interactive Markov Models of Evolutionary

More information

Examining the Effects of Site Selection Criteria for Evaluating the Effectiveness of Traffic Safety Countermeasures

Examining the Effects of Site Selection Criteria for Evaluating the Effectiveness of Traffic Safety Countermeasures Exaining the Effects of Site Selection Criteria for Evaluating the Effectiveness of Traffic Safety Countereasures Doinique Lord* Associate Professor and Zachry Developent Professor I Zachry Departent of

More information

Recovery of Sparsely Corrupted Signals

Recovery of Sparsely Corrupted Signals TO APPEAR IN IEEE TRANSACTIONS ON INFORMATION TEORY 1 Recovery of Sparsely Corrupted Signals Christoph Studer, Meber, IEEE, Patrick Kuppinger, Student Meber, IEEE, Graee Pope, Student Meber, IEEE, and

More information

Lower Bounds for Quantized Matrix Completion

Lower Bounds for Quantized Matrix Completion Lower Bounds for Quantized Matrix Copletion Mary Wootters and Yaniv Plan Departent of Matheatics University of Michigan Ann Arbor, MI Eail: wootters, yplan}@uich.edu Mark A. Davenport School of Elec. &

More information

On Constant Power Water-filling

On Constant Power Water-filling On Constant Power Water-filling Wei Yu and John M. Cioffi Electrical Engineering Departent Stanford University, Stanford, CA94305, U.S.A. eails: {weiyu,cioffi}@stanford.edu Abstract This paper derives

More information

A Unified Approach to Universal Prediction: Generalized Upper and Lower Bounds

A Unified Approach to Universal Prediction: Generalized Upper and Lower Bounds 646 IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL 6, NO 3, MARCH 05 A Unified Approach to Universal Prediction: Generalized Upper and Lower Bounds Nuri Denizcan Vanli and Suleyan S Kozat,

More information

Adapting the Pheromone Evaporation Rate in Dynamic Routing Problems

Adapting the Pheromone Evaporation Rate in Dynamic Routing Problems Adapting the Pheroone Evaporation Rate in Dynaic Routing Probles Michalis Mavrovouniotis and Shengxiang Yang School of Coputer Science and Inforatics, De Montfort University The Gateway, Leicester LE1

More information

A Poisson process reparameterisation for Bayesian inference for extremes

A Poisson process reparameterisation for Bayesian inference for extremes A Poisson process reparaeterisation for Bayesian inference for extrees Paul Sharkey and Jonathan A. Tawn Eail: p.sharkey1@lancs.ac.uk, j.tawn@lancs.ac.uk STOR-i Centre for Doctoral Training, Departent

More information

Bisection Method 8/11/2010 1

Bisection Method 8/11/2010 1 Bisection Method 8/11/010 1 Bisection Method Basis of Bisection Method Theore An equation f()=0, where f() is a real continuous function, has at least one root between l and u if f( l ) f( u ) < 0. f()

More information

OPTIMIZATION in multi-agent networks has attracted

OPTIMIZATION in multi-agent networks has attracted Distributed constrained optiization and consensus in uncertain networks via proxial iniization Kostas Margellos, Alessandro Falsone, Sione Garatti and Maria Prandini arxiv:603.039v3 [ath.oc] 3 May 07 Abstract

More information

A Low-Complexity Congestion Control and Scheduling Algorithm for Multihop Wireless Networks with Order-Optimal Per-Flow Delay

A Low-Complexity Congestion Control and Scheduling Algorithm for Multihop Wireless Networks with Order-Optimal Per-Flow Delay A Low-Coplexity Congestion Control and Scheduling Algorith for Multihop Wireless Networks with Order-Optial Per-Flow Delay Po-Kai Huang, Xiaojun Lin, and Chih-Chun Wang School of Electrical and Coputer

More information

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices CS71 Randoness & Coputation Spring 018 Instructor: Alistair Sinclair Lecture 13: February 7 Disclaier: These notes have not been subjected to the usual scrutiny accorded to foral publications. They ay

More information

An Approximate Model for the Theoretical Prediction of the Velocity Increase in the Intermediate Ballistics Period

An Approximate Model for the Theoretical Prediction of the Velocity Increase in the Intermediate Ballistics Period An Approxiate Model for the Theoretical Prediction of the Velocity... 77 Central European Journal of Energetic Materials, 205, 2(), 77-88 ISSN 2353-843 An Approxiate Model for the Theoretical Prediction

More information

Bayesian Approach for Fatigue Life Prediction from Field Inspection

Bayesian Approach for Fatigue Life Prediction from Field Inspection Bayesian Approach for Fatigue Life Prediction fro Field Inspection Dawn An and Jooho Choi School of Aerospace & Mechanical Engineering, Korea Aerospace University, Goyang, Seoul, Korea Srira Pattabhiraan

More information

8.1 Force Laws Hooke s Law

8.1 Force Laws Hooke s Law 8.1 Force Laws There are forces that don't change appreciably fro one instant to another, which we refer to as constant in tie, and forces that don't change appreciably fro one point to another, which

More information

Ensemble Based on Data Envelopment Analysis

Ensemble Based on Data Envelopment Analysis Enseble Based on Data Envelopent Analysis So Young Sohn & Hong Choi Departent of Coputer Science & Industrial Systes Engineering, Yonsei University, Seoul, Korea Tel) 82-2-223-404, Fax) 82-2- 364-7807

More information

A general forulation of the cross-nested logit odel Michel Bierlaire, Dpt of Matheatics, EPFL, Lausanne Phone: Fax:

A general forulation of the cross-nested logit odel Michel Bierlaire, Dpt of Matheatics, EPFL, Lausanne Phone: Fax: A general forulation of the cross-nested logit odel Michel Bierlaire, EPFL Conference paper STRC 2001 Session: Choices A general forulation of the cross-nested logit odel Michel Bierlaire, Dpt of Matheatics,

More information

Novel Realization of Adaptive Sparse Sensing With Sparse Least Mean Fourth Algorithms

Novel Realization of Adaptive Sparse Sensing With Sparse Least Mean Fourth Algorithms ovel Realization of Adaptive Sparse Sensing With Sparse Least ean Fourth Algoriths Guan Gui Li Xu Xiao-ei Zhu and Zhang-in Chen 3. Dept. of Electronics and Inforation Systes Akita Prefectural University

More information

A Self-Organizing Model for Logical Regression Jerry Farlow 1 University of Maine. (1900 words)

A Self-Organizing Model for Logical Regression Jerry Farlow 1 University of Maine. (1900 words) 1 A Self-Organizing Model for Logical Regression Jerry Farlow 1 University of Maine (1900 words) Contact: Jerry Farlow Dept of Matheatics Univeristy of Maine Orono, ME 04469 Tel (07) 866-3540 Eail: farlow@ath.uaine.edu

More information

AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS

AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS Statistica Sinica 6 016, 1709-178 doi:http://dx.doi.org/10.5705/ss.0014.0034 AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS Nilabja Guha 1, Anindya Roy, Yaakov Malinovsky and Gauri

More information

EMPIRICAL COMPLEXITY ANALYSIS OF A MILP-APPROACH FOR OPTIMIZATION OF HYBRID SYSTEMS

EMPIRICAL COMPLEXITY ANALYSIS OF A MILP-APPROACH FOR OPTIMIZATION OF HYBRID SYSTEMS EMPIRICAL COMPLEXITY ANALYSIS OF A MILP-APPROACH FOR OPTIMIZATION OF HYBRID SYSTEMS Jochen Till, Sebastian Engell, Sebastian Panek, and Olaf Stursberg Process Control Lab (CT-AST), University of Dortund,

More information

Computational and Statistical Learning Theory

Computational and Statistical Learning Theory Coputational and Statistical Learning Theory TTIC 31120 Prof. Nati Srebro Lecture 2: PAC Learning and VC Theory I Fro Adversarial Online to Statistical Three reasons to ove fro worst-case deterinistic

More information

Forecasting Financial Indices: The Baltic Dry Indices

Forecasting Financial Indices: The Baltic Dry Indices International Journal of Maritie, Trade & Econoic Issues pp. 109-130 Volue I, Issue (1), 2013 Forecasting Financial Indices: The Baltic Dry Indices Eleftherios I. Thalassinos 1, Mike P. Hanias 2, Panayiotis

More information

Asynchronous Gossip Algorithms for Stochastic Optimization

Asynchronous Gossip Algorithms for Stochastic Optimization Asynchronous Gossip Algoriths for Stochastic Optiization S. Sundhar Ra ECE Dept. University of Illinois Urbana, IL 680 ssrini@illinois.edu A. Nedić IESE Dept. University of Illinois Urbana, IL 680 angelia@illinois.edu

More information

A remark on a success rate model for DPA and CPA

A remark on a success rate model for DPA and CPA A reark on a success rate odel for DPA and CPA A. Wieers, BSI Version 0.5 andreas.wieers@bsi.bund.de Septeber 5, 2018 Abstract The success rate is the ost coon evaluation etric for easuring the perforance

More information

Bayes Decision Rule and Naïve Bayes Classifier

Bayes Decision Rule and Naïve Bayes Classifier Bayes Decision Rule and Naïve Bayes Classifier Le Song Machine Learning I CSE 6740, Fall 2013 Gaussian Mixture odel A density odel p(x) ay be ulti-odal: odel it as a ixture of uni-odal distributions (e.g.

More information

Pattern Recognition and Machine Learning. Artificial Neural networks

Pattern Recognition and Machine Learning. Artificial Neural networks Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2016 Lessons 7 14 Dec 2016 Outline Artificial Neural networks Notation...2 1. Introduction...3... 3 The Artificial

More information

Robust polynomial regression up to the information theoretic limit

Robust polynomial regression up to the information theoretic limit 58th Annual IEEE Syposiu on Foundations of Coputer Science Robust polynoial regression up to the inforation theoretic liit Daniel Kane Departent of Coputer Science and Engineering / Departent of Matheatics

More information

Why it was necessary? Geodetic Datums. Type of Geodetic Datum. Nothing but spheroid of origin. We need. Local Datum Definitions in Dubai Emirate

Why it was necessary? Geodetic Datums. Type of Geodetic Datum. Nothing but spheroid of origin. We need. Local Datum Definitions in Dubai Emirate Why it was necessary? Dubai Eirate Between Clark880 and Spheroid.Al Marzooqi, H.Fashir, Syed Iliyas Ahed ( Dubai, United Arab Eirates ) About 9000 control points were available on Clark880 UTM Whole ilitary

More information

DESIGN OF THE DIE PROFILE FOR THE INCREMENTAL RADIAL FORGING PROCESS *

DESIGN OF THE DIE PROFILE FOR THE INCREMENTAL RADIAL FORGING PROCESS * IJST, Transactions of Mechanical Engineering, Vol. 39, No. M1, pp 89-100 Printed in The Islaic Republic of Iran, 2015 Shira University DESIGN OF THE DIE PROFILE FOR THE INCREMENTAL RADIAL FORGING PROCESS

More information

An Inverse Interpolation Method Utilizing In-Flight Strain Measurements for Determining Loads and Structural Response of Aerospace Vehicles

An Inverse Interpolation Method Utilizing In-Flight Strain Measurements for Determining Loads and Structural Response of Aerospace Vehicles An Inverse Interpolation Method Utilizing In-Flight Strain Measureents for Deterining Loads and Structural Response of Aerospace Vehicles S. Shkarayev and R. Krashantisa University of Arizona, Tucson,

More information

Kernel-Based Nonparametric Anomaly Detection

Kernel-Based Nonparametric Anomaly Detection Kernel-Based Nonparaetric Anoaly Detection Shaofeng Zou Dept of EECS Syracuse University Eail: szou@syr.edu Yingbin Liang Dept of EECS Syracuse University Eail: yliang6@syr.edu H. Vincent Poor Dept of

More information