KNN Particle Filters for Dynamic Hybrid Bayesian Networks

Size: px
Start display at page:

Download "KNN Particle Filters for Dynamic Hybrid Bayesian Networks"

Transcription

1 KNN Particle Filters for Dynamic Hybri Bayesian Networs H. D. Chen an K. C. Chang Dept. of Systems Engineering an Operations Research George Mason University MS 4A6, 4400 University Dr. Fairfax, VA Abstract - In state estimation of ynamic systems, Sequential Monte Carlo methos, also nown as particle filters, have been introuce to eal with practical problems of nonlinear, non-gaussian situations. They allow us to treat any type of probability istribution, nonlinearity an non-stationarity although they usually suffer major rawbacs of sample egeneracy an inefficiency in high-imensional cases. In this paper, we show how we can exploit the structure of partially ynamic hybri Bayesian networs (PD-HBN) to reuce sample epletion an increase the efficiency of particle filtering, by combining the well-nown KNN majority voting strategy an the concept of evolution algorithm. Essentially, the novel metho re-samples part of the variables an ranomly combines them with the existing samples of other variables to prouce new particles. As new observations become available, the algorithm allows the particles to incorporate the latest information so that the top K fittest particles associate with a propose objective rule will be ept for re-sampling. With simulations, we show that this new approach has a superior estimation/classification performance compare to other relate algorithms. Keywors: Dynamic Bayesian Networs, Hybri Bayesian Networs, an Particle Filters. 1 Introuction Bayesian networs represent a probability istribution using a graphical moel of a irecte acyclic graph (DAG). Every noe in the graph correspons to a ranom variable in the omain an is annotate by a conitional probability istribution (CPD), efining the conitional istribution of the variable given its parents. Pure iscrete networs are the most popular ones use in practice. However, iscrete networs are inaequate since many practical problems involve continuous attributes as well as iscrete ones. Hybri Bayesian networs (HBN) contains both iscrete an continuous noes [1] an is general in the sense that it allows arbitrary relationships between any ranom variables in the networ. The ynamic systems in practical applications usually involve HBN moels. Examples of such inclue vehicle navigations, target tracing, parameter estimations an system ientifications an/or classifications. This paper focuses specifically on HBN ynamic systems that can be escribe by a partially ynamic HBN moel (PD-HBN) where not all noes in the networ are ynamic noes (see Figure 1). Figure 1. A Systematic Moel of PD-HBN. The hien system state S (incluing ynamic an static, evolves over time noes), with initial istribution p( S 0 ) as an inirect or partially observe first orer Marov process accoring to the conitional probability ensity p ( S S 1 ). The observations { e } are conitionally inepenent given the state an behave accoring to the probability ensity p( e S ). In general, the PD-HBN can be escribe by the following ynamic equations S = ψ ( S 1, U 1) (1) e = h ( S, W ) (2) Where U enotes the process vector that rives the ψ, ynamic system through the state transition function ( ) an W is the measurement noise vector corrupting the observations of the state through the function h( ). The state transition ensity p ( S S ) is fully specifie by 1 ψ ( ) an the process istribution p( U ), whereas h( ) an the observation noise istribution p( W ) fully specify the observation lielihoo p( e S ) /05/$ IEEE

2 One of the most funamental issues for Bayesian networs is the problem of probabilistic inference. Given a Bayesian networ that represents a joint probability istribution of the variables involve, the inference is to compute the probability istribution over a set of ranom variables given a secon set of evience. For example, in Figure 1, the posterior ensity p ( S E ) of the state given all the observations E { e1, e2,..., e} constitutes the complete solution to a sequential probabilistic inference problem. The optimal metho to recursively upate the a posterior probability ensity as new observations arrive is given by the recursive Bayesian estimation algorithm [2]. First, one can compute the prior p ( S E ) base on the 1 ynamic moel such that p( S E 1) = p( S 1 E 1) p( S S 1) S (3) 1 Then incorporates the latest noisy measurement using the observation lielihoo to compute the upate posterior p( S E 1 ) p( e S) p ( S E) = (4) p( S E 1 ) p( e S) S Although this presents the optimal recursive solution, it is usually only tractable for linear an Gaussian systems, in which case the close-form recursive solution is the wellnown Kalman filter. For most general real-worl (nonlinear, non-gaussian) systems, however, the multiimensional integrals are intractable an approximate solutions must be use [1]. One approach is to iscretize the continuous variables by partitioning their omain into a finite number of subsets [4]. However, this simple approach is often very problematic an might lea to an unacceptable performance both in computation an accuracy [5]. Recently, a popular solution strategy for the general filtering problem is to use sequential Monte Carlo methos, also nown as particle filters (PF) [4], which allow for a complete representation of the posterior istribution of the states, so that any statistical estimates, such as mean an variance, can be easily compute. They can eal with any nonlinearity an/or istribution. However, particle filters rely on importance sampling an, as a result, require the esign of appropriate istributions that can approximate the posterior istribution reasonably well. In general, it is har to esign such an importance istribution. Without special correction strategies, the particle epletion is unavoiable in many situations. This is particularly the case for PD-HBN ue its hybri nature. To overcome this problem, several techniques have been propose in the literature, such as pruning an enrichment to throw out ba an boost goo particles [9], irecte enrichment [4] an mutation (ernel smoothing) [7]. In this paper, we focus our effort on the inference problem for PD-HBN using a novel re-sampling approach base on a K-Nearest-Neighbors (KNN) ecision strategy. The KNN is a non-parametric classification metho, which is simple but effective in many cases. KNN is also a memory-base moel efine by a set of examples for which the outcomes are nown (i.e., the examples are labele). Each example consists of a ata case having a set of inepenent values labele by a set of epenent outcomes. The inepenent an epenent variables can be either continuous or categorical. A majority voting among the ata recors in the neighborhoo is then use to ecie the classification for the test ata with or without consieration of istance-base weighting. 2 K-Nearest-Neighbors Decision The KNN is a case-base learning metho, which eeps all the training ata for ecision applications. To employ the KNN ecision strategy, one nees to efine a metric for measuring the istance between the test ata an training samples. One of the most popular choices to measure this istance is nown as Eucliean. For instance, let s consier the tas of classifying a new test object (2- Dimensional ata) among m classes, an each one has n training samples. Then, there are total mn corresponing Eucliean istances. To apply KNN one also nees to choose an appropriate value for K an the success of classification is very much epenent on this value. In fact, K can be regare as one of the most important factors of the moel that can strongly influence the quality of preictions. In general, K shoul be set to a value large enough to minimize the probability of misclassification an small enough so that the K nearest points are close to the test ata. There are many ways of choosing the K value, one simple iea is to run the algorithm many times with ifferent K values an choose the one with the best performance. r Y O 10-nearest neighbor outcome is a circle 20-nearest neighbor outcome is a square Figure 2. Illustrative iagram for KNN algorithm. Figure 2 shows a possible istribution of 2-D Eucliean istances. Our tas is to estimate the outcome of the test ata base on a selecte number of its nearest neighbors. In other wors, we want to now whether the test ata can be classifie as one of the epicte mars. To procee, consiering the outcome of KNN base on 10-nearest X

3 neighbor. It is clear that in this example KNN will classify the test ata to the group of circle since the circle is the majority in the 10 selecte neighbors. If increasing the number of nearest neighbors to 20, then KNN will report a square, which is the majority of the 20 neighbors shown in the figure. Since KNN preictions are base on the intuitive assumption that objects closer in istance are potentially similar, it maes goo sense to iscriminate between the K nearest neighbors when maing preictions. 3 KNN-PF Re-sampling Algorithm In this paper, we focus our effort on the inference problem for PD-HBN [7], which contain both iscrete an continuous variables. The consiere hybri moel is general in the sense that it allows arbitrary relationships between any ranom variables as long as they can be expresse in a certain form. In equation (4), the a posterior probability istribution constitutes the complete solution to the sequential estimation problem. However, in many applications, such as target tracing an ientifications, it is ifficult to have a close form solution. In a PD-HBN, one may express the unobserve state as a set, i.e., S { F, D,T}, where F an D present static an ynamic features in the PD-HBN, an T enotes the target noe of interest. For simplicity, we efine the ynamic moel (1) an (2) as a combine function such that PDBN[ F 1, D 1,T ] h{ ψ ( F-1, D 1,T, U 1), W } (5) Assume that there are n iscrete state values in the target noe T, an the evience space is with imensions, i.e., 1 2 e = e e e. In traitional Monte Carlo simulations an particle filters, a set of weighte particles (importance samples), rawn from a propose istribution, is use for representing the state istribution an propagating over ynamic steps. Typically, after a few iterations, one or the sum of a few of the normalize importance weights approaches to 1, while the remaining weights approach to zero. A large number of samples are thus effectively remove from the sample set because their weights become numerically insignificant. This is particularly the case when static iscrete variables are present in the moel. Many mitigation strategies have been propose to avoi this egeneration or epletion of samples. Besies the pruning an enrichment techniques, most of methos are base on estimating the a posterior istributions, such as minimum variance sampling [12], sampling-importance re-sampling [13], an resiual resampling [14]. In this paper, we will propose a novel metho, terme as K-nearest-neighbor particle filters (KNN-PF), to mitigate this egeneration problem. Instea of consiering posterior istributions, we will select particles base on KNN ecision results. At the each iteration, the new particles are constructe by ranomly combining the re-sample particles of the ynamic noe D an the existing samples of the static noes F an T. The iea is to ranomly permute an cross over between ynamic an static particles in such as way that the particles correspon to the static variables remain relatively stable so that the problem of particle epletion can be minimize. The etail algorithm is escribe as follows. Step 1. Set = 1, an generate N particles by sampling base on the prior of Bayesian Networs. Assuming uniform istribution for the static iscrete target noe, there will be Ν n particles for each target class. Step 2. Mae the KNN ecision an select K best particles base on the square Eucliean istances to the current observe evience. Step 3. If is the last level of the PD-HBN, stop; else, = + 1 an go to Step 4. Step 4. Extrapolate the ynamic features of each of the K surviving particles to the next time slice. Step 5. Generate a new set of N particles, where the newly generate ynamic features in Step 4 will be reprouce an ranomly combine with the static features of the previous time slice. Step 6. Of the newly generate N particles in Step 5, replace the ynamic features of the K particles corresponing to the ones chosen in Step 2 with their extrapolate ones obtaine in Step 4. Go to Step 2. In the above algorithm escriptions, the step 5 introuces a concept similar to the evolution algorithm, an the step 6 guarantees the previously selecte goo particles will be extene to the next ynamic step. This novel metho resamples the ynamic variables an eeps the samples of the static variables unchange. As a new set of noisy or incomplete observations becomes available, the algorithm allows the particle filter to incorporate the latest observations so that ynamic variables are upate accoringly. Aitionally, in the step 5, the static variables may be partially moifie in orer to achieve a better iversity especially for large imensional state space. The strategy to re-sample the static noes in the PD-HBN will be iscusse later. Next, we will present the square Eucliean measurement an the KNN strategy in the KNN-PF re-sampling algorithm. Consiering that two sets of ranom variables e, y are rawn from the same istribution ( )

4 ( ) p e S, it is obvious that E y e is zero. Accoring to the law of large number, the statistical mean of the ifference is almost surely convergence to the expectation. Moreover, if the variance of p ( e S ) is boune, then the ifference ( y e ) is multivariate zero-mean Gaussian following the central limit theorem. The probability istribution may therefore be expresse as (6) p( y e S ) = exp ( ) ( ) T y e Σ y e ( 2π ) 2 4 Σ The new algorithm is to use KNN strategy to select the particles with the state S as close as possible to the true ' S, an then mae the classification ecision upon a majority voting. The majority group is selecte by those particles that have higher lielihoo functions (6), or equivalently, they have smaller istance, 1 ( y ) ( ) T e Σ y e, which is the objective function of our KNN ecision strategy. With multiple inepenent samples over time, the joint ensity function may be illustrate by 1 ( ) exp ( ) ( ) ' p y e S = C y e Σ y e (7) The constant factor C can be irectly erive from (6). More specifically, suppose that there are a total of N particles each contains a sample in the observation space at the -th ynamic step such that 1 2 x [1] x1( ) x1 ( ) x1 ( ) 1 2 [2] x x2( ) x2 ( ) x2 ( ) (8) 1 2 x [N] xn( ) xn( ) xn( ) N ' x i h S, W, where the observations are base on = ( ) an y { x 1 x 2 x N }. From equation (7), we may recursively calculate the square Eucliean istances for each particle, R i = R i + x i e Σ x i e (9) ( ) ( ) T 1 where Σ enotes the covariance of the evience space. Now, we sort the istances (9) in the ascenant orer. Suppose that { λ 1, λ 2,..., λ N } presents the inex set corresponing to R λ R λ R λ N, respectively. We then pic up the top K minimal istances an locate the ynamic features. Namely, q( 1: Κ, ) 2 Z = arg min sort R i (10) i D : Κ, i where ( 1:, ) {,,..., } q λ1 λ2 λ Κ Κ represents the inex set of the K nearest neighbors at the ynamic step. In the KNN-PF, we start with a fixe number of particles obtaine base on the prior istribution. We then choose the top K (say 20) particles for ecision-maing (voting) q( 1: Κ, ) an also use them Z for ynamic propagation. The iea is that assuming enough samples for static variables to start with, we only nee to moify the samples for ynamic noes. For example, with 10 4 particles, the ynamic part of the top 20 particles will be uplicate an merge with the static particles ranomly (1 for 500, an therefore 50 for each target class). This is essentially the re-sampling without the static noes. It has the flavor of the ranom mutation an cross over as in the evolution algorithm. Thus, we have o ql (, ) D i = Z ; l = mo ( i, Κ) (11) On the other han, to ensure iversity, we may choose to re-sample the static noes partially or wholly as well. Suppose that we upate the corresponing static features F every c perioic interval. Dynamically, the static noes can be expresse as F i, mo ( ic, ) 0 o (12) F i = i, F mo ( ic, ) = 0 * Where F enotes the re-sampling static noes. In the meantime, in orer to guarantee the top K particles to survive onto the next time step, we eep an propagate those K particles forwar, namely o PDBN i, i,, F D T i q( 1: Κ, ) (13) x + 1 i = i i,, T, PDBN F Z i q( 1: Κ, ) This novel metho re-samples the ynamic variables an mixes them with the existing samples of the static variables. This has proven to be an effective way of ealing with sample epletion for static variables. Note that the static variables can also be re-sample at a ifferent rate to improve performance. 4 Numerical Results In this section we present results from a set of experiments that test the efficacy an traeoffs of our moel an the classification algorithms. We esign a hybri ynamic networ as shown in Figure 3 for testing. In this example, there are two iscrete variables (SDA, SDB) each has eight an ten state values respectively, an the remaining variables are continuous. Assuming with conitional linear Gaussian (CLG) istributions, we have conitional Gaussian between continuous an iscrete noes, an linear Gaussian among continuous noes. The target noe of interest is SDB, an the observable eviences are COA, COB, COC, COD, COE an COF. Figure 3 shows the corresponing PD-HBN of two time slices where DA an DB are the ynamic noes an CFA is a ynamic feature that change over time. However, the static noes SDA an SDB as well as the static features CFB, CFC, CFD an CFE will remain unchange over time. This Bayes net can

5 be extene to a multi-slice PD-HBN easily by introucing aitional copies of the ynamic noes an the proper transitional relationship between them [7]. Figure 4. The pf Curves of the Observation Noes. Figure 3. A Two-step PD-HBN Example. To evaluate the propose KNN particle filter base inference algorithms, we conuct extensive simulations to examine the computation an performance traeoffs. The two ynamic noes, DA an DB, are moele as two sinusoi waveforms x = Asin ( 2πf ) at + φa + n an ( ) a y = Bsin 2πf bt + φb + nb, where the aitive noises are Gaussian with na Ν ( 100, 0.2) an nb Ν ( 30, 0.1), respectively. The parameters are given as A = 20, B = 10, f a = 0.6, an f b = 1. The sampling interval is assume to be 0.5. Assuming that the initial phases are uniformly istribute, i.e., φa U ( 0, 2π ) an φb U ( 0, 2π ), the probability ensity functions (pf) of the 6-D evience given 10 ifferent target types are shown in Figure 4. In all the subplots, each curve is a pf correspons to a particular target type. It is necessary to mention that these pf curves are corresponing to the Bayesian networ in the initial state, i.e., time slice t = 1. As can be seen from the figures, some of them can be approximate by a single Gaussian, but most of them require at least a multiple terms Gaussian mixture. Since the KNN-PF eeps the survive particles as the ynamic slices move forwar, it is not critical to have a precise covariance of the evience variables in equation (9). We may only nee to estimate the iagonal covariance to construct the matrix such that 2 σ σ 2 0 Σ = (14) σ Figure 5 shows the average probabilities of correct etection (Pc) base on 1,000 ranomly generate evience sets using the KNN-PF algorithm uner three ifferent scenarios. K is selecte as 20, an the number of particles is either 10 4 or Aitionally, the static noes are partially re-sample an the parameter c=100, means 1% (of 10 4 ) of the static particles are re-sample. The figure shows that the two cases (with or without resampling static noes) are very close to each other after the fifth ynamic time step. As a by-prouct of the KNN-PF inference, the ynamic noes DA an DB may be estimate. The estimates are mae from the DA/DB values corresponing to the best particle at each ynamic step. With respective to the etection performance in Figure 5, we have plotte the mean square error (MSE) of the DA/DB estimations base on the best particle in the KNN-PF algorithm. As seen in Figure 6, clearly with more particles, the estimation performance is better. Figure 5. Probabilities of Correct Detection

6 Figure 7 shows the etection performance comparisons for traitional PF, LW an KNN-PF. In this experiment, the initial DA/DB values are assume to be nown. The K value is set to be 20 in the KNN-PF. Note that the traitional PF reaches its pea performance at aroun 87% ue to the epletion of samples. LW reaches about 89% using 10 5 particles. We observe that with 10 times more samples, LW is able to increase its performance up to 97%. Notice that with 10 5 particles, KNN-PF was able to outperform both PF an LW significantly. After the 15 th time slice, it even outperforms LW with 10 6 particles, an converges to almost 100% performance at the en of the simulation. Figure 6. DA/DB MSE via the KNN-PF Now, we compare the KNN-PF with other relate algorithms, such as the traitional PF an the lielihooweighting (LW) [15] algorithm. The traitional PF relies on importance sampling an, as a result, requires the esign of appropriate istributions that can approximate the a posterior istribution reasonably well. However, without special treatments, the epletion of samples is unavoiable particularly for the static variables. One of the main ifferences between KNN-PF an PF/LW is that KNN-PF simulates the "particles" all the way to the leaf/evience noes but PF/LW only simulates own to the parent noes of the observe evience. On the other han, between PF an LW, instea of re-sampling at every time slice base on the importance function learne so far as in PF, LW generate the samples globally base on the prior istribution of the entire N-step Bayes net an use accumulate evience lielihoos to weight each sample. Note that we use LW here merely for a bench mar purpose. It is not realistic to use this type of batch processing in practice 5 Conclusions We presente the KNN-PF algorithm, an efficient an effective new solution to the sequential inference problems for PD-HBN as well as for other non-linear, non-gaussian ynamic estimation problems. This algorithm partitions the particles into ynamic an static variables an ranomly combines the re-sample particles together. It allows the particles to incorporate the latest observations so that target variables are upate by the surviving top K fittest particles. It was shown that this algorithm outperforms stanar particle filters an global lielihoo weighting techniques. Furthermore, the KNN-PF mitigates the effects of sample epletion by eeping the particles of the static noe unchange. In summary, this novel metho allows us to exploit the architecture of any ynamic system, an has a potential to perform efficient an accurate inference on large real worl complex systems. References [1] K. G. Olesen, Causal Probabilistic Networs with both Discrete an Continuous Variables, IEEE Transactions on Pattern Analysis an Machine Intelligence, vol.15, pp , [2] B. D. Anerson an J. B. Moore, Optimal Filtering, Prentice-Hall, [3] D. Hecerman, A. Mamani, M. P. Wellman, Realworl applications of Bayesian networs, Comm. of the ACM, vol. 38, no. 3, pp , March Figure 7. Performance Comparison of Different Algorithms. [4] A. V. Kozlov an D. Koller, Nonuniform Dynamic Discretization in Hybri Networs, Proceeings of the 13th Uncertainty in AI Conference, p , 1997.

7 [5] K. C. Chang an W. Sun, Comparing probabilistic inference for mixe Bayesian networs, Proc. SPIE conference, [17] M. Taiawa, B. D Ambrosio, an E. Wright, Real-Time Inference with Large-Scale Temporal Bayes Nets, Proceeings of the 18th UAI Conference, [6] M. K. Pitt, an N. Shephar, "Filtering via simulation: Auxiliary particle filters", Journal of the Americian Statistical Association, 94(446): , [7] M. Arulampalam, S. Masell, N. Goron an T. Clapp, A Tutorial on Particle Filters for On-line Nonlinear/Non-gaussian Bayesian Tracing, IEEE Transactions on Signal Processing, No.50, pp [8] Arnau Doucet, Nano e Freitas, an Neil Goron, Sequential Monte Carlo Methos in Practice, Springer, [9] J. L. Zhang an J. S. Liua, A new sequential importance sampling metho an its application to the two-imensional hyrophobic hyrophilic moel, Journal of Chemical Physics, vol. 117, no. 7, pp , August [10] U. Kjaerulff, A Computational Scheme for Reasoning in Dynamic Probabilistic Networs, Proceeings of the 8th UAI Conference, [11] X. Boyen an D. Koller, Tractable Inference for Complex Stochastic Processes, Proceeings of the 14th UAI Conference, Seattle, [12] A. J. Bayes, A Minimum Variance Sampling Technique for Simulation Moels, Journal of the ACM archive, vol. 19, Issue 4, pp , October [13] N. Goron, D. Salmon, an A. F. Smith, Novel approach to nonlinear an non-gaussian Bayesian state estimation, IEE Proceeings-F, v.140, pp , [14] S. Hong, M. B. Miorag an P. M. Djuric, An Efficient Fixe-Point Implementation of Resiual Resampling Scheme for High-Spee Particle Filter, IEEE Signal Processing Letters, vol. 11, no. 5, pp May [15] R. Fung an K. C. Chang, Weighing an integrating evience for stochastic simulation in Bayesian networs. In Uncertainty in Artificial Intelligence 5, pages , New Yor, N. Y., Elsevier Science Publishing Company, Inc. [16] U. N. Lerner an R. Parr, Inference in Hybri Networs: Theoretical Limits an Practical Algorithms, Proceeings of the 17th UAI Conference, Seattle, 2001.

Time-of-Arrival Estimation in Non-Line-Of-Sight Environments

Time-of-Arrival Estimation in Non-Line-Of-Sight Environments 2 Conference on Information Sciences an Systems, The Johns Hopkins University, March 2, 2 Time-of-Arrival Estimation in Non-Line-Of-Sight Environments Sinan Gezici, Hisashi Kobayashi an H. Vincent Poor

More information

A Review of Multiple Try MCMC algorithms for Signal Processing

A Review of Multiple Try MCMC algorithms for Signal Processing A Review of Multiple Try MCMC algorithms for Signal Processing Luca Martino Image Processing Lab., Universitat e València (Spain) Universia Carlos III e Mari, Leganes (Spain) Abstract Many applications

More information

WEIGHTING A RESAMPLED PARTICLE IN SEQUENTIAL MONTE CARLO. L. Martino, V. Elvira, F. Louzada

WEIGHTING A RESAMPLED PARTICLE IN SEQUENTIAL MONTE CARLO. L. Martino, V. Elvira, F. Louzada WEIGHTIG A RESAMPLED PARTICLE I SEQUETIAL MOTE CARLO L. Martino, V. Elvira, F. Louzaa Dep. of Signal Theory an Communic., Universia Carlos III e Mari, Leganés (Spain). Institute of Mathematical Sciences

More information

Survey Sampling. 1 Design-based Inference. Kosuke Imai Department of Politics, Princeton University. February 19, 2013

Survey Sampling. 1 Design-based Inference. Kosuke Imai Department of Politics, Princeton University. February 19, 2013 Survey Sampling Kosuke Imai Department of Politics, Princeton University February 19, 2013 Survey sampling is one of the most commonly use ata collection methos for social scientists. We begin by escribing

More information

Influence of weight initialization on multilayer perceptron performance

Influence of weight initialization on multilayer perceptron performance Influence of weight initialization on multilayer perceptron performance M. Karouia (1,2) T. Denœux (1) R. Lengellé (1) (1) Université e Compiègne U.R.A. CNRS 817 Heuiasyc BP 649 - F-66 Compiègne ceex -

More information

Least-Squares Regression on Sparse Spaces

Least-Squares Regression on Sparse Spaces Least-Squares Regression on Sparse Spaces Yuri Grinberg, Mahi Milani Far, Joelle Pineau School of Computer Science McGill University Montreal, Canaa {ygrinb,mmilan1,jpineau}@cs.mcgill.ca 1 Introuction

More information

SYNCHRONOUS SEQUENTIAL CIRCUITS

SYNCHRONOUS SEQUENTIAL CIRCUITS CHAPTER SYNCHRONOUS SEUENTIAL CIRCUITS Registers an counters, two very common synchronous sequential circuits, are introuce in this chapter. Register is a igital circuit for storing information. Contents

More information

Leaving Randomness to Nature: d-dimensional Product Codes through the lens of Generalized-LDPC codes

Leaving Randomness to Nature: d-dimensional Product Codes through the lens of Generalized-LDPC codes Leaving Ranomness to Nature: -Dimensional Prouct Coes through the lens of Generalize-LDPC coes Tavor Baharav, Kannan Ramchanran Dept. of Electrical Engineering an Computer Sciences, U.C. Berkeley {tavorb,

More information

Non-Linear Bayesian CBRN Source Term Estimation

Non-Linear Bayesian CBRN Source Term Estimation Non-Linear Bayesian CBRN Source Term Estimation Peter Robins Hazar Assessment, Simulation an Preiction Group Dstl Porton Down, UK. probins@stl.gov.uk Paul Thomas Hazar Assessment, Simulation an Preiction

More information

Admin BACKPROPAGATION. Neural network. Neural network 11/3/16. Assignment 7. Assignment 8 Goals today. David Kauchak CS158 Fall 2016

Admin BACKPROPAGATION. Neural network. Neural network 11/3/16. Assignment 7. Assignment 8 Goals today. David Kauchak CS158 Fall 2016 Amin Assignment 7 Assignment 8 Goals toay BACKPROPAGATION Davi Kauchak CS58 Fall 206 Neural network Neural network inputs inputs some inputs are provie/ entere Iniviual perceptrons/ neurons Neural network

More information

'HVLJQ &RQVLGHUDWLRQ LQ 0DWHULDO 6HOHFWLRQ 'HVLJQ 6HQVLWLYLW\,1752'8&7,21

'HVLJQ &RQVLGHUDWLRQ LQ 0DWHULDO 6HOHFWLRQ 'HVLJQ 6HQVLWLYLW\,1752'8&7,21 Large amping in a structural material may be either esirable or unesirable, epening on the engineering application at han. For example, amping is a esirable property to the esigner concerne with limiting

More information

A Course in Machine Learning

A Course in Machine Learning A Course in Machine Learning Hal Daumé III 12 EFFICIENT LEARNING So far, our focus has been on moels of learning an basic algorithms for those moels. We have not place much emphasis on how to learn quickly.

More information

Analyzing Tensor Power Method Dynamics in Overcomplete Regime

Analyzing Tensor Power Method Dynamics in Overcomplete Regime Journal of Machine Learning Research 18 (2017) 1-40 Submitte 9/15; Revise 11/16; Publishe 4/17 Analyzing Tensor Power Metho Dynamics in Overcomplete Regime Animashree Ananumar Department of Electrical

More information

Lecture Introduction. 2 Examples of Measure Concentration. 3 The Johnson-Lindenstrauss Lemma. CS-621 Theory Gems November 28, 2012

Lecture Introduction. 2 Examples of Measure Concentration. 3 The Johnson-Lindenstrauss Lemma. CS-621 Theory Gems November 28, 2012 CS-6 Theory Gems November 8, 0 Lecture Lecturer: Alesaner Mąry Scribes: Alhussein Fawzi, Dorina Thanou Introuction Toay, we will briefly iscuss an important technique in probability theory measure concentration

More information

A PAC-Bayesian Approach to Spectrally-Normalized Margin Bounds for Neural Networks

A PAC-Bayesian Approach to Spectrally-Normalized Margin Bounds for Neural Networks A PAC-Bayesian Approach to Spectrally-Normalize Margin Bouns for Neural Networks Behnam Neyshabur, Srinah Bhojanapalli, Davi McAllester, Nathan Srebro Toyota Technological Institute at Chicago {bneyshabur,

More information

Robust Forward Algorithms via PAC-Bayes and Laplace Distributions. ω Q. Pr (y(ω x) < 0) = Pr A k

Robust Forward Algorithms via PAC-Bayes and Laplace Distributions. ω Q. Pr (y(ω x) < 0) = Pr A k A Proof of Lemma 2 B Proof of Lemma 3 Proof: Since the support of LL istributions is R, two such istributions are equivalent absolutely continuous with respect to each other an the ivergence is well-efine

More information

Optimal Signal Detection for False Track Discrimination

Optimal Signal Detection for False Track Discrimination Optimal Signal Detection for False Track Discrimination Thomas Hanselmann Darko Mušicki Dept. of Electrical an Electronic Eng. Dept. of Electrical an Electronic Eng. The University of Melbourne The University

More information

On the Surprising Behavior of Distance Metrics in High Dimensional Space

On the Surprising Behavior of Distance Metrics in High Dimensional Space On the Surprising Behavior of Distance Metrics in High Dimensional Space Charu C. Aggarwal, Alexaner Hinneburg 2, an Daniel A. Keim 2 IBM T. J. Watson Research Center Yortown Heights, NY 0598, USA. charu@watson.ibm.com

More information

Estimating Causal Direction and Confounding Of Two Discrete Variables

Estimating Causal Direction and Confounding Of Two Discrete Variables Estimating Causal Direction an Confouning Of Two Discrete Variables This inspire further work on the so calle aitive noise moels. Hoyer et al. (2009) extene Shimizu s ientifiaarxiv:1611.01504v1 [stat.ml]

More information

inflow outflow Part I. Regular tasks for MAE598/494 Task 1

inflow outflow Part I. Regular tasks for MAE598/494 Task 1 MAE 494/598, Fall 2016 Project #1 (Regular tasks = 20 points) Har copy of report is ue at the start of class on the ue ate. The rules on collaboration will be release separately. Please always follow the

More information

Separation of Variables

Separation of Variables Physics 342 Lecture 1 Separation of Variables Lecture 1 Physics 342 Quantum Mechanics I Monay, January 25th, 2010 There are three basic mathematical tools we nee, an then we can begin working on the physical

More information

Collapsed Gibbs and Variational Methods for LDA. Example Collapsed MoG Sampling

Collapsed Gibbs and Variational Methods for LDA. Example Collapsed MoG Sampling Case Stuy : Document Retrieval Collapse Gibbs an Variational Methos for LDA Machine Learning/Statistics for Big Data CSE599C/STAT59, University of Washington Emily Fox 0 Emily Fox February 7 th, 0 Example

More information

Switching Time Optimization in Discretized Hybrid Dynamical Systems

Switching Time Optimization in Discretized Hybrid Dynamical Systems Switching Time Optimization in Discretize Hybri Dynamical Systems Kathrin Flaßkamp, To Murphey, an Sina Ober-Blöbaum Abstract Switching time optimization (STO) arises in systems that have a finite set

More information

Expected Value of Partial Perfect Information

Expected Value of Partial Perfect Information Expecte Value of Partial Perfect Information Mike Giles 1, Takashi Goa 2, Howar Thom 3 Wei Fang 1, Zhenru Wang 1 1 Mathematical Institute, University of Oxfor 2 School of Engineering, University of Tokyo

More information

Robust Low Rank Kernel Embeddings of Multivariate Distributions

Robust Low Rank Kernel Embeddings of Multivariate Distributions Robust Low Rank Kernel Embeings of Multivariate Distributions Le Song, Bo Dai College of Computing, Georgia Institute of Technology lsong@cc.gatech.eu, boai@gatech.eu Abstract Kernel embeing of istributions

More information

7.1 Support Vector Machine

7.1 Support Vector Machine 67577 Intro. to Machine Learning Fall semester, 006/7 Lecture 7: Support Vector Machines an Kernel Functions II Lecturer: Amnon Shashua Scribe: Amnon Shashua 7. Support Vector Machine We return now to

More information

Gaussian processes with monotonicity information

Gaussian processes with monotonicity information Gaussian processes with monotonicity information Anonymous Author Anonymous Author Unknown Institution Unknown Institution Abstract A metho for using monotonicity information in multivariate Gaussian process

More information

Improving Estimation Accuracy in Nonrandomized Response Questioning Methods by Multiple Answers

Improving Estimation Accuracy in Nonrandomized Response Questioning Methods by Multiple Answers International Journal of Statistics an Probability; Vol 6, No 5; September 207 ISSN 927-7032 E-ISSN 927-7040 Publishe by Canaian Center of Science an Eucation Improving Estimation Accuracy in Nonranomize

More information

Cascaded redundancy reduction

Cascaded redundancy reduction Network: Comput. Neural Syst. 9 (1998) 73 84. Printe in the UK PII: S0954-898X(98)88342-5 Cascae reunancy reuction Virginia R e Sa an Geoffrey E Hinton Department of Computer Science, University of Toronto,

More information

Quantum mechanical approaches to the virial

Quantum mechanical approaches to the virial Quantum mechanical approaches to the virial S.LeBohec Department of Physics an Astronomy, University of Utah, Salt Lae City, UT 84112, USA Date: June 30 th 2015 In this note, we approach the virial from

More information

A Novel Decoupled Iterative Method for Deep-Submicron MOSFET RF Circuit Simulation

A Novel Decoupled Iterative Method for Deep-Submicron MOSFET RF Circuit Simulation A Novel ecouple Iterative Metho for eep-submicron MOSFET RF Circuit Simulation CHUAN-SHENG WANG an YIMING LI epartment of Mathematics, National Tsing Hua University, National Nano evice Laboratories, an

More information

Modelling and simulation of dependence structures in nonlife insurance with Bernstein copulas

Modelling and simulation of dependence structures in nonlife insurance with Bernstein copulas Moelling an simulation of epenence structures in nonlife insurance with Bernstein copulas Prof. Dr. Dietmar Pfeifer Dept. of Mathematics, University of Olenburg an AON Benfiel, Hamburg Dr. Doreen Straßburger

More information

Table of Common Derivatives By David Abraham

Table of Common Derivatives By David Abraham Prouct an Quotient Rules: Table of Common Derivatives By Davi Abraham [ f ( g( ] = [ f ( ] g( + f ( [ g( ] f ( = g( [ f ( ] g( g( f ( [ g( ] Trigonometric Functions: sin( = cos( cos( = sin( tan( = sec

More information

WEIGHTING A RESAMPLED PARTICLES IN SEQUENTIAL MONTE CARLO (EXTENDED PREPRINT) L. Martino, V. Elvira, F. Louzada

WEIGHTING A RESAMPLED PARTICLES IN SEQUENTIAL MONTE CARLO (EXTENDED PREPRINT) L. Martino, V. Elvira, F. Louzada WEIGHTIG A RESAMLED ARTICLES I SEQUETIAL MOTE CARLO (ETEDED RERIT) L. Martino, V. Elvira, F. Louzaa Dep. of Signal Theory an Communic., Universia Carlos III e Mari, Leganés (Spain). Institute of Mathematical

More information

u!i = a T u = 0. Then S satisfies

u!i = a T u = 0. Then S satisfies Deterministic Conitions for Subspace Ientifiability from Incomplete Sampling Daniel L Pimentel-Alarcón, Nigel Boston, Robert D Nowak University of Wisconsin-Maison Abstract Consier an r-imensional subspace

More information

Hybrid Fusion for Biometrics: Combining Score-level and Decision-level Fusion

Hybrid Fusion for Biometrics: Combining Score-level and Decision-level Fusion Hybri Fusion for Biometrics: Combining Score-level an Decision-level Fusion Qian Tao Raymon Velhuis Signals an Systems Group, University of Twente Postbus 217, 7500AE Enschee, the Netherlans {q.tao,r.n.j.velhuis}@ewi.utwente.nl

More information

State estimation for predictive maintenance using Kalman filter

State estimation for predictive maintenance using Kalman filter Reliability Engineering an System Safety 66 (1999) 29 39 www.elsevier.com/locate/ress State estimation for preictive maintenance using Kalman filter S.K. Yang, T.S. Liu* Department of Mechanical Engineering,

More information

Group Importance Sampling for particle filtering and MCMC

Group Importance Sampling for particle filtering and MCMC Group Importance Sampling for particle filtering an MCMC Luca Martino, Víctor Elvira, Gustau Camps-Valls Image Processing Laboratory, Universitat e València (Spain). IMT Lille Douai CRISTAL (UMR 989),

More information

Equilibrium in Queues Under Unknown Service Times and Service Value

Equilibrium in Queues Under Unknown Service Times and Service Value University of Pennsylvania ScholarlyCommons Finance Papers Wharton Faculty Research 1-2014 Equilibrium in Queues Uner Unknown Service Times an Service Value Laurens Debo Senthil K. Veeraraghavan University

More information

This module is part of the. Memobust Handbook. on Methodology of Modern Business Statistics

This module is part of the. Memobust Handbook. on Methodology of Modern Business Statistics This moule is part of the Memobust Hanbook on Methoology of Moern Business Statistics 26 March 2014 Metho: Balance Sampling for Multi-Way Stratification Contents General section... 3 1. Summary... 3 2.

More information

OPTIMAL CONTROL PROBLEM FOR PROCESSES REPRESENTED BY STOCHASTIC SEQUENTIAL MACHINE

OPTIMAL CONTROL PROBLEM FOR PROCESSES REPRESENTED BY STOCHASTIC SEQUENTIAL MACHINE OPTIMA CONTRO PROBEM FOR PROCESSES REPRESENTED BY STOCHASTIC SEQUENTIA MACHINE Yaup H. HACI an Muhammet CANDAN Department of Mathematics, Canaale Onseiz Mart University, Canaale, Turey ABSTRACT In this

More information

ON MULTIPLE TRY SCHEMES AND THE PARTICLE METROPOLIS-HASTINGS ALGORITHM

ON MULTIPLE TRY SCHEMES AND THE PARTICLE METROPOLIS-HASTINGS ALGORITHM ON MULTIPLE TRY SCHEMES AN THE PARTICLE METROPOLIS-HASTINGS ALGORITHM L. Martino, F. Leisen, J. Coraner University of Helsinki, Helsinki (Finlan). University of Kent, Canterbury (UK). ABSTRACT Markov Chain

More information

Lower Bounds for the Smoothed Number of Pareto optimal Solutions

Lower Bounds for the Smoothed Number of Pareto optimal Solutions Lower Bouns for the Smoothe Number of Pareto optimal Solutions Tobias Brunsch an Heiko Röglin Department of Computer Science, University of Bonn, Germany brunsch@cs.uni-bonn.e, heiko@roeglin.org Abstract.

More information

Spatio-temporal fusion for reliable moving vehicle classification in wireless sensor networks

Spatio-temporal fusion for reliable moving vehicle classification in wireless sensor networks Proceeings of the 2009 IEEE International Conference on Systems, Man, an Cybernetics San Antonio, TX, USA - October 2009 Spatio-temporal for reliable moving vehicle classification in wireless sensor networks

More information

Introduction to Markov Processes

Introduction to Markov Processes Introuction to Markov Processes Connexions moule m44014 Zzis law Gustav) Meglicki, Jr Office of the VP for Information Technology Iniana University RCS: Section-2.tex,v 1.24 2012/12/21 18:03:08 gustav

More information

Multi-View Clustering via Canonical Correlation Analysis

Multi-View Clustering via Canonical Correlation Analysis Technical Report TTI-TR-2008-5 Multi-View Clustering via Canonical Correlation Analysis Kamalika Chauhuri UC San Diego Sham M. Kakae Toyota Technological Institute at Chicago ABSTRACT Clustering ata in

More information

ensembles When working with density operators, we can use this connection to define a generalized Bloch vector: v x Tr x, v y Tr y

ensembles When working with density operators, we can use this connection to define a generalized Bloch vector: v x Tr x, v y Tr y Ph195a lecture notes, 1/3/01 Density operators for spin- 1 ensembles So far in our iscussion of spin- 1 systems, we have restricte our attention to the case of pure states an Hamiltonian evolution. Toay

More information

Exponential Tracking Control of Nonlinear Systems with Actuator Nonlinearity

Exponential Tracking Control of Nonlinear Systems with Actuator Nonlinearity Preprints of the 9th Worl Congress The International Feeration of Automatic Control Cape Town, South Africa. August -9, Exponential Tracking Control of Nonlinear Systems with Actuator Nonlinearity Zhengqiang

More information

Level Construction of Decision Trees in a Partition-based Framework for Classification

Level Construction of Decision Trees in a Partition-based Framework for Classification Level Construction of Decision Trees in a Partition-base Framework for Classification Y.Y. Yao, Y. Zhao an J.T. Yao Department of Computer Science, University of Regina Regina, Saskatchewan, Canaa S4S

More information

New Statistical Test for Quality Control in High Dimension Data Set

New Statistical Test for Quality Control in High Dimension Data Set International Journal of Applie Engineering Research ISSN 973-456 Volume, Number 6 (7) pp. 64-649 New Statistical Test for Quality Control in High Dimension Data Set Shamshuritawati Sharif, Suzilah Ismail

More information

Monte Carlo Methods with Reduced Error

Monte Carlo Methods with Reduced Error Monte Carlo Methos with Reuce Error As has been shown, the probable error in Monte Carlo algorithms when no information about the smoothness of the function is use is Dξ r N = c N. It is important for

More information

PD Controller for Car-Following Models Based on Real Data

PD Controller for Car-Following Models Based on Real Data PD Controller for Car-Following Moels Base on Real Data Xiaopeng Fang, Hung A. Pham an Minoru Kobayashi Department of Mechanical Engineering Iowa State University, Ames, IA 5 Hona R&D The car following

More information

Topic 7: Convergence of Random Variables

Topic 7: Convergence of Random Variables Topic 7: Convergence of Ranom Variables Course 003, 2016 Page 0 The Inference Problem So far, our starting point has been a given probability space (S, F, P). We now look at how to generate information

More information

Generalizing Kronecker Graphs in order to Model Searchable Networks

Generalizing Kronecker Graphs in order to Model Searchable Networks Generalizing Kronecker Graphs in orer to Moel Searchable Networks Elizabeth Boine, Babak Hassibi, Aam Wierman California Institute of Technology Pasaena, CA 925 Email: {eaboine, hassibi, aamw}@caltecheu

More information

05 The Continuum Limit and the Wave Equation

05 The Continuum Limit and the Wave Equation Utah State University DigitalCommons@USU Founations of Wave Phenomena Physics, Department of 1-1-2004 05 The Continuum Limit an the Wave Equation Charles G. Torre Department of Physics, Utah State University,

More information

Estimation of the Maximum Domination Value in Multi-Dimensional Data Sets

Estimation of the Maximum Domination Value in Multi-Dimensional Data Sets Proceeings of the 4th East-European Conference on Avances in Databases an Information Systems ADBIS) 200 Estimation of the Maximum Domination Value in Multi-Dimensional Data Sets Eleftherios Tiakas, Apostolos.

More information

Radar Sensor Management for Detection and Tracking

Radar Sensor Management for Detection and Tracking Raar Sensor Management for Detection an Tracking Krüger White, Jason Williams, Peter Hoffensetz Defence Science an Technology Organisation PO Box 00, Einburgh, SA Australia Email: Kruger.White@sto.efence.gov.au,

More information

Error Floors in LDPC Codes: Fast Simulation, Bounds and Hardware Emulation

Error Floors in LDPC Codes: Fast Simulation, Bounds and Hardware Emulation Error Floors in LDPC Coes: Fast Simulation, Bouns an Harware Emulation Pamela Lee, Lara Dolecek, Zhengya Zhang, Venkat Anantharam, Borivoje Nikolic, an Martin J. Wainwright EECS Department University of

More information

Image Denoising Using Spatial Adaptive Thresholding

Image Denoising Using Spatial Adaptive Thresholding International Journal of Engineering Technology, Management an Applie Sciences Image Denoising Using Spatial Aaptive Thresholing Raneesh Mishra M. Tech Stuent, Department of Electronics & Communication,

More information

CONTROL CHARTS FOR VARIABLES

CONTROL CHARTS FOR VARIABLES UNIT CONTOL CHATS FO VAIABLES Structure.1 Introuction Objectives. Control Chart Technique.3 Control Charts for Variables.4 Control Chart for Mean(-Chart).5 ange Chart (-Chart).6 Stanar Deviation Chart

More information

Optimal Cooperative Spectrum Sensing in Cognitive Sensor Networks

Optimal Cooperative Spectrum Sensing in Cognitive Sensor Networks Optimal Cooperative Spectrum Sensing in Cognitive Sensor Networks Hai Ngoc Pham, an Zhang, Paal E. Engelsta,,3, Tor Skeie,, Frank Eliassen, Department of Informatics, University of Oslo, Norway Simula

More information

Survey-weighted Unit-Level Small Area Estimation

Survey-weighted Unit-Level Small Area Estimation Survey-weighte Unit-Level Small Area Estimation Jan Pablo Burgar an Patricia Dörr Abstract For evience-base regional policy making, geographically ifferentiate estimates of socio-economic inicators are

More information

THE VAN KAMPEN EXPANSION FOR LINKED DUFFING LINEAR OSCILLATORS EXCITED BY COLORED NOISE

THE VAN KAMPEN EXPANSION FOR LINKED DUFFING LINEAR OSCILLATORS EXCITED BY COLORED NOISE Journal of Soun an Vibration (1996) 191(3), 397 414 THE VAN KAMPEN EXPANSION FOR LINKED DUFFING LINEAR OSCILLATORS EXCITED BY COLORED NOISE E. M. WEINSTEIN Galaxy Scientific Corporation, 2500 English Creek

More information

Bayesian Estimation of the Entropy of the Multivariate Gaussian

Bayesian Estimation of the Entropy of the Multivariate Gaussian Bayesian Estimation of the Entropy of the Multivariate Gaussian Santosh Srivastava Fre Hutchinson Cancer Research Center Seattle, WA 989, USA Email: ssrivast@fhcrc.org Maya R. Gupta Department of Electrical

More information

Lecture 2: Correlated Topic Model

Lecture 2: Correlated Topic Model Probabilistic Moels for Unsupervise Learning Spring 203 Lecture 2: Correlate Topic Moel Inference for Correlate Topic Moel Yuan Yuan First of all, let us make some claims about the parameters an variables

More information

TIME-DELAY ESTIMATION USING FARROW-BASED FRACTIONAL-DELAY FIR FILTERS: FILTER APPROXIMATION VS. ESTIMATION ERRORS

TIME-DELAY ESTIMATION USING FARROW-BASED FRACTIONAL-DELAY FIR FILTERS: FILTER APPROXIMATION VS. ESTIMATION ERRORS TIME-DEAY ESTIMATION USING FARROW-BASED FRACTIONA-DEAY FIR FITERS: FITER APPROXIMATION VS. ESTIMATION ERRORS Mattias Olsson, Håkan Johansson, an Per öwenborg Div. of Electronic Systems, Dept. of Electrical

More information

LATTICE-BASED D-OPTIMUM DESIGN FOR FOURIER REGRESSION

LATTICE-BASED D-OPTIMUM DESIGN FOR FOURIER REGRESSION The Annals of Statistics 1997, Vol. 25, No. 6, 2313 2327 LATTICE-BASED D-OPTIMUM DESIGN FOR FOURIER REGRESSION By Eva Riccomagno, 1 Rainer Schwabe 2 an Henry P. Wynn 1 University of Warwick, Technische

More information

Thermal conductivity of graded composites: Numerical simulations and an effective medium approximation

Thermal conductivity of graded composites: Numerical simulations and an effective medium approximation JOURNAL OF MATERIALS SCIENCE 34 (999)5497 5503 Thermal conuctivity of grae composites: Numerical simulations an an effective meium approximation P. M. HUI Department of Physics, The Chinese University

More information

APPROXIMATE SOLUTION FOR TRANSIENT HEAT TRANSFER IN STATIC TURBULENT HE II. B. Baudouy. CEA/Saclay, DSM/DAPNIA/STCM Gif-sur-Yvette Cedex, France

APPROXIMATE SOLUTION FOR TRANSIENT HEAT TRANSFER IN STATIC TURBULENT HE II. B. Baudouy. CEA/Saclay, DSM/DAPNIA/STCM Gif-sur-Yvette Cedex, France APPROXIMAE SOLUION FOR RANSIEN HEA RANSFER IN SAIC URBULEN HE II B. Bauouy CEA/Saclay, DSM/DAPNIA/SCM 91191 Gif-sur-Yvette Ceex, France ABSRAC Analytical solution in one imension of the heat iffusion equation

More information

TEMPORAL AND TIME-FREQUENCY CORRELATION-BASED BLIND SOURCE SEPARATION METHODS. Yannick DEVILLE

TEMPORAL AND TIME-FREQUENCY CORRELATION-BASED BLIND SOURCE SEPARATION METHODS. Yannick DEVILLE TEMPORAL AND TIME-FREQUENCY CORRELATION-BASED BLIND SOURCE SEPARATION METHODS Yannick DEVILLE Université Paul Sabatier Laboratoire Acoustique, Métrologie, Instrumentation Bât. 3RB2, 8 Route e Narbonne,

More information

Track Initialization from Incomplete Measurements

Track Initialization from Incomplete Measurements Track Initialiation from Incomplete Measurements Christian R. Berger, Martina Daun an Wolfgang Koch Department of Electrical an Computer Engineering, University of Connecticut, Storrs, Connecticut 6269,

More information

A study on ant colony systems with fuzzy pheromone dispersion

A study on ant colony systems with fuzzy pheromone dispersion A stuy on ant colony systems with fuzzy pheromone ispersion Louis Gacogne LIP6 104, Av. Kenney, 75016 Paris, France gacogne@lip6.fr Sanra Sanri IIIA/CSIC Campus UAB, 08193 Bellaterra, Spain sanri@iiia.csic.es

More information

A New Minimum Description Length

A New Minimum Description Length A New Minimum Description Length Soosan Beheshti, Munther A. Dahleh Laboratory for Information an Decision Systems Massachusetts Institute of Technology soosan@mit.eu,ahleh@lis.mit.eu Abstract The minimum

More information

Adaptive Adjustment of Noise Covariance in Kalman Filter for Dynamic State Estimation

Adaptive Adjustment of Noise Covariance in Kalman Filter for Dynamic State Estimation Aaptive Ajustment of Noise Covariance in Kalman Filter for Dynamic State Estimation Shahroh Ahlaghi, Stuent Member, IEEE Ning Zhou, Senior Member, IEEE Electrical an Computer Engineering Department, Binghamton

More information

We G Model Reduction Approaches for Solution of Wave Equations for Multiple Frequencies

We G Model Reduction Approaches for Solution of Wave Equations for Multiple Frequencies We G15 5 Moel Reuction Approaches for Solution of Wave Equations for Multiple Frequencies M.Y. Zaslavsky (Schlumberger-Doll Research Center), R.F. Remis* (Delft University) & V.L. Druskin (Schlumberger-Doll

More information

6 General properties of an autonomous system of two first order ODE

6 General properties of an autonomous system of two first order ODE 6 General properties of an autonomous system of two first orer ODE Here we embark on stuying the autonomous system of two first orer ifferential equations of the form ẋ 1 = f 1 (, x 2 ), ẋ 2 = f 2 (, x

More information

Introduction to Machine Learning

Introduction to Machine Learning How o you estimate p(y x)? Outline Contents Introuction to Machine Learning Logistic Regression Varun Chanola April 9, 207 Generative vs. Discriminative Classifiers 2 Logistic Regression 2 3 Logistic Regression

More information

An Optimal Algorithm for Bandit and Zero-Order Convex Optimization with Two-Point Feedback

An Optimal Algorithm for Bandit and Zero-Order Convex Optimization with Two-Point Feedback Journal of Machine Learning Research 8 07) - Submitte /6; Publishe 5/7 An Optimal Algorithm for Banit an Zero-Orer Convex Optimization with wo-point Feeback Oha Shamir Department of Computer Science an

More information

Topic Modeling: Beyond Bag-of-Words

Topic Modeling: Beyond Bag-of-Words Hanna M. Wallach Cavenish Laboratory, University of Cambrige, Cambrige CB3 0HE, UK hmw26@cam.ac.u Abstract Some moels of textual corpora employ text generation methos involving n-gram statistics, while

More information

Fast Resampling Weighted v-statistics

Fast Resampling Weighted v-statistics Fast Resampling Weighte v-statistics Chunxiao Zhou Mar O. Hatfiel Clinical Research Center National Institutes of Health Bethesa, MD 20892 chunxiao.zhou@nih.gov Jiseong Par Dept of Math George Mason Univ

More information

Parameter estimation: A new approach to weighting a priori information

Parameter estimation: A new approach to weighting a priori information Parameter estimation: A new approach to weighting a priori information J.L. Mea Department of Mathematics, Boise State University, Boise, ID 83725-555 E-mail: jmea@boisestate.eu Abstract. We propose a

More information

Event based Kalman filter observer for rotary high speed on/off valve

Event based Kalman filter observer for rotary high speed on/off valve 28 American Control Conference Westin Seattle Hotel, Seattle, Washington, USA June 11-13, 28 WeC9.6 Event base Kalman filter observer for rotary high spee on/off valve Meng Wang, Perry Y. Li ERC for Compact

More information

Estimation of District Level Poor Households in the State of. Uttar Pradesh in India by Combining NSSO Survey and

Estimation of District Level Poor Households in the State of. Uttar Pradesh in India by Combining NSSO Survey and Int. Statistical Inst.: Proc. 58th Worl Statistical Congress, 2011, Dublin (Session CPS039) p.6567 Estimation of District Level Poor Househols in the State of Uttar Praesh in Inia by Combining NSSO Survey

More information

Subspace Estimation from Incomplete Observations: A High-Dimensional Analysis

Subspace Estimation from Incomplete Observations: A High-Dimensional Analysis Subspace Estimation from Incomplete Observations: A High-Dimensional Analysis Chuang Wang, Yonina C. Elar, Fellow, IEEE an Yue M. Lu, Senior Member, IEEE Abstract We present a high-imensional analysis

More information

APPLICATION of compressed sensing (CS) in radar signal

APPLICATION of compressed sensing (CS) in radar signal A Novel Joint Compressive Single Target Detection an Parameter Estimation in Raar without Signal Reconstruction Alireza Hariri, Massou Babaie-Zaeh Department of Electrical Engineering, Sharif University

More information

Relative Entropy and Score Function: New Information Estimation Relationships through Arbitrary Additive Perturbation

Relative Entropy and Score Function: New Information Estimation Relationships through Arbitrary Additive Perturbation Relative Entropy an Score Function: New Information Estimation Relationships through Arbitrary Aitive Perturbation Dongning Guo Department of Electrical Engineering & Computer Science Northwestern University

More information

The new concepts of measurement error s regularities and effect characteristics

The new concepts of measurement error s regularities and effect characteristics The new concepts of measurement error s regularities an effect characteristics Ye Xiaoming[1,] Liu Haibo [3,,] Ling Mo[3] Xiao Xuebin [5] [1] School of Geoesy an Geomatics, Wuhan University, Wuhan, Hubei,

More information

Similarity Measures for Categorical Data A Comparative Study. Technical Report

Similarity Measures for Categorical Data A Comparative Study. Technical Report Similarity Measures for Categorical Data A Comparative Stuy Technical Report Department of Computer Science an Engineering University of Minnesota 4-92 EECS Builing 200 Union Street SE Minneapolis, MN

More information

On conditional moments of high-dimensional random vectors given lower-dimensional projections

On conditional moments of high-dimensional random vectors given lower-dimensional projections Submitte to the Bernoulli arxiv:1405.2183v2 [math.st] 6 Sep 2016 On conitional moments of high-imensional ranom vectors given lower-imensional projections LUKAS STEINBERGER an HANNES LEEB Department of

More information

CUSTOMER REVIEW FEATURE EXTRACTION Heng Ren, Jingye Wang, and Tony Wu

CUSTOMER REVIEW FEATURE EXTRACTION Heng Ren, Jingye Wang, and Tony Wu CUSTOMER REVIEW FEATURE EXTRACTION Heng Ren, Jingye Wang, an Tony Wu Abstract Popular proucts often have thousans of reviews that contain far too much information for customers to igest. Our goal for the

More information

Lower bounds on Locality Sensitive Hashing

Lower bounds on Locality Sensitive Hashing Lower bouns on Locality Sensitive Hashing Rajeev Motwani Assaf Naor Rina Panigrahy Abstract Given a metric space (X, X ), c 1, r > 0, an p, q [0, 1], a istribution over mappings H : X N is calle a (r,

More information

Crossover-Cat Swarm Optimization

Crossover-Cat Swarm Optimization Australian Journal of Basic an Applie Sciences, 0(5) Special 06, Pages: -6 AUSTRALIAN JOURNAL OF BASIC AND APPLIED SCIENCES ISSN:99-878 EISSN: 309-844 Journal home page: www.abasweb.com RBFNN Equalizer

More information

Slide10 Haykin Chapter 14: Neurodynamics (3rd Ed. Chapter 13)

Slide10 Haykin Chapter 14: Neurodynamics (3rd Ed. Chapter 13) Slie10 Haykin Chapter 14: Neuroynamics (3r E. Chapter 13) CPSC 636-600 Instructor: Yoonsuck Choe Spring 2012 Neural Networks with Temporal Behavior Inclusion of feeback gives temporal characteristics to

More information

On the Value of Partial Information for Learning from Examples

On the Value of Partial Information for Learning from Examples JOURNAL OF COMPLEXITY 13, 509 544 (1998) ARTICLE NO. CM970459 On the Value of Partial Information for Learning from Examples Joel Ratsaby* Department of Electrical Engineering, Technion, Haifa, 32000 Israel

More information

Flexible High-Dimensional Classification Machines and Their Asymptotic Properties

Flexible High-Dimensional Classification Machines and Their Asymptotic Properties Journal of Machine Learning Research 16 (2015) 1547-1572 Submitte 1/14; Revise 9/14; Publishe 8/15 Flexible High-Dimensional Classification Machines an Their Asymptotic Properties Xingye Qiao Department

More information

Necessary and Sufficient Conditions for Sketched Subspace Clustering

Necessary and Sufficient Conditions for Sketched Subspace Clustering Necessary an Sufficient Conitions for Sketche Subspace Clustering Daniel Pimentel-Alarcón, Laura Balzano 2, Robert Nowak University of Wisconsin-Maison, 2 University of Michigan-Ann Arbor Abstract This

More information

A Unified Approach for Learning the Parameters of Sum-Product Networks

A Unified Approach for Learning the Parameters of Sum-Product Networks A Unifie Approach for Learning the Parameters of Sum-Prouct Networks Han Zhao Machine Learning Dept. Carnegie Mellon University han.zhao@cs.cmu.eu Pascal Poupart School of Computer Science University of

More information

Balancing Expected and Worst-Case Utility in Contracting Models with Asymmetric Information and Pooling

Balancing Expected and Worst-Case Utility in Contracting Models with Asymmetric Information and Pooling Balancing Expecte an Worst-Case Utility in Contracting Moels with Asymmetric Information an Pooling R.B.O. erkkamp & W. van en Heuvel & A.P.M. Wagelmans Econometric Institute Report EI2018-01 9th January

More information

State observers and recursive filters in classical feedback control theory

State observers and recursive filters in classical feedback control theory State observers an recursive filters in classical feeback control theory State-feeback control example: secon-orer system Consier the riven secon-orer system q q q u x q x q x x x x Here u coul represent

More information

A Hybrid Approach for Modeling High Dimensional Medical Data

A Hybrid Approach for Modeling High Dimensional Medical Data A Hybri Approach for Moeling High Dimensional Meical Data Alok Sharma 1, Gofrey C. Onwubolu 1 1 University of the South Pacific, Fii sharma_al@usp.ac.f, onwubolu_g@usp.ac.f Abstract. his work presents

More information