A Monte Carlo scheme for diffusion estimation

Size: px
Start display at page:

Download "A Monte Carlo scheme for diffusion estimation"

Transcription

1 A onte Carlo schee for diffusion estiation L. artino, J. Plata, F. Louzada Institute of atheatical Sciences Coputing, Universidade de São Paulo (ICC-USP), Brazil. Dep. of Electrical Engineering (ESAT-STADIUS), Leuven, Belgiu. Abstract In this work, we design an efficient onte Carlo schee for diffusion estiation, where global local paraeters are involved in a unique inference proble. This scenario often appears in distributed inference probles in wireless sensor networks. The proposed schee uses parallel local CC chains then an iportance sapling (IS) fusion for obtaining an efficient estiation of the global paraeters. The resulting algorith is siple flexible. It can be easily applied iteratively, or extended in a sequential fraework. In order to apply the novel schee, the only assuption required about the odel is that the easureents are conditionally independent given the related paraeters. Keywords: Diffusion estiation; Distributed inference; Parallel CC; Iportance sapling. I. ITRODUCTIO Distributed inference have obtained a assive attention in the last decades, for different applications [], [2], [3], [4], [5], [6], [7], [8]. For instance, in signal processing several algoriths have been proposed for solving inference probles in wireless sensor networks, where each node provides easureents about global local paraeters. aely, all the received easureents are statistically related to a global paraeter, whereas only a subset of the observations provides statistical inforation about the local variables. Furtherore, Bayesian ethods have becoe very popular in signal processing during the last years, with the, onte Carlo (C) techniques that are often necessary for the ipleentation of optial a posteriori estiators [9], [0], []. Indeed, C ethods are powerful tools for nuerical inference optiization [2], [3], [4], [5], [6]. They are very flexible techniques. The only requireent needs for applying an C technique is to be able to evaluate point-wise the posterior probability density function (pdf) [9], [0]. In this work, we introduce a siple flexible approach for the diffusion estiation of global paraeters, providing siultaneously the inference of the local paraeters. The proposed solution eploys parallel CC algoriths for analyzing the local features of the network an iportance sapling (IS) fusion for obtaining coplete estiators of the global paraeters. Each CC ethod addresses a different target posterior function, obtained by considering a subset of observations. This approach also presents several coputational benefits fro a onte Carlo point of view (as rearked This work has been supported by the Grant 204/ of the São Paulo Research Foundation (FAPESP), by the Grant 30536/203-3 of the ational Council for Scientific Technological Developent (CPq) ERC Consolidator Grant SEDAL ERC-204-CoG exhaustively in Section III-A). For instance, the ixing of the CC ethods is facilitated by the reduced nuber of easureents involved in the partial posterior, since this partial posterior distribution is iplicitly tepered [7], [8], [9], [6]. Furtherore, several parallel or related schees, proposed in literature, could adapted for this fraework [20], [2], [22], [8], [9], [23], [6], [7], [8]. uerical siulations show the advantages of the proposed approach. II. PROBLE STATEET In this work, we are interested in aking inference the following variable of interest, x [ ] Θ (G) Θ Θ (L). v v A RD θ, () coposed by the vectors x [x,..., x dx ] R d X v [v,,..., v d,] R d, for,...,. The vector Θ (G) x represents a global paraeter, whereas Θ (L) [v,..., v ] R L are local paraeters. We receive a set of d Y easureents, Y {z, z 2,..., z dy }, with each z j R (we assue z j be scalar only for siplicity), related to variable of interest Θ. We consider disjoint subset of Y, i.e., we can write Y y : y, y j y k, for all j k. We assue that the observations are conditionally independent, i.e., the likelihood function can be factorized as L(y : Θ) l (y x, v ). (2) Considering a prior probability density function (pdf) p(θ) p(x, v ), the coplete posterior pdf can written as Ω(Θ y : ) π (x, v y ), (3) where the partial posteriors are π (x, v y ) π (x, v y ) l (y x, v ) [p(x, v )] /. (4)

2 For the inference of x, an iportant role is played by the arginal posterior density G(x y : ) Ω(Θ y : )dv... dv (5) R L π (x, v y )dv... dv (6) where R L g (x y ), (7) g (x y ) π (x, v y )dv. R d (8) ote that the integrals above cannot be coputed analytically, in general. This fraework appears naturally in several applications, for instance in the so-called ode-specific Paraeter Estiation (SPE) proble within a sensor network [4], [5], [24], [25] (see Figure ). Furtherore, a siilar approach is considered in Big Data context [23], [6], [7], [8]. x v z 3 z z 2 y y 3 z z 0 9 v 3 z 4 z 5 z 8 v 2 z 7 y 2 Figure. Diffusion estiation in a sensor network. In this graphical exaples, the network is coposed by 8 nodes, dividing in 3 clusters providing 3 different vectors of easureents y, y 2 y 3. In this case, Y y y 2 y 3, y j y k 0 for j k (in this case, d Y 5, d Y2 3, d Y3 2). Each sensor can provide ore than one easureent. Reark. ote that we could assue easily that each partial posterior π (x, v y ) depends also to other local variables, e.g., π (x, v, v k, v j y ), with k, j. The algorith proposed in this work can be autoatically extended for this case. For siplicity, we have considered only one local variable in each partial posterior. III. BAYESIA IFERECE Our purpose is to ake inference about Θ [x, v : ] given Y y :. For instance, we desire to copute the iniu ean Square Error (SE) estiators of x v :, i.e., x x Ω(x, v : y : )dxdv :, (9) A x G(x y : )dx, (0) R d X z 6 v A v Ω(x, v : y : )dxdv :, () for,...,. In general, we are not able to calculate analytically the integrals above. Thus, we apply a onte Carlo (C) approach for coputing approxiately x v :. A. Benefits of the parallel C ipleentation The previous factorization of the posterior pdf suggests the use of parallel algoriths then cobine the corresponding outputs. This is convenient fro a onte Carlo point of view. aely, the use of parallel C ethods, each one addressing one partial posterior π, presents several coputational benefits: a) Each partial posterior π is ebedded in a state space of lower diensionality, specifically, x, v R d X d. This clearly helps the exploration of the space by the C algorith. b) Each partial posterior involves a saller nuber of easureents. This is an advantage since the ass of probability is in general ore disperse than when a big nuber of observations is jointly considered, producing a tepering effect (datatepering) [7], [6], [9]. This again helps the exploration of the state space (as suggested, e.g., in [9]). c) This scenario autoatically allows a possible parallel ipleentation. IV. DISTRIBUTED OTE CARLO IFERECE Let us consider that we able to draw saples θ (n) [x (n), v (n) ], with n,...,, directly fro each π, i.e., θ (),..., θ () π (x, v y ), with,...,. Due to the factorization of the coplete target pdf Ω(Θ y : ), for inferring the local variable v, we use only the saples v (n), n,...,, obtained fro π. For the global variable x, we can build different partial onte Carlo estiators. However, all the inforation contained in the different partial posteriors should be eployed for providing a ore efficient unique estiator of x. It can be done cobining adequately the partial onte Carlo estiators of x. Let us consider that we are able to draw saples fro each arginal pdfs g (x y ) in Eq. (8) also assue that we are able to evaluate it. In this case, we can use the following IS schee: ) Draw x (),..., x () g (x y ). 2) Assign the weight w (n) G(x(n) y : ) g (x (n) y ), (2) g k (x (n) k y k), (3) k;k to each saple x (n), for n,...,.

3 Then, the IS approxiation of the SE estiator x is x n w(n) n The previous approach has two ain probles: w (n) x (n) (4) We are not able to draw fro g (x y ) in Eq. (8). It is not possible to evaluate g (x y ) G(x y : ). A. Proposed Algorith A possible approxiate solution consists in the following procedure. First, we use parallel CC algoriths for drawing saples θ (n) [x (n), v (n) ] fro each partial posterior π (x, v y ). Given an index {,..., }, after that the chain converges, note that the saples x (n) are distributed as g (x y ) whereas θ (n) is distributed as π (x, v y ). Then, we build a kernel density approxiation [26], ĝ (x y ) n ϕ(x x (n), C), (5) for each {,..., }, using the generated saples x (n), n,...,, as eans of a kernel function ϕ with bwidth atrix C. In this way, we can copute an approxiate weight ŵ (n) k;k ĝ k (x (n) k y k), (6) so that the IS estiator x in Eq. (4) can be approxiated with x ŵ n ŵ(n) (n) x (n). (7) n Table I Figure 2 suarizes the proposed algorith. Table II suarized the notation about the estiators of x. x, v x, v x, v y y 2 y CC CC 2 CC KDE KDE 2 KDE IS fusion Figure 2. Graphical representation of Parallel arginal arkov Iportance Sapling schee: local CC chains are run, each one addressing one partial posterior π. A kernel density estiation (KDE) for approxiating the arginal partial posteriors g (x y ) is perfored locally. Then, an iportance sapling fusion is eployed for providing a global estiator of the global paraeter x. Theoretical support. After a burn-in period, the CC chains converges to the invariant target pdf (for, the convergence is ensured [9], [0]). aely after soe Table I PARALLEL ARGIAL ARKOV IPORTACE SAPLER (PIS).. Local CC saplers: Generate chains of length, i.e., θ () [x (), v () ],..., θ () [x (), v () ], with target pdf π (x y ) π (x y ),,...,. 2. Kernel density estiation: Build bg (x y ) X ϕ(x x (n), C), (8) n given a function ϕ scale paraeter C. 3. Global IS fusion: Copute the weights bw (n) Y k;k for,..., n,...,. 4. Outputs: Return the onte Carlo estiators x P P n bw(n) bg k (x (n) k y k), (9) X X n bw (n) x (n), (20) ev X v (n),,...,. (2) n Table II OTATIO OF DIFFERET ESTIATORS OF x. otation Description bx SE estiator in Eq. (9). ex onte Carlo estiator (approxiation of bx) in Eq. (4). x Approxiate onte Carlo estiator (approxiation of ex) in Eqs. (7)-(20). iterations, the CC ethods yields saples {x (n), v (n) } distributed according to π, so that {x (n) } are distributed as the arginal partial posteriors g (x y ) [9]. There exists an optial bwidth C [26] such that ĝ (x y ) g (x y ), for. x x, for. oreover, the IS estiator is consistent [9] so that x x x, as. In general, the optial bwidth C is unknown. However, using a bwidth C C, Eq. (5) provides an estiator of g (x y ) [26], in any case. Alternative IS weights. Other proper IS weights can be eployed in our fraework, providing consistent estiators [27], [28], [29]. For instance, a full deterinistic ixture approach [28], [30], [3] for ultiple iportance sapling (IS) schees can be used, i.e., As a consequence, we have that ŵ (n) w (n) G(x (n) w (n) y : ) j g j(x (n) y j ), (22) j g j(x (n) y j ) j g j(x (n) y j ). (23) It is possible to show that the application of these D-IS weights provides ore efficient IS estiators [29], [28] (i.e.,

4 with saller variance). B. Parallel etropolis-hastings algoriths For siplicity, we consider etropolis-hastings (H) ethods in the first step of the novel schee. However, ore sophisticated algoriths can be eployed. ore specifically, starting with a roly chosen θ (0), we perfor the following steps: For,..., : For n,..., : ) Draw θ fro a proposal pdf q (θ θ (n ) ). 2) Set θ (n) θ, with probability α in [, otherwise set θ (n) π (θ y )q (θ (n ) θ ) π (θ (n ) y )q (θ θ (n ) ) θ (n ) V. UERICAL SIULATIOS ], (24) (with probability α). In order to test PIS, we consider a Gaussian likelihoods f (y x, v ) d Y j (z j x, v, Σ ), (25) with y [z,..., z dy ],,...,, where z j, x, v R. ote that d Y y. We consider flat iproper priors over x v,,...,. We set 0, so that we have 0 different partial target pdfs π (x, v y ) f (y x, v ). The covariance atrices are Σ [σ, 2 ρ ; ρ σ2,] 2 with, [ σ,: 2, 3 2, 4, 5 2, 3, 7 2, 3, 5 2, 2, ], 2 [ σ 2,: 3, 2 3,, 4 3, 5 3, 2, 7 3, 8 3, 3, 0 ], 3 [ ρ : 0, 0, 2 0, 3 0, 4 0, 5 0, 6 0, 7 0, 8 0, 9 ] 0 We set x v : [ 5, 4, 3, 2,, 0,, 2, 3, 4] as true values of the paraeters. Then, given these values, we generate (according to the odel in Eq. (25)) different nubers of easureents for each partial likelihood, specifically, d Y: [2, 2, 50, 2, 5, 20, 5, 00, 2, 0]. Given a set Y y : of generated observations, in this toy exaple we can copute the SE estiator x 0.9 by a costly deterinistic nuerical procedure using a thin grid (for approxiating the arginal posterior then the corresponding expected value). Thus, we apply PIS in 400 independent runs copare the obtained estiator x with x 0.9, coputing the corresponding SE. We consider Gaussian functions ϕ for the kernel density estiation, with the optial bwidth suggested in [26]. For the proposal pdfs q s of the H algoriths, we eploy stard Gaussian ro walk proposals (with identity covariance atrix [ 0; 0 ]). We test PIS for different values of the length of the chains,, fro 5 to Furtherore, at each run, we also copute a trivial onte Carlo approxiation of x, given by x trivial x x (n), n x(n) n where x is the onte Carlo approxiation of x, obtained using the saples of -th chain. The results are shown in Figure 3, in ters of SE versus (length of the chains, i.e., nuber of H iterations for each partial target). Three curves are shown, corresponding to the use of stard IS weights (dashed line), ultiple IS weights (solid line) the trivial solution (dotted-dashed line). We can observe that PIS provides good results outperforing the trivial solution for > 40. This eans that, for 40, the saples generated by the H ethods still belong to the burn-in period the convergence to the invariant pdf is not reached. However, with a adequate nuber of iterations of the chains, PIS provides good results. In the exaple, the stard IS D-IS weights perfor siilarly (with a slight advantages for the D-IS weights). SE IS IS Trivial Figure 3. SE in the estiation of ˆx obtaining by the onte Carlo approxiation x by PIS using stard IS weights (dashed line), D- IS weights (solid line) the trivial weights (dotted-dashed line), for the final fusion (seilog scale representation). VI. COCLUSIOS In this work, we have introduced a novel onte Carlo schee in order to obtain an efficient distributed estiation. The new Bayesian ethod provides a siultaneous estiation of local global paraeters. The estiation of the global paraeters takes into account all the possible statistical inforation. The proposed algorith is an iportance sapler that assigns weights to the saples obtained by the application of parallel CC ethods. Each CC addresses a different partial target distribution, considering only a subset of easureents. As future line, we consider the possible design of an iterative ipleentation of the proposed schee, where the proposal pdfs eployed by the CC algoriths are adapted online, generating in this way an interaction aong the parallel chains.

5 REFERECES [] J. Tsitsiklis, D. Bertsekas,. Athans, Distributed asynchronous deterinistic stochastic gradient optiization algoriths, IEEE Transactions on Autoatic Control, vol. 3, no. 9, pp , 986. [2] F. S. Cattivelli, C. G. Lopes, A. H. Sayed, Diffusion recursive least-squares for distributed estiation over adaptive networks, IEEE Transactions on Signal Processing, vol. 56, no. 5, pp , [3] F. S. Cattivelli A. H. Sayed, Diffusion LS strategies for distributed estiation, IEEE Transactions on Signal Processing, vol. 58, no. 3, pp , 200. [4] J. Plata-Chaves,. Bogdanovic, K. Berberidis, Distributed diffusion-based LS for node-specific paraeter estiation over adaptive networks, IEEE Transactions on Signal Processing, vol. 3, no. 63, pp , 205. [5]. Bogdanovic, J. Plata-Chaves, K. Berberidis, Distributed increental-based LS for node-specific adaptive paraeter estiation, IEEE Transactions on Signal Processing, vol. 62, no. 20, pp , 204. [6] Steven L. Scott, Alexer W. Blocker, Ferno V. Bonassi, Hugh A. Chipan, Edward I. George, Robert E. cculloch, Bayes big data: The consensus onte Carlo algorith, in EFaBBayes 250th conference, 203, vol. 6. [7] W. eiswanger, C. Wang, E. Xing, Asyptotically exact, ebarrassingly parallel CC, arxiv:3.4780, pp. 6, 2 ar [8] R. Bardenet, A. Doucet, C. Holes, On arkov chain onte Carlo ethods for tall data, arxiv: , 205. [9] C. P. Robert G. Casella, onte Carlo Statistical ethods, Springer, [0] J. S. Liu, onte Carlo Strategies in Scientific Coputing, Springer, [] A. Doucet,. de Freitas,. Gordon, Eds., Sequential onte Carlo ethods in Practice, Springer, ew York (USA), 200. [2] L. artino J. íguez, A generalization of the adaptive rejection sapling algorith, Statistics Coputing, vol. 2, no., pp , July 20. [3] W. J. Fitzgerald, arkov chain onte Carlo ethods with applications to signal processing, Signal Processing, vol. 8, no., pp. 3 8, January 200. [4]. Hong,. F Bugallo, P. Djuric, Joint odel selection paraeter estiation by population onte carlo siulation, Selected Topics in Signal Processing, IEEE Journal of, vol. 4, no. 3, pp , 200. [5] P.. Djurić, B. Shen,. F. Bugallo, Population onte Carlo ethodology a la Gibbs sapling, in EUSIPCO, 20. [6] P. Del oral, A. Doucet, A. Jasra, Sequential onte Carlo saplers, Journal of the Royal Statistical Society: Series B (Statistical ethodology), vol. 68, no. 3, pp , [7] E. arinari G. Parisi, Siulated tepering: a new onte Carlo schee, Europhysics Letters, vol. 9, no. 6, pp , July 992. [8] A. Jasra, D. A. Stephens, C. C. Holes, On population-based siulation for static inference, Statistics Coputing, vol. 7, no. 3, pp , [9]. Chopin, A sequential particle filter ethod for static odels, Bioetrika, vol. 89, no. 3, pp , [20] L. artino, V. Elvira, D. Luengo, J. Corer, Layered adaptive iportance sapling, Statistics Coputing, to appear, 206. [2] L. artino, V. Elvira, D. Luengo, A. Artes, J. Corer, Orthogonal CC algoriths, IEEE Statistical Signal Processing Workshop, (SSP), 204. [22] L. artino, V. Elvira, D. Luengo, A. Artes, J. Corer, Selly parallel CC chains, IEEE International Conference on Acoustics, Speech, Signal Processing (ICASSP), 205. [23] D. Luengo, L. artino, V. Elvira,. Bugallo, Efficient linear cobination of partial onte Carlo estiators, IEEE International Conference on Acoustics, Speech, Signal Processing (ICASSP), 205. [24] J. Plata-Chaves,. Bogdanovic, K. Berberidis, Distributed increental-based RLS for node-specific paraeter estiation over adaptive networks, in IEEE 2st European Signal Conference, 203. EUSIPCO 203, 203. [25] J. Plata-Chaves, A. Bertr,. oonen, Distributed signal estiation in a wireless sensor network with partially-overlapping nodespecific interests or source observability, in IEEE 40th International Conference on Acoustics, Speech Signal Processing, 205. ICASSP 205, 205. [26].P. W.C. Jones, Kernel Soothing, Chapan Hall, 994. [27] V. Elvira, L. artino, D. Luengo,. F. Bugallo, Efficient ultiple iportance sapling estiators, Signal Processing Letters, IEEE, vol. 22, no. 0, pp , 205. [28] V. Elvira, L. artino, D. Luengo,. F. Bugallo, Generalized ultiple iportance sapling, arxiv preprint arxiv: , 205. [29] L. artino, V. Elvira, D. Luengo, J. Corer, An adaptive population iportance sapler: Learning fro the uncertanity, IEEE Transactions on Signal Processing, vol. 63, no. 6, pp , 205. [30] E. Veach L. Guibas, Optially cobining sapling techniques for onte Carlo rendering, In SIGGRAPH 995 Proceedings, pp , 995. [3] A. Owen Y. Zhou, Safe effective iportance sapling, Journal of the Aerican Statistical Association, vol. 95, no. 449, pp , 2000.

Pseudo-marginal Metropolis-Hastings: a simple explanation and (partial) review of theory

Pseudo-marginal Metropolis-Hastings: a simple explanation and (partial) review of theory Pseudo-arginal Metropolis-Hastings: a siple explanation and (partial) review of theory Chris Sherlock Motivation Iagine a stochastic process V which arises fro soe distribution with density p(v θ ). Iagine

More information

Estimating Parameters for a Gaussian pdf

Estimating Parameters for a Gaussian pdf Pattern Recognition and achine Learning Jaes L. Crowley ENSIAG 3 IS First Seester 00/0 Lesson 5 7 Noveber 00 Contents Estiating Paraeters for a Gaussian pdf Notation... The Pattern Recognition Proble...3

More information

Non-Parametric Non-Line-of-Sight Identification 1

Non-Parametric Non-Line-of-Sight Identification 1 Non-Paraetric Non-Line-of-Sight Identification Sinan Gezici, Hisashi Kobayashi and H. Vincent Poor Departent of Electrical Engineering School of Engineering and Applied Science Princeton University, Princeton,

More information

CS Lecture 13. More Maximum Likelihood

CS Lecture 13. More Maximum Likelihood CS 6347 Lecture 13 More Maxiu Likelihood Recap Last tie: Introduction to axiu likelihood estiation MLE for Bayesian networks Optial CPTs correspond to epirical counts Today: MLE for CRFs 2 Maxiu Likelihood

More information

An Improved Particle Filter with Applications in Ballistic Target Tracking

An Improved Particle Filter with Applications in Ballistic Target Tracking Sensors & ransducers Vol. 72 Issue 6 June 204 pp. 96-20 Sensors & ransducers 204 by IFSA Publishing S. L. http://www.sensorsportal.co An Iproved Particle Filter with Applications in Ballistic arget racing

More information

Identical Maximum Likelihood State Estimation Based on Incremental Finite Mixture Model in PHD Filter

Identical Maximum Likelihood State Estimation Based on Incremental Finite Mixture Model in PHD Filter Identical Maxiu Lielihood State Estiation Based on Increental Finite Mixture Model in PHD Filter Gang Wu Eail: xjtuwugang@gail.co Jing Liu Eail: elelj20080730@ail.xjtu.edu.cn Chongzhao Han Eail: czhan@ail.xjtu.edu.cn

More information

Effective joint probabilistic data association using maximum a posteriori estimates of target states

Effective joint probabilistic data association using maximum a posteriori estimates of target states Effective joint probabilistic data association using axiu a posteriori estiates of target states 1 Viji Paul Panakkal, 2 Rajbabu Velurugan 1 Central Research Laboratory, Bharat Electronics Ltd., Bangalore,

More information

Experimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis

Experimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis City University of New York (CUNY) CUNY Acadeic Works International Conference on Hydroinforatics 8-1-2014 Experiental Design For Model Discriination And Precise Paraeter Estiation In WDS Analysis Giovanna

More information

Distributed Subgradient Methods for Multi-agent Optimization

Distributed Subgradient Methods for Multi-agent Optimization 1 Distributed Subgradient Methods for Multi-agent Optiization Angelia Nedić and Asuan Ozdaglar October 29, 2007 Abstract We study a distributed coputation odel for optiizing a su of convex objective functions

More information

are equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are,

are equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are, Page of 8 Suppleentary Materials: A ultiple testing procedure for ulti-diensional pairwise coparisons with application to gene expression studies Anjana Grandhi, Wenge Guo, Shyaal D. Peddada S Notations

More information

Inspection; structural health monitoring; reliability; Bayesian analysis; updating; decision analysis; value of information

Inspection; structural health monitoring; reliability; Bayesian analysis; updating; decision analysis; value of information Cite as: Straub D. (2014). Value of inforation analysis with structural reliability ethods. Structural Safety, 49: 75-86. Value of Inforation Analysis with Structural Reliability Methods Daniel Straub

More information

Tracking using CONDENSATION: Conditional Density Propagation

Tracking using CONDENSATION: Conditional Density Propagation Tracking using CONDENSATION: Conditional Density Propagation Goal Model-based visual tracking in dense clutter at near video frae rates M. Isard and A. Blake, CONDENSATION Conditional density propagation

More information

SPECTRUM sensing is a core concept of cognitive radio

SPECTRUM sensing is a core concept of cognitive radio World Acadey of Science, Engineering and Technology International Journal of Electronics and Counication Engineering Vol:6, o:2, 202 Efficient Detection Using Sequential Probability Ratio Test in Mobile

More information

Asynchronous Gossip Algorithms for Stochastic Optimization

Asynchronous Gossip Algorithms for Stochastic Optimization Asynchronous Gossip Algoriths for Stochastic Optiization S. Sundhar Ra ECE Dept. University of Illinois Urbana, IL 680 ssrini@illinois.edu A. Nedić IESE Dept. University of Illinois Urbana, IL 680 angelia@illinois.edu

More information

Probability Distributions

Probability Distributions Probability Distributions In Chapter, we ephasized the central role played by probability theory in the solution of pattern recognition probles. We turn now to an exploration of soe particular exaples

More information

A Simple Regression Problem

A Simple Regression Problem A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where

More information

Warning System of Dangerous Chemical Gas in Factory Based on Wireless Sensor Network

Warning System of Dangerous Chemical Gas in Factory Based on Wireless Sensor Network 565 A publication of CHEMICAL ENGINEERING TRANSACTIONS VOL. 59, 07 Guest Editors: Zhuo Yang, Junie Ba, Jing Pan Copyright 07, AIDIC Servizi S.r.l. ISBN 978-88-95608-49-5; ISSN 83-96 The Italian Association

More information

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley osig 1 Winter Seester 2018 Lesson 6 27 February 2018 Outline Perceptrons and Support Vector achines Notation...2 Linear odels...3 Lines, Planes

More information

A method to determine relative stroke detection efficiencies from multiplicity distributions

A method to determine relative stroke detection efficiencies from multiplicity distributions A ethod to deterine relative stroke detection eiciencies ro ultiplicity distributions Schulz W. and Cuins K. 2. Austrian Lightning Detection and Inoration Syste (ALDIS), Kahlenberger Str.2A, 90 Vienna,

More information

DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS

DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS ISSN 1440-771X AUSTRALIA DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS An Iproved Method for Bandwidth Selection When Estiating ROC Curves Peter G Hall and Rob J Hyndan Working Paper 11/00 An iproved

More information

Bayes Decision Rule and Naïve Bayes Classifier

Bayes Decision Rule and Naïve Bayes Classifier Bayes Decision Rule and Naïve Bayes Classifier Le Song Machine Learning I CSE 6740, Fall 2013 Gaussian Mixture odel A density odel p(x) ay be ulti-odal: odel it as a ixture of uni-odal distributions (e.g.

More information

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians Using EM To Estiate A Probablity Density With A Mixture Of Gaussians Aaron A. D Souza adsouza@usc.edu Introduction The proble we are trying to address in this note is siple. Given a set of data points

More information

Bayesian Approach for Fatigue Life Prediction from Field Inspection

Bayesian Approach for Fatigue Life Prediction from Field Inspection Bayesian Approach for Fatigue Life Prediction fro Field Inspection Dawn An and Jooho Choi School of Aerospace & Mechanical Engineering, Korea Aerospace University, Goyang, Seoul, Korea Srira Pattabhiraan

More information

J11.3 STOCHASTIC EVENT RECONSTRUCTION OF ATMOSPHERIC CONTAMINANT DISPERSION

J11.3 STOCHASTIC EVENT RECONSTRUCTION OF ATMOSPHERIC CONTAMINANT DISPERSION J11.3 STOCHASTIC EVENT RECONSTRUCTION OF ATMOSPHERIC CONTAMINANT DISPERSION Inanc Senocak 1*, Nicolas W. Hengartner, Margaret B. Short 3, and Brent W. Daniel 1 Boise State University, Boise, ID, Los Alaos

More information

Block designs and statistics

Block designs and statistics Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent

More information

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization Recent Researches in Coputer Science Support Vector Machine Classification of Uncertain and Ibalanced data using Robust Optiization RAGHAV PAT, THEODORE B. TRAFALIS, KASH BARKER School of Industrial Engineering

More information

Supplementary to Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data

Supplementary to Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data Suppleentary to Learning Discriinative Bayesian Networks fro High-diensional Continuous Neuroiaging Data Luping Zhou, Lei Wang, Lingqiao Liu, Philip Ogunbona, and Dinggang Shen Proposition. Given a sparse

More information

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search Quantu algoriths (CO 781, Winter 2008) Prof Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search ow we begin to discuss applications of quantu walks to search algoriths

More information

A Poisson process reparameterisation for Bayesian inference for extremes

A Poisson process reparameterisation for Bayesian inference for extremes A Poisson process reparaeterisation for Bayesian inference for extrees Paul Sharkey and Jonathan A. Tawn Eail: p.sharkey1@lancs.ac.uk, j.tawn@lancs.ac.uk STOR-i Centre for Doctoral Training, Departent

More information

Introduction to Machine Learning. Recitation 11

Introduction to Machine Learning. Recitation 11 Introduction to Machine Learning Lecturer: Regev Schweiger Recitation Fall Seester Scribe: Regev Schweiger. Kernel Ridge Regression We now take on the task of kernel-izing ridge regression. Let x,...,

More information

Weighted- 1 minimization with multiple weighting sets

Weighted- 1 minimization with multiple weighting sets Weighted- 1 iniization with ultiple weighting sets Hassan Mansour a,b and Özgür Yılaza a Matheatics Departent, University of British Colubia, Vancouver - BC, Canada; b Coputer Science Departent, University

More information

Boosting with log-loss

Boosting with log-loss Boosting with log-loss Marco Cusuano-Towner Septeber 2, 202 The proble Suppose we have data exaples {x i, y i ) i =... } for a two-class proble with y i {, }. Let F x) be the predictor function with the

More information

On the Communication Complexity of Lipschitzian Optimization for the Coordinated Model of Computation

On the Communication Complexity of Lipschitzian Optimization for the Coordinated Model of Computation journal of coplexity 6, 459473 (2000) doi:0.006jco.2000.0544, available online at http:www.idealibrary.co on On the Counication Coplexity of Lipschitzian Optiization for the Coordinated Model of Coputation

More information

TEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES

TEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES TEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES S. E. Ahed, R. J. Tokins and A. I. Volodin Departent of Matheatics and Statistics University of Regina Regina,

More information

Machine Learning Basics: Estimators, Bias and Variance

Machine Learning Basics: Estimators, Bias and Variance Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics

More information

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical IEEE TRANSACTIONS ON INFORMATION THEORY Large Alphabet Source Coding using Independent Coponent Analysis Aichai Painsky, Meber, IEEE, Saharon Rosset and Meir Feder, Fellow, IEEE arxiv:67.7v [cs.it] Jul

More information

AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS

AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS Statistica Sinica 6 016, 1709-178 doi:http://dx.doi.org/10.5705/ss.0014.0034 AN OPTIMAL SHRINKAGE FACTOR IN PREDICTION OF ORDERED RANDOM EFFECTS Nilabja Guha 1, Anindya Roy, Yaakov Malinovsky and Gauri

More information

Fast Montgomery-like Square Root Computation over GF(2 m ) for All Trinomials

Fast Montgomery-like Square Root Computation over GF(2 m ) for All Trinomials Fast Montgoery-like Square Root Coputation over GF( ) for All Trinoials Yin Li a, Yu Zhang a, a Departent of Coputer Science and Technology, Xinyang Noral University, Henan, P.R.China Abstract This letter

More information

A Low-Complexity Congestion Control and Scheduling Algorithm for Multihop Wireless Networks with Order-Optimal Per-Flow Delay

A Low-Complexity Congestion Control and Scheduling Algorithm for Multihop Wireless Networks with Order-Optimal Per-Flow Delay A Low-Coplexity Congestion Control and Scheduling Algorith for Multihop Wireless Networks with Order-Optial Per-Flow Delay Po-Kai Huang, Xiaojun Lin, and Chih-Chun Wang School of Electrical and Coputer

More information

Intelligent Systems: Reasoning and Recognition. Artificial Neural Networks

Intelligent Systems: Reasoning and Recognition. Artificial Neural Networks Intelligent Systes: Reasoning and Recognition Jaes L. Crowley MOSIG M1 Winter Seester 2018 Lesson 7 1 March 2018 Outline Artificial Neural Networks Notation...2 Introduction...3 Key Equations... 3 Artificial

More information

C na (1) a=l. c = CO + Clm + CZ TWO-STAGE SAMPLE DESIGN WITH SMALL CLUSTERS. 1. Introduction

C na (1) a=l. c = CO + Clm + CZ TWO-STAGE SAMPLE DESIGN WITH SMALL CLUSTERS. 1. Introduction TWO-STGE SMPLE DESIGN WITH SMLL CLUSTERS Robert G. Clark and David G. Steel School of Matheatics and pplied Statistics, University of Wollongong, NSW 5 ustralia. (robert.clark@abs.gov.au) Key Words: saple

More information

COS 424: Interacting with Data. Written Exercises

COS 424: Interacting with Data. Written Exercises COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well

More information

State Estimation Problem for the Action Potential Modeling in Purkinje Fibers

State Estimation Problem for the Action Potential Modeling in Purkinje Fibers APCOM & ISCM -4 th Deceber, 203, Singapore State Estiation Proble for the Action Potential Modeling in Purinje Fibers *D. C. Estuano¹, H. R. B.Orlande and M. J.Colaço Federal University of Rio de Janeiro

More information

Pattern Recognition and Machine Learning. Artificial Neural networks

Pattern Recognition and Machine Learning. Artificial Neural networks Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2016 Lessons 7 14 Dec 2016 Outline Artificial Neural networks Notation...2 1. Introduction...3... 3 The Artificial

More information

Bayesian inference for stochastic differential mixed effects models - initial steps

Bayesian inference for stochastic differential mixed effects models - initial steps Bayesian inference for stochastic differential ixed effects odels - initial steps Gavin Whitaker 2nd May 2012 Supervisors: RJB and AG Outline Mixed Effects Stochastic Differential Equations (SDEs) Bayesian

More information

Ch 12: Variations on Backpropagation

Ch 12: Variations on Backpropagation Ch 2: Variations on Backpropagation The basic backpropagation algorith is too slow for ost practical applications. It ay take days or weeks of coputer tie. We deonstrate why the backpropagation algorith

More information

1 Bounding the Margin

1 Bounding the Margin COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #12 Scribe: Jian Min Si March 14, 2013 1 Bounding the Margin We are continuing the proof of a bound on the generalization error of AdaBoost

More information

Sharp Time Data Tradeoffs for Linear Inverse Problems

Sharp Time Data Tradeoffs for Linear Inverse Problems Sharp Tie Data Tradeoffs for Linear Inverse Probles Saet Oyak Benjain Recht Mahdi Soltanolkotabi January 016 Abstract In this paper we characterize sharp tie-data tradeoffs for optiization probles used

More information

Biostatistics Department Technical Report

Biostatistics Department Technical Report Biostatistics Departent Technical Report BST006-00 Estiation of Prevalence by Pool Screening With Equal Sized Pools and a egative Binoial Sapling Model Charles R. Katholi, Ph.D. Eeritus Professor Departent

More information

Bootstrapping Dependent Data

Bootstrapping Dependent Data Bootstrapping Dependent Data One of the key issues confronting bootstrap resapling approxiations is how to deal with dependent data. Consider a sequence fx t g n t= of dependent rando variables. Clearly

More information

MSEC MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL SOLUTION FOR MAINTENANCE AND PERFORMANCE

MSEC MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL SOLUTION FOR MAINTENANCE AND PERFORMANCE Proceeding of the ASME 9 International Manufacturing Science and Engineering Conference MSEC9 October 4-7, 9, West Lafayette, Indiana, USA MSEC9-8466 MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL

More information

Pattern Recognition and Machine Learning. Artificial Neural networks

Pattern Recognition and Machine Learning. Artificial Neural networks Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lessons 7 20 Dec 2017 Outline Artificial Neural networks Notation...2 Introduction...3 Key Equations... 3 Artificial

More information

Recursive Algebraic Frisch Scheme: a Particle-Based Approach

Recursive Algebraic Frisch Scheme: a Particle-Based Approach Recursive Algebraic Frisch Schee: a Particle-Based Approach Stefano Massaroli Renato Myagusuku Federico Califano Claudio Melchiorri Atsushi Yaashita Hajie Asaa Departent of Precision Engineering, The University

More information

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels Extension of CSRSM for the Paraetric Study of the Face Stability of Pressurized Tunnels Guilhe Mollon 1, Daniel Dias 2, and Abdul-Haid Soubra 3, M.ASCE 1 LGCIE, INSA Lyon, Université de Lyon, Doaine scientifique

More information

Keywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution

Keywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution Testing approxiate norality of an estiator using the estiated MSE and bias with an application to the shape paraeter of the generalized Pareto distribution J. Martin van Zyl Abstract In this work the norality

More information

Probabilistic Machine Learning

Probabilistic Machine Learning Probabilistic Machine Learning by Prof. Seungchul Lee isystes Design Lab http://isystes.unist.ac.kr/ UNIST Table of Contents I.. Probabilistic Linear Regression I... Maxiu Likelihood Solution II... Maxiu-a-Posteriori

More information

An l 1 Regularized Method for Numerical Differentiation Using Empirical Eigenfunctions

An l 1 Regularized Method for Numerical Differentiation Using Empirical Eigenfunctions Journal of Matheatical Research with Applications Jul., 207, Vol. 37, No. 4, pp. 496 504 DOI:0.3770/j.issn:2095-265.207.04.0 Http://jre.dlut.edu.cn An l Regularized Method for Nuerical Differentiation

More information

UNIVERSITY OF TRENTO ON THE USE OF SVM FOR ELECTROMAGNETIC SUBSURFACE SENSING. A. Boni, M. Conci, A. Massa, and S. Piffer.

UNIVERSITY OF TRENTO ON THE USE OF SVM FOR ELECTROMAGNETIC SUBSURFACE SENSING. A. Boni, M. Conci, A. Massa, and S. Piffer. UIVRSITY OF TRTO DIPARTITO DI IGGRIA SCIZA DLL IFORAZIO 3823 Povo Trento (Italy) Via Soarive 4 http://www.disi.unitn.it O TH US OF SV FOR LCTROAGTIC SUBSURFAC SSIG A. Boni. Conci A. assa and S. Piffer

More information

Support recovery in compressed sensing: An estimation theoretic approach

Support recovery in compressed sensing: An estimation theoretic approach Support recovery in copressed sensing: An estiation theoretic approach Ain Karbasi, Ali Horati, Soheil Mohajer, Martin Vetterli School of Coputer and Counication Sciences École Polytechnique Fédérale de

More information

Paul M. Goggans Department of Electrical Engineering, University of Mississippi, Anderson Hall, University, Mississippi 38677

Paul M. Goggans Department of Electrical Engineering, University of Mississippi, Anderson Hall, University, Mississippi 38677 Evaluation of decay ties in coupled spaces: Bayesian decay odel selection a),b) Ning Xiang c) National Center for Physical Acoustics and Departent of Electrical Engineering, University of Mississippi,

More information

Feature Extraction Techniques

Feature Extraction Techniques Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that

More information

On the theoretical analysis of cross validation in compressive sensing

On the theoretical analysis of cross validation in compressive sensing MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.erl.co On the theoretical analysis of cross validation in copressive sensing Zhang, J.; Chen, L.; Boufounos, P.T.; Gu, Y. TR2014-025 May 2014 Abstract

More information

Optimal nonlinear Bayesian experimental design: an application to amplitude versus offset experiments

Optimal nonlinear Bayesian experimental design: an application to amplitude versus offset experiments Geophys. J. Int. (23) 155, 411 421 Optial nonlinear Bayesian experiental design: an application to aplitude versus offset experients Jojanneke van den Berg, 1, Andrew Curtis 2,3 and Jeannot Trapert 1 1

More information

IN modern society that various systems have become more

IN modern society that various systems have become more Developent of Reliability Function in -Coponent Standby Redundant Syste with Priority Based on Maxiu Entropy Principle Ryosuke Hirata, Ikuo Arizono, Ryosuke Toohiro, Satoshi Oigawa, and Yasuhiko Takeoto

More information

On Hyper-Parameter Estimation in Empirical Bayes: A Revisit of the MacKay Algorithm

On Hyper-Parameter Estimation in Empirical Bayes: A Revisit of the MacKay Algorithm On Hyper-Paraeter Estiation in Epirical Bayes: A Revisit of the MacKay Algorith Chune Li 1, Yongyi Mao, Richong Zhang 1 and Jinpeng Huai 1 1 School of Coputer Science and Engineering, Beihang University,

More information

In this chapter, we consider several graph-theoretic and probabilistic models

In this chapter, we consider several graph-theoretic and probabilistic models THREE ONE GRAPH-THEORETIC AND STATISTICAL MODELS 3.1 INTRODUCTION In this chapter, we consider several graph-theoretic and probabilistic odels for a social network, which we do under different assuptions

More information

Tail Estimation of the Spectral Density under Fixed-Domain Asymptotics

Tail Estimation of the Spectral Density under Fixed-Domain Asymptotics Tail Estiation of the Spectral Density under Fixed-Doain Asyptotics Wei-Ying Wu, Chae Young Li and Yiin Xiao Wei-Ying Wu, Departent of Statistics & Probability Michigan State University, East Lansing,

More information

PAC-Bayes Analysis Of Maximum Entropy Learning

PAC-Bayes Analysis Of Maximum Entropy Learning PAC-Bayes Analysis Of Maxiu Entropy Learning John Shawe-Taylor and David R. Hardoon Centre for Coputational Statistics and Machine Learning Departent of Coputer Science University College London, UK, WC1E

More information

A Note on the Applied Use of MDL Approximations

A Note on the Applied Use of MDL Approximations A Note on the Applied Use of MDL Approxiations Daniel J. Navarro Departent of Psychology Ohio State University Abstract An applied proble is discussed in which two nested psychological odels of retention

More information

A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION

A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION A eshsize boosting algorith in kernel density estiation A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION C.C. Ishiekwene, S.M. Ogbonwan and J.E. Osewenkhae Departent of Matheatics, University

More information

Detection and Estimation Theory

Detection and Estimation Theory ESE 54 Detection and Estiation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electronic Systes and Signals Research Laboratory Electrical and Systes Engineering Washington University 11 Urbauer

More information

RANDOM GRADIENT EXTRAPOLATION FOR DISTRIBUTED AND STOCHASTIC OPTIMIZATION

RANDOM GRADIENT EXTRAPOLATION FOR DISTRIBUTED AND STOCHASTIC OPTIMIZATION RANDOM GRADIENT EXTRAPOLATION FOR DISTRIBUTED AND STOCHASTIC OPTIMIZATION GUANGHUI LAN AND YI ZHOU Abstract. In this paper, we consider a class of finite-su convex optiization probles defined over a distributed

More information

Average Consensus and Gossip Algorithms in Networks with Stochastic Asymmetric Communications

Average Consensus and Gossip Algorithms in Networks with Stochastic Asymmetric Communications Average Consensus and Gossip Algoriths in Networks with Stochastic Asyetric Counications Duarte Antunes, Daniel Silvestre, Carlos Silvestre Abstract We consider that a set of distributed agents desire

More information

A Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair

A Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair Proceedings of the 6th SEAS International Conference on Siulation, Modelling and Optiization, Lisbon, Portugal, Septeber -4, 006 0 A Siplified Analytical Approach for Efficiency Evaluation of the eaving

More information

Support Vector Machines. Machine Learning Series Jerry Jeychandra Blohm Lab

Support Vector Machines. Machine Learning Series Jerry Jeychandra Blohm Lab Support Vector Machines Machine Learning Series Jerry Jeychandra Bloh Lab Outline Main goal: To understand how support vector achines (SVMs) perfor optial classification for labelled data sets, also a

More information

Classical and Bayesian Inference for an Extension of the Exponential Distribution under Progressive Type-II Censored Data with Binomial Removals

Classical and Bayesian Inference for an Extension of the Exponential Distribution under Progressive Type-II Censored Data with Binomial Removals J. Stat. Appl. Pro. Lett. 1, No. 3, 75-86 (2014) 75 Journal of Statistics Applications & Probability Letters An International Journal http://dx.doi.org/10.12785/jsapl/010304 Classical and Bayesian Inference

More information

Vulnerability of MRD-Code-Based Universal Secure Error-Correcting Network Codes under Time-Varying Jamming Links

Vulnerability of MRD-Code-Based Universal Secure Error-Correcting Network Codes under Time-Varying Jamming Links Vulnerability of MRD-Code-Based Universal Secure Error-Correcting Network Codes under Tie-Varying Jaing Links Jun Kurihara KDDI R&D Laboratories, Inc 2 5 Ohara, Fujiino, Saitaa, 356 8502 Japan Eail: kurihara@kddilabsjp

More information

A Self-Organizing Model for Logical Regression Jerry Farlow 1 University of Maine. (1900 words)

A Self-Organizing Model for Logical Regression Jerry Farlow 1 University of Maine. (1900 words) 1 A Self-Organizing Model for Logical Regression Jerry Farlow 1 University of Maine (1900 words) Contact: Jerry Farlow Dept of Matheatics Univeristy of Maine Orono, ME 04469 Tel (07) 866-3540 Eail: farlow@ath.uaine.edu

More information

W-BASED VS LATENT VARIABLES SPATIAL AUTOREGRESSIVE MODELS: EVIDENCE FROM MONTE CARLO SIMULATIONS

W-BASED VS LATENT VARIABLES SPATIAL AUTOREGRESSIVE MODELS: EVIDENCE FROM MONTE CARLO SIMULATIONS W-BASED VS LATENT VARIABLES SPATIAL AUTOREGRESSIVE MODELS: EVIDENCE FROM MONTE CARLO SIMULATIONS. Introduction When it coes to applying econoetric odels to analyze georeferenced data, researchers are well

More information

DERIVING PROPER UNIFORM PRIORS FOR REGRESSION COEFFICIENTS

DERIVING PROPER UNIFORM PRIORS FOR REGRESSION COEFFICIENTS DERIVING PROPER UNIFORM PRIORS FOR REGRESSION COEFFICIENTS N. van Erp and P. van Gelder Structural Hydraulic and Probabilistic Design, TU Delft Delft, The Netherlands Abstract. In probles of odel coparison

More information

RESOLUTION ISSUES IN SOUNDFIELD IMAGING: A MULTIRESOLUTION APPROACH TO MULTIPLE SOURCE LOCALIZATION

RESOLUTION ISSUES IN SOUNDFIELD IMAGING: A MULTIRESOLUTION APPROACH TO MULTIPLE SOURCE LOCALIZATION IEEE Workshop on Applications of Signal Processing to Audio and Acoustics October 8-,, New Paltz, NY RESOLUTION ISSUES IN SOUNDFIELD IMAGING: A MULTIRESOLUTION APPROACH TO MULTIPLE SOURCE LOCALIZATION

More information

Correlated Bayesian Model Fusion: Efficient Performance Modeling of Large-Scale Tunable Analog/RF Integrated Circuits

Correlated Bayesian Model Fusion: Efficient Performance Modeling of Large-Scale Tunable Analog/RF Integrated Circuits Correlated Bayesian odel Fusion: Efficient Perforance odeling of Large-Scale unable Analog/RF Integrated Circuits Fa Wang and Xin Li ECE Departent, Carnegie ellon University, Pittsburgh, PA 53 {fwang,

More information

This model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t.

This model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t. CS 493: Algoriths for Massive Data Sets Feb 2, 2002 Local Models, Bloo Filter Scribe: Qin Lv Local Models In global odels, every inverted file entry is copressed with the sae odel. This work wells when

More information

Department of Electronic and Optical Engineering, Ordnance Engineering College, Shijiazhuang, , China

Department of Electronic and Optical Engineering, Ordnance Engineering College, Shijiazhuang, , China 6th International Conference on Machinery, Materials, Environent, Biotechnology and Coputer (MMEBC 06) Solving Multi-Sensor Multi-Target Assignent Proble Based on Copositive Cobat Efficiency and QPSO Algorith

More information

Smoothing Framework for Automatic Track Initiation in Clutter

Smoothing Framework for Automatic Track Initiation in Clutter Soothing Fraework for Autoatic Track Initiation in Clutter Rajib Chakravorty Networked Sensor Technology (NeST) Laboratory Faculty of Engineering University of Technology, Sydney Broadway -27,Sydney, NSW

More information

Kernel Methods and Support Vector Machines

Kernel Methods and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic

More information

Lecture 12: Ensemble Methods. Introduction. Weighted Majority. Mixture of Experts/Committee. Σ k α k =1. Isabelle Guyon

Lecture 12: Ensemble Methods. Introduction. Weighted Majority. Mixture of Experts/Committee. Σ k α k =1. Isabelle Guyon Lecture 2: Enseble Methods Isabelle Guyon guyoni@inf.ethz.ch Introduction Book Chapter 7 Weighted Majority Mixture of Experts/Coittee Assue K experts f, f 2, f K (base learners) x f (x) Each expert akes

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE7C (Spring 018: Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee7c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee7c@berkeley.edu October 15,

More information

Order Recursion Introduction Order versus Time Updates Matrix Inversion by Partitioning Lemma Levinson Algorithm Interpretations Examples

Order Recursion Introduction Order versus Time Updates Matrix Inversion by Partitioning Lemma Levinson Algorithm Interpretations Examples Order Recursion Introduction Order versus Tie Updates Matrix Inversion by Partitioning Lea Levinson Algorith Interpretations Exaples Introduction Rc d There are any ways to solve the noral equations Solutions

More information

ANALYTICAL INVESTIGATION AND PARAMETRIC STUDY OF LATERAL IMPACT BEHAVIOR OF PRESSURIZED PIPELINES AND INFLUENCE OF INTERNAL PRESSURE

ANALYTICAL INVESTIGATION AND PARAMETRIC STUDY OF LATERAL IMPACT BEHAVIOR OF PRESSURIZED PIPELINES AND INFLUENCE OF INTERNAL PRESSURE DRAFT Proceedings of the ASME 014 International Mechanical Engineering Congress & Exposition IMECE014 Noveber 14-0, 014, Montreal, Quebec, Canada IMECE014-36371 ANALYTICAL INVESTIGATION AND PARAMETRIC

More information

Lower Bounds for Quantized Matrix Completion

Lower Bounds for Quantized Matrix Completion Lower Bounds for Quantized Matrix Copletion Mary Wootters and Yaniv Plan Departent of Matheatics University of Michigan Ann Arbor, MI Eail: wootters, yplan}@uich.edu Mark A. Davenport School of Elec. &

More information

ANALYSIS OF HALL-EFFECT THRUSTERS AND ION ENGINES FOR EARTH-TO-MOON TRANSFER

ANALYSIS OF HALL-EFFECT THRUSTERS AND ION ENGINES FOR EARTH-TO-MOON TRANSFER IEPC 003-0034 ANALYSIS OF HALL-EFFECT THRUSTERS AND ION ENGINES FOR EARTH-TO-MOON TRANSFER A. Bober, M. Guelan Asher Space Research Institute, Technion-Israel Institute of Technology, 3000 Haifa, Israel

More information

Multiscale Entropy Analysis: A New Method to Detect Determinism in a Time. Series. A. Sarkar and P. Barat. Variable Energy Cyclotron Centre

Multiscale Entropy Analysis: A New Method to Detect Determinism in a Time. Series. A. Sarkar and P. Barat. Variable Energy Cyclotron Centre Multiscale Entropy Analysis: A New Method to Detect Deterinis in a Tie Series A. Sarkar and P. Barat Variable Energy Cyclotron Centre /AF Bidhan Nagar, Kolkata 700064, India PACS nubers: 05.45.Tp, 89.75.-k,

More information

Fairness via priority scheduling

Fairness via priority scheduling Fairness via priority scheduling Veeraruna Kavitha, N Heachandra and Debayan Das IEOR, IIT Bobay, Mubai, 400076, India vavitha,nh,debayan}@iitbacin Abstract In the context of ulti-agent resource allocation

More information

Interactive Markov Models of Evolutionary Algorithms

Interactive Markov Models of Evolutionary Algorithms Cleveland State University EngagedScholarship@CSU Electrical Engineering & Coputer Science Faculty Publications Electrical Engineering & Coputer Science Departent 2015 Interactive Markov Models of Evolutionary

More information

ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics

ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS A Thesis Presented to The Faculty of the Departent of Matheatics San Jose State University In Partial Fulfillent of the Requireents

More information

Best Arm Identification: A Unified Approach to Fixed Budget and Fixed Confidence

Best Arm Identification: A Unified Approach to Fixed Budget and Fixed Confidence Best Ar Identification: A Unified Approach to Fixed Budget and Fixed Confidence Victor Gabillon Mohaad Ghavazadeh Alessandro Lazaric INRIA Lille - Nord Europe, Tea SequeL {victor.gabillon,ohaad.ghavazadeh,alessandro.lazaric}@inria.fr

More information

Neural Network-Aided Extended Kalman Filter for SLAM Problem

Neural Network-Aided Extended Kalman Filter for SLAM Problem 7 IEEE International Conference on Robotics and Autoation Roa, Italy, -4 April 7 ThA.5 Neural Network-Aided Extended Kalan Filter for SLAM Proble Minyong Choi, R. Sakthivel, and Wan Kyun Chung Abstract

More information

Multiple Testing Issues & K-Means Clustering. Definitions related to the significance level (or type I error) of multiple tests

Multiple Testing Issues & K-Means Clustering. Definitions related to the significance level (or type I error) of multiple tests StatsM254 Statistical Methods in Coputational Biology Lecture 3-04/08/204 Multiple Testing Issues & K-Means Clustering Lecturer: Jingyi Jessica Li Scribe: Arturo Rairez Multiple Testing Issues When trying

More information

Constrained Consensus and Optimization in Multi-Agent Networks arxiv: v2 [math.oc] 17 Dec 2008

Constrained Consensus and Optimization in Multi-Agent Networks arxiv: v2 [math.oc] 17 Dec 2008 LIDS Report 2779 1 Constrained Consensus and Optiization in Multi-Agent Networks arxiv:0802.3922v2 [ath.oc] 17 Dec 2008 Angelia Nedić, Asuan Ozdaglar, and Pablo A. Parrilo February 15, 2013 Abstract We

More information