Deep Gaussian Processes for Multi-fidelity Modeling

Size: px
Start display at page:

Download "Deep Gaussian Processes for Multi-fidelity Modeling"

Transcription

1 Deep Gaussian Processes for Muti-fideity Modeing Kurt Cutajar EURECOM Sophia Antipois, France Mark Puin Andreas Damianou Nei Lawrence Javier Gonzáez Abstract Muti-fideity modes are prominenty used in various science and engineering appications where cheapy-obtained, but possiby biased and noisy observations must be effectivey combined with imited or epensive true data in order to construct reiabe modes. The notion of appying deep Gaussian processes DGPs to this setting has recenty shown great promise by capturing compe noninear correations across fideities. However, the architectures epored thus far are burdened by structura assumptions and constraints which deter such modes from performing to the best of their epected capabiities. In this paper we propose a nove approach for DGP muti-fideity modeing which treats DGP ayers as fideity eves and uses a variationa inference scheme to propagate uncertainty across them. In our eperiments, we show that this approach makes substantia improvements in quantifying and propagating uncertainty in muti-fideity set-ups, which in turn improves their effectiveness in decision-making pipeines. 1 Introduction Muti-fideity modes [4, 7] are designed to fuse imited true observations high-fideity with cheapy-obtained ower granuarity representations ow-fideity. Gaussian processes [GPs; 9] are we-suited to muti-fideity probems due to their abiity to encode prior beiefs about how fideities are reated, yieding predictions accompanied by uncertainty estimates. GPs formed the basis of semina autoregressive modes AR1 investigated by [4] and [6], and are suitabe when the mapping between fideities is inear, i.e. the high-fideity function f t can be modeed as: f t = ρf t 1 + δ t, 1 where ρ is a constant scaing the contribution of sampes f t 1 drawn from the GP modeing the data at the preceding fideity, and δ t modes the bias between fideities. However, this is insufficient when the mapping is noninear, i.e. ρ is now a noninear transformation such that: f t = ρ t f t 1 + δ t. 2 The additive structure and independence assumption between the GPs for modeing ρ t f t 1 and δ t permits us to combine these as a singe GP that takes as inputs both and f t 1, which here denotes a sampe from the posterior of the GP modeing the preceding fideity evauated at. This can be epressed as f t = g t f t 1,. Third workshop on Bayesian Deep Learning NeurIPS 2018, Montréa, Canada. Work carried out during an internship at, Cambridge.

2 high-fideity AR1 ow-fideity NARGP a Left: Overfitting in the NARGP mode. Right: b Left: AR1 cannot capture noninear mappings. We-caibrated fit using proposed MF - DGP mode. Right: Fied by compositiona structure of MF - DGP. Figure 1: Limitations addressed and resoved jointy by MF - DGP. Bue and red markers denote ow and high-fideity observations respectivey. Shaded regions indicate the 95% confidence interva. Deep Gaussian processes [DGPs; 2] are a natura candidate for handing such reationships, aowing for uncertainty propagation in a nested structure of GPs where each GP modes the transition from one fideity to the net. However, DGPs are cumbersome to deveop and approimations are necessary for enabing tractabe inference. Whie motivated by the structure of DGPs, the noninear muti-fideity mode NARGP proposed in [8] amounts to a disjointed architecture whereby each GP is fitted in an isoated hierarchica manner, preventing GPs at ower fideities from being updated once they have been fit. Consider the eampe given in Figure 1a. In the boed area, we woud epect the mode to return high uncertainty to refect the ack of data avaiabe, but overfitting in NARGP resuts in predicting an incorrect resut with reasonaby high confidence. Contribution: In this work, we propose the first compete interpretation of muti-fideity modeing using DGPs, which we refer to as MF - DGP. In particuar, we everage the sparse DGP approimation proposed in [10] for constructing a muti-fideity DGP mode which can be trained end-to-end, overcoming the constraints that hinder eisting attempts at using DGP structure for this purpose. Returning to the eampe given in Figure 1a, we see that our mode fits the true function propery whie aso returning sensiby conservative uncertainty estimates. Additionay, our mode aso inherits the compositiona structure of NARGP, aeviating a crucia imitation of AR1 Figure 1b. 2 Muti-fideity Deep Gaussian Process The appication of DGPs to the muti-fideity setting is particuary appeaing because if we assume that each ayer corresponds to a fideity eve, then the atent functions at the intermediate ayers are given a meaningfu interpretation which is not aways avaiabe in standard DGP modes. The first attempt at using compositions of GPs in a muti-fideity setting [8] reied on structura assumptions on the data to circumvent the intractabiity of DGPs, but this heaviy impairs their epected feibiity. Recent advances in the DGP iterature [1, 10] have everaged traditiona GP approimations to construct scaabe DGP modes which are easier to specify and train; we buid our etension atop the mode presented in [10] to avoid the constraints imposed on seecting kerne functions in [1]. 2.1 Mode Specification Let us assume a dataset D having observations at T fideities, where Xt and yt denote the nt inputs and corresponding outputs observed with fideity eve t: { } D = X1, y1,..., Xt, yt,..., XT, yt. For enhanced interpretabiity, we assume that each ayer of our MF - DGP mode corresponds to the process modeing the observations avaiabe at fideity eve t, and that the bias or deviation from the true function decreases from one eve to the net. We use the notation Ft to denote the evauation at ayer for inputs observed with fideity t; for eampe, the evauation of the process at ayer 1 for the inputs observed with fideity 3 is denoted as F31. A conceptua iustration of the proposed MF DGP architecture is given in Figure 2 eft for a dataset with three fideities. Note that the GP at each ayer is conditioned on the data beonging to that eve, as we as the evauation of that same input data at the preceding fideity. This gives greater purpose to the notion of feeding forward the origina inputs at each ayer, as originay suggested in [3] for avoiding pathoogies in deep architectures. 2

3 X 1 f X 2 X 3 1 GP f 2 GP f 3 GP {F t {F t 1 }3 t=1 2 }3 t=2 {F t 3 }3 t=3 f 1 GP f 1 f 2 GP f 2 f 3 GP f 3 y 1 y 2 y 3 y 1 y 2 y 3 Figure 2: Left: architecture with 3 fideity eves. Right: Predictions using same. At each ayer we rey on the sparse variationa approimation of a GP for inference, thus obtaining the foowing variationa posterior distribution: q F t U = p F t U ; {F t 1, X t }, Z 1 q U, 3 where Z 1 denotes the inducing inputs for, U their corresponding function evauation, and q U = N U µ, Σ is the variationa approimation of the inducing points. The mean and variance defining this variationa approimation, i.e. µ and Σ, are optimized during training. Furthermore, if U is marginaized out from Equation 3, the resuting variationa posterior is once again Gaussian and fuy defined by its mean, m, and variance, S : q F t µ, Σ ; {F t 1, X t }, Z 1 = N F t m t, S t, 4 which can be derived anayticay. The ikeihood noise at ower fideity eves is encoded as additive white noise in the kerne function of the GP at that ayer. We can then formuate the variationa ower bound on the margina ikeihood as foows: L = T n t t=1 i=1 E qf i,t t [ og p y i,t f i,t t ] + L D KL q U p U ; Z 1, where we assume that the ikeihood is factorized across fideities and observations, and D KL denotes the Kuback-Leiber divergence. Sampes from the mode are obtained recursivey using the reparameterization trick [5] to draw sampes from the variationa posterior. Mode predictions with different fideities are aso obtained recursivey by propagating the input through the mode up to the chosen fideity. At a intermediate ayers, the output from the preceding ayer is augmented with the origina input, as wi be made evident by the choice of kerne epained in the net section. The output of a test point can then be predicted with fideity eve t as foows: q f t 1 S S s=1 =1 q f s, t µ t, Σ t ; {f s, t 1, }, Z t 1, 5 where S denotes the number of Monte Caro sampes and t repaces as the ayer indicator. This procedure is iustrated in Figure 2 right. 2.2 Muti-fideity Covariance For every GP at an intermediate ayer, we opt for the muti-fideity kerne function proposed in [8], since this captures both the potentiay noninear mapping between outputs as we as the correation in the origina input space: k = k ρ i, j ; θ ρ k f 1 f 1 i, f 1 j ; θ f 1 + k δ i, j ; θ δ, 6 3

4 Linear 1 Linear 2 Noninear 1 Noninear 2 AR1 NARGP defaut aternate Figure 3: Comparison across methods and benchmarks for chaenging muti-fideity scenarios. The importance of choosing an appropriate kerne for is aso reinforced here. where k f 1 denotes the covariance between outputs obtained from the preceding fideity eve, k ρ is a space-dependent scaing factor, and k δ captures the bias at that fideity eve. At the first ayer this reduces to k 1 = k1 δ i, j ; θ δ 1. In [8], it was assumed that each individua component of the composite kerne function is an RBF kerne, and we sha aso assume this to be the defaut setting for. However, this may not be appropriate when the mapping between fideities is inear. In such instances, we propose to repace k f 1 with an aternate inear kerne such that the composite intermediate ayer covariance becomes: k = k ρ i, j ; θ ρ f 1 i f 1 j + k δ i, j ; θ δ. 7 3 Eperimenta Evauation In the preceding sections, we demonstrated how the formuation of state-of-the-art DGP modes can be adapted to the muti-fideity setting. Through a series of eperiments, we vaidate that beyond its novety and theoretic appea, the proposed mode aso works we in practice. Improved UQ: We empiricay vaidate s we-caibrated uncertainty quantification by considering eperimenta set-ups where the avaiabe data is generay insufficient to yied confident predictions, and higher uncertainty is prized. In Figure 3, we consider muti-fideity scenarios where the aocation of high-fideity data is imited or constrained to ie in one area of the input domain. In a of the eampes, our mode yieds appropriatey conservative estimates in regions where insufficient observations are avaiabe. As evidenced by the overfitting ehibited by AR1 for the LINEAR 2 eampe, deep modes can aso be usefu for probems having inear mappings. Tabe 1: Mode comparison on muti-fideity benchmark eampes. Defaut indicates use of the kerne isted in Equation 6, whie aternate indicates that the covariance in Equation 7 was used. Mean Squared Error Benchmark n ow n high AR1 NARGP Linear aternate Linear aternate Noninear e defaut Noninear defaut 4

5 Mean Squared Error AR Iterations NARGP high-fideity gp Predicted Water Fow Actua Water Fow Figure 4: Eperimenta design oop. Figure 5: fit to Borehoe function. Benchmark Comparison: We aso compare the predictive performance of to AR1 and NARGP on the same seection of benchmark eampes. Twenty randomy-generated training sets are prepared for each eampe function, foowing the aocation of ow and high-fideity points isted in Tabe 1. The resuts denote the average mean squared error obtained using each mode over a fied test set covering the entire input domain. The obtained resuts give credence to our intuition that baances out issues in the two modeing approaches; it performs as we as NARGP on the noninear eampes where AR1 faters, and outperforms the former on inear eampes. Muti-fideity in the Loop: We further assess using an epository eperimenta design oop whereby points are sequentiay chosen to reduce uncertainty about a function of interest. Starting with 20 ow-fideity and 3 high-fideity observations, we earn the NONLINEAR 1 function by seecting to observe points where the variance of the predictive distribution at the high fideity is argest. Figure 4 shows how the mean squared error against a constant test set evoves as more points are coected, averaged over 5 runs with different initia training data. Here we aso compare against a standard GP trained on the high-fideity observations ony. As epected, NARGP and MF- DGP perform best as the mode structure better represents the underying data. Athough NARGP and both converge to a simiar soution once enough points are samped, the benefit of using is evidenced in the initia steps of the procedure, whereby it fits the data sensiby after ony few iterations. Rea-word Simuation: We fit to a two-eve function that simuates stochastic water fow through a borehoe [11] and depends on eight input parameters, for which a dataset of 150 ow and 40 high-fideity points was generated. Figure 5 iustrates the performance of for a test set containing 1000 high-fideity points, where it achieves an R 2 of Concusion Reiabe decision making under uncertainty is a core requirement in muti-fideity scenarios where unbiased observations are scarce or difficut to obtain. In this paper, we proposed the first compete specification of a muti-fideity mode as a DGP that is capabe of capturing noninear reationships between fideities with reduced overfitting. By providing end-to-end training across a fideity eves, yieds superior quantification and propagation of uncertainty that is crucia in iterative methods such as eperimenta design. In spite of being prevaent in engineering appications, we beieve that muti-fideity modeing has been under-epored by the machine earning community, and hope that this work can reignite further interest in this direction. References [1] K. Cutajar, E. V. Bonia, P. Michiardi, and M. Fiippone. Random feature epansions for deep Gaussian processes. In Proceedings of the 34th Internationa Conference on Machine Learning, ICML 2017, Sydney, NSW, Austraia, 6-11 August 2017, pages , [2] A. C. Damianou and N. D. Lawrence. Deep Gaussian processes. In Proceedings of the Siteenth Internationa Conference on Artificia Inteigence and Statistics, AISTATS 2013, Scotts- 5

6 dae, AZ, USA, Apri 29 - May 1, 2013, pages , [3] D. K. Duvenaud, O. Rippe, R. P. Adams, and Z. Ghahramani. Avoiding pathoogies in very deep networks. In Proceedings of the Seventeenth Internationa Conference on Artificia Inteigence and Statistics, AISTATS 2014, Reykjavik, Iceand, Apri 22-25, 2014, pages , [4] M. C. Kennedy and A. O Hagan. Predicting the output from a compe computer code when fast approimations are avaiabe. Biometrika, 871:1 13, [5] D. P. Kingma and M. Weing. Auto-encoding variationa Bayes. In Proceedings of the Second Internationa Conference on Learning Representations, ICLR 2014, Banff, Canada, Apri 14-16, 2014, [6] L. Le Gratiet and J. Garnier. Recursive co-kriging mode for design of computer eperiments with mutipe eves of fideity. Internationa Journa for Uncertainty Quantification, 45, [7] B. Peherstorfer, K. Wico, and M. Gunzburger. Survey of mutifideity methods in uncertainty propagation, inference, and optimization. SIAM Review, 603: , [8] P. Perdikaris, M. Raissi, A. Damianou, N. D. Lawrence, and G. E. Karniadakis. Noninear information fusion agorithms for data-efficient muti-fideity modeing. Proceedings of the Roya Society A: Mathematica, Physica and Engineering Sciences, : , [9] C. E. Rasmussen and C. K. I. Wiiams. Gaussian processes for machine earning. Adaptive computation and machine earning. MIT Press, [10] H. Saimbeni and M. P. Deisenroth. Douby stochastic variationa inference for deep Gaussian processes. In Advances in Neura Information Processing Systems 30: Annua Conference on Neura Information Processing Systems 2017, 4-9 December 2017, Long Beach, CA, USA, pages , [11] S. Xiong, P. Z. G. Qian, and C. F. J. Wu. Sequentia design and anaysis of high-accuracy and ow-accuracy computer codes. Technometrics, 551:37 46,

Radar/ESM Tracking of Constant Velocity Target : Comparison of Batch (MLE) and EKF Performance

Radar/ESM Tracking of Constant Velocity Target : Comparison of Batch (MLE) and EKF Performance adar/ racing of Constant Veocity arget : Comparison of Batch (LE) and EKF Performance I. Leibowicz homson-csf Deteis/IISA La cef de Saint-Pierre 1 Bd Jean ouin 7885 Eancourt Cede France Isabee.Leibowicz

More information

Stochastic Variational Inference with Gradient Linearization

Stochastic Variational Inference with Gradient Linearization Stochastic Variationa Inference with Gradient Linearization Suppementa Materia Tobias Pötz * Anne S Wannenwetsch Stefan Roth Department of Computer Science, TU Darmstadt Preface In this suppementa materia,

More information

Optimum Design Method of Viscous Dampers in Building Frames Using Calibration Model

Optimum Design Method of Viscous Dampers in Building Frames Using Calibration Model The 4 th Word Conference on Earthquake Engineering October -7, 8, Beijing, China Optimum Design Method of iscous Dampers in Buiding Frames sing Caibration Mode M. Yamakawa, Y. Nagano, Y. ee 3, K. etani

More information

BP neural network-based sports performance prediction model applied research

BP neural network-based sports performance prediction model applied research Avaiabe onine www.jocpr.com Journa of Chemica and Pharmaceutica Research, 204, 6(7:93-936 Research Artice ISSN : 0975-7384 CODEN(USA : JCPRC5 BP neura networ-based sports performance prediction mode appied

More information

ASummaryofGaussianProcesses Coryn A.L. Bailer-Jones

ASummaryofGaussianProcesses Coryn A.L. Bailer-Jones ASummaryofGaussianProcesses Coryn A.L. Baier-Jones Cavendish Laboratory University of Cambridge caj@mrao.cam.ac.uk Introduction A genera prediction probem can be posed as foows. We consider that the variabe

More information

Multilayer Kerceptron

Multilayer Kerceptron Mutiayer Kerceptron Zotán Szabó, András Lőrincz Department of Information Systems, Facuty of Informatics Eötvös Loránd University Pázmány Péter sétány 1/C H-1117, Budapest, Hungary e-mai: szzoi@csetehu,

More information

A Better Way to Pretrain Deep Boltzmann Machines

A Better Way to Pretrain Deep Boltzmann Machines A Better Way to Pretrain Deep Botzmann Machines Rusan Saakhutdino Department of Statistics and Computer Science Uniersity of Toronto rsaakhu@cs.toronto.edu Geoffrey Hinton Department of Computer Science

More information

A Sparse Covariance Function for Exact Gaussian Process Inference in Large Datasets

A Sparse Covariance Function for Exact Gaussian Process Inference in Large Datasets A Covariance Function for Exact Gaussian Process Inference in Large Datasets Arman ekumyan Austraian Centre for Fied Robotics The University of Sydney NSW 26, Austraia a.mekumyan@acfr.usyd.edu.au Fabio

More information

Paragraph Topic Classification

Paragraph Topic Classification Paragraph Topic Cassification Eugene Nho Graduate Schoo of Business Stanford University Stanford, CA 94305 enho@stanford.edu Edward Ng Department of Eectrica Engineering Stanford University Stanford, CA

More information

Robust Multi-Task Learning with t-processes

Robust Multi-Task Learning with t-processes Shipeng Yu shipeng.yu@siemens.com CAD and Knowedge Soutions, Siemens Medica Soutions, Mavern, PA 9355, USA Voker Tresp Corporate Technoogy, Siemens AG, Munich 8730, Germany Kai Yu NEC Laboratories America,

More information

NEW DEVELOPMENT OF OPTIMAL COMPUTING BUDGET ALLOCATION FOR DISCRETE EVENT SIMULATION

NEW DEVELOPMENT OF OPTIMAL COMPUTING BUDGET ALLOCATION FOR DISCRETE EVENT SIMULATION NEW DEVELOPMENT OF OPTIMAL COMPUTING BUDGET ALLOCATION FOR DISCRETE EVENT SIMULATION Hsiao-Chang Chen Dept. of Systems Engineering University of Pennsyvania Phiadephia, PA 904-635, U.S.A. Chun-Hung Chen

More information

Structural Control of Probabilistic Boolean Networks and Its Application to Design of Real-Time Pricing Systems

Structural Control of Probabilistic Boolean Networks and Its Application to Design of Real-Time Pricing Systems Preprints of the 9th Word Congress The Internationa Federation of Automatic Contro Structura Contro of Probabiistic Booean Networks and Its Appication to Design of Rea-Time Pricing Systems Koichi Kobayashi

More information

Appendix A: MATLAB commands for neural networks

Appendix A: MATLAB commands for neural networks Appendix A: MATLAB commands for neura networks 132 Appendix A: MATLAB commands for neura networks p=importdata('pn.xs'); t=importdata('tn.xs'); [pn,meanp,stdp,tn,meant,stdt]=prestd(p,t); for m=1:10 net=newff(minmax(pn),[m,1],{'tansig','purein'},'trainm');

More information

Nonlinear Gaussian Filtering via Radial Basis Function Approximation

Nonlinear Gaussian Filtering via Radial Basis Function Approximation 51st IEEE Conference on Decision and Contro December 10-13 01 Maui Hawaii USA Noninear Gaussian Fitering via Radia Basis Function Approximation Huazhen Fang Jia Wang and Raymond A de Caafon Abstract This

More information

Bayesian Learning. You hear a which which could equally be Thanks or Tanks, which would you go with?

Bayesian Learning. You hear a which which could equally be Thanks or Tanks, which would you go with? Bayesian Learning A powerfu and growing approach in machine earning We use it in our own decision making a the time You hear a which which coud equay be Thanks or Tanks, which woud you go with? Combine

More information

Moreau-Yosida Regularization for Grouped Tree Structure Learning

Moreau-Yosida Regularization for Grouped Tree Structure Learning Moreau-Yosida Reguarization for Grouped Tree Structure Learning Jun Liu Computer Science and Engineering Arizona State University J.Liu@asu.edu Jieping Ye Computer Science and Engineering Arizona State

More information

Fitting Algorithms for MMPP ATM Traffic Models

Fitting Algorithms for MMPP ATM Traffic Models Fitting Agorithms for PP AT Traffic odes A. Nogueira, P. Savador, R. Vaadas University of Aveiro / Institute of Teecommunications, 38-93 Aveiro, Portuga; e-mai: (nogueira, savador, rv)@av.it.pt ABSTRACT

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabiistic Graphica Modes Inference & Learning in DL Zhiting Hu Lecture 19, March 29, 2017 Reading: 1 Deep Generative Modes Expicit probabiistic modes Provide an expicit parametric specification of the

More information

An Algorithm for Pruning Redundant Modules in Min-Max Modular Network

An Algorithm for Pruning Redundant Modules in Min-Max Modular Network An Agorithm for Pruning Redundant Modues in Min-Max Moduar Network Hui-Cheng Lian and Bao-Liang Lu Department of Computer Science and Engineering, Shanghai Jiao Tong University 1954 Hua Shan Rd., Shanghai

More information

A. Distribution of the test statistic

A. Distribution of the test statistic A. Distribution of the test statistic In the sequentia test, we first compute the test statistic from a mini-batch of size m. If a decision cannot be made with this statistic, we keep increasing the mini-batch

More information

Collective organization in an adaptative mixture of experts

Collective organization in an adaptative mixture of experts Coective organization in an adaptative mixture of experts Vincent Vigneron, Christine Fuchen, Jean-Marc Martinez To cite this version: Vincent Vigneron, Christine Fuchen, Jean-Marc Martinez. Coective organization

More information

Soft Clustering on Graphs

Soft Clustering on Graphs Soft Custering on Graphs Kai Yu 1, Shipeng Yu 2, Voker Tresp 1 1 Siemens AG, Corporate Technoogy 2 Institute for Computer Science, University of Munich kai.yu@siemens.com, voker.tresp@siemens.com spyu@dbs.informatik.uni-muenchen.de

More information

Optimality of Inference in Hierarchical Coding for Distributed Object-Based Representations

Optimality of Inference in Hierarchical Coding for Distributed Object-Based Representations Optimaity of Inference in Hierarchica Coding for Distributed Object-Based Representations Simon Brodeur, Jean Rouat NECOTIS, Département génie éectrique et génie informatique, Université de Sherbrooke,

More information

Appendix for Stochastic Gradient Monomial Gamma Sampler

Appendix for Stochastic Gradient Monomial Gamma Sampler 3 4 5 6 7 8 9 3 4 5 6 7 8 9 3 4 5 6 7 8 9 3 3 3 33 34 35 36 37 38 39 4 4 4 43 44 45 46 47 48 49 5 5 5 53 54 Appendix for Stochastic Gradient Monomia Gamma Samper A The Main Theorem We provide the foowing

More information

A Novel Learning Method for Elman Neural Network Using Local Search

A Novel Learning Method for Elman Neural Network Using Local Search Neura Information Processing Letters and Reviews Vo. 11, No. 8, August 2007 LETTER A Nove Learning Method for Eman Neura Networ Using Loca Search Facuty of Engineering, Toyama University, Gofuu 3190 Toyama

More information

How the backpropagation algorithm works Srikumar Ramalingam School of Computing University of Utah

How the backpropagation algorithm works Srikumar Ramalingam School of Computing University of Utah How the backpropagation agorithm works Srikumar Ramaingam Schoo of Computing University of Utah Reference Most of the sides are taken from the second chapter of the onine book by Michae Nieson: neuranetworksanddeepearning.com

More information

Can Active Learning Experience Be Transferred?

Can Active Learning Experience Be Transferred? Can Active Learning Experience Be Transferred? Hong-Min Chu Department of Computer Science and Information Engineering, Nationa Taiwan University E-mai: r04922031@csie.ntu.edu.tw Hsuan-Tien Lin Department

More information

Appendix for Stochastic Gradient Monomial Gamma Sampler

Appendix for Stochastic Gradient Monomial Gamma Sampler Appendix for Stochastic Gradient Monomia Gamma Samper A The Main Theorem We provide the foowing theorem to characterize the stationary distribution of the stochastic process with SDEs in (3) Theorem 3

More information

Progressive Correction for Deterministic Dirac Mixture Approximations

Progressive Correction for Deterministic Dirac Mixture Approximations 4th Internationa Conference on Information Fusion Chicago, Iinois, USA, Juy 5-8, Progressive Correction for Deterministic Dirac Miture Approimations Patrick Ruoff, Peter Krauthausen, and Uwe D. Hanebeck

More information

Statistical Learning Theory: A Primer

Statistical Learning Theory: A Primer Internationa Journa of Computer Vision 38(), 9 3, 2000 c 2000 uwer Academic Pubishers. Manufactured in The Netherands. Statistica Learning Theory: A Primer THEODOROS EVGENIOU, MASSIMILIANO PONTIL AND TOMASO

More information

Bayesian Semi-supervised Learning with Deep Generative Models

Bayesian Semi-supervised Learning with Deep Generative Models Bayesian Semi-supervised Learning with Deep Generative Models Jonathan Gordon Department of Engineering Cambridge University jg801@cam.ac.uk José Miguel Hernández-Lobato Department of Engineering Cambridge

More information

arxiv: v1 [cs.lg] 31 Oct 2017

arxiv: v1 [cs.lg] 31 Oct 2017 ACCELERATED SPARSE SUBSPACE CLUSTERING Abofaz Hashemi and Haris Vikao Department of Eectrica and Computer Engineering, University of Texas at Austin, Austin, TX, USA arxiv:7.26v [cs.lg] 3 Oct 27 ABSTRACT

More information

DIGITAL FILTER DESIGN OF IIR FILTERS USING REAL VALUED GENETIC ALGORITHM

DIGITAL FILTER DESIGN OF IIR FILTERS USING REAL VALUED GENETIC ALGORITHM DIGITAL FILTER DESIGN OF IIR FILTERS USING REAL VALUED GENETIC ALGORITHM MIKAEL NILSSON, MATTIAS DAHL AND INGVAR CLAESSON Bekinge Institute of Technoogy Department of Teecommunications and Signa Processing

More information

A Solution to the 4-bit Parity Problem with a Single Quaternary Neuron

A Solution to the 4-bit Parity Problem with a Single Quaternary Neuron Neura Information Processing - Letters and Reviews Vo. 5, No. 2, November 2004 LETTER A Soution to the 4-bit Parity Probem with a Singe Quaternary Neuron Tohru Nitta Nationa Institute of Advanced Industria

More information

Structural health monitoring of concrete dams using least squares support vector machines

Structural health monitoring of concrete dams using least squares support vector machines Structura heath monitoring of concrete dams using east squares support vector machines *Fei Kang ), Junjie Li ), Shouju Li 3) and Jia Liu ) ), ), ) Schoo of Hydrauic Engineering, Daian University of echnoogy,

More information

Research of Data Fusion Method of Multi-Sensor Based on Correlation Coefficient of Confidence Distance

Research of Data Fusion Method of Multi-Sensor Based on Correlation Coefficient of Confidence Distance Send Orders for Reprints to reprints@benthamscience.ae 340 The Open Cybernetics & Systemics Journa, 015, 9, 340-344 Open Access Research of Data Fusion Method of Muti-Sensor Based on Correation Coefficient

More information

Combining reaction kinetics to the multi-phase Gibbs energy calculation

Combining reaction kinetics to the multi-phase Gibbs energy calculation 7 th European Symposium on Computer Aided Process Engineering ESCAPE7 V. Pesu and P.S. Agachi (Editors) 2007 Esevier B.V. A rights reserved. Combining reaction inetics to the muti-phase Gibbs energy cacuation

More information

Available online at ScienceDirect. Procedia Computer Science 96 (2016 )

Available online at  ScienceDirect. Procedia Computer Science 96 (2016 ) Avaiabe onine at www.sciencedirect.com ScienceDirect Procedia Computer Science 96 (206 92 99 20th Internationa Conference on Knowedge Based and Inteigent Information and Engineering Systems Connected categorica

More information

Inductive Bias: How to generalize on novel data. CS Inductive Bias 1

Inductive Bias: How to generalize on novel data. CS Inductive Bias 1 Inductive Bias: How to generaize on nove data CS 478 - Inductive Bias 1 Overfitting Noise vs. Exceptions CS 478 - Inductive Bias 2 Non-Linear Tasks Linear Regression wi not generaize we to the task beow

More information

A Statistical Framework for Real-time Event Detection in Power Systems

A Statistical Framework for Real-time Event Detection in Power Systems 1 A Statistica Framework for Rea-time Event Detection in Power Systems Noan Uhrich, Tim Christman, Phiip Swisher, and Xichen Jiang Abstract A quickest change detection (QCD) agorithm is appied to the probem

More information

Paper presented at the Workshop on Space Charge Physics in High Intensity Hadron Rings, sponsored by Brookhaven National Laboratory, May 4-7,1998

Paper presented at the Workshop on Space Charge Physics in High Intensity Hadron Rings, sponsored by Brookhaven National Laboratory, May 4-7,1998 Paper presented at the Workshop on Space Charge Physics in High ntensity Hadron Rings, sponsored by Brookhaven Nationa Laboratory, May 4-7,998 Noninear Sef Consistent High Resoution Beam Hao Agorithm in

More information

Power Control and Transmission Scheduling for Network Utility Maximization in Wireless Networks

Power Control and Transmission Scheduling for Network Utility Maximization in Wireless Networks ower Contro and Transmission Scheduing for Network Utiity Maximization in Wireess Networks Min Cao, Vivek Raghunathan, Stephen Hany, Vinod Sharma and. R. Kumar Abstract We consider a joint power contro

More information

6.434J/16.391J Statistics for Engineers and Scientists May 4 MIT, Spring 2006 Handout #17. Solution 7

6.434J/16.391J Statistics for Engineers and Scientists May 4 MIT, Spring 2006 Handout #17. Solution 7 6.434J/16.391J Statistics for Engineers and Scientists May 4 MIT, Spring 2006 Handout #17 Soution 7 Probem 1: Generating Random Variabes Each part of this probem requires impementation in MATLAB. For the

More information

PREDICTION OF DEFORMED AND ANNEALED MICROSTRUCTURES USING BAYESIAN NEURAL NETWORKS AND GAUSSIAN PROCESSES

PREDICTION OF DEFORMED AND ANNEALED MICROSTRUCTURES USING BAYESIAN NEURAL NETWORKS AND GAUSSIAN PROCESSES PREDICTION OF DEFORMED AND ANNEALED MICROSTRUCTURES USING BAYESIAN NEURAL NETWORKS AND GAUSSIAN PROCESSES C.A.L. Baier-Jones, T.J. Sabin, D.J.C. MacKay, P.J. Withers Department of Materias Science and

More information

Probabilistic & Bayesian deep learning. Andreas Damianou

Probabilistic & Bayesian deep learning. Andreas Damianou Probabilistic & Bayesian deep learning Andreas Damianou Amazon Research Cambridge, UK Talk at University of Sheffield, 19 March 2019 In this talk Not in this talk: CRFs, Boltzmann machines,... In this

More information

Active Learning & Experimental Design

Active Learning & Experimental Design Active Learning & Experimenta Design Danie Ting Heaviy modified, of course, by Lye Ungar Origina Sides by Barbara Engehardt and Aex Shyr Lye Ungar, University of Pennsyvania Motivation u Data coection

More information

Process Capability Proposal. with Polynomial Profile

Process Capability Proposal. with Polynomial Profile Contemporary Engineering Sciences, Vo. 11, 2018, no. 85, 4227-4236 HIKARI Ltd, www.m-hikari.com https://doi.org/10.12988/ces.2018.88467 Process Capabiity Proposa with Poynomia Profie Roberto José Herrera

More information

FRST Multivariate Statistics. Multivariate Discriminant Analysis (MDA)

FRST Multivariate Statistics. Multivariate Discriminant Analysis (MDA) 1 FRST 531 -- Mutivariate Statistics Mutivariate Discriminant Anaysis (MDA) Purpose: 1. To predict which group (Y) an observation beongs to based on the characteristics of p predictor (X) variabes, using

More information

Learning Structured Weight Uncertainty in Bayesian Neural Networks

Learning Structured Weight Uncertainty in Bayesian Neural Networks Learning Structured Weight Uncertainty in Bayesian Neura Networks Shengyang Sun Changyou Chen Lawrence Carin Tsinghua University Duke University Duke University Abstract Deep neura networks DNNs are increasingy

More information

An Information Geometrical View of Stationary Subspace Analysis

An Information Geometrical View of Stationary Subspace Analysis An Information Geometrica View of Stationary Subspace Anaysis Motoaki Kawanabe, Wojciech Samek, Pau von Bünau, and Frank C. Meinecke Fraunhofer Institute FIRST, Kekuéstr. 7, 12489 Berin, Germany Berin

More information

Symbolic models for nonlinear control systems using approximate bisimulation

Symbolic models for nonlinear control systems using approximate bisimulation Symboic modes for noninear contro systems using approximate bisimuation Giordano Poa, Antoine Girard and Pauo Tabuada Abstract Contro systems are usuay modeed by differentia equations describing how physica

More information

A Simple and Efficient Algorithm of 3-D Single-Source Localization with Uniform Cross Array Bing Xue 1 2 a) * Guangyou Fang 1 2 b and Yicai Ji 1 2 c)

A Simple and Efficient Algorithm of 3-D Single-Source Localization with Uniform Cross Array Bing Xue 1 2 a) * Guangyou Fang 1 2 b and Yicai Ji 1 2 c) A Simpe Efficient Agorithm of 3-D Singe-Source Locaization with Uniform Cross Array Bing Xue a * Guangyou Fang b Yicai Ji c Key Laboratory of Eectromagnetic Radiation Sensing Technoogy, Institute of Eectronics,

More information

TRAVEL TIME ESTIMATION FOR URBAN ROAD NETWORKS USING LOW FREQUENCY PROBE VEHICLE DATA

TRAVEL TIME ESTIMATION FOR URBAN ROAD NETWORKS USING LOW FREQUENCY PROBE VEHICLE DATA TRAVEL TIME ESTIMATIO FOR URBA ROAD ETWORKS USIG LOW FREQUECY PROBE VEHICLE DATA Erik Jeneius Corresponding author KTH Roya Institute of Technoogy Department of Transport Science Emai: erik.jeneius@abe.kth.se

More information

Module 2. Analysis of Statically Indeterminate Structures by the Matrix Force Method. Version 2 CE IIT, Kharagpur

Module 2. Analysis of Statically Indeterminate Structures by the Matrix Force Method. Version 2 CE IIT, Kharagpur odue 2 naysis of Staticay ndeterminate Structures by the atri Force ethod Version 2 E T, Kharagpur esson 12 The Three-oment Equations- Version 2 E T, Kharagpur nstructiona Objectives fter reading this

More information

Evolutionary Product-Unit Neural Networks for Classification 1

Evolutionary Product-Unit Neural Networks for Classification 1 Evoutionary Product-Unit Neura Networs for Cassification F.. Martínez-Estudio, C. Hervás-Martínez, P. A. Gutiérrez Peña A. C. Martínez-Estudio and S. Ventura-Soto Department of Management and Quantitative

More information

Bayesian Unscented Kalman Filter for State Estimation of Nonlinear and Non-Gaussian Systems

Bayesian Unscented Kalman Filter for State Estimation of Nonlinear and Non-Gaussian Systems Bayesian Unscented Kaman Fiter for State Estimation of Noninear and Non-aussian Systems Zhong Liu, Shing-Chow Chan, Ho-Chun Wu and iafei Wu Department of Eectrica and Eectronic Engineering, he University

More information

https://doi.org/ /epjconf/

https://doi.org/ /epjconf/ HOW TO APPLY THE OPTIMAL ESTIMATION METHOD TO YOUR LIDAR MEASUREMENTS FOR IMPROVED RETRIEVALS OF TEMPERATURE AND COMPOSITION R. J. Sica 1,2,*, A. Haefee 2,1, A. Jaai 1, S. Gamage 1 and G. Farhani 1 1 Department

More information

Testing for the Existence of Clusters

Testing for the Existence of Clusters Testing for the Existence of Custers Caudio Fuentes and George Casea University of Forida November 13, 2008 Abstract The detection and determination of custers has been of specia interest, among researchers

More information

Adaptive Localization in a Dynamic WiFi Environment Through Multi-view Learning

Adaptive Localization in a Dynamic WiFi Environment Through Multi-view Learning daptive Locaization in a Dynamic WiFi Environment Through Muti-view Learning Sinno Jiain Pan, James T. Kwok, Qiang Yang, and Jeffrey Junfeng Pan Department of Computer Science and Engineering Hong Kong

More information

FORECASTING TELECOMMUNICATIONS DATA WITH AUTOREGRESSIVE INTEGRATED MOVING AVERAGE MODELS

FORECASTING TELECOMMUNICATIONS DATA WITH AUTOREGRESSIVE INTEGRATED MOVING AVERAGE MODELS FORECASTING TEECOMMUNICATIONS DATA WITH AUTOREGRESSIVE INTEGRATED MOVING AVERAGE MODES Niesh Subhash naawade a, Mrs. Meenakshi Pawar b a SVERI's Coege of Engineering, Pandharpur. nieshsubhash15@gmai.com

More information

STA 216 Project: Spline Approach to Discrete Survival Analysis

STA 216 Project: Spline Approach to Discrete Survival Analysis : Spine Approach to Discrete Surviva Anaysis November 4, 005 1 Introduction Athough continuous surviva anaysis differs much from the discrete surviva anaysis, there is certain ink between the two modeing

More information

A Comparison Study of the Test for Right Censored and Grouped Data

A Comparison Study of the Test for Right Censored and Grouped Data Communications for Statistica Appications and Methods 2015, Vo. 22, No. 4, 313 320 DOI: http://dx.doi.org/10.5351/csam.2015.22.4.313 Print ISSN 2287-7843 / Onine ISSN 2383-4757 A Comparison Study of the

More information

II. PROBLEM. A. Description. For the space of audio signals

II. PROBLEM. A. Description. For the space of audio signals CS229 - Fina Report Speech Recording based Language Recognition (Natura Language) Leopod Cambier - cambier; Matan Leibovich - matane; Cindy Orozco Bohorquez - orozcocc ABSTRACT We construct a rea time

More information

Auxiliary Gibbs Sampling for Inference in Piecewise-Constant Conditional Intensity Models

Auxiliary Gibbs Sampling for Inference in Piecewise-Constant Conditional Intensity Models uxiiary Gibbs Samping for Inference in Piecewise-Constant Conditiona Intensity Modes Zhen Qin University of Caifornia, Riverside zqin001@cs.ucr.edu Christian R. Sheton University of Caifornia, Riverside

More information

How the backpropagation algorithm works Srikumar Ramalingam School of Computing University of Utah

How the backpropagation algorithm works Srikumar Ramalingam School of Computing University of Utah How the backpropagation agorithm works Srikumar Ramaingam Schoo of Computing University of Utah Reference Most of the sides are taken from the second chapter of the onine book by Michae Nieson: neuranetworksanddeepearning.com

More information

arxiv: v2 [cs.lg] 30 Jan 2017

arxiv: v2 [cs.lg] 30 Jan 2017 Improved Variationa Inference with Inverse Autoregressive Fow arxiv:1606.04934v2 [cs.lg] 30 Jan 2017 Diederik P. Kingma dpkingma@openai.com Iya Sutskever iya@openai.com Tim Saimans tim@openai.com Abstract

More information

Intuitionistic Fuzzy Optimization Technique for Nash Equilibrium Solution of Multi-objective Bi-Matrix Games

Intuitionistic Fuzzy Optimization Technique for Nash Equilibrium Solution of Multi-objective Bi-Matrix Games Journa of Uncertain Systems Vo.5, No.4, pp.27-285, 20 Onine at: www.jus.org.u Intuitionistic Fuzzy Optimization Technique for Nash Equiibrium Soution of Muti-objective Bi-Matri Games Prasun Kumar Naya,,

More information

c 2016 Georgios Rovatsos

c 2016 Georgios Rovatsos c 2016 Georgios Rovatsos QUICKEST CHANGE DETECTION WITH APPLICATIONS TO LINE OUTAGE DETECTION BY GEORGIOS ROVATSOS THESIS Submitted in partia fufiment of the requirements for the degree of Master of Science

More information

Asynchronous Control for Coupled Markov Decision Systems

Asynchronous Control for Coupled Markov Decision Systems INFORMATION THEORY WORKSHOP (ITW) 22 Asynchronous Contro for Couped Marov Decision Systems Michae J. Neey University of Southern Caifornia Abstract This paper considers optima contro for a coection of

More information

Lecture Note 3: Stationary Iterative Methods

Lecture Note 3: Stationary Iterative Methods MATH 5330: Computationa Methods of Linear Agebra Lecture Note 3: Stationary Iterative Methods Xianyi Zeng Department of Mathematica Sciences, UTEP Stationary Iterative Methods The Gaussian eimination (or

More information

CS229 Lecture notes. Andrew Ng

CS229 Lecture notes. Andrew Ng CS229 Lecture notes Andrew Ng Part IX The EM agorithm In the previous set of notes, we taked about the EM agorithm as appied to fitting a mixture of Gaussians. In this set of notes, we give a broader view

More information

Capacity sharing among truck owners: A collaborative approach to overcome overloading

Capacity sharing among truck owners: A collaborative approach to overcome overloading Capacity sharing among truck owners: A coaborative approach to overcome overoading Arindam Debroy 1 Research Schoar debroyarindam1@gmai.com S. P. Sarmah 1 Professor 1 Department of Industria and Systems

More information

PARSIMONIOUS VARIATIONAL-BAYES MIXTURE AGGREGATION WITH A POISSON PRIOR. Pierrick Bruneau, Marc Gelgon and Fabien Picarougne

PARSIMONIOUS VARIATIONAL-BAYES MIXTURE AGGREGATION WITH A POISSON PRIOR. Pierrick Bruneau, Marc Gelgon and Fabien Picarougne 17th European Signa Processing Conference (EUSIPCO 2009) Gasgow, Scotand, August 24-28, 2009 PARSIMONIOUS VARIATIONAL-BAYES MIXTURE AGGREGATION WITH A POISSON PRIOR Pierric Bruneau, Marc Gegon and Fabien

More information

Study on Fusion Algorithm of Multi-source Image Based on Sensor and Computer Image Processing Technology

Study on Fusion Algorithm of Multi-source Image Based on Sensor and Computer Image Processing Technology Sensors & Transducers 3 by IFSA http://www.sensorsporta.com Study on Fusion Agorithm of Muti-source Image Based on Sensor and Computer Image Processing Technoogy Yao NAN, Wang KAISHENG, 3 Yu JIN The Information

More information

Utilization of Chemical Structure Information for Analysis of Spectra Composites

Utilization of Chemical Structure Information for Analysis of Spectra Composites ESANN 214 proceedings, European Symposium on Artificia Neura Networks, Computationa Inteigence and Machine Learning Bruges (Begium), 23-25 Apri 214, i6doccom pub, ISBN 978-28741995-7 Avaiabe from http://wwwi6doccom/fr/ivre/?gcoi=281143244

More information

A Brief Introduction to Markov Chains and Hidden Markov Models

A Brief Introduction to Markov Chains and Hidden Markov Models A Brief Introduction to Markov Chains and Hidden Markov Modes Aen B MacKenzie Notes for December 1, 3, &8, 2015 Discrete-Time Markov Chains You may reca that when we first introduced random processes,

More information

BDD-Based Analysis of Gapped q-gram Filters

BDD-Based Analysis of Gapped q-gram Filters BDD-Based Anaysis of Gapped q-gram Fiters Marc Fontaine, Stefan Burkhardt 2 and Juha Kärkkäinen 2 Max-Panck-Institut für Informatik Stuhsatzenhausweg 85, 6623 Saarbrücken, Germany e-mai: stburk@mpi-sb.mpg.de

More information

arxiv: v1 [math.ca] 6 Mar 2017

arxiv: v1 [math.ca] 6 Mar 2017 Indefinite Integras of Spherica Besse Functions MIT-CTP/487 arxiv:703.0648v [math.ca] 6 Mar 07 Joyon K. Boomfied,, Stephen H. P. Face,, and Zander Moss, Center for Theoretica Physics, Laboratory for Nucear

More information

Source and Relay Matrices Optimization for Multiuser Multi-Hop MIMO Relay Systems

Source and Relay Matrices Optimization for Multiuser Multi-Hop MIMO Relay Systems Source and Reay Matrices Optimization for Mutiuser Muti-Hop MIMO Reay Systems Yue Rong Department of Eectrica and Computer Engineering, Curtin University, Bentey, WA 6102, Austraia Abstract In this paper,

More information

Decentralized Event-Driven Algorithms for Multi-Agent Persistent Monitoring Tasks

Decentralized Event-Driven Algorithms for Multi-Agent Persistent Monitoring Tasks 2017 IEEE 56th Annua Conference on Decision and Contro (CDC) December 12-15, 2017, Mebourne, Austraia Decentraized Event-Driven Agorithms for Muti-Agent Persistent Monitoring Tasks Nan Zhou 1, Christos

More information

TELECOMMUNICATION DATA FORECASTING BASED ON ARIMA MODEL

TELECOMMUNICATION DATA FORECASTING BASED ON ARIMA MODEL TEECOMMUNICATION DATA FORECASTING BASED ON ARIMA MODE Anjuman Akbar Muani 1, Prof. Sachin Muraraka 2, Prof. K. Sujatha 3 1Student ME E&TC,Shree Ramchandra Coege of Engineering, onikand,pune,maharashtra

More information

Consistency Analysis for Massively Inconsistent Datasets in Bound-to-Bound Data Collaboration

Consistency Analysis for Massively Inconsistent Datasets in Bound-to-Bound Data Collaboration Consistency Anaysis for Massivey Inconsistent Datasets in Bound-to-Bound Data Coaboration Arun Hegde Wenyu i James Oreuk Andrew Packard Michae Frenkach December 19, 2017 Abstract Bound-to-Bound Data Coaboration

More information

Determining The Degree of Generalization Using An Incremental Learning Algorithm

Determining The Degree of Generalization Using An Incremental Learning Algorithm Determining The Degree of Generaization Using An Incrementa Learning Agorithm Pabo Zegers Facutad de Ingeniería, Universidad de os Andes San Caros de Apoquindo 22, Las Condes, Santiago, Chie pzegers@uandes.c

More information

The influence of temperature of photovoltaic modules on performance of solar power plant

The influence of temperature of photovoltaic modules on performance of solar power plant IOSR Journa of Engineering (IOSRJEN) ISSN (e): 2250-3021, ISSN (p): 2278-8719 Vo. 05, Issue 04 (Apri. 2015), V1 PP 09-15 www.iosrjen.org The infuence of temperature of photovotaic modues on performance

More information

Bayesian Deep Learning

Bayesian Deep Learning Bayesian Deep Learning Mohammad Emtiyaz Khan AIP (RIKEN), Tokyo http://emtiyaz.github.io emtiyaz.khan@riken.jp June 06, 2018 Mohammad Emtiyaz Khan 2018 1 What will you learn? Why is Bayesian inference

More information

Cryptanalysis of PKP: A New Approach

Cryptanalysis of PKP: A New Approach Cryptanaysis of PKP: A New Approach Éiane Jaumes and Antoine Joux DCSSI 18, rue du Dr. Zamenhoff F-92131 Issy-es-Mx Cedex France eiane.jaumes@wanadoo.fr Antoine.Joux@ens.fr Abstract. Quite recenty, in

More information

Chemical Kinetics Part 2

Chemical Kinetics Part 2 Integrated Rate Laws Chemica Kinetics Part 2 The rate aw we have discussed thus far is the differentia rate aw. Let us consider the very simpe reaction: a A à products The differentia rate reates the rate

More information

Expectation-Maximization for Estimating Parameters for a Mixture of Poissons

Expectation-Maximization for Estimating Parameters for a Mixture of Poissons Expectation-Maximization for Estimating Parameters for a Mixture of Poissons Brandon Maone Department of Computer Science University of Hesini February 18, 2014 Abstract This document derives, in excrutiating

More information

Converting Z-number to Fuzzy Number using. Fuzzy Expected Value

Converting Z-number to Fuzzy Number using. Fuzzy Expected Value ISSN 1746-7659, Engand, UK Journa of Information and Computing Science Vo. 1, No. 4, 017, pp.91-303 Converting Z-number to Fuzzy Number using Fuzzy Expected Vaue Mahdieh Akhbari * Department of Industria

More information

From Margins to Probabilities in Multiclass Learning Problems

From Margins to Probabilities in Multiclass Learning Problems From Margins to Probabiities in Muticass Learning Probems Andrea Passerini and Massimiiano Ponti 2 and Paoo Frasconi 3 Abstract. We study the probem of muticass cassification within the framework of error

More information

Discrete Applied Mathematics

Discrete Applied Mathematics Discrete Appied Mathematics 159 (2011) 812 825 Contents ists avaiabe at ScienceDirect Discrete Appied Mathematics journa homepage: www.esevier.com/ocate/dam A direct barter mode for course add/drop process

More information

Generalized multigranulation rough sets and optimal granularity selection

Generalized multigranulation rough sets and optimal granularity selection Granu. Comput. DOI 10.1007/s41066-017-0042-9 ORIGINAL PAPER Generaized mutigranuation rough sets and optima granuarity seection Weihua Xu 1 Wentao Li 2 Xiantao Zhang 1 Received: 27 September 2016 / Accepted:

More information

Pareto-improving Congestion Pricing on Multimodal Transportation Networks

Pareto-improving Congestion Pricing on Multimodal Transportation Networks Pareto-improving Congestion Pricing on Mutimoda Transportation Networks Wu, Di Civi & Coasta Engineering, Univ. of Forida Yin, Yafeng Civi & Coasta Engineering, Univ. of Forida Lawphongpanich, Siriphong

More information

Consistent linguistic fuzzy preference relation with multi-granular uncertain linguistic information for solving decision making problems

Consistent linguistic fuzzy preference relation with multi-granular uncertain linguistic information for solving decision making problems Consistent inguistic fuzzy preference reation with muti-granuar uncertain inguistic information for soving decision making probems Siti mnah Binti Mohd Ridzuan, and Daud Mohamad Citation: IP Conference

More information

Learning Structural Changes of Gaussian Graphical Models in Controlled Experiments

Learning Structural Changes of Gaussian Graphical Models in Controlled Experiments Learning Structura Changes of Gaussian Graphica Modes in Controed Experiments Bai Zhang and Yue Wang Bradey Department of Eectrica and Computer Engineering Virginia Poytechnic Institute and State University

More information

Convolutional Networks 2: Training, deep convolutional networks

Convolutional Networks 2: Training, deep convolutional networks Convoutiona Networks 2: Training, deep convoutiona networks Hakan Bien Machine Learning Practica MLP Lecture 8 30 October / 6 November 2018 MLP Lecture 8 / 30 October / 6 November 2018 Convoutiona Networks

More information

Sequential Decoding of Polar Codes with Arbitrary Binary Kernel

Sequential Decoding of Polar Codes with Arbitrary Binary Kernel Sequentia Decoding of Poar Codes with Arbitrary Binary Kerne Vera Miosavskaya, Peter Trifonov Saint-Petersburg State Poytechnic University Emai: veram,petert}@dcn.icc.spbstu.ru Abstract The probem of efficient

More information

Uniprocessor Feasibility of Sporadic Tasks with Constrained Deadlines is Strongly conp-complete

Uniprocessor Feasibility of Sporadic Tasks with Constrained Deadlines is Strongly conp-complete Uniprocessor Feasibiity of Sporadic Tasks with Constrained Deadines is Strongy conp-compete Pontus Ekberg and Wang Yi Uppsaa University, Sweden Emai: {pontus.ekberg yi}@it.uu.se Abstract Deciding the feasibiity

More information

Centralized Coded Caching of Correlated Contents

Centralized Coded Caching of Correlated Contents Centraized Coded Caching of Correated Contents Qianqian Yang and Deniz Gündüz Information Processing and Communications Lab Department of Eectrica and Eectronic Engineering Imperia Coege London arxiv:1711.03798v1

More information

Unsupervised Domain Adaptation with Imbalanced Cross-Domain Data

Unsupervised Domain Adaptation with Imbalanced Cross-Domain Data Unsupervised Domain Adaptation with Imbaanced Cross-Domain Data Tzu-Ming Harry Hsu 1, Wei-Yu Chen 1, Cheng-An Hou 2, Yao-Hung Hubert Tsai 3, Yi-Ren Yeh 4, and Yu-Chiang Frank Wang 3 1 Department of Eectrica

More information