Gaussian Process Vine Copulas for Multivariate Dependence
|
|
- Shannon Cross
- 5 years ago
- Views:
Transcription
1 Gaussian Process Vine Copulas for Multivariate Dependence José Miguel Hernández-Lobato 1,2 joint work with David López-Paz 2,3 and Zoubin Ghahramani 1 1 Department of Engineering, Cambridge University, Cambridge, UK 3 Ma Planck Institute for Intelligent Systems, Tübingen, Germany April 29, Both authors are equal contributors. 1
2 What is a Copula? Informal Definition A copula is a function that links univariate marginal distributions into a joint multivariate one. Marginal Densities Copula Joint Density y The copula specifies the dependencies among the random variables. 2
3 What is a Copula? Formal Definition A copula is a distribution function with marginals uniform in [0, 1]. Let U 1,..., U d be r.v. uniformly distributed in [0, 1] with copula C then C(u 1,..., u d ) = p(u 1 u 1,..., U d u d ). Sklar s theorem (connection between joints, marginals and copulas) Any joint cdf F ( 1,..., d ) with marginal cdfs F 1 ( 1 ),..., F d ( d ) satisfies F ( 1,..., d ) = C(F 1 ( 1 ),..., F d ( d )), where C is the copula of F. It is easy to show that the joint pdf f can be written as d f ( 1,..., d ) = c(f 1 ( 1 ),..., F d ( d )) f i ( i ), c(u 1,..., u d ) and f 1 ( 1 ),..., f d ( d ) are the copula and marginal densities. i=1 3
4 4 Why are Copulas Useful in Machine Learning? The converse of Sklar s theorem is also true: Given a copula C : [0, 1] d [0, 1] and margins F 1 ( 1 ),..., F d ( d ) then C(F 1 ( 1 ),..., F d ( d )) represents a valid joint cdf. Copulas are a powerful tool for the modeling of multivariate data. We can easily etend univariate models to the multivariate regime. Copulas simplify the estimation process for multivariate models. 1 - Estimate the marginal distributions. 2 - Map the data to [0, 1] d using the estimated marginals. 3 - Estimate a copula function given the mapped data. Learning the marginals : easily done using standard univariate methods. Learning the copula : difficult, requires to use copula models that i) can represent a broad range of dependencies and ii) are robust to overfitting.
5 5 Parametric Copula Models There are many parametric 2D copulas. Some eamples are... Gaussian Clayton Frank t Copula Gumbel Joe Usually depend on a single scalar parameter θ which is in a one-to-one relationship with Kendall s tau rank correlation coefficient, defined as τ = p[(u 1 U 1)(U 2 U 2) > 0] p[(u 1 U 1)(U 2 U 2) < 0] = p[concordance] p[discordance], where (U 1, U 2 ) and (U 1, U 2 ) are independent samples from the copula. However, in higher dimensions, the number and epressiveness of parametric copulas is more limited.
6 6 Vine Copulas They are hierarchical graphical models that factorize c(u 1,..., u d ) into a product of d(d 1)/2 bivariate conditional copula densities. We can factorize c(u 1, u 2, u 3 ) using the product rule of probability as c(u 1, u 2, u 3 ) = f 3 12 (u 3 u 1, u 2 )f 2 1 (u 2 u 1 ) and we can epress each factor in terms of bivariate copula functions
7 7 Computing Conditional cdfs Computing c 31 2 [F 3 2 (u 3 u 2 ), F 1 2 (u 1 u 2 ) u 2 ] requires to evaluate the conditional marginal cdfs F 3 2 (u 3 u 2 ) and F 1 2 (u 1 u 2 ). This can be done using the following recursive relationship: F j A (u j A) = C jk B[F j B (u j B), B] =Fk B (u k B), where A is a set of variables different from u j and B = A \ {k}. For eample, F 3 2 (u 3 u 2 ) = C 32(u 3, ), F 1 2 (u 1 u 2 ) = C 21(, u 1 ) =u2. =u2
8 8 Regular Vines A regular vine specifies a factorization of c(u 1,..., u d ). Formed by d 1 trees T 1,..., T d 1 with node and edge sets V i and E i. Each edge e in any tree has associated three sets of variables C(e), D(e), N(e) {1,..., d} called conditioned, conditioning and constraint sets. V 1 = {1,..., d} and E 1 forms a spanning tree over a complete graph G 1 over V 1. For any e E 1, C(e) = N(e) = e and D(e) =. For i > 1, V i = E i 1 and E i forms a spanning tree over a graph G i with nodes V i and edges e = {e 1, e 2 } such that e 1, e 2 E i 1 and e 1 e 2. For any e = {e 1, e 2 } E i, i > 1, we have that C(e) = N(e 1 ) N(e 2 ), D(e) = N(e 1 ) N(e 2 ) and N(e) = N(e 1 ) N(e 2 ). c(u 1,..., u d ) = d 1 i=1 e E i c C(e) D(e).
9 Eample of a Regular Vine 9
10 10 Using Regular Vines in Practice Selecting a particular factorization: Many possible factorizations. Each one determined by the specific choices of spanning trees T 1,..., T d 1. In practice, each tree T i is chosen by assigning a weight to each edge in G i and then selecting the corresponding maimum spanning tree. The weight for the edge e is usually related to the dependence level between the variables in C(e) (often measured in terms of Kendall s tau). It is common to prune the vine and consider only a few of the first trees. Dealing with conditional bivariate copulas: Use the simplifying assumption : c C(e) D(e) does not depend on D(e). Our main contribution: avoid making use of the simplifying assumption.
11 11 A Semi-parametric Model for Conditional Copulas We describe c C(e) D(e) using a parametric model specified in terms of Kendall s tau τ [ 1, 1]. Let z be a vector with the value of the variables in D(e). Then we assume τ = σ[f (z)], where f is an arbitrary non-linear function and σ() = 2Φ() 1 is a sigmoid function.
12 Bayesian Inference on f We are given a sample D UV = {U i, V i } n i=1 from C C(e) D(e) with corresponding values for the variables in D(e) given by D z = {z i } n i=1. We want to identify the value of f that was used to generate the data. We assume that f follows a priori a Gaussian process
13 13 Posterior and Predictive Distributions The posterior distribution for f = (f 1,..., f n ) T, where f i = f (z i ), is p(f D UV, D z ) = [ n i=1 c(u i, V i τ = σ[f i ])] p(f D z ), p(d UV D z ) where p(f D z ) = N (f m 0, K) is the Gaussian process prior on f. Given z n+1, the predictive distribution for U n+1 and V n+1 is p(u n+1, v n+1 z n+1, D UV, D z ) = c(u n+1, v n+1 τ = σ[f n+1 ]) p(f n+1 f, z n+1, D z )p(f D UV, D z )df, For efficient approimate inference, we use Epectation Propagation.
14 14 Epectation Propagation EP approimates p(f D UV, D z ) by Q(f) = N (f m, V), where EP tunes ˆm i and ˆv i by minimizing KL[q i (f i )Q(f)[ˆq i (f i )] 1 Q(f)]. We use numerical integration methods for this task. Kernel parameters fied by maimizing the EP appro. of p(d UV D z ). The total cost is O(n 3 ).
15 Implementation Details We choose the following covariance function for the GP prior: { } Cov[f (z i ), f (z j )] = σ ep (z i z j ) T diag(λ)(z i z j ) + σ 0. The mean of the GP prior is constant and equal to Φ 1 ((ˆτ MLE + 1)/2), where ˆτ MLE is the MLE of τ for an unconditional Gaussian copula. We use the FITC approimation: K approimated by K = Q + diag(k Q), where Q = K nn0 K 1 n 0 n 0 K T nn 0. K n0 n 0 is the n 0 n 0 covariance matri for n 0 n pseudo-inputs. K nn0 contains the covariances between training points and pseudo-inputs. The cost of EP is now O(nn 2 0 ). We choose n 0 = 20. The predictive distribution is approimated using sampling. 15
16 Eperiments I We compare the proposed method GPVINE with two baselines: 1 - SVINE, based on the simplifying assumption. 2 - MLLVINE, based on the maimization of the local likelihood. Can only capture dependencies on a single random variable. Limited to regular vines with at most two trees. All the data mapped to [0, 1] d using the ecdfs. Synthetic Data: Z uniform in [ 6, 6] and (U, V ) Gaussian with correlation 3/4 sin(z). Data set of size 50. τ U,V Z GPVINE MLLVINE TRUE P Z (Z) 16
17 17 Eperiments II Real-world data: UCI datasets, meteorological data, mineral concentrations and financial data Data split into training and test sets (50 times) with half of the data. Average test log likelihood when limited to two trees in the vine:
18 18 Results for More than Two Trees GPVINE SVINE
19 Conditional Dependencies in Weather Data Conditional Kendall s tau for atmospheric pressure and cloud percentage cover when conditioned to latitude and longitude near Barcelona on 11/19/2012 at 8pm. 19
20 20 Summary and Conclusions Vine copulas are fleible models for multivariate dependencies which specify a factorization of the copula density into a product of conditional bivariate copulas. In practical implementations of vines, some of the conditional dependencies in the bivariate copulas are usually ignored. To avoid this, we have proposed a method for the estimation of fully conditional vines using Gaussian processes (GPVINE). GPVINE outperforms a baseline that ignores conditional dependencies (SVINE) and other alternatives based on maimum local-likelihood methods (MLLVINE).
21 21 References Lopez-Paz D., Hernandez-Lobato J. M. and Ghahramani Z. Gaussian Process Vine Copulas for Multivariate Dependence International Conference on Machine Learning (ICML 2013). Acar, E. F., Craiu, R. V., and Yao, F. Dependence calibration in conditional copulas: A nonparametric approach. Biometrics, 67(2): , Bedford, T. and Cooke, R. M. Vines-a new graphical model for dependent random variables. The Annals of Statistics, 30(4): , 2002 Minka, T. P. Epectation Propagation for approimate Bayesian inference. Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence, pp , Naish-Guzman, A. and Holden, S. B. The generalized FITC approimation. In Advances in Neural Information Processing Systems 20, Patton, A. J. Modelling asymmetric echange rate dependence. International Economic Review, 47(2): , 2006
22 Thank you for your attention! 22
Gaussian Process Vine Copulas for Multivariate Dependence
Gaussian Process Vine Copulas for Multivariate Dependence José Miguel Hernández Lobato 1,2, David López Paz 3,2 and Zoubin Ghahramani 1 June 27, 2013 1 University of Cambridge 2 Equal Contributor 3 Ma-Planck-Institute
More informationarxiv: v1 [stat.me] 16 Feb 2013
arxiv:1302.3979v1 [stat.me] 16 Feb 2013 David Lopez-Paz Max Planck Institute for Intelligent Systems Jose Miguel Hernández-Lobato Zoubin Ghahramani University of Cambridge Abstract Copulas allow to learn
More informationHow to select a good vine
Universitetet i Oslo ingrihaf@math.uio.no International FocuStat Workshop on Focused Information Criteria and Related Themes, May 9-11, 2016 Copulae Regular vines Model selection and reduction Limitations
More informationCopulas. MOU Lili. December, 2014
Copulas MOU Lili December, 2014 Outline Preliminary Introduction Formal Definition Copula Functions Estimating the Parameters Example Conclusion and Discussion Preliminary MOU Lili SEKE Team 3/30 Probability
More informationFinancial Econometrics and Volatility Models Copulas
Financial Econometrics and Volatility Models Copulas Eric Zivot Updated: May 10, 2010 Reading MFTS, chapter 19 FMUND, chapters 6 and 7 Introduction Capturing co-movement between financial asset returns
More informationGaussian Process Regression Networks
Gaussian Process Regression Networks Andrew Gordon Wilson agw38@camacuk mlgengcamacuk/andrew University of Cambridge Joint work with David A Knowles and Zoubin Ghahramani June 27, 2012 ICML, Edinburgh
More informationEstimation of Copula Models with Discrete Margins (via Bayesian Data Augmentation) Michael S. Smith
Estimation of Copula Models with Discrete Margins (via Bayesian Data Augmentation) Michael S. Smith Melbourne Business School, University of Melbourne (Joint with Mohamad Khaled, University of Queensland)
More informationMachine Learning 4771
Machine Learning 4771 Instructor: Tony Jebara Topic 7 Unsupervised Learning Statistical Perspective Probability Models Discrete & Continuous: Gaussian, Bernoulli, Multinomial Maimum Likelihood Logistic
More informationBayesian Semi-supervised Learning with Deep Generative Models
Bayesian Semi-supervised Learning with Deep Generative Models Jonathan Gordon Department of Engineering Cambridge University jg801@cam.ac.uk José Miguel Hernández-Lobato Department of Engineering Cambridge
More informationMarkov Switching Regular Vine Copulas
Int. Statistical Inst.: Proc. 58th World Statistical Congress, 2011, Dublin (Session CPS057) p.5304 Markov Switching Regular Vine Copulas Stöber, Jakob and Czado, Claudia Lehrstuhl für Mathematische Statistik,
More informationBayesian Learning in Undirected Graphical Models
Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ Work with: Iain Murray and Hyun-Chul
More informationHybrid Copula Bayesian Networks
Kiran Karra kiran.karra@vt.edu Hume Center Electrical and Computer Engineering Virginia Polytechnic Institute and State University September 7, 2016 Outline Introduction Prior Work Introduction to Copulas
More informationBayesian Inference for Conditional Copula models
Bayesian Inference for Conditional Copula models Radu Craiu Department of Statistical Sciences University of Toronto Joint with Evgeny Levi (Toronto), Avideh Sabeti (Toronto) and Mian Wei (Toronto) CRM,
More informationConstruction and estimation of high dimensional copulas
Construction and estimation of high dimensional copulas Gildas Mazo PhD work supervised by S. Girard and F. Forbes Mistis, Inria and laboratoire Jean Kuntzmann, Grenoble, France Séminaire Statistiques,
More informationVariational Inference with Copula Augmentation
Variational Inference with Copula Augmentation Dustin Tran 1 David M. Blei 2 Edoardo M. Airoldi 1 1 Department of Statistics, Harvard University 2 Department of Statistics & Computer Science, Columbia
More informationBayesian Quadrature: Model-based Approximate Integration. David Duvenaud University of Cambridge
Bayesian Quadrature: Model-based Approimate Integration David Duvenaud University of Cambridge The Quadrature Problem ˆ We want to estimate an integral Z = f ()p()d ˆ Most computational problems in inference
More informationGaussian Process priors with Uncertain Inputs: Multiple-Step-Ahead Prediction
Gaussian Process priors with Uncertain Inputs: Multiple-Step-Ahead Prediction Agathe Girard Dept. of Computing Science University of Glasgow Glasgow, UK agathe@dcs.gla.ac.uk Carl Edward Rasmussen Gatsby
More informationMultivariate Non-Normally Distributed Random Variables
Multivariate Non-Normally Distributed Random Variables An Introduction to the Copula Approach Workgroup seminar on climate dynamics Meteorological Institute at the University of Bonn 18 January 2008, Bonn
More informationLecture 6: Graphical Models: Learning
Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)
More informationExpectation Propagation for Approximate Bayesian Inference
Expectation Propagation for Approximate Bayesian Inference José Miguel Hernández Lobato Universidad Autónoma de Madrid, Computer Science Department February 5, 2007 1/ 24 Bayesian Inference Inference Given
More informationarxiv: v3 [stat.me] 25 May 2017
Bayesian Inference for Conditional Copulas using Gaussian Process Single Index Models Evgeny Levi Radu V. Craiu arxiv:1603.0308v3 [stat.me] 5 May 017 Department of Statistical Sciences, University of Toronto
More informationBayesian Inference for Pair-copula Constructions of Multiple Dependence
Bayesian Inference for Pair-copula Constructions of Multiple Dependence Claudia Czado and Aleksey Min Technische Universität München cczado@ma.tum.de, aleksmin@ma.tum.de December 7, 2007 Overview 1 Introduction
More informationDependence Calibration in Conditional Copulas: A Nonparametric Approach
Biometrics DOI: 10.1111/j.1541-0420.2010.01472. Dependence Calibration in Conditional Copulas: A Nonparametric Approach Elif F. Acar, Radu V. Craiu, and Fang Yao Department of Statistics, University of
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture Notes Fall 2009 November, 2009 Byoung-Ta Zhang School of Computer Science and Engineering & Cognitive Science, Brain Science, and Bioinformatics Seoul National University
More informationSTA 414/2104, Spring 2014, Practice Problem Set #1
STA 44/4, Spring 4, Practice Problem Set # Note: these problems are not for credit, and not to be handed in Question : Consider a classification problem in which there are two real-valued inputs, and,
More informationGaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature
More informationMachine Learning Summer School
Machine Learning Summer School Lecture 3: Learning parameters and structure Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Department of Engineering University of Cambridge,
More informationLecture 3: Pattern Classification. Pattern classification
EE E68: Speech & Audio Processing & Recognition Lecture 3: Pattern Classification 3 4 5 The problem of classification Linear and nonlinear classifiers Probabilistic classification Gaussians, mitures and
More informationMobile Robot Localization
Mobile Robot Localization 1 The Problem of Robot Localization Given a map of the environment, how can a robot determine its pose (planar coordinates + orientation)? Two sources of uncertainty: - observations
More informationTree-structured Gaussian Process Approximations
Tree-structured Gaussian Process Approximations Thang Bui joint work with Richard Turner MLG, Cambridge July 1st, 2014 1 / 27 Outline 1 Introduction 2 Tree-structured GP approximation 3 Experiments 4 Summary
More informationQuantifying mismatch in Bayesian optimization
Quantifying mismatch in Bayesian optimization Eric Schulz University College London e.schulz@cs.ucl.ac.uk Maarten Speekenbrink University College London m.speekenbrink@ucl.ac.uk José Miguel Hernández-Lobato
More informationLearning Gaussian Process Models from Uncertain Data
Learning Gaussian Process Models from Uncertain Data Patrick Dallaire, Camille Besse, and Brahim Chaib-draa DAMAS Laboratory, Computer Science & Software Engineering Department, Laval University, Canada
More informationAn Introduction to Bayesian Machine Learning
1 An Introduction to Bayesian Machine Learning José Miguel Hernández-Lobato Department of Engineering, Cambridge University April 8, 2013 2 What is Machine Learning? The design of computational systems
More informationActive and Semi-supervised Kernel Classification
Active and Semi-supervised Kernel Classification Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London Work done in collaboration with Xiaojin Zhu (CMU), John Lafferty (CMU),
More informationTruncation of vine copulas using fit indices
Truncation of vine copulas using fit indices Eike C. Brechmann Harry Joe February 2, 2015 Abstract Vine copulas are flexible multivariate dependence models, which are built up from a set of bivariate copulas
More informationMarginal Specifications and a Gaussian Copula Estimation
Marginal Specifications and a Gaussian Copula Estimation Kazim Azam Abstract Multivariate analysis involving random variables of different type like count, continuous or mixture of both is frequently required
More informationA parametric approach to Bayesian optimization with pairwise comparisons
A parametric approach to Bayesian optimization with pairwise comparisons Marco Co Eindhoven University of Technology m.g.h.co@tue.nl Bert de Vries Eindhoven University of Technology and GN Hearing bdevries@ieee.org
More informationCopula modeling for discrete data
Copula modeling for discrete data Christian Genest & Johanna G. Nešlehová in collaboration with Bruno Rémillard McGill University and HEC Montréal ROBUST, September 11, 2016 Main question Suppose (X 1,
More informationCopula Network Classifiers (CNCs)
Gal Elidan Department of Statistics, The Hebrew University Abstract The task of classification is of paramount importance and extensive research has been aimed at developing general purpose classifiers
More informationModel Selection for Gaussian Processes
Institute for Adaptive and Neural Computation School of Informatics,, UK December 26 Outline GP basics Model selection: covariance functions and parameterizations Criteria for model selection Marginal
More informationCSci 8980: Advanced Topics in Graphical Models Gaussian Processes
CSci 8980: Advanced Topics in Graphical Models Gaussian Processes Instructor: Arindam Banerjee November 15, 2007 Gaussian Processes Outline Gaussian Processes Outline Parametric Bayesian Regression Gaussian
More informationBayesian Inference for Conditional Copula models with Continuous and Binary Responses
Bayesian Inference for Conditional Copula models with Continuous and Binary Responses Radu Craiu Department of Statistics University of Toronto Joint with Avideh Sabeti (Toronto) and Mian Wei (Toronto)
More informationDensity Estimation: ML, MAP, Bayesian estimation
Density Estimation: ML, MAP, Bayesian estimation CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Introduction Maximum-Likelihood Estimation Maximum
More informationProbabilistic & Bayesian deep learning. Andreas Damianou
Probabilistic & Bayesian deep learning Andreas Damianou Amazon Research Cambridge, UK Talk at University of Sheffield, 19 March 2019 In this talk Not in this talk: CRFs, Boltzmann machines,... In this
More informationBayesian optimization for automatic machine learning
Bayesian optimization for automatic machine learning Matthew W. Ho man based o work with J. M. Hernández-Lobato, M. Gelbart, B. Shahriari, and others! University of Cambridge July 11, 2015 Black-bo optimization
More informationVariational Model Selection for Sparse Gaussian Process Regression
Variational Model Selection for Sparse Gaussian Process Regression Michalis K. Titsias School of Computer Science University of Manchester 7 September 2008 Outline Gaussian process regression and sparse
More informationBayesian Approach 2. CSC412 Probabilistic Learning & Reasoning
CSC412 Probabilistic Learning & Reasoning Lecture 12: Bayesian Parameter Estimation February 27, 2006 Sam Roweis Bayesian Approach 2 The Bayesian programme (after Rev. Thomas Bayes) treats all unnown quantities
More informationLinear, threshold units. Linear Discriminant Functions and Support Vector Machines. Biometrics CSE 190 Lecture 11. X i : inputs W i : weights
Linear Discriminant Functions and Support Vector Machines Linear, threshold units CSE19, Winter 11 Biometrics CSE 19 Lecture 11 1 X i : inputs W i : weights θ : threshold 3 4 5 1 6 7 Courtesy of University
More informationShort course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda
Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis Lecture Recalls of probability theory Massimo Piccardi University of Technology, Sydney,
More informationNonparametric Bayesian Methods (Gaussian Processes)
[70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent
More informationAssessing the VaR of a portfolio using D-vine copula based multivariate GARCH models
Assessing the VaR of a portfolio using D-vine copula based multivariate GARCH models Mathias Hofmann a,, Claudia Czado b a Technische Universität München Zentrum Mathematik Lehrstuhl für Mathematische
More informationA Goodness-of-fit Test for Copulas
A Goodness-of-fit Test for Copulas Artem Prokhorov August 2008 Abstract A new goodness-of-fit test for copulas is proposed. It is based on restrictions on certain elements of the information matrix and
More informationWhere now? Machine Learning and Bayesian Inference
Machine Learning and Bayesian Inference Dr Sean Holden Computer Laboratory, Room FC6 Telephone etension 67 Email: sbh@clcamacuk wwwclcamacuk/ sbh/ Where now? There are some simple take-home messages from
More informationSTA414/2104 Statistical Methods for Machine Learning II
STA414/2104 Statistical Methods for Machine Learning II Murat A. Erdogdu & David Duvenaud Department of Computer Science Department of Statistical Sciences Lecture 3 Slide credits: Russ Salakhutdinov Announcements
More informationPredictive Entropy Search for Efficient Global Optimization of Black-box Functions
Predictive Entropy Search for Efficient Global Optimization of Black-bo Functions José Miguel Hernández-Lobato jmh233@cam.ac.uk University of Cambridge Matthew W. Hoffman mwh3@cam.ac.uk University of Cambridge
More informationNonparameteric Regression:
Nonparameteric Regression: Nadaraya-Watson Kernel Regression & Gaussian Process Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,
More informationLearning to Learn and Collaborative Filtering
Appearing in NIPS 2005 workshop Inductive Transfer: Canada, December, 2005. 10 Years Later, Whistler, Learning to Learn and Collaborative Filtering Kai Yu, Volker Tresp Siemens AG, 81739 Munich, Germany
More informationDependence. MFM Practitioner Module: Risk & Asset Allocation. John Dodson. September 11, Dependence. John Dodson. Outline.
MFM Practitioner Module: Risk & Asset Allocation September 11, 2013 Before we define dependence, it is useful to define Random variables X and Y are independent iff For all x, y. In particular, F (X,Y
More informationProbabilistic Reasoning in Deep Learning
Probabilistic Reasoning in Deep Learning Dr Konstantina Palla, PhD palla@stats.ox.ac.uk September 2017 Deep Learning Indaba, Johannesburgh Konstantina Palla 1 / 39 OVERVIEW OF THE TALK Basics of Bayesian
More informationMachine Learning Lecture 3
Announcements Machine Learning Lecture 3 Eam dates We re in the process of fiing the first eam date Probability Density Estimation II 9.0.207 Eercises The first eercise sheet is available on L2P now First
More informationM.S. Project Report. Efficient Failure Rate Prediction for SRAM Cells via Gibbs Sampling. Yamei Feng 12/15/2011
.S. Project Report Efficient Failure Rate Prediction for SRA Cells via Gibbs Sampling Yamei Feng /5/ Committee embers: Prof. Xin Li Prof. Ken ai Table of Contents CHAPTER INTRODUCTION...3 CHAPTER BACKGROUND...5
More informationRobustness of a semiparametric estimator of a copula
Robustness of a semiparametric estimator of a copula Gunky Kim a, Mervyn J. Silvapulle b and Paramsothy Silvapulle c a Department of Econometrics and Business Statistics, Monash University, c Caulfield
More informationBayesian Inference of Noise Levels in Regression
Bayesian Inference of Noise Levels in Regression Christopher M. Bishop Microsoft Research, 7 J. J. Thomson Avenue, Cambridge, CB FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop
More informationShould all Machine Learning be Bayesian? Should all Bayesian models be non-parametric?
Should all Machine Learning be Bayesian? Should all Bayesian models be non-parametric? Zoubin Ghahramani Department of Engineering University of Cambridge, UK zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/
More informationebay/google short course: Problem set 2
18 Jan 013 ebay/google short course: Problem set 1. (the Echange Parado) You are playing the following game against an opponent, with a referee also taking part. The referee has two envelopes (numbered
More informationBayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework
HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for
More informationAn Econometric Study of Vine Copulas
An Econometric Study of Vine Copulas Pierre-André Maugis (Corresponding author) PSE, Université Paris 1 Panthéon-Sorbonne, 106 boulevard de l Hopital 75647 Paris Cedex 13, France E-mail: pierre-andre.maugis@malix.univ-paris1.fr.
More informationState Space and Hidden Markov Models
State Space and Hidden Markov Models Kunsch H.R. State Space and Hidden Markov Models. ETH- Zurich Zurich; Aliaksandr Hubin Oslo 2014 Contents 1. Introduction 2. Markov Chains 3. Hidden Markov and State
More informationVariable sigma Gaussian processes: An expectation propagation perspective
Variable sigma Gaussian processes: An expectation propagation perspective Yuan (Alan) Qi Ahmed H. Abdel-Gawad CS & Statistics Departments, Purdue University ECE Department, Purdue University alanqi@cs.purdue.edu
More informationProgram and big picture Big data: can copula modelling be used for high dimensions, say
Conditional independence copula models with graphical representations Harry Joe (University of British Columbia) For multivariate Gaussian with a large number of variables, there are several approaches
More informationChapter 1. Bayesian Inference for D-vines: Estimation and Model Selection
Chapter 1 Bayesian Inference for D-vines: Estimation and Model Selection Claudia Czado and Aleksey Min Technische Universität München, Zentrum Mathematik, Boltzmannstr. 3, 85747 Garching, Germany cczado@ma.tum.de
More informationSemi-parametric predictive inference for bivariate data using copulas
Semi-parametric predictive inference for bivariate data using copulas Tahani Coolen-Maturi a, Frank P.A. Coolen b,, Noryanti Muhammad b a Durham University Business School, Durham University, Durham, DH1
More informationHow to build an automatic statistician
How to build an automatic statistician James Robert Lloyd 1, David Duvenaud 1, Roger Grosse 2, Joshua Tenenbaum 2, Zoubin Ghahramani 1 1: Department of Engineering, University of Cambridge, UK 2: Massachusetts
More informationGWAS IV: Bayesian linear (variance component) models
GWAS IV: Bayesian linear (variance component) models Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS IV: Bayesian
More informationThe partial vine copula: A dependence measure and approximation based on the simplifying assumption
The partial vine copula: A dependence measure and approximation based on the simplifying assumption FABIAN SPANHEL 1, and MALTE S. KURZ 1 1 Department of Statistics, Ludwig-Maximilians-Universität München,
More informationBivariate Rainfall and Runoff Analysis Using Entropy and Copula Theories
Entropy 2012, 14, 1784-1812; doi:10.3390/e14091784 Article OPEN ACCESS entropy ISSN 1099-4300 www.mdpi.com/journal/entropy Bivariate Rainfall and Runoff Analysis Using Entropy and Copula Theories Lan Zhang
More informationTutorial on Gaussian Processes and the Gaussian Process Latent Variable Model
Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model (& discussion on the GPLVM tech. report by Prof. N. Lawrence, 06) Andreas Damianou Department of Neuro- and Computer Science,
More informationVine Copulas. Spatial Copula Workshop 2014 September 22, Institute for Geoinformatics University of Münster.
Spatial Workshop 2014 September 22, 2014 Institute for Geoinformatics University of Münster http://ifgi.uni-muenster.de/graeler 1 spatio-temporal data Typically, spatio-temporal data is given at a set
More informationDeep Neural Networks as Gaussian Processes
Deep Neural Networks as Gaussian Processes Jaehoon Lee, Yasaman Bahri, Roman Novak, Samuel S. Schoenholz, Jeffrey Pennington, Jascha Sohl-Dickstein Google Brain {jaehlee, yasamanb, romann, schsam, jpennin,
More informationDeep learning with differential Gaussian process flows
Deep learning with differential Gaussian process flows Pashupati Hegde Markus Heinonen Harri Lähdesmäki Samuel Kaski Helsinki Institute for Information Technology HIIT Department of Computer Science, Aalto
More informationA measure of radial asymmetry for bivariate copulas based on Sobolev norm
A measure of radial asymmetry for bivariate copulas based on Sobolev norm Ahmad Alikhani-Vafa Ali Dolati Abstract The modified Sobolev norm is used to construct an index for measuring the degree of radial
More informationPARSIMONIOUS MULTIVARIATE COPULA MODEL FOR DENSITY ESTIMATION. Alireza Bayestehtashk and Izhak Shafran
PARSIMONIOUS MULTIVARIATE COPULA MODEL FOR DENSITY ESTIMATION Alireza Bayestehtashk and Izhak Shafran Center for Spoken Language Understanding, Oregon Health & Science University, Portland, Oregon, USA
More informationKernels for Automatic Pattern Discovery and Extrapolation
Kernels for Automatic Pattern Discovery and Extrapolation Andrew Gordon Wilson agw38@cam.ac.uk mlg.eng.cam.ac.uk/andrew University of Cambridge Joint work with Ryan Adams (Harvard) 1 / 21 Pattern Recognition
More informationNonparmeteric Bayes & Gaussian Processes. Baback Moghaddam Machine Learning Group
Nonparmeteric Bayes & Gaussian Processes Baback Moghaddam baback@jpl.nasa.gov Machine Learning Group Outline Bayesian Inference Hierarchical Models Model Selection Parametric vs. Nonparametric Gaussian
More informationMACHINE LEARNING ADVANCED MACHINE LEARNING
MACHINE LEARNING ADVANCED MACHINE LEARNING Recap of Important Notions on Estimation of Probability Density Functions 2 2 MACHINE LEARNING Overview Definition pdf Definition joint, condition, marginal,
More informationRegression with Input-Dependent Noise: A Bayesian Treatment
Regression with Input-Dependent oise: A Bayesian Treatment Christopher M. Bishop C.M.BishopGaston.ac.uk Cazhaow S. Qazaz qazazcsgaston.ac.uk eural Computing Research Group Aston University, Birmingham,
More informationBayesian Learning in Undirected Graphical Models
Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ and Center for Automated Learning and
More informationIntroduction to Probabilistic Graphical Models: Exercises
Introduction to Probabilistic Graphical Models: Exercises Cédric Archambeau Xerox Research Centre Europe cedric.archambeau@xrce.xerox.com Pascal Bootcamp Marseille, France, July 2010 Exercise 1: basics
More informationSimulation of Tail Dependence in Cot-copula
Int Statistical Inst: Proc 58th World Statistical Congress, 0, Dublin (Session CPS08) p477 Simulation of Tail Dependence in Cot-copula Pirmoradian, Azam Institute of Mathematical Sciences, Faculty of Science,
More informationBlack-box α-divergence Minimization
Black-box α-divergence Minimization José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato, Thang Bui, Richard Turner, Harvard University, University of Cambridge, Universidad Autónoma de Madrid.
More informationTalk on Bayesian Optimization
Talk on Bayesian Optimization Jungtaek Kim (jtkim@postech.ac.kr) Machine Learning Group, Department of Computer Science and Engineering, POSTECH, 77-Cheongam-ro, Nam-gu, Pohang-si 37673, Gyungsangbuk-do,
More informationSpatial Statistics 2013, S2.2 6 th June Institute for Geoinformatics University of Münster.
Spatial Statistics 2013, S2.2 6 th June 2013 Institute for Geoinformatics University of Münster http://ifgi.uni-muenster.de/graeler Vine Vine 1 Spatial/spatio-temporal data Typically, spatial/spatio-temporal
More informationGaussian Process Regression with Censored Data Using Expectation Propagation
Sixth European Workshop on Probabilistic Graphical Models, Granada, Spain, 01 Gaussian Process Regression with Censored Data Using Expectation Propagation Perry Groot, Peter Lucas Radboud University Nijmegen,
More informationMachine Learning Lecture 2
Machine Perceptual Learning and Sensory Summer Augmented 6 Computing Announcements Machine Learning Lecture 2 Course webpage http://www.vision.rwth-aachen.de/teaching/ Slides will be made available on
More informationRecent Advances in Bayesian Inference Techniques
Recent Advances in Bayesian Inference Techniques Christopher M. Bishop Microsoft Research, Cambridge, U.K. research.microsoft.com/~cmbishop SIAM Conference on Data Mining, April 2004 Abstract Bayesian
More informationPractical Bayesian Optimization of Machine Learning. Learning Algorithms
Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that
More informationGeneralized Information Matrix Tests for Copulas
Generalized Information Matri Tests for Copulas Artem Prokhorov Ulf Schepsmeier Yajing Zhu October 014 Abstract We propose a family of goodness-of-fit tests for copulas. The tests use generalizations of
More informationMULTIDIMENSIONAL POVERTY MEASUREMENT: DEPENDENCE BETWEEN WELL-BEING DIMENSIONS USING COPULA FUNCTION
Rivista Italiana di Economia Demografia e Statistica Volume LXXII n. 3 Luglio-Settembre 2018 MULTIDIMENSIONAL POVERTY MEASUREMENT: DEPENDENCE BETWEEN WELL-BEING DIMENSIONS USING COPULA FUNCTION Kateryna
More informationWeb-based Supplementary Material for. Dependence Calibration in Conditional Copulas: A Nonparametric Approach
1 Web-based Supplementary Material for Dependence Calibration in Conditional Copulas: A Nonparametric Approach Elif F. Acar, Radu V. Craiu, and Fang Yao Web Appendix A: Technical Details The score and
More informationA Brief Introduction to Copulas
A Brief Introduction to Copulas Speaker: Hua, Lei February 24, 2009 Department of Statistics University of British Columbia Outline Introduction Definition Properties Archimedean Copulas Constructing Copulas
More information