Gaussian Process Vine Copulas for Multivariate Dependence
|
|
- Hubert Quinn
- 5 years ago
- Views:
Transcription
1 Gaussian Process Vine Copulas for Multivariate Dependence José Miguel Hernández Lobato 1,2, David López Paz 3,2 and Zoubin Ghahramani 1 June 27, University of Cambridge 2 Equal Contributor 3 Ma-Planck-Institute for Intelligent Systems 1 / 23
2 Copulas and Multivariate Probabilistic Models Copulas link univariate marginal distributions into joint multivariate probabilistic models. Marginal Densities Copula Joint Density y Copulas specify dependencies between the random variables. 2 / 23
3 Separating Marginals and Dependencies Using Copulas p( 1,..., d ) = c(u 1,..., u d ) }{{} unique copula d p i ( i ), where u i := P i ( i ) and P i denotes the cumulative distribution function (cdf) for the variable i. We can model each of the d marginal distributions and the copula function independently. There eists a broad catalog of two-dimensional copula models i=1... However, for the high-dimensional regime, the number of copula models available and their epressiveness is more limited. 3 / 23
4 More than Two Dimensions: Vine Factorizations G G 1 /T 2 /T 2 1 e 4 = 1, , 3 2, 3 e1 = 1, 3 e 2 = 2, 3 e5 = 1, 4 3 G 3 /T 3 e 6 = 2, 4 1, e 3 = 3, 4 3, 4 1, 2 3 1, 4 3 c 1234 = c 13 c 12 3 }{{}}{{}}{{}}{{}}{{} c 23 c 34 e 1 e 2 e 3 } {{ } T 1 c 14 3 c e 4 e 5 } {{ } T 2 }{{} e 6 }{{} T 3 Parameters: bivariate copula families, tree construction method. 4 / 23
5 Eample of Vine Factorizations: Three Variables We can factorize c(u 1, u 2, u 3 ) using the product rule of probability as c(u 1, u 2, u 3 ) = p(u 3 u 1, u 2 )p(u 2 u 1 ) and we can epress each factor in terms of bivariate copula functions p(u 3 u 1, u 2 ) = p(u 3, u 1 u 2 )[p(u 1 u 2 )] 1 = c 31 2 (u 3 {u 2 }, u 1 {u 2 })p(u 3 u 2 ), p(u 3 u 2 ) = p(u 3, u 2 ) = c 32 (u 3, u 2 ), p(u 2 u 1 ) = p(u 2, u 1 ) = c 21 (u 2, u 1 ), where u i {u j } is the conditional cdf of u i given u j. Finally, we obtain c(u 1, u 2, u 3 ) = c 31 2 (u 3 {u 2 }, u 1 {u 2 })c 32 (u 3, u 2 )c 21 (u 2, u 1 ). 5 / 23
6 Eample of Vine Factorizations: Four Variables Using vines, we can factorize p( 1, 2, 3, 4 ) as p( 1, 2, 3, 4 ) = p 1 ( 1 )p 2 ( 2 )p 3 ( 3 )p 4 ( 4 ) c 13 (u 1, u 3 )c 23 (u 2, u 3 )c 34 (u 3, u 4 ) c 12 3 (u 1 {u 3 }, u 2 {u 3 })c 14 3 (u 1 {u 3 }, u 4 {u 3 }) c (u 2 {u 1, u 3 }, u 4 {u 1, u 3 }), We need to use conditional copulas and conditional cdfs! The conditional cdfs can be recursively computed using u i v = C ij v j (u i v j, u j v j ) u j v j. The Vine Simplifying Assumption: conditional copulas are independent from their conditioning variables. Can we do better? 6 / 23
7 A Semi-parametric Model for Conditional Copulas Assumption: the conditional copula is parametric and the conditioning variables have a direct effect on the copula parameter, specified in terms of Kendall s tau 4 τ [ 1, 1]: Eample: The Gaussian copula. c(u 1, u 2 τ) = φ 2[Φ 1 (u 1 ), Φ 1 (u 2 ) ρ = sin(τπ0.5)] φ(φ 1 (u 1 ))φ(φ 1 (u 2 )) φ and Φ are the standard Gaussian pdf and cdf. φ 2 is a 2D Gaussian pdf with zero mean, correlation ρ and variances 1. Let z be a vector of conditioning variables. Then we assume the conditional dependence τ = σ[f(z)], where f is a real function and σ() = 2Φ() 1 is a sigmoid function., 4 Many one-parameter copulas can be specified in terms of Kendall s τ rank correlation coefficient. 7 / 23
8 Estimation of Conditional Copulas with GP+EP Given the data D u,v = {(u i, v i )} n i=1 and D z = {z i } n i=1, we place a Gaussian process prior on f and do Bayesian inference. The posterior for f = (f 1,..., f n ) T, where f i = f(z i ), is prior likelihood p(f D u,v, D z ) = p(f D z )p(d u,v f), }{{} evidence posterior where p(f D z ) = N (f m, K) }{{} GP prior and p(d u,v f) = n c(u i, v i τ = σ(f i )). i=1 } {{ } copula likelihood model 8 / 23
9 Ingredient 1: The Gaussian Process Prior n p(f D u,v, D z ) N (f m, K) c(u i, v i τ = σ(f i )). The prior distribution of any set of values of f is jointly Gaussian. A covariance function specifies the value of K R n n : K ij = k(z i, z j ) = s ep( (z i z j ) T diag(λ)(z i z j )) + s 0. (1) This imposes smoothness properties on the possible prior values of f. i=1 In the GP prior m is given by a constant mean function. 9 / 23
10 Ingredient 2: The Copula Likelihood n p(f D u,v, D z ) N (f 0, K) c(u i, v i τ = σ(f i )). If several copula families are available as candidates, build one posterior for each one and perform Bayesian model selection. i=1 10 / 23
11 Ingredient 3: Bayesian Inference To make predictions at z n+1 for u n+1 and v n+1 average over all possible functions f, weighting each one by its posterior probability: p(u n+1, v n+1 z n+1, D UV, D z ) = c(u n+1, v n+1 τ = σ[f n+1 ]) p(f n+1 f, z n+1, D z ) p(f D UV, D z ) dfdf n+1. }{{}}{{}}{{} new likelihood conditional prior old posterior }{{} new posterior for f n+1 Problem 1: The likelihood is not Gaussian. We cannot perform eact inference (solve the integral analytically). Problem 2: Working with GPs is epensive. It often requires O(n 3 ) operations (compute Cholesky decompositions of n n matrices). 11 / 23
12 Ingredient 4: Epectation Propagation Solution for 1: Approimate likelihood factors with Gaussians. EP tunes ˆm i and ˆv i by minimizing KL[q i (f i )Q(f)[ˆq i (f i )] 1 Q(f)]. After convergence, the new (approimate) posterior is Gaussian. An approimation of the evidence and its gradient is returned. 12 / 23
13 Ingredient 5: FITC Approimation for GPs Solution for 2: Use a covariance matri K with a simple form: K K = Q + diag(k Q), Q = K nn0 K 1 n 0 n 0 K n0 n, where a subset of n 0 n pseudo-inputs is used. Q = K nn0 K 1 n0n0 K n0n The n 0 pseudo-inputs are tuned by maimizing the evidence returned by EP. New cost of O(nn 2 0 ). 13 / 23
14 All Together: The GPCC Method Input: Data {(u i, v i, z i )} n i=1. Parametric copula density function c(u, v τ). Number of pseudo-inputs n 0. Algorithm: Until Convergence: Use EP to get approimate Gaussian posterior, evidence and evidence gradient. Follow evidence gradient to tune pseudo-inputs and the parameters of the covariance function. Prediction of (u, v ) at z : sample f = f(z ) from the EP-posterior and average over copulas with τ = σ(f ). Output: Conditional copula model. 14 / 23
15 Eperimental Results: Synthetic Test Z is uniform in [ 6, 6] (U, V ) are Gaussian with correlation 3/4 sin(z) We sample the data {( ˆP U (u i ), ˆP V (v i ), ˆP Z (z i ))} 50 i=1. τ U,V Z GPVINE MLLVINE TRUE P Z (Z) MLLVINE is a nearest-neighbor algorithm proposed by Acar (2011). 15 / 23
16 Eperimental Results: Real-World Data Real-world data: UCI datasets, meteorological data, mineral concentrations and financial data Average test log likelihood when limited to two trees in the vine: Results are averages on 50 random train/test splits. SVINE are Vines that use the simplifying assumption. 16 / 23
17 Eperimental Results: Using More Trees GPVINE SVINE 17 / 23
18 Eperimental Results: Weather in Barcelona τ(atmospheric pressure, cloud percentage {latitude, longitude}). 18 / 23
19 GP Conditional Copulas and Financial Time Series We learn time-changing dependencies in financial returns. Three parametric copulas: Gaussian Copula Student's t Copula Symmetrized Joe Clayton Copula The marginals are described using AR-GARCH processes: t = φ 0 + φ 1 t 1 + σ t ɛ t, σ t = ω + ασ t 1 ( ɛ t 1 γɛ t 1 ) + βσ t / 23
20 Eperimental Results: Conditioning on time: NASDAQ HD AXP CNW ED HPQ BARC RBS RBS Method HON BA CSX EIX IBM HSBC BARC HSBC GPCC-G GPCC-T GPCC-SJC HMM TVC DSJCC CONST-G CONST-T CONST-SJC Data is preprocessed using AR(1)-GARCH(1,1) models. GPCC-G: GPs on τ(t) for Gaussian Copula. GPCC-T: GPs on ν(t) and τ(t) for tcopula. GPCC-SJC: GPs on τ L (t) and τ U (t) for Sym. Joe Clayton. HMM: High/Low Correlated tcopula (Jondeau 2006). TVC: Dynamic tcopula (Tse 2002). DSJCC: Dynamic SJC Copula (Patton 2006). 20 / 23
21 Eperimental Results: Conditioning on time: FOREX EUR CHF GPCC-T, EUR CHF GPCC-T, EUR CHF GPCC-SJC, Mean GPCC T Mean GPCC T Mean GPCC-SJC Mean GPCC-SJC Oct 06 Mar 07 Aug 07 Jan 08 Jun 08 Nov 08 Apr 09 Oct 09 Mar 10 Aug 10 Oct 06 Mar 07 Aug 07 Jan 08 Jun 08 Nov 08 Apr 09 Oct 09 Mar 10 Aug 10 Oct 06 Mar 07 Aug 07 Jan 08 Jun 08 Nov 08 Apr 09 Oct 09 Mar 10 Aug 10 Method AUD CAD JPY NOK SEK EUR GBP NZD GPCC-G GPCC-T GPCC-SJC HMM TVC DSJCC CONST-G CONST-T CONST-SJC Data against USD from 02/01/1990 to 15/01/2013 (6011 points). Decorrelations at Fall 2008 (onset of global recession) and Summer 2010 (worsening European sovereign debt crisis) 21 / 23
22 Summary and Conclusions Vines are fleible dependence models. They factorize a copula density into a product of conditional bivariate copulas. In practical implementations of vines, some of the conditional dependencies in the bivariate copulas are usually ignored. To avoid this, we have proposed a method for the estimation of fully conditional vines using Gaussian processes (GPVINE). GPVINE outperforms a baseline that ignores conditional dependencies (SVINE) and other alternatives (MLLVINE). The building block of GPVINE are GP Conditional Copulas (GPCC). GPCC obtain state-of-the-art performance in the task of predicting time-changing dependencies in financial data. 22 / 23
23 Thanks! Lopez-Paz D., Hernandez-Lobato J. M. and Ghahramani Z. Gaussian Process Vine Copulas for Multivariate Dependence International Conference on Machine Learning (ICML 2013). Acar, E. F., Craiu, R. V., and Yao, F. Dependence calibration in conditional copulas: A nonparametric approach. Biometrics, 67(2): , Bedford, T. and Cooke, R. M. Vines-a new graphical model for dependent random variables. The Annals of Statistics, 30(4): , 2002 Minka, T. P. Epectation Propagation for approimate Bayesian inference. Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence, pp , Naish-Guzman, A. and Holden, S. B. The generalized FITC approimation. In Advances in Neural Information Processing Systems 20, Patton, A. J. Modelling asymmetric echange rate dependence. International Economic Review, 47(2): , / 23
Gaussian Process Vine Copulas for Multivariate Dependence
Gaussian Process Vine Copulas for Multivariate Dependence José Miguel Hernández-Lobato 1,2 joint work with David López-Paz 2,3 and Zoubin Ghahramani 1 1 Department of Engineering, Cambridge University,
More informationarxiv: v1 [stat.me] 16 Feb 2013
arxiv:1302.3979v1 [stat.me] 16 Feb 2013 David Lopez-Paz Max Planck Institute for Intelligent Systems Jose Miguel Hernández-Lobato Zoubin Ghahramani University of Cambridge Abstract Copulas allow to learn
More informationHow to select a good vine
Universitetet i Oslo ingrihaf@math.uio.no International FocuStat Workshop on Focused Information Criteria and Related Themes, May 9-11, 2016 Copulae Regular vines Model selection and reduction Limitations
More informationNonparametric Bayesian Methods (Gaussian Processes)
[70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent
More informationAssessing the VaR of a portfolio using D-vine copula based multivariate GARCH models
Assessing the VaR of a portfolio using D-vine copula based multivariate GARCH models Mathias Hofmann a,, Claudia Czado b a Technische Universität München Zentrum Mathematik Lehrstuhl für Mathematische
More informationPredictive Entropy Search for Efficient Global Optimization of Black-box Functions
Predictive Entropy Search for Efficient Global Optimization of Black-bo Functions José Miguel Hernández-Lobato jmh233@cam.ac.uk University of Cambridge Matthew W. Hoffman mwh3@cam.ac.uk University of Cambridge
More informationNonparameteric Regression:
Nonparameteric Regression: Nadaraya-Watson Kernel Regression & Gaussian Process Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,
More informationExamples of vines and vine copulas Harry Joe July 24/31, 2015; Graphical models reading group
Examples of vines and vine copulas Harry Joe July 4/31, 015; Graphical models reading group Outline Examples with correlation matrices Example showing continuous variables and non-gaussian dependence Example
More informationVariational Inference with Copula Augmentation
Variational Inference with Copula Augmentation Dustin Tran 1 David M. Blei 2 Edoardo M. Airoldi 1 1 Department of Statistics, Harvard University 2 Department of Statistics & Computer Science, Columbia
More informationGaussian Process Regression Networks
Gaussian Process Regression Networks Andrew Gordon Wilson agw38@camacuk mlgengcamacuk/andrew University of Cambridge Joint work with David A Knowles and Zoubin Ghahramani June 27, 2012 ICML, Edinburgh
More informationarxiv: v1 [stat.ml] 10 Jun 2014
Predictive Entropy Search for Efficient Global Optimization of Black-bo Functions arxiv:46.254v [stat.ml] Jun 24 José Miguel Hernández-Lobato jmh233@cam.ac.uk University of Cambridge Matthew W. Hoffman
More informationBayesian Quadrature: Model-based Approximate Integration. David Duvenaud University of Cambridge
Bayesian Quadrature: Model-based Approimate Integration David Duvenaud University of Cambridge The Quadrature Problem ˆ We want to estimate an integral Z = f ()p()d ˆ Most computational problems in inference
More informationGaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature
More informationExpectation Propagation for Approximate Bayesian Inference
Expectation Propagation for Approximate Bayesian Inference José Miguel Hernández Lobato Universidad Autónoma de Madrid, Computer Science Department February 5, 2007 1/ 24 Bayesian Inference Inference Given
More informationState Space and Hidden Markov Models
State Space and Hidden Markov Models Kunsch H.R. State Space and Hidden Markov Models. ETH- Zurich Zurich; Aliaksandr Hubin Oslo 2014 Contents 1. Introduction 2. Markov Chains 3. Hidden Markov and State
More informationBayesian Inference for Pair-copula Constructions of Multiple Dependence
Bayesian Inference for Pair-copula Constructions of Multiple Dependence Claudia Czado and Aleksey Min Technische Universität München cczado@ma.tum.de, aleksmin@ma.tum.de December 7, 2007 Overview 1 Introduction
More informationMarkov Switching Regular Vine Copulas
Int. Statistical Inst.: Proc. 58th World Statistical Congress, 2011, Dublin (Session CPS057) p.5304 Markov Switching Regular Vine Copulas Stöber, Jakob and Czado, Claudia Lehrstuhl für Mathematische Statistik,
More informationQuantifying mismatch in Bayesian optimization
Quantifying mismatch in Bayesian optimization Eric Schulz University College London e.schulz@cs.ucl.ac.uk Maarten Speekenbrink University College London m.speekenbrink@ucl.ac.uk José Miguel Hernández-Lobato
More informationBayesian Semi-supervised Learning with Deep Generative Models
Bayesian Semi-supervised Learning with Deep Generative Models Jonathan Gordon Department of Engineering Cambridge University jg801@cam.ac.uk José Miguel Hernández-Lobato Department of Engineering Cambridge
More informationBayesian Inference for Conditional Copula models
Bayesian Inference for Conditional Copula models Radu Craiu Department of Statistical Sciences University of Toronto Joint with Evgeny Levi (Toronto), Avideh Sabeti (Toronto) and Mian Wei (Toronto) CRM,
More informationIntroduction to Gaussian Processes
Introduction to Gaussian Processes Neil D. Lawrence GPSS 10th June 2013 Book Rasmussen and Williams (2006) Outline The Gaussian Density Covariance from Basis Functions Basis Function Representations Constructing
More informationThe Instability of Correlations: Measurement and the Implications for Market Risk
The Instability of Correlations: Measurement and the Implications for Market Risk Prof. Massimo Guidolin 20254 Advanced Quantitative Methods for Asset Pricing and Structuring Winter/Spring 2018 Threshold
More informationActive and Semi-supervised Kernel Classification
Active and Semi-supervised Kernel Classification Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London Work done in collaboration with Xiaojin Zhu (CMU), John Lafferty (CMU),
More informationVine Copulas. Spatial Copula Workshop 2014 September 22, Institute for Geoinformatics University of Münster.
Spatial Workshop 2014 September 22, 2014 Institute for Geoinformatics University of Münster http://ifgi.uni-muenster.de/graeler 1 spatio-temporal data Typically, spatio-temporal data is given at a set
More informationVariable sigma Gaussian processes: An expectation propagation perspective
Variable sigma Gaussian processes: An expectation propagation perspective Yuan (Alan) Qi Ahmed H. Abdel-Gawad CS & Statistics Departments, Purdue University ECE Department, Purdue University alanqi@cs.purdue.edu
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture Notes Fall 2009 November, 2009 Byoung-Ta Zhang School of Computer Science and Engineering & Cognitive Science, Brain Science, and Bioinformatics Seoul National University
More informationTree-structured Gaussian Process Approximations
Tree-structured Gaussian Process Approximations Thang Bui joint work with Richard Turner MLG, Cambridge July 1st, 2014 1 / 27 Outline 1 Introduction 2 Tree-structured GP approximation 3 Experiments 4 Summary
More informationBlack-box α-divergence Minimization
Black-box α-divergence Minimization José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato, Thang Bui, Richard Turner, Harvard University, University of Cambridge, Universidad Autónoma de Madrid.
More informationEfficient estimation of a semiparametric dynamic copula model
Efficient estimation of a semiparametric dynamic copula model Christian Hafner Olga Reznikova Institute of Statistics Université catholique de Louvain Louvain-la-Neuve, Blgium 30 January 2009 Young Researchers
More informationarxiv: v3 [stat.me] 25 May 2017
Bayesian Inference for Conditional Copulas using Gaussian Process Single Index Models Evgeny Levi Radu V. Craiu arxiv:1603.0308v3 [stat.me] 5 May 017 Department of Statistical Sciences, University of Toronto
More informationA parametric approach to Bayesian optimization with pairwise comparisons
A parametric approach to Bayesian optimization with pairwise comparisons Marco Co Eindhoven University of Technology m.g.h.co@tue.nl Bert de Vries Eindhoven University of Technology and GN Hearing bdevries@ieee.org
More informationMarginal Specifications and a Gaussian Copula Estimation
Marginal Specifications and a Gaussian Copula Estimation Kazim Azam Abstract Multivariate analysis involving random variables of different type like count, continuous or mixture of both is frequently required
More informationChapter 1. Bayesian Inference for D-vines: Estimation and Model Selection
Chapter 1 Bayesian Inference for D-vines: Estimation and Model Selection Claudia Czado and Aleksey Min Technische Universität München, Zentrum Mathematik, Boltzmannstr. 3, 85747 Garching, Germany cczado@ma.tum.de
More informationClimate Change Impact Analysis
Climate Change Impact Analysis Patrick Breach M.E.Sc Candidate pbreach@uwo.ca Outline July 2, 2014 Global Climate Models (GCMs) Selecting GCMs Downscaling GCM Data KNN-CAD Weather Generator KNN-CADV4 Example
More informationMACHINE LEARNING ADVANCED MACHINE LEARNING
MACHINE LEARNING ADVANCED MACHINE LEARNING Recap of Important Notions on Estimation of Probability Density Functions 22 MACHINE LEARNING Discrete Probabilities Consider two variables and y taking discrete
More informationWhen is a copula constant? A test for changing relationships
When is a copula constant? A test for changing relationships Fabio Busetti and Andrew Harvey Bank of Italy and University of Cambridge November 2007 usetti and Harvey (Bank of Italy and University of Cambridge)
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear
More informationIntroduction to Probabilistic Graphical Models: Exercises
Introduction to Probabilistic Graphical Models: Exercises Cédric Archambeau Xerox Research Centre Europe cedric.archambeau@xrce.xerox.com Pascal Bootcamp Marseille, France, July 2010 Exercise 1: basics
More informationShort course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda
Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis Lecture Recalls of probability theory Massimo Piccardi University of Technology, Sydney,
More informationBayesian optimization for automatic machine learning
Bayesian optimization for automatic machine learning Matthew W. Ho man based o work with J. M. Hernández-Lobato, M. Gelbart, B. Shahriari, and others! University of Cambridge July 11, 2015 Black-bo optimization
More informationThe Automatic Statistician
The Automatic Statistician Zoubin Ghahramani Department of Engineering University of Cambridge, UK zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ James Lloyd, David Duvenaud (Cambridge) and
More informationMachine Learning Summer School
Machine Learning Summer School Lecture 3: Learning parameters and structure Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Department of Engineering University of Cambridge,
More informationLecture 3: Pattern Classification. Pattern classification
EE E68: Speech & Audio Processing & Recognition Lecture 3: Pattern Classification 3 4 5 The problem of classification Linear and nonlinear classifiers Probabilistic classification Gaussians, mitures and
More informationComputation fundamentals of discrete GMRF representations of continuous domain spatial models
Computation fundamentals of discrete GMRF representations of continuous domain spatial models Finn Lindgren September 23 2015 v0.2.2 Abstract The fundamental formulas and algorithms for Bayesian spatial
More informationMTTTS16 Learning from Multiple Sources
MTTTS16 Learning from Multiple Sources 5 ECTS credits Autumn 2018, University of Tampere Lecturer: Jaakko Peltonen Lecture 6: Multitask learning with kernel methods and nonparametric models On this lecture:
More informationA Goodness-of-fit Test for Copulas
A Goodness-of-fit Test for Copulas Artem Prokhorov August 2008 Abstract A new goodness-of-fit test for copulas is proposed. It is based on restrictions on certain elements of the information matrix and
More informationGaussian Process priors with Uncertain Inputs: Multiple-Step-Ahead Prediction
Gaussian Process priors with Uncertain Inputs: Multiple-Step-Ahead Prediction Agathe Girard Dept. of Computing Science University of Glasgow Glasgow, UK agathe@dcs.gla.ac.uk Carl Edward Rasmussen Gatsby
More informationAnalytic Long-Term Forecasting with Periodic Gaussian Processes
Nooshin Haji Ghassemi School of Computing Blekinge Institute of Technology Sweden Marc Peter Deisenroth Department of Computing Imperial College London United Kingdom Department of Computer Science TU
More informationPower EP. Thomas Minka Microsoft Research Ltd., Cambridge, UK MSR-TR , October 4, Abstract
Power EP Thomas Minka Microsoft Research Ltd., Cambridge, UK MSR-TR-2004-149, October 4, 2004 Abstract This note describes power EP, an etension of Epectation Propagation (EP) that makes the computations
More informationMACHINE LEARNING ADVANCED MACHINE LEARNING
MACHINE LEARNING ADVANCED MACHINE LEARNING Recap of Important Notions on Estimation of Probability Density Functions 2 2 MACHINE LEARNING Overview Definition pdf Definition joint, condition, marginal,
More informationFinancial Econometrics and Volatility Models Copulas
Financial Econometrics and Volatility Models Copulas Eric Zivot Updated: May 10, 2010 Reading MFTS, chapter 19 FMUND, chapters 6 and 7 Introduction Capturing co-movement between financial asset returns
More informationWhere now? Machine Learning and Bayesian Inference
Machine Learning and Bayesian Inference Dr Sean Holden Computer Laboratory, Room FC6 Telephone etension 67 Email: sbh@clcamacuk wwwclcamacuk/ sbh/ Where now? There are some simple take-home messages from
More informationLecture 6: Graphical Models: Learning
Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)
More informationTime Varying Hierarchical Archimedean Copulae (HALOC)
Time Varying Hierarchical Archimedean Copulae () Wolfgang Härdle Ostap Okhrin Yarema Okhrin Ladislaus von Bortkiewicz Chair of Statistics C.A.S.E. Center for Applied Statistics and Economics Humboldt-Universität
More informationHierarchical Modelling for Multivariate Spatial Data
Hierarchical Modelling for Multivariate Spatial Data Geography 890, Hierarchical Bayesian Models for Environmental Spatial Data Analysis February 15, 2011 1 Point-referenced spatial data often come as
More informationSTA 414/2104, Spring 2014, Practice Problem Set #1
STA 44/4, Spring 4, Practice Problem Set # Note: these problems are not for credit, and not to be handed in Question : Consider a classification problem in which there are two real-valued inputs, and,
More informationComputer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression
Group Prof. Daniel Cremers 9. Gaussian Processes - Regression Repetition: Regularized Regression Before, we solved for w using the pseudoinverse. But: we can kernelize this problem as well! First step:
More informationBayesian Inference for Conditional Copula models with Continuous and Binary Responses
Bayesian Inference for Conditional Copula models with Continuous and Binary Responses Radu Craiu Department of Statistics University of Toronto Joint with Avideh Sabeti (Toronto) and Mian Wei (Toronto)
More informationSTAT 518 Intro Student Presentation
STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible
More informationEstimation of Copula Models with Discrete Margins (via Bayesian Data Augmentation) Michael S. Smith
Estimation of Copula Models with Discrete Margins (via Bayesian Data Augmentation) Michael S. Smith Melbourne Business School, University of Melbourne (Joint with Mohamad Khaled, University of Queensland)
More informationAn Introduction to Bayesian Machine Learning
1 An Introduction to Bayesian Machine Learning José Miguel Hernández-Lobato Department of Engineering, Cambridge University April 8, 2013 2 What is Machine Learning? The design of computational systems
More informationHow to build an automatic statistician
How to build an automatic statistician James Robert Lloyd 1, David Duvenaud 1, Roger Grosse 2, Joshua Tenenbaum 2, Zoubin Ghahramani 1 1: Department of Engineering, University of Cambridge, UK 2: Massachusetts
More informationebay/google short course: Problem set 2
18 Jan 013 ebay/google short course: Problem set 1. (the Echange Parado) You are playing the following game against an opponent, with a referee also taking part. The referee has two envelopes (numbered
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More informationDependence. MFM Practitioner Module: Risk & Asset Allocation. John Dodson. September 11, Dependence. John Dodson. Outline.
MFM Practitioner Module: Risk & Asset Allocation September 11, 2013 Before we define dependence, it is useful to define Random variables X and Y are independent iff For all x, y. In particular, F (X,Y
More informationDeep Neural Networks as Gaussian Processes
Deep Neural Networks as Gaussian Processes Jaehoon Lee, Yasaman Bahri, Roman Novak, Samuel S. Schoenholz, Jeffrey Pennington, Jascha Sohl-Dickstein Google Brain {jaehlee, yasamanb, romann, schsam, jpennin,
More informationAn Econometric Study of Vine Copulas
An Econometric Study of Vine Copulas Pierre-André Maugis (Corresponding author) PSE, Université Paris 1 Panthéon-Sorbonne, 106 boulevard de l Hopital 75647 Paris Cedex 13, France E-mail: pierre-andre.maugis@malix.univ-paris1.fr.
More informationGWAS IV: Bayesian linear (variance component) models
GWAS IV: Bayesian linear (variance component) models Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS IV: Bayesian
More informationProbabilistic & Bayesian deep learning. Andreas Damianou
Probabilistic & Bayesian deep learning Andreas Damianou Amazon Research Cambridge, UK Talk at University of Sheffield, 19 March 2019 In this talk Not in this talk: CRFs, Boltzmann machines,... In this
More informationConstruction and estimation of high dimensional copulas
Construction and estimation of high dimensional copulas Gildas Mazo PhD work supervised by S. Girard and F. Forbes Mistis, Inria and laboratoire Jean Kuntzmann, Grenoble, France Séminaire Statistiques,
More informationProbabilistic Reasoning in Deep Learning
Probabilistic Reasoning in Deep Learning Dr Konstantina Palla, PhD palla@stats.ox.ac.uk September 2017 Deep Learning Indaba, Johannesburgh Konstantina Palla 1 / 39 OVERVIEW OF THE TALK Basics of Bayesian
More informationHierarchical Modeling for Univariate Spatial Data
Hierarchical Modeling for Univariate Spatial Data Geography 890, Hierarchical Bayesian Models for Environmental Spatial Data Analysis February 15, 2011 1 Spatial Domain 2 Geography 890 Spatial Domain This
More informationCSci 8980: Advanced Topics in Graphical Models Gaussian Processes
CSci 8980: Advanced Topics in Graphical Models Gaussian Processes Instructor: Arindam Banerjee November 15, 2007 Gaussian Processes Outline Gaussian Processes Outline Parametric Bayesian Regression Gaussian
More informationApproximate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery
Approimate Message Passing with Built-in Parameter Estimation for Sparse Signal Recovery arxiv:1606.00901v1 [cs.it] Jun 016 Shuai Huang, Trac D. Tran Department of Electrical and Computer Engineering Johns
More informationSpatial Statistics 2013, S2.2 6 th June Institute for Geoinformatics University of Münster.
Spatial Statistics 2013, S2.2 6 th June 2013 Institute for Geoinformatics University of Münster http://ifgi.uni-muenster.de/graeler Vine Vine 1 Spatial/spatio-temporal data Typically, spatial/spatio-temporal
More informationCopulas. MOU Lili. December, 2014
Copulas MOU Lili December, 2014 Outline Preliminary Introduction Formal Definition Copula Functions Estimating the Parameters Example Conclusion and Discussion Preliminary MOU Lili SEKE Team 3/30 Probability
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationHierarchical Modelling for Univariate Spatial Data
Spatial omain Hierarchical Modelling for Univariate Spatial ata Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A.
More informationDependence Calibration in Conditional Copulas: A Nonparametric Approach
Biometrics DOI: 10.1111/j.1541-0420.2010.01472. Dependence Calibration in Conditional Copulas: A Nonparametric Approach Elif F. Acar, Radu V. Craiu, and Fang Yao Department of Statistics, University of
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationMachine Learning Lecture 3
Announcements Machine Learning Lecture 3 Eam dates We re in the process of fiing the first eam date Probability Density Estimation II 9.0.207 Eercises The first eercise sheet is available on L2P now First
More informationApproximate Inference in Practice Microsoft s Xbox TrueSkill TM
1 Approximate Inference in Practice Microsoft s Xbox TrueSkill TM Daniel Hernández-Lobato Universidad Autónoma de Madrid 2014 2 Outline 1 Introduction 2 The Probabilistic Model 3 Approximate Inference
More informationIGD-TP Exchange Forum n 5 WG1 Safety Case: Handling of uncertainties October th 2014, Kalmar, Sweden
IGD-TP Exchange Forum n 5 WG1 Safety Case: Handling of uncertainties October 28-30 th 2014, Kalmar, Sweden Comparison of probabilistic and alternative evidence theoretical methods for the handling of parameter
More informationBuilding an Automatic Statistician
Building an Automatic Statistician Zoubin Ghahramani Department of Engineering University of Cambridge zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ ALT-Discovery Science Conference, October
More informationDeep learning with differential Gaussian process flows
Deep learning with differential Gaussian process flows Pashupati Hegde Markus Heinonen Harri Lähdesmäki Samuel Kaski Helsinki Institute for Information Technology HIIT Department of Computer Science, Aalto
More informationGaussian Processes (10/16/13)
STA561: Probabilistic machine learning Gaussian Processes (10/16/13) Lecturer: Barbara Engelhardt Scribes: Changwei Hu, Di Jin, Mengdi Wang 1 Introduction In supervised learning, we observe some inputs
More informationForecasting time series with multivariate copulas
Depend. Model. 2015; 3:59 82 Research Article Open Access Clarence Simard* and Bruno Rémillard Forecasting time series with multivariate copulas DOI 10.1515/demo-2015-0005 5 Received august 17, 2014; accepted
More informationSub-kilometer-scale space-time stochastic rainfall simulation
Picture: Huw Alexander Ogilvie Sub-kilometer-scale space-time stochastic rainfall simulation Lionel Benoit (University of Lausanne) Gregoire Mariethoz (University of Lausanne) Denis Allard (INRA Avignon)
More informationBayesian Inference of Noise Levels in Regression
Bayesian Inference of Noise Levels in Regression Christopher M. Bishop Microsoft Research, 7 J. J. Thomson Avenue, Cambridge, CB FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop
More informationAdvanced Introduction to Machine Learning CMU-10715
Advanced Introduction to Machine Learning CMU-10715 Gaussian Processes Barnabás Póczos http://www.gaussianprocess.org/ 2 Some of these slides in the intro are taken from D. Lizotte, R. Parr, C. Guesterin
More informationExpectation Propagation Algorithm
Expectation Propagation Algorithm 1 Shuang Wang School of Electrical and Computer Engineering University of Oklahoma, Tulsa, OK, 74135 Email: {shuangwang}@ou.edu This note contains three parts. First,
More informationDoubly Stochastic Inference for Deep Gaussian Processes. Hugh Salimbeni Department of Computing Imperial College London
Doubly Stochastic Inference for Deep Gaussian Processes Hugh Salimbeni Department of Computing Imperial College London 29/5/2017 Motivation DGPs promise much, but are difficult to train Doubly Stochastic
More informationNetBox: A Probabilistic Method for Analyzing Market Basket Data
NetBox: A Probabilistic Method for Analyzing Market Basket Data José Miguel Hernández-Lobato joint work with Zoubin Gharhamani Department of Engineering, Cambridge University October 22, 2012 J. M. Hernández-Lobato
More informationBayesian Machine Learning
Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 4 Occam s Razor, Model Construction, and Directed Graphical Models https://people.orie.cornell.edu/andrew/orie6741 Cornell University September
More informationLecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu
Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes
More informationVariational Model Selection for Sparse Gaussian Process Regression
Variational Model Selection for Sparse Gaussian Process Regression Michalis K. Titsias School of Computer Science University of Manchester 7 September 2008 Outline Gaussian process regression and sparse
More informationSTA414/2104. Lecture 11: Gaussian Processes. Department of Statistics
STA414/2104 Lecture 11: Gaussian Processes Department of Statistics www.utstat.utoronto.ca Delivered by Mark Ebden with thanks to Russ Salakhutdinov Outline Gaussian Processes Exam review Course evaluations
More informationPerturbation Theory for Variational Inference
Perturbation heor for Variational Inference Manfred Opper U Berlin Marco Fraccaro echnical Universit of Denmark Ulrich Paquet Apple Ale Susemihl U Berlin Ole Winther echnical Universit of Denmark Abstract
More informationRecovering Copulae from Conditional Quantiles
Wolfgang K. Härdle Chen Huang Alexander Ristig Ladislaus von Bortkiewicz Chair of Statistics C.A.S.E. Center for Applied Statistics and Economics HumboldtUniversität zu Berlin http://lvb.wiwi.hu-berlin.de
More informationTalk on Bayesian Optimization
Talk on Bayesian Optimization Jungtaek Kim (jtkim@postech.ac.kr) Machine Learning Group, Department of Computer Science and Engineering, POSTECH, 77-Cheongam-ro, Nam-gu, Pohang-si 37673, Gyungsangbuk-do,
More information