Hierarchical Time-series Modelling for Haemodialysis. Tingting Zhu 24 th October 2016
|
|
- Ethelbert Goodman
- 5 years ago
- Views:
Transcription
1 Hierarchical Time-series Modelling for Haemodialysis Tingting Zhu 24 th October 2016
2 Content Background on haemodialysis Methods o Gaussian Process Regression (GPR) o Hierarchical Gaussian Process Regression (HGPR) Application of GPR to the haemodialysis dataset o Pre-processing and Fitting Time-series using GPR o Univariate GPR Forecasting - Prediction of Deterioration o HGPR Forecasting for Multiple Time-series o Clustering of Time-series Trajectories
3 Background - Haemodialysis Background: - Prevalence of end-stage renal failure is 861 per million population in the UK; - 50% of them requires renal replacement therapy haemodialysis. The process: (1) Remove excess fluid from the body; (2) Restore electrolyte (such as sodium, potassium, and phosphate) balance; (3) Clear the waste products (such as urine) in blood; (4) 3 sessions per week, each lasts upto 4 hours. Image taken from
4 Background - Haemodialysis The problem(s): The rapid changes in blood fluid levels that occur during the treatment and causing: - cramps, nausea, and vomiting; - Intra-dialytic hypotension (IDH) sudden fall in blood pressure [reduction in blood volume resulted from an imbalance between rapid fluid withdraw and vascular refilling from the interstitial space to the blood stream]. The challenges: - Discrepancy in definition of IDH : - haemodynamic (fluid dynamic of blood flow) status varies between sessions; - Inaccurate measurement of non-invasive systolic blood pressure (derived from mean arterial pressure). - Real-time detection/early warning of IDH. Definition of IDH from Kidney Disease Outcomes and Quality Initiative: A decrease in systolic blood pressure by 20 mmhg or a decrease in mean arterial pressure by 10 mmhg associated with symptoms that include: abdominal discomfort; yawning; sighing; nausea; vomiting; muscle cramps; restlessness; dizziness or fainting; and anxiety.
5 Gaussian Process Regression (GPR) Gaussian Process for Regression: y can be considered as related to an underlying function f x through a Gaussian noise model: y = f x + N(0, σ n 2 ) Assuming f x has a Gaussian Process (GP) prior: or y~n(f(x), σ n 2 ) f x ~ GP(m x, k x, x ) where m x is the mean function of the GP and k x, x is a covariance function which describes the relationship among the y values that is determined according to the distance of the x values. Hence, we can define y as: y ~ GP (m x, k x, x + σ n 2 I) Noting that there are different covariance functions available to be considered, a common one is called the squaredexponential covariance function: k x, x = αexp { γ(x x ) 2 }) where the amplitude (α) and relative length-scale (γ) are hyperparameters.
6 Gaussian Process Regression (GPR) GPR Prediction: Condition on the training set {x, y}, the distribution of an unknown output y at x is defined as: y x, x, y ~N(E y, Var y ) where E y = k(x, x)k(x, x) 1 y Var y = k x, x k(x, x)k(x, x) 1 k(x, x) T Log Marginal Likelihood: To infer the hyperparameters (denoted as θ) in the covariance function, we can compute the probability of the data given the hyperparameters (i.e., marginal likelihood): p y x, θ = p y f, x p f x, θ df where we have marginalised over the function values f, and by taking the log of the likelihood, we have: L = log p y x, θ = 1 2 log k x, x + σ n 2 I 1 2 y μ T [k x, x + σ n 2 I] 1 y μ N 2 log(2π) Complexity Penalty Data-fit Measure Constant
7 Hierarchical Gaussian Process Regression (HGPR) Assuming we have N groups of time-series (such as physiological measurements over a time vector t n ), and they are similar to each other. The observed data for N groups of timeseries can be defined as Y = y nr r=1 taken at times T = t N nr r=1 N. Subject 1 N=3 Under the model assumption, there is a latent GP function which governs all the time series, denoted as g n t. Given a draw for g n, each group of data is then drawn from a GP as: Session 1 Session 2 Session 3 f nr t ~GP g n t, k f t, t The hierarchical structure of GPs can be computed as: g n t ~GP 0, k g t, t, f nr t ~GP g n t, k f t, t. Note two points on f nr t are jointly Gaussian distributed with zero mean and covariance k g t, t + k f t, t. But two points in different time-series are jointly distributed with covariance k g t, t. Figure taken from Hensman et al. 2013
8 Deeper HGPR Session 1 Subject ID=1 Session K Cluster 1 Subject ID=2 Subject ID=3 Session 1 Session M Session 1 What does a cluster mean: - Normal vs abnormal clusters of subjects - Different clusters of an abnormal population - Different clusters of a normal population h i t ~GP 0, k h t, t, g n t ~GP h i t, k g t, t, f nr t ~GP g n t, k f t, t. Cluster G Session 1 Subject ID=N Session M
9 Haemodialysis Dataset 60 recruited patients for the haemodialysis study, however, only 35 subjects had continuous blood pressure measurements. 4 vital signs (HR, SBP, MAP, and SpO2) were considered, and each vital sign is extracted from different sensors. Data taken from Clare R. MacEwen PhD Thesis 2016 Definition of Intradialytic hypotension (IDH) event: o MAP < 60mmHg (cerebral ischemia) AND SBP < 80%SBP_0 (i.e., SBP at baseline) o OR MAP < 60mmHg when there is no baseline SBP.
10 Pre-processing and Fitting Time-series using GPR
11 Raw session data Downsample GPR outlier removal GPR fitted time-series
12 Normalised LML Log SBP (mmhg) Raw session data Downsample GPR outlier removal GPR fitted time-series Time (min) Time (min) GPR Outlier Removal: - Estimate the normalised LMLs with respect to the userdefined window (such as ±5min window with 10min overlap); - Identify and remove outliers using a threshold; - Iteratively fitting GPR after removal of outliers; - Stop when over 5% data removed.
13 Raw session data Downsample GPR outlier removal GPR fitted time-series GPR Outlier Removal: - Estimate the normalised LMLs with respect to the userdefined window (such as ±5min window with 10min overlap); - Identify and remove outliers using a threshold; - Iteratively fitting GPR after removal of outliers; - Stop when over 5% data removed.
14 GPR fitting for an individual session: intervention SBP<80%SBP_baseline +MAP<60mmHg
15 Univariate GPR Forecasting - Prediction of Deterioration
16 Prediction of Deterioration using the Log Marginal Likelihood (LML) Adaptive Training and Forecasting: Training mins Training window Forecast window LML
17 Mean Predictive Log-Likelihood VS 160 Training Forecasting mins Original Signal IDH LML For each training set: - estimate the mean of LMLs in a 3-min forecasted window Time (mins) Abnormal blood pressure (MAP < 60mmHg) occurred at 684 mins; Nurse intervention occurred at 687 mins.
18 HGPR Forecasting for Multiple Time-series
19 LML GPR Forecasting for Multiple Vital Signs HR MAP SBP spo Time(mins) LML_HR LML_MAP LML_SBP Fused/Latent LML LML_SPO2
20 Data Fusion on Forecasted LMLs using BCLA-MAP Assumptions: the LML values are independent, and each vital sign LML is conditionally independent.
21 Data Fusion on Forecasted LMLs using HGPR Raw LML values HR MAP SBP SPO2 Normalised LML values (zero mean unit variance) HR MAP SBP SPO2
22 Data Fusion on GPR mean function of the Forecasted LMLs using HGPR HR MAP SBP SPO2 GPR mean function of the LML values LML_HR LML_MAP LML_SBP LML_SPO2 Mean function of LML_HR Mean function of LML_MAP Mean function of LML_SBP Mean function of LML_SPO2 Fused/Latent LML
23 Raw LML values HR MAP SBP SPO2 GPR mean function of the LML values HR MAP SBP SPO2
24 Clustering of Time-series Trajectories
25 Derivation of Latent Trajectories using HGPR (1) Derive latent GPR mean function from session-wise GPR mean functions Abnormal Sessions Normal Sessions
26 Hierarchical Clustering of Latent Trajectories (2) Clustering of latent mean functions using hierarchical clustering Abnormal Population (N=29) Normal Population (N=24) Abnormal Population + Normal Population (N=53)
27 Session 1 Subject ID=1 Deeper HGPR of Latent Trajectories Cluster 1 Session K Session 1 Subject ID=10 Session M
28 Future Works Normalise LML trajectory of each vital sign to better infer a latent LML trajectory for a session; Time-series modelling across sessions; Focus on windowed HGPR (2hrs) prior to an event; Deeper windowed HGPR; Non-parametric clustering: o using Mixture of hierarchical GPRs using Dirichlet distribution/process; Overlapping mixtures of GPRs. Cluster 1 Subject ID=1 Session 1 Session K Session 1 Subject ID=2 Session M
29 Thank You Acknowledgements: o o o o Kate Niehaus and Glen Colopy; Prof David Clifton and Prof Chris Pugh; The CHI lab; Funding bodies: NIHR and the EPSRC. References: o Ebden, M.: Gaussian processes: A quick introduction, arxiv: v2. o Rasmussen, C. E.: "Gaussian processes for machine learning." (2006). o MacEwen, C.R.: Can data fusion techniques predict adverse physiological events during haemodialysis?. PhD thesis o Colopy, G.W., Pimentel, M.A.F., Roberts, S.J., and Clifton, D.A.: Bayesian Gaussian Processes for Identifying the Deteriorating Patient. IEEE Engineering in Medicine & Biology Conference, Orlando, Florida, USA, 2016, pp o Zhu, T.T., Dunkley, N., Behar, J., Clifton, D.A., and Clifford, G.D.: Fusing Continuous-Valued Medical Labels Using a Bayesian Model. Annals of Biomedical Engineering 43(12), 2015, pp o Hensman J, Lawrence ND, Rattray M.: Hierarchical Bayesian modelling of gene expression time series across irregularly sampled replicates and clusters. BMC bioinformatics Aug 20;14(1):252. o Hensman, J., Rattray, M. and Lawrence, N.D.: Fast nonparametric clustering of structured time-series. IEEE transactions on pattern analysis and machine intelligence, 37(2), 2015, pp o Lázaro-Gredilla, M., Van Vaerenbergh, S. and Lawrence, N.D.: Overlapping mixtures of Gaussian processes for the data association problem. Pattern Recognition, 45(4), 2012, pp
Joint Emotion Analysis via Multi-task Gaussian Processes
Joint Emotion Analysis via Multi-task Gaussian Processes Daniel Beck, Trevor Cohn, Lucia Specia October 28, 2014 1 Introduction 2 Multi-task Gaussian Process Regression 3 Experiments and Discussion 4 Conclusions
More informationGAUSSIAN PROCESS CLUSTERING FOR THE FUNCTIONAL CHARACTERISATION OF VITAL-SIGN TRAJECTORIES. Marco A. F. Pimentel, David A. Clifton, Lionel Tarassenko
2013 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, SEPT. 22 25, 2013, SOUTHAMPTON, UK GAUSSIAN PROCESS CLUSTERING FOR THE FUNCTIONAL CHARACTERISATION OF VITAL-SIGN TRAJECTORIES
More informationModelling gene expression dynamics with Gaussian processes
Modelling gene expression dynamics with Gaussian processes Regulatory Genomics and Epigenomics March th 6 Magnus Rattray Faculty of Life Sciences University of Manchester Talk Outline Introduction to Gaussian
More informationLecture 9. Time series prediction
Lecture 9 Time series prediction Prediction is about function fitting To predict we need to model There are a bewildering number of models for data we look at some of the major approaches in this lecture
More informationTutorial on Gaussian Processes and the Gaussian Process Latent Variable Model
Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model (& discussion on the GPLVM tech. report by Prof. N. Lawrence, 06) Andreas Damianou Department of Neuro- and Computer Science,
More informationPrediction of double gene knockout measurements
Prediction of double gene knockout measurements Sofia Kyriazopoulou-Panagiotopoulou sofiakp@stanford.edu December 12, 2008 Abstract One way to get an insight into the potential interaction between a pair
More informationNonparameteric Regression:
Nonparameteric Regression: Nadaraya-Watson Kernel Regression & Gaussian Process Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,
More informationProbabilistic & Unsupervised Learning
Probabilistic & Unsupervised Learning Gaussian Processes Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc ML/CSML, Dept Computer Science University College London
More informationLecture 5: GPs and Streaming regression
Lecture 5: GPs and Streaming regression Gaussian Processes Information gain Confidence intervals COMP-652 and ECSE-608, Lecture 5 - September 19, 2017 1 Recall: Non-parametric regression Input space X
More informationADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II
1 Non-linear regression techniques Part - II Regression Algorithms in this Course Support Vector Machine Relevance Vector Machine Support vector regression Boosting random projections Relevance vector
More informationA Process over all Stationary Covariance Kernels
A Process over all Stationary Covariance Kernels Andrew Gordon Wilson June 9, 0 Abstract I define a process over all stationary covariance kernels. I show how one might be able to perform inference that
More informationProbabilistic Models for Learning Data Representations. Andreas Damianou
Probabilistic Models for Learning Data Representations Andreas Damianou Department of Computer Science, University of Sheffield, UK IBM Research, Nairobi, Kenya, 23/06/2015 Sheffield SITraN Outline Part
More informationNon-Parametric Bayes
Non-Parametric Bayes Mark Schmidt UBC Machine Learning Reading Group January 2016 Current Hot Topics in Machine Learning Bayesian learning includes: Gaussian processes. Approximate inference. Bayesian
More informationarxiv: v1 [cs.lg] 22 Jun 2009
Bayesian two-sample tests arxiv:0906.4032v1 [cs.lg] 22 Jun 2009 Karsten M. Borgwardt 1 and Zoubin Ghahramani 2 1 Max-Planck-Institutes Tübingen, 2 University of Cambridge June 22, 2009 Abstract In this
More informationNonparametric Bayesian Methods (Gaussian Processes)
[70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent
More informationGaussian Processes for Big Data. James Hensman
Gaussian Processes for Big Data James Hensman Overview Motivation Sparse Gaussian Processes Stochastic Variational Inference Examples Overview Motivation Sparse Gaussian Processes Stochastic Variational
More informationGaussian Processes in Machine Learning
Gaussian Processes in Machine Learning November 17, 2011 CharmGil Hong Agenda Motivation GP : How does it make sense? Prior : Defining a GP More about Mean and Covariance Functions Posterior : Conditioning
More informationStochastic Variational Inference for Gaussian Process Latent Variable Models using Back Constraints
Stochastic Variational Inference for Gaussian Process Latent Variable Models using Back Constraints Thang D. Bui Richard E. Turner tdb40@cam.ac.uk ret26@cam.ac.uk Computational and Biological Learning
More informationBAYESIAN CLASSIFICATION OF HIGH DIMENSIONAL DATA WITH GAUSSIAN PROCESS USING DIFFERENT KERNELS
BAYESIAN CLASSIFICATION OF HIGH DIMENSIONAL DATA WITH GAUSSIAN PROCESS USING DIFFERENT KERNELS Oloyede I. Department of Statistics, University of Ilorin, Ilorin, Nigeria Corresponding Author: Oloyede I.,
More informationGaussian Process Regression
Gaussian Process Regression 4F1 Pattern Recognition, 21 Carl Edward Rasmussen Department of Engineering, University of Cambridge November 11th - 16th, 21 Rasmussen (Engineering, Cambridge) Gaussian Process
More informationReliability Monitoring Using Log Gaussian Process Regression
COPYRIGHT 013, M. Modarres Reliability Monitoring Using Log Gaussian Process Regression Martin Wayne Mohammad Modarres PSA 013 Center for Risk and Reliability University of Maryland Department of Mechanical
More informationDensity Estimation. Seungjin Choi
Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/
More informationGaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature
More informationAdaptive Sampling of Clouds with a Fleet of UAVs: Improving Gaussian Process Regression by Including Prior Knowledge
Master s Thesis Presentation Adaptive Sampling of Clouds with a Fleet of UAVs: Improving Gaussian Process Regression by Including Prior Knowledge Diego Selle (RIS @ LAAS-CNRS, RT-TUM) Master s Thesis Presentation
More informationK-Means and Gaussian Mixture Models
K-Means and Gaussian Mixture Models David Rosenberg New York University October 29, 2016 David Rosenberg (New York University) DS-GA 1003 October 29, 2016 1 / 42 K-Means Clustering K-Means Clustering David
More informationVirtual Sensors and Large-Scale Gaussian Processes
Virtual Sensors and Large-Scale Gaussian Processes Ashok N. Srivastava, Ph.D. Principal Investigator, IVHM Project Group Lead, Intelligent Data Understanding ashok.n.srivastava@nasa.gov Coauthors: Kamalika
More informationProbabilistic Graphical Models Lecture 20: Gaussian Processes
Probabilistic Graphical Models Lecture 20: Gaussian Processes Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 30, 2015 1 / 53 What is Machine Learning? Machine learning algorithms
More informationDimensional reduction of clustered data sets
Dimensional reduction of clustered data sets Guido Sanguinetti 5th February 2007 Abstract We present a novel probabilistic latent variable model to perform linear dimensional reduction on data sets which
More informationMultiple-step Time Series Forecasting with Sparse Gaussian Processes
Multiple-step Time Series Forecasting with Sparse Gaussian Processes Perry Groot ab Peter Lucas a Paul van den Bosch b a Radboud University, Model-Based Systems Development, Heyendaalseweg 135, 6525 AJ
More informationSTAT 518 Intro Student Presentation
STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible
More informationProbabilistic Graphical Models for Image Analysis - Lecture 1
Probabilistic Graphical Models for Image Analysis - Lecture 1 Alexey Gronskiy, Stefan Bauer 21 September 2018 Max Planck ETH Center for Learning Systems Overview 1. Motivation - Why Graphical Models 2.
More informationIntroduction to Gaussian Processes
Introduction to Gaussian Processes Iain Murray murray@cs.toronto.edu CSC255, Introduction to Machine Learning, Fall 28 Dept. Computer Science, University of Toronto The problem Learn scalar function of
More informationHierarchical Dirichlet Processes with Random Effects
Hierarchical Dirichlet Processes with Random Effects Seyoung Kim Department of Computer Science University of California, Irvine Irvine, CA 92697-34 sykim@ics.uci.edu Padhraic Smyth Department of Computer
More informationParameter Estimation. Industrial AI Lab.
Parameter Estimation Industrial AI Lab. Generative Model X Y w y = ω T x + ε ε~n(0, σ 2 ) σ 2 2 Maximum Likelihood Estimation (MLE) Estimate parameters θ ω, σ 2 given a generative model Given observed
More informationGaussian Process Functional Regression Model for Curve Prediction and Clustering
Gaussian Process Functional Regression Model for Curve Prediction and Clustering J.Q. SHI School of Mathematics and Statistics, University of Newcastle, UK j.q.shi@ncl.ac.uk http://www.staff.ncl.ac.uk/j.q.shi
More informationLecture 3a: Dirichlet processes
Lecture 3a: Dirichlet processes Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London c.archambeau@cs.ucl.ac.uk Advanced Topics
More informationICML Scalable Bayesian Inference on Point processes. with Gaussian Processes. Yves-Laurent Kom Samo & Stephen Roberts
ICML 2015 Scalable Nonparametric Bayesian Inference on Point Processes with Gaussian Processes Machine Learning Research Group and Oxford-Man Institute University of Oxford July 8, 2015 Point Processes
More informationGAUSSIAN PROCESS REGRESSION
GAUSSIAN PROCESS REGRESSION CSE 515T Spring 2015 1. BACKGROUND The kernel trick again... The Kernel Trick Consider again the linear regression model: y(x) = φ(x) w + ε, with prior p(w) = N (w; 0, Σ). The
More informationMachine Learning. Gaussian Mixture Models. Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall
Machine Learning Gaussian Mixture Models Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall 2012 1 The Generative Model POV We think of the data as being generated from some process. We assume
More informationPattern Recognition and Machine Learning
Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability
More informationINFINITE MIXTURES OF MULTIVARIATE GAUSSIAN PROCESSES
INFINITE MIXTURES OF MULTIVARIATE GAUSSIAN PROCESSES SHILIANG SUN Department of Computer Science and Technology, East China Normal University 500 Dongchuan Road, Shanghai 20024, China E-MAIL: slsun@cs.ecnu.edu.cn,
More informationHarmonic Regression in the Biological Setting. Michael Gaffney, Ph.D., Pfizer Inc
Harmonic Regression in the Biological Setting Michael Gaffney, Ph.D., Pfizer Inc Two primary aims of harmonic regression 1. To describe the timing (phase) or degree of the diurnal variation (amplitude)
More informationGaussian Process Regression Model in Spatial Logistic Regression
Journal of Physics: Conference Series PAPER OPEN ACCESS Gaussian Process Regression Model in Spatial Logistic Regression To cite this article: A Sofro and A Oktaviarina 018 J. Phys.: Conf. Ser. 947 01005
More informationWITH THE rapid increase in volume of wearable devices
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, VOL. 23, NO. 1, JANUARY 2019 47 Unsupervised Bayesian Inference to Fuse Biosignal Sensory Estimates for Personalizing Care Tingting Zhu, Marco A. F. Pimentel,
More informationLecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu
Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes
More informationVariational Model Selection for Sparse Gaussian Process Regression
Variational Model Selection for Sparse Gaussian Process Regression Michalis K. Titsias School of Computer Science University of Manchester 7 September 2008 Outline Gaussian process regression and sparse
More informationAnalytic Long-Term Forecasting with Periodic Gaussian Processes
Nooshin Haji Ghassemi School of Computing Blekinge Institute of Technology Sweden Marc Peter Deisenroth Department of Computing Imperial College London United Kingdom Department of Computer Science TU
More informationCSci 8980: Advanced Topics in Graphical Models Gaussian Processes
CSci 8980: Advanced Topics in Graphical Models Gaussian Processes Instructor: Arindam Banerjee November 15, 2007 Gaussian Processes Outline Gaussian Processes Outline Parametric Bayesian Regression Gaussian
More informationEnhancing Fetal ECG Using Gaussian Process
Enhancing Fetal ECG Using Gaussian Process Saman Noorzadeh, Bertrand Rivet, Pierre-Yves Guméry To cite this version: Saman Noorzadeh, Bertrand Rivet, Pierre-Yves Guméry. Enhancing Fetal ECG Using Gaussian
More informationGWAS V: Gaussian processes
GWAS V: Gaussian processes Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS V: Gaussian processes Summer 2011
More informationProbabilistic & Bayesian deep learning. Andreas Damianou
Probabilistic & Bayesian deep learning Andreas Damianou Amazon Research Cambridge, UK Talk at University of Sheffield, 19 March 2019 In this talk Not in this talk: CRFs, Boltzmann machines,... In this
More informationGaussian Processes. 1 What problems can be solved by Gaussian Processes?
Statistical Techniques in Robotics (16-831, F1) Lecture#19 (Wednesday November 16) Gaussian Processes Lecturer: Drew Bagnell Scribe:Yamuna Krishnamurthy 1 1 What problems can be solved by Gaussian Processes?
More informationSystem identification and control with (deep) Gaussian processes. Andreas Damianou
System identification and control with (deep) Gaussian processes Andreas Damianou Department of Computer Science, University of Sheffield, UK MIT, 11 Feb. 2016 Outline Part 1: Introduction Part 2: Gaussian
More informationSTA414/2104. Lecture 11: Gaussian Processes. Department of Statistics
STA414/2104 Lecture 11: Gaussian Processes Department of Statistics www.utstat.utoronto.ca Delivered by Mark Ebden with thanks to Russ Salakhutdinov Outline Gaussian Processes Exam review Course evaluations
More informationBayesian Machine Learning
Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 2: Bayesian Basics https://people.orie.cornell.edu/andrew/orie6741 Cornell University August 25, 2016 1 / 17 Canonical Machine Learning
More informationA Bayesian Nonparametric Model for Predicting Disease Status Using Longitudinal Profiles
A Bayesian Nonparametric Model for Predicting Disease Status Using Longitudinal Profiles Jeremy Gaskins Department of Bioinformatics & Biostatistics University of Louisville Joint work with Claudio Fuentes
More informationNeutron inverse kinetics via Gaussian Processes
Neutron inverse kinetics via Gaussian Processes P. Picca Politecnico di Torino, Torino, Italy R. Furfaro University of Arizona, Tucson, Arizona Outline Introduction Review of inverse kinetics techniques
More informationFirst Technical Course, European Centre for Soft Computing, Mieres, Spain. 4th July 2011
First Technical Course, European Centre for Soft Computing, Mieres, Spain. 4th July 2011 Linear Given probabilities p(a), p(b), and the joint probability p(a, B), we can write the conditional probabilities
More informationMulticomponent DS Fusion Approach for Waveform EKG Detection
Multicomponent DS Fusion Approach for Waveform EKG Detection Nicholas Napoli University of Virginia njn5fg@virginia.edu August 10, 2013 Nicholas Napoli (UVa) Multicomponent EKG Fusion August 10, 2013 1
More informationMultivariate Bayesian Linear Regression MLAI Lecture 11
Multivariate Bayesian Linear Regression MLAI Lecture 11 Neil D. Lawrence Department of Computer Science Sheffield University 21st October 2012 Outline Univariate Bayesian Linear Regression Multivariate
More informationAlkalosis or alkalemia arterial blood ph rises above Acidosis or acidemia arterial ph drops below 7.35 (physiological acidosis)
Acid-Base Balance Normal ph of body fluids Arterial blood is 7.4 Venous blood and interstitial fluid is 7.35 Intracellular fluid is 7.0 Alkalosis or alkalemia arterial blood ph rises above 7.45 Acidosis
More informationLearning latent structure in complex networks
Learning latent structure in complex networks Lars Kai Hansen www.imm.dtu.dk/~lkh Current network research issues: Social Media Neuroinformatics Machine learning Joint work with Morten Mørup, Sune Lehmann
More informationGaussian Process Regression Forecasting of Computer Network Conditions
Gaussian Process Regression Forecasting of Computer Network Conditions Christina Garman Bucknell University August 3, 2010 Christina Garman (Bucknell University) GPR Forecasting of NPCs August 3, 2010
More informationGentle Introduction to Infinite Gaussian Mixture Modeling
Gentle Introduction to Infinite Gaussian Mixture Modeling with an application in neuroscience By Frank Wood Rasmussen, NIPS 1999 Neuroscience Application: Spike Sorting Important in neuroscience and for
More informationIntroduction to Gaussian Process
Introduction to Gaussian Process CS 778 Chris Tensmeyer CS 478 INTRODUCTION 1 What Topic? Machine Learning Regression Bayesian ML Bayesian Regression Bayesian Non-parametric Gaussian Process (GP) GP Regression
More informationNovelty Detection based on Extensions of GMMs for Industrial Gas Turbines
Novelty Detection based on Extensions of GMMs for Industrial Gas Turbines Yu Zhang, Chris Bingham, Michael Gallimore School of Engineering University of Lincoln Lincoln, U.. {yzhang; cbingham; mgallimore}@lincoln.ac.uk
More informationDoubly Stochastic Inference for Deep Gaussian Processes. Hugh Salimbeni Department of Computing Imperial College London
Doubly Stochastic Inference for Deep Gaussian Processes Hugh Salimbeni Department of Computing Imperial College London 29/5/2017 Motivation DGPs promise much, but are difficult to train Doubly Stochastic
More informationModel Selection for Gaussian Processes
Institute for Adaptive and Neural Computation School of Informatics,, UK December 26 Outline GP basics Model selection: covariance functions and parameterizations Criteria for model selection Marginal
More informationOptimization of Gaussian Process Hyperparameters using Rprop
Optimization of Gaussian Process Hyperparameters using Rprop Manuel Blum and Martin Riedmiller University of Freiburg - Department of Computer Science Freiburg, Germany Abstract. Gaussian processes are
More informationVariational Principal Components
Variational Principal Components Christopher M. Bishop Microsoft Research 7 J. J. Thomson Avenue, Cambridge, CB3 0FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop In Proceedings
More informationLearning Non-stationary System Dynamics Online using Gaussian Processes
Learning Non-stationary System Dynamics Online using Gaussian Processes Axel Rottmann and Wolfram Burgard Department of Computer Science, University of Freiburg, Germany Abstract. Gaussian processes are
More informationComputer Vision Group Prof. Daniel Cremers. 4. Gaussian Processes - Regression
Group Prof. Daniel Cremers 4. Gaussian Processes - Regression Definition (Rep.) Definition: A Gaussian process is a collection of random variables, any finite number of which have a joint Gaussian distribution.
More informationBayesian time series classification
Bayesian time series classification Peter Sykacek Department of Engineering Science University of Oxford Oxford, OX 3PJ, UK psyk@robots.ox.ac.uk Stephen Roberts Department of Engineering Science University
More informationDecision-making, inference, and learning theory. ECE 830 & CS 761, Spring 2016
Decision-making, inference, and learning theory ECE 830 & CS 761, Spring 2016 1 / 22 What do we have here? Given measurements or observations of some physical process, we ask the simple question what do
More informationConcentration-based Delta Check for Laboratory Error Detection
Northeastern University Department of Electrical and Computer Engineering Concentration-based Delta Check for Laboratory Error Detection Biomedical Signal Processing, Imaging, Reasoning, and Learning (BSPIRAL)
More informationSTA 414/2104, Spring 2014, Practice Problem Set #1
STA 44/4, Spring 4, Practice Problem Set # Note: these problems are not for credit, and not to be handed in Question : Consider a classification problem in which there are two real-valued inputs, and,
More informationIntroduction. Chapter 1
Chapter 1 Introduction In this book we will be concerned with supervised learning, which is the problem of learning input-output mappings from empirical data (the training dataset). Depending on the characteristics
More informationTemplate-Based Representations. Sargur Srihari
Template-Based Representations Sargur srihari@cedar.buffalo.edu 1 Topics Variable-based vs Template-based Temporal Models Basic Assumptions Dynamic Bayesian Networks Hidden Markov Models Linear Dynamical
More informationA Data-driven Approach for Remaining Useful Life Prediction of Critical Components
GT S3 : Sûreté, Surveillance, Supervision Meeting GdR Modélisation, Analyse et Conduite des Systèmes Dynamiques (MACS) January 28 th, 2014 A Data-driven Approach for Remaining Useful Life Prediction of
More informationGaussian Process Vine Copulas for Multivariate Dependence
Gaussian Process Vine Copulas for Multivariate Dependence José Miguel Hernández-Lobato 1,2 joint work with David López-Paz 2,3 and Zoubin Ghahramani 1 1 Department of Engineering, Cambridge University,
More information20: Gaussian Processes
10-708: Probabilistic Graphical Models 10-708, Spring 2016 20: Gaussian Processes Lecturer: Andrew Gordon Wilson Scribes: Sai Ganesh Bandiatmakuri 1 Discussion about ML Here we discuss an introduction
More informationAutoregressive Gaussian processes for structural damage detection
Autoregressive Gaussian processes for structural damage detection R. Fuentes 1,2, E. J. Cross 1, A. Halfpenny 2, R. J. Barthorpe 1, K. Worden 1 1 University of Sheffield, Dynamics Research Group, Department
More informationPairwise rank based likelihood for estimating the relationship between two homogeneous populations and their mixture proportion
Pairwise rank based likelihood for estimating the relationship between two homogeneous populations and their mixture proportion Glenn Heller and Jing Qin Department of Epidemiology and Biostatistics Memorial
More informationPractical Bayesian Optimization of Machine Learning. Learning Algorithms
Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that
More informationIndependent Component Analysis and Unsupervised Learning
Independent Component Analysis and Unsupervised Learning Jen-Tzung Chien National Cheng Kung University TABLE OF CONTENTS 1. Independent Component Analysis 2. Case Study I: Speech Recognition Independent
More informationBayesian Inference: Principles and Practice 3. Sparse Bayesian Models and the Relevance Vector Machine
Bayesian Inference: Principles and Practice 3. Sparse Bayesian Models and the Relevance Vector Machine Mike Tipping Gaussian prior Marginal prior: single α Independent α Cambridge, UK Lecture 3: Overview
More informationIntroduction to Gaussian Processes
Introduction to Gaussian Processes Neil D. Lawrence GPSS 10th June 2013 Book Rasmussen and Williams (2006) Outline The Gaussian Density Covariance from Basis Functions Basis Function Representations Constructing
More informationModel Based Clustering of Count Processes Data
Model Based Clustering of Count Processes Data Tin Lok James Ng, Brendan Murphy Insight Centre for Data Analytics School of Mathematics and Statistics May 15, 2017 Tin Lok James Ng, Brendan Murphy (Insight)
More informationNon-parametric Bayesian Modeling and Fusion of Spatio-temporal Information Sources
th International Conference on Information Fusion Chicago, Illinois, USA, July -8, Non-parametric Bayesian Modeling and Fusion of Spatio-temporal Information Sources Priyadip Ray Department of Electrical
More informationUsing Tactile Feedback and Gaussian Process Regression in a Dynamic System to Learn New Motions
Using Tactile Feedback and Gaussian Process Regression in a Dynamic System to Learn New Motions MCE 499H Honors Thesis Cleveland State University Washkewicz College of Engineering Department of Mechanical
More informationMaximum Likelihood Estimation. only training data is available to design a classifier
Introduction to Pattern Recognition [ Part 5 ] Mahdi Vasighi Introduction Bayesian Decision Theory shows that we could design an optimal classifier if we knew: P( i ) : priors p(x i ) : class-conditional
More informationTree-structured Gaussian Process Approximations
Tree-structured Gaussian Process Approximations Thang Bui joint work with Richard Turner MLG, Cambridge July 1st, 2014 1 / 27 Outline 1 Introduction 2 Tree-structured GP approximation 3 Experiments 4 Summary
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationLearning with Noisy Labels. Kate Niehaus Reading group 11-Feb-2014
Learning with Noisy Labels Kate Niehaus Reading group 11-Feb-2014 Outline Motivations Generative model approach: Lawrence, N. & Scho lkopf, B. Estimating a Kernel Fisher Discriminant in the Presence of
More informationOnline Bayesian Transfer Learning for Sequential Data Modeling
Online Bayesian Transfer Learning for Sequential Data Modeling....? Priyank Jaini Machine Learning, Algorithms and Theory Lab Network for Aging Research 2 3 Data of personal preferences (years) Data (non-existent)
More informationGaussian Processes (10/16/13)
STA561: Probabilistic machine learning Gaussian Processes (10/16/13) Lecturer: Barbara Engelhardt Scribes: Changwei Hu, Di Jin, Mengdi Wang 1 Introduction In supervised learning, we observe some inputs
More informationSparse Linear Models (10/7/13)
STA56: Probabilistic machine learning Sparse Linear Models (0/7/) Lecturer: Barbara Engelhardt Scribes: Jiaji Huang, Xin Jiang, Albert Oh Sparsity Sparsity has been a hot topic in statistics and machine
More informationGaussian Process Regression with K-means Clustering for Very Short-Term Load Forecasting of Individual Buildings at Stanford
Gaussian Process Regression with K-means Clustering for Very Short-Term Load Forecasting of Individual Buildings at Stanford Carol Hsin Abstract The objective of this project is to return expected electricity
More information