Hierarchical expectation propagation for Bayesian aggregation of average data
|
|
- Sherilyn Cameron
- 5 years ago
- Views:
Transcription
1 Hierarchical expectation propagation for Bayesian aggregation of average data Andrew Gelman, Columbia University Sebastian Weber, Novartis also Bob Carpenter, Daniel Lee, Frédéric Bois, Aki Vehtari, and Amy Racine 23 Sept 2015
2 Motivation Trial design with hierarchical models using patient data and summaries of external data More generally: incorporating aggregate data into statistical analysis
3 Scenario Non-inferiority and bio-similarity trials Test of a candidate substance against an active control Candidate substance developed in-house: lots of raw data, individual patient level longitudinal data Active control developed externally: we only have data summaries from publications or submission documents New trial will be similar to earlier trials of the active control drug, but still to some extent different Also the conditions differ in the two experiments
4 Meta-analysis Partial pooling of information from two studies Statistical challenges: Only summary data are available on one study Nonlinear model, so no closed-form model for average data Only 2 groups so hierarchical modeling is not so easy, requires strong prior information
5 Drug disease modeling of drug responses Key application is clinical trial simulations for study design Simulation of drug responses of patients over time New designs can be considered Different endpoints can be explored including time to event Hierarchical (population) based models require patient-level data Problem: Same disease progression, same patient population and likely similar mechanism of action, but population model describes only in-house drug How to learn from published summaries of longitudinal data in the context of nonlinear hierarchical models?
6 Semi-Mechanistic Turn-Over Models Linking Pharmacokinetics (PK) with Pharmacodynamics (PD) PD response «R» can be safety or efficacy related driven by PK effect on «bio-compartement» Zero order «production» / first order «elimination» of response R 4 variants: zero / first order inhibition / stimulation due to PK Drug response with respect to reference state (placebo) Some regimens may lead to PK causing oscillations and hence oscillations in response Source: Peletie LA et al.; J Pharmacokinet Pharmacodyn Bayes Pharma S. Weber et al. 20. May 15 Bayesian aggregation of summary data Public
7 General formulation Direct data y = (y jt ; j = 1,..., J; t = 1,..., T ) with model p(y j α j, φ) Hierarchical model p(α, φ) = p(φ) J j=1 p(α j φ) Use Stan to draw posterior simulations from p(α, φ y) p(φ) J j=1 p(α j φ) J j=1 p(y j α j, φ) External dataset y = (y jt ; j = 1,..., J ; t = 1,..., T ) Observe time series of averages: ȳ = (ȳ 1,..., ȳ T ) External dataset has parameters φ = φ + δ Informative prior on δ
8 Simulated Example Data Set 50 Patients per Treatment Arm Placebo, Treatment 1 & 2 Solid line is true population mean No population differences No between-trial variation 8 Bayes Pharma S. Weber et al. 20. May 15 Bayesian aggregation of summary data Public
9 Integrating external summary data Simplest case: 2 datasets follow the same model Populations must be comparable Parameters must be identical Natural disease progression must be the same Relax these assumptions: φ = φ + δ Extreme cases δ 0: complete pooling p(δ) 1: no pooling
10 Natural Bayesian approach doesn t work Consider external data as latent variables y not observed; all we see is ȳ Computationally expensive For example, 300 patients and 15 measurements per patient Instead, we ll model ȳ directly Take advantage of central limit theorem
11 Multivariate normal approx to p(y 0 φ0 ) I I I I Given φ0 simulate data from 1000 hypothetical patients Compute mean M and T T covariance matrix S Approx p(y 0 φ0 ) by N(y 0 M, S/J 0 ) J 0 = number of patients in external data, not the same as the 1000
12 Importance sampling algorithm Fit model to direct data; get draws from p(α, φ y) For each draw of α, φ: Draw δ from prior p(δ) Compute φ = φ + δ Simulate data from 1000 hypothetical patients Approx p(ȳ φ ) by N(ȳ M, S/J ) Compute importance ratio N(ȳ M, S/J )
13 Hierarchical expectation propagation (EP) algorithm Simple importance sampling won t work if p(δ) is broad Need iterative algorithm At each step, sample from pseudo-prior g(φ, δ) Multiply importance ratios by p(φ,δ) g(φ,δ) EP: match moments to get update for g Use smoothed importance weights and stable moment matching
14 Examples Hierarchical linear model: Local data: yjt N(α j1 + α j2 x t + βx 2 t, σ 2 y ) αj N(µ α, Σ α ) External data: y jt N(α j1 + α j2 x t + βx 2 t, σ 2 y ) α j N(µ α, Σ α ). δ = µ α µ α. Hierarchical logistic Hierarchical PKPD
15 Hierarchical linear model example: Posterior mean +/ sd from EP algorithm from 3 starting points (Red lines show estimate from local data, blue includes aggregate data, green uses complete data) Updating of mu_a[1] Updating of mu_a[2] Updating of b Updating of sigma_a[1] Updating of sigma_a[2] Updating of delta[1] Step of algorithm Updating of delta[2] Pareto shape k^ of importance weights Efficiency of importance samples
16 Hierarchical logistic example: Posterior mean +/ sd from EP algorithm from 3 starting points (Red lines show estimate from local data, green uses complete data) Updating of mu_a[1] Updating of mu_a[2] Updating of b Updating of sigma_a[1] Updating of sigma_a[2] Updating of delta[1] Updating of delta[2] Pareto shape k^ of importance weights Efficiency of importance samples
17 Hierarchical PK/PD example: Posterior mean +/ sd from EP algorithm from 3 starting points (Red lines show estimate from local data, green uses complete data) Updating of lalpha_0 Updating of lalpha_s Updating of lkappa Updating of lemax_s Updating of sigma_lalpha_0 Updating of sigma_lalpha_s Updating of delta[1] Pareto shape k^ of importance weights Efficiency of importance samples
18 Discussion Use fake-data simulation to build trust in results Hope to improve efficiency of EP by approximating p(φ, δ) (currently approximating p(φ) and p(δ)) Shift parameter δ instead of cut New way to think about meta-analysis
Hierarchical expectation propagation for Bayesian aggregation of average data
Hierarchical expectation propagation for Bayesian aggregation of average data Sebastian Weber Andrew Gelman Bob Carpenter Daniel Lee Michael Betancourt Aki Vehtari Amy Racine 26 Oct 2015 Abstract We often
More informationBayesian Drug Disease Model with Stan Using published longitudinal data summaries in population models
Bayesian Drug Disease Model with Stan Using published longitudinal data summaries in population models S. Weber 1, B. Carpenter 2, D. Lee 2, F. Y. Bois 3, A. Gelman 2, A. Racine 1 (1) Novartis, Basel;
More informationBAYESIAN AGGREGATION OF AVERAGE DATA: AN APPLICATION IN DRUG DEVELOPMENT
Submitted to the Annals of Applied Statistics BAYESIAN AGGREGATION OF AVERAGE DATA: AN APPLICATION IN DRUG DEVELOPMENT By Sebastian Weber Andrew Gelman, Daniel Lee Michael Betancourt, Aki Vehtari, and
More informationAOAS imspdf v.2017/12/08 Prn:2018/02/05; 13:43 F:aoas1122.tex; (Aurimas) p. 1
AOAS imspdf v.2017/12/08 Prn:2018/02/05; 13:43 F:aoas1122.tex; (Aurimas) p. 1 The Annals of Applied Statistics 0, Vol. 0, No. 00, 1 22 https://doi.org/10.1214/17-aoas1122 Institute of Mathematical Statistics,
More informationBayesian Updating: Discrete Priors: Spring
Bayesian Updating: Discrete Priors: 18.05 Spring 2017 http://xkcd.com/1236/ Learning from experience Which treatment would you choose? 1. Treatment 1: cured 100% of patients in a trial. 2. Treatment 2:
More informationPractice of SAS Logistic Regression on Binary Pharmacodynamic Data Problems and Solutions. Alan J Xiao, Cognigen Corporation, Buffalo NY
Practice of SAS Logistic Regression on Binary Pharmacodynamic Data Problems and Solutions Alan J Xiao, Cognigen Corporation, Buffalo NY ABSTRACT Logistic regression has been widely applied to population
More informationBayesian Updating: Discrete Priors: Spring
Bayesian Updating: Discrete Priors: 18.05 Spring 2017 http://xkcd.com/1236/ Learning from experience Which treatment would you choose? 1. Treatment 1: cured 100% of patients in a trial. 2. Treatment 2:
More informationHierarchical Models & Bayesian Model Selection
Hierarchical Models & Bayesian Model Selection Geoffrey Roeder Departments of Computer Science and Statistics University of British Columbia Jan. 20, 2016 Contact information Please report any typos or
More informationPubh 8482: Sequential Analysis
Pubh 8482: Sequential Analysis Joseph S. Koopmeiners Division of Biostatistics University of Minnesota Week 10 Class Summary Last time... We began our discussion of adaptive clinical trials Specifically,
More informationBayesian leave-one-out cross-validation for large data sets
1st Symposium on Advances in Approximate Bayesian Inference, 2018 1 5 Bayesian leave-one-out cross-validation for large data sets Måns Magnusson Michael Riis Andersen Aki Vehtari Aalto University mans.magnusson@aalto.fi
More informationθ 1 θ 2 θ n y i1 y i2 y in Hierarchical models (chapter 5) Hierarchical model Introduction to hierarchical models - sometimes called multilevel model
Hierarchical models (chapter 5) Introduction to hierarchical models - sometimes called multilevel model Exchangeability Slide 1 Hierarchical model Example: heart surgery in hospitals - in hospital j survival
More informationNeutral Bayesian reference models for incidence rates of (rare) clinical events
Neutral Bayesian reference models for incidence rates of (rare) clinical events Jouni Kerman Statistical Methodology, Novartis Pharma AG, Basel BAYES2012, May 10, Aachen Outline Motivation why reference
More information6 Sample Size Calculations
6 Sample Size Calculations A major responsibility of a statistician: sample size calculation. Hypothesis Testing: compare treatment 1 (new treatment) to treatment 2 (standard treatment); Assume continuous
More informationUnderstanding Automatic Differentiation to Improve Performance
Understanding Automatic Differentiation to Improve Performance Charles Margossian Columbia University, Department of Statistics July 22nd 2018 Sampler Metropolis Hasting, Gibbs Hamiltonian Monte Carlo
More informationA Bayesian Nonparametric Approach to Monotone Missing Data in Longitudinal Studies with Informative Missingness
A Bayesian Nonparametric Approach to Monotone Missing Data in Longitudinal Studies with Informative Missingness A. Linero and M. Daniels UF, UT-Austin SRC 2014, Galveston, TX 1 Background 2 Working model
More informationModel Averaging (Bayesian Learning)
Model Averaging (Bayesian Learning) We want to predict the output Y of a new case that has input X = x given the training examples e: p(y x e) = m M P(Y m x e) = m M P(Y m x e)p(m x e) = m M P(Y m x)p(m
More informationProbabilistic machine learning group, Aalto University Bayesian theory and methods, approximative integration, model
Aki Vehtari, Aalto University, Finland Probabilistic machine learning group, Aalto University http://research.cs.aalto.fi/pml/ Bayesian theory and methods, approximative integration, model assessment and
More informationR-squared for Bayesian regression models
R-squared for Bayesian regression models Andrew Gelman Ben Goodrich Jonah Gabry Imad Ali 8 Nov 2017 Abstract The usual definition of R 2 (variance of the predicted values divided by the variance of the
More informationA discrete semi-markov model for the effect of need-based treatments on the disease states Ekkehard Glimm Novartis Pharma AG
A discrete semi-markov model for the effect of need-based treatments on the disease states Ekkehard Glimm Novartis Pharma AG Basel, Switzerland Lillian Yau Sandoz Biopharmaceuticals Munich, Germany Agenda
More informationADAPTIVE EXPERIMENTAL DESIGNS. Maciej Patan and Barbara Bogacka. University of Zielona Góra, Poland and Queen Mary, University of London
ADAPTIVE EXPERIMENTAL DESIGNS FOR SIMULTANEOUS PK AND DOSE-SELECTION STUDIES IN PHASE I CLINICAL TRIALS Maciej Patan and Barbara Bogacka University of Zielona Góra, Poland and Queen Mary, University of
More informationSTA414/2104 Statistical Methods for Machine Learning II
STA414/2104 Statistical Methods for Machine Learning II Murat A. Erdogdu & David Duvenaud Department of Computer Science Department of Statistical Sciences Lecture 3 Slide credits: Russ Salakhutdinov Announcements
More informationPrevious lecture. P-value based combination. Fixed vs random effects models. Meta vs. pooled- analysis. New random effects testing.
Previous lecture P-value based combination. Fixed vs random effects models. Meta vs. pooled- analysis. New random effects testing. Interaction Outline: Definition of interaction Additive versus multiplicative
More informationBayesian Analysis for Natural Language Processing Lecture 2
Bayesian Analysis for Natural Language Processing Lecture 2 Shay Cohen February 4, 2013 Administrativia The class has a mailing list: coms-e6998-11@cs.columbia.edu Need two volunteers for leading a discussion
More informationA Clinical Trial Simulation System, Its Applications, and Future Challenges. Peter Westfall, Texas Tech University Kuenhi Tsai, Merck Research Lab
A Clinical Trial Simulation System, Its Applications, and Future Challenges Peter Westfall, Texas Tech University Kuenhi Tsai, Merck Research Lab Acknowledgement Major collaborators Stephan Ogenstand,
More informationFitting Complex PK/PD Models with Stan
Fitting Complex PK/PD Models with Stan Michael Betancourt @betanalpha, @mcmc_stan University of Warwick Bayes Pharma Novartis Basel Campus May 20, 2015 Because of the clinical applications, precise PK/PD
More informationModeling Environment
Topic Model Modeling Environment What does it mean to understand/ your environment? Ability to predict Two approaches to ing environment of words and text Latent Semantic Analysis (LSA) Topic Model LSA
More informationEvidence synthesis for a single randomized controlled trial and observational data in small populations
Evidence synthesis for a single randomized controlled trial and observational data in small populations Steffen Unkel, Christian Röver and Tim Friede Department of Medical Statistics University Medical
More informationMIT /30 Gelman, Carpenter, Hoffman, Guo, Goodrich, Lee,... Stan for Bayesian data analysis
MIT 1985 1/30 Stan: a program for Bayesian data analysis with complex models Andrew Gelman, Bob Carpenter, and Matt Hoffman, Jiqiang Guo, Ben Goodrich, and Daniel Lee Department of Statistics, Columbia
More informationAdaptive designs beyond p-value combination methods. Ekkehard Glimm, Novartis Pharma EAST user group meeting Basel, 31 May 2013
Adaptive designs beyond p-value combination methods Ekkehard Glimm, Novartis Pharma EAST user group meeting Basel, 31 May 2013 Outline Introduction Combination-p-value method and conditional error function
More informationConfidence and Prediction Intervals for Pharmacometric Models
Citation: CPT Pharmacometrics Syst. Pharmacol. (218) 7, 36 373; VC 218 ASCPT All rights reserved doi:1.12/psp4.12286 TUTORIAL Confidence and Prediction Intervals for Pharmacometric Models Anne K ummel
More informationCorrection of the likelihood function as an alternative for imputing missing covariates. Wojciech Krzyzanski and An Vermeulen PAGE 2017 Budapest
Correction of the likelihood function as an alternative for imputing missing covariates Wojciech Krzyzanski and An Vermeulen PAGE 2017 Budapest 1 Covariates in Population PKPD Analysis Covariates are defined
More informationCase Study in the Use of Bayesian Hierarchical Modeling and Simulation for Design and Analysis of a Clinical Trial
Case Study in the Use of Bayesian Hierarchical Modeling and Simulation for Design and Analysis of a Clinical Trial William R. Gillespie Pharsight Corporation Cary, North Carolina, USA PAGE 2003 Verona,
More informationIntegrated Non-Factorized Variational Inference
Integrated Non-Factorized Variational Inference Shaobo Han, Xuejun Liao and Lawrence Carin Duke University February 27, 2014 S. Han et al. Integrated Non-Factorized Variational Inference February 27, 2014
More informationPATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter
More informationChapter 4 Multi-factor Treatment Designs with Multiple Error Terms 93
Contents Preface ix Chapter 1 Introduction 1 1.1 Types of Models That Produce Data 1 1.2 Statistical Models 2 1.3 Fixed and Random Effects 4 1.4 Mixed Models 6 1.5 Typical Studies and the Modeling Issues
More informationBayesian model selection: methodology, computation and applications
Bayesian model selection: methodology, computation and applications David Nott Department of Statistics and Applied Probability National University of Singapore Statistical Genomics Summer School Program
More informationBayesian Updating: Discrete Priors: Spring 2014
ian Updating: Discrete Priors: 18.05 Spring 2014 http://xkcd.com/1236/ January 1, 2017 1 / 16 Learning from experience Which treatment would you choose? 1. Treatment 1: cured 100% of patients in a trial.
More informationGenerative Clustering, Topic Modeling, & Bayesian Inference
Generative Clustering, Topic Modeling, & Bayesian Inference INFO-4604, Applied Machine Learning University of Colorado Boulder December 12-14, 2017 Prof. Michael Paul Unsupervised Naïve Bayes Last week
More informationPart 7: Hierarchical Modeling
Part 7: Hierarchical Modeling!1 Nested data It is common for data to be nested: i.e., observations on subjects are organized by a hierarchy Such data are often called hierarchical or multilevel For example,
More informationRecap on Data Assimilation
Concluding Thoughts Recap on Data Assimilation FORECAST ANALYSIS Kalman Filter Forecast Analysis Analytical projection of the ANALYSIS mean and cov from t-1 to the FORECAST mean and cov for t Update FORECAST
More informationBayesian Nonparametric Accelerated Failure Time Models for Analyzing Heterogeneous Treatment Effects
Bayesian Nonparametric Accelerated Failure Time Models for Analyzing Heterogeneous Treatment Effects Nicholas C. Henderson Thomas A. Louis Gary Rosner Ravi Varadhan Johns Hopkins University September 28,
More informationMachine Learning. Gaussian Mixture Models. Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall
Machine Learning Gaussian Mixture Models Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall 2012 1 The Generative Model POV We think of the data as being generated from some process. We assume
More informationModeling & Simulation to Improve the Design of Clinical Trials
Modeling & Simulation to Improve the Design of Clinical Trials Gary L. Rosner Oncology Biostatistics and Bioinformatics The Sidney Kimmel Comprehensive Cancer Center at Johns Hopkins Joint work with Lorenzo
More informationLearning the Linear Dynamical System with ASOS ( Approximated Second-Order Statistics )
Learning the Linear Dynamical System with ASOS ( Approximated Second-Order Statistics ) James Martens University of Toronto June 24, 2010 Computer Science UNIVERSITY OF TORONTO James Martens (U of T) Learning
More informationBayesian methods for missing data: part 1. Key Concepts. Nicky Best and Alexina Mason. Imperial College London
Bayesian methods for missing data: part 1 Key Concepts Nicky Best and Alexina Mason Imperial College London BAYES 2013, May 21-23, Erasmus University Rotterdam Missing Data: Part 1 BAYES2013 1 / 68 Outline
More informationvariability of the model, represented by σ 2 and not accounted for by Xβ
Posterior Predictive Distribution Suppose we have observed a new set of explanatory variables X and we want to predict the outcomes ỹ using the regression model. Components of uncertainty in p(ỹ y) variability
More informationBIOS 2083: Linear Models
BIOS 2083: Linear Models Abdus S Wahed September 2, 2009 Chapter 0 2 Chapter 1 Introduction to linear models 1.1 Linear Models: Definition and Examples Example 1.1.1. Estimating the mean of a N(μ, σ 2
More informationInterim Monitoring of Clinical Trials: Decision Theory, Dynamic Programming. and Optimal Stopping
Interim Monitoring of Clinical Trials: Decision Theory, Dynamic Programming and Optimal Stopping Christopher Jennison Department of Mathematical Sciences, University of Bath, UK http://people.bath.ac.uk/mascj
More informationA Bayesian Nonparametric Approach to Causal Inference for Semi-competing risks
A Bayesian Nonparametric Approach to Causal Inference for Semi-competing risks Y. Xu, D. Scharfstein, P. Mueller, M. Daniels Johns Hopkins, Johns Hopkins, UT-Austin, UF JSM 2018, Vancouver 1 What are semi-competing
More informationLINEAR MODELS FOR CLASSIFICATION. J. Elder CSE 6390/PSYC 6225 Computational Modeling of Visual Perception
LINEAR MODELS FOR CLASSIFICATION Classification: Problem Statement 2 In regression, we are modeling the relationship between a continuous input variable x and a continuous target variable t. In classification,
More informationType I error rate control in adaptive designs for confirmatory clinical trials with treatment selection at interim
Type I error rate control in adaptive designs for confirmatory clinical trials with treatment selection at interim Frank Bretz Statistical Methodology, Novartis Joint work with Martin Posch (Medical University
More informationNonlinear random effects mixture models: Maximum likelihood estimation via the EM algorithm
Computational Statistics & Data Analysis 51 (2007) 6614 6623 www.elsevier.com/locate/csda Nonlinear random effects mixture models: Maximum lielihood estimation via the EM algorithm Xiaoning Wang a, Alan
More informationMarkov Chain Monte Carlo
Department of Statistics The University of Auckland https://www.stat.auckland.ac.nz/~brewer/ Emphasis I will try to emphasise the underlying ideas of the methods. I will not be teaching specific software
More information1 Inference for binomial proportion (Matlab/Python)
Bayesian data analysis exercises from 2015 1 Inference for binomial proportion (Matlab/Python) Algae status is monitored in 274 sites at Finnish lakes and rivers. The observations for the 2008 algae status
More informationBMA-Mod : A Bayesian Model Averaging Strategy for Determining Dose-Response Relationships in the Presence of Model Uncertainty
BMA-Mod : A Bayesian Model Averaging Strategy for Determining -Response Relationships in the Presence of Model Uncertainty A. Lawrence Gould KOL 20 October, 2017 Overview and Motivation -Response Trials
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning Empirical Bayes, Hierarchical Bayes Mark Schmidt University of British Columbia Winter 2017 Admin Assignment 5: Due April 10. Project description on Piazza. Final details coming
More informationPrinciples of Bayesian Inference
Principles of Bayesian Inference Sudipto Banerjee University of Minnesota July 20th, 2008 1 Bayesian Principles Classical statistics: model parameters are fixed and unknown. A Bayesian thinks of parameters
More informationCSE 473: Artificial Intelligence Autumn Topics
CSE 473: Artificial Intelligence Autumn 2014 Bayesian Networks Learning II Dan Weld Slides adapted from Jack Breese, Dan Klein, Daphne Koller, Stuart Russell, Andrew Moore & Luke Zettlemoyer 1 473 Topics
More informationEvaluation of the Fisher information matrix in nonlinear mixed eect models without linearization
Evaluation of the Fisher information matrix in nonlinear mixed eect models without linearization Sebastian Ueckert, Marie-Karelle Riviere, and France Mentré INSERM, IAME, UMR 1137, F-75018 Paris, France;
More informationPrinciples of Bayesian Inference
Principles of Bayesian Inference Sudipto Banerjee and Andrew O. Finley 2 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department
More informationPrinciples of Bayesian Inference
Principles of Bayesian Inference Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department
More informationProbabilistic programming and Stan. mc-stan.org
Probabilistic programming and Stan mc-stan.org Outline What is probabilistic programming Stan now Stan in the future A bit about other software Probabilistic programming Probabilistic programming framework
More informationGeneralized Linear Models
Generalized Linear Models STAT 489-01: Bayesian Methods of Data Analysis Spring Semester 2017 Contents 1 Multi-Variable Linear Models 1 2 Generalized Linear Models 4 2.1 Illustration: Logistic Regression..........
More informationIntroduction to Gaussian Processes
Introduction to Gaussian Processes Iain Murray murray@cs.toronto.edu CSC255, Introduction to Machine Learning, Fall 28 Dept. Computer Science, University of Toronto The problem Learn scalar function of
More informationPart 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior Tom Heskes joint work with Marcel van Gerven
More informationIncluding historical data in the analysis of clinical trials using the modified power priors: theoretical overview and sampling algorithms
Including historical data in the analysis of clinical trials using the modified power priors: theoretical overview and sampling algorithms Joost van Rosmalen 1, David Dejardin 2,3, and Emmanuel Lesaffre
More informationModelling a complex input process in a population pharmacokinetic analysis: example of mavoglurant oral absorption in healthy subjects
Modelling a complex input process in a population pharmacokinetic analysis: example of mavoglurant oral absorption in healthy subjects Thierry Wendling Manchester Pharmacy School Novartis Institutes for
More informationWeakly informative priors
Department of Statistics and Department of Political Science Columbia University 21 Oct 2011 Collaborators (in order of appearance): Gary King, Frederic Bois, Aleks Jakulin, Vince Dorie, Sophia Rabe-Hesketh,
More informationIndividualized Treatment Effects with Censored Data via Nonparametric Accelerated Failure Time Models
Individualized Treatment Effects with Censored Data via Nonparametric Accelerated Failure Time Models Nicholas C. Henderson Thomas A. Louis Gary Rosner Ravi Varadhan Johns Hopkins University July 31, 2018
More informationOptimising Group Sequential Designs. Decision Theory, Dynamic Programming. and Optimal Stopping
: Decision Theory, Dynamic Programming and Optimal Stopping Christopher Jennison Department of Mathematical Sciences, University of Bath, UK http://people.bath.ac.uk/mascj InSPiRe Conference on Methodology
More informationNon-Parametric Bayes
Non-Parametric Bayes Mark Schmidt UBC Machine Learning Reading Group January 2016 Current Hot Topics in Machine Learning Bayesian learning includes: Gaussian processes. Approximate inference. Bayesian
More informationPractical considerations for survival models
Including historical data in the analysis of clinical trials using the modified power prior Practical considerations for survival models David Dejardin 1 2, Joost van Rosmalen 3 and Emmanuel Lesaffre 1
More informationBayesian data analysis in practice: Three simple examples
Bayesian data analysis in practice: Three simple examples Martin P. Tingley Introduction These notes cover three examples I presented at Climatea on 5 October 0. Matlab code is available by request to
More informationEfficient Data Assimilation for Spatiotemporal Chaos: a Local Ensemble Transform Kalman Filter
Efficient Data Assimilation for Spatiotemporal Chaos: a Local Ensemble Transform Kalman Filter arxiv:physics/0511236 v1 28 Nov 2005 Brian R. Hunt Institute for Physical Science and Technology and Department
More informationMachine Learning. Lecture 4: Regularization and Bayesian Statistics. Feng Li. https://funglee.github.io
Machine Learning Lecture 4: Regularization and Bayesian Statistics Feng Li fli@sdu.edu.cn https://funglee.github.io School of Computer Science and Technology Shandong University Fall 207 Overfitting Problem
More informationNaïve Bayes Introduction to Machine Learning. Matt Gormley Lecture 18 Oct. 31, 2018
10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Naïve Bayes Matt Gormley Lecture 18 Oct. 31, 2018 1 Reminders Homework 6: PAC Learning
More informationVariable selection and machine learning methods in causal inference
Variable selection and machine learning methods in causal inference Debashis Ghosh Department of Biostatistics and Informatics Colorado School of Public Health Joint work with Yeying Zhu, University of
More informationAdvanced Introduction to Machine Learning CMU-10715
Advanced Introduction to Machine Learning CMU-10715 Gaussian Processes Barnabás Póczos http://www.gaussianprocess.org/ 2 Some of these slides in the intro are taken from D. Lizotte, R. Parr, C. Guesterin
More informationThis is an electronic reprint of the original article. This reprint may differ from the original in pagination and typographic detail.
Powered by TCPDF (www.tcpdf.org) This is an electronic reprint of the original article. This reprint may differ from the original in pagination and typographic detail. Vehtari, Aki; Mononen, Tommi; Tolvanen,
More informationMore Spectral Clustering and an Introduction to Conjugacy
CS8B/Stat4B: Advanced Topics in Learning & Decision Making More Spectral Clustering and an Introduction to Conjugacy Lecturer: Michael I. Jordan Scribe: Marco Barreno Monday, April 5, 004. Back to spectral
More informationebay/google short course: Problem set 2
18 Jan 013 ebay/google short course: Problem set 1. (the Echange Parado) You are playing the following game against an opponent, with a referee also taking part. The referee has two envelopes (numbered
More informationNonparametric Bayes approach for a semimechanistic pharmacodynamic model
University of New Mexico UNM Digital Repository Mathematics & Statistics ETDs Electronic Theses and Dissertations 6-23-2015 Nonparametric Bayes approach for a semimechanistic pharmacokinetic and pharmacodynamic
More informationgsbdesign: An R Package for Evaluating the Operating Characteristics of a Group Sequential Bayesian Design
gsbdesign: An R Package for Evaluating the Operating Characteristics of a Group Sequential Bayesian Design Florian Gerber University of Zurich Thomas Gsponer University of Berne Abstract The R package
More informationGaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature
More informationA short introduction to INLA and R-INLA
A short introduction to INLA and R-INLA Integrated Nested Laplace Approximation Thomas Opitz, BioSP, INRA Avignon Workshop: Theory and practice of INLA and SPDE November 7, 2018 2/21 Plan for this talk
More informationGaussian Processes for Computer Experiments
Gaussian Processes for Computer Experiments Jeremy Oakley School of Mathematics and Statistics, University of Sheffield www.jeremy-oakley.staff.shef.ac.uk 1 / 43 Computer models Computer model represented
More informationTorsten: Stan functions for pharmacometrics applications
Torsten: Stan functions for pharmacometrics applications Bill Gillespie Metrum Research Group Stan for Pharmacometrics Paris, France 24 July 2018 c 2018 Torsten: Stan functions for PMX 24 July 2018 1 /
More informationIntegration of SAS and NONMEM for Automation of Population Pharmacokinetic/Pharmacodynamic Modeling on UNIX systems
Integration of SAS and NONMEM for Automation of Population Pharmacokinetic/Pharmacodynamic Modeling on UNIX systems Alan J Xiao, Cognigen Corporation, Buffalo NY Jill B Fiedler-Kelly, Cognigen Corporation,
More informationProbability Review and Naïve Bayes
Probability Review and Naïve Bayes Instructor: Alan Ritter Some slides adapted from Dan Jurfasky and Brendan O connor What is Probability? The probability the coin will land heads is 0.5 Q: what does this
More informationBayesian Inference on Joint Mixture Models for Survival-Longitudinal Data with Multiple Features. Yangxin Huang
Bayesian Inference on Joint Mixture Models for Survival-Longitudinal Data with Multiple Features Yangxin Huang Department of Epidemiology and Biostatistics, COPH, USF, Tampa, FL yhuang@health.usf.edu January
More informationParametric Unsupervised Learning Expectation Maximization (EM) Lecture 20.a
Parametric Unsupervised Learning Expectation Maximization (EM) Lecture 20.a Some slides are due to Christopher Bishop Limitations of K-means Hard assignments of data points to clusters small shift of a
More informationIntroduction to Bayesian Learning. Machine Learning Fall 2018
Introduction to Bayesian Learning Machine Learning Fall 2018 1 What we have seen so far What does it mean to learn? Mistake-driven learning Learning by counting (and bounding) number of mistakes PAC learnability
More informationBayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework
HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for
More informationReconstruction of individual patient data for meta analysis via Bayesian approach
Reconstruction of individual patient data for meta analysis via Bayesian approach Yusuke Yamaguchi, Wataru Sakamoto and Shingo Shirahata Graduate School of Engineering Science, Osaka University Masashi
More informationMonte Carlo Studies. The response in a Monte Carlo study is a random variable.
Monte Carlo Studies The response in a Monte Carlo study is a random variable. The response in a Monte Carlo study has a variance that comes from the variance of the stochastic elements in the data-generating
More informationGenerative Learning algorithms
CS9 Lecture notes Andrew Ng Part IV Generative Learning algorithms So far, we ve mainly been talking about learning algorithms that model p(y x; θ), the conditional distribution of y given x. For instance,
More informationThe Naïve Bayes Classifier. Machine Learning Fall 2017
The Naïve Bayes Classifier Machine Learning Fall 2017 1 Today s lecture The naïve Bayes Classifier Learning the naïve Bayes Classifier Practical concerns 2 Today s lecture The naïve Bayes Classifier Learning
More informationApproximate Bayesian Computation and Particle Filters
Approximate Bayesian Computation and Particle Filters Dennis Prangle Reading University 5th February 2014 Introduction Talk is mostly a literature review A few comments on my own ongoing research See Jasra
More informationPubH 7470: STATISTICS FOR TRANSLATIONAL & CLINICAL RESEARCH
PubH 7470: STATISTICS FOR TRANSLATIONAL & CLINICAL RESEARCH The First Step: SAMPLE SIZE DETERMINATION THE ULTIMATE GOAL The most important, ultimate step of any of clinical research is to do draw inferences;
More informationDistributed Bayesian Learning with Stochastic Natural-gradient EP and the Posterior Server
Distributed Bayesian Learning with Stochastic Natural-gradient EP and the Posterior Server in collaboration with: Minjie Xu, Balaji Lakshminarayanan, Leonard Hasenclever, Thibaut Lienart, Stefan Webb,
More information