Causal Inference: Discussion

Size: px
Start display at page:

Download "Causal Inference: Discussion"

Transcription

1 Causal Inference: Discussion Mladen Kolar The University of Chicago Booth School of Business Sept 23, 2016

2 Types of machine learning problems Based on the information available: Supervised learning Reinforcement learning Unsupervised learning M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

3 Bayesian networks P (F, A, S, H, N) = P (F ) P (A) P (S F, A) P (H S) P (N S) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

4 Bayesian networks Probabilistic Interpretation of Bayesian Networks A Bayesian network represents a distribution P when each variable is independent of its non-descendants conditional on its parents in the DAG Causal Interpretation of Bayesian Networks There is a directed edge from A to B (relative to V) when A is a direct cause of B. M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

5 Markov Networks Random vector X = (X 1,..., X p ) Graph G = (V, E) with p nodes - represents conditional independence relationships between nodes Useful for exploring associations between measured variables (a, b) E X a X b X ab ( ab := V \{a, b} ) P[X a X b, X ab ] = P[X a X ab ] (Koller and Friedman, 2009) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

6 Two Common Markov Networks Gaussian Markov Network: X N (µ, Σ) ( p(x) exp 1 ) 2 (x µ)t Σ 1 (x µ) The precision matrix Ω = Σ 1 encodes both parameters and the graph structure (Lauritzen, 1996; Koller and Friedman, 2009) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

7 Two Common Markov Networks Gaussian Markov Network: X N (µ, Σ) ( p(x) exp 1 ) 2 (x µ)t Σ 1 (x µ) The precision matrix Ω = Σ 1 encodes both parameters and the graph structure Discrete Markov network: X { 1, 1} p p(x; Θ) exp x a θ aa + a V (Ising model) x a x b θ ab a,b V V Θ = (θ ab ) ab encodes the conditional independence relationships (Lauritzen, 1996; Koller and Friedman, 2009) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

8 Structure Learning Problem Given an i.i.d. sample D n = {x i } n i=1 from a distribution P P Learn the set of conditional independence relationships Ĝ = Ĝ(D n) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

9 Structure Learning Problem Given an i.i.d. sample D n = {x i } n i=1 from a distribution P P Learn the set of conditional independence relationships Ĝ = Ĝ(D n) Gaussian Markov Networks (Drton and Perlman, 2007) - Form the maximum likelihood estimator for the covariance matrix - Test for zeros in the precision matrix M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

10 Structure Learning Problem Given an i.i.d. sample D n = {x i } n i=1 from a distribution P P Learn the set of conditional independence relationships Ĝ = Ĝ(D n) Gaussian Markov Networks (Drton and Perlman, 2007) - Form the maximum likelihood estimator for the covariance matrix - Test for zeros in the precision matrix Discrete Markov Networks (Chickering, 1996) - Hard to learn structure, since the log partition function cannot be evaluated efficiently M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

11 Structure Learning in High-Dimensions (Some) Existing Work: Gaussian graphical models: GLasso (Yuan and Lin, 2007), CLIME (Cai et al., 2011), neighborhood selection (Meinshausen and Bühlmann, 2006) Ising model neighborhood selection (Ravikumar et al., 2009), composite likelihood (Xue et al., 2012) Exponential family graphical models exponential (Yang et al., 2012, 2015), Poisson (Yang et al., 2013), mixed (Yang et al., 2014),... Recent overview: Drton and Maathuis (2016) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

12 Neighborhood Selection Local structure estimation θ a = arg max θ a R p l(θ a; D n ) λ θ a Estimated neighborhood N a = {b V θ ab 0} M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

13 Neighborhood Selection Local structure estimation θ a = arg max θ a R p l(θ a; D n ) λ θ a Estimated neighborhood N a = {b V θ ab 0} M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

14 Neighborhood Selection Local structure estimation θ a = arg max θ a R p l(θ a; D n ) λ θ a Estimated neighborhood 8 θ 17 θ 16 θ 15 4 N a = {b V θ ab 0} θ 18 θ 14 1 θ 12 θ M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

15 Neighborhood Selection Local structure estimation θ a = arg max θ a R p l(θ a; D n ) λ θ a θ 1 = ( ) Estimated neighborhood 8 θ 17 θ 16 θ 15 4 N a = {b V θ ab 0} θ 18 θ 14 1 θ 12 θ M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

16 Neighborhood Selection Local structure estimation θ a = arg max θ a R p l(θ a; D n ) λ θ a θ 1 = ( ) Estimated neighborhood 8 θ 15 4 N a = {b V θ ab 0} N a = {2, 3, 5} 1 θ 12 θ M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

17 Neighborhood Selection Local structure estimation θ a = arg max θ a R p l(θ a; D n ) λ θ a 1 θ 1 = ( ) Estimated neighborhood N a = {b V θ ab 0} N a = {2, 3, 5} 2 M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

18 Neighborhood Selection Local structure estimation θ a = arg max θ a R p l(θ a; D n ) λ θ a Estimated neighborhood N a = {b V θ ab 0} M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

19 Neighborhood Selection Local structure estimation θ a = arg max θ a R p l(θ a; D n ) λ θ a Estimated neighborhood N a = {b V θ ab 0} M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

20 Neighborhood Selection Local structure estimation θ a = arg max θ a R p l(θ a; D n ) λ θ a Estimated neighborhood N a = {b V θ ab 0} M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

21 Implications for Science Some questions remain unanswered: How can we quantify uncertainty of estimated graph structure? How certain we are there is an edge between nodes a and b? How to construct honest, robust tests about edge parameters? M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

22 Quantifying uncertainty For Gaussian graphical model inference on values of the precision matrix Ω using an asymptotically normal estimator (Ren et al., 2015) covariate adjusted (Chen et al., 2015) time-varying extension (Wang and Kolar, 2014) Exponential family graphical models (Wang and Kolar, 2016; Yu, Gupta, and Kolar, 2016) Quantile graphical models (Belloni et al., 2016) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

23 Transelliptical Graphical Models M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

24 Background: Nonparanormal model / Gaussian copula Nonparanormal distribution: X NP N p (Σ; f 1,..., f p ) if (f 1 (X 1 ),..., f p (X p )) T N (0, Σ) (Liu et al., 2009) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

25 Background: Transelliptical Distribution Transelliptical distribution: X T E p (Σ, ξ; f 1,..., f p ) if (f 1 (X 1 ),..., f p (X p )) T EC P (0, Σ, ξ) where Σ = [σ ab ] a,b R p is a correlation matrix and P[ξ = 0] = 0. Elliptical distribution: if Z = µ + ξ }{{} random radius Z EC p (µ, Σ, ξ) Σ 1/2 U }{{} random unit vector (Liu et al., 2012b) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

26 Tail dependence Elliptical and transelliptical distributions allow for heavy tail dependence between variables. (X 1, X 2 ) multivariate t-distribution with d degrees of freedom Tail correlations: Corr ( 1I { X 1 q X } { 1 α, 1I Xb q X }) 2 α d=0.1 Tail correlation d=1 d= d=10 d= (Gaussian) Quantile M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

27 Robust Graphical Modeling Data: X 1,..., X n T E p (Σ, ξ; f 1,..., f p ) Underlying graph: Edge (a, b) E if ω ab 0 where Ω = Σ 1 = [ω kl ] Construct Σ = [ σ ab ] where σ ab = sin ( π 2 τ ab) and τ ab = ( ) n 1 2 i<i sign((x ia X i a)(x ib X i b)) is Kendall s tau. Plug into, for example, GLasso objective Ω = arg max Ω 0 log Ω tr ΣΩ λ Ω 1 (Liu et al., 2012a) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

28 ROCKET: Robust Confidence Intervals via Kendalls Tau Let Idea ( θaa θ Θ ab = ab θ ba θ bb ) = Ω 1 JJ = Cov(ɛ a, ɛ b ). θ ab = E[ɛ a ɛ b ] = E [( Y a Y I ) ( γ a Yb Y I )] γ b = E [Y a Y b ] + γ a E [ Y I Y I ] γb E [Y a Y I ] γ b E [Y b Y I ] γ a where γ a = Σ 1 II Σ Ia and γ b = Σ 1 II Σ Ib Our procedure constructs γ a and γ b θ ab = Σ JJ + γ a ΣII γ b Σ ai γ b Σ bi γ a M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

29 Main results Estimation consistency If n kn 2 log(p), γ a 1 k n γ a 2, λ max (Σ)/λ min (Σ) C cov, kn log(p n ) k 2 γ a γ a 2 and γ a γ a 1 n log(p n ), n n then Θ Θ k n log(p n ) n where Θ is an oracle estimator that knows γ c exactly. Asymptotic Normality sup t R { n P ω ab ω ab Ŝ ab } t, Φ(t) C n M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

30 How To Estimate γ a? Lasso γ a = arg min γ, γ 1 R { } 1 2 γt ΣII γ γ T ΣIa + λ γ 1 non-convex problem, however Loh and Wainwright (2015) need R so that γ a 1 R Dantzig selector γ a = arg min { γ 1 s.t. Σ II γ Σ } Ia λ M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

31 Minimax optimality G 0 (M, k n ) = { Ω = (Ωab ) a,b [p] : max a [p] b a 1I{Ω ab 0} k n, and M 1 λ min (Ω) λ max (Ω) M. where M is a constant greater than one. } Theorem 1 in Ren et al. (2015) states that inf inf a,b sup ω ab G 0 (M,k n) P { ω ab ω ab ɛ 0 ( n 1 k n log(p n ) n 1/2)} ɛ 0. M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

32 Simulations Data generated from a grid, sample size n = 400 Ω aa = 1, Ω ab = { 0.24 for edges 0 for non edges, X EC(0, Ω 1, t 5 ) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

33 Simulations Check if the estimator is asymptotically normal (over 1000 trials): Quantiles of Ť(2,2),(2,3) ROCKET Standard Normal Quantiles Pearson Standard Normal Quantiles Nonparanormal Standard Normal Quantiles Quantiles of Ť(2,2),(3,3) Standard Normal Quantiles Standard Normal Quantiles Standard Normal Quantiles Quantiles of Ť(2,2),(10,10) Standard Normal Quantiles Standard Normal Quantiles Standard Normal Quantiles M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

34 Simulations ROCKET Pearson Nonparanormal Coverage Width Coverage Width Coverage Width True edge Near non-edge Far non-edge M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

35 Simulations Results for Gaussian data with the same Ω (grid graph): ROCKET Pearson Nonparanormal Coverage Width Coverage Width Coverage Width True edge Near non-edge Far non-edge All methods have 95% coverage ROCKET confidence intervals only slightly wider M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

36 The ROCKET method: Theoretical guarantees for asymptotic normality over the transelliptical family Confidence intervals have the right coverage Practical recommendation: we should use the transelliptical family in practice Code: Preprint: arxiv: with Rina Foygel Barber Extension to dynamic model: arxiv: with Junwei Lu and Han Liu M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

37 No Cool-aid Assumptions Literature often requires a lot of assumptions pros: makes math work out cons: hard to verify in practice Negative result from Wasserman, Kolar, and Rinaldo (2014) inf sup E[Wn(C 2 n )] C(α) C n C n P P n p See also: Cai and Guo (2015) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

38 Technical Difficulties Many machine learning methods come with knobs When the goal is prediction, we can use cross-validation to select knobs. When the goal is inference, it is not clear that using cross-validation will give valid confidence intervals. So, how do we choose these tuning parameters? my personal experience: the tuning parameters affect finite sample properties a lot many procedures are asymptotically normal, however, higher order biases may significantly affect finite sample properties M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

39 Extensions/Ideas Other tree ensembles gradient boosted trees (Friedman, 2001) Uniform convergence results Finite sample results Adaptive estimation of nuisance parameter Running competitions for estimating causal effects (Jennifer Hill) M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

40 Thank you! M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

41 References I A. Belloni, M. Chen, and V. Chernozhukov. Quantile graphical models: Prediction and conditional independence with applications to financial risk management. ArXiv e-prints, arxiv: , July T. T. Cai and Z. Guo. Confidence intervals for high-dimensional linear regression: Minimax rates and adaptivity. ArXiv e-prints, arxiv: , June T. T. Cai, W. Liu, and X. Luo. A constrained l 1 minimization approach to sparse precision matrix estimation. J. Am. Stat. Assoc., 106(494): , M. Chen, Z. Ren, H. Zhao, and H. H. Zhou. Asymptotically normal and efficient estimation of covariate-adjusted gaussian graphical model. Journal of the American Statistical Association, 0(ja):00 00, M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

42 References II D. M. Chickering. Learning bayesian networks is np-complete. In Learning from Data: Artificial Intelligence and Statistics V, pages Springer-Verlag, M. Drton and M. H. Maathuis. Structure learning in graphical modeling. To appear in Annual Review of Statistics and Its Application, 3, M. Drton and M. D. Perlman. Multiple testing and error control in gaussian graphical model selection. Statistical Science, 22(3): , J. H. Friedman. Greedy function approximation: a gradient boosting machine. Ann. Statist., 29(5): , D. Koller and N. Friedman. Probabilistic graphical models: principles and techniques. MIT press, S. L. Lauritzen. Graphical Models (Oxford Statistical Science Series). Oxford University Press, USA, July M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

43 References III H. Liu, J. D. Lafferty, and L. A. Wasserman. The nonparanormal: Semiparametric estimation of high dimensional undirected graphs. J. Mach. Learn. Res., 10: , H. Liu, F. Han, M. Yuan, J. D. Lafferty, and L. A. Wasserman. High-dimensional semiparametric Gaussian copula graphical models. Ann. Stat., 40(4): , 2012a. H. Liu, F. Han, and C.-H. Zhang. Transelliptical graphical models. In P. Bartlett, F. Pereira, C. Burges, L. Bottou, and K. Weinberger, editors, Proc. of NIPS, pages b. P.-L. Loh and M. J. Wainwright. Regularized m-estimators with nonconvexity: Statistical and algorithmic theory for local optima. J. Mach. Learn. Res., 16: , N. Meinshausen and P. Bühlmann. High dimensional graphs and variable selection with the lasso. Ann. Stat., 34(3): , M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

44 References IV P. Ravikumar, M. J. Wainwright, and J. D. Lafferty. High-dimensional ising model selection using l 1 regularized logistic regression. Annals of Statistics, to appear, Z. Ren, T. Sun, C.-H. Zhang, and H. H. Zhou. Asymptotic normality and optimalities in estimation of large Gaussian graphical models. Ann. Stat., 43(3): , J. Wang and M. Kolar. Inference for high-dimensional exponential family graphical models. In A. Gretton and C. C. Robert, editors, Proc. of AISTATS, volume 51, pages , J. Wang and M. Kolar. Inference for sparse conditional precision matrices. ArXiv e-prints, arxiv: , December L. A. Wasserman, M. Kolar, and A. Rinaldo. Berry-Esseen bounds for estimating undirected graphs. Electron. J. Stat., 8: , M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

45 References V L. Xue, H. Zou, and T. Ca. Nonconcave penalized composite conditional likelihood estimation of sparse ising models. Ann. Stat., 40(3): , E. Yang, G. I. Allen, Z. Liu, and P. Ravikumar. Graphical models via generalized linear models. In F. Pereira, C. Burges, L. Bottou, and K. Weinberger, editors, Advances in Neural Information Processing Systems 25, pages Curran Associates, Inc., E. Yang, P. Ravikumar, G. I. Allen, and Z. Liu. On poisson graphical models. In C. Burges, L. Bottou, M. Welling, Z. Ghahramani, and K. Weinberger, editors, Advances in Neural Information Processing Systems 26, pages Curran Associates, Inc., E. Yang, Y. Baker, P. Ravikumar, G. I. Allen, and Z. Liu. Mixed graphical models via exponential families. In Proc. 17th Int. Conf, Artif. Intel. Stat., pages , M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

46 References VI E. Yang, P. Ravikumar, G. I. Allen, and Z. Liu. On graphical models via univariate exponential family distributions. J. Mach. Learn. Res., 16: , M. Yu, V. Gupta, and M. Kolar. Statistical inference for pairwise graphical models using score matching. In Advances in Neural Information Processing Systems 29. Curran Associates, Inc., M. Yuan and Y. Lin. Model selection and estimation in the gaussian graphical model. Biometrika, 94(1):19 35, M. Kolar (Chicago Booth) Causal Inference: Discussion Sept 23,

Learning discrete graphical models via generalized inverse covariance matrices

Learning discrete graphical models via generalized inverse covariance matrices Learning discrete graphical models via generalized inverse covariance matrices Duzhe Wang, Yiming Lv, Yongjoon Kim, Young Lee Department of Statistics University of Wisconsin-Madison {dwang282, lv23, ykim676,

More information

Sparse Graph Learning via Markov Random Fields

Sparse Graph Learning via Markov Random Fields Sparse Graph Learning via Markov Random Fields Xin Sui, Shao Tang Sep 23, 2016 Xin Sui, Shao Tang Sparse Graph Learning via Markov Random Fields Sep 23, 2016 1 / 36 Outline 1 Introduction to graph learning

More information

The Nonparanormal skeptic

The Nonparanormal skeptic The Nonpara skeptic Han Liu Johns Hopkins University, 615 N. Wolfe Street, Baltimore, MD 21205 USA Fang Han Johns Hopkins University, 615 N. Wolfe Street, Baltimore, MD 21205 USA Ming Yuan Georgia Institute

More information

BAGUS: Bayesian Regularization for Graphical Models with Unequal Shrinkage

BAGUS: Bayesian Regularization for Graphical Models with Unequal Shrinkage BAGUS: Bayesian Regularization for Graphical Models with Unequal Shrinkage Lingrui Gan, Naveen N. Narisetty, Feng Liang Department of Statistics University of Illinois at Urbana-Champaign Problem Statement

More information

Properties of optimizations used in penalized Gaussian likelihood inverse covariance matrix estimation

Properties of optimizations used in penalized Gaussian likelihood inverse covariance matrix estimation Properties of optimizations used in penalized Gaussian likelihood inverse covariance matrix estimation Adam J. Rothman School of Statistics University of Minnesota October 8, 2014, joint work with Liliana

More information

Joint Gaussian Graphical Model Review Series I

Joint Gaussian Graphical Model Review Series I Joint Gaussian Graphical Model Review Series I Probability Foundations Beilun Wang Advisor: Yanjun Qi 1 Department of Computer Science, University of Virginia http://jointggm.org/ June 23rd, 2017 Beilun

More information

Graphical Models for Non-Negative Data Using Generalized Score Matching

Graphical Models for Non-Negative Data Using Generalized Score Matching Graphical Models for Non-Negative Data Using Generalized Score Matching Shiqing Yu Mathias Drton Ali Shojaie University of Washington University of Washington University of Washington Abstract A common

More information

High-dimensional covariance estimation based on Gaussian graphical models

High-dimensional covariance estimation based on Gaussian graphical models High-dimensional covariance estimation based on Gaussian graphical models Shuheng Zhou Department of Statistics, The University of Michigan, Ann Arbor IMA workshop on High Dimensional Phenomena Sept. 26,

More information

Robust Inverse Covariance Estimation under Noisy Measurements

Robust Inverse Covariance Estimation under Noisy Measurements .. Robust Inverse Covariance Estimation under Noisy Measurements Jun-Kun Wang, Shou-De Lin Intel-NTU, National Taiwan University ICML 2014 1 / 30 . Table of contents Introduction.1 Introduction.2 Related

More information

Robust and sparse Gaussian graphical modelling under cell-wise contamination

Robust and sparse Gaussian graphical modelling under cell-wise contamination Robust and sparse Gaussian graphical modelling under cell-wise contamination Shota Katayama 1, Hironori Fujisawa 2 and Mathias Drton 3 1 Tokyo Institute of Technology, Japan 2 The Institute of Statistical

More information

Probabilistic Graphical Models

Probabilistic Graphical Models School of Computer Science Probabilistic Graphical Models Gaussian graphical models and Ising models: modeling networks Eric Xing Lecture 0, February 7, 04 Reading: See class website Eric Xing @ CMU, 005-04

More information

Statistical Machine Learning for Structured and High Dimensional Data

Statistical Machine Learning for Structured and High Dimensional Data Statistical Machine Learning for Structured and High Dimensional Data (FA9550-09- 1-0373) PI: Larry Wasserman (CMU) Co- PI: John Lafferty (UChicago and CMU) AFOSR Program Review (Jan 28-31, 2013, Washington,

More information

Introduction to graphical models: Lecture III

Introduction to graphical models: Lecture III Introduction to graphical models: Lecture III Martin Wainwright UC Berkeley Departments of Statistics, and EECS Martin Wainwright (UC Berkeley) Some introductory lectures January 2013 1 / 25 Introduction

More information

High-dimensional graphical model selection: Practical and information-theoretic limits

High-dimensional graphical model selection: Practical and information-theoretic limits 1 High-dimensional graphical model selection: Practical and information-theoretic limits Martin Wainwright Departments of Statistics, and EECS UC Berkeley, California, USA Based on joint work with: John

More information

High-dimensional graphical model selection: Practical and information-theoretic limits

High-dimensional graphical model selection: Practical and information-theoretic limits 1 High-dimensional graphical model selection: Practical and information-theoretic limits Martin Wainwright Departments of Statistics, and EECS UC Berkeley, California, USA Based on joint work with: John

More information

Probabilistic Graphical Models

Probabilistic Graphical Models School of Computer Science Probabilistic Graphical Models Gaussian graphical models and Ising models: modeling networks Eric Xing Lecture 0, February 5, 06 Reading: See class website Eric Xing @ CMU, 005-06

More information

An efficient ADMM algorithm for high dimensional precision matrix estimation via penalized quadratic loss

An efficient ADMM algorithm for high dimensional precision matrix estimation via penalized quadratic loss An efficient ADMM algorithm for high dimensional precision matrix estimation via penalized quadratic loss arxiv:1811.04545v1 [stat.co] 12 Nov 2018 Cheng Wang School of Mathematical Sciences, Shanghai Jiao

More information

A General Framework for High-Dimensional Inference and Multiple Testing

A General Framework for High-Dimensional Inference and Multiple Testing A General Framework for High-Dimensional Inference and Multiple Testing Yang Ning Department of Statistical Science Joint work with Han Liu 1 Overview Goal: Control false scientific discoveries in high-dimensional

More information

High Dimensional Inverse Covariate Matrix Estimation via Linear Programming

High Dimensional Inverse Covariate Matrix Estimation via Linear Programming High Dimensional Inverse Covariate Matrix Estimation via Linear Programming Ming Yuan October 24, 2011 Gaussian Graphical Model X = (X 1,..., X p ) indep. N(µ, Σ) Inverse covariance matrix Σ 1 = Ω = (ω

More information

Chapter 17: Undirected Graphical Models

Chapter 17: Undirected Graphical Models Chapter 17: Undirected Graphical Models The Elements of Statistical Learning Biaobin Jiang Department of Biological Sciences Purdue University bjiang@purdue.edu October 30, 2014 Biaobin Jiang (Purdue)

More information

Markov Network Estimation From Multi-attribute Data

Markov Network Estimation From Multi-attribute Data Mladen Kolar mladenk@cs.cmu.edu Machine Learning Department, Carnegie Mellon University, Pittsburgh, PA 117 USA Han Liu hanliu@princeton.edu Department of Operations Research and Financial Engineering,

More information

10708 Graphical Models: Homework 2

10708 Graphical Models: Homework 2 10708 Graphical Models: Homework 2 Due Monday, March 18, beginning of class Feburary 27, 2013 Instructions: There are five questions (one for extra credit) on this assignment. There is a problem involves

More information

A New Combined Approach for Inference in High-Dimensional Regression Models with Correlated Variables

A New Combined Approach for Inference in High-Dimensional Regression Models with Correlated Variables A New Combined Approach for Inference in High-Dimensional Regression Models with Correlated Variables Niharika Gauraha and Swapan Parui Indian Statistical Institute Abstract. We consider the problem of

More information

Extended Bayesian Information Criteria for Gaussian Graphical Models

Extended Bayesian Information Criteria for Gaussian Graphical Models Extended Bayesian Information Criteria for Gaussian Graphical Models Rina Foygel University of Chicago rina@uchicago.edu Mathias Drton University of Chicago drton@uchicago.edu Abstract Gaussian graphical

More information

High Dimensional Semiparametric Gaussian Copula Graphical Models

High Dimensional Semiparametric Gaussian Copula Graphical Models High Dimeional Semiparametric Gaussian Copula Graphical Models Han Liu, Fang Han, Ming Yuan, John Lafferty and Larry Wasserman February 9, 2012 arxiv:1202.2169v1 [stat.ml] 10 Feb 2012 Contents Abstract:

More information

Convex relaxation for Combinatorial Penalties

Convex relaxation for Combinatorial Penalties Convex relaxation for Combinatorial Penalties Guillaume Obozinski Equipe Imagine Laboratoire d Informatique Gaspard Monge Ecole des Ponts - ParisTech Joint work with Francis Bach Fête Parisienne in Computation,

More information

STATISTICAL MACHINE LEARNING FOR STRUCTURED AND HIGH DIMENSIONAL DATA

STATISTICAL MACHINE LEARNING FOR STRUCTURED AND HIGH DIMENSIONAL DATA AFRL-OSR-VA-TR-2014-0234 STATISTICAL MACHINE LEARNING FOR STRUCTURED AND HIGH DIMENSIONAL DATA Larry Wasserman CARNEGIE MELLON UNIVERSITY 0 Final Report DISTRIBUTION A: Distribution approved for public

More information

Inference in high-dimensional graphical models arxiv: v1 [math.st] 25 Jan 2018

Inference in high-dimensional graphical models arxiv: v1 [math.st] 25 Jan 2018 Inference in high-dimensional graphical models arxiv:1801.08512v1 [math.st] 25 Jan 2018 Jana Janková Seminar for Statistics ETH Zürich Abstract Sara van de Geer We provide a selected overview of methodology

More information

Proximity-Based Anomaly Detection using Sparse Structure Learning

Proximity-Based Anomaly Detection using Sparse Structure Learning Proximity-Based Anomaly Detection using Sparse Structure Learning Tsuyoshi Idé (IBM Tokyo Research Lab) Aurelie C. Lozano, Naoki Abe, and Yan Liu (IBM T. J. Watson Research Center) 2009/04/ SDM 2009 /

More information

Inferning with High Girth Graphical Models

Inferning with High Girth Graphical Models Uri Heinemann The Hebrew University of Jerusalem, Jerusalem, Israel Amir Globerson The Hebrew University of Jerusalem, Jerusalem, Israel URIHEI@CS.HUJI.AC.IL GAMIR@CS.HUJI.AC.IL Abstract Unsupervised learning

More information

Sample Size Requirement For Some Low-Dimensional Estimation Problems

Sample Size Requirement For Some Low-Dimensional Estimation Problems Sample Size Requirement For Some Low-Dimensional Estimation Problems Cun-Hui Zhang, Rutgers University September 10, 2013 SAMSI Thanks for the invitation! Acknowledgements/References Sun, T. and Zhang,

More information

High-dimensional learning of linear causal networks via inverse covariance estimation

High-dimensional learning of linear causal networks via inverse covariance estimation High-dimensional learning of linear causal networks via inverse covariance estimation Po-Ling Loh Department of Statistics, University of California, Berkeley, CA 94720, USA Peter Bühlmann Seminar für

More information

Probabilistic Graphical Models

Probabilistic Graphical Models School of Computer Science Probabilistic Graphical Models Distributed ADMM for Gaussian Graphical Models Yaoliang Yu Lecture 29, April 29, 2015 Eric Xing @ CMU, 2005-2015 1 Networks / Graphs Eric Xing

More information

11 : Gaussian Graphic Models and Ising Models

11 : Gaussian Graphic Models and Ising Models 10-708: Probabilistic Graphical Models 10-708, Spring 2017 11 : Gaussian Graphic Models and Ising Models Lecturer: Bryon Aragam Scribes: Chao-Ming Yen 1 Introduction Different from previous maximum likelihood

More information

High-dimensional Covariance Estimation Based On Gaussian Graphical Models

High-dimensional Covariance Estimation Based On Gaussian Graphical Models High-dimensional Covariance Estimation Based On Gaussian Graphical Models Shuheng Zhou, Philipp Rutimann, Min Xu and Peter Buhlmann February 3, 2012 Problem definition Want to estimate the covariance matrix

More information

Computational and Statistical Aspects of Statistical Machine Learning. John Lafferty Department of Statistics Retreat Gleacher Center

Computational and Statistical Aspects of Statistical Machine Learning. John Lafferty Department of Statistics Retreat Gleacher Center Computational and Statistical Aspects of Statistical Machine Learning John Lafferty Department of Statistics Retreat Gleacher Center Outline Modern nonparametric inference for high dimensional data Nonparametric

More information

Random Field Models for Applications in Computer Vision

Random Field Models for Applications in Computer Vision Random Field Models for Applications in Computer Vision Nazre Batool Post-doctorate Fellow, Team AYIN, INRIA Sophia Antipolis Outline Graphical Models Generative vs. Discriminative Classifiers Markov Random

More information

arxiv: v6 [math.st] 3 Feb 2018

arxiv: v6 [math.st] 3 Feb 2018 Submitted to the Annals of Statistics HIGH-DIMENSIONAL CONSISTENCY IN SCORE-BASED AND HYBRID STRUCTURE LEARNING arxiv:1507.02608v6 [math.st] 3 Feb 2018 By Preetam Nandy,, Alain Hauser and Marloes H. Maathuis,

More information

Robust estimation, efficiency, and Lasso debiasing

Robust estimation, efficiency, and Lasso debiasing Robust estimation, efficiency, and Lasso debiasing Po-Ling Loh University of Wisconsin - Madison Departments of ECE & Statistics WHOA-PSI workshop Washington University in St. Louis Aug 12, 2017 Po-Ling

More information

Robust Sparse Principal Component Regression under the High Dimensional Elliptical Model

Robust Sparse Principal Component Regression under the High Dimensional Elliptical Model Robust Sparse Principal Component Regression under the High Dimensional Elliptical Model Fang Han Department of Biostatistics Johns Hopkins University Baltimore, MD 21210 fhan@jhsph.edu Han Liu Department

More information

Hierarchical kernel learning

Hierarchical kernel learning Hierarchical kernel learning Francis Bach Willow project, INRIA - Ecole Normale Supérieure May 2010 Outline Supervised learning and regularization Kernel methods vs. sparse methods MKL: Multiple kernel

More information

Stability Approach to Regularization Selection (StARS) for High Dimensional Graphical Models

Stability Approach to Regularization Selection (StARS) for High Dimensional Graphical Models Stability Approach to Regularization Selection (StARS) for High Dimensional Graphical Models arxiv:1006.3316v1 [stat.ml] 16 Jun 2010 Contents Han Liu, Kathryn Roeder and Larry Wasserman Carnegie Mellon

More information

Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems. ArXiv: Alexandre Belloni (Duke) + Kengo Kato (Tokyo)

Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems. ArXiv: Alexandre Belloni (Duke) + Kengo Kato (Tokyo) Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems. ArXiv: 1304.0282 Victor MIT, Economics + Center for Statistics Co-authors: Alexandre Belloni (Duke) + Kengo Kato (Tokyo)

More information

The picasso Package for Nonconvex Regularized M-estimation in High Dimensions in R

The picasso Package for Nonconvex Regularized M-estimation in High Dimensions in R The picasso Package for Nonconvex Regularized M-estimation in High Dimensions in R Xingguo Li Tuo Zhao Tong Zhang Han Liu Abstract We describe an R package named picasso, which implements a unified framework

More information

Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas

Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas Department of Mathematics Department of Statistical Science Cornell University London, January 7, 2016 Joint work

More information

Learning Markov Network Structure using Brownian Distance Covariance

Learning Markov Network Structure using Brownian Distance Covariance arxiv:.v [stat.ml] Jun 0 Learning Markov Network Structure using Brownian Distance Covariance Ehsan Khoshgnauz May, 0 Abstract In this paper, we present a simple non-parametric method for learning the

More information

A Unified Theory of Confidence Regions and Testing for High Dimensional Estimating Equations

A Unified Theory of Confidence Regions and Testing for High Dimensional Estimating Equations A Unified Theory of Confidence Regions and Testing for High Dimensional Estimating Equations arxiv:1510.08986v2 [math.st] 23 Jun 2016 Matey Neykov Yang Ning Jun S. Liu Han Liu Abstract We propose a new

More information

Sparse inverse covariance estimation with the lasso

Sparse inverse covariance estimation with the lasso Sparse inverse covariance estimation with the lasso Jerome Friedman Trevor Hastie and Robert Tibshirani November 8, 2007 Abstract We consider the problem of estimating sparse graphs by a lasso penalty

More information

Downloaded by Stanford University Medical Center Package from online.liebertpub.com at 10/25/17. For personal use only. ABSTRACT 1.

Downloaded by Stanford University Medical Center Package from online.liebertpub.com at 10/25/17. For personal use only. ABSTRACT 1. JOURNAL OF COMPUTATIONAL BIOLOGY Volume 24, Number 7, 2017 # Mary Ann Liebert, Inc. Pp. 721 731 DOI: 10.1089/cmb.2017.0053 A Poisson Log-Normal Model for Constructing Gene Covariation Network Using RNA-seq

More information

Variations on Nonparametric Additive Models: Computational and Statistical Aspects

Variations on Nonparametric Additive Models: Computational and Statistical Aspects Variations on Nonparametric Additive Models: Computational and Statistical Aspects John Lafferty Department of Statistics & Department of Computer Science University of Chicago Collaborators Sivaraman

More information

WEIGHTED QUANTILE REGRESSION THEORY AND ITS APPLICATION. Abstract

WEIGHTED QUANTILE REGRESSION THEORY AND ITS APPLICATION. Abstract Journal of Data Science,17(1). P. 145-160,2019 DOI:10.6339/JDS.201901_17(1).0007 WEIGHTED QUANTILE REGRESSION THEORY AND ITS APPLICATION Wei Xiong *, Maozai Tian 2 1 School of Statistics, University of

More information

Marginal Regression For Multitask Learning

Marginal Regression For Multitask Learning Mladen Kolar Machine Learning Department Carnegie Mellon University mladenk@cs.cmu.edu Han Liu Biostatistics Johns Hopkins University hanliu@jhsph.edu Abstract Variable selection is an important and practical

More information

MATH 829: Introduction to Data Mining and Analysis Graphical Models II - Gaussian Graphical Models

MATH 829: Introduction to Data Mining and Analysis Graphical Models II - Gaussian Graphical Models 1/13 MATH 829: Introduction to Data Mining and Analysis Graphical Models II - Gaussian Graphical Models Dominique Guillot Departments of Mathematical Sciences University of Delaware May 4, 2016 Recall

More information

Learning in Bayesian Networks

Learning in Bayesian Networks Learning in Bayesian Networks Florian Markowetz Max-Planck-Institute for Molecular Genetics Computational Molecular Biology Berlin Berlin: 20.06.2002 1 Overview 1. Bayesian Networks Stochastic Networks

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 11 CRFs, Exponential Family CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due today Project milestones due next Monday (Nov 9) About half the work should

More information

Approximation. Inderjit S. Dhillon Dept of Computer Science UT Austin. SAMSI Massive Datasets Opening Workshop Raleigh, North Carolina.

Approximation. Inderjit S. Dhillon Dept of Computer Science UT Austin. SAMSI Massive Datasets Opening Workshop Raleigh, North Carolina. Using Quadratic Approximation Inderjit S. Dhillon Dept of Computer Science UT Austin SAMSI Massive Datasets Opening Workshop Raleigh, North Carolina Sept 12, 2012 Joint work with C. Hsieh, M. Sustik and

More information

Lecture 6: Graphical Models: Learning

Lecture 6: Graphical Models: Learning Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)

More information

Learning Quadratic Variance Function (QVF) DAG Models via OverDispersion Scoring (ODS)

Learning Quadratic Variance Function (QVF) DAG Models via OverDispersion Scoring (ODS) Journal of Machine Learning Research 18 2018 1-44 Submitted 4/17; Revised 12/17; Published 4/18 Learning Quadratic Variance Function QVF DAG Models via OverDispersion Scoring ODS Gunwoong Park Department

More information

Ordinal Data Analysis via Graphical Models

Ordinal Data Analysis via Graphical Models Ordinal Data Analysis via Graphical Models Arun Sai Suggala Sunday 5 th November, 27 Abstract Background. Undirected graphical models or Markov random fields (MRFs) are very popular for modeling multivariate

More information

arxiv: v1 [stat.me] 13 Jun 2016

arxiv: v1 [stat.me] 13 Jun 2016 Tuning-free heterogeneity pursuit in massive networs Zhao Ren, Yongjian Kang 2, Yingying Fan 2 and Jinchi Lv 2 University of Pittsburgh and University of Southern California 2 arxiv:606.03803v [stat.me]

More information

Does Better Inference mean Better Learning?

Does Better Inference mean Better Learning? Does Better Inference mean Better Learning? Andrew E. Gelfand, Rina Dechter & Alexander Ihler Department of Computer Science University of California, Irvine {agelfand,dechter,ihler}@ics.uci.edu Abstract

More information

Structure Learning of Mixed Graphical Models

Structure Learning of Mixed Graphical Models Jason D. Lee Institute of Computational and Mathematical Engineering Stanford University Trevor J. Hastie Department of Statistics Stanford University Abstract We consider the problem of learning the structure

More information

Machine Learning Summer School

Machine Learning Summer School Machine Learning Summer School Lecture 3: Learning parameters and structure Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Department of Engineering University of Cambridge,

More information

Non-Asymptotic Analysis for Relational Learning with One Network

Non-Asymptotic Analysis for Relational Learning with One Network Peng He Department of Automation Tsinghua University Changshui Zhang Department of Automation Tsinghua University Abstract This theoretical paper is concerned with a rigorous non-asymptotic analysis of

More information

Confidence Intervals for Low-dimensional Parameters with High-dimensional Data

Confidence Intervals for Low-dimensional Parameters with High-dimensional Data Confidence Intervals for Low-dimensional Parameters with High-dimensional Data Cun-Hui Zhang and Stephanie S. Zhang Rutgers University and Columbia University September 14, 2012 Outline Introduction Methodology

More information

Random Forests. These notes rely heavily on Biau and Scornet (2016) as well as the other references at the end of the notes.

Random Forests. These notes rely heavily on Biau and Scornet (2016) as well as the other references at the end of the notes. Random Forests One of the best known classifiers is the random forest. It is very simple and effective but there is still a large gap between theory and practice. Basically, a random forest is an average

More information

Divide-and-combine Strategies in Statistical Modeling for Massive Data

Divide-and-combine Strategies in Statistical Modeling for Massive Data Divide-and-combine Strategies in Statistical Modeling for Massive Data Liqun Yu Washington University in St. Louis March 30, 2017 Liqun Yu (WUSTL) D&C Statistical Modeling for Massive Data March 30, 2017

More information

High-dimensional regression with unknown variance

High-dimensional regression with unknown variance High-dimensional regression with unknown variance Christophe Giraud Ecole Polytechnique march 2012 Setting Gaussian regression with unknown variance: Y i = f i + ε i with ε i i.i.d. N (0, σ 2 ) f = (f

More information

High-Dimensional Learning of Linear Causal Networks via Inverse Covariance Estimation

High-Dimensional Learning of Linear Causal Networks via Inverse Covariance Estimation University of Pennsylvania ScholarlyCommons Statistics Papers Wharton Faculty Research 10-014 High-Dimensional Learning of Linear Causal Networks via Inverse Covariance Estimation Po-Ling Loh University

More information

Efficient Information Planning in Graphical Models

Efficient Information Planning in Graphical Models Efficient Information Planning in Graphical Models computational complexity considerations John Fisher & Giorgos Papachristoudis, MIT VITALITE Annual Review 2013 September 9, 2013 J. Fisher (VITALITE Annual

More information

Probabilistic Graphical Models

Probabilistic Graphical Models 2016 Robert Nowak Probabilistic Graphical Models 1 Introduction We have focused mainly on linear models for signals, in particular the subspace model x = Uθ, where U is a n k matrix and θ R k is a vector

More information

Towards an extension of the PC algorithm to local context-specific independencies detection

Towards an extension of the PC algorithm to local context-specific independencies detection Towards an extension of the PC algorithm to local context-specific independencies detection Feb-09-2016 Outline Background: Bayesian Networks The PC algorithm Context-specific independence: from DAGs to

More information

Stability Approach to Regularization Selection (StARS) for High Dimensional Graphical Models

Stability Approach to Regularization Selection (StARS) for High Dimensional Graphical Models Stability Approach to Regularization Selection (StARS) for High Dimensional Graphical Models Han Liu Kathryn Roeder Larry Wasserman Carnegie Mellon University Pittsburgh, PA 15213 Abstract A challenging

More information

An iterative hard thresholding estimator for low rank matrix recovery

An iterative hard thresholding estimator for low rank matrix recovery An iterative hard thresholding estimator for low rank matrix recovery Alexandra Carpentier - based on a joint work with Arlene K.Y. Kim Statistical Laboratory, Department of Pure Mathematics and Mathematical

More information

Graphical models. Christophe Giraud. Lecture Notes on High-Dimensional Statistics : Université Paris-Sud and Ecole Polytechnique Maths Department

Graphical models. Christophe Giraud. Lecture Notes on High-Dimensional Statistics : Université Paris-Sud and Ecole Polytechnique Maths Department Graphical Modeling Université Paris-Sud and Ecole Polytechnique Maths Department Lecture Notes on High-Dimensional Statistics : http://www.cmap.polytechnique.fr/ giraud/msv/lecturenotes.pdf 1/73 Please

More information

arxiv: v1 [stat.ap] 19 Oct 2015

arxiv: v1 [stat.ap] 19 Oct 2015 Submitted to the Annals of Applied Statistics STRUCTURE ESTIMATION FOR MIXED GRAPHICAL MODELS IN HIGH-DIMENSIONAL DATA arxiv:1510.05677v1 [stat.ap] 19 Oct 2015 By Jonas M. B. Haslbeck Utrecht University

More information

Sparse Permutation Invariant Covariance Estimation: Motivation, Background and Key Results

Sparse Permutation Invariant Covariance Estimation: Motivation, Background and Key Results Sparse Permutation Invariant Covariance Estimation: Motivation, Background and Key Results David Prince Biostat 572 dprince3@uw.edu April 19, 2012 David Prince (UW) SPICE April 19, 2012 1 / 11 Electronic

More information

arxiv: v1 [math.st] 13 Feb 2012

arxiv: v1 [math.st] 13 Feb 2012 Sparse Matrix Inversion with Scaled Lasso Tingni Sun and Cun-Hui Zhang Rutgers University arxiv:1202.2723v1 [math.st] 13 Feb 2012 Address: Department of Statistics and Biostatistics, Hill Center, Busch

More information

High dimensional ising model selection using l 1 -regularized logistic regression

High dimensional ising model selection using l 1 -regularized logistic regression High dimensional ising model selection using l 1 -regularized logistic regression 1 Department of Statistics Pennsylvania State University 597 Presentation 2016 1/29 Outline Introduction 1 Introduction

More information

High dimensional Ising model selection

High dimensional Ising model selection High dimensional Ising model selection Pradeep Ravikumar UT Austin (based on work with John Lafferty, Martin Wainwright) Sparse Ising model US Senate 109th Congress Banerjee et al, 2008 Estimate a sparse

More information

Introduction to Probabilistic Graphical Models

Introduction to Probabilistic Graphical Models Introduction to Probabilistic Graphical Models Kyu-Baek Hwang and Byoung-Tak Zhang Biointelligence Lab School of Computer Science and Engineering Seoul National University Seoul 151-742 Korea E-mail: kbhwang@bi.snu.ac.kr

More information

Learning Multiple Tasks with a Sparse Matrix-Normal Penalty

Learning Multiple Tasks with a Sparse Matrix-Normal Penalty Learning Multiple Tasks with a Sparse Matrix-Normal Penalty Yi Zhang and Jeff Schneider NIPS 2010 Presented by Esther Salazar Duke University March 25, 2011 E. Salazar (Reading group) March 25, 2011 1

More information

A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models

A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models Jingyi Jessica Li Department of Statistics University of California, Los

More information

Estimation of Graphical Models with Shape Restriction

Estimation of Graphical Models with Shape Restriction Estimation of Graphical Models with Shape Restriction BY KHAI X. CHIONG USC Dornsife INE, Department of Economics, University of Southern California, Los Angeles, California 989, U.S.A. kchiong@usc.edu

More information

Learning With Bayesian Networks. Markus Kalisch ETH Zürich

Learning With Bayesian Networks. Markus Kalisch ETH Zürich Learning With Bayesian Networks Markus Kalisch ETH Zürich Inference in BNs - Review P(Burglary JohnCalls=TRUE, MaryCalls=TRUE) Exact Inference: P(b j,m) = c Sum e Sum a P(b)P(e)P(a b,e)p(j a)p(m a) Deal

More information

Junction Tree, BP and Variational Methods

Junction Tree, BP and Variational Methods Junction Tree, BP and Variational Methods Adrian Weller MLSALT4 Lecture Feb 21, 2018 With thanks to David Sontag (MIT) and Tony Jebara (Columbia) for use of many slides and illustrations For more information,

More information

Sparse Covariance Matrix Estimation with Eigenvalue Constraints

Sparse Covariance Matrix Estimation with Eigenvalue Constraints Sparse Covariance Matrix Estimation with Eigenvalue Constraints Han Liu and Lie Wang 2 and Tuo Zhao 3 Department of Operations Research and Financial Engineering, Princeton University 2 Department of Mathematics,

More information

Multivariate Bernoulli Distribution 1

Multivariate Bernoulli Distribution 1 DEPARTMENT OF STATISTICS University of Wisconsin 1300 University Ave. Madison, WI 53706 TECHNICAL REPORT NO. 1170 June 6, 2012 Multivariate Bernoulli Distribution 1 Bin Dai 2 Department of Statistics University

More information

Directed and Undirected Graphical Models

Directed and Undirected Graphical Models Directed and Undirected Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Last Lecture Refresher Lecture Plan Directed

More information

Bayesian Networks Inference with Probabilistic Graphical Models

Bayesian Networks Inference with Probabilistic Graphical Models 4190.408 2016-Spring Bayesian Networks Inference with Probabilistic Graphical Models Byoung-Tak Zhang intelligence Lab Seoul National University 4190.408 Artificial (2016-Spring) 1 Machine Learning? Learning

More information

Copula PC Algorithm for Causal Discovery from Mixed Data

Copula PC Algorithm for Causal Discovery from Mixed Data Copula PC Algorithm for Causal Discovery from Mixed Data Ruifei Cui ( ), Perry Groot, and Tom Heskes Institute for Computing and Information Sciences, Radboud University, Nijmegen, The Netherlands {R.Cui,

More information

High-dimensional statistics: Some progress and challenges ahead

High-dimensional statistics: Some progress and challenges ahead High-dimensional statistics: Some progress and challenges ahead Martin Wainwright UC Berkeley Departments of Statistics, and EECS University College, London Master Class: Lecture Joint work with: Alekh

More information

Shrinkage Tuning Parameter Selection in Precision Matrices Estimation

Shrinkage Tuning Parameter Selection in Precision Matrices Estimation arxiv:0909.1123v1 [stat.me] 7 Sep 2009 Shrinkage Tuning Parameter Selection in Precision Matrices Estimation Heng Lian Division of Mathematical Sciences School of Physical and Mathematical Sciences Nanyang

More information

Undirected Graphical Models

Undirected Graphical Models Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Properties Properties 3 Generative vs. Conditional

More information

25 : Graphical induced structured input/output models

25 : Graphical induced structured input/output models 10-708: Probabilistic Graphical Models 10-708, Spring 2016 25 : Graphical induced structured input/output models Lecturer: Eric P. Xing Scribes: Raied Aljadaany, Shi Zong, Chenchen Zhu Disclaimer: A large

More information

On Semiparametric Exponential Family Graphical Models

On Semiparametric Exponential Family Graphical Models On Semiparametric Exponential Family Graphical Models arxiv:4.8697v [stat.ml] 5 Oct 05 Zhuoran Yang Yang Ning Han Liu Abstract We propose a new class of semiparametric exponential family graphical models

More information

Graphical Models and Independence Models

Graphical Models and Independence Models Graphical Models and Independence Models Yunshu Liu ASPITRG Research Group 2014-03-04 References: [1]. Steffen Lauritzen, Graphical Models, Oxford University Press, 1996 [2]. Christopher M. Bishop, Pattern

More information

Genetic Networks. Korbinian Strimmer. Seminar: Statistical Analysis of RNA-Seq Data 19 June IMISE, Universität Leipzig

Genetic Networks. Korbinian Strimmer. Seminar: Statistical Analysis of RNA-Seq Data 19 June IMISE, Universität Leipzig Genetic Networks Korbinian Strimmer IMISE, Universität Leipzig Seminar: Statistical Analysis of RNA-Seq Data 19 June 2012 Korbinian Strimmer, RNA-Seq Networks, 19/6/2012 1 Paper G. I. Allen and Z. Liu.

More information

Graphical Model Selection

Graphical Model Selection May 6, 2013 Trevor Hastie, Stanford Statistics 1 Graphical Model Selection Trevor Hastie Stanford University joint work with Jerome Friedman, Rob Tibshirani, Rahul Mazumder and Jason Lee May 6, 2013 Trevor

More information