Inferring Protein-Signaling Networks II
|
|
- Wesley Holmes
- 5 years ago
- Views:
Transcription
1 Inferring Protein-Signaling Networks II Lectures 15 Nov 16, 2011 CSE 527 Computational Biology, Fall 2011 Instructor: Su-In Lee TA: Christopher Miles Monday & Wednesday 12:00-1:20 Johnson Hall (JHN) Closing the Loop Thank you for your participation to the survey! Things that helped you A very diverse set of topics Well-organized Who is with me? Quality of the slides Things that did not help you Lack of depth Intended to be achieved through problem sets and projects HW problems Needs to improve clarity 2 1
2 Outline Inferring the protein-signaling network Computational problem Structure score Structure search Interventional modeling Evaluation of results Conclusion Key learning points Structure learning of Bayesian network Intervention modeling Evaluation of the inferred biological network 3 Inferring Signaling Networks Signaling networks are full of post-translational regulatory mechanisms (e.g. phosphorylation) = Measured proteins 4 2
3 Computational Problem Given data D Goals Proteins praf pmek plcg PIP2 PIP3 p44/42 pakts473 P38 pjnk Cells Infer the causal interaction network Plc Jnk P38 Raf Mek PIP3 P44/42 PIP2 Akt Structure learning of Bayesian network General machine learning problem Applicable to different areas in network biology 5 Structure Search Score-based learning algorithm Given a set of all possible structures and the scoring function, we aim to select the highest scoring network structure. Greedy hill-climbing search Make one edge change which maximizes the graph score Current Next Add an edge Remove an edge Reverse an edge *Constrained to non-cyclic modifications Importance of score decomposition 6 3
4 Score Decomposition Greedy hill-climbing search Make one edge change which maximizes the graph score Current G 1 Next G 1 Add an edge Compare G and G in terms of the structure score If score decomposability holds, Score of G = S 1 ( 1, Pa 1 )+S 2 ( 2, Pa 2 )+...+S 6 ( 6, Pa 6 ), where S i ( i, Pa i ) is a FamScore for node i When G G, we only need to re-compute S 4 ( 4, Pa 4 ) How about commonly used structure scores? 7 Maximum Likelihood Find G that maximizes: P( Data=D Graph=G, Θ MLE ) K=#discrete levels of N ijk = # times i=k and Parents(i)=j in the Data D = Data G=Graph θ = CPT values for each node θ ijk = P( i=k Parents(i)=j ) N ijk = # times i=k and Parents(i)=j in the Data Θ ijk ML = N ijk / k N ijk θ ijk = P( i=k Parents(i)=j ) [# of (0,0,0)] / [# of (0,0,*)] Θ C10 Θ C20 Θ C30 Θ C40 Θ C11 Θ C21 Θ C31 Θ C41 D = [(0,0,0), (0,0,1), (1,1,1), (1,0,0), (1,1,1), (1,0,1), : (1,0,0), (1,0,1)] 8 4
5 Structure Score Bayesian score P( Data=D Graph=G ) = P(D G,θ) P(θ G) dθ D = Data G=Graph θ = CPT values for each node θ ijk = P( i=k Parents(i)=j ) N ijk = # times i=k and Parents(i)=j in the Data P( 1,, K) ~ Multinomial Dirichlet prior ~ Dir(α) P( D, θ G ) K=#discrete levels of N ijk = # times i=k and Parents(i)=j in the Data Dirichlet normalizer Θ ij = Simplex { k θ ijk = 1} θ ijk = P( i=k Parents(i)=j ) D. Heckerman. A Tutorial on Learning with Bayesian Networks. 1999, 1997, G. Cooper E. Herskovits. A Bayesian Method for the Induction of Probabilistic Networks from Data. Machine Learning, 9, Structure Score D = Data G=Graph θ = CPT values for each node θ ijk = P( i=k Parents(i)=j ) N ijk = # times i=k and Parents(i)=j in the Data 10 G. Cooper E. Herskovits. A Bayesian Method for the Induction of Probabilistic Networks from Data. Machine Learning, 9,
6 Structure Score D = Data G=Graph θ = CPT values for each node θ ijk = P( i=k Parents(i)=j ) N ijk = # times i=k and Parents(i)=j in the Data + log P(G) Decomposability! 11 Bayesian Score P( Data=D Graph=G ) = P(D G,θ) P(θ G) dθ D = Data G=Graph θ = CPT values for each node θ ijk = P( i=k Parents(i)=j ) N ijk = # times i=k and Parents(i)=j in the Data Θ ijk BS = (N ijk + α ijk ) / k (N ijk + α ijk ) Imaginary counts [# of (0,0,0)+ α C10 ] / [# of (0,0,*) + α C10 + α C11 ] Θ C10 Θ C20 Θ C11 Θ C21 Θ C30 Θ C40 Θ C31 Θ C
7 Structure Learning Algorithm Greedy hill-climbing search Make one edge change which maximizes the graph score Current G Next G Add an edge Remove an edge Reverse an edge Update S 4 ( 4, Pa 4 ) when G G Update S 3 ( 3, Pa 3 ) when G G Update S 3 ( 3, Pa 3 ) when G G Repeat to make one edge changes until the score no longer increases (find local maxima) 13 Model Averaging Generate N graphs using bootstrapped data Protein A Protein A Protein B Protein B Protein C Protein D Protein C Protein D Protein E Protein F Protein E Protein F Confidence(feature f) = # graphs with feature f / # graphs Select a confidence threshold 14 7
8 Causal Networks Bayes net is NOT a causal net Conditional Independence B (A E) (B D A,E) (C A,D,E B) (D B,C,E A) (E A,D) Does structure learning of the Bayesian network reveal causal relationships? Not necessarily Pe'er D. Bayesian Network Analysis of Signaling Networks: A Primer. Science STKE. April Causal Networks Simple example Given D={<0,0>,<0,1>,<0,0>,,<1,1>}, we can compute the Bayesian scores of both graphs very similar! (e.g , -6.78) Data D containing intervention samples can improve the accurate inference of causal relationships D = { (0,0), (0,1), [do(=1),1], (0,0), [do(=0),0], [do(=0),0], (1,1),, (1,1) } 8
9 Observation vs Intervention Data Training data D Proteins praf pmek plcg PIP2 PIP3 p44/42 pakts473 P38 pjnk No intervention Cells Intervention conditions In each intervention condition, add a chemical known to inhibit/ activate a certain protein How do we treat the data from intervention conditions? 17 Intervention Modeling Let s say that the real network is: Observational / interventional data would be like: D ={ (0,0), (1,1), (0,0), (0,0), (1,1), (0,1), (1,1), (1,0), (0,0) [do(=0),0], [do(=1),1], [do(=0),0], [do(=0),0], [do(=1),1], [1, do(=0)], [1, do(=1)], [0,do(=0)], [0,do(=1)], [0,do(=1)], } How should the scores be computed in each case? Let s consider ML score G 1 G
10 Intervention Modeling D ={ (0,0), (1,1), (0,0), (0,0), (1,1), (0,1), (1,1), (1,0), (0,0) [do(=0),0], [do(=1),1], [do(=0),0], [do(=0),0], [do(=1),1], [1, do(=0)], [1, do(=1)], [0,do(=0)], [0,do(=1)], [0,do(=1)], } When computing the structure score, the training samples in D are assumed to come from the same underlying network. However, do(=0/1) means that is forced to have value 0/1 and does not depend on it s parents. We should treat the intervention samples differently account when computing the score. G 1 G 1 [do(=0/1)] G 1 [do(=0/1)] When estimating MLE for s CPD, ignore the samples with [do(=0/1)] 19 Intervention Modeling D ={ (0,0), (1,1), (0,0), (0,0), (1,1), (0,1), (1,1), (1,0), (0,0) [do(=0),0], [do(=1),1], [do(=0),0], [do(=0),0], [do(=1),1], [1, do(=0)], [1, do(=1)], [0,do(=0)], [0,do(=1)], [0,do(=1)], } ML scores of G 1 & G 2 : G 1 Which of G 1 and G 2 is likely to have a higher score? G 1 [do(=0/1)] G 1 [do(=0/1)] D 1 ={ (0,0), (1,1), (0,0), (0,0), (1,1), (0,1), (1,1), (1,0), (0,0) [do(=0),0], [do(=1),1], [do(=0),0], [do(=0),0], [do(=1),1], [1, do(=0)], [1, do(=1)], [0,do(=0)], [0,do(=1)], [0,do(=1)], } Dependency between and can be modeled by G 1 better. Intervention samples help! D 2 ={ (0,0), (1,1), (0,0), (0,0), (1,1), (0,1), (1,1), (1,0), (0,0) [do(=0),0], [do(=1),1], [do(=0),0], [do(=0),0], [do(=1),1], [1, do(=0)], [1, do(=1)], [0,do(=0)], [0,do(=1)], [0,do(=1)], } G 2 G 2 [do(=0/1)] G 2 [do(=0/1)] 20 10
11 Intervention Modeling Compute the score with intervention data * Modified FamScore S i Ignore counts for the intervened node Inhibitor 3 Current G Proteins praf pmek plcg PIP2 PIP3 p44/42 pakts473 P38 pjnk Cells activator No intervention Inhibit 3 do(3=0) Activate 6 do(6=1) Intervention conditions * G. Cooper E. Herskovits. A Bayesian Method for the Induction of Probabilistic Networks from Data. Machine Learning, 9, Overview Conditions (multi well format) perturbation a perturbation b Multiparameter Flow Cytometry perturbation n Datasets of cells condition a condition b condition n 22 Influence diagram of measured variables Bayesian Network Analysis Source: Causal Protein-Signaling Networks Derived from Multiparameter Single-Cell Data. Sachs et al. Science (2005)
12 Signaling Networks Example Classic signaling network and points of intervention Human T cell (white blood cell) = Measured proteins Intervention conditions Stimuli: anti-cd3, anti-cd28, ICAM-2 Inhibitors to: Akt,, PIP3, Mek Activators of:, Source: Causal Protein-Signaling Networks Derived from Multiparameter Single-Cell Data. Sachs et al. Science (2005). 23 Inferred Network Phospho-Proteins Phospho-Lipids Perturbed in data Plc Raf Jnk P38 Mek PIP3 P44/42 PIP2 Akt Causal Protein-Signaling Networks Derived from Multiparameter Single-Cell Data. Sachs et al. Science (2005)
13 Evaluation I Phospho-Proteins Phospho-Lipids Perturbed in data Plc Raf Jnk P38 Mek PIP3 P44/42 (Erk) PIP2 Akt Direct phosphorylation Causal Protein-Signaling Networks Derived from Multiparameter Single-Cell Data. Sachs et al. Science (2005). 25 Features of Approach Direct phosphorylation: Mek Erk Difficult to detect using other forms of high-throughput data: - Protein-protein interaction data - Microarrays Causal Protein-Signaling Networks Derived from Multiparameter Single-Cell Data. Sachs et al. Science (2005)
14 Evaluation II Phospho-Proteins Phospho-Lipids Perturbed in data Plc Jnk P38 Raf Mek PIP3 P44/42 (Erk) PIP2 Akt Indirect signaling Direct phosphorylation Causal Protein-Signaling Networks Derived from Multiparameter Single-Cell Data. Sachs et al. Science (2005). 27 Features of Approach Indirect signaling Jnk Mapkkk Mapkk Jnk Explaining away Not measured Indirect connections can be found even when the intermediate molecule(s) are not measured Raf Mek Erk Unnecessary edges do NOT appear 28 14
15 Indirect signaling - Complex example Is this a mistake? Raf Mek The real picture Raf s497 Ras Raf s259 Mek Phoso-protein specific More than one pathway of influence 29 Evaluation Phospho-Proteins Phospho-Lipids Perturbed in data Plc Raf Expected Pathway Indirect signaling Direct phosphorylation Jnk P38 Mek PIP3 P44/42 15/17 Classic PIP2 Akt Causal Protein-Signaling Networks Derived from Multiparameter Single-Cell Data. Sachs et al. Science (2005)
16 Evaluation Phospho-Proteins Phospho-Lipids Perturbed in data Plc Raf Expected Pathway Reported Reversed Jnk P38 Missed Mek PIP3 PIP2 Akt P44/42 15/17 Classic 17/17 Reported 3 Missed 31 Prediction Erk1/2 unperturbed Erk influence on Akt previously reported in colon cancer cell lines Raf Mek Erk1/2 Predictions: Erk1/2 influences Akt While correlated, Erk1/2 does not influence Akt 16
17 Validation SiRNA on Erk1/Erk2 Select transfected cells Measure Akt and control, stimulated Erk1 sirna, stimulated P=9.4e -5 P= APC-A: p-akt-647 APC-A P-Akt PE-A: p-pka-546 PE-A P- 33 Conclusions Many limitations Interventions Flow cytometry (4-12 proteins, no time series data) Bayesian networks (no feedback loops) Advantages In vivo measurement No a priori knowledge Enablers of accurate inference Network intervention Sufficient numbers of single cells 34 17
18 What We ve Covered So Far ML basics (Bayesian networks, MLE, EM) Genetics (association studies, phasing, linkage analysis) Systems biology (gene regulation, gene interaction) What next? 35 Sequence Analysis (5 lectures) Sequencing techniques Sequence alignment Comparative genomics 36 18
Inferring Protein-Signaling Networks
Inferring Protein-Signaling Networks Lectures 14 Nov 14, 2011 CSE 527 Computational Biology, Fall 2011 Instructor: Su-In Lee TA: Christopher Miles Monday & Wednesday 12:00-1:20 Johnson Hall (JHN) 022 1
More informationInferring Transcriptional Regulatory Networks from High-throughput Data
Inferring Transcriptional Regulatory Networks from High-throughput Data Lectures 9 Oct 26, 2011 CSE 527 Computational Biology, Fall 2011 Instructor: Su-In Lee TA: Christopher Miles Monday & Wednesday 12:00-1:20
More informationLearning in Bayesian Networks
Learning in Bayesian Networks Florian Markowetz Max-Planck-Institute for Molecular Genetics Computational Molecular Biology Berlin Berlin: 20.06.2002 1 Overview 1. Bayesian Networks Stochastic Networks
More informationInferring Transcriptional Regulatory Networks from Gene Expression Data II
Inferring Transcriptional Regulatory Networks from Gene Expression Data II Lectures 9 Oct 26, 2011 CSE 527 Computational Biology, Fall 2011 Instructor: Su-In Lee TA: Christopher Miles Monday & Wednesday
More informationLecture 6: Graphical Models: Learning
Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)
More informationBayesian Network Modelling
Bayesian Network Modelling with Examples scutari@stats.ox.ac.uk Department of Statistics November 28, 2016 What Are Bayesian Networks? What Are Bayesian Networks? A Graph and a Probability Distribution
More informationMachine Learning Summer School
Machine Learning Summer School Lecture 3: Learning parameters and structure Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Department of Engineering University of Cambridge,
More informationLearning' Probabilis2c' Graphical' Models' BN'Structure' Structure' Learning' Daphne Koller
Probabilis2c' Graphical' Models' Learning' BN'Structure' Structure' Learning' Why Structure Learning To learn model for new queries, when domain expertise is not perfect For structure discovery, when inferring
More informationLearning Causal Bayesian Network Structures from. Experimental Data
Learning Causal Bayesian Network Structures from Experimental Data Byron Ellis Baxter Labs in Genetic Pharmacology Stanford University Medical School Stanford University, Stanford, CA 94305 email: bcellis@stanford.edu
More informationLearning robust cell signalling models from high throughput proteomic data
Int. J. Bioinformatics Research and Applications, Vol. 5, No. 3, 2009 241 Learning robust cell signalling models from high throughput proteomic data Mitchell Koch Department of Computer Science, Rice University,
More informationGraphical Model Selection
May 6, 2013 Trevor Hastie, Stanford Statistics 1 Graphical Model Selection Trevor Hastie Stanford University joint work with Jerome Friedman, Rob Tibshirani, Rahul Mazumder and Jason Lee May 6, 2013 Trevor
More informationThe Monte Carlo Method: Bayesian Networks
The Method: Bayesian Networks Dieter W. Heermann Methods 2009 Dieter W. Heermann ( Methods)The Method: Bayesian Networks 2009 1 / 18 Outline 1 Bayesian Networks 2 Gene Expression Data 3 Bayesian Networks
More informationGLOBEX Bioinformatics (Summer 2015) Genetic networks and gene expression data
GLOBEX Bioinformatics (Summer 2015) Genetic networks and gene expression data 1 Gene Networks Definition: A gene network is a set of molecular components, such as genes and proteins, and interactions between
More informationLearning Bayesian networks
1 Lecture topics: Learning Bayesian networks from data maximum likelihood, BIC Bayesian, marginal likelihood Learning Bayesian networks There are two problems we have to solve in order to estimate Bayesian
More informationReadings: K&F: 16.3, 16.4, Graphical Models Carlos Guestrin Carnegie Mellon University October 6 th, 2008
Readings: K&F: 16.3, 16.4, 17.3 Bayesian Param. Learning Bayesian Structure Learning Graphical Models 10708 Carlos Guestrin Carnegie Mellon University October 6 th, 2008 10-708 Carlos Guestrin 2006-2008
More informationIntroduction to Bioinformatics
CSCI8980: Applied Machine Learning in Computational Biology Introduction to Bioinformatics Rui Kuang Department of Computer Science and Engineering University of Minnesota kuang@cs.umn.edu History of Bioinformatics
More informationComputational Genomics. Systems biology. Putting it together: Data integration using graphical models
02-710 Computational Genomics Systems biology Putting it together: Data integration using graphical models High throughput data So far in this class we discussed several different types of high throughput
More informationLearning Bayesian network : Given structure and completely observed data
Learning Bayesian network : Given structure and completely observed data Probabilistic Graphical Models Sharif University of Technology Spring 2017 Soleymani Learning problem Target: true distribution
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture 5 Bayesian Learning of Bayesian Networks CS/CNS/EE 155 Andreas Krause Announcements Recitations: Every Tuesday 4-5:30 in 243 Annenberg Homework 1 out. Due in class
More informationOverview of Course. Nevin L. Zhang (HKUST) Bayesian Networks Fall / 58
Overview of Course So far, we have studied The concept of Bayesian network Independence and Separation in Bayesian networks Inference in Bayesian networks The rest of the course: Data analysis using Bayesian
More informationSupplementary material to Structure Learning of Linear Gaussian Structural Equation Models with Weak Edges
Supplementary material to Structure Learning of Linear Gaussian Structural Equation Models with Weak Edges 1 PRELIMINARIES Two vertices X i and X j are adjacent if there is an edge between them. A path
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More information{ p if x = 1 1 p if x = 0
Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =
More informationBayesian Network Analysis of Signaling Networks: A Primer
Bayesian Network Analysis of Signaling Networks: A Primer Dana Pe er (Published 26 April 2005) High-throughput proteomic data can be used to reveal the connectivity of signaling networks and the influences
More informationBelief net structure learning from uncertain interventions
Journal of Machine Learning Research (2) -48 Submitted 4/; Published / Belief net structure learning from uncertain interventions Daniel Eaton Kevin Murphy University of British Columbia Department of
More informationAn Empirical-Bayes Score for Discrete Bayesian Networks
An Empirical-Bayes Score for Discrete Bayesian Networks scutari@stats.ox.ac.uk Department of Statistics September 8, 2016 Bayesian Network Structure Learning Learning a BN B = (G, Θ) from a data set D
More informationIllustration of the K2 Algorithm for Learning Bayes Net Structures
Illustration of the K2 Algorithm for Learning Bayes Net Structures Prof. Carolina Ruiz Department of Computer Science, WPI ruiz@cs.wpi.edu http://www.cs.wpi.edu/ ruiz The purpose of this handout is to
More informationLecture 8 Learning Sequence Motif Models Using Expectation Maximization (EM) Colin Dewey February 14, 2008
Lecture 8 Learning Sequence Motif Models Using Expectation Maximization (EM) Colin Dewey February 14, 2008 1 Sequence Motifs what is a sequence motif? a sequence pattern of biological significance typically
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More informationCausality challenge #2: Pot-Luck
Causality challenge #2: Pot-Luck Isabelle Guyon, Clopinet Constantin Aliferis and Alexander Statnikov, Vanderbilt Univ. André Elisseeff and Jean-Philippe Pellet, IBM Zürich Gregory F. Cooper, Pittsburg
More informationCS839: Probabilistic Graphical Models. Lecture 7: Learning Fully Observed BNs. Theo Rekatsinas
CS839: Probabilistic Graphical Models Lecture 7: Learning Fully Observed BNs Theo Rekatsinas 1 Exponential family: a basic building block For a numeric random variable X p(x ) =h(x)exp T T (x) A( ) = 1
More informationBeyond Uniform Priors in Bayesian Network Structure Learning
Beyond Uniform Priors in Bayesian Network Structure Learning (for Discrete Bayesian Networks) scutari@stats.ox.ac.uk Department of Statistics April 5, 2017 Bayesian Network Structure Learning Learning
More informationIntroduction to Probabilistic Graphical Models
Introduction to Probabilistic Graphical Models Kyu-Baek Hwang and Byoung-Tak Zhang Biointelligence Lab School of Computer Science and Engineering Seoul National University Seoul 151-742 Korea E-mail: kbhwang@bi.snu.ac.kr
More informationTopic Models. Brandon Malone. February 20, Latent Dirichlet Allocation Success Stories Wrap-up
Much of this material is adapted from Blei 2003. Many of the images were taken from the Internet February 20, 2014 Suppose we have a large number of books. Each is about several unknown topics. How can
More information6.047 / Computational Biology: Genomes, Networks, Evolution Fall 2008
MIT OpenCourseWare http://ocw.mit.edu 6.047 / 6.878 Computational Biology: Genomes, Networks, Evolution Fall 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationDiscovering molecular pathways from protein interaction and ge
Discovering molecular pathways from protein interaction and gene expression data 9-4-2008 Aim To have a mechanism for inferring pathways from gene expression and protein interaction data. Motivation Why
More informationECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning
ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Summary of Class Advanced Topics Dhruv Batra Virginia Tech HW1 Grades Mean: 28.5/38 ~= 74.9%
More informationLearning Bayesian Networks (part 1) Goals for the lecture
Learning Bayesian Networks (part 1) Mark Craven and David Page Computer Scices 760 Spring 2018 www.biostat.wisc.edu/~craven/cs760/ Some ohe slides in these lectures have been adapted/borrowed from materials
More informationLearning Bayesian belief networks
Lecture 4 Learning Bayesian belief networks Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Administration Midterm: Monday, March 7, 2003 In class Closed book Material covered by Wednesday, March
More informationCS6220: DATA MINING TECHNIQUES
CS6220: DATA MINING TECHNIQUES Matrix Data: Classification: Part 2 Instructor: Yizhou Sun yzsun@ccs.neu.edu September 21, 2014 Methods to Learn Matrix Data Set Data Sequence Data Time Series Graph & Network
More informationProbabilistic Graphical Models
Parameter Estimation December 14, 2015 Overview 1 Motivation 2 3 4 What did we have so far? 1 Representations: how do we model the problem? (directed/undirected). 2 Inference: given a model and partially
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear
More informationModeling Genetic Regulatory Networks Using First-Order Probabilistic Logic
Modeling Genetic Regulatory Networks Using First-Order Probabilistic Logic by Nand Kishore, Radhakrishnan Balu, and Shashi P. Karna ARL-TR-6354 March 2013 Approved for public release; distribution is unlimited.
More informationBayesian network modeling. 1
Bayesian network modeling http://springuniversity.bc3research.org/ 1 Probabilistic vs. deterministic modeling approaches Probabilistic Explanatory power (e.g., r 2 ) Explanation why Based on inductive
More informationMethods of Data Analysis Learning probability distributions
Methods of Data Analysis Learning probability distributions Week 5 1 Motivation One of the key problems in non-parametric data analysis is to infer good models of probability distributions, assuming we
More informationLearning Bayesian Networks
Learning Bayesian Networks Probabilistic Models, Spring 2011 Petri Myllymäki, University of Helsinki V-1 Aspects in learning Learning the parameters of a Bayesian network Marginalizing over all all parameters
More informationBayesian Networks. Motivation
Bayesian Networks Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Motivation Assume we have five Boolean variables,,,, The joint probability is,,,, How many state configurations
More informationLecture 5: Bayesian Network
Lecture 5: Bayesian Network Topics of this lecture What is a Bayesian network? A simple example Formal definition of BN A slightly difficult example Learning of BN An example of learning Important topics
More informationGraphical Models for Collaborative Filtering
Graphical Models for Collaborative Filtering Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Sequence modeling HMM, Kalman Filter, etc.: Similarity: the same graphical model topology,
More informationProbabilistic Soft Interventions in Conditional Gaussian Networks
Probabilistic Soft Interventions in Conditional Gaussian Networks Florian Markowetz, Steffen Grossmann, and Rainer Spang firstname.lastname@molgen.mpg.de Dept. Computational Molecular Biology Max Planck
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture 11 CRFs, Exponential Family CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due today Project milestones due next Monday (Nov 9) About half the work should
More informationCombining Experiments to Discover Linear Cyclic Models with Latent Variables
Combining Experiments to Discover Linear Cyclic Models with Latent Variables Frederick Eberhardt Patrik O. Hoyer Richard Scheines Department of Philosophy CSAIL, MIT Department of Philosophy Washington
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture 4 Learning Bayesian Networks CS/CNS/EE 155 Andreas Krause Announcements Another TA: Hongchao Zhou Please fill out the questionnaire about recitations Homework 1 out.
More informationBayesian Networks to design optimal experiments. Davide De March
Bayesian Networks to design optimal experiments Davide De March davidedemarch@gmail.com 1 Outline evolutionary experimental design in high-dimensional space and costly experimentation the microwell mixture
More informationBayesian Network Learning and Applications in Bioinformatics. Xiaotong Lin
Bayesian Network Learning and Applications in Bioinformatics By Xiaotong Lin Submitted to the Department of Electrical Engineering and Computer Science and the Faculty of the Graduate School of the University
More informationHomework 6: Image Completion using Mixture of Bernoullis
Homework 6: Image Completion using Mixture of Bernoullis Deadline: Wednesday, Nov. 21, at 11:59pm Submission: You must submit two files through MarkUs 1 : 1. a PDF file containing your writeup, titled
More informationDirected Graphical Models
Directed Graphical Models Instructor: Alan Ritter Many Slides from Tom Mitchell Graphical Models Key Idea: Conditional independence assumptions useful but Naïve Bayes is extreme! Graphical models express
More informationReview: Bayesian learning and inference
Review: Bayesian learning and inference Suppose the agent has to make decisions about the value of an unobserved query variable X based on the values of an observed evidence variable E Inference problem:
More informationIntroduction into Bayesian statistics
Introduction into Bayesian statistics Maxim Kochurov EF MSU November 15, 2016 Maxim Kochurov Introduction into Bayesian statistics EF MSU 1 / 7 Content 1 Framework Notations 2 Difference Bayesians vs Frequentists
More informationStructure Learning in Bayesian Networks
Structure Learning in Bayesian Networks Sargur Srihari srihari@cedar.buffalo.edu 1 Topics Problem Definition Overview of Methods Constraint-Based Approaches Independence Tests Testing for Independence
More informationBayesian Machine Learning
Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 3 Stochastic Gradients, Bayesian Inference, and Occam s Razor https://people.orie.cornell.edu/andrew/orie6741 Cornell University August
More informationBayesian Networks Basic and simple graphs
Bayesian Networks Basic and simple graphs Ullrika Sahlin, Centre of Environmental and Climate Research Lund University, Sweden Ullrika.Sahlin@cec.lu.se http://www.cec.lu.se/ullrika-sahlin Bayesian [Belief]
More informationPMR Learning as Inference
Outline PMR Learning as Inference Probabilistic Modelling and Reasoning Amos Storkey Modelling 2 The Exponential Family 3 Bayesian Sets School of Informatics, University of Edinburgh Amos Storkey PMR Learning
More informationBayesian Machine Learning
Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 4 Occam s Razor, Model Construction, and Directed Graphical Models https://people.orie.cornell.edu/andrew/orie6741 Cornell University September
More informationBayesian Learning. Instructor: Jesse Davis
Bayesian Learning Instructor: Jesse Davis 1 Announcements Homework 1 is due today Homework 2 is out Slides for this lecture are online We ll review some of homework 1 next class Techniques for efficient
More informationComparative Network Analysis
Comparative Network Analysis BMI/CS 776 www.biostat.wisc.edu/bmi776/ Spring 2016 Anthony Gitter gitter@biostat.wisc.edu These slides, excluding third-party material, are licensed under CC BY-NC 4.0 by
More informationCS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional
More informationLearning Bayesian Networks Does Not Have to Be NP-Hard
Learning Bayesian Networks Does Not Have to Be NP-Hard Norbert Dojer Institute of Informatics, Warsaw University, Banacha, 0-097 Warszawa, Poland dojer@mimuw.edu.pl Abstract. We propose an algorithm for
More informationLecture 25: November 27
10-725: Optimization Fall 2012 Lecture 25: November 27 Lecturer: Ryan Tibshirani Scribes: Matt Wytock, Supreeth Achar Note: LaTeX template courtesy of UC Berkeley EECS dept. Disclaimer: These notes have
More informationAbstract. Three Methods and Their Limitations. N-1 Experiments Suffice to Determine the Causal Relations Among N Variables
N-1 Experiments Suffice to Determine the Causal Relations Among N Variables Frederick Eberhardt Clark Glymour 1 Richard Scheines Carnegie Mellon University Abstract By combining experimental interventions
More informationhsnim: Hyper Scalable Network Inference Machine for Scale-Free Protein-Protein Interaction Networks Inference
CS 229 Project Report (TR# MSB2010) Submitted 12/10/2010 hsnim: Hyper Scalable Network Inference Machine for Scale-Free Protein-Protein Interaction Networks Inference Muhammad Shoaib Sehgal Computer Science
More informationLearning Bayesian Networks for Biomedical Data
Learning Bayesian Networks for Biomedical Data Faming Liang (Texas A&M University ) Liang, F. and Zhang, J. (2009) Learning Bayesian Networks for Discrete Data. Computational Statistics and Data Analysis,
More informationCausal learning without DAGs
Journal of Machine Learning Research 1 (2000) 1-48 Submitted 4/00; Published 10/00 Causal learning without s David Duvenaud Daniel Eaton Kevin Murphy Mark Schmidt University of British Columbia Department
More informationParametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012
Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood
More informationProgramming Assignment 4: Image Completion using Mixture of Bernoullis
Programming Assignment 4: Image Completion using Mixture of Bernoullis Deadline: Tuesday, April 4, at 11:59pm TA: Renie Liao (csc321ta@cs.toronto.edu) Submission: You must submit two files through MarkUs
More informationProbabilistic modeling. The slides are closely adapted from Subhransu Maji s slides
Probabilistic modeling The slides are closely adapted from Subhransu Maji s slides Overview So far the models and algorithms you have learned about are relatively disconnected Probabilistic modeling framework
More informationLearning Sequence Motif Models Using Expectation Maximization (EM) and Gibbs Sampling
Learning Sequence Motif Models Using Expectation Maximization (EM) and Gibbs Sampling BMI/CS 776 www.biostat.wisc.edu/bmi776/ Spring 009 Mark Craven craven@biostat.wisc.edu Sequence Motifs what is a sequence
More information4: Parameter Estimation in Fully Observed BNs
10-708: Probabilistic Graphical Models 10-708, Spring 2015 4: Parameter Estimation in Fully Observed BNs Lecturer: Eric P. Xing Scribes: Purvasha Charavarti, Natalie Klein, Dipan Pal 1 Learning Graphical
More informationCS 7180: Behavioral Modeling and Decision- making in AI
CS 7180: Behavioral Modeling and Decision- making in AI Learning Probabilistic Graphical Models Prof. Amy Sliva October 31, 2012 Hidden Markov model Stochastic system represented by three matrices N =
More informationComputational methods for predicting protein-protein interactions
Computational methods for predicting protein-protein interactions Tomi Peltola T-61.6070 Special course in bioinformatics I 3.4.2008 Outline Biological background Protein-protein interactions Computational
More informationLogistic Regression. Jia-Bin Huang. Virginia Tech Spring 2019 ECE-5424G / CS-5824
Logistic Regression Jia-Bin Huang ECE-5424G / CS-5824 Virginia Tech Spring 2019 Administrative Please start HW 1 early! Questions are welcome! Two principles for estimating parameters Maximum Likelihood
More informationCSC321 Lecture 18: Learning Probabilistic Models
CSC321 Lecture 18: Learning Probabilistic Models Roger Grosse Roger Grosse CSC321 Lecture 18: Learning Probabilistic Models 1 / 25 Overview So far in this course: mainly supervised learning Language modeling
More informationGWAS IV: Bayesian linear (variance component) models
GWAS IV: Bayesian linear (variance component) models Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS IV: Bayesian
More informationChapter 3: Maximum-Likelihood & Bayesian Parameter Estimation (part 1)
HW 1 due today Parameter Estimation Biometrics CSE 190 Lecture 7 Today s lecture was on the blackboard. These slides are an alternative presentation of the material. CSE190, Winter10 CSE190, Winter10 Chapter
More informationQualifier: CS 6375 Machine Learning Spring 2015
Qualifier: CS 6375 Machine Learning Spring 2015 The exam is closed book. You are allowed to use two double-sided cheat sheets and a calculator. If you run out of room for an answer, use an additional sheet
More informationECE521 Tutorial 11. Topic Review. ECE521 Winter Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides. ECE521 Tutorial 11 / 4
ECE52 Tutorial Topic Review ECE52 Winter 206 Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides ECE52 Tutorial ECE52 Winter 206 Credits to Alireza / 4 Outline K-means, PCA 2 Bayesian
More informationExact model averaging with naive Bayesian classifiers
Exact model averaging with naive Bayesian classifiers Denver Dash ddash@sispittedu Decision Systems Laboratory, Intelligent Systems Program, University of Pittsburgh, Pittsburgh, PA 15213 USA Gregory F
More informationGenerative Clustering, Topic Modeling, & Bayesian Inference
Generative Clustering, Topic Modeling, & Bayesian Inference INFO-4604, Applied Machine Learning University of Colorado Boulder December 12-14, 2017 Prof. Michael Paul Unsupervised Naïve Bayes Last week
More informationLearning With Bayesian Networks. Markus Kalisch ETH Zürich
Learning With Bayesian Networks Markus Kalisch ETH Zürich Inference in BNs - Review P(Burglary JohnCalls=TRUE, MaryCalls=TRUE) Exact Inference: P(b j,m) = c Sum e Sum a P(b)P(e)P(a b,e)p(j a)p(m a) Deal
More informationFast Regularization Paths via Coordinate Descent
KDD August 2008 Trevor Hastie, Stanford Statistics 1 Fast Regularization Paths via Coordinate Descent Trevor Hastie Stanford University joint work with Jerry Friedman and Rob Tibshirani. KDD August 2008
More informationGenerating executable models from signaling network connectivity and semi-quantitative proteomic measurements
Generating executable models from signaling network connectivity and semi-quantitative proteomic measurements Derek Ruths 1 Luay Nakhleh 2 1 School of Computer Science, McGill University, Quebec, Montreal
More informationBayesian Methods: Naïve Bayes
Bayesian Methods: aïve Bayes icholas Ruozzi University of Texas at Dallas based on the slides of Vibhav Gogate Last Time Parameter learning Learning the parameter of a simple coin flipping model Prior
More informationCSE 473: Artificial Intelligence Autumn 2011
CSE 473: Artificial Intelligence Autumn 2011 Bayesian Networks Luke Zettlemoyer Many slides over the course adapted from either Dan Klein, Stuart Russell or Andrew Moore 1 Outline Probabilistic models
More informationIntroduction to Probabilistic Machine Learning
Introduction to Probabilistic Machine Learning Piyush Rai Dept. of CSE, IIT Kanpur (Mini-course 1) Nov 03, 2015 Piyush Rai (IIT Kanpur) Introduction to Probabilistic Machine Learning 1 Machine Learning
More informationLearning Bayes Net Structures
Learning Bayes Net Structures KF, Chapter 15 15.5 (RN, Chapter 20) Some material taken from C Guesterin (CMU), K Murphy (UBC) 1 2 Learning Bayes Nets Known Structure Unknown Data Complete Missing Easy
More informationBayesian Networks BY: MOHAMAD ALSABBAGH
Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional
More informationAnnouncements. CS 188: Artificial Intelligence Spring Probability recap. Outline. Bayes Nets: Big Picture. Graphical Model Notation
CS 188: Artificial Intelligence Spring 2010 Lecture 15: Bayes Nets II Independence 3/9/2010 Pieter Abbeel UC Berkeley Many slides over the course adapted from Dan Klein, Stuart Russell, Andrew Moore Current
More informationProbabilistic Graphical Models for Image Analysis - Lecture 1
Probabilistic Graphical Models for Image Analysis - Lecture 1 Alexey Gronskiy, Stefan Bauer 21 September 2018 Max Planck ETH Center for Learning Systems Overview 1. Motivation - Why Graphical Models 2.
More informationSTA 4273H: Sta-s-cal Machine Learning
STA 4273H: Sta-s-cal Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 2 In our
More informationCSE 473: Artificial Intelligence Autumn Topics
CSE 473: Artificial Intelligence Autumn 2014 Bayesian Networks Learning II Dan Weld Slides adapted from Jack Breese, Dan Klein, Daphne Koller, Stuart Russell, Andrew Moore & Luke Zettlemoyer 1 473 Topics
More information