Molecular Similarity Searching Using Inference Network
|
|
- Dorothy Wilkins
- 6 years ago
- Views:
Transcription
1 Molecular Similarity Searching Using Inference Network Ammar Abdo, Naomie Salim* Faculty of Computer Science & Information Systems Universiti Teknologi Malaysia
2 Molecular Similarity Searching Search for chemical compounds with similar structure or properties to a known compound A variety of methods used in these searches Graph theory 1 D, 2D and 3D shape similarity, docking similarity, electrostatic similarity and others. Machine learning methods e.g. BKD,SVM,NBC,NN Vector space model using 2D fingerprints and Tanimoto coefficients is one the most widely used molecular similarity measure
3 Rationale for Chemical Similarity Similar property principle structurally similar molecules are likely to have similar properties Given a known active molecule, a similarity search can identify further molecules in the database for testing
4 Probabilistic models (Alternative approach) Why probabilistic models Information Retrieval deals with Uncertain Information Query and compounds characterizations are incomplete Probability theory seems to be the most natural way to quantify uncertainty Applied in IR for text document
5 Why Bayesian Networks Bayesian Nets is the most popular way of doing probabilistic inference in AI Clear formalism to combine evidences Modularize the world (dependencies) Bayesian Network Models for IR Inference Network (Turtle & Croft, 1991) Belief Network (Ribeiro-Neto & Muntz, 1996) Simple
6 Bayesian inference Bayes Rule : the heart of Bayesian techniques P(H E) = P(E H)P(H) / P(E) where, H is a hypothesis and E is an evidence P(H) : prior probability P(H E) : posterior probability P(E H) : probability of E if H is true P(E) : a normalizing constant, then we write: P(H E) ~ P(E H)P(H)
7 Bayesian Networks What is a Bayesian networks? It is directed acyclic graphs (DAGs) in which nodes represent random variables, The parents of a node are those judged to be direct causes for it. The root of the network are the nodes without parents. The arcs represent casual relationships between these variables, and the strengths of these casual influences are expressed by conditional probabilities. x 1 x n : parent nodes, X the set of parents of y (in this case, root nodes) y : child node x i cause y The influence of X on y can be quantified by any function (conditional probabilities) F(y,X)=P(y X) x 1 x 2 y x n
8 Bayesian networks p(a) a c b p(c a,b) for all values for a,b,c p(b) Conditional dependence Running Bayesian Nets: Given probability distributions for roots and conditional probabilities of nodes, we can compute apriori probability of any instance Changes in parents (e.g., b was observed) will cause recomputation of probabilities
9 How to describe and compare molecules Network Model generation Description of the system in a suitable network form Representation of importance of descriptors (weighting schemes) Probability estimation for the network model Calculate the similarity scores Bayesian networks approach to molecular similarity searching
10 Bayesian inference network Nodes compounds (c j ) features (f i ) queries (q 1, q 2, and q r ) target (A) Edges from c j to its feature nodes f i indicate that the observation of c j increase the belief in the variables f i.
11 Definitions f 1, c j, and q 1 are random variables. F=(f 1, f 2,...,f n ) is an n-dimensional vector (equal to fingerprint length) f i, i {0, 1}, then f has 2x2 n possible states c j, j {0, 1}; q {0, 1} The rank of a compound c j is computed as P(q=true c j =true) (c j stands for a state where c j =true and i j c i =false, because we observe one compound at a time)
12 Direct Acyclic Graph (DAG) of compound nodes as roots, contain prior probability of observing compound feature nodes as leaves, contain probability associated with node given set of parent compounds Construct Compound Network (once)
13 Inverted DAG with single leaf for target molecule, multiple roots that correspond to the features that express query. A set of intermediate query nodes may also be used in case of a multiple query used to express the target. Attach it to compound Network Construct Query Network for each query
14 Find probability that target molecule (A) is satisfied given compound c j has been observed Instantiate each c j which corresponds to attaching evidence to network by stating that c j is true and rest of compounds as false Find subset of c j s which maximizes the probability value of node A (best subset). Retrieve these c j s as the answer to query. Similarity calculation
15 Bayesian inference network The retrieval of an active compound compared to a given target structure is obtained by means of an inference process through a network of dependences. To achieve the inference process We need to estimate the strength of the relationships represented by network This involves estimating & encoding a set of conditional probability distributions The inference network we have described comprise of four different layers of nodes (four different random variables), first layer comprise the compound nodes (roots) The probability associated with these nodes is define as: P(c j )=1/(collection size) prior probability
16 Bayesian inference network The second layer comprise of the feature nodes, so we need to compute P(f i ). P(f i c j ) will be computed as follows, since dependency is based on first layer (parent nodes). Weighting function is used to estimate the probability in p(f i /c j ) where α is a constant and experiments using the inference network (Turtle, 1991) show that the best value for α is 0.4, ff ij is the frequency of the i th feature within j th compound, icf i is the inverse compound frequency of the i th feature in the collection, cl j is the size of j th compound, total_cl is the total length of compounds in the collection, and m is total number of compounds in the collection (this Eq. has been adapted from Okapi retrieval system (Robertson et al., 1995))
17 Bayesian inference network The third layer comprises only the query nodes p(q k ) where c jk the set of features in common between j th compound and k th query, cl j is the size of j th compound, nff ik is the normalized frequency of the i th feature within k th query, nicf i is the normalized inverse compound frequency of the i th feature in the collection and p i is the estimated probability at the i th feature node. where ff ij is the frequency of the i th feature within j th compound,
18 Bayesian inference network The last layer comprises only the activity-need node (target) or bel(a) in the case of where more than one query is used. Weighted MAX Weighted SUM where c jk is the set of feature in common between j th compound and k th query, ql k is the size of the k th query, p jk is the estimated probability that the k th query is met by the j th compound, and r is the number of queries.
19 Experimental details Subset of MDDR with molecules 12 activity classes In all, 6804 actives in the 12 classes 10 set of 10 randomly chosen compounds from each activity class (to form a set of queries). For comparison purpose, similarity calculation is also done using non-binary Tanimoto coefficient Six different type of weighted fingerprints from Scitegic atom type extended-connectivity counts (ECFC), functional class extended-connectivity counts (FCFC), atom type atom environment counts (EEFC), functional class atom environment counts (FEFC), atom type hashed atom environment counts (EHFC), and functional class hashed atom environment counts (FHFC)
20 MDDR Data no. of unique av. no. mols. diversity Code Activity class Actives AF a MF b AF MF mean SD 5H3 5HT3 antagonists HA 5HT1A agonists D2A D2 antagonists Ren Renin inhibitors Ang Angiontensin II AT1 antagonists Thr Thrombin inhibitors SPA Substance P antagonists HIV HIV-1 protease inhibitors Cyc Cyclooxygenase inhibitors Kin Tyrosin protein kinase inhibitors PAF PAF antagonists HMG HMG-CoA reductase inhibitors a Unique AF is the number of unique atomic frameworks present in the class. b Unique MF is the number of unique molecular frameworks present in the class.
21 Use of a single reference structure Highest diverse class
22 Use of a single reference structure Comparison of the average percentage of unique atomic frameworks obtained in the top 5% of the ranked test set using BIN & Tan with EHFC_4 Highest diverse class
23 Use of multiple reference structures Comparison between BIN & Tan using MAX rule and ECFC_4
24 Use of multiple reference structures Comparison of the average percentage of atomic frameworks retrieved obtained in the top 5% of the ranked test set using BIN-MAX & Tan-MAX with ECFC_4
25
26 BIN with multiple molecular descriptors So far have considered using just a single molecular descriptor and multiple reference structures as the basis for a search Further work to search with multiple molecular descriptors (ECFC4, EHFC4, FHFC4, FPFC4,PHPFC3) with single and multiple reference structures
27 Use of a single molecular descriptors and a single reference structure Compound nodes c 1 c 2 c m Feature nodes D 1 f 1 f n f 1 f n f 1 f n D 2 D s Query nodes D 1 D 2 D s q 1 q r q 1 q r q 1 q r Weighted-max link matrices wmax 1 wmax 2 wmax s wsum Target node A
28 Use of multiple molecular descriptors and a single reference structure Comparison between multiple descriptors and single descriptor with single reference structure using BIN
29 Use of multiple molecular descriptors and a multiple reference structures Comparison between multiple descriptors and single descriptor with multiple reference structures using BIN
30 Summary I BIN method with a single active reference structure outperforms the Tanimoto similarity method in 11 classes (between 6% to 71%) 19% overall improvement only in one activity (Cyclooxygenase inhibitors) BIN is slightly inferior to Tan (-5%) BIN with multiple reference structures superior to Tan in all activity classes (between 5% to 118%) significantly outperform Tan with overall improvements 35% performance improvement in the overall average recall rate
31 Summary II BIN with multiple descriptors and single reference structure slightly outperform the BIN with single descriptor and single reference BIN with multiple descriptors and multiple reference structures slightly outperform the BIN with multiple descriptors and multiple references BIN with multiple descriptors will enhance performance (with a high percentage) when the sought actives are structurally heterogeneous. But it will slightly enhance performance when the sought actives are structurally homogeneous.
32 Summary III Some evidence to suggest that the BIN is more effective at scaffold hopping for the more diverse data sets. The networks do not impose additional costs because the networks do not include cycles. The major strength is net combination of distinct evidential sources to support the rank of a given compound. BIN provide the ability to integrate into a single framework, several descriptors and several references
33 Use of a single reference structure Thank you
An Enhancement of Bayesian Inference Network for Ligand-Based Virtual Screening using Features Selection
American Journal of Applied Sciences 8 (4): 368-373, 2011 ISSN 1546-9239 2010 Science Publications An Enhancement of Bayesian Inference Network for Ligand-Based Virtual Screening using Features Selection
More informationUniversities of Leeds, Sheffield and York
promoting access to White Rose research papers Universities of Leeds, Sheffield and York http://eprints.whiterose.ac.uk/ This is an author produced version of a paper published in Organic & Biomolecular
More informationBayesian Networks BY: MOHAMAD ALSABBAGH
Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional
More informationDirected Graphical Models
CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential
More informationFunctional Group Fingerprints CNS Chemistry Wilmington, USA
Functional Group Fingerprints CS Chemistry Wilmington, USA James R. Arnold Charles L. Lerman William F. Michne James R. Damewood American Chemical Society ational Meeting August, 2004 Philadelphia, PA
More informationDirected Graphical Models or Bayesian Networks
Directed Graphical Models or Bayesian Networks Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Bayesian Networks One of the most exciting recent advancements in statistical AI Compact
More informationBayesian Networks. Motivation
Bayesian Networks Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Motivation Assume we have five Boolean variables,,,, The joint probability is,,,, How many state configurations
More informationThe Monte Carlo Method: Bayesian Networks
The Method: Bayesian Networks Dieter W. Heermann Methods 2009 Dieter W. Heermann ( Methods)The Method: Bayesian Networks 2009 1 / 18 Outline 1 Bayesian Networks 2 Gene Expression Data 3 Bayesian Networks
More informationRecall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem
Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)
More information1. Some examples of coping with Molecular informatics data legacy data (accuracy)
Molecular Informatics Tools for Data Analysis and Discovery 1. Some examples of coping with Molecular informatics data legacy data (accuracy) 2. Database searching using a similarity approach fingerprints
More informationCheminformatics analysis and learning in a data pipelining environment
Molecular Diversity (2006) 10: 283 299 DOI: 10.1007/s11030-006-9041-5 c Springer 2006 Review Cheminformatics analysis and learning in a data pipelining environment Moises Hassan 1,, Robert D. Brown 1,
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear
More informationIntro to AI: Lecture 8. Volker Sorge. Introduction. A Bayesian Network. Inference in. Bayesian Networks. Bayesian Networks.
Specifying Probability Distributions Specifying a probability for every atomic event is impractical We have already seen it can be easier to specify probability distributions by using (conditional) independence
More informationCS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional
More informationhas its own advantages and drawbacks, depending on the questions facing the drug discovery.
2013 First International Conference on Artificial Intelligence, Modelling & Simulation Comparison of Similarity Coefficients for Chemical Database Retrieval Mukhsin Syuib School of Information Technology
More informationIntroduction to Bayesian Learning
Course Information Introduction Introduction to Bayesian Learning Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Apprendimento Automatico: Fondamenti - A.A. 2016/2017 Outline
More informationIntroduction to Machine Learning
Introduction to Machine Learning CS4375 --- Fall 2018 Bayesian a Learning Reading: Sections 13.1-13.6, 20.1-20.2, R&N Sections 6.1-6.3, 6.7, 6.9, Mitchell 1 Uncertainty Most real-world problems deal with
More informationBayesian Learning. CSL603 - Fall 2017 Narayanan C Krishnan
Bayesian Learning CSL603 - Fall 2017 Narayanan C Krishnan ckn@iitrpr.ac.in Outline Bayes Theorem MAP Learners Bayes optimal classifier Naïve Bayes classifier Example text classification Bayesian networks
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More informationIntroduction to Machine Learning
Uncertainty Introduction to Machine Learning CS4375 --- Fall 2018 a Bayesian Learning Reading: Sections 13.1-13.6, 20.1-20.2, R&N Sections 6.1-6.3, 6.7, 6.9, Mitchell Most real-world problems deal with
More informationRapid Introduction to Machine Learning/ Deep Learning
Rapid Introduction to Machine Learning/ Deep Learning Hyeong In Choi Seoul National University 1/32 Lecture 5a Bayesian network April 14, 2016 2/32 Table of contents 1 1. Objectives of Lecture 5a 2 2.Bayesian
More informationLearning Bayesian networks
1 Lecture topics: Learning Bayesian networks from data maximum likelihood, BIC Bayesian, marginal likelihood Learning Bayesian networks There are two problems we have to solve in order to estimate Bayesian
More informationMachine Learning. CS Spring 2015 a Bayesian Learning (I) Uncertainty
Machine Learning CS6375 --- Spring 2015 a Bayesian Learning (I) 1 Uncertainty Most real-world problems deal with uncertain information Diagnosis: Likely disease given observed symptoms Equipment repair:
More informationProbabilistic Graphical Models (I)
Probabilistic Graphical Models (I) Hongxin Zhang zhx@cad.zju.edu.cn State Key Lab of CAD&CG, ZJU 2015-03-31 Probabilistic Graphical Models Modeling many real-world problems => a large number of random
More informationBayesian network modeling. 1
Bayesian network modeling http://springuniversity.bc3research.org/ 1 Probabilistic vs. deterministic modeling approaches Probabilistic Explanatory power (e.g., r 2 ) Explanation why Based on inductive
More informationA Framework For Genetic-Based Fusion Of Similarity Measures In Chemical Compound Retrieval
A Framework For Genetic-Based Fusion Of Similarity Measures In Chemical Compound Retrieval Naomie Salim Faculty of Computer Science and Information Systems Universiti Teknologi Malaysia naomie@fsksm.utm.my
More informationProbabilistic Classification
Bayesian Networks Probabilistic Classification Goal: Gather Labeled Training Data Build/Learn a Probability Model Use the model to infer class labels for unlabeled data points Example: Spam Filtering...
More informationUncertainty and Bayesian Networks
Uncertainty and Bayesian Networks Tutorial 3 Tutorial 3 1 Outline Uncertainty Probability Syntax and Semantics for Uncertainty Inference Independence and Bayes Rule Syntax and Semantics for Bayesian Networks
More informationInformation Retrieval
Introduction to Information Retrieval Lecture 11: Probabilistic Information Retrieval 1 Outline Basic Probability Theory Probability Ranking Principle Extensions 2 Basic Probability Theory For events A
More informationExtensions of Bayesian Networks. Outline. Bayesian Network. Reasoning under Uncertainty. Features of Bayesian Networks.
Extensions of Bayesian Networks Outline Ethan Howe, James Lenfestey, Tom Temple Intro to Dynamic Bayesian Nets (Tom Exact inference in DBNs with demo (Ethan Approximate inference and learning (Tom Probabilistic
More informationAn Empirical-Bayes Score for Discrete Bayesian Networks
An Empirical-Bayes Score for Discrete Bayesian Networks scutari@stats.ox.ac.uk Department of Statistics September 8, 2016 Bayesian Network Structure Learning Learning a BN B = (G, Θ) from a data set D
More informationUncertainty and knowledge. Uncertainty and knowledge. Reasoning with uncertainty. Notes
Approximate reasoning Uncertainty and knowledge Introduction All knowledge representation formalism and problem solving mechanisms that we have seen until now are based on the following assumptions: All
More informationIntroduction to Artificial Intelligence. Unit # 11
Introduction to Artificial Intelligence Unit # 11 1 Course Outline Overview of Artificial Intelligence State Space Representation Search Techniques Machine Learning Logic Probabilistic Reasoning/Bayesian
More informationIntelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks
Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 13 24 march 2017 Reasoning with Bayesian Networks Naïve Bayesian Systems...2 Example
More informationLearning Bayesian Networks (part 1) Goals for the lecture
Learning Bayesian Networks (part 1) Mark Craven and David Page Computer Scices 760 Spring 2018 www.biostat.wisc.edu/~craven/cs760/ Some ohe slides in these lectures have been adapted/borrowed from materials
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More informationBayesian Networks. Vibhav Gogate The University of Texas at Dallas
Bayesian Networks Vibhav Gogate The University of Texas at Dallas Intro to AI (CS 6364) Many slides over the course adapted from either Dan Klein, Luke Zettlemoyer, Stuart Russell or Andrew Moore 1 Outline
More informationIntroduction to Bayes Nets. CS 486/686: Introduction to Artificial Intelligence Fall 2013
Introduction to Bayes Nets CS 486/686: Introduction to Artificial Intelligence Fall 2013 1 Introduction Review probabilistic inference, independence and conditional independence Bayesian Networks - - What
More informationDecision-Theoretic Specification of Credal Networks: A Unified Language for Uncertain Modeling with Sets of Bayesian Networks
Decision-Theoretic Specification of Credal Networks: A Unified Language for Uncertain Modeling with Sets of Bayesian Networks Alessandro Antonucci Marco Zaffalon IDSIA, Istituto Dalle Molle di Studi sull
More informationProbabilistic Reasoning. (Mostly using Bayesian Networks)
Probabilistic Reasoning (Mostly using Bayesian Networks) Introduction: Why probabilistic reasoning? The world is not deterministic. (Usually because information is limited.) Ways of coping with uncertainty
More informationProbabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference
Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference David Poole University of British Columbia 1 Overview Belief Networks Variable Elimination Algorithm Parent Contexts
More informationBayesian Networks: Representation, Variable Elimination
Bayesian Networks: Representation, Variable Elimination CS 6375: Machine Learning Class Notes Instructor: Vibhav Gogate The University of Texas at Dallas We can view a Bayesian network as a compact representation
More informationIR Models: The Probabilistic Model. Lecture 8
IR Models: The Probabilistic Model Lecture 8 ' * ) ( % $ $ +#! "#! '& & Probability of Relevance? ' ', IR is an uncertain process Information need to query Documents to index terms Query terms and index
More informationLearning in Bayesian Networks
Learning in Bayesian Networks Florian Markowetz Max-Planck-Institute for Molecular Genetics Computational Molecular Biology Berlin Berlin: 20.06.2002 1 Overview 1. Bayesian Networks Stochastic Networks
More informationLecture 10: Introduction to reasoning under uncertainty. Uncertainty
Lecture 10: Introduction to reasoning under uncertainty Introduction to reasoning under uncertainty Review of probability Axioms and inference Conditional probability Probability distributions COMP-424,
More informationBayesian Networks. Vibhav Gogate The University of Texas at Dallas
Bayesian Networks Vibhav Gogate The University of Texas at Dallas Intro to AI (CS 4365) Many slides over the course adapted from either Dan Klein, Luke Zettlemoyer, Stuart Russell or Andrew Moore 1 Outline
More informationInference in Bayesian Networks
Andrea Passerini passerini@disi.unitn.it Machine Learning Inference in graphical models Description Assume we have evidence e on the state of a subset of variables E in the model (i.e. Bayesian Network)
More information{ p if x = 1 1 p if x = 0
Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =
More informationProbabilistic Logics and Probabilistic Networks
Probabilistic Logics and Probabilistic Networks Jan-Willem Romeijn, Philosophy, Groningen Jon Williamson, Philosophy, Kent ESSLLI 2008 Course Page: http://www.kent.ac.uk/secl/philosophy/jw/2006/progicnet/esslli.htm
More informationToday. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use?
Today Statistical Learning Parameter Estimation: Maximum Likelihood (ML) Maximum A Posteriori (MAP) Bayesian Continuous case Learning Parameters for a Bayesian Network Naive Bayes Maximum Likelihood estimates
More informationProbability and Statistics
Probability and Statistics Jesse Hoey School of Computer Science University of Waterloo January 9, 2012 Uncertainty Why is uncertainty important? Agents (and humans) don t know everything, but need to
More informationParameter Learning: Binary Variables
Parameter Learning: Binary Variables SS 008 Bayesian Networks Multimedia Computing, Universität Augsburg Rainer.Lienhart@informatik.uni-augsburg.de www.multimedia-computing.{de,org} Reference Richard E.
More informationProbability. CS 3793/5233 Artificial Intelligence Probability 1
CS 3793/5233 Artificial Intelligence 1 Motivation Motivation Random Variables Semantics Dice Example Joint Dist. Ex. Axioms Agents don t have complete knowledge about the world. Agents need to make decisions
More informationCS839: Probabilistic Graphical Models. Lecture 2: Directed Graphical Models. Theo Rekatsinas
CS839: Probabilistic Graphical Models Lecture 2: Directed Graphical Models Theo Rekatsinas 1 Questions Questions? Waiting list Questions on other logistics 2 Section 1 1. Intro to Bayes Nets 3 Section
More informationIntroduction. OntoChem
Introduction ntochem Providing drug discovery knowledge & small molecules... Supporting the task of medicinal chemistry Allows selecting best possible small molecule starting point From target to leads
More informationCS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine
CS 484 Data Mining Classification 7 Some slides are from Professor Padhraic Smyth at UC Irvine Bayesian Belief networks Conditional independence assumption of Naïve Bayes classifier is too strong. Allows
More informationOutline. Spring It Introduction Representation. Markov Random Field. Conclusion. Conditional Independence Inference: Variable elimination
Probabilistic Graphical Models COMP 790-90 Seminar Spring 2011 The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Outline It Introduction ti Representation Bayesian network Conditional Independence Inference:
More informationIllustration of the K2 Algorithm for Learning Bayes Net Structures
Illustration of the K2 Algorithm for Learning Bayes Net Structures Prof. Carolina Ruiz Department of Computer Science, WPI ruiz@cs.wpi.edu http://www.cs.wpi.edu/ ruiz The purpose of this handout is to
More informationCMPT Machine Learning. Bayesian Learning Lecture Scribe for Week 4 Jan 30th & Feb 4th
CMPT 882 - Machine Learning Bayesian Learning Lecture Scribe for Week 4 Jan 30th & Feb 4th Stephen Fagan sfagan@sfu.ca Overview: Introduction - Who was Bayes? - Bayesian Statistics Versus Classical Statistics
More informationStudying the effect of noise on Laplacian-modified Bayesian Analysis and Tanimoto Similarity
Studying the effect of noise on Laplacian-modified Bayesian nalysis and Tanimoto Similarity David Rogers, Ph.D. SciTegic, Inc. (Division of ccelrys, Inc.) drogers@scitegic.com Description of: nalysis methods
More informationBeyond Uniform Priors in Bayesian Network Structure Learning
Beyond Uniform Priors in Bayesian Network Structure Learning (for Discrete Bayesian Networks) scutari@stats.ox.ac.uk Department of Statistics April 5, 2017 Bayesian Network Structure Learning Learning
More informationChemogenomic: Approaches to Rational Drug Design. Jonas Skjødt Møller
Chemogenomic: Approaches to Rational Drug Design Jonas Skjødt Møller Chemogenomic Chemistry Biology Chemical biology Medical chemistry Chemical genetics Chemoinformatics Bioinformatics Chemoproteomics
More informationOn the Complexity of Mapping Pipelined Filtering Services on Heterogeneous Platforms
On the Complexity of Mapping Pipelined Filtering Services on Heterogeneous Platforms Anne Benoit, Fanny Dufossé and Yves Robert LIP, École Normale Supérieure de Lyon, France {Anne.Benoit Fanny.Dufosse
More informationTópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863
Tópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863 Daniel, Edmundo, Rosa Terceiro trimestre de 2012 UFRJ - COPPE Programa de Engenharia de Sistemas e Computação Bayesian Networks
More informationPROBABILISTIC REASONING SYSTEMS
PROBABILISTIC REASONING SYSTEMS In which we explain how to build reasoning systems that use network models to reason with uncertainty according to the laws of probability theory. Outline Knowledge in uncertain
More information2 : Directed GMs: Bayesian Networks
10-708: Probabilistic Graphical Models 10-708, Spring 2017 2 : Directed GMs: Bayesian Networks Lecturer: Eric P. Xing Scribes: Jayanth Koushik, Hiroaki Hayashi, Christian Perez Topic: Directed GMs 1 Types
More informationIntroduction to Probabilistic Reasoning. Image credit: NASA. Assignment
Introduction to Probabilistic Reasoning Brian C. Williams 16.410/16.413 November 17 th, 2010 11/17/10 copyright Brian Williams, 2005-10 1 Brian C. Williams, copyright 2000-09 Image credit: NASA. Assignment
More informationMachine Learning Concepts in Chemoinformatics
Machine Learning Concepts in Chemoinformatics Martin Vogt B-IT Life Science Informatics Rheinische Friedrich-Wilhelms-Universität Bonn BigChem Winter School 2017 25. October Data Mining in Chemoinformatics
More informationImplementing Machine Reasoning using Bayesian Network in Big Data Analytics
Implementing Machine Reasoning using Bayesian Network in Big Data Analytics Steve Cheng, Ph.D. Guest Speaker for EECS 6893 Big Data Analytics Columbia University October 26, 2017 Outline Introduction Probability
More informationOutline. CSE 573: Artificial Intelligence Autumn Bayes Nets: Big Picture. Bayes Net Semantics. Hidden Markov Models. Example Bayes Net: Car
CSE 573: Artificial Intelligence Autumn 2012 Bayesian Networks Dan Weld Many slides adapted from Dan Klein, Stuart Russell, Andrew Moore & Luke Zettlemoyer Outline Probabilistic models (and inference)
More informationRecall from last time. Lecture 3: Conditional independence and graph structure. Example: A Bayesian (belief) network.
ecall from last time Lecture 3: onditional independence and graph structure onditional independencies implied by a belief network Independence maps (I-maps) Factorization theorem The Bayes ball algorithm
More informationExact model averaging with naive Bayesian classifiers
Exact model averaging with naive Bayesian classifiers Denver Dash ddash@sispittedu Decision Systems Laboratory, Intelligent Systems Program, University of Pittsburgh, Pittsburgh, PA 15213 USA Gregory F
More information3 : Representation of Undirected GM
10-708: Probabilistic Graphical Models 10-708, Spring 2016 3 : Representation of Undirected GM Lecturer: Eric P. Xing Scribes: Longqi Cai, Man-Chia Chang 1 MRF vs BN There are two types of graphical models:
More informationCh.6 Uncertain Knowledge. Logic and Uncertainty. Representation. One problem with logical approaches: Department of Computer Science
Ch.6 Uncertain Knowledge Representation Hantao Zhang http://www.cs.uiowa.edu/ hzhang/c145 The University of Iowa Department of Computer Science Artificial Intelligence p.1/39 Logic and Uncertainty One
More informationBayes Nets: Independence
Bayes Nets: Independence [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Bayes Nets A Bayes
More informationECE521 Tutorial 11. Topic Review. ECE521 Winter Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides. ECE521 Tutorial 11 / 4
ECE52 Tutorial Topic Review ECE52 Winter 206 Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides ECE52 Tutorial ECE52 Winter 206 Credits to Alireza / 4 Outline K-means, PCA 2 Bayesian
More informationMachine Learning Lecture 14
Many slides adapted from B. Schiele, S. Roth, Z. Gharahmani Machine Learning Lecture 14 Undirected Graphical Models & Inference 23.06.2015 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de
More informationChris Bishop s PRML Ch. 8: Graphical Models
Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular
More informationPublished in: Tenth Tbilisi Symposium on Language, Logic and Computation: Gudauri, Georgia, September 2013
UvA-DARE (Digital Academic Repository) Estimating the Impact of Variables in Bayesian Belief Networks van Gosliga, S.P.; Groen, F.C.A. Published in: Tenth Tbilisi Symposium on Language, Logic and Computation:
More informationRespecting Markov Equivalence in Computing Posterior Probabilities of Causal Graphical Features
Proceedings of the Twenty-Fourth AAAI Conference on Artificial Intelligence (AAAI-10) Respecting Markov Equivalence in Computing Posterior Probabilities of Causal Graphical Features Eun Yong Kang Department
More informationDirected Graphical Models
Directed Graphical Models Instructor: Alan Ritter Many Slides from Tom Mitchell Graphical Models Key Idea: Conditional independence assumptions useful but Naïve Bayes is extreme! Graphical models express
More informationGraphical Models - Part I
Graphical Models - Part I Oliver Schulte - CMPT 726 Bishop PRML Ch. 8, some slides from Russell and Norvig AIMA2e Outline Probabilistic Models Bayesian Networks Markov Random Fields Inference Outline Probabilistic
More informationRanked Retrieval (2)
Text Technologies for Data Science INFR11145 Ranked Retrieval (2) Instructor: Walid Magdy 31-Oct-2017 Lecture Objectives Learn about Probabilistic models BM25 Learn about LM for IR 2 1 Recall: VSM & TFIDF
More informationQuantifying uncertainty & Bayesian networks
Quantifying uncertainty & Bayesian networks CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2016 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition,
More informationA Brief Introduction to Graphical Models. Presenter: Yijuan Lu November 12,2004
A Brief Introduction to Graphical Models Presenter: Yijuan Lu November 12,2004 References Introduction to Graphical Models, Kevin Murphy, Technical Report, May 2001 Learning in Graphical Models, Michael
More informationBayesian Networks. Distinguished Prof. Dr. Panos M. Pardalos
Distinguished Prof. Dr. Panos M. Pardalos Center for Applied Optimization Department of Industrial & Systems Engineering Computer & Information Science & Engineering Department Biomedical Engineering Program,
More informationStructure-Activity Modeling - QSAR. Uwe Koch
Structure-Activity Modeling - QSAR Uwe Koch QSAR Assumption: QSAR attempts to quantify the relationship between activity and molecular strcucture by correlating descriptors with properties Biological activity
More informationProbabilistic Graphical Networks: Definitions and Basic Results
This document gives a cursory overview of Probabilistic Graphical Networks. The material has been gleaned from different sources. I make no claim to original authorship of this material. Bayesian Graphical
More informationBayesian networks. Chapter Chapter
Bayesian networks Chapter 14.1 3 Chapter 14.1 3 1 Outline Syntax Semantics Parameterized distributions Chapter 14.1 3 2 Bayesian networks A simple, graphical notation for conditional independence assertions
More informationReasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Alan Mackworth UBC CS 322 Uncertainty 2 March 13, 2013 Textbook 6.1.3 Lecture Overview Recap: Probability & Possible World Semantics
More informationEvents A and B are independent P(A) = P(A B) = P(A B) / P(B)
Events A and B are independent A B U P(A) = P(A B) = P(A B) / P(B) 1 Alternative Characterization of Independence a) P(A B) = P(A) b) P(A B) = P(A) P(B) Recall P(A B) = P(A B) / P(B) (if P(B) 0) So P(A
More informationReductionist View: A Priori Algorithm and Vector-Space Text Retrieval. Sargur Srihari University at Buffalo The State University of New York
Reductionist View: A Priori Algorithm and Vector-Space Text Retrieval Sargur Srihari University at Buffalo The State University of New York 1 A Priori Algorithm for Association Rule Learning Association
More informationTHE VINE COPULA METHOD FOR REPRESENTING HIGH DIMENSIONAL DEPENDENT DISTRIBUTIONS: APPLICATION TO CONTINUOUS BELIEF NETS
Proceedings of the 00 Winter Simulation Conference E. Yücesan, C.-H. Chen, J. L. Snowdon, and J. M. Charnes, eds. THE VINE COPULA METHOD FOR REPRESENTING HIGH DIMENSIONAL DEPENDENT DISTRIBUTIONS: APPLICATION
More informationSimilarity methods for ligandbased virtual screening
Similarity methods for ligandbased virtual screening Peter Willett, University of Sheffield Computers in Scientific Discovery 5, 22 nd July 2010 Overview Molecular similarity and its use in virtual screening
More informationBayesian Networks aka belief networks, probabilistic networks. Bayesian Networks aka belief networks, probabilistic networks. An Example Bayes Net
Bayesian Networks aka belief networks, probabilistic networks A BN over variables {X 1, X 2,, X n } consists of: a DAG whose nodes are the variables a set of PTs (Pr(X i Parents(X i ) ) for each X i P(a)
More informationBayesian Networks to design optimal experiments. Davide De March
Bayesian Networks to design optimal experiments Davide De March davidedemarch@gmail.com 1 Outline evolutionary experimental design in high-dimensional space and costly experimentation the microwell mixture
More informationSoft Computing. Lecture Notes on Machine Learning. Matteo Matteucci.
Soft Computing Lecture Notes on Machine Learning Matteo Matteucci matteucci@elet.polimi.it Department of Electronics and Information Politecnico di Milano Matteo Matteucci c Lecture Notes on Machine Learning
More informationCHAPTER-17. Decision Tree Induction
CHAPTER-17 Decision Tree Induction 17.1 Introduction 17.2 Attribute selection measure 17.3 Tree Pruning 17.4 Extracting Classification Rules from Decision Trees 17.5 Bayesian Classification 17.6 Bayes
More informationGraphical models and causality: Directed acyclic graphs (DAGs) and conditional (in)dependence
Graphical models and causality: Directed acyclic graphs (DAGs) and conditional (in)dependence General overview Introduction Directed acyclic graphs (DAGs) and conditional independence DAGs and causal effects
More informationBayesian Networks 2:
1/27 PhD seminar series Probabilistics in Engineering : Bayesian networks and Bayesian hierarchical analysis in engineering Conducted by Prof. Dr. Maes, Prof. Dr. Faber and Dr. Nishijima Bayesian Networks
More information