WELCOME BAYESIAN NETWORK

Size: px
Start display at page:

Download "WELCOME BAYESIAN NETWORK"

Transcription

1 WELCOME BAYESIAN NETWORK 1

2 OUTLINE HOW IS IT DIFFERENT? RIVEW OF PROBABILITY THEORY INDEPENDENCE OF EVENTS JOINT PROBABILITY DISTRIBUTION MARKOV CONDITION BAYESIAN NETWORK NAÏVE BAYES CLASSIFIER 2

3 DIFFERENCES DEPENDENCE AMONG FEATURES ABLE TO PREDICT VALUE OF ANY FEATURE MORE INSIGHT OF THE DATA ASSESSMENT ON ACCURACY OF PREDICTION 3

4 PROBABILITY THEORY P(A) = A / S A <= S 0 <= P(A) <= 1 4

5 INDEPENDENCE OF EVENTS TWO EVENTS ARE INDEPENDENT WHEN KNOWING ONE DOES NOT HELP US TO KNOW THE OTHER ONE BETTER P ( E F ) = P ( E ) I P ( E, F ) P ( E, F ) = P ( E ) P ( F ) P ( E F, G ) = P ( E G ) I P ( E, F G) P ( E, F G ) = P ( E G ) P ( F G ) 5

6 EXAMPLES FROM CARD PROBLEMS P (Queen) = 4 / 52 = 1 / 13 P (Spade) = 13 / 52 = 1 / 4 P (Queen Spade) = 1 / 13 = P (Queen) P (Queen, Spade) = 1 / 52 = P(Queen)P(Spade) P(Royal) = 12 / 52 = 3 / 13 P(Queen Royal) = 4 / 12 = 1 / 3 > P(Queen) 6

7 MORE EXAMPLES P(One) = 5 / 13 P(One Square) = 3 / 8 P(One Black) = 3 / 9 = 1 / 3 P(One Square, Black) = 2 / 6 = 1 / 3 P(One White) = 2 / 4 = 1 / 2 P(One Square, White) = 1 / 2 7

8 RANDOM VARIABLE IT IS EASY TO EXPRESS EVENTS IN TERMS OF RANDOM VARIABLES 8

9 JOINT PROBABILITY DISTRIBUTION (JPD) ALL POSSIBLE COMBINATIONS OF THE VALUES OF RANDOM VARIABLES P (x, y) = P (X = x, Y = y) 9

10 PROBLEM WITH JPD IF WE HAVE 10 RANDOM VARIABLES EACH WITH 10 DIFFERENT VALUES EXPONENTIAL COMPLEXITY CONDITIONAL INDEPENDENCE CAN HELP 10

11 CAUSAL GRAPH - DAG IT IS A DIRECTED ACYCLIC GRAPH EDGE REPRESENTS CAUSE EFFECT RELATIONSHIP DESCENDANT & NONDESCENDANT 11

12 MARKOV CONDITION WE HAVE A DAG, G = ( V, E ) PA X = SET OF PARENTS OF X ND X = SET OF NONDESCENDANTS OF X FOR ALL X є V, I P ( X, ND X PA X ) WHEN A DAG SATISFIES THE MARKOV CONDITION, IT IS CALLED BAYESIAN NETWORK 12

13 EXAMPLE P ( v, s, c ) = P ( v c ) P (s c) P (c) P (One, Square, Black) = P (One Black) P (Square Black) P(Black) = ( 1 / 3 ) ( 2 / 3) ( 9 / 13 ) = 2 / 13 13

14 MORE EXAMPLES NOTICE THAT WE NEED ONLY 11 PROBABILITIES INSTEAD OF 32 14

15 THE PROBLEM FINDING THE STRUCTURE CALCULATING THE CONDITIONAL PROBABILITY TABLES 15

16 Biological basis DNA (Storage of Genetic Information) RNA Polymerase (Copy DNA in RNA) Ribosome (Translate Genetic Information in Proteins) mrna (Storage & Transport of Genetic Information) Proteins (Expression of Genetic Information) *-PDB file 1L3A, Transcriptional Regulator Pbf-2 16

17 Biological basis How do proteins get regulated? E. coli operon lactose example : In normal time, E. coli uses glucose to get energy, but how does it react if there is no more glucose but only lactose? 17

18 Biological basis E. coli environment Glucose X Lactose Lactose decomposor (β-galactosidase) Lactose getter (permease) RNA Polymerase Gene Lac I associated protein Polymerase action is blocked because of a DNA lock 18

19 Biological basis E. coli environment Glucose X Lactose X Lactose decomposor (β-galactosidase) Lactose getter (permease) RNA Polymerase Lactose Lactose Lactose recruits gene laci associated protein unlocking the DNA that is then accessible to the polymerase 19

20 Biological basis Lactose decomposor (β-galactosidase) Lactose getter (permease) = inhibit 20

21 Biological basis E.coli environment Glucose Lactose X RNA Polymerase Lactose decomposor (β-galactosidase) c-amp CAP Lactose getter (permease) Lactose In absence of glucose, a polymerase magnet binds to the DNA to accelerate the products of information that help lactose decomposition 21

22 Biological basis Lactose decomposor (β-galactosidase) Lactose getter (permease) = activate = inhibit Research goal: Infer these links 22

23 Why? Get insights about cellular processes Help understand diseases Find drug targets 23

24 [mrna] How? Using gene expression data and tools for learning Bayesian networks Experiments * + Tools for Learning Bayesian networks Lactose decomposor (β-galactosidase) Lactose getter (permease) *-Spellman et al.(1998) Mol Biol Cell 9:

25 [mrna] What is gene expression data? Data showing the concentration of a specific mrna at a given time of the cell life. * Experiments A real value is coming from one spot and tells if the concentration Every of columns a specific are mrna the result is higher(+) of one image or lower(-) than the normal value *-Spellman et al.(1998) Mol Biol Cell 9:

26 What is Bayesian networks? Graphic representation of a joint distribution over a set of random variables. A C E B D P(A,B,C,D,E) = P(A)*P(B) *P(C A)*P(D A,B) *P(E D) Nodes represent gene expression while edges encode the interactions (cf. inhibition, activation) 26

27 Bayesian networks little problem A Bayesian network should be a DAG (Direct Acyclic Graph), but there are a lot of example of regulatory networks having directed cycles. Histeric oscillator * Transcription factor dimer Switch *-Husmeier D.,Bioinformatics,Vol. 19 no , pages

28 How can we deal with that? Using DBN (Dynamic Bayesian Networks*) and sequential gene expression data t t+1 A A1 A2 We unfold the network in time B B1 B2 DBN = BN with constraints on parents and children nodes *-Friedman, Murphy, Russell,Learning the Structure of Dynamic Probabilitic Networks 28

29 What are we searching for? A Bayesian network that is most probable given the data D (gene expression) We found this BN like that : BN* = argmax BN {P(BN D)} Where: Marginal likelihood Prior on network structure P( BN D) P( D BN ) P( BN ) P( D) Data probability Naïve approach to the problem : try all possible dags and keep the best one! 29

30 It is impossible to try all possible DAGs because The number of dags increases super-exponentially with the number of nodes n = 3 25 dags n = dags n = dags n = dags n = dags n = dags We are interested in problem having around 60 nodes. 30

31 Learning Bayesian Networks from data? Choosing search space method and a conditional distribution representation Networks space search methods Greedy hill-climbing Beam-search Stochastic hill-climbing Simulated annealing MCMC simulation Basically add, remove and reverse edges Conditional distribution representation Linear Gaussian Multinomial, binomial A B C P(a) =? P(b) =? P(c a,b) =? 31

32 Learning Bayesian Networks from data? Choosing search space method and a conditional distribution representation Networks space search methods Greedy hill-climbing Beam-search Stochastic hill-climbing Simulated annealing MCMC simulation Basically add, remove and reverse edges Conditional distribution representation Linear Gaussian Multinomial, binomial A B C P(a) =? P(b) =? P(c a,b) =? 32

33 We use three types of gene expression level? Sort Split data in 3 equal buckets Discretized data 33

34 Return on: Marginal likelihood Prior on network structure P( BN D) P( D BN ) P( BN ) P( D) Data probability 34

35 Insight on each terms P(BN) prior on network In our research, we always use a prior equals to 1 We could incorporate knowledge using it Eg. : we know the presence of an edge. If the edge is in the BN, P(BN) = 1 else P(BN) = 0 Efforts are made to reduce the search space by using knowledge eg. limit the number of parents or children 35

36 Insight on each terms P(D BN) marginal likelihood Easy to calculate using Multinomial distribution with Dirichlet prior * P( d bn) n qi ( N ij ) ( a s i 1 j 1 ( Nij Mij) k 1 ( aijk) ri ijk ijk ) *-Heckerman,A Tutorial on Learning With Bayesian Networks and Neapolitan,Learning Bayesian Networks 36

37 MCMC (Markov Chain Monte Carlo) simulation Markov Chain part: Zoom on a node of the chain C A B 1/5 P(BN new ) 1/5 C A B C A B 0 A 1/5 C A B 1/5 1/5 A C A B C B C B 37

38 MCMC (Markov Chain Monte Carlo) simulation Monte Carlo part: Choose next BN with probability P(BNnew) Accept the new BN with the following Metropolis Hastings acceptance criterion : PMH P( BNnew D) min 1, * P( BNnew) P( BNold D) P( D BNnew) P( BNnew) P( D) min 1, * P( BNnew) P( D BNold) P( BNold) P( D) P( D BNnew) P( BNnew) min 1, * P( BNnew) P( D BNold) P( BNold) P(D) is gone! 38

39 Monte Carlo part example : 1. Choose a random path. Each path having a P(BN new ) of 1/5 2. Choose another random number. If it is smaller than the Metropolis-Hasting criterion, accept BN new else return to BN old C A B 1/5 P(BN new ) 1/5 C A B C A B 0 A 1/5 C A B 1/5 1/5 A C A B C B C B 39

40 log(p(d BN)P(BN)) MCMC (Markov Chain Monte Carlo) simulation recap: Choose a starting BN at random Burning phase (generate 5*N BN from MCMC without storing them) Storing phase (get 100*N BN structure from MCMC) = Burning phase = Storing phase Iteration 40

41 Why 100*N BN and not only 1: Cause we don t have enough data and there are a lot of high scoring networks Instead, we associate confidence to edge. Eg. : how many time in the sample can we find edge going from A to B? We could fix a threshold on confidence and retrieve a global network construct with edges having confidence over the threshold 41

42 What we are working on: Mixing both sequential and non-sequential data to retrieve interesting relation between genes How? Using DBN and MCMC for sequential data + BN and MCMC for non-sequential 100*N networks from DBN 100*N networks from BN Information tuner Learn network 42

43 How to test the approach: Problem : There is no way to test it on real data cause there is no completely known network Solution : Work on realistic simulation where we know the network structure Example : * Simulate *-Hartemink A. Using Bayesian Network Inference Algorithms to Recover Molecular Genetic Regulatory Networks 43

44 How to test the approach: Info tuner BN MCMC DBN MCMC Sequential data Non-Sequential data Compare using ROC curves * Simulate *-Hartemink A. Using Bayesian Network Inference Algorithms to Recover Molecular Genetic Regulatory Networks 44

45 Test description: Generate 60 sequential data Generate 120 non-sequential data (~reality proportion) Run DBN MCMC on sequential data keep 100*N sample net Run BN MCMC on non-sequential data keep 100*N sample net Test performance using weight on sample 0 BN 1 DBN.05 BN 0.95 DBN 0.95 BN.05 DBN 1 BN 0 DBN The metric used is the area under ROC curve. Perfect learner gets 1.0, random gets 0.5 and the worst one gets 0. 45

46 Area under ROC curve Results: BN 1 DBN 46

47 Perspective: Working on more sophisticated ways to mix sequential and nonsequential data Working on real cases: Yeast cell-cycle Arabidopsis Thaliana circadian rhythm Real data also means missing values Evaluate missing values solution (EM, KNNImpute) 47

48 Acknowledgements: François Major 48

49 Why are there missing datas? Experimental problems Low correlation 49

50 ROC Curve Receiver Operating Characteristic curve * *- 50

51 MCMC simulation and number of sampled networks ROC area ROC curve area in function of the number of sample networks from M CM C simulation for N= # of samples from MCMC

52 GRN Models Directed and undirected graphs Bayesian networks Boolean networks Generalized logical networks Non-linear ordinary differential equations Piecewise linear differential equations Qualitative differential equations Partial differential equations Stochastic master equations Rule based formalisms

53 GRN Models Hidde de Jong, Modeling and simulation of genetic regulatory systems: a literature review; J Comput Biol. 2002;9(1): Review. Dynamics Data Node States Computation Complexity 53 23/2/2008

54 What class of models should be chosen? The selection should be made in view of data requirements goals of modeling and analysis /2/2008

55 Classical Tradeoff A fine model with many parameters may be able to capture detailed low-level phenomena (protein concentrations, reaction kinetics); requires very large amounts of data for inference, lest the model be overfit. A coarse model with lower complexity may succeed in capturing high-level phenomena (which genes are ON/OFF); requires smaller amounts of data /2/2008

56 Occam s Razor 56 23/2/2008

57 Model Reliability and Adequacy P is the set of all possible observations S set of all observations made on the study system M is the set of all model outputs Q=S пm Q P 57 23/2/2008

58 Model Reliability and Adequacy P P Q S Useless Model Dream Model 58 23/2/2008

59 Model Reliability and Adequacy Q P Q P M S 59 Incomplete model Model reliability: Q / M Model adequacy: Q / S Complete, but erring model 23/2/2008

60 GRN Models Directed and undirected graphs Bayesian networks Boolean networks Generalized logical networks Non-linear ordinary differential equations Piecewise linear differential equations Qualitative differential equations Partial differential equations Stochastic master equations Rule based formalisms 60 23/2/2008

61 Directed and undirected Graphs Probably most straightforward way to model a GRN G=<V,E> V set of vertices Set of edges E=<i,j> where i,j є V, head and tail of edge Additional labels denote positive/negative influence 61 23/2/2008

62 Directed and undirected Graphs Advantages: Intuitive way of visualization Common and well explored graph algorithms can make biologically relevant predictions about GRSes: paths between genes may reveal missing regulatory interactions or provide clues about redundancy cycles in the network point at feedback relations connectivity characteristics give indication of the complexity loosely connected subgraphs point at functional modules Disadvantages: Time does not play a role Too much abstraction: very simplified model far from reality 62 23/2/2008

63 GRN Models Directed and undirected graphs Bayesian networks Boolean networks More popular and efficient Generalized logical networks Non-linear ordinary differential equations Piecewise linear differential equations Qualitative differential equations Partial differential equations Stochastic master equations Rule based formalisms 63 23/2/2008

64 Boolean Network Model A Boolean network is defined by a set of nodes, V = {x 1, x 2,..., x n }, and a list of Boolean functions, F= {f 1, f 2,..., f n } Each x k represents the state (expression) of a gene, g k, where x k = 1 the gene is expressed or x k = 0, the gene is not expressed 64 23/2/2008

65 Boolean Network t x x x At any given time, combining the gene states gives a gene activity pattern (GAP). GAP 65 23/2/2008

66 Boolean Network Given a GAP at time t, a deterministic function (a set of logical rules) provides the GAP at time t +1. t x x x GAP t GAP t /2/2008

67 Boolean Network t x x x /2/2008

68 Boolean Network Example t x x x /2/2008

69 Boolean Network t x x x /2/2008

70 Boolean Network Example t t x 1 x 2 x 3 x x t+1 x 1 x /2/2008

71 Boolean Network Example t t t+1 x 1 x 1 or x 2 x 3 x x x /2/2008

72 Boolean Network Example t t x 1 x 2 x 3 x x t+1 x 1 x /2/2008

73 Boolean Network Example t t x 1 x 2 x 3 x x t+1 x 1 x /2/2008

74 Boolean Network Example t t t+1 x 1 x 1 or x 2 x 3 x x x For each node there will be 2^2^k possible functions 74 23/2/2008

75 Boolean Network Example t t+1 x 1 x 2 x 3 x 1 x 2 x 3 or nor nand t x x x /2/2008

76 Boolean Network Example AND NAND NOT 76 I. Shmulevich et al., Bioinformatics (2002), 18 (2): /2/2008

77 Boolean Networks Summary 77 Advantages Efficient analysis of large RN Positive/negative feedback-cycles can be modeled with BN s Disadvantages Strong simplifying assumptions gene is either on or off, no in between states The computation time is very high or often impractical to construct large-scale gene networks Very susceptible to noise There are situations where boolean idealisation is not appropriate more general methods required 23/2/2008

78 Bayesian Networks A gene regulatory network is represented by directed acyclic graph: Vertices correspond to genes. Edges correspond to direct influence or interaction. For each gene x i, a conditional distribution p(x i ancestors(x i ) ) is defined. The graph and the conditional distributions, uniquely specify the joint probability distribution /2/2008

79 Bayesian Network Example x 1 x 2 x 4 x 3 Conditional distributions: p(x 1 ), p(x 2 ), p(x 3 x 2 ), p(x 4 x 1, x 2 ), p(x 5 x 4 ) x 5 p(x) =p(x) = p(x 1 ) p(x 2 ) p(x 3 x 2 ) p(x 4 x 1, x 2 ) p(x 5 x 4 ) 79 23/2/2008

80 Learning Bayesian Models Using gene expression data, the goal is to find the bayesian network that best matches the data. Recovering optimal conditional probability distributions when the graph is known is easy. Recovering the structure of the graph is NP hard (non-deterministic polynomial ). But, good statistics are available: What is the likelihood of a specific assignment? What is the distribution of x i given x j? 80 23/2/2008

81 Issues with Bayesian Models Computationally intensive. Requires lots of data. Does not allow for feedback loops which are known to play an important role in biological networks. Does not make use of the temporal aspect of the data. Dynamical Bayesian Networks aim at solving some of these issues but they require even more data /2/2008

82 Differential Equations Typically uses linear differential equations to model the gene trajectories: dx i (t) / dt = a 0 + a i,1 x 1 (t)+ a i,2 x 2 (t)+ + a i,n x n (t) Several reasons for that choice: lower number of parameters implies that we are less likely to over fit the data sufficient to model complex interactions between the genes 82 23/2/2008

83 Small Network Example + x 1 x 4 + x 2 + _ x 3 83 dx 1 (t) / dt = x 1 (t) dx 2 (t) / dt = x 3 (t) x 4 (t) dx 3 (t) / dt = x 1 (t) x 3 (t) dx 4 (t) / dt = x 1 (t) x 3 (t) x 4 (t) 23/2/2008

84 Small Network Example _ 84 _ + x 1 x 4 + x 2 + _ x 3 dx 1 (t) / dt = x 1 (t) dx 2 (t) / dt = x 3 (t) x 4 (t) dx 3 (t) / dt = x 1 (t) x 3 (t) dx 4 (t) / dt = x 1 (t) x 3 (t) x 4 (t) one interaction coefficient 23/2/2008

85 Small Network Example _ 85 _ + x 1 x 4 + x 2 + _ x 3 dx 1 (t) / dt = x 1 (t) dx 2 (t) / dt = x 3 (t) x 4 (t) dx 3 (t) / dt = x 1 (t) x 3 (t) dx 4 (t) / dt = x 1 (t) x 3 (t) x 4 (t) constant coefficients 23/2/2008

86 Problem Revisited a 0,i a 1,i a 2,i a 3,i a 4,i x x x x Given the time-series data, can we find the interactions coefficients? 23/2/2008

87 Issues with Differential Equations Even under the simplest linear model, there are m(m+1) unknown parameters to estimate: m(m-1) directional effects m self effects m constant effects Number of data points is mn and we typically have that n << m (few time-points). To avoid over fitting, extra constraints must be incorporated into the model such as: Smoothness of the equations Sparseness of the network (few non-null interaction coefficients) 87 23/2/2008

88 OUTLINES What is the Gene Regulatory Network? GRN Application of GRN GRN Construction Methodology GRN modeling steps GRN Models GRS Software Next work Reference 88 23/2/2008

89 GRN Software GNA: Genetic Network Analyzer Helix Bioinformatics /2/2008

90 GRN Software Probabilistic Boolean Networks (PBN) Matlab Tool Box Ilya Shmulevich Institute for Systems Biology 90 23/2/2008

91 OUTLINES What is the Gene Regulatory Network? GRN Application of GRN GRN Construction Methodology GRN modeling steps GRN Models GRS Software Next work Reference 91 23/2/2008

92 Future Work: Literature Review Study the noisy natural of Microarray Data. Study in depth the existing modeling methodology. Focus on specialized problem like cancer.

93 Future Work : GSP Statistics Books Genomics signal processing and statistics, Edward,2006 Introduction to genomics signal processing with control, Ily,2006 Computational and Statistical Approaches to Genomics (Springer, 2006), Ily

94 Future Work : Statistics Books Handbook of Computational Statistics An Introduction to Statistical Signal Processing, Robert M. Gray,2007 fundamentals of statistical signal processing :estimation theory, steven kay nonlinear signal processing a statistical approach, Gonzalo R,2005 Inference_in_HMM, Olivier Cappe, /2/2008

95 Future Work : Modeling Books Modeling and Control of Complex Systems (Control Engineering) by Petros A. Ioannou, Andreas Pitsillides,2008 MODELING BIOLOGICAL SYSTEMS: Principles and Applications2005 gene regulation and metabolism postgenomic computational approaches, Julio, /2/2008

96 Future Work: Resources IEEE Transactions on Computational Biology and Bioinformatics IEEE International Workshop on Genomic Signal Processing and Statistics IEEE Journal of Selected Topics in Signal Processing: Special Issue on Genomic and Proteomic Signal Processing EURASIP Journal of Bioinformatics and Systems Biology Special issue of the on Genetic Regulatory Networks IEEE Signal Processing Magazine on Signal Processing Special issue of the Methods in Genomics and Proteomics IEEE Transactions on Signal Processing Special Genomic Signal Processing issue of the Workshop on Discrete Models for Genetic Regulatory Networks 96 23/2/2008

97 OUTLINES What is the Gene Regulatory Network? GRN Application of GRN GRN Construction Methodology GRN modeling steps GRN Models GRS Software Next work Reference 97 23/2/2008

98 Reference Hidde de Jong, Modeling and simulation of genetic regulatory systems: a literature review; J Comput Biol. 2002;9(1): Review. BAYESIAN ROBUSTNESS IN THE CONTROL OF GENE REGULATORY NETWORKS Ranadip Pal1, Aniruddha Datta2, Edward R. Dougherty Anastassiou, D. (2001). Genomic Signal Processing. IEEE Signal Processing Dougherty, E. R. and A. Datta (2005). "Genomic signal processing: diagnosis and therapy." Signal Processing Magazine, IEEE 22(1): Vaidyanathan, P. P. (2004). Genomics and Proteomics: A Signal Processorapos's Tour. Circuits and Systems Magazine, IEEE. 4: /2/2008

99 Reference Vaidyanathan, P. P. and B.-J. Yoon (2004). "The role of signal-processing concepts in genomics and proteomics." Journal of the Franklin Institute.(Special Issue on Genomics) /2/2008

100

101 NAÏVE BAYES FORMULATION WE NEED P ( C F1, F2,, Fn) IT CAN BE SHOWN THAT - 101

102 THANK YOU ANY QUESTION? 102

The Monte Carlo Method: Bayesian Networks

The Monte Carlo Method: Bayesian Networks The Method: Bayesian Networks Dieter W. Heermann Methods 2009 Dieter W. Heermann ( Methods)The Method: Bayesian Networks 2009 1 / 18 Outline 1 Bayesian Networks 2 Gene Expression Data 3 Bayesian Networks

More information

Learning in Bayesian Networks

Learning in Bayesian Networks Learning in Bayesian Networks Florian Markowetz Max-Planck-Institute for Molecular Genetics Computational Molecular Biology Berlin Berlin: 20.06.2002 1 Overview 1. Bayesian Networks Stochastic Networks

More information

GLOBEX Bioinformatics (Summer 2015) Genetic networks and gene expression data

GLOBEX Bioinformatics (Summer 2015) Genetic networks and gene expression data GLOBEX Bioinformatics (Summer 2015) Genetic networks and gene expression data 1 Gene Networks Definition: A gene network is a set of molecular components, such as genes and proteins, and interactions between

More information

Computational Genomics. Systems biology. Putting it together: Data integration using graphical models

Computational Genomics. Systems biology. Putting it together: Data integration using graphical models 02-710 Computational Genomics Systems biology Putting it together: Data integration using graphical models High throughput data So far in this class we discussed several different types of high throughput

More information

Bayesian Networks BY: MOHAMAD ALSABBAGH

Bayesian Networks BY: MOHAMAD ALSABBAGH Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional

More information

Introduction to Bioinformatics

Introduction to Bioinformatics CSCI8980: Applied Machine Learning in Computational Biology Introduction to Bioinformatics Rui Kuang Department of Computer Science and Engineering University of Minnesota kuang@cs.umn.edu History of Bioinformatics

More information

Boolean models of gene regulatory networks. Matthew Macauley Math 4500: Mathematical Modeling Clemson University Spring 2016

Boolean models of gene regulatory networks. Matthew Macauley Math 4500: Mathematical Modeling Clemson University Spring 2016 Boolean models of gene regulatory networks Matthew Macauley Math 4500: Mathematical Modeling Clemson University Spring 2016 Gene expression Gene expression is a process that takes gene info and creates

More information

Chris Bishop s PRML Ch. 8: Graphical Models

Chris Bishop s PRML Ch. 8: Graphical Models Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular

More information

Learning With Bayesian Networks. Markus Kalisch ETH Zürich

Learning With Bayesian Networks. Markus Kalisch ETH Zürich Learning With Bayesian Networks Markus Kalisch ETH Zürich Inference in BNs - Review P(Burglary JohnCalls=TRUE, MaryCalls=TRUE) Exact Inference: P(b j,m) = c Sum e Sum a P(b)P(e)P(a b,e)p(j a)p(m a) Deal

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear

More information

6.047 / Computational Biology: Genomes, Networks, Evolution Fall 2008

6.047 / Computational Biology: Genomes, Networks, Evolution Fall 2008 MIT OpenCourseWare http://ocw.mit.edu 6.047 / 6.878 Computational Biology: Genomes, Networks, Evolution Fall 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Rapid Introduction to Machine Learning/ Deep Learning

Rapid Introduction to Machine Learning/ Deep Learning Rapid Introduction to Machine Learning/ Deep Learning Hyeong In Choi Seoul National University 1/32 Lecture 5a Bayesian network April 14, 2016 2/32 Table of contents 1 1. Objectives of Lecture 5a 2 2.Bayesian

More information

Probabilistic Graphical Models (I)

Probabilistic Graphical Models (I) Probabilistic Graphical Models (I) Hongxin Zhang zhx@cad.zju.edu.cn State Key Lab of CAD&CG, ZJU 2015-03-31 Probabilistic Graphical Models Modeling many real-world problems => a large number of random

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Introduction to Probabilistic Graphical Models

Introduction to Probabilistic Graphical Models Introduction to Probabilistic Graphical Models Kyu-Baek Hwang and Byoung-Tak Zhang Biointelligence Lab School of Computer Science and Engineering Seoul National University Seoul 151-742 Korea E-mail: kbhwang@bi.snu.ac.kr

More information

Graphical Models and Kernel Methods

Graphical Models and Kernel Methods Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.

More information

Directed Graphical Models

Directed Graphical Models CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential

More information

Introduction to Bioinformatics

Introduction to Bioinformatics Systems biology Introduction to Bioinformatics Systems biology: modeling biological p Study of whole biological systems p Wholeness : Organization of dynamic interactions Different behaviour of the individual

More information

{ p if x = 1 1 p if x = 0

{ p if x = 1 1 p if x = 0 Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

More information

Bayesian Machine Learning

Bayesian Machine Learning Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 4 Occam s Razor, Model Construction, and Directed Graphical Models https://people.orie.cornell.edu/andrew/orie6741 Cornell University September

More information

Tópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863

Tópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863 Tópicos Especiais em Modelagem e Análise - Aprendizado por Máquina CPS863 Daniel, Edmundo, Rosa Terceiro trimestre de 2012 UFRJ - COPPE Programa de Engenharia de Sistemas e Computação Bayesian Networks

More information

Bayesian Networks Inference with Probabilistic Graphical Models

Bayesian Networks Inference with Probabilistic Graphical Models 4190.408 2016-Spring Bayesian Networks Inference with Probabilistic Graphical Models Byoung-Tak Zhang intelligence Lab Seoul National University 4190.408 Artificial (2016-Spring) 1 Machine Learning? Learning

More information

Summary of the Bayes Net Formalism. David Danks Institute for Human & Machine Cognition

Summary of the Bayes Net Formalism. David Danks Institute for Human & Machine Cognition Summary of the Bayes Net Formalism David Danks Institute for Human & Machine Cognition Bayesian Networks Two components: 1. Directed Acyclic Graph (DAG) G: There is a node for every variable D: Some nodes

More information

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional

More information

An Introduction to Bayesian Machine Learning

An Introduction to Bayesian Machine Learning 1 An Introduction to Bayesian Machine Learning José Miguel Hernández-Lobato Department of Engineering, Cambridge University April 8, 2013 2 What is Machine Learning? The design of computational systems

More information

Probabilistic Graphical Networks: Definitions and Basic Results

Probabilistic Graphical Networks: Definitions and Basic Results This document gives a cursory overview of Probabilistic Graphical Networks. The material has been gleaned from different sources. I make no claim to original authorship of this material. Bayesian Graphical

More information

Statistical Applications in Genetics and Molecular Biology

Statistical Applications in Genetics and Molecular Biology Statistical Applications in Genetics and Molecular Biology Volume 6, Issue 2007 Article 5 Reconstructing Gene Regulatory Networks with Bayesian Networks by Combining Expression Data with Multiple Sources

More information

Inferring Protein-Signaling Networks II

Inferring Protein-Signaling Networks II Inferring Protein-Signaling Networks II Lectures 15 Nov 16, 2011 CSE 527 Computational Biology, Fall 2011 Instructor: Su-In Lee TA: Christopher Miles Monday & Wednesday 12:00-1:20 Johnson Hall (JHN) 022

More information

Outline. Spring It Introduction Representation. Markov Random Field. Conclusion. Conditional Independence Inference: Variable elimination

Outline. Spring It Introduction Representation. Markov Random Field. Conclusion. Conditional Independence Inference: Variable elimination Probabilistic Graphical Models COMP 790-90 Seminar Spring 2011 The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Outline It Introduction ti Representation Bayesian network Conditional Independence Inference:

More information

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014

Bayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2014 Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2014 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several

More information

Artificial Intelligence Bayesian Networks

Artificial Intelligence Bayesian Networks Artificial Intelligence Bayesian Networks Stephan Dreiseitl FH Hagenberg Software Engineering & Interactive Media Stephan Dreiseitl (Hagenberg/SE/IM) Lecture 11: Bayesian Networks Artificial Intelligence

More information

Lecture 6: Graphical Models: Learning

Lecture 6: Graphical Models: Learning Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)

More information

A New Method to Build Gene Regulation Network Based on Fuzzy Hierarchical Clustering Methods

A New Method to Build Gene Regulation Network Based on Fuzzy Hierarchical Clustering Methods International Academic Institute for Science and Technology International Academic Journal of Science and Engineering Vol. 3, No. 6, 2016, pp. 169-176. ISSN 2454-3896 International Academic Journal of

More information

Directed Graphical Models or Bayesian Networks

Directed Graphical Models or Bayesian Networks Directed Graphical Models or Bayesian Networks Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Bayesian Networks One of the most exciting recent advancements in statistical AI Compact

More information

Abstract. Three Methods and Their Limitations. N-1 Experiments Suffice to Determine the Causal Relations Among N Variables

Abstract. Three Methods and Their Limitations. N-1 Experiments Suffice to Determine the Causal Relations Among N Variables N-1 Experiments Suffice to Determine the Causal Relations Among N Variables Frederick Eberhardt Clark Glymour 1 Richard Scheines Carnegie Mellon University Abstract By combining experimental interventions

More information

PROBABILISTIC REASONING SYSTEMS

PROBABILISTIC REASONING SYSTEMS PROBABILISTIC REASONING SYSTEMS In which we explain how to build reasoning systems that use network models to reason with uncertainty according to the laws of probability theory. Outline Knowledge in uncertain

More information

Undirected Graphical Models

Undirected Graphical Models Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Properties Properties 3 Generative vs. Conditional

More information

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005 Instructor: Professor Jeff A. Bilmes Uncertainty & Bayesian Networks

More information

Probabilistic Machine Learning

Probabilistic Machine Learning Probabilistic Machine Learning Bayesian Nets, MCMC, and more Marek Petrik 4/18/2017 Based on: P. Murphy, K. (2012). Machine Learning: A Probabilistic Perspective. Chapter 10. Conditional Independence Independent

More information

Machine Learning Lecture 14

Machine Learning Lecture 14 Many slides adapted from B. Schiele, S. Roth, Z. Gharahmani Machine Learning Lecture 14 Undirected Graphical Models & Inference 23.06.2015 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de

More information

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Probabilistic Reasoning. (Mostly using Bayesian Networks) Probabilistic Reasoning (Mostly using Bayesian Networks) Introduction: Why probabilistic reasoning? The world is not deterministic. (Usually because information is limited.) Ways of coping with uncertainty

More information

REVIEW SESSION. Wednesday, September 15 5:30 PM SHANTZ 242 E

REVIEW SESSION. Wednesday, September 15 5:30 PM SHANTZ 242 E REVIEW SESSION Wednesday, September 15 5:30 PM SHANTZ 242 E Gene Regulation Gene Regulation Gene expression can be turned on, turned off, turned up or turned down! For example, as test time approaches,

More information

Directed Graphical Models

Directed Graphical Models Directed Graphical Models Instructor: Alan Ritter Many Slides from Tom Mitchell Graphical Models Key Idea: Conditional independence assumptions useful but Naïve Bayes is extreme! Graphical models express

More information

Discovering molecular pathways from protein interaction and ge

Discovering molecular pathways from protein interaction and ge Discovering molecular pathways from protein interaction and gene expression data 9-4-2008 Aim To have a mechanism for inferring pathways from gene expression and protein interaction data. Motivation Why

More information

Machine Learning Summer School

Machine Learning Summer School Machine Learning Summer School Lecture 3: Learning parameters and structure Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Department of Engineering University of Cambridge,

More information

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 13 24 march 2017 Reasoning with Bayesian Networks Naïve Bayesian Systems...2 Example

More information

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Part I C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Probabilistic Graphical Models Graphical representation of a probabilistic model Each variable corresponds to a

More information

Directed and Undirected Graphical Models

Directed and Undirected Graphical Models Directed and Undirected Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Machine Learning: Neural Networks and Advanced Models (AA2) Last Lecture Refresher Lecture Plan Directed

More information

Learning Sequence Motif Models Using Expectation Maximization (EM) and Gibbs Sampling

Learning Sequence Motif Models Using Expectation Maximization (EM) and Gibbs Sampling Learning Sequence Motif Models Using Expectation Maximization (EM) and Gibbs Sampling BMI/CS 776 www.biostat.wisc.edu/bmi776/ Spring 009 Mark Craven craven@biostat.wisc.edu Sequence Motifs what is a sequence

More information

Hybrid Model of gene regulatory networks, the case of the lac-operon

Hybrid Model of gene regulatory networks, the case of the lac-operon Hybrid Model of gene regulatory networks, the case of the lac-operon Laurent Tournier and Etienne Farcot LMC-IMAG, 51 rue des Mathématiques, 38041 Grenoble Cedex 9, France Laurent.Tournier@imag.fr, Etienne.Farcot@imag.fr

More information

Lecture 7: Simple genetic circuits I

Lecture 7: Simple genetic circuits I Lecture 7: Simple genetic circuits I Paul C Bressloff (Fall 2018) 7.1 Transcription and translation In Fig. 20 we show the two main stages in the expression of a single gene according to the central dogma.

More information

Recall from last time. Lecture 3: Conditional independence and graph structure. Example: A Bayesian (belief) network.

Recall from last time. Lecture 3: Conditional independence and graph structure. Example: A Bayesian (belief) network. ecall from last time Lecture 3: onditional independence and graph structure onditional independencies implied by a belief network Independence maps (I-maps) Factorization theorem The Bayes ball algorithm

More information

Chapter 15 Active Reading Guide Regulation of Gene Expression

Chapter 15 Active Reading Guide Regulation of Gene Expression Name: AP Biology Mr. Croft Chapter 15 Active Reading Guide Regulation of Gene Expression The overview for Chapter 15 introduces the idea that while all cells of an organism have all genes in the genome,

More information

4. Why not make all enzymes all the time (even if not needed)? Enzyme synthesis uses a lot of energy.

4. Why not make all enzymes all the time (even if not needed)? Enzyme synthesis uses a lot of energy. 1 C2005/F2401 '10-- Lecture 15 -- Last Edited: 11/02/10 01:58 PM Copyright 2010 Deborah Mowshowitz and Lawrence Chasin Department of Biological Sciences Columbia University New York, NY. Handouts: 15A

More information

Networks in systems biology

Networks in systems biology Networks in systems biology Matthew Macauley Department of Mathematical Sciences Clemson University http://www.math.clemson.edu/~macaule/ Math 4500, Spring 2017 M. Macauley (Clemson) Networks in systems

More information

Inference in Graphical Models Variable Elimination and Message Passing Algorithm

Inference in Graphical Models Variable Elimination and Message Passing Algorithm Inference in Graphical Models Variable Elimination and Message Passing lgorithm Le Song Machine Learning II: dvanced Topics SE 8803ML, Spring 2012 onditional Independence ssumptions Local Markov ssumption

More information

Bayesian Networks. Motivation

Bayesian Networks. Motivation Bayesian Networks Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Motivation Assume we have five Boolean variables,,,, The joint probability is,,,, How many state configurations

More information

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling 10-708: Probabilistic Graphical Models 10-708, Spring 2014 27 : Distributed Monte Carlo Markov Chain Lecturer: Eric P. Xing Scribes: Pengtao Xie, Khoa Luu In this scribe, we are going to review the Parallel

More information

Lecture 5: Bayesian Network

Lecture 5: Bayesian Network Lecture 5: Bayesian Network Topics of this lecture What is a Bayesian network? A simple example Formal definition of BN A slightly difficult example Learning of BN An example of learning Important topics

More information

CS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine

CS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine CS 484 Data Mining Classification 7 Some slides are from Professor Padhraic Smyth at UC Irvine Bayesian Belief networks Conditional independence assumption of Naïve Bayes classifier is too strong. Allows

More information

Regulation of Gene Expression

Regulation of Gene Expression Chapter 18 Regulation of Gene Expression PowerPoint Lecture Presentations for Biology Eighth Edition Neil Campbell and Jane Reece Lectures by Chris Romero, updated by Erin Barley with contributions from

More information

Network motifs in the transcriptional regulation network (of Escherichia coli):

Network motifs in the transcriptional regulation network (of Escherichia coli): Network motifs in the transcriptional regulation network (of Escherichia coli): Janne.Ravantti@Helsinki.Fi (disclaimer: IANASB) Contents: Transcription Networks (aka. The Very Boring Biology Part ) Network

More information

Markov Networks. l Like Bayes Nets. l Graph model that describes joint probability distribution using tables (AKA potentials)

Markov Networks. l Like Bayes Nets. l Graph model that describes joint probability distribution using tables (AKA potentials) Markov Networks l Like Bayes Nets l Graph model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov

More information

CS6220: DATA MINING TECHNIQUES

CS6220: DATA MINING TECHNIQUES CS6220: DATA MINING TECHNIQUES Chapter 8&9: Classification: Part 3 Instructor: Yizhou Sun yzsun@ccs.neu.edu March 12, 2013 Midterm Report Grade Distribution 90-100 10 80-89 16 70-79 8 60-69 4

More information

Bayesian Approaches Data Mining Selected Technique

Bayesian Approaches Data Mining Selected Technique Bayesian Approaches Data Mining Selected Technique Henry Xiao xiao@cs.queensu.ca School of Computing Queen s University Henry Xiao CISC 873 Data Mining p. 1/17 Probabilistic Bases Review the fundamentals

More information

Topic 4 - #14 The Lactose Operon

Topic 4 - #14 The Lactose Operon Topic 4 - #14 The Lactose Operon The Lactose Operon The lactose operon is an operon which is responsible for the transport and metabolism of the sugar lactose in E. coli. - Lactose is one of many organic

More information

Probabilistic Classification

Probabilistic Classification Bayesian Networks Probabilistic Classification Goal: Gather Labeled Training Data Build/Learn a Probability Model Use the model to infer class labels for unlabeled data points Example: Spam Filtering...

More information

CS 188: Artificial Intelligence. Bayes Nets

CS 188: Artificial Intelligence. Bayes Nets CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew

More information

CS6220: DATA MINING TECHNIQUES

CS6220: DATA MINING TECHNIQUES CS6220: DATA MINING TECHNIQUES Matrix Data: Classification: Part 2 Instructor: Yizhou Sun yzsun@ccs.neu.edu September 21, 2014 Methods to Learn Matrix Data Set Data Sequence Data Time Series Graph & Network

More information

CS Lecture 3. More Bayesian Networks

CS Lecture 3. More Bayesian Networks CS 6347 Lecture 3 More Bayesian Networks Recap Last time: Complexity challenges Representing distributions Computing probabilities/doing inference Introduction to Bayesian networks Today: D-separation,

More information

Markov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials)

Markov Networks. l Like Bayes Nets. l Graphical model that describes joint probability distribution using tables (AKA potentials) Markov Networks l Like Bayes Nets l Graphical model that describes joint probability distribution using tables (AKA potentials) l Nodes are random variables l Labels are outcomes over the variables Markov

More information

CSCE 478/878 Lecture 6: Bayesian Learning

CSCE 478/878 Lecture 6: Bayesian Learning Bayesian Methods Not all hypotheses are created equal (even if they are all consistent with the training data) Outline CSCE 478/878 Lecture 6: Bayesian Learning Stephen D. Scott (Adapted from Tom Mitchell

More information

Learning Bayesian network : Given structure and completely observed data

Learning Bayesian network : Given structure and completely observed data Learning Bayesian network : Given structure and completely observed data Probabilistic Graphical Models Sharif University of Technology Spring 2017 Soleymani Learning problem Target: true distribution

More information

Informatics 2D Reasoning and Agents Semester 2,

Informatics 2D Reasoning and Agents Semester 2, Informatics 2D Reasoning and Agents Semester 2, 2017 2018 Alex Lascarides alex@inf.ed.ac.uk Lecture 23 Probabilistic Reasoning with Bayesian Networks 15th March 2018 Informatics UoE Informatics 2D 1 Where

More information

Introduction to Causal Calculus

Introduction to Causal Calculus Introduction to Causal Calculus Sanna Tyrväinen University of British Columbia August 1, 2017 1 / 1 2 / 1 Bayesian network Bayesian networks are Directed Acyclic Graphs (DAGs) whose nodes represent random

More information

Directed Probabilistic Graphical Models CMSC 678 UMBC

Directed Probabilistic Graphical Models CMSC 678 UMBC Directed Probabilistic Graphical Models CMSC 678 UMBC Announcement 1: Assignment 3 Due Wednesday April 11 th, 11:59 AM Any questions? Announcement 2: Progress Report on Project Due Monday April 16 th,

More information

Lecture 6: Graphical Models

Lecture 6: Graphical Models Lecture 6: Graphical Models Kai-Wei Chang CS @ Uniersity of Virginia kw@kwchang.net Some slides are adapted from Viek Skirmar s course on Structured Prediction 1 So far We discussed sequence labeling tasks:

More information

Computational Systems Biology

Computational Systems Biology Computational Systems Biology Vasant Honavar Artificial Intelligence Research Laboratory Bioinformatics and Computational Biology Graduate Program Center for Computational Intelligence, Learning, & Discovery

More information

REGULATION OF GENE EXPRESSION. Bacterial Genetics Lac and Trp Operon

REGULATION OF GENE EXPRESSION. Bacterial Genetics Lac and Trp Operon REGULATION OF GENE EXPRESSION Bacterial Genetics Lac and Trp Operon Levels of Metabolic Control The amount of cellular products can be controlled by regulating: Enzyme activity: alters protein function

More information

AP Bio Module 16: Bacterial Genetics and Operons, Student Learning Guide

AP Bio Module 16: Bacterial Genetics and Operons, Student Learning Guide Name: Period: Date: AP Bio Module 6: Bacterial Genetics and Operons, Student Learning Guide Getting started. Work in pairs (share a computer). Make sure that you log in for the first quiz so that you get

More information

Learning Bayesian Networks (part 1) Goals for the lecture

Learning Bayesian Networks (part 1) Goals for the lecture Learning Bayesian Networks (part 1) Mark Craven and David Page Computer Scices 760 Spring 2018 www.biostat.wisc.edu/~craven/cs760/ Some ohe slides in these lectures have been adapted/borrowed from materials

More information

FUNDAMENTALS of SYSTEMS BIOLOGY From Synthetic Circuits to Whole-cell Models

FUNDAMENTALS of SYSTEMS BIOLOGY From Synthetic Circuits to Whole-cell Models FUNDAMENTALS of SYSTEMS BIOLOGY From Synthetic Circuits to Whole-cell Models Markus W. Covert Stanford University 0 CRC Press Taylor & Francis Group Boca Raton London New York Contents /... Preface, xi

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Learning Bayesian Networks

Learning Bayesian Networks Learning Bayesian Networks Probabilistic Models, Spring 2011 Petri Myllymäki, University of Helsinki V-1 Aspects in learning Learning the parameters of a Bayesian network Marginalizing over all all parameters

More information

Reasoning Under Uncertainty: Belief Network Inference

Reasoning Under Uncertainty: Belief Network Inference Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5 Textbook 10.4 Reasoning Under Uncertainty: Belief Network Inference CPSC 322 Uncertainty 5, Slide 1 Lecture Overview 1 Recap

More information

CHAPTER 13 PROKARYOTE GENES: E. COLI LAC OPERON

CHAPTER 13 PROKARYOTE GENES: E. COLI LAC OPERON PROKARYOTE GENES: E. COLI LAC OPERON CHAPTER 13 CHAPTER 13 PROKARYOTE GENES: E. COLI LAC OPERON Figure 1. Electron micrograph of growing E. coli. Some show the constriction at the location where daughter

More information

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain

More information

Predicting causal effects in large-scale systems from observational data

Predicting causal effects in large-scale systems from observational data nature methods Predicting causal effects in large-scale systems from observational data Marloes H Maathuis 1, Diego Colombo 1, Markus Kalisch 1 & Peter Bühlmann 1,2 Supplementary figures and text: Supplementary

More information

Bayesian Networks Basic and simple graphs

Bayesian Networks Basic and simple graphs Bayesian Networks Basic and simple graphs Ullrika Sahlin, Centre of Environmental and Climate Research Lund University, Sweden Ullrika.Sahlin@cec.lu.se http://www.cec.lu.se/ullrika-sahlin Bayesian [Belief]

More information

Index. FOURTH PROOFS n98-book 2009/11/4 page 261

Index. FOURTH PROOFS n98-book 2009/11/4 page 261 FOURTH PROOFS n98-book 2009/11/4 page 261 Index activity, 10, 13 adaptive control, 215 adaptive controller, 215 adaptive immune system, 20 adaptive intervention strategy, 216 adjacency rule, 111 admissible

More information

MA 777: Topics in Mathematical Biology

MA 777: Topics in Mathematical Biology MA 777: Topics in Mathematical Biology David Murrugarra Department of Mathematics, University of Kentucky http://www.math.uky.edu/~dmu228/ma777/ Spring 2018 David Murrugarra (University of Kentucky) Lecture

More information

BN Semantics 3 Now it s personal! Parameter Learning 1

BN Semantics 3 Now it s personal! Parameter Learning 1 Readings: K&F: 3.4, 14.1, 14.2 BN Semantics 3 Now it s personal! Parameter Learning 1 Graphical Models 10708 Carlos Guestrin Carnegie Mellon University September 22 nd, 2006 1 Building BNs from independence

More information

Bayesian Hierarchical Classification. Seminar on Predicting Structured Data Jukka Kohonen

Bayesian Hierarchical Classification. Seminar on Predicting Structured Data Jukka Kohonen Bayesian Hierarchical Classification Seminar on Predicting Structured Data Jukka Kohonen 17.4.2008 Overview Intro: The task of hierarchical gene annotation Approach I: SVM/Bayes hybrid Barutcuoglu et al:

More information

Bayesian Learning in Undirected Graphical Models

Bayesian Learning in Undirected Graphical Models Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ Work with: Iain Murray and Hyun-Chul

More information

Conditional Independence

Conditional Independence Conditional Independence Sargur Srihari srihari@cedar.buffalo.edu 1 Conditional Independence Topics 1. What is Conditional Independence? Factorization of probability distribution into marginals 2. Why

More information

Introduction to Artificial Intelligence (AI)

Introduction to Artificial Intelligence (AI) Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 9 Oct, 11, 2011 Slide credit Approx. Inference : S. Thrun, P, Norvig, D. Klein CPSC 502, Lecture 9 Slide 1 Today Oct 11 Bayesian

More information

COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference

COS402- Artificial Intelligence Fall Lecture 10: Bayesian Networks & Exact Inference COS402- Artificial Intelligence Fall 2015 Lecture 10: Bayesian Networks & Exact Inference Outline Logical inference and probabilistic inference Independence and conditional independence Bayes Nets Semantics

More information

Name Period The Control of Gene Expression in Prokaryotes Notes

Name Period The Control of Gene Expression in Prokaryotes Notes Bacterial DNA contains genes that encode for many different proteins (enzymes) so that many processes have the ability to occur -not all processes are carried out at any one time -what allows expression

More information

Big Idea 3: Living systems store, retrieve, transmit and respond to information essential to life processes. Tuesday, December 27, 16

Big Idea 3: Living systems store, retrieve, transmit and respond to information essential to life processes. Tuesday, December 27, 16 Big Idea 3: Living systems store, retrieve, transmit and respond to information essential to life processes. Enduring understanding 3.B: Expression of genetic information involves cellular and molecular

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information