Figure 1: Bayesian network for problem 1. P (A = t) = 0.3 (1) P (C = t) = 0.6 (2) Table 1: CPTs for problem 1. (c) P (E B) C P (D = t) f 0.9 t 0.
|
|
- Maximillian Powell
- 5 years ago
- Views:
Transcription
1 Probabilistic Artiicial Intelligence Problem Set 3 Oct 27, Variable elimination In this exercise you will use variable elimination to perorm inerence on a bayesian network. Consider the network in igure 1 and its corresponding conditional probability tables (CPTs). A C B E D F Figure 1: Bayesian network or problem 1. P (A = t) = 0.3 (1) P (C = t) = 0.6 (2) Table 1: CPTs or problem 1. (a) P (B A, C) (b) P (D C) (c) P (E B) (d) P (F D, E) A C P (B = t) 0.2 t 0.8 t 0.3 t t 0.5 C P (D = t) 0.9 t 0.75 B P (E = t) 0.2 t 0.4 D E P (F = t) 0.95 t 1 t 0 t t
2 Assuming a query on A with evidence or B and D, i.e. P (A B, D), use the variable elimination algorithm to answer the ollowing queries. Make explicit the selected ordering or the variables and compute the probability tables o the intermediate actors. 1. P (A = t B = t, D = ) 2. P (A = B =, D = ) 3. P (A = t B = t, D = t) Consider now the ordering, C, E, F, D, B, A, use again the variable elimination algorithm and write down the intermediate actors, this time without computing their probability tables. Is this ordering better or worse than the one you used beore? Why? 2. Belie propagation In this exercise, you will implement the belie propagation algorithm or perorming inerence in Bayesian networks. As you have seen in the class lectures, the algorithm is based on converting the Bayesian network to a actor graph and then passing messages between variable and actor nodes o that graph until convergence. You are provided some skeleton Python code in the.zip ile accompanying this document. Take the ollowing steps or this exercise. 1. Install the Python dependencies listed in README.txt, i your system does not already satisy them. Ater that, you should be able to run demo.py and produce some plots, albeit wrong ones or now. 2. Implement the missing code in bprop.py marked with TODO. In particular, you have to ill in parts o the two unctions that are responsible or sending messages rom variable to actor nodes and vice versa, as well as parts o the unction that returns the resulting marginal distribution o a variable node ater message passing has terminated. 3. Now, set up the ull-ledged earthquake network, whose structure was introduced in Problem Set 2 and is shown again in Figure 2. Here is the story behind this network: While Fred is commuting to work, he receives a phone call rom his neighbor saying that the burglar alarm in Fred s house is ringing. Upon hearing this, Fred immediately turns around to get back and check his home. A ew minutes on his way back, however, he hears on the radio that there was an earthquake near his home earlier that day. Relieved by the news, he turns around again and continues his way to work. To build up the conditional probability tables (CPTs) or the network o Figure 2 you may make the ollowing assumptions about the variables involved: All variables in the network are binary. As can be seen rom the network structure, burglaries and earthquakes are assumed to be independent. Furthermore, each o them is assumed to occur with probability 0.1%. 2
3 Earthquake Burglar Radio Alarm Phone Figure 2: The earthquake network to be implemented. The alarm is triggered in the ollowing ways: (1) When a burglar enters the house, the alarm will ring 99% o the time; (2) when an earthquake occurs, there will be a alse alarm 1% o the time; (3) the alarm might go o due to other causes (wind, rain, etc.) 0.1% o the time. These three types o causes are assumed to be independent o each other. The neighbor is assumed to call only when the alarm is ringing, but only does so 70% o the time when it is actually ringing. The radio is assumed to never alsely report an earthquake, but it might ail to report an earthquake that actually happened 50% o the time. (This includes the times that Fred ails to listen to the announcement.) 4. Ater having set up the network and its CPTs, answer the ollowing questions using your belie propagation implementation: (a) Beore Fred gets the neighbor s call, what is the probability o a burglary having occurred? What is the probability o an earthquake having occurred? (b) How do these probabilities change ater Fred receives the neighbor s phonecall? (c) How do these probabilities change ater Fred listens to the news on the radio? 3. Belie propagation on tree actor graphs* In this exercise we will prove that the belie propagation algorithm converges to the exact marginals ater a ixed number o iterations given that the actor graph is a tree, that is, given that the original Bayesian network is a polytree. We will assume that the actor graph contains no single-variable actors. (You have already seen that i those exist, they can easily be incorporated into multi-variable actors without increasing the complexity o the algorithm.) Since the actor graph is a tree, we will designate 3
4 u u g g v 1 v 2 w v 1 v 2 w h h T [v1 ] T [u] r 1 r 2 r 1 r 2 Figure 3: An example actor graph and two o its subtrees. a variable node, say a, as the root o the tree. We will consider subtrees T [rt], where r and t are adjacent nodes in the actor graph and t is closer to the root than r, which are o two types: i r is a actor node (and t a variable node), then T [rt] denotes a subtree that has t as its root, contains the whole subtree under r and, additionally, the edge {r, t}, i r is a variable node (and t a actor node), then T [rt] denotes the whole subtree under r with r as its root. See Figure 3 or two example subtrees, one o each type. The depth o a tree is deined as the maximum distance between the root and any other node. Note that, both types o subtrees T [rt] deined above have always depths that are even numbers. We will use the subscript notation [rt] to reer to quantities constrained to the subtree T [rt]. In particular, we denote by F [rt] the set o actors in the subtree and by P [rt] (x v ) the marginal distribution o v when we only consider the nodes o the subtree. More concretely, i r is a variable node, by the sum rule we get P [rt] (x r ) (x ), (1) F [rt] x [rt]\{r} where denotes equality up to a normalization constant. Remember the orm o the messages passed between variable and actor nodes at each iteration 4
5 T [v] v T [v] T [v] v v... N () \ {v}... N (v) \ {}... N () \ {v} (a) Base case. (b) First induction step. (c) Second induction step. Figure 4: The three cases considered in the proo by induction. o the algorithm: µ (t+1) v (x v) := g N (v)\{} µ (t+1) v (x v) := x \{v} (x ) g v(x v ) (2) w N ()\{v} w (x w). (3) We also deine the estimated marginal distribution o variable v at iteration t as ˆP (t) (x v ) := g v(x v ). (4) g N (v) Our ultimate goal is to show that the estimated marginals are equal to the true marginals or all variables ater a number iterations. However, we will irst consider the rooted version o the actor graph and show that the previous statement holds or the root node a. More concretely, i we denote variable nodes with v and actor nodes with, we will show using induction that or all subtrees T [v] o depth τ, it holds that, or all t τ, v (x v) P [v] (x v ). (5) (i) Consider the base case o T [v] being a subtree o depth τ = 2 (see Figure 4a). Show that (5) holds in this case or all t 2. (ii) Now, assume that (5) holds or all subtrees o depth τ. As a irst step, show that or any subtree T [v] o depth τ (see Figure 4b) it holds that, or all t τ + 1, v (x v) P [v] (x v ). (iii) Using the result o the previous step, show that, or any subtree T [v] o depth τ = τ +2, (5) holds or all t τ (see Figure 4c). 5
6 (iv) Show that, i the actor graph rooted at a has depth d, then, or all t d, ˆP (t) (x a ) P (x a ). (v) To generalize the statement above, assume that the actor graph has diameter D, i.e., the maximum distance between any two nodes in the graph is D. Show that or all t D the estimated marginal o any variable v is exact, that is, ˆP (t) (x v ) P (x v ). 6
Figure 1: Bayesian network for problem 1. P (A = t) = 0.3 (1) P (C = t) = 0.6 (2) Table 1: CPTs for problem 1. C P (D = t) f 0.9 t 0.
Probabilistic Artificial Intelligence Solutions to Problem Set 3 Nov 2, 2018 1. Variable elimination In this exercise you will use variable elimination to perform inference on a bayesian network. Consider
More information( x) f = where P and Q are polynomials.
9.8 Graphing Rational Functions Lets begin with a deinition. Deinition: Rational Function A rational unction is a unction o the orm ( ) ( ) ( ) P where P and Q are polynomials. Q An eample o a simple rational
More informationExact Inference by Complete Enumeration
21 Exact Inference by Complete Enumeration We open our toolbox of methods for handling probabilities by discussing a brute-force inference method: complete enumeration of all hypotheses, and evaluation
More informationSupplementary material for Continuous-action planning for discounted infinite-horizon nonlinear optimal control with Lipschitz values
Supplementary material or Continuous-action planning or discounted ininite-horizon nonlinear optimal control with Lipschitz values List o main notations x, X, u, U state, state space, action, action space,
More informationExact Inference: Variable Elimination
Readings: K&F 9.2 9. 9.4 9.5 Exact nerence: Variable Elimination ecture 6-7 Apr 1/18 2011 E 515 tatistical Methods pring 2011 nstructor: u-n ee University o Washington eattle et s revisit the tudent Network
More information9.3 Graphing Functions by Plotting Points, The Domain and Range of Functions
9. Graphing Functions by Plotting Points, The Domain and Range o Functions Now that we have a basic idea o what unctions are and how to deal with them, we would like to start talking about the graph o
More informationBayesian Networks: Independencies and Inference
Bayesian Networks: Independencies and Inference Scott Davies and Andrew Moore Note to other teachers and users of these slides. Andrew and Scott would be delighted if you found this source material useful
More informationUsing first-order logic, formalize the following knowledge:
Probabilistic Artificial Intelligence Final Exam Feb 2, 2016 Time limit: 120 minutes Number of pages: 19 Total points: 100 You can use the back of the pages if you run out of space. Collaboration on the
More information15-780: Graduate Artificial Intelligence. Bayesian networks: Construction and inference
15-780: Graduate Artificial Intelligence ayesian networks: Construction and inference ayesian networks: Notations ayesian networks are directed acyclic graphs. Conditional probability tables (CPTs) P(Lo)
More informationIntroduction to Bayesian Networks
Introduction to Bayesian Networks Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1/23 Outline Basic Concepts Bayesian
More informationBayesian networks. Chapter 14, Sections 1 4
Bayesian networks Chapter 14, Sections 1 4 Artificial Intelligence, spring 2013, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 14, Sections 1 4 1 Bayesian networks
More informationMaximum Flow. Reading: CLRS Chapter 26. CSE 6331 Algorithms Steve Lai
Maximum Flow Reading: CLRS Chapter 26. CSE 6331 Algorithms Steve Lai Flow Network A low network G ( V, E) is a directed graph with a source node sv, a sink node tv, a capacity unction c. Each edge ( u,
More informationCircuit Complexity / Counting Problems
Lecture 5 Circuit Complexity / Counting Problems April 2, 24 Lecturer: Paul eame Notes: William Pentney 5. Circuit Complexity and Uniorm Complexity We will conclude our look at the basic relationship between
More information8.4 Inverse Functions
Section 8. Inverse Functions 803 8. Inverse Functions As we saw in the last section, in order to solve application problems involving eponential unctions, we will need to be able to solve eponential equations
More informationAlgorithms and Complexity Results for Exact Bayesian Structure Learning
Algorithms and Complexity Results or Exact Bayesian Structure Learning Sebastian Ordyniak and Stean Szeider Institute o Inormation Systems Vienna University o Technology, Austria Abstract Bayesian structure
More informationCurve Sketching. The process of curve sketching can be performed in the following steps:
Curve Sketching So ar you have learned how to ind st and nd derivatives o unctions and use these derivatives to determine where a unction is:. Increasing/decreasing. Relative extrema 3. Concavity 4. Points
More information(C) The rationals and the reals as linearly ordered sets. Contents. 1 The characterizing results
(C) The rationals and the reals as linearly ordered sets We know that both Q and R are something special. When we think about about either o these we usually view it as a ield, or at least some kind o
More informationBayesian Networks. Material used
Bayesian Networks Material used Halpern: Reasoning about Uncertainty. Chapter 4 Stuart Russell and Peter Norvig: Artificial Intelligence: A Modern Approach 1 Random variables 2 Probabilistic independence
More informationArtificial Intelligence
ICS461 Fall 2010 Nancy E. Reed nreed@hawaii.edu 1 Lecture #14B Outline Inference in Bayesian Networks Exact inference by enumeration Exact inference by variable elimination Approximate inference by stochastic
More informationBayesian Networks. Semantics of Bayes Nets. Example (Binary valued Variables) CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III
CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-III Bayesian Networks Announcements: Drop deadline is this Sunday Nov 5 th. All lecture notes needed for T3 posted (L13,,L17). T3 sample
More informationQuantifying uncertainty & Bayesian networks
Quantifying uncertainty & Bayesian networks CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2016 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition,
More informationProbabilistic Reasoning Systems
Probabilistic Reasoning Systems Dr. Richard J. Povinelli Copyright Richard J. Povinelli rev 1.0, 10/7/2001 Page 1 Objectives You should be able to apply belief networks to model a problem with uncertainty.
More informationObjectives. Probabilistic Reasoning Systems. Outline. Independence. Conditional independence. Conditional independence II.
Copyright Richard J. Povinelli rev 1.0, 10/1//2001 Page 1 Probabilistic Reasoning Systems Dr. Richard J. Povinelli Objectives You should be able to apply belief networks to model a problem with uncertainty.
More informationBayesian networks. Chapter Chapter
Bayesian networks Chapter 14.1 3 Chapter 14.1 3 1 Outline Syntax Semantics Parameterized distributions Chapter 14.1 3 2 Bayesian networks A simple, graphical notation for conditional independence assertions
More informationRecall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem
Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)
More informationCS 188: Artificial Intelligence Spring Announcements
CS 188: Artificial Intelligence Spring 2011 Lecture 16: Bayes Nets IV Inference 3/28/2011 Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew Moore Announcements
More informationLecture 4: Probability Propagation in Singly Connected Bayesian Networks
Lecture 4: Probability Propagation in Singly Connected Bayesian Networks Singly Connected Networks So far we have looked at several cases of probability propagation in networks. We now want to formalise
More information14 PROBABILISTIC REASONING
228 14 PROBABILISTIC REASONING A Bayesian network is a directed graph in which each node is annotated with quantitative probability information 1. A set of random variables makes up the nodes of the network.
More informationCS 188: Artificial Intelligence. Bayes Nets
CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew
More informationIntroduction to Artificial Intelligence. Unit # 11
Introduction to Artificial Intelligence Unit # 11 1 Course Outline Overview of Artificial Intelligence State Space Representation Search Techniques Machine Learning Logic Probabilistic Reasoning/Bayesian
More informationExample: When describing where a function is increasing, decreasing or constant we use the x- axis values.
Business Calculus Lecture Notes (also Calculus With Applications or Business Math II) Chapter 3 Applications o Derivatives 31 Increasing and Decreasing Functions Inormal Deinition: A unction is increasing
More informationThe 2nd Texas A&M at Galveston Mathematics Olympiad. September 24, Problems & Solutions
nd Math Olympiad Solutions Problem #: The nd Teas A&M at Galveston Mathematics Olympiad September 4, 00 Problems & Solutions Written by Dr. Lin Qiu A runner passes 5 poles in 30 seconds. How long will
More informationOn High-Rate Cryptographic Compression Functions
On High-Rate Cryptographic Compression Functions Richard Ostertág and Martin Stanek Department o Computer Science Faculty o Mathematics, Physics and Inormatics Comenius University Mlynská dolina, 842 48
More informationExponential Functions. Michelle Northshield
Exponential Functions Michelle Northshield Introduction The exponential unction is probably the most recognizable graph to many nonmathematicians. The typical graph o an exponential growth model shows
More informationA Brief Introduction to Graphical Models. Presenter: Yijuan Lu November 12,2004
A Brief Introduction to Graphical Models Presenter: Yijuan Lu November 12,2004 References Introduction to Graphical Models, Kevin Murphy, Technical Report, May 2001 Learning in Graphical Models, Michael
More informationVALUATIVE CRITERIA FOR SEPARATED AND PROPER MORPHISMS
VALUATIVE CRITERIA FOR SEPARATED AND PROPER MORPHISMS BRIAN OSSERMAN Recall that or prevarieties, we had criteria or being a variety or or being complete in terms o existence and uniqueness o limits, where
More informationLeast-Squares Spectral Analysis Theory Summary
Least-Squares Spectral Analysis Theory Summary Reerence: Mtamakaya, J. D. (2012). Assessment o Atmospheric Pressure Loading on the International GNSS REPRO1 Solutions Periodic Signatures. Ph.D. dissertation,
More informationDefinition: Let f(x) be a function of one variable with continuous derivatives of all orders at a the point x 0, then the series.
2.4 Local properties o unctions o several variables In this section we will learn how to address three kinds o problems which are o great importance in the ield o applied mathematics: how to obtain the
More informationSEPARATED AND PROPER MORPHISMS
SEPARATED AND PROPER MORPHISMS BRIAN OSSERMAN Last quarter, we introduced the closed diagonal condition or a prevariety to be a prevariety, and the universally closed condition or a variety to be complete.
More informationAxioms of Probability? Notation. Bayesian Networks. Bayesian Networks. Today we ll introduce Bayesian Networks.
Bayesian Networks Today we ll introduce Bayesian Networks. This material is covered in chapters 13 and 14. Chapter 13 gives basic background on probability and Chapter 14 talks about Bayesian Networks.
More informationAnnouncements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic
CS 188: Artificial Intelligence Fall 2008 Lecture 16: Bayes Nets III 10/23/2008 Announcements Midterms graded, up on glookup, back Tuesday W4 also graded, back in sections / box Past homeworks in return
More information3. Several Random Variables
. Several Random Variables. Two Random Variables. Conditional Probabilit--Revisited. Statistical Independence.4 Correlation between Random Variables. Densit unction o the Sum o Two Random Variables. Probabilit
More informationAnnouncements. Inference. Mid-term. Inference by Enumeration. Reminder: Alarm Network. Introduction to Artificial Intelligence. V22.
Introduction to Artificial Intelligence V22.0472-001 Fall 2009 Lecture 15: Bayes Nets 3 Midterms graded Assignment 2 graded Announcements Rob Fergus Dept of Computer Science, Courant Institute, NYU Slides
More informationThe achievable limits of operational modal analysis. * Siu-Kui Au 1)
The achievable limits o operational modal analysis * Siu-Kui Au 1) 1) Center or Engineering Dynamics and Institute or Risk and Uncertainty, University o Liverpool, Liverpool L69 3GH, United Kingdom 1)
More informationInformatics 2D Reasoning and Agents Semester 2,
Informatics 2D Reasoning and Agents Semester 2, 2017 2018 Alex Lascarides alex@inf.ed.ac.uk Lecture 23 Probabilistic Reasoning with Bayesian Networks 15th March 2018 Informatics UoE Informatics 2D 1 Where
More informationBayesian Networks. Motivation
Bayesian Networks Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Motivation Assume we have five Boolean variables,,,, The joint probability is,,,, How many state configurations
More informationBasic mathematics of economic models. 3. Maximization
John Riley 1 January 16 Basic mathematics o economic models 3 Maimization 31 Single variable maimization 1 3 Multi variable maimization 6 33 Concave unctions 9 34 Maimization with non-negativity constraints
More informationLecture 10: Introduction to reasoning under uncertainty. Uncertainty
Lecture 10: Introduction to reasoning under uncertainty Introduction to reasoning under uncertainty Review of probability Axioms and inference Conditional probability Probability distributions COMP-424,
More informationPROBABILISTIC REASONING SYSTEMS
PROBABILISTIC REASONING SYSTEMS In which we explain how to build reasoning systems that use network models to reason with uncertainty according to the laws of probability theory. Outline Knowledge in uncertain
More informationNumerical Methods - Lecture 2. Numerical Methods. Lecture 2. Analysis of errors in numerical methods
Numerical Methods - Lecture 1 Numerical Methods Lecture. Analysis o errors in numerical methods Numerical Methods - Lecture Why represent numbers in loating point ormat? Eample 1. How a number 56.78 can
More informationOn the Girth of (3,L) Quasi-Cyclic LDPC Codes based on Complete Protographs
On the Girth o (3,L) Quasi-Cyclic LDPC Codes based on Complete Protographs Sudarsan V S Ranganathan, Dariush Divsalar and Richard D Wesel Department o Electrical Engineering, University o Caliornia, Los
More informationBayesian Learning. Bayesian Learning Criteria
Bayesian Learning In Bayesian learning, we are interested in the probability of a hypothesis h given the dataset D. By Bayes theorem: P (h D) = P (D h)p (h) P (D) Other useful formulas to remember are:
More informationSEPARATED AND PROPER MORPHISMS
SEPARATED AND PROPER MORPHISMS BRIAN OSSERMAN The notions o separatedness and properness are the algebraic geometry analogues o the Hausdor condition and compactness in topology. For varieties over the
More informationHSP SUBCATEGORIES OF EILENBERG-MOORE ALGEBRAS
HSP SUBCATEGORIES OF EILENBERG-MOORE ALGEBRAS MICHAEL BARR Abstract. Given a triple T on a complete category C and a actorization system E /M on the category o algebras, we show there is a 1-1 correspondence
More information«Develop a better understanding on Partial fractions»
«Develop a better understanding on Partial ractions» ackground inormation: The topic on Partial ractions or decomposing actions is irst introduced in O level dditional Mathematics with its applications
More informationDifferentiation. The main problem of differential calculus deals with finding the slope of the tangent line at a point on a curve.
Dierentiation The main problem o dierential calculus deals with inding the slope o the tangent line at a point on a curve. deinition() : The slope o a curve at a point p is the slope, i it eists, o the
More informationFinal Exam December 12, 2017
Introduction to Artificial Intelligence CSE 473, Autumn 2017 Dieter Fox Final Exam December 12, 2017 Directions This exam has 7 problems with 111 points shown in the table below, and you have 110 minutes
More informationNew Results on Boomerang and Rectangle Attacks
New Results on Boomerang and Rectangle Attacks Eli Biham, 1 Orr Dunkelman, 1 Nathan Keller 2 1 Computer Science Department, Technion. Haia 32000, Israel {biham,orrd}@cs.technion.ac.il 2 Mathematics Department,
More information(One Dimension) Problem: for a function f(x), find x 0 such that f(x 0 ) = 0. f(x)
Solving Nonlinear Equations & Optimization One Dimension Problem: or a unction, ind 0 such that 0 = 0. 0 One Root: The Bisection Method This one s guaranteed to converge at least to a singularity, i not
More informationThe Clifford algebra and the Chevalley map - a computational approach (detailed version 1 ) Darij Grinberg Version 0.6 (3 June 2016). Not proofread!
The Cliord algebra and the Chevalley map - a computational approach detailed version 1 Darij Grinberg Version 0.6 3 June 2016. Not prooread! 1. Introduction: the Cliord algebra The theory o the Cliord
More informationBayesian belief networks. Inference.
Lecture 13 Bayesian belief networks. Inference. Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Midterm exam Monday, March 17, 2003 In class Closed book Material covered by Wednesday, March 12 Last
More informationAutomatic Differentiation Equipped Variable Elimination for Sensitivity Analysis on Probabilistic Inference Queries
Automatic Differentiation Equipped Variable Elimination for Sensitivity Analysis on Probabilistic Inference Queries Anonymous Author(s) Affiliation Address email Abstract 1 2 3 4 5 6 7 8 9 10 11 12 Probabilistic
More informationBayesian Networks and Decision Graphs
Bayesian Networks and Decision Graphs A 3-week course at Reykjavik University Finn V. Jensen & Uffe Kjærulff ({fvj,uk}@cs.aau.dk) Group of Machine Intelligence Department of Computer Science, Aalborg University
More informationMEASUREMENT UNCERTAINTIES
MEASUREMENT UNCERTAINTIES What distinguishes science rom philosophy is that it is grounded in experimental observations. These observations are most striking when they take the orm o a quantitative measurement.
More informationMidterm II Sample Questions
Midterm II Sample Questions 15-317 Constructive Logic Frank Penning November 7, 2017 Name: Sample Solution Andrew ID: p Instructions This exam is closed book, closed notes. You have 80 minutes to complete
More informationSPCS Cryptography Homework 13
1 1.1 PRP For this homework, use the ollowing PRP: E(k, m) : {0, 1} 3 {0, 1} 3 {0, 1} 3 000 001 010 011 100 101 110 111 m 000 011 001 111 010 000 101 110 100 001 101 110 010 000 111 100 001 011 010 001
More informationNetwork Upgrade Game. Qiuhui Li, Qiao Zhao
Network Upgrade Game Qiuhui Li, Qiao Zhao Outline Two ISPs N-ISPs Declining Upgrade Cost Model Motivation Network ater one upgrade Network ater both upgrade Contribution Formulate this upgrade decision
More informationCHAPTER 2 LINEAR MOTION
0 CHAPTER LINEAR MOTION HAPTER LINEAR MOTION 1 Motion o an object is the continuous change in the position o that object. In this chapter we shall consider the motion o a particle in a straight line, which
More informationProbabilistic Graphical Models for Image Analysis - Lecture 1
Probabilistic Graphical Models for Image Analysis - Lecture 1 Alexey Gronskiy, Stefan Bauer 21 September 2018 Max Planck ETH Center for Learning Systems Overview 1. Motivation - Why Graphical Models 2.
More informationMath 2412 Activity 1(Due by EOC Sep. 17)
Math 4 Activity (Due by EOC Sep. 7) Determine whether each relation is a unction.(indicate why or why not.) Find the domain and range o each relation.. 4,5, 6,7, 8,8. 5,6, 5,7, 6,6, 6,7 Determine whether
More informationAPPENDIX 1 ERROR ESTIMATION
1 APPENDIX 1 ERROR ESTIMATION Measurements are always subject to some uncertainties no matter how modern and expensive equipment is used or how careully the measurements are perormed These uncertainties
More informationSecure Communication in Multicast Graphs
Secure Communication in Multicast Graphs Qiushi Yang and Yvo Desmedt Department o Computer Science, University College London, UK {q.yang, y.desmedt}@cs.ucl.ac.uk Abstract. In this paper we solve the problem
More informationFinal Exam December 12, 2017
Introduction to Artificial Intelligence CSE 473, Autumn 2017 Dieter Fox Final Exam December 12, 2017 Directions This exam has 7 problems with 111 points shown in the table below, and you have 110 minutes
More informationRESOLUTION MSC.362(92) (Adopted on 14 June 2013) REVISED RECOMMENDATION ON A STANDARD METHOD FOR EVALUATING CROSS-FLOODING ARRANGEMENTS
(Adopted on 4 June 203) (Adopted on 4 June 203) ANNEX 8 (Adopted on 4 June 203) MSC 92/26/Add. Annex 8, page THE MARITIME SAFETY COMMITTEE, RECALLING Article 28(b) o the Convention on the International
More informationBayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1
Bayes Networks CS540 Bryan R Gibson University of Wisconsin-Madison Slides adapted from those used by Prof. Jerry Zhu, CS540-1 1 / 59 Outline Joint Probability: great for inference, terrible to obtain
More informationUC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics. EECS 281A / STAT 241A Statistical Learning Theory
UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics EECS 281A / STAT 241A Statistical Learning Theory Solutions to Problem Set 2 Fall 2011 Issued: Wednesday,
More informationIntroduction to Algorithms
Introduction to Algorithms, Lecture 5 // Introduction to Algorithms 6.46J/.4J LECTURE Shortest Paths I Properties o shortest paths Dijkstra s Correctness Analysis Breadth-irst Pro. Manolis Kellis March,
More informationProbabilistic Reasoning. Kee-Eung Kim KAIST Computer Science
Probabilistic Reasoning Kee-Eung Kim KAIST Computer Science Outline #1 Acting under uncertainty Probabilities Inference with Probabilities Independence and Bayes Rule Bayesian networks Inference in Bayesian
More informationBayesian Networks Introduction to Machine Learning. Matt Gormley Lecture 24 April 9, 2018
10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Bayesian Networks Matt Gormley Lecture 24 April 9, 2018 1 Homework 7: HMMs Reminders
More informationDefining Things in Terms of Joint Probability Distribution. Today s Lecture. Lecture 17: Uncertainty 2. Victor R. Lesser
Lecture 17: Uncertainty 2 Victor R. Lesser CMPSCI 683 Fall 2010 Today s Lecture How belief networks can be a Knowledge Base for probabilistic knowledge. How to construct a belief network. How to answer
More informationOPTIMAL PLACEMENT AND UTILIZATION OF PHASOR MEASUREMENTS FOR STATE ESTIMATION
OPTIMAL PLACEMENT AND UTILIZATION OF PHASOR MEASUREMENTS FOR STATE ESTIMATION Xu Bei, Yeo Jun Yoon and Ali Abur Teas A&M University College Station, Teas, U.S.A. abur@ee.tamu.edu Abstract This paper presents
More informationCS220/MATH320 Applied Discrete Math Fall 2018 Instructor: Marc Pomplun. Assignment #3. Sample Solutions
CS22/MATH2 Applied Discrete Math Fall 28 Instructor: Marc Pomplun Assignment # Sample Solutions Question : The Boston Powerlower Botanists at UMass Boston recently discovered a new local lower species
More information9.1 The Square Root Function
Section 9.1 The Square Root Function 869 9.1 The Square Root Function In this section we turn our attention to the square root unction, the unction deined b the equation () =. (1) We begin the section
More informationBayesian Network. Outline. Bayesian Network. Syntax Semantics Exact inference by enumeration Exact inference by variable elimination
Outline Syntax Semantics Exact inference by enumeration Exact inference by variable elimination s A simple, graphical notation for conditional independence assertions and hence for compact specication
More informationQuantifying Uncertainty & Probabilistic Reasoning. Abdulla AlKhenji Khaled AlEmadi Mohammed AlAnsari
Quantifying Uncertainty & Probabilistic Reasoning Abdulla AlKhenji Khaled AlEmadi Mohammed AlAnsari Outline Previous Implementations What is Uncertainty? Acting Under Uncertainty Rational Decisions Basic
More informationFunction Operations. I. Ever since basic math, you have been combining numbers by using addition, subtraction, multiplication, and division.
Function Operations I. Ever since basic math, you have been combining numbers by using addition, subtraction, multiplication, and division. Add: 5 + Subtract: 7 Multiply: (9)(0) Divide: (5) () or 5 II.
More informationSIO 211B, Rudnick. We start with a definition of the Fourier transform! ĝ f of a time series! ( )
SIO B, Rudnick! XVIII.Wavelets The goal o a wavelet transorm is a description o a time series that is both requency and time selective. The wavelet transorm can be contrasted with the well-known and very
More informationSymbolic-Numeric Methods for Improving Structural Analysis of DAEs
Symbolic-Numeric Methods or Improving Structural Analysis o DAEs Guangning Tan, Nedialko S. Nedialkov, and John D. Pryce Abstract Systems o dierential-algebraic equations (DAEs) are generated routinely
More informationThe First-Order Theory of Ordering Constraints over Feature Trees
Discrete Mathematics and Theoretical Computer Science 4, 2001, 193 234 The First-Order Theory o Ordering Constraints over Feature Trees Martin Müller 1 and Joachim Niehren 1 and Ral Treinen 2 1 Programming
More informationBayesian Networks (Part II)
10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Bayesian Networks (Part II) Graphical Model Readings: Murphy 10 10.2.1 Bishop 8.1,
More informationRATIONAL FUNCTIONS. Finding Asymptotes..347 The Domain Finding Intercepts Graphing Rational Functions
RATIONAL FUNCTIONS Finding Asymptotes..347 The Domain....350 Finding Intercepts.....35 Graphing Rational Functions... 35 345 Objectives The ollowing is a list o objectives or this section o the workbook.
More informationThis lecture. Reading. Conditional Independence Bayesian (Belief) Networks: Syntax and semantics. Chapter CS151, Spring 2004
This lecture Conditional Independence Bayesian (Belief) Networks: Syntax and semantics Reading Chapter 14.1-14.2 Propositions and Random Variables Letting A refer to a proposition which may either be true
More informationCS 484 Data Mining. Classification 7. Some slides are from Professor Padhraic Smyth at UC Irvine
CS 484 Data Mining Classification 7 Some slides are from Professor Padhraic Smyth at UC Irvine Bayesian Belief networks Conditional independence assumption of Naïve Bayes classifier is too strong. Allows
More informationUncertainty. Logic and Uncertainty. Russell & Norvig. Readings: Chapter 13. One problem with logical-agent approaches: C:145 Artificial
C:145 Artificial Intelligence@ Uncertainty Readings: Chapter 13 Russell & Norvig. Artificial Intelligence p.1/43 Logic and Uncertainty One problem with logical-agent approaches: Agents almost never have
More informationRolle s Theorem and the Mean Value Theorem. Rolle s Theorem
0_00qd //0 0:50 AM Page 7 7 CHAPTER Applications o Dierentiation Section ROLLE S THEOREM French mathematician Michel Rolle irst published the theorem that bears his name in 9 Beore this time, however,
More information1 Categories, Functors, and Natural Transformations. Discrete categories. A category is discrete when every arrow is an identity.
MacLane: Categories or Working Mathematician 1 Categories, Functors, and Natural Transormations 1.1 Axioms or Categories 1.2 Categories Discrete categories. A category is discrete when every arrow is an
More informationSec 3.1. lim and lim e 0. Exponential Functions. f x 9, write the equation of the graph that results from: A. Limit Rules
Sec 3. Eponential Functions A. Limit Rules. r lim a a r. I a, then lim a and lim a 0 3. I 0 a, then lim a 0 and lim a 4. lim e 0 5. e lim and lim e 0 Eamples:. Starting with the graph o a.) Shiting 9 units
More informationIntroduction to Bayesian Learning
Course Information Introduction Introduction to Bayesian Learning Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Apprendimento Automatico: Fondamenti - A.A. 2016/2017 Outline
More information9 Forward-backward algorithm, sum-product on factor graphs
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 9 Forward-backward algorithm, sum-product on factor graphs The previous
More informationBayesian belief networks
CS 2001 Lecture 1 Bayesian belief networks Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square 4-8845 Milos research interests Artificial Intelligence Planning, reasoning and optimization in the presence
More information