26:010:680 Current Topics in Accounting Research

Size: px
Start display at page:

Download "26:010:680 Current Topics in Accounting Research"

Transcription

1 26:010:680 Current Topics in Accounting Research Dr. Peter R. Gillett Associate Professor Department of Accounting and Information Systems Rutgers Business School Newark and New Brunswick

2 OVERVIEW ADAPT Homework 4 Valuation Networks * Lauritzen & Spiegelhalter * Aalborg * Shenoy-Shafer * Gillett Homework 5 March 2, 2011 Dr. Peter R. Gillett 2

3 ADAPT Describes a project to create an expert system for audit program(me) tailoring Goals * To ensure that sufficient audit evidence is planned * To ensure that excessive audit evidence is not planned * To produce an optimal audit approach March 2, 2011 Dr. Peter R. Gillett 3

4 ADAPT The principal design philosophies incorporated in ADAPT are: * the user retains ultimate control of the process * the software is capable both of generating programmes, and of assessing the effectiveness of programmes determined by the auditor * two techniques are used in audit programme generation heuristic rules may include appropriate audit procedures to the extent that both the user and heuristics leave the programme indeterminate, a mathematical evaluation based on the assurance that is derivable from various procedures is used to plan the audit programme * this combination of techniques is designed to provide a means by which relevant auditor expertise in the designing of audit programmes may be combined with detailed knowledge of the client to ensure that the audit programme produced is relevant, effective and efficient to ensure that ADAPT can nevertheless operate with a relatively small number of heuristic rules, requiring only limited effort in supplying responses to questions to generate an efficient and cost-effective audit programme by taking into account the structure of audit evidence, and the relationships between related assertions * ADAPT was implemented initially with a small number of heuristic rules; more were added as necessary during pilot testing, particularly in response to issues arising during validation of the programmes produced by the system * built-in expertise is used in conjunction with the auditor's own knowledge March 2, 2011 Dr. Peter R. Gillett 4

5 ADAPT The Assertion Hierarchy OR F1 F2 F3 F4 etc. A1 A2 A3 A4 etc. IR CR DR IR CR DR IR CR DR P1 P2 P3 etc. March 2, 2011 Dr. Peter R. Gillett 5

6 ADAPT Derivable Assurance, Evidential Power and Scope * Evidential Power = Relevance * Reliability * Maximum Derivable Assurance (MDA)= Evidential Power * Scope * Maximum Derivable Assurance (MDA)= Relevance * Reliability * Scope * Evidential Value = MDA for all relevant assertions * Diversity and Diversity Factors (b = 1 - (30 + 2*min(p, 5) - f*(13-f)) 2 /1568 ) March 2, 2011 Dr. Peter R. Gillett 6

7 ADAPT Aggregation Rules * Direct Assurance DA = AsC b (X 1, X 2,... X n ) = 1 - (1-X 1 )*[(1-X 2 )...(1-X n )] b where the procedures are sorted so that X 1 > X 2 >... > X n * Indirect Assurance IA = Asxx(Y 1, Y 2,... Y q ) = Y 1 *Y 2 *...*Y q * Direct and Indirect Assurance AsC(DA, IA) = 1 - (1 - DA)*(1 - IA) Model Rules * Audit Risk = Inherent Risk * Control Risk * Detection Risk * Detection Risk = Audit Risk / (Inherent Risk * Control Risk) * Target Assurance = 1 - Detection Risk * Bayesian and Belief Function alternatives Hierarchical Rules March 2, 2011 Dr. Peter R. Gillett 7

8 ADAPT Control Strategy: for each financial statement area in turn, as selected: 1 Ignore Unimportant assertions (ie those not designated as Important by the user) throughout. 2 Identify Balance Sheet assertions and Transaction Stream assertions from the Assertions Database. 3 Identify Strategic assertions: 3.1 assertions that are normally Strategic are identified on the Assertions Database 3.2 heuristics may modify this determination 3.3 users may override these determinations of Strategic assertions. 4 Set a Target Assurance for each assertion: 4.1 using model rules based on the Audit Risk Model (or other specified alternative) 4.2 applying any appropriate heuristics. 5 Include any procedures designated as Mandatory by heuristics. 6 Ignore any procedures designated as Ignored by heuristics. 7 Include any procedures designated as Necessary by heuristics, except if they have been designated as Excluded by the user. 8 Exclude any procedures designated as Unnecessary by heuristics, except if they have been designated as Included by the user. 9 Include any procedures designated as Included by the user. 10 Include any Additional procedures supplied by the user that are relevant to Important assertions, except if they have subsequently been designated as Excluded by the user. 11 Exclude any procedures designated as Excluded by the user. 12 Include any procedures required by heuristics as a consequence of any procedures already included or excluded. 13 Evaluate the Planned Assurance achieved for each assertion. March 2, 2011 Dr. Peter R. Gillett 8

9 ADAPT 14 Select audit procedures for Strategic Balance Sheet assertions where the Target Assurance has not been achieved: 14.1 exclude available procedures as required by heuristics 14.2 for each assertion in turn select one of the available procedures: consider any procedures that, if added, would achieve the Target Assurance, and select the one with the lowest Cost,; if several procedures have the same lowest Cost, select the one with the greatest Evidential Value; if there are several of these, select the first one encountered; however, if this is the first procedure selected by the Control Strategy, do not select a sample unless there is no qualifying alternative available if there is no procedure that, if added, would achieve the Target Assurance and this is the first procedure selected by this algorithm, select the procedure with the greatest Maximum Derivable Assurance for the assertion; if several procedures have the same greatest Maximum Derivable Assurance, select the one with the lowest Cost; if several procedures have the same lowest Cost, select the one with the greatest Evidential Value; if there are several of these, select the first one encountered if there is no procedure that, if added, would achieve the Target Assurance but this is not the first procedure selected by this algorithm, select the procedure with the highest ratio of Maximum Derivable Assurance/Cost for the assertion; if several procedures have the same highest ratio of Maximum Derivable Assurance/Cost, select the one with the greatest Maximum Derivable Assurance for the assertion; if several procedures have the same Maximum Derivable Assurance, select the one with the greatest Evidential Value; if there are several of these, select the first one encountered but obey any heuristics overriding these choices 14.3 select any procedures now required by heuristics 14.4 revise the Planned Assurance for all appropriate assertions 14.5 iterate CS14.1 to CS14.4 until: there are no more procedures directly relevant or the Target Assurance for the assertion is achieved. March 2, 2011 Dr. Peter R. Gillett 9

10 ADAPT 15 Revise the Planned Assurance for all assertions to take account of the direct and indirect evidence available from the procedures selected. 16 Select audit procedures for Strategic Transaction Stream assertions where the Target Assurance has not been achieved, as described in CS CS Revise the Planned Assurance for all assertions to take account of the direct and indirect evidence available from the procedures selected. 18 Select audit procedures for non-strategic Balance Sheet assertions where the Target Assurance has not been achieved, as described in CS CS Revise the Planned Assurance for all assertions to take account of the direct and indirect evidence available from the procedures selected. 20 Select audit procedures for non-strategic Transaction Stream assertions where the Target Assurance has not been achieved, as described in CS CS Revise the Planned Assurance for all assertions to take account of the direct and indirect evidence available from the procedures selected. 22 For each assertion for which the Target Assurance has not been achieved, consider whether additional assurance can be achieved by extending the Scope of any selected non-sampling procedures: 22.1 consider Analytical Procedures only if the user has indicated that the quality of the accounting systems and the level of detailed data available may permit this 22.2 for each permitted Analytical Procedure or Test of Details that has been selected, ask the user to confirm whether it is possible to extend the Scope of the procedure 22.3 for the procedures where the user has confirmed the possibility of extending the Scope, select procedures as described in CS14.2 and increase the Scope as indicated in the ADAPT procedures database (AUDPROCS), until the Target Assurance for the assertion is achieved. 23 Revise the Planned Assurance for all assertions to take account of the direct and indirect evidence available from the procedures selected. March 2, 2011 Dr. Peter R. Gillett 10

11 ADAPT 24 For each assertion in turn for which the Target Assurance has not been achieved, select remaining procedures relevant to related assertions that provide appropriate indirect evidence as follows: 24.1 identify the related assertion with the lowest Planned Assurance for which there are relevant procedures available 24.2 select one of the available procedures relevant to this assertion, in accordance with CS revise the Planned Assurance for all appropriate assertions 24.4 iterate CS24.1 to CS24.3 until: there are no more relevant procedures available or the Target Assurance is achieved for the problem assertion or the Planned Assurances for all the assertions related to the problem assertion have reached the level set by the Governor. 25 Revise the Planned Assurance for all assertions to take account of the direct and indirect evidence available from the procedures selected. 26 For each assertion for which the Target Assurance has not been achieved, extend the Scope of any of the procedures added during CS24, in accordance with CS CS Revise the Planned Assurance for all assertions to take account of the direct and indirect evidence available from the procedures selected. 28 Minimise the sample sizes for the selected sampling procedures as follows: 28.1 build a network of related samples by selecting any sample not already included in a previous network, identifying all the assertions to which it is directly relevant for which the Target Assurance has been achieved, and all the assertions to which they provide indirect assurance for which the Target Assurance has been reached, then including all the samples relevant to any of these assertions, and iterating until no further samples are added to the network March 2, 2011 Dr. Peter R. Gillett 11

12 ADAPT 28.2 for each sample in the network, in descending order of Cost and, where Costs are equal, in ascending order of Evidential Value, consider whether it can be given a scope of zero (ie removed from the programme) without there being: an assertion for which the Target Assurance would no longer be achieved, although it would be achieved if the procedure were retained or an assertion for which the Target Assurance was not previously achieved, where the Planned Assurance would be reduced if the procedure were eliminated and if so, remove it, and iterate from CS for each of the remaining samples in the network, determine a minimum sample size assuming that the Scope of this procedure alone is reduced, as follows: determine an upper limit for the Sampling Confidence, which will be the lower of the Maximum Sampling Confidence permitted and the Sampling Confidence that would be needed for the procedure in question to achieve the Target Assurance alone determine a lower limit for the Sampling Confidence, which will be the Sampling Confidence needed to ensure that the minimum sample size is not less than 10 items or, for statistical samples, , the Sampling Confidence at which the sampling interval is equal to the acceptable level of imprecision (Touchstone), as this is greater determine the minimum Sampling Confidence for the sample by using a Binary Chop, rejecting the lower limit whenever there is : an assertion for which the Target Assurance would no longer be achieved, although it would be achieved if the sample size were retained at the present upper limit or an assertion for which the Target Assurance was not previously achieved, where the Planned Assurance would be reduced if the sample size were reduced to this lower limit generate the sample size for the minimum Sampling Confidence (using the Grant Thornton Sampling Plan) March 2, 2011 Dr. Peter R. Gillett 12

13 ADAPT 28.4 ascertain which of the possible reductions in sample sizes corresponding to the minima determined in CS28.3 achieves the greatest reduction in the total Cost of all the sampling, and reduce the Sampling Confidence for the appropriate procedure to the calculated minimum 28.5 iterate CS28.3 to CS28.4, calculating new minima following the previous Sampling Confidence reduction, until no further reductions are possible in the network 28.6 iterate CS28.1 to CS28.5 until all samples included in the audit programme have been included in a network, their Sampling Confidences have been reduced so far as possible, and appropriate sample sizes have been calculated. 29 Revise the Planned Assurance for all assertions to take account of the direct and indirect evidence available from the procedures selected. 30 For each procedure in turn that has been selected by the system using the Aggregation rules, in descending order of Cost and, where Costs are equal, in ascending order of Evidential Value: 30.1 eliminate the procedure unless: there is an assertion for which the Target Assurance would no longer be achieved, although it would be achieved if the procedure were retained or there is an assertion for which the Target Assurance was not previously achieved, where the Planned Assurance would be reduced if the procedure were eliminated 30.2 if there is any procedure selected by heuristic as a result of the procedure under consideration in CS30.1 being selected, then heuristics should enable both to be considered for elimination or retention together. 31 Revise the Planned Assurance for all assertions to take account of the direct and indirect evidence available from the procedures selected. March 2, 2011 Dr. Peter R. Gillett 13

14 ADAPT 32 If there is any assertion for which the Target Assurance has not been reached, there is insufficient evidence planned. 33 If there is any procedure selected by the user, or as a result of a heuristic rule, that could be eliminated without there being: an assertion for which the Target Assurance would no longer be achieved, although it would be achieved if the procedure were retained or an assertion for which the Target Assurance was not previously achieved, where the Planned Assurance would be reduced if the procedure were eliminated there is excessive evidence planned. Heuristics and Eval function Control Panel External representation of knowledge and heuristics * five system databases Extensive knowledge engineering project involving many people System verification and validation Outstanding research issues March 2, 2011 Dr. Peter R. Gillett 14

15 Homework 4 March 2, 2011 Dr. Peter R. Gillett 15

16 Assignments for Week 7 Lauritzen, Steffan. L., and David. J. Spiegelhalter "Local Computations with Probabilities on Graphical Structures and their Application to Expert Systems". Journal of the Royal Statistical Society. Shenoy, Prakash., and Glenn. Shafer "Axioms for Probability and Belief-Function Propagation". Uncertainty in Artificial Intelligence. Gillett, Peter. R A Comparative Study of Audit Evidence and Audit Planning Models Using Uncertain Reasoning. UMI. March 2, 2011 Dr. Peter R. Gillett 16

17 Valuation Networks Last week we studied approximate methods for propagation probabilities A brute force approach to computing probabilities is not realistic, and so we look for ways to carry out local propagation (i.e., to combine probabilities with neighboring probabilities in a network, but not combine all probabilities at once) March 2, 2011 Dr. Peter R. Gillett 17

18 Valuation Networks Brute force * 35 binary variables * 2 35 = 34,359,738,368 rows * 46 evidence nodes * 16 relation nodes * 2 calculation columns * 64 x x 2 = cells * = bytes * = 1,024 PCs with 2Gb memory each March 2, 2011 Dr. Peter R. Gillett 18

19 Valuation Networks Brute force * 2 35 x 61 = multiplications * 2 35 divisions * x (2 34-1) additions * 2 35 subtractions * ==> 4 hours at 2 GHz * ==> days(s) when overhead added * ==> many months/years to build an audit plan! March 2, 2011 Dr. Peter R. Gillett 19

20 Valuation Networks A great deal of work on formal propagation of probabilities in a network has been carried out by a number of researchers Perhaps most notable among them is Judea Pearl The biggest obstacle was the problem created when the network did not have a tree structure and the key breakthrough came in Lauritzen & Spiegelhalter s paper March 2, 2011 Dr. Peter R. Gillett 20

21 Lauritzen & Spiegelhalter Emphasis on efficient computational schemes Assumes a fixed model currently being entertained and the numerical assessments are precisely specified Based on development of MUNIN Acknowledges much prior work March 2, 2011 Dr. Peter R. Gillett 21

22 Lauritzen & Spiegelhalter Main Issues * Initialization * Absorption of evidence * Global propagation * Hypothesizing * Planning * Influential findings March 2, 2011 Dr. Peter R. Gillett 22

23 Lauritzen & Spiegelhalter As in Pearl s scheme, there is a fundamental assumption that the joint distribution is given as a product of priors and conditionals; e.g., ( ) α τ ξ ε δ λ β σ P,,,,,,, = ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) α τα ξε ετδ δ ε β λσ βσ σ P P P P, P, P P P Or, working with proportionality, as potentials: ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) α τ, α ξ, ε ε, τ, λ δ, ε, β λ, σ β, α σ ψ ψ ψ ψ ψ ψ ψ ψ March 2, 2011 Dr. Peter R. Gillett 23

24 Lauritzen & Spiegelhalter Evidence potentials are not uniquely determined They may not, in general, be easy to interpret other than through the joint probability being proportional to their product Initially, however, we may set the potentials equal to the probability factors March 2, 2011 Dr. Peter R. Gillett 24

25 Lauritzen & Spiegelhalter Building the tree of cliques for propagation * Form the moral graph by: dropping directions on the links; and marrying parents * Add fill-ins to create a triangulated graph * Identify the cliques of the triangulated graph * Label nodes using maximal cardinality search, rank cliques according to the highest labeled node, to form a set chain with the running intersection property * This is, of course, a join tree; arbitrarily select a root March 2, 2011 Dr. Peter R. Gillett 25

26 Lauritzen & Spiegelhalter A subset of nodes of a graph is complete if every member is connected to every other member by an edge A clique is a maximal complete subset March 2, 2011 Dr. Peter R. Gillett 26

27 Lauritzen & Spiegelhalter Maximum cardinality search * Assign 1 to an arbitrary node * Number the nodes consecutively, choosing as the next a node with a maximum number of previously numbered neighbors * Break ties arbitrarily March 2, 2011 Dr. Peter R. Gillett 27

28 Lauritzen & Spiegelhalter Maximum cardinality fill-in * Assume the nodes are numbered as for maximum cardinality search * Working backwards from the last node to the first, consider the set bd ( i ) { 1,..., i 1} where bd ( i ) is the set of numbers of the neighbors of the i th node * If this set is not complete, add whatever fill-in edges are necessary to make it so March 2, 2011 Dr. Peter R. Gillett 28

29 Lauritzen & Spiegelhalter Running intersection property * Suppose we have an ordered set of cliques * Then the set of cliques has the running intersection property if, for each i, all the nodes of C i that are in any earlier cliques are all members of one particular earlier clique; i.e., i 2, j < i : C C... C C ( ) i 1 i-1 j C i March 2, 2011 Dr. Peter R. Gillett 29

30 Lauritzen & Spiegelhalter Rules for propagation * Inward pass Each node waits to send its message to its inward neighbor until it has received messages from all its outward neighbors When a node is ready to send its message to its inward neighbor, it computes the message by marginalizing its current potential to its intersection with its inward neighbor; it sends this marginal to the inward neighbor, and then divides its own current potential by it When a node receives a message from its outward neighbor, it replaces its own current potential with the product of that potential and the message March 2, 2011 Dr. Peter R. Gillett 30

31 Lauritzen & Spiegelhalter Rules for propagation * Outward pass Each node waits to send its message to its outward neighbors until it has received the message from its inward neighbor When a node is ready to send its message to its outward neighbor, it computes the message by marginalizing its current potential to its intersection with its outward neighbor; it sends this marginal to the outward neighbor When a node receives a message from its inward neighbor, it replaces its own current potential with the product of that potential and the message March 2, 2011 Dr. Peter R. Gillett 31

32 Example Current Topics in Accounting Research Lauritzen & Spiegelhalter * Bayesian Network for student behavioral problems Overworked Insomnia Tired Cross ( o) = ( i) = ( toi) = ( to i) = ( t oi) = ( t o i) = ( coi) ( co i) ( c oi) ( c o i) P 0.2; P 0.01; P, 0.98; P, 0.8; P, 0.9; P, 0.05; P, = 0.96; P, = 0.9; P, = 0.6; P, = 0.01 March 2, 2011 Dr. Peter R. Gillett 32

33 Lauritzen & Spiegelhalter Moral graph Overworked Insomnia Tired Cross March 2, 2011 Dr. Peter R. Gillett 33

34 Lauritzen & Spiegelhalter This is clearly triangulated... Overworked Insomnia Tired Cross March 2, 2011 Dr. Peter R. Gillett 34

35 Lauritzen & Spiegelhalter Maximum cardinality 1 Overworked 2 Insomnia 3 Tired 4 Cross March 2, 2011 Dr. Peter R. Gillett 35

36 Lauritzen & Spiegelhalter No fill-ins needed because { } { } bd( 4) 1, 2, 3 = 1, 2 { } { } bd(3) 1, 2 = 1, 2 { } { } bd(2) 1 = 1 { } { } bd(1) = March 2, 2011 Dr. Peter R. Gillett 36

37 Lauritzen & Spiegelhalter Cliques ordered by highest numbered node: 1, 2, 3, 1, 2, 4 = OIT,,, OIC,, { } { } { } { } Since there are only two cliques, they vacuously have the running intersection property! i Clique C i Residual R i Separator S i Parent(s) 1 {O,I,T} {O,I,T} 2 {O,I,C} {C} {O,I} 1 March 2, 2011 Dr. Peter R. Gillett 37

38 Assignment 4 - Question 2 {O,I,C} {O,I,T} It would be natural to take {O,I,T} as root, but since this is a tree I can in fact choose any node, and I have chosen {O,I,C} as root in the next slide! March 2, 2011 Dr. Peter R. Gillett 38

39 Lauritzen & Spiegelhalter March 2, 2011 Dr. Peter R. Gillett 39

40 Aalborg Architecture Building the Junction Tree * Construct a tree of cliques with the running intersection property as for the Lauritzen & Spiegelhalter architecture Note that more efficient techniques for fill-in such as a onestep look-ahead may be used instead of maximum cardinality Alternatively, a tree of cliques may be constructed by building a Join Tree (Markov Tree) using the Shenoy-Shafer architecture, and then dropping all nodes that are subsets of adjacent nodes * Place a separator between each two adjacent cliques containing their intersection * This is, of course, a join tree; arbitrarily select a root March 2, 2011 Dr. Peter R. Gillett 40

41 Aalborg Architecture Rules for propagation * If separators initially contain tables of 1s, then for both the Inward pass and Outward pass: Each non-root node waits to send a message to a given neighbor until it has received messages from all its other neighbors The root waits to send messages to its neighbors until it has received messages from them all When a node is ready to send its message to a particular neighbor, it computes the message by marginalizing its current potential to its intersection with this neighbor, and then sends the message to the separator between it and the neighbor March 2, 2011 Dr. Peter R. Gillett 41

42 Aalborg Architecture Rules for propagation * If separators initially contain tables of 1s, then for both the Inward pass and Outward pass: When a separator receives a message from one of its two nodes, it divides the message by its current potential, sends the quotient on to the other node, and replaces its current potential with the new message When a node receives a message, it replaces its current potential with the product of that potential and the message March 2, 2011 Dr. Peter R. Gillett 42

43 Aalborg Architecture Junction Tree {O,I,C} oi, o, i oi, o, i {O,I} oi, 1 o, i 1 oi, 1 o, i 1 {O,I,T} oi, 0.98 o, i 0.80 oi, 0.90 o, i 0.05 March 2, 2011 Dr. Peter R. Gillett 43

44 Aalborg Architecture Inward Pass oi, 0.98 o, i 0.80 oi, 0.90 o, i 0.05 oi, 0.98 o, i 0.80 oi, 0.90 o, i 0.05 {O,I,C} {O,I} {O,I,T} oi, = o, i = oi, = o, i = oi, 0.98 o, i 0.80 oi, 0.90 o, i 0.05 oi, 0.98 o, i 0.80 oi, 0.90 o, i 0.05 March 2, 2011 Dr. Peter R. Gillett 44

45 Aalborg Architecture Outward Pass oi, o, i oi, o, i oi, o, i oi, o, i {O,I,C} {O,I} {O,I,T} oi, o, i oi, o, i oi, o, i oi, o, i oi, = o, i = oi, = o, i = March 2, 2011 Dr. Peter R. Gillett 45

46 Aalborg Architecture Hence, after normalization: P ( o ) = = and: P ( ) ( ) ( ) ( ) ( i ) = = March 2, 2011 Dr. Peter R. Gillett 46

47 Aalborg Architecture The Aalborg architecture has been implemented in Bayesian Network software called HUGIN Download a demonstration version from hugin-lite/200-hugin-lite-windows Recompute this example using HUGIN Add to the HUGIN model the observation that the student suffers from Insomnia, and examine what happens to the posterior probability of the student being Overworked as a result March 2, 2011 Dr. Peter R. Gillett 47

48 Textbooks Current Topics in Accounting Research Aalborg Architecture * Introduction to Bayesian Networks Finn V. Jensen * Probabilistic Networks and Expert Systems Robert G. Cowell, Steffen L. Lauritzen, David J. Spiegelhater * Expert Systems and Probabilistic Network Models Enrique Castillo, Jose M. Gutierrez, Ali S. Hadi March 2, 2011 Dr. Peter R. Gillett 48

49 Shenoy-Shafer Architecture Building the Join Tree * Detailed algorithms were provided in the readings for Shenoy-Shafer Join Trees * Later work has proposed a refinement that is more efficient Binary Join Trees * The Shenoy-Shafer algorithm was designed to work with both probabilities and belief functions * The axioms are fairly general, and the algorithm has been found to apply to many other valuations, such as Possibilites and Spohn s Epistemic (Dis)beliefs March 2, 2011 Dr. Peter R. Gillett 49

50 Shenoy-Shafer Architecture Rules for propagation * There is no distinction between inward and outward passes Each node waits to send its message to a given neighbor until it has received messages from all its other neighbors When a node is ready to send its message to a particular neighbor, it computes the message by collecting all its messages from other neighbors, multiplying its own potential by these messages, and marginalizing the product to its intersection with the neighbor to which it is sending March 2, 2011 Dr. Peter R. Gillett 50

51 Shenoy-Shafer Architecture Let s return to our example of cross, tired doctoral students but, now adding the knowledge that the student suffers from Insomnia March 2, 2011 Dr. Peter R. Gillett 51

52 Shenoy-Shafer Architecture Join Tree {O} {C} {O,I,C} {O,I} {O,I,T} {T} {I} March 2, 2011 Dr. Peter R. Gillett 52

53 Shenoy-Shafer Architecture Initially o.2 o.8 {O} oit,,.98 oi,, t.02 o, it,.80 o, i, t.20 oit,,.90 oi,, t.10 o, it,.05 o, i, t.95 c 1 c 0 {C} {O,I,C} {O,I} {O,I,T} {T} t t 1 0 oic,,.96 oi,, c.04 o, ic,.90 o, i, c.10 oic,,.60 oi,, c.40 o, ic,.01 o, i, c.99 i.01 i.99 {I} i i 1 0 March 2, 2011 Dr. Peter R. Gillett 53

54 Shenoy-Shafer Architecture Propagation o.2 c 1 c 0 o.8 oi,.96 o, i.90 oi,.60 o, i.01 {O} o.2 o.8 oit,,.98 oi,, t.02 o, it,.80 o, i, t.20 oit,,.90 oi,, t.10 o, it,.05 o, i, t.95 c 1 c 0 {C} oic,,.96 oi,, c.04 o, ic,.90 o, i, c.10 oic,,.60 oi,, c.40 o, ic,.01 o, i, c.99 {O,I,C} i i i.01 i {O,I} {I} oi,.98 o, i.80 oi,.90 o, i.05 i 1 i 0 {O,I,T} t t 1 0 {T} t t 1 0 March 2, 2011 Dr. Peter R. Gillett 54

55 Shenoy-Shafer Architecture Propagation o.2 c 1 c 0 o.8 oi,.96 o, i.90 oi,.60 o, i.01 {O} o.2 o.8 oit,,.98 oi,, t.02 o, it,.80 o, i, t.20 oit,,.90 oi,, t.10 o, it,.05 o, i, t.95 c 1 c 0 oic,,.96 oi,, c.04 o, ic,.90 o, i, c.10 oic,,.60 oi,, c.40 o, ic,.01 o, i, c.99 {C} {O,I,C} i i i.01 i.99 {O,I} {I} oi,.98 o, i.80 oi,.90 o, i.05 i 1 i 0 {O,I,T} t t 1 0 {T} t t 1 0 March 2, 2011 Dr. Peter R. Gillett 55

56 Shenoy-Shafer Architecture Propagation o.2 c c o.8 oi, o, i oi, o, i {O} o.2 o.8 oit,,.98 oi,, t.02 o, it,.80 o, i, t.20 oit,,.90 oi,, t.10 o, it,.05 o, i, t.95 c 1 c 0 {C} oic,,.96 oi,, c.04 o, ic,.90 o, i, c.10 oic,,.60 oi,, c.40 o, ic,.01 o, i, c.99 {O,I,C} i i i.01 i {O,I} {I} oi,.98 o, i.80 oi,.90 o, i.05 i 1 i 0 {O,I,T} t t 1 0 {T} t t 1 0 March 2, 2011 Dr. Peter R. Gillett 56

57 Shenoy-Shafer Architecture Propagation o.2 c 1 c 0 o.8 oi,.96 o, i.90 oi,.60 o, i.01 {O} o.2 o.8 oit,,.98 oi,, t.02 o, it,.80 o, i, t.20 oit,,.90 oi,, t.10 o, it,.05 o, i, t.95 c 1 c 0 {C} {O,I,C} {O,I} {O,I,T} {T} t t 1 0 oic,,.96 oi,, c.04 o, ic,.90 o, i, c.10 oic,,.60 oi,, c.40 o, ic,.01 o, i, c.99 i i i.01 i {I} oi, o, i oi, o, i i 1 i 0 t t March 2, 2011 Dr. Peter R. Gillett 57

58 Shenoy-Shafer Architecture Propagation o.2 c 1 c 0 o.8 oi,.96 o, i.90 oi,.60 o, i.01 {O} o =.9408 o =.5400 oit,,.98 oi,, t.02 o, it,.80 o, i, t.20 oit,,.90 oi,, t.10 o, it,.05 o, i, t.95 c 1 c 0 {C} {O,I,C} {O,I} {O,I,T} {T} t t 1 0 oic,,.96 oi,, c.04 o, ic,.90 o, i, c.10 oic,,.60 oi,, c.40 o, ic,.01 o, i, c.99 i i i.01 i {I} oi,.98 o, i.80 oi,.90 o, i.05 i 1 i 0 t t 1 0 March 2, 2011 Dr. Peter R. Gillett 58

59 Shenoy-Shafer Architecture Hence the posterior for O is given by o = ~ o March 2, 2011 Dr. Peter R. Gillett 59

60 Chapter 3 Current Topics in Accounting Research A Comparative Study... * Introduces Shenoy s Valuation-Based Systems * Defines Variables Configurations Valuations Marginalization Combination * Introduces Valuation Networks March 2, 2011 Dr. Peter R. Gillett 60

61 Chapter 3 Current Topics in Accounting Research A Comparative Study... * Axioms for Local Computation Order of deletion does not matter Combination is commutative and associative Marginalization distributes over combination * Fusion and Computation Algorithms * Join Tree construction * Relationship with Auditing research * Illustrative models March 2, 2011 Dr. Peter R. Gillett 61

62 A Comparative Study... Chapter 4 Probability Theory * Introduction * Probability as VBS * AND nodes * Discounted AND nodes * Independent and non-independent evidence * Hierarchical aggregation * Example March 2, 2011 Dr. Peter R. Gillett 62

63 A Comparative Study... Chapter 5 Belief Functions * Introduction * Belief Functions as VBS * AND nodes * Discounted AND nodes * Independent and non-independent evidence * Statistical Audit Evidence * Hierarchical aggregation * Example March 2, 2011 Dr. Peter R. Gillett 63

64 A Comparative Study... Chapter 6 Spohn s Epistemic Beliefs * Introduction * Spohn s Epistemic Beliefs as VBS * AND nodes * No Discounted AND nodes! * Independent and non-independent evidence * Statistical Audit Evidence undesirable? * Hierarchical aggregation * Example March 2, 2011 Dr. Peter R. Gillett 64

65 A Comparative Study... Chapter 7 Possibility Theory * Introduction * Possibility Theory as VBS * AND nodes * No Discounted AND nodes * Independent and non-independent evidence * Statistical Audit Evidence undesirable? * Hierarchical aggregation * Example March 2, 2011 Dr. Peter R. Gillett 65

66 A Comparative Study... Chapter 8 The Decision problem * Compares VN Instantiations *... March 2, 2011 Dr. Peter R. Gillett 66

67 Chapter 9 A Comparative Study... * Not in readings * Discusses knowledge elicitation difficulties for a comparative study, and implements proposed solutions March 2, 2011 Dr. Peter R. Gillett 67

68 Possibility Theory Possibility Theory is a computational implementation (largely inspired by Dubois and Prade) of Zadeh s Fuzzy Logic Shenoy showed that it may be re-formulated as a variant of Valuation Networks, so that possibilities may be propagated in Join Trees using the Shenoy-Shafer algorithm March 2, 2011 Dr. Peter R. Gillett 68

69 Possibility Theory A possibility function is a function such that: and Ω p [ ] π: 2 0,1 x Ω : π x = 1 p ( ) Ω 2 : ( ) max{ ( ) } p a π a = π x x a March 2, 2011 Dr. Peter R. Gillett 69

70 Possibility Theory By virtue of this second condition, possibility functions are completely determined by their values for singleton elements Intuitively, the degree of possibility for a subset a is 1 π( ~a) and the degree of impossibility is 1 π( a) March 2, 2011 Dr. Peter R. Gillett 70

71 Marginalization Combination where { X} Current Topics in Accounting Research Possibility Theory a { X } ( ) ( ) { } y Ω : π y = max π yx, x Ω a π1 π 2 = K π x π x K 1 ( a) ( b ) ( ) 1 2 x 0 K = 0 max{ ( a ) ( ) } b K = π1 x π2 x x Ωa b 0 X March 2, 2011 Dr. Peter R. Gillett 71

72 Possibility Theory Example: Marginalization Objective 1 Objective 2 Possibility true true 1 true false.5 false true.3 false false.1 March 2, 2011 Dr. Peter R. Gillett 72

73 Possibility Theory Example: Marginalization Objective 1 Possibility true 1 false.3 Objective 2 Possibility true 1 false.5 March 2, 2011 Dr. Peter R. Gillett 73

74 Possibility Theory So for Objective 1, the degree of possibility for true is 0.7, and the degree of impossibility is 0 Similarly, for Objective 2, the degree of possibility for true is 0.5, and the degree of impossibility is 0 March 2, 2011 Dr. Peter R. Gillett 74

75 Possibility Theory Example: Combination Objective 1 Possibility true 1 false.4 Objective 1 Possibility true 1 false.6 March 2, 2011 Dr. Peter R. Gillett 75

76 Possibility Theory Example: Combination Objective 1 Possibility true 1 false.24 March 2, 2011 Dr. Peter R. Gillett 76

77 Possibility Theory Example: Combination Objective 1 Possibility true 1 false.4 Objective 1 Possibility true.6 false 1 March 2, 2011 Dr. Peter R. Gillett 77

78 Possibility Theory Example: Combination Objective 1 Possibility true.6 false.4 Objective 1 Possibility true 1 false.67 March 2, 2011 Dr. Peter R. Gillett 78

79 Possibility Theory We can represent complete ignorance using possibilities as follows: Objective 1 Objective 2 Possibility true true 1 true false 1 false true 1 false false 1 March 2, 2011 Dr. Peter R. Gillett 79

80 Possibility Theory We can define logical relationships such as AND nodes as follows: A= B&C: A B C Possibility a b c 1 a b ~c 0 a ~b c 0 a ~b ~c 0 ~a b c 0 ~a b ~c 1 ~a ~b c 1 ~a ~b ~c 1 March 2, 2011 Dr. Peter R. Gillett 80

81 Possibility Theory Discounted AND nodes as discussed for probabilities and belief functions CANNOT be defined Dubois and Prade argue that statistical sampling is contrary to the non-probabilistic nature of possibility theory: however, it seems that it could be incorporated using normalized maximum likelihood functions March 2, 2011 Dr. Peter R. Gillett 81

82 Possibility Theory Joint possibilities generally cannot be uniquely determined from marginals, as for probabilities and belief functions: X Y P1 P2 P3 x y x ~y ~x y ~x ~y have the same marginals for X and Y. However, when combined with an AND node, all three produce the same results!!! March 2, 2011 Dr. Peter R. Gillett 82

83 Possibility Theory When multiple nodes are combined in an AND node, the effect is that the degree of possibility for the conjunction is equal to the degree of possibility for the least possible conjunct This is significantly different from probabilities and belief functions, and is very relevant, for example, to auditing March 2, 2011 Dr. Peter R. Gillett 83

84 Spohn s Epistemic Calculus Spohn introduced his theory of epistemic states in order to represent plain human beliefs in a non-probabilistic way easily amenable to revision Initially they were based on functions mapping into the ordinals; later he changed this to the natural numbers March 2, 2011 Dr. Peter R. Gillett 84

85 Spohn s Epistemic Calculus For technical reasons, we will define disbeliefs as functions mapping to the natural numbers extended by adding a representation of infinity, March 2, 2011 Dr. Peter R. Gillett 85

86 Spohn s Epistemic Calculus A disbelief function is a function δ:2 d N such that: and Ω + x Ω : π x = 0 p ( ) a δ a = π x x a Ω 2 : ( ) min{ ( ) } d March 2, 2011 Dr. Peter R. Gillett 86

87 Spohn s Epistemic Calculus By virtue of this second condition, disbelief functions are completely determined by their values for singleton elements Intuitively, the degree of disbelief for a subset a is δ( a) and the degree of belief is δ( ~ a) March 2, 2011 Dr. Peter R. Gillett 87

88 Spohn s Epistemic Calculus Marginalization { X} a Combination { X } ( ) ( ) { } y Ω : δ y = min δ yx, x Ω a δ δ x =δ x +δ x K ( ) ( a) ( b) X where min{ ( a ) ( ) } b 1 2 a b K = δ x +δ x x Ω March 2, 2011 Dr. Peter R. Gillett 88

89 Spohn s Epistemic Calculus Example: Marginalization Objective 1 Objective 2 Disbelief true true 0 true false 5 false true 7 false false 9 March 2, 2011 Dr. Peter R. Gillett 89

90 Spohn s Epistemic Calculus Example: Marginalization Objective 1 Disbelief true 0 false 7 Objective 2 Disbelief true 0 false 5 March 2, 2011 Dr. Peter R. Gillett 90

91 Spohn s Epistemic Calculus Example: Combination Objective 1 Disbelief true 0 false 6 Objective 1 Disbelief true 0 false 4 March 2, 2011 Dr. Peter R. Gillett 91

92 Spohn s Epistemic Calculus Example: Combination Objective 1 Disbelief true 0 false 10 March 2, 2011 Dr. Peter R. Gillett 92

93 Spohn s Epistemic Calculus Example: Combination Objective 1 Disbelief true 0 false 6 Objective 1 Disbelief true 4 false 0 March 2, 2011 Dr. Peter R. Gillett 93

94 Spohn s Epistemic Calculus Example: Combination Objective 1 Disbelief true 4 false 6 Objective 1 Disbelief true 0 false 2 March 2, 2011 Dr. Peter R. Gillett 94

95 Spohn s Epistemic Calculus We can represent complete ignorance using disbeliefs as follows: Objective 1 Objective 2 Disbelief true true 0 true false 0 false true 0 false false 0 March 2, 2011 Dr. Peter R. Gillett 95

96 Spohn s Epistemic Calculus We can define logical relationships such as AND nodes as follows: A= B&C: A B C Disbelief a b c 0 a b ~c a ~b c a ~b ~c ~a b c ~a b ~c 0 ~a ~b c 0 ~a ~b ~c 0 March 2, 2011 Dr. Peter R. Gillett 96

97 Spohn s Epistemic Calculus Discounted AND nodes as discussed for probabilities and belief functions CANNOT be defined Statistical sampling is contrary to the intended ordinal nature of epistemic (dis)beliefs March 2, 2011 Dr. Peter R. Gillett 97

98 Spohn s Epistemic Calculus Joint disbeliefs generally cannot be uniquely determined from marginals, as for probabilities and belief functions: X Y P1 P2 P3 x y x ~y ~x y ~x ~y have the same marginals for X and Y. However, when combined with an AND node, all three produce the same results!!! March 2, 2011 Dr. Peter R. Gillett 98

99 Spohn s Epistemic Calculus When multiple nodes are combined in an AND node, the effect is that the degree of belief for the conjunction is equal to the degree of belief for the least believed conjunct This is significantly different from probabilities and belief functions, and is very relevant, for example, to auditing March 2, 2011 Dr. Peter R. Gillett 99

100 Spohn s Epistemic Calculus In addition, Spohn s system offers the prospect of elicitation of ordinal rankings rather than highly sensitive real values March 2, 2011 Dr. Peter R. Gillett 100

101 Homework 5 Homework 5 is a set of simple exercises using your knowledge of valuation networks The important thing is not the answers, but your workings and whether you understand the manipulations you are carrying out so by all means get help, but do the work yourself On this occasion, the actual answers are all in the readings, so it is the workings that count! Post your solutions as a WORD document on Blackboard as usual You will also need HUGIN March 2, 2011 Dr. Peter R. Gillett 101

Belief Update in CLG Bayesian Networks With Lazy Propagation

Belief Update in CLG Bayesian Networks With Lazy Propagation Belief Update in CLG Bayesian Networks With Lazy Propagation Anders L Madsen HUGIN Expert A/S Gasværksvej 5 9000 Aalborg, Denmark Anders.L.Madsen@hugin.com Abstract In recent years Bayesian networks (BNs)

More information

Computation in Valuation Algebras

Computation in Valuation Algebras Appeared in: J. Kohlas and S. Moral (eds.), Algorithms for Uncertainty and Defeasible Reasoning, Handbook of Defeasible Reasoning and Uncertainty Managment Systems, Vol. 5, 2000, pp. 5--39, Kluwer Academic

More information

p L yi z n m x N n xi

p L yi z n m x N n xi y i z n x n N x i Overview Directed and undirected graphs Conditional independence Exact inference Latent variables and EM Variational inference Books statistical perspective Graphical Models, S. Lauritzen

More information

Morphing the Hugin and Shenoy Shafer Architectures

Morphing the Hugin and Shenoy Shafer Architectures Morphing the Hugin and Shenoy Shafer Architectures James D. Park and Adnan Darwiche UCLA, Los Angeles CA 995, USA, {jd,darwiche}@cs.ucla.edu Abstrt. he Hugin and Shenoy Shafer architectures are two variations

More information

Probabilistic Graphical Models Homework 2: Due February 24, 2014 at 4 pm

Probabilistic Graphical Models Homework 2: Due February 24, 2014 at 4 pm Probabilistic Graphical Models 10-708 Homework 2: Due February 24, 2014 at 4 pm Directions. This homework assignment covers the material presented in Lectures 4-8. You must complete all four problems to

More information

An Application of Evidential Networks to Threat Assessment

An Application of Evidential Networks to Threat Assessment An Application of Evidential Networks to Threat Assessment A. Benavoli, B. Ristic, A. Farina, M. Oxenham, L. Chisci Abstract Decision makers operating in modern defence theatres need to comprehend and

More information

Machine Learning 4771

Machine Learning 4771 Machine Learning 4771 Instructor: Tony Jebara Topic 16 Undirected Graphs Undirected Separation Inferring Marginals & Conditionals Moralization Junction Trees Triangulation Undirected Graphs Separation

More information

Applying Bayesian networks in the game of Minesweeper

Applying Bayesian networks in the game of Minesweeper Applying Bayesian networks in the game of Minesweeper Marta Vomlelová Faculty of Mathematics and Physics Charles University in Prague http://kti.mff.cuni.cz/~marta/ Jiří Vomlel Institute of Information

More information

Exact Inference I. Mark Peot. In this lecture we will look at issues associated with exact inference. = =

Exact Inference I. Mark Peot. In this lecture we will look at issues associated with exact inference. = = Exact Inference I Mark Peot In this lecture we will look at issues associated with exact inference 10 Queries The objective of probabilistic inference is to compute a joint distribution of a set of query

More information

Message Passing and Junction Tree Algorithms. Kayhan Batmanghelich

Message Passing and Junction Tree Algorithms. Kayhan Batmanghelich Message Passing and Junction Tree Algorithms Kayhan Batmanghelich 1 Review 2 Review 3 Great Ideas in ML: Message Passing Each soldier receives reports from all branches of tree 3 here 7 here 1 of me 11

More information

LOCAL COMPUTATION HYPERTREES

LOCAL COMPUTATION HYPERTREES LOCAL COMPUTATION IN HYPERTREES by Glenn Shafer and Prakash P. Shenoy August 1988 Revised June 1991 School of Business University of Kansas Summerfield Hall Lawrence, KS 66045-2003 USA 1988 Glenn Shafer

More information

Inference in Hybrid Bayesian Networks with Mixtures of Truncated Exponentials

Inference in Hybrid Bayesian Networks with Mixtures of Truncated Exponentials In J. Vejnarova (ed.), Proceedings of 6th Workshop on Uncertainty Processing (WUPES-2003), 47--63, VSE-Oeconomica Publishers. Inference in Hybrid Bayesian Networks with Mixtures of Truncated Exponentials

More information

Machine Learning Lecture 14

Machine Learning Lecture 14 Many slides adapted from B. Schiele, S. Roth, Z. Gharahmani Machine Learning Lecture 14 Undirected Graphical Models & Inference 23.06.2015 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 9 Undirected Models CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due next Wednesday (Nov 4) in class Start early!!! Project milestones due Monday (Nov 9)

More information

On Markov Properties in Evidence Theory

On Markov Properties in Evidence Theory On Markov Properties in Evidence Theory 131 On Markov Properties in Evidence Theory Jiřina Vejnarová Institute of Information Theory and Automation of the ASCR & University of Economics, Prague vejnar@utia.cas.cz

More information

Discrete Bayesian Networks: The Exact Posterior Marginal Distributions

Discrete Bayesian Networks: The Exact Posterior Marginal Distributions arxiv:1411.6300v1 [cs.ai] 23 Nov 2014 Discrete Bayesian Networks: The Exact Posterior Marginal Distributions Do Le (Paul) Minh Department of ISDS, California State University, Fullerton CA 92831, USA dminh@fullerton.edu

More information

13 : Variational Inference: Loopy Belief Propagation

13 : Variational Inference: Loopy Belief Propagation 10-708: Probabilistic Graphical Models 10-708, Spring 2014 13 : Variational Inference: Loopy Belief Propagation Lecturer: Eric P. Xing Scribes: Rajarshi Das, Zhengzhong Liu, Dishan Gupta 1 Introduction

More information

Basic Probabilistic Reasoning SEG

Basic Probabilistic Reasoning SEG Basic Probabilistic Reasoning SEG 7450 1 Introduction Reasoning under uncertainty using probability theory Dealing with uncertainty is one of the main advantages of an expert system over a simple decision

More information

Axioms for Probability and Belief-Function Propagation

Axioms for Probability and Belief-Function Propagation Axioms for Probability and Belief-Function Propagation Prakash P. Shenoy and Glenn Shafer School of Business, Summerfield Hall, University of Kansas Lawrence, Kansas, 66045-2003, USA In this paper, we

More information

CS281A/Stat241A Lecture 19

CS281A/Stat241A Lecture 19 CS281A/Stat241A Lecture 19 p. 1/4 CS281A/Stat241A Lecture 19 Junction Tree Algorithm Peter Bartlett CS281A/Stat241A Lecture 19 p. 2/4 Announcements My office hours: Tuesday Nov 3 (today), 1-2pm, in 723

More information

USING POSSIBILITY THEORY IN EXPERT SYSTEMS ABSTRACT

USING POSSIBILITY THEORY IN EXPERT SYSTEMS ABSTRACT Appeared in: Fuzzy Sets and Systems, Vol. 52, Issue 2, 10 Dec. 1992, pp. 129--142. USING POSSIBILITY THEORY IN EXPERT SYSTEMS by Prakash P. Shenoy School of Business, University of Kansas, Lawrence, KS

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014 Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Problem Set 3 Issued: Thursday, September 25, 2014 Due: Thursday,

More information

Bayesian Network Inference Using Marginal Trees

Bayesian Network Inference Using Marginal Trees Bayesian Network Inference Using Marginal Trees Cory J. Butz 1?, Jhonatan de S. Oliveira 2??, and Anders L. Madsen 34 1 University of Regina, Department of Computer Science, Regina, S4S 0A2, Canada butz@cs.uregina.ca

More information

Probability. CS 3793/5233 Artificial Intelligence Probability 1

Probability. CS 3793/5233 Artificial Intelligence Probability 1 CS 3793/5233 Artificial Intelligence 1 Motivation Motivation Random Variables Semantics Dice Example Joint Dist. Ex. Axioms Agents don t have complete knowledge about the world. Agents need to make decisions

More information

Exact Inference: Clique Trees. Sargur Srihari

Exact Inference: Clique Trees. Sargur Srihari Exact Inference: Clique Trees Sargur srihari@cedar.buffalo.edu 1 Topics 1. Overview 2. Variable Elimination and Clique Trees 3. Message Passing: Sum-Product VE in a Clique Tree Clique-Tree Calibration

More information

Bayesian Networks: Representation, Variable Elimination

Bayesian Networks: Representation, Variable Elimination Bayesian Networks: Representation, Variable Elimination CS 6375: Machine Learning Class Notes Instructor: Vibhav Gogate The University of Texas at Dallas We can view a Bayesian network as a compact representation

More information

9 Forward-backward algorithm, sum-product on factor graphs

9 Forward-backward algorithm, sum-product on factor graphs Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 9 Forward-backward algorithm, sum-product on factor graphs The previous

More information

Probabilistic Graphical Models (I)

Probabilistic Graphical Models (I) Probabilistic Graphical Models (I) Hongxin Zhang zhx@cad.zju.edu.cn State Key Lab of CAD&CG, ZJU 2015-03-31 Probabilistic Graphical Models Modeling many real-world problems => a large number of random

More information

Search and Lookahead. Bernhard Nebel, Julien Hué, and Stefan Wölfl. June 4/6, 2012

Search and Lookahead. Bernhard Nebel, Julien Hué, and Stefan Wölfl. June 4/6, 2012 Search and Lookahead Bernhard Nebel, Julien Hué, and Stefan Wölfl Albert-Ludwigs-Universität Freiburg June 4/6, 2012 Search and Lookahead Enforcing consistency is one way of solving constraint networks:

More information

6.867 Machine learning, lecture 23 (Jaakkola)

6.867 Machine learning, lecture 23 (Jaakkola) Lecture topics: Markov Random Fields Probabilistic inference Markov Random Fields We will briefly go over undirected graphical models or Markov Random Fields (MRFs) as they will be needed in the context

More information

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional

More information

Tensor rank-one decomposition of probability tables

Tensor rank-one decomposition of probability tables Tensor rank-one decomposition of probability tables Petr Savicky Inst. of Comp. Science Academy of Sciences of the Czech Rep. Pod vodárenskou věží 2 82 7 Prague, Czech Rep. http://www.cs.cas.cz/~savicky/

More information

Introduction to Artificial Intelligence. Unit # 11

Introduction to Artificial Intelligence. Unit # 11 Introduction to Artificial Intelligence Unit # 11 1 Course Outline Overview of Artificial Intelligence State Space Representation Search Techniques Machine Learning Logic Probabilistic Reasoning/Bayesian

More information

Dynamic importance sampling in Bayesian networks using factorisation of probability trees

Dynamic importance sampling in Bayesian networks using factorisation of probability trees Dynamic importance sampling in Bayesian networks using factorisation of probability trees Irene Martínez Department of Languages and Computation University of Almería Almería, 4, Spain Carmelo Rodríguez

More information

UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics. EECS 281A / STAT 241A Statistical Learning Theory

UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics. EECS 281A / STAT 241A Statistical Learning Theory UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics EECS 281A / STAT 241A Statistical Learning Theory Solutions to Problem Set 2 Fall 2011 Issued: Wednesday,

More information

Lecture 15. Probabilistic Models on Graph

Lecture 15. Probabilistic Models on Graph Lecture 15. Probabilistic Models on Graph Prof. Alan Yuille Spring 2014 1 Introduction We discuss how to define probabilistic models that use richly structured probability distributions and describe how

More information

Bayesian Networks Representation and Reasoning

Bayesian Networks Representation and Reasoning ayesian Networks Representation and Reasoning Marco F. Ramoni hildren s Hospital Informatics Program Harvard Medical School (2003) Harvard-MIT ivision of Health Sciences and Technology HST.951J: Medical

More information

Implementing Machine Reasoning using Bayesian Network in Big Data Analytics

Implementing Machine Reasoning using Bayesian Network in Big Data Analytics Implementing Machine Reasoning using Bayesian Network in Big Data Analytics Steve Cheng, Ph.D. Guest Speaker for EECS 6893 Big Data Analytics Columbia University October 26, 2017 Outline Introduction Probability

More information

Inference in Graphical Models Variable Elimination and Message Passing Algorithm

Inference in Graphical Models Variable Elimination and Message Passing Algorithm Inference in Graphical Models Variable Elimination and Message Passing lgorithm Le Song Machine Learning II: dvanced Topics SE 8803ML, Spring 2012 onditional Independence ssumptions Local Markov ssumption

More information

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty

Lecture 10: Introduction to reasoning under uncertainty. Uncertainty Lecture 10: Introduction to reasoning under uncertainty Introduction to reasoning under uncertainty Review of probability Axioms and inference Conditional probability Probability distributions COMP-424,

More information

Weighting Expert Opinions in Group Decision Making for the Influential Effects between Variables in a Bayesian Network Model

Weighting Expert Opinions in Group Decision Making for the Influential Effects between Variables in a Bayesian Network Model Weighting Expert Opinions in Group Decision Making for the Influential Effects between Variables in a Bayesian Network Model Nipat Jongsawat 1, Wichian Premchaiswadi 2 Graduate School of Information Technology

More information

Independence Concepts for Convex Sets of Probabilities. Luis M. De Campos and Serafn Moral. Departamento de Ciencias de la Computacion e I.A.

Independence Concepts for Convex Sets of Probabilities. Luis M. De Campos and Serafn Moral. Departamento de Ciencias de la Computacion e I.A. Independence Concepts for Convex Sets of Probabilities Luis M. De Campos and Serafn Moral Departamento de Ciencias de la Computacion e I.A. Universidad de Granada, 18071 - Granada - Spain e-mails: lci@robinson.ugr.es,smc@robinson.ugr

More information

A NEW SET THEORY FOR ANALYSIS

A NEW SET THEORY FOR ANALYSIS Article A NEW SET THEORY FOR ANALYSIS Juan Pablo Ramírez 0000-0002-4912-2952 Abstract: We present the real number system as a generalization of the natural numbers. First, we prove the co-finite topology,

More information

THE VINE COPULA METHOD FOR REPRESENTING HIGH DIMENSIONAL DEPENDENT DISTRIBUTIONS: APPLICATION TO CONTINUOUS BELIEF NETS

THE VINE COPULA METHOD FOR REPRESENTING HIGH DIMENSIONAL DEPENDENT DISTRIBUTIONS: APPLICATION TO CONTINUOUS BELIEF NETS Proceedings of the 00 Winter Simulation Conference E. Yücesan, C.-H. Chen, J. L. Snowdon, and J. M. Charnes, eds. THE VINE COPULA METHOD FOR REPRESENTING HIGH DIMENSIONAL DEPENDENT DISTRIBUTIONS: APPLICATION

More information

Intelligent Systems:

Intelligent Systems: Intelligent Systems: Undirected Graphical models (Factor Graphs) (2 lectures) Carsten Rother 15/01/2015 Intelligent Systems: Probabilistic Inference in DGM and UGM Roadmap for next two lectures Definition

More information

Probabilistic Reasoning: Graphical Models

Probabilistic Reasoning: Graphical Models Probabilistic Reasoning: Graphical Models Christian Borgelt Intelligent Data Analysis and Graphical Models Research Unit European Center for Soft Computing c/ Gonzalo Gutiérrez Quirós s/n, 33600 Mieres

More information

Uncertainty and knowledge. Uncertainty and knowledge. Reasoning with uncertainty. Notes

Uncertainty and knowledge. Uncertainty and knowledge. Reasoning with uncertainty. Notes Approximate reasoning Uncertainty and knowledge Introduction All knowledge representation formalism and problem solving mechanisms that we have seen until now are based on the following assumptions: All

More information

PROBABILISTIC REASONING SYSTEMS

PROBABILISTIC REASONING SYSTEMS PROBABILISTIC REASONING SYSTEMS In which we explain how to build reasoning systems that use network models to reason with uncertainty according to the laws of probability theory. Outline Knowledge in uncertain

More information

Fractional Belief Propagation

Fractional Belief Propagation Fractional Belief Propagation im iegerinck and Tom Heskes S, niversity of ijmegen Geert Grooteplein 21, 6525 EZ, ijmegen, the etherlands wimw,tom @snn.kun.nl Abstract e consider loopy belief propagation

More information

Y. Xiang, Inference with Uncertain Knowledge 1

Y. Xiang, Inference with Uncertain Knowledge 1 Inference with Uncertain Knowledge Objectives Why must agent use uncertain knowledge? Fundamentals of Bayesian probability Inference with full joint distributions Inference with Bayes rule Bayesian networks

More information

Uncertainty and Rules

Uncertainty and Rules Uncertainty and Rules We have already seen that expert systems can operate within the realm of uncertainty. There are several sources of uncertainty in rules: Uncertainty related to individual rules Uncertainty

More information

On Conditional Independence in Evidence Theory

On Conditional Independence in Evidence Theory 6th International Symposium on Imprecise Probability: Theories and Applications, Durham, United Kingdom, 2009 On Conditional Independence in Evidence Theory Jiřina Vejnarová Institute of Information Theory

More information

12 : Variational Inference I

12 : Variational Inference I 10-708: Probabilistic Graphical Models, Spring 2015 12 : Variational Inference I Lecturer: Eric P. Xing Scribes: Fattaneh Jabbari, Eric Lei, Evan Shapiro 1 Introduction Probabilistic inference is one of

More information

Probabilistic and Bayesian Analytics

Probabilistic and Bayesian Analytics Probabilistic and Bayesian Analytics Note to other teachers and users of these slides. Andrew would be delighted if you found this source material useful in giving your own lectures. Feel free to use these

More information

Statistical Approaches to Learning and Discovery

Statistical Approaches to Learning and Discovery Statistical Approaches to Learning and Discovery Graphical Models Zoubin Ghahramani & Teddy Seidenfeld zoubin@cs.cmu.edu & teddy@stat.cmu.edu CALD / CS / Statistics / Philosophy Carnegie Mellon University

More information

Algebraic Structure of Information

Algebraic Structure of Information Algebraic Structure of Information Jürg Kohlas Departement of Informatics University of Fribourg CH 1700 Fribourg (Switzerland) E-mail: juerg.kohlas@unifr.ch http://diuf.unifr.ch/drupal/tns/juerg kohlas

More information

Final Exam, Machine Learning, Spring 2009

Final Exam, Machine Learning, Spring 2009 Name: Andrew ID: Final Exam, 10701 Machine Learning, Spring 2009 - The exam is open-book, open-notes, no electronics other than calculators. - The maximum possible score on this exam is 100. You have 3

More information

Appeared in: Information Systems Frontiers, Vol. 5, No. 4, Dec. 2003, pp A COMPARISON OF BAYESIAN AND BELIEF FUNCTION REASONING

Appeared in: Information Systems Frontiers, Vol. 5, No. 4, Dec. 2003, pp A COMPARISON OF BAYESIAN AND BELIEF FUNCTION REASONING Appeared in: Information Systems Frontiers, Vol. 5, No. 4, Dec. 2003, pp. 345 358. A COMPARISON OF BAYESIAN AND BELIEF FUNCTION REASONING Barry R. Cobb and Prakash P. Shenoy University of Kansas School

More information

Reasoning Under Uncertainty: Introduction to Probability

Reasoning Under Uncertainty: Introduction to Probability Reasoning Under Uncertainty: Introduction to Probability CPSC 322 Lecture 23 March 12, 2007 Textbook 9 Reasoning Under Uncertainty: Introduction to Probability CPSC 322 Lecture 23, Slide 1 Lecture Overview

More information

Bayesian networks: approximate inference

Bayesian networks: approximate inference Bayesian networks: approximate inference Machine Intelligence Thomas D. Nielsen September 2008 Approximative inference September 2008 1 / 25 Motivation Because of the (worst-case) intractability of exact

More information

Intelligent Systems (AI-2)

Intelligent Systems (AI-2) Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 3, 2016 CPSC 422, Lecture 11 Slide 1 422 big picture: Where are we? Query Planning Deterministic Logics First Order Logics Ontologies

More information

A Genetic Algorithm Approach for Doing Misuse Detection in Audit Trail Files

A Genetic Algorithm Approach for Doing Misuse Detection in Audit Trail Files A Genetic Algorithm Approach for Doing Misuse Detection in Audit Trail Files Pedro A. Diaz-Gomez and Dean F. Hougen Robotics, Evolution, Adaptation, and Learning Laboratory (REAL Lab) School of Computer

More information

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem

Recall from last time: Conditional probabilities. Lecture 2: Belief (Bayesian) networks. Bayes ball. Example (continued) Example: Inference problem Recall from last time: Conditional probabilities Our probabilistic models will compute and manipulate conditional probabilities. Given two random variables X, Y, we denote by Lecture 2: Belief (Bayesian)

More information

arxiv: v1 [cs.lo] 1 Sep 2017

arxiv: v1 [cs.lo] 1 Sep 2017 A DECISION PROCEDURE FOR HERBRAND FORMULAE WITHOUT SKOLEMIZATION arxiv:1709.00191v1 [cs.lo] 1 Sep 2017 TIMM LAMPERT Humboldt University Berlin, Unter den Linden 6, D-10099 Berlin e-mail address: lampertt@staff.hu-berlin.de

More information

6.047 / Computational Biology: Genomes, Networks, Evolution Fall 2008

6.047 / Computational Biology: Genomes, Networks, Evolution Fall 2008 MIT OpenCourseWare http://ocw.mit.edu 6.047 / 6.878 Computational Biology: Genomes, etworks, Evolution Fall 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

An Egalitarist Fusion of Incommensurable Ranked Belief Bases under Constraints

An Egalitarist Fusion of Incommensurable Ranked Belief Bases under Constraints An Egalitarist Fusion of Incommensurable Ranked Belief Bases under Constraints Salem Benferhat and Sylvain Lagrue and Julien Rossit CRIL - Université d Artois Faculté des Sciences Jean Perrin Rue Jean

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 295-P, Spring 213 Prof. Erik Sudderth Lecture 11: Inference & Learning Overview, Gaussian Graphical Models Some figures courtesy Michael Jordan s draft

More information

The Parameterized Complexity of Approximate Inference in Bayesian Networks

The Parameterized Complexity of Approximate Inference in Bayesian Networks JMLR: Workshop and Conference Proceedings vol 52, 264-274, 2016 PGM 2016 The Parameterized Complexity of Approximate Inference in Bayesian Networks Johan Kwisthout Radboud University Donders Institute

More information

Using first-order logic, formalize the following knowledge:

Using first-order logic, formalize the following knowledge: Probabilistic Artificial Intelligence Final Exam Feb 2, 2016 Time limit: 120 minutes Number of pages: 19 Total points: 100 You can use the back of the pages if you run out of space. Collaboration on the

More information

Writing proofs for MATH 51H Section 2: Set theory, proofs of existential statements, proofs of uniqueness statements, proof by cases

Writing proofs for MATH 51H Section 2: Set theory, proofs of existential statements, proofs of uniqueness statements, proof by cases Writing proofs for MATH 51H Section 2: Set theory, proofs of existential statements, proofs of uniqueness statements, proof by cases September 22, 2018 Recall from last week that the purpose of a proof

More information

Classification Based on Logical Concept Analysis

Classification Based on Logical Concept Analysis Classification Based on Logical Concept Analysis Yan Zhao and Yiyu Yao Department of Computer Science, University of Regina, Regina, Saskatchewan, Canada S4S 0A2 E-mail: {yanzhao, yyao}@cs.uregina.ca Abstract.

More information

Bayesian Networks to design optimal experiments. Davide De March

Bayesian Networks to design optimal experiments. Davide De March Bayesian Networks to design optimal experiments Davide De March davidedemarch@gmail.com 1 Outline evolutionary experimental design in high-dimensional space and costly experimentation the microwell mixture

More information

Reasoning with Uncertainty

Reasoning with Uncertainty Reasoning with Uncertainty Representing Uncertainty Manfred Huber 2005 1 Reasoning with Uncertainty The goal of reasoning is usually to: Determine the state of the world Determine what actions to take

More information

Theoretical and Computational Approaches to Dempster-Shafer (DS) Inference

Theoretical and Computational Approaches to Dempster-Shafer (DS) Inference Theoretical and Computational Approaches to Dempster-Shafer (DS) Inference Chuanhai Liu Department of Statistics Purdue University Joint work with A. P. Dempster Dec. 11, 2006 What is DS Analysis (DSA)?

More information

Dynamic Importance Sampling in Bayesian Networks Based on Probability Trees

Dynamic Importance Sampling in Bayesian Networks Based on Probability Trees Dynamic Importance Sampling in Bayesian Networks Based on Probability Trees Serafín Moral Dpt. Computer Science and Artificial Intelligence University of Granada Avda. Andalucía 38, 807 Granada, Spain

More information

Improving Search-Based Inference in Bayesian Networks

Improving Search-Based Inference in Bayesian Networks From: Proceedings of the Eleventh International FLAIRS Conference. Copyright 1998, AAAI (www.aaai.org). All rights reserved. Improving Search-Based Inference in Bayesian Networks E. Castillo, J. M. Guti~rrez

More information

COMPSCI 276 Fall 2007

COMPSCI 276 Fall 2007 Exact Inference lgorithms for Probabilistic Reasoning; OMPSI 276 Fall 2007 1 elief Updating Smoking lung ancer ronchitis X-ray Dyspnoea P lung cancer=yes smoking=no, dyspnoea=yes =? 2 Probabilistic Inference

More information

The Limitation of Bayesianism

The Limitation of Bayesianism The Limitation of Bayesianism Pei Wang Department of Computer and Information Sciences Temple University, Philadelphia, PA 19122 pei.wang@temple.edu Abstract In the current discussion about the capacity

More information

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks

Intelligent Systems: Reasoning and Recognition. Reasoning with Bayesian Networks Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2016/2017 Lesson 13 24 march 2017 Reasoning with Bayesian Networks Naïve Bayesian Systems...2 Example

More information

Calculus at Rutgers. Course descriptions

Calculus at Rutgers. Course descriptions Calculus at Rutgers This edition of Jon Rogawski s text, Calculus Early Transcendentals, is intended for students to use in the three-semester calculus sequence Math 151/152/251 beginning with Math 151

More information

5. Sum-product algorithm

5. Sum-product algorithm Sum-product algorithm 5-1 5. Sum-product algorithm Elimination algorithm Sum-product algorithm on a line Sum-product algorithm on a tree Sum-product algorithm 5-2 Inference tasks on graphical models consider

More information

Handling imprecise and uncertain class labels in classification and clustering

Handling imprecise and uncertain class labels in classification and clustering Handling imprecise and uncertain class labels in classification and clustering Thierry Denœux 1 1 Université de Technologie de Compiègne HEUDIASYC (UMR CNRS 6599) COST Action IC 0702 Working group C, Mallorca,

More information

Inference as Optimization

Inference as Optimization Inference as Optimization Sargur Srihari srihari@cedar.buffalo.edu 1 Topics in Inference as Optimization Overview Exact Inference revisited The Energy Functional Optimizing the Energy Functional 2 Exact

More information

Variable Elimination: Algorithm

Variable Elimination: Algorithm Variable Elimination: Algorithm Sargur srihari@cedar.buffalo.edu 1 Topics 1. Types of Inference Algorithms 2. Variable Elimination: the Basic ideas 3. Variable Elimination Sum-Product VE Algorithm Sum-Product

More information

Mixtures of Polynomials in Hybrid Bayesian Networks with Deterministic Variables

Mixtures of Polynomials in Hybrid Bayesian Networks with Deterministic Variables Appeared in: J. Vejnarova and T. Kroupa (eds.), Proceedings of the 8th Workshop on Uncertainty Processing (WUPES'09), Sept. 2009, 202--212, University of Economics, Prague. Mixtures of Polynomials in Hybrid

More information

CHAPTER-17. Decision Tree Induction

CHAPTER-17. Decision Tree Induction CHAPTER-17 Decision Tree Induction 17.1 Introduction 17.2 Attribute selection measure 17.3 Tree Pruning 17.4 Extracting Classification Rules from Decision Trees 17.5 Bayesian Classification 17.6 Bayes

More information

A Graphical Model for Simultaneous Partitioning and Labeling

A Graphical Model for Simultaneous Partitioning and Labeling A Graphical Model for Simultaneous Partitioning and Labeling Philip J Cowans Cavendish Laboratory, University of Cambridge, Cambridge, CB3 0HE, United Kingdom pjc51@camacuk Martin Szummer Microsoft Research

More information

Inference in Bayesian Networks

Inference in Bayesian Networks Andrea Passerini passerini@disi.unitn.it Machine Learning Inference in graphical models Description Assume we have evidence e on the state of a subset of variables E in the model (i.e. Bayesian Network)

More information

Chapter 8 Cluster Graph & Belief Propagation. Probabilistic Graphical Models 2016 Fall

Chapter 8 Cluster Graph & Belief Propagation. Probabilistic Graphical Models 2016 Fall Chapter 8 Cluster Graph & elief ropagation robabilistic Graphical Models 2016 Fall Outlines Variable Elimination 消元法 imple case: linear chain ayesian networks VE in complex graphs Inferences in HMMs and

More information

Directed Graphical Models

Directed Graphical Models CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential

More information

Learning in Bayesian Networks

Learning in Bayesian Networks Learning in Bayesian Networks Florian Markowetz Max-Planck-Institute for Molecular Genetics Computational Molecular Biology Berlin Berlin: 20.06.2002 1 Overview 1. Bayesian Networks Stochastic Networks

More information

Uncertainty and Bayesian Networks

Uncertainty and Bayesian Networks Uncertainty and Bayesian Networks Tutorial 3 Tutorial 3 1 Outline Uncertainty Probability Syntax and Semantics for Uncertainty Inference Independence and Bayes Rule Syntax and Semantics for Bayesian Networks

More information

Graphical models. Sunita Sarawagi IIT Bombay

Graphical models. Sunita Sarawagi IIT Bombay 1 Graphical models Sunita Sarawagi IIT Bombay http://www.cse.iitb.ac.in/~sunita 2 Probabilistic modeling Given: several variables: x 1,... x n, n is large. Task: build a joint distribution function Pr(x

More information

Operations for Inference in Continuous Bayesian Networks with Linear Deterministic Variables

Operations for Inference in Continuous Bayesian Networks with Linear Deterministic Variables Appeared in: International Journal of Approximate Reasoning, Vol. 42, Nos. 1--2, 2006, pp. 21--36. Operations for Inference in Continuous Bayesian Networks with Linear Deterministic Variables Barry R.

More information

Using Belief Functions in Software Agents to Test the Strength of Application Controls: A Conceptual Framework

Using Belief Functions in Software Agents to Test the Strength of Application Controls: A Conceptual Framework Using Belief Functions in Software Agents to Test the Strength of Application Controls: A Conceptual Framework Robert Nehmer Associate Professor Oakland University Rajendra P. Srivastava Ernst and Young

More information

Modeling and reasoning with uncertainty

Modeling and reasoning with uncertainty CS 2710 Foundations of AI Lecture 18 Modeling and reasoning with uncertainty Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square KB systems. Medical example. We want to build a KB system for the diagnosis

More information

Bayesian belief networks

Bayesian belief networks CS 2001 Lecture 1 Bayesian belief networks Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square 4-8845 Milos research interests Artificial Intelligence Planning, reasoning and optimization in the presence

More information

In Defense of Jeffrey Conditionalization

In Defense of Jeffrey Conditionalization In Defense of Jeffrey Conditionalization Franz Huber Department of Philosophy University of Toronto Please do not cite! December 31, 2013 Contents 1 Introduction 2 2 Weisberg s Paradox 3 3 Jeffrey Conditionalization

More information

Conditional Independence and Markov Properties

Conditional Independence and Markov Properties Conditional Independence and Markov Properties Lecture 1 Saint Flour Summerschool, July 5, 2006 Steffen L. Lauritzen, University of Oxford Overview of lectures 1. Conditional independence and Markov properties

More information

Lecture 8: Bayesian Networks

Lecture 8: Bayesian Networks Lecture 8: Bayesian Networks Bayesian Networks Inference in Bayesian Networks COMP-652 and ECSE 608, Lecture 8 - January 31, 2017 1 Bayes nets P(E) E=1 E=0 0.005 0.995 E B P(B) B=1 B=0 0.01 0.99 E=0 E=1

More information