Bayesian Networks. Course: CS40022 Instructor: Dr. Pallab Dasgupta

Size: px
Start display at page:

Download "Bayesian Networks. Course: CS40022 Instructor: Dr. Pallab Dasgupta"

Transcription

1 Bayesan Networks Course: CS40022 Instructor: Dr. Pallab Dasgupta Department of Computer Scence & Engneerng Indan Insttute of Technology Kharagpur

2 Example Burglar alarm at home Farly relable at detectng a burglary Responds at tmes to mnor earthquakes Two neghbors, on hearng alarm, calls polce John always calls when he hears the alarm, but sometmes confuses the telephone rngng wth the alarm and calls then, too. Mary lkes loud musc and sometmes msses the alarm altogether

3 Belef Network Example Burglary P(B Earthquake B E P(A Alarm T T T F F T 0.29 A P(J F F JohnCalls T F MaryCalls A T P(M 0.70 F 0.01

4 The jont probablty dstrbuton A generc entry n the jont probablty dstrbuton P(x 1,, x n s gven by: P(x 1,...,x n = n = 1 P(x Parents(

5 The jont probablty dstrbuton Probablty of the event that the alarm has sounded but nether a burglary nor an earthquake has occurred, and both Mary and John call: P(J M A B E = P(J A P(M A P(A B E P( B P( E = =

6 Condtonal ndependence P( x = = = 1,..., P( x P( x n = 1 x n n n P( x x x n 1 n 1 x,...,,..., 1 x x,..., 1 1 P( x P( x x 1 n 1 n 1,..., x...p( x x n 2 2 1,..., The belef network represents condtonal ndependence: P(,..., = 1 P( x 1 x 1 P( x Parents( 1

7 Incremental Network Constructon 1. Choose the set of relevant varables that descrbe the doman 2. Choose an orderng for the varables (very( mportant step 3. Whle there are varables left: a Pck a varable and add a node for t b Set Parents( to some mnmal set of exstng nodes such that the condtonal ndependence property s satsfed c Defne the condtonal prob table for

8 Condtonal Independence Relatons If every undrected path from a node n to a node n Y s d-separated d by a gven set of evdence nodes E, then and Y are condtonally ndependent gven E. A set of nodes E d-separates two sets of nodes and Y f every undrected path from a node n to a node n Y s blocked gven E.

9 Condtonal Independence Relatons A path s blocked gven a set of nodes E f there s a node Z on the path for whch one of three condtons holds: 1. Z s n E and Z has one arrow on the path leadng n and one arrow out 2. Z s n E and Z has both path arrows leadng out 3. Nether Z nor any descendant of Z s n E, and both path arrows lead n to Z

10 Cond Independence n belef networks Battery Rado Ignton Petrol Starts Whether there s petrol and whether the rado plays are ndependent gven evdence about whether the gnton takes place Petrol and Rado are ndependent f t s known whether the battery works

11 Cond Independence n belef networks Battery Rado Ignton Petrol Starts Petrol and Rado are ndependent gven no evdence at all. But they are dependent gven evdence about whether the car starts. If the car does not start, then the rado playng s ncreased evdence that we are out of petrol.

12 Inferences usng belef networks Dagnostc nferences (from effects to causes Gven that JohnCalls,, nfer that P(Burglary JohnCalls = Causal nferences (from causes to effects Gven Burglary, nfer that P(JohnCalls Burglary = 0.86 and P(MaryCalls Burglary = 0.67

13 Inferences usng belef networks Intercausal nferences (between causes of a common effect Gven Alarm, we have P(Burglary Alarm = If we add evdence that Earthquake s true, then P(Burglary Alarm Earthquake goes down to Mxed nferences Settng the effect JohnCalls to true and the cause Earthquake to false gves P(Alarm JohnCalls Earthquake = 0.003

14 The four patterns Q E Q E E Q E Q E Dagnostc Causal InterCausal Mxed

15 Answerng queres We consder cases where the belef network s a poly-tree There s at most one undrected path between any two nodes

16 Answerng queres E U 1 U m Z 1j Z nj E Y 1 Y n

17 Answerng queres U = U 1 U m are parents of node Y = Y 1 Y n are chldren of node s the query varable E s a set of evdence varables The am s to compute P( E

18 Defntons E s the causal support for The evdence varables above above that are connected to through ts parents E s the evdental support for The evdence varables below below that are connected to through ts chldren U \ refers to all the evdence connected to node U except va the path from E U Y \ refers to all the evdence connected to node Y through ts parents for E Y

19 The computaton of P( E = P( = E,E,E P( Snce d-separates E from E, we can use condtonal ndependence to smplfy the frst term n the numerator We can treat the denomnator as a constant E E P( E = α P( E

20 The computaton of P( E We consder all possble confguratons of the parents of and how lkely they are gven E. Let U be the vector of parents U 1,, U m, and let u be an assgnment of values to them. P( E = P( u u,e P(u E

21 The computaton of P( E P( E = P( u u,e P(u U d-separates from E, so the frst term smplfes to P( u We can smplfy the second term by notng E d-separates each U from the others, the probablty of a conjuncton of ndependent varables s equal to the product of ther ndvdual probabltes P( E = P( u u P(u E E

22 The computaton of P( E P( E = P( u u P(u The last term can be smplfed by parttonng E nto E U1\,, E Um\ and notng that E U\ d-separates U from all the other evdence n E = P( u u P(u E E U\ P( u s a lookup n the cond prob table of P(u E U\ s a recursve (smaller sub-problem

23 The computaton of Let Z be the parents of Y other than, and let z be an assgnment of values to the parents The evdence n each Y box s condtonally ndependent of the others gven = Y\

24 The computaton of = Y\ Averagng over Y and z yelds: = y z Y\,y,z P (y,z

25 The computaton of = y z Y\,y,z P (y,z Breakng E Y\ nto the two ndependent components E Y and E Y\ = y z Y,y,z Y\,y,z P(y,z

26 The computaton of = y z Y,y,z Y\,y,z P(y,z E Y s ndependent of and z gven y, and E Y\ s ndependent of and y = y Y Y\ y z z P(y,z

27 The computaton of = y Y Y\ y z z P(y,z Apply Bayes rule to Y\ z : = y P(z EY \ P(z Y y z Y\ P(y,z

28 The computaton of = y P(z EY \ P(z Y y z Y\ P(y,z Rewrtng the conjuncton of Y and z : = z P(z y EY\ P(z Y y Y\ P(y,z P(z

29 The computaton of z = P(z y EY\ P(z Y y Y\ P(y,z P(z P(z = P(z because Z and are d-separated. Also Y\ s a constant y = Y y z βp(z E Y\ P(y,z

30 The computaton of = y Y y z βp(z E Y\ P(y,z The parents of Y (the Z j are ndependent of each other. We also combne the β nto one sngle β

31 The computaton of = β y P(y,z Y y z j P(z j E Z j \Y Y y s a recursve nstance of P(y, z s a cond prob table entry for Y P(z j E Zj\Y s a recursve sub-nstance of the P( E calculaton

32 Inference n multply connected belef networks Clusterng methods Transform the net nto a probablstcally equvalent (but topologcally dfferent poly- tree by mergng offendng nodes Condtonng methods Instantate varables to defnte values, and then evaluate a poly-tree for each possble nstantaton

33 Inference n multply connected belef networks Stochastc smulaton methods Use the network to generate a large number of concrete models of the doman that are consstent wth the network dstrbuton. They gve an approxmaton of the exact evaluaton.

34 Default reasonng Some conclusons are made by default unless a counter-evdence evdence s obtaned Non-monotonc reasonng Ponts to ponder Whats the semantc status of default rules? What happens when the evdence matches the premses of two default rules wth conflctng conclusons? If a belef s retracted later, how can a system keep track of whch conclusons need to be retracted as a consequence?

35 Issues n Rule-based methods for Uncertan Reasonng Localty In logcal reasonng systems, f we have A B, then we can conclude B gven evdence A, wthout worryng about any other rules.. In probablstc systems, we need to consder all avalable evdence.

36 Issues n Rule-based methods for Uncertan Reasonng Detachment Once a logcal proof s found for proposton B, we can use t regardless of how t was derved (t( can be detached from ts justfcaton. In probablstc reasonng, the source of the evdence s mportant for subsequent reasonng.

37 Issues n Rule-based methods for Uncertan Reasonng Truth functonalty In logc, the truth of complex sentences can be computed from the truth of the components. Probablty combnaton does not work ths way, except under strong ndependence assumptons. A famous example of a truth functonal system for uncertan reasonng s the certanty factors model,, developed for the Mycn medcal dagnostc program

38 Dempster-Shafer Theory Desgned to deal wth the dstncton between uncertanty and gnorance. We use a belef functon Bel( probablty that the evdence supports the proposton When we do not have any evdence about, we assgn Bel( = 0 as well as Bel( = 0

39 Dempster-Shafer Theory For example, f we do not know whether a con s far, then: Bel( ( Heads = Bel( Heads = 0 If we are gven that the con s far wth 90% certanty, then: Bel( ( Heads = = 0.45 Bel( Heads = = 0.45 Note that we stll have a gap of 0.1 that s not accounted for by the evdence

40 Fuzzy Logc Fuzzy set theory s a means of specfyng how well an object satsfes a vague descrpton Truth s a value between 0 and 1 Uncertanty stems from lack of evdence, but gven the dmensons of a man concludng whether he s fat has no uncertanty nvolved

41 Fuzzy Logc The rules for evaluatng the fuzzy truth, T, of a complex sentence are T(A B = mn( T(A, T(B T(A B = max( T(A, T(B T( A = 1 T(A

Reasoning under Uncertainty

Reasoning under Uncertainty Reasonng under Uncertanty Course: CS40022 Instructor: Dr. Pallab Dasgupta Department of Computer Scence & Engneerng Indan Insttute of Technology Kharagpur Handlng uncertan knowledge p p Symptom(p, Toothache

More information

CIS587 - Artificial Intellgence. Bayesian Networks CIS587 - AI. KB for medical diagnosis. Example.

CIS587 - Artificial Intellgence. Bayesian Networks CIS587 - AI. KB for medical diagnosis. Example. CIS587 - Artfcal Intellgence Bayesan Networks KB for medcal dagnoss. Example. We want to buld a KB system for the dagnoss of pneumona. Problem descrpton: Dsease: pneumona Patent symptoms (fndngs, lab tests):

More information

Artificial Intelligence Bayesian Networks

Artificial Intelligence Bayesian Networks Artfcal Intellgence Bayesan Networks Adapted from sldes by Tm Fnn and Mare desjardns. Some materal borrowed from Lse Getoor. 1 Outlne Bayesan networks Network structure Condtonal probablty tables Condtonal

More information

Bayesian belief networks

Bayesian belief networks CS 1571 Introducton to I Lecture 24 ayesan belef networks los Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square CS 1571 Intro to I dmnstraton Homework assgnment 10 s out and due next week Fnal exam: December

More information

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Power law and dimension of the maximum value for belief distribution with the max Deng entropy Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng

More information

Engineering Risk Benefit Analysis

Engineering Risk Benefit Analysis Engneerng Rsk Beneft Analyss.55, 2.943, 3.577, 6.938, 0.86, 3.62, 6.862, 22.82, ESD.72, ESD.72 RPRA 2. Elements of Probablty Theory George E. Apostolaks Massachusetts Insttute of Technology Sprng 2007

More information

Bayesian decision theory. Nuno Vasconcelos ECE Department, UCSD

Bayesian decision theory. Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory Nuno Vasconcelos ECE Department UCSD Notaton the notaton n DHS s qute sloppy e.. show that error error z z dz really not clear what ths means we wll use the follown notaton subscrpts

More information

Hidden Markov Models

Hidden Markov Models CM229S: Machne Learnng for Bonformatcs Lecture 12-05/05/2016 Hdden Markov Models Lecturer: Srram Sankararaman Scrbe: Akshay Dattatray Shnde Edted by: TBD 1 Introducton For a drected graph G we can wrte

More information

Stochastic Structural Dynamics

Stochastic Structural Dynamics Stochastc Structural Dynamcs Lecture-1 Defnton of probablty measure and condtonal probablty Dr C S Manohar Department of Cvl Engneerng Professor of Structural Engneerng Indan Insttute of Scence angalore

More information

International Journal of Mathematical Archive-3(3), 2012, Page: Available online through ISSN

International Journal of Mathematical Archive-3(3), 2012, Page: Available online through   ISSN Internatonal Journal of Mathematcal Archve-3(3), 2012, Page: 1136-1140 Avalable onlne through www.ma.nfo ISSN 2229 5046 ARITHMETIC OPERATIONS OF FOCAL ELEMENTS AND THEIR CORRESPONDING BASIC PROBABILITY

More information

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements CS 750 Machne Learnng Lecture 5 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square CS 750 Machne Learnng Announcements Homework Due on Wednesday before the class Reports: hand n before

More information

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING

Department of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING MACHINE LEANING Vasant Honavar Bonformatcs and Computatonal Bology rogram Center for Computatonal Intellgence, Learnng, & Dscovery Iowa State Unversty honavar@cs.astate.edu www.cs.astate.edu/~honavar/

More information

Formalisms For Fusion Belief in Design

Formalisms For Fusion Belief in Design XII ADM Internatonal Conference - Grand Hotel - Rmn Italy - Sept. 5 th -7 th, 200 Formalsms For Fuson Belef n Desgn Mchele Pappalardo DIMEC-Department of Mechancal Engneerng Unversty of Salerno, Italy

More information

What Independencies does a Bayes Net Model? Bayesian Networks: Independencies and Inference. Quick proof that independence is symmetric

What Independencies does a Bayes Net Model? Bayesian Networks: Independencies and Inference. Quick proof that independence is symmetric Bayesan Networks: Indeendences and Inference Scott Daves and ndrew Moore Note to other teachers and users of these sldes. ndrew and Scott would be delghted f you found ths source materal useful n gvng

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

Directed Graphical Models

Directed Graphical Models CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential

More information

Evaluation for sets of classes

Evaluation for sets of classes Evaluaton for Tet Categorzaton Classfcaton accuracy: usual n ML, the proporton of correct decsons, Not approprate f the populaton rate of the class s low Precson, Recall and F 1 Better measures 21 Evaluaton

More information

} Often, when learning, we deal with uncertainty:

} Often, when learning, we deal with uncertainty: Uncertanty and Learnng } Often, when learnng, we deal wth uncertanty: } Incomplete data sets, wth mssng nformaton } Nosy data sets, wth unrelable nformaton } Stochastcty: causes and effects related non-determnstcally

More information

CS47300: Web Information Search and Management

CS47300: Web Information Search and Management CS47300: Web Informaton Search and Management Probablstc Retreval Models Prof. Chrs Clfton 7 September 2018 Materal adapted from course created by Dr. Luo S, now leadng Albaba research group 14 Why probabltes

More information

Tsitsiashvili G. Sh , Vladivostok, Radio st. 7, IAM FEB RAS

Tsitsiashvili G. Sh , Vladivostok, Radio st. 7, IAM FEB RAS Tstsashvl G. SH. - BOTTLENECKS IN GENERAL TYPE LOGICAL SISTEMS WITH UNRELIABLE ELEMENTS BOTTLENECKS IN GENERAL TYPE LOGICAL SISTEMS WITH UNRELIABLE ELEMENTS Tstsashvl G. Sh. guram@am.dvo.ru 690041, Vladvostok,

More information

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

Representing arbitrary probability distributions Inference. Exact inference; Approximate inference

Representing arbitrary probability distributions Inference. Exact inference; Approximate inference Bayesan Learnng So far What does t mean to be Bayesan? Naïve Bayes Independence assumptons EM Algorthm Learnng wth hdden varables Today: Representng arbtrary probablty dstrbutons Inference Exact nference;

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Probability Theory (revisited)

Probability Theory (revisited) Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted

More information

14 PROBABILISTIC REASONING

14 PROBABILISTIC REASONING 228 14 PROBABILISTIC REASONING A Bayesian network is a directed graph in which each node is annotated with quantitative probability information 1. A set of random variables makes up the nodes of the network.

More information

EPR Paradox and the Physical Meaning of an Experiment in Quantum Mechanics. Vesselin C. Noninski

EPR Paradox and the Physical Meaning of an Experiment in Quantum Mechanics. Vesselin C. Noninski EPR Paradox and the Physcal Meanng of an Experment n Quantum Mechancs Vesseln C Nonnsk vesselnnonnsk@verzonnet Abstract It s shown that there s one purely determnstc outcome when measurement s made on

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

Probability and Random Variable Primer

Probability and Random Variable Primer B. Maddah ENMG 622 Smulaton 2/22/ Probablty and Random Varable Prmer Sample space and Events Suppose that an eperment wth an uncertan outcome s performed (e.g., rollng a de). Whle the outcome of the eperment

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number

More information

Why BP Works STAT 232B

Why BP Works STAT 232B Why BP Works STAT 232B Free Energes Helmholz & Gbbs Free Energes 1 Dstance between Probablstc Models - K-L dvergence b{ KL b{ p{ = b{ ln { } p{ Here, p{ s the eact ont prob. b{ s the appromaton, called

More information

Machine learning: Density estimation

Machine learning: Density estimation CS 70 Foundatons of AI Lecture 3 Machne learnng: ensty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square ata: ensty estmaton {.. n} x a vector of attrbute values Objectve: estmate the model of

More information

), it produces a response (output function g (x)

), it produces a response (output function g (x) Lnear Systems Revew Notes adapted from notes by Mchael Braun Typcally n electrcal engneerng, one s concerned wth functons of tme, such as a voltage waveform System descrpton s therefore defned n the domans

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

Computing Correlated Equilibria in Multi-Player Games

Computing Correlated Equilibria in Multi-Player Games Computng Correlated Equlbra n Mult-Player Games Chrstos H. Papadmtrou Presented by Zhanxang Huang December 7th, 2005 1 The Author Dr. Chrstos H. Papadmtrou CS professor at UC Berkley (taught at Harvard,

More information

CS-433: Simulation and Modeling Modeling and Probability Review

CS-433: Simulation and Modeling Modeling and Probability Review CS-433: Smulaton and Modelng Modelng and Probablty Revew Exercse 1. (Probablty of Smple Events) Exercse 1.1 The owner of a camera shop receves a shpment of fve cameras from a camera manufacturer. Unknown

More information

EM and Structure Learning

EM and Structure Learning EM and Structure Learnng Le Song Machne Learnng II: Advanced Topcs CSE 8803ML, Sprng 2012 Partally observed graphcal models Mxture Models N(μ 1, Σ 1 ) Z X N N(μ 2, Σ 2 ) 2 Gaussan mxture model Consder

More information

Bayesian epistemology II: Arguments for Probabilism

Bayesian epistemology II: Arguments for Probabilism Bayesan epstemology II: Arguments for Probablsm Rchard Pettgrew May 9, 2012 1 The model Represent an agent s credal state at a gven tme t by a credence functon c t : F [0, 1]. where F s the algebra of

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering / Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons

More information

The optimal delay of the second test is therefore approximately 210 hours earlier than =2.

The optimal delay of the second test is therefore approximately 210 hours earlier than =2. THE IEC 61508 FORMULAS 223 The optmal delay of the second test s therefore approxmately 210 hours earler than =2. 8.4 The IEC 61508 Formulas IEC 61508-6 provdes approxmaton formulas for the PF for smple

More information

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016

CS 2750: Machine Learning. Bayesian Networks. Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016 Plan for today and next week Today and next time: Bayesian networks (Bishop Sec. 8.1) Conditional

More information

Statistics and Probability Theory in Civil, Surveying and Environmental Engineering

Statistics and Probability Theory in Civil, Surveying and Environmental Engineering Statstcs and Probablty Theory n Cvl, Surveyng and Envronmental Engneerng Pro. Dr. Mchael Havbro Faber ETH Zurch, Swtzerland Contents o Todays Lecture Overvew o Uncertanty Modelng Random Varables - propertes

More information

THE ROYAL STATISTICAL SOCIETY 2006 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE

THE ROYAL STATISTICAL SOCIETY 2006 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE THE ROYAL STATISTICAL SOCIETY 6 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE PAPER I STATISTICAL THEORY The Socety provdes these solutons to assst canddates preparng for the eamnatons n future years and for

More information

Fuzzy Boundaries of Sample Selection Model

Fuzzy Boundaries of Sample Selection Model Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN

More information

Equilibrium with Complete Markets. Instructor: Dmytro Hryshko

Equilibrium with Complete Markets. Instructor: Dmytro Hryshko Equlbrum wth Complete Markets Instructor: Dmytro Hryshko 1 / 33 Readngs Ljungqvst and Sargent. Recursve Macroeconomc Theory. MIT Press. Chapter 8. 2 / 33 Equlbrum n pure exchange, nfnte horzon economes,

More information

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs

More information

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7 Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every

More information

xp(x µ) = 0 p(x = 0 µ) + 1 p(x = 1 µ) = µ

xp(x µ) = 0 p(x = 0 µ) + 1 p(x = 1 µ) = µ CSE 455/555 Sprng 2013 Homework 7: Parametrc Technques Jason J. Corso Computer Scence and Engneerng SUY at Buffalo jcorso@buffalo.edu Solutons by Yngbo Zhou Ths assgnment does not need to be submtted and

More information

Bayesian predictive Configural Frequency Analysis

Bayesian predictive Configural Frequency Analysis Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse

More information

Polynomial Regression Models

Polynomial Regression Models LINEAR REGRESSION ANALYSIS MODULE XII Lecture - 6 Polynomal Regresson Models Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Test of sgnfcance To test the sgnfcance

More information

On the correction of the h-index for career length

On the correction of the h-index for career length 1 On the correcton of the h-ndex for career length by L. Egghe Unverstet Hasselt (UHasselt), Campus Depenbeek, Agoralaan, B-3590 Depenbeek, Belgum 1 and Unverstet Antwerpen (UA), IBW, Stadscampus, Venusstraat

More information

Checking Pairwise Relationships. Lecture 19 Biostatistics 666

Checking Pairwise Relationships. Lecture 19 Biostatistics 666 Checkng Parwse Relatonshps Lecture 19 Bostatstcs 666 Last Lecture: Markov Model for Multpont Analyss X X X 1 3 X M P X 1 I P X I P X 3 I P X M I 1 3 M I 1 I I 3 I M P I I P I 3 I P... 1 IBD states along

More information

Accepted for the Twelfth Conference on Uncertainty in Artificial Intelligence (UAI-96) August 1-3, 1996, Portland, Oregon, USA

Accepted for the Twelfth Conference on Uncertainty in Artificial Intelligence (UAI-96) August 1-3, 1996, Portland, Oregon, USA Computatonal complexty reducton for BN2O networks usng smlarty of states Alexander V. Kozlov Department of Appled Physcs Stanford Unversty Stanford, CA 94305 phone: (415) 725-8814 e-mal: alexvk@cs.stanford.edu

More information

CS286r Assign One. Answer Key

CS286r Assign One. Answer Key CS286r Assgn One Answer Key 1 Game theory 1.1 1.1.1 Let off-equlbrum strateges also be that people contnue to play n Nash equlbrum. Devatng from any Nash equlbrum s a weakly domnated strategy. That s,

More information

One-sided finite-difference approximations suitable for use with Richardson extrapolation

One-sided finite-difference approximations suitable for use with Richardson extrapolation Journal of Computatonal Physcs 219 (2006) 13 20 Short note One-sded fnte-dfference approxmatons sutable for use wth Rchardson extrapolaton Kumar Rahul, S.N. Bhattacharyya * Department of Mechancal Engneerng,

More information

A New Evidence Combination Method based on Consistent Strength

A New Evidence Combination Method based on Consistent Strength TELKOMNIKA, Vol.11, No.11, November 013, pp. 697~6977 e-issn: 087-78X 697 A New Evdence Combnaton Method based on Consstent Strength Changmng Qao, Shul Sun* Insttute of Electronc Engneerng, He Longang

More information

9.913 Pattern Recognition for Vision. Class IV Part I Bayesian Decision Theory Yuri Ivanov

9.913 Pattern Recognition for Vision. Class IV Part I Bayesian Decision Theory Yuri Ivanov 9.93 Class IV Part I Bayesan Decson Theory Yur Ivanov TOC Roadmap to Machne Learnng Bayesan Decson Makng Mnmum Error Rate Decsons Mnmum Rsk Decsons Mnmax Crteron Operatng Characterstcs Notaton x - scalar

More information

1.4. Experiments, Outcome, Sample Space, Events, and Random Variables

1.4. Experiments, Outcome, Sample Space, Events, and Random Variables 1.4. Experments, Outcome, Sample Space, Events, and Random Varables In Secton 1.2.5, we dscuss how to fnd probabltes based on countng. Whle the probablty of any complex event s bult on countng, brute force

More information

Defining Things in Terms of Joint Probability Distribution. Today s Lecture. Lecture 17: Uncertainty 2. Victor R. Lesser

Defining Things in Terms of Joint Probability Distribution. Today s Lecture. Lecture 17: Uncertainty 2. Victor R. Lesser Lecture 17: Uncertainty 2 Victor R. Lesser CMPSCI 683 Fall 2010 Today s Lecture How belief networks can be a Knowledge Base for probabilistic knowledge. How to construct a belief network. How to answer

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

find (x): given element x, return the canonical element of the set containing x;

find (x): given element x, return the canonical element of the set containing x; COS 43 Sprng, 009 Dsjont Set Unon Problem: Mantan a collecton of dsjont sets. Two operatons: fnd the set contanng a gven element; unte two sets nto one (destructvely). Approach: Canoncal element method:

More information

9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 - Chapter 9R -Davd Klenfeld - Fall 2005 9 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys a set

More information

Vapnik-Chervonenkis theory

Vapnik-Chervonenkis theory Vapnk-Chervonenks theory Rs Kondor June 13, 2008 For the purposes of ths lecture, we restrct ourselves to the bnary supervsed batch learnng settng. We assume that we have an nput space X, and an unknown

More information

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands Content. Inference on Regresson Parameters a. Fndng Mean, s.d and covarance amongst estmates.. Confdence Intervals and Workng Hotellng Bands 3. Cochran s Theorem 4. General Lnear Testng 5. Measures of

More information

Expected Value and Variance

Expected Value and Variance MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

COMP9414: Artificial Intelligence Reasoning Under Uncertainty

COMP9414: Artificial Intelligence Reasoning Under Uncertainty COMP9414, Monday 16 April, 2012 Reasoning Under Uncertainty 2 COMP9414: Artificial Intelligence Reasoning Under Uncertainty Overview Problems with Logical Approach What Do the Numbers Mean? Wayne Wobcke

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng

More information

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006) ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)

More information

Stat 543 Exam 2 Spring 2016

Stat 543 Exam 2 Spring 2016 Stat 543 Exam 2 Sprng 2016 I have nether gven nor receved unauthorzed assstance on ths exam. Name Sgned Date Name Prnted Ths Exam conssts of 11 questons. Do at least 10 of the 11 parts of the man exam.

More information

Introduction to Artificial Intelligence. Unit # 11

Introduction to Artificial Intelligence. Unit # 11 Introduction to Artificial Intelligence Unit # 11 1 Course Outline Overview of Artificial Intelligence State Space Representation Search Techniques Machine Learning Logic Probabilistic Reasoning/Bayesian

More information

On the Revision of Probabilistic Beliefs using Uncertain Evidence

On the Revision of Probabilistic Beliefs using Uncertain Evidence On the Revson of Probablstc Belefs usng Uncertan Evdence He Chan and Adnan Darwche Computer Scence Department Unversty of Calforna, Los Angeles Los Angeles, CA 90095 {he,darwche}@cs.ucla.edu Abstract We

More information

6. Stochastic processes (2)

6. Stochastic processes (2) 6. Stochastc processes () Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 6. Stochastc processes () Contents Markov processes Brth-death processes 6. Stochastc processes () Markov process

More information

PROBABILISTIC REASONING SYSTEMS

PROBABILISTIC REASONING SYSTEMS PROBABILISTIC REASONING SYSTEMS In which we explain how to build reasoning systems that use network models to reason with uncertainty according to the laws of probability theory. Outline Knowledge in uncertain

More information

6. Stochastic processes (2)

6. Stochastic processes (2) Contents Markov processes Brth-death processes Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 Markov process Consder a contnuous-tme and dscrete-state stochastc process X(t) wth state space

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

COMP5211 Lecture Note on Reasoning under Uncertainty

COMP5211 Lecture Note on Reasoning under Uncertainty COMP5211 Lecture Note on Reasoning under Uncertainty Fangzhen Lin Department of Computer Science and Engineering Hong Kong University of Science and Technology Fangzhen Lin (HKUST) Uncertainty 1 / 33 Uncertainty

More information

Probabilistic Reasoning. (Mostly using Bayesian Networks)

Probabilistic Reasoning. (Mostly using Bayesian Networks) Probabilistic Reasoning (Mostly using Bayesian Networks) Introduction: Why probabilistic reasoning? The world is not deterministic. (Usually because information is limited.) Ways of coping with uncertainty

More information

Statistics and Quantitative Analysis U4320. Segment 3: Probability Prof. Sharyn O Halloran

Statistics and Quantitative Analysis U4320. Segment 3: Probability Prof. Sharyn O Halloran Statstcs and Quanttatve Analyss U430 Segment 3: Probablty Prof. Sharyn O Halloran Revew: Descrptve Statstcs Code book for Measures Sample Data Relgon Employed 1. Catholc 0. Unemployed. Protestant 1. Employed

More information

Stat 543 Exam 2 Spring 2016

Stat 543 Exam 2 Spring 2016 Stat 543 Exam 2 Sprng 206 I have nether gven nor receved unauthorzed assstance on ths exam. Name Sgned Date Name Prnted Ths Exam conssts of questons. Do at least 0 of the parts of the man exam. I wll score

More information

Bayesian Networks. Motivation

Bayesian Networks. Motivation Bayesian Networks Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Motivation Assume we have five Boolean variables,,,, The joint probability is,,,, How many state configurations

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 -Davd Klenfeld - Fall 2005 (revsed Wnter 2011) 1 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

CS : Algorithms and Uncertainty Lecture 17 Date: October 26, 2016

CS : Algorithms and Uncertainty Lecture 17 Date: October 26, 2016 CS 29-128: Algorthms and Uncertanty Lecture 17 Date: October 26, 2016 Instructor: Nkhl Bansal Scrbe: Mchael Denns 1 Introducton In ths lecture we wll be lookng nto the secretary problem, and an nterestng

More information

Modelli Clamfim Equazione del Calore Lezione ottobre 2014

Modelli Clamfim Equazione del Calore Lezione ottobre 2014 CLAMFIM Bologna Modell 1 @ Clamfm Equazone del Calore Lezone 17 15 ottobre 2014 professor Danele Rtell danele.rtell@unbo.t 1/24? Convoluton The convoluton of two functons g(t) and f(t) s the functon (g

More information

ON THE EQUIVALENCE OF ORDINAL BAYESIAN INCENTIVE COMPATIBILITY AND DOMINANT STRATEGY INCENTIVE COMPATIBILITY FOR RANDOM RULES

ON THE EQUIVALENCE OF ORDINAL BAYESIAN INCENTIVE COMPATIBILITY AND DOMINANT STRATEGY INCENTIVE COMPATIBILITY FOR RANDOM RULES ON THE EQUIVALENCE OF ORDINAL BAYESIAN INCENTIVE COMPATIBILITY AND DOMINANT STRATEGY INCENTIVE COMPATIBILITY FOR RANDOM RULES Madhuparna Karmokar 1 and Souvk Roy 1 1 Economc Research Unt, Indan Statstcal

More information

A New Evolutionary Computation Based Approach for Learning Bayesian Network

A New Evolutionary Computation Based Approach for Learning Bayesian Network Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang

More information

Math1110 (Spring 2009) Prelim 3 - Solutions

Math1110 (Spring 2009) Prelim 3 - Solutions Math 1110 (Sprng 2009) Solutons to Prelm 3 (04/21/2009) 1 Queston 1. (16 ponts) Short answer. Math1110 (Sprng 2009) Prelm 3 - Solutons x a 1 (a) (4 ponts) Please evaluate lm, where a and b are postve numbers.

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

PHYS 705: Classical Mechanics. Calculus of Variations II

PHYS 705: Classical Mechanics. Calculus of Variations II 1 PHYS 705: Classcal Mechancs Calculus of Varatons II 2 Calculus of Varatons: Generalzaton (no constrant yet) Suppose now that F depends on several dependent varables : We need to fnd such that has a statonary

More information

10.40 Appendix Connection to Thermodynamics and Derivation of Boltzmann Distribution

10.40 Appendix Connection to Thermodynamics and Derivation of Boltzmann Distribution 10.40 Appendx Connecton to Thermodynamcs Dervaton of Boltzmann Dstrbuton Bernhardt L. Trout Outlne Cannoncal ensemble Maxmumtermmethod Most probable dstrbuton Ensembles contnued: Canoncal, Mcrocanoncal,

More information

CIS 519/419 Appled Machne Learnng www.seas.upenn.edu/~cs519 Dan Roth danroth@seas.upenn.edu http://www.cs.upenn.edu/~danroth/ 461C, 3401 Walnut Sldes were created by Dan Roth (for CIS519/419 at Penn or

More information