Uncertaint Chapter 13 Mausam Based on slides b UW-AI facult
Knowledge Representation KR Language Ontological Commitment Epistemological Commitment ropositional Logic facts true, false, unknown First Order Logic facts, objects, relations true, false, unknown Temporal Logic facts, objects, relations, times true, false, unknown robabilit Theor facts degree of belief Fu Logic facts, degree of truth known interval values robabilistic Relational Models - combine probabilit and first order logic
Need for Reasoning w/ Uncertaint The world is full of uncertaint chance nodes/sensor noise/actuator error/partial info.. Logic is brittle can t encode eceptions to rules can t encode statistical properties in a domain Computers need to be able to handle uncertaint robabilit: new foundation for AI & CS! Massive amounts of data around toda Statistics and CS are both about data Statistics lets us summarie and understand it Statistics is the basis for most learning Statistics lets data do our work for us UW CSE AI Facult 3
Logic vs. robabilit Smbol: Q, R Boolean values: T, F State of the world: Assignment to Q, R Z Random variable: Q Domain: ou specif e.g. {heads, tails} [1, 6] Atomic event: complete specification of world: Q Z Mutuall eclusive Ehaustive rior probabilit aka Unconditional prob: Q Joint distribution: rob. of ever atomic event UW CSE AI Facult 4
robabilit Basics Begin with a set S: the sample space e.g., 6 possible rolls of a die. ϵ S is a sample point/possible world/atomic event A probabilit space or probabilit model is a sample space with an assignment for ever s.t. 0 1 and = 1 An event A is an subset of S e.g. A= die roll < 4 A random variable is a function from sample points to some range, e.g., the reals or Booleans
Tpes of robabilit Spaces UW CSE AI Facult 6
True Aioms of robabilit Theor All probabilities between 0 and 1 0 A 1 true = 1 false = 0. The probabilit of disjunction is: A B A B A B A B A B UW CSE AI Facult 7
rior robabilit Joint distribution can answer an question UW CSE AI Facult 8
Conditional probabilit Conditional or posterior probabilities e.g., cavit toothache = 0.8 i.e., given that toothache is all I know there is 80% chance of cavit Notation for conditional distributions: Cavit Toothache = 2-element vector of 2-element vectors If we know more, e.g., cavit is also given, then we have cavit toothache, cavit = 1 New evidence ma be irrelevant, allowing simplification: cavit toothache, sunn = cavit toothache = 0.8 This kind of inference, sanctioned b domain knowledge, is crucial UW CSE AI Facult 9
True Conditional robabilit A B is the probabilit of A given B Assumes that B is the onl info known. Defined b: A B A B B A AB B UW CSE AI Facult 10
Chain Rule/roduct Rule X 1,, X n = X n X 1..X n-1 X n-1 X 1..X n-2 X 1 = ПX i X 1,..X i-1
Dilemma at the Dentist s What is the probabilit of a cavit given a toothache? What is the probabilit of a cavit given the probe catches? CSE AI Facult 12
Inference b Enumeration toothache=.108+.012+.016+.064 =.20 or 20% UW CSE AI Facult 13
Inference b Enumeration toothachecavit =.20 +??.072 +.008.28 UW CSE AI Facult 14
Inference b Enumeration UW CSE AI Facult 15
Compleit of Enumeration Worst case time: Od n Where d = ma arit And n = number of random variables Space compleit also Od n Sie of joint distribution rohibitive! UW CSE AI Facult 16
Independence A and B are independent iff: UW CSE AI Facult 17 A B A B A B A B B A B A B A B A These two constraints are logicall equivalent Therefore, if A and B are independent:
Independence 31 10; Complete independence is powerful but rare What to do if it doesn t hold? UW CSE AI Facult 18
Conditional Independence Instead of 7 entries, onl need 5 UW CSE AI Facult 19
Conditional Independence II catch toothache, cavit = catch cavit catch toothache,cavit = catch cavit Wh onl 5 entries in table? UW CSE AI Facult 20
ower of Cond. Independence Often, using conditional independence reduces the storage compleit of the joint distribution from eponential to linear!! Conditional independence is the most basic & robust form of knowledge about uncertain environments. UW CSE AI Facult 21
Baes Rule UW CSE AI Facult 22 evidence prior likelihood, Baes rules! posterior
Computing Diagnostic rob. from Causal rob. E.g. let M be meningitis, S be stiff neck M = 0.0001, S = 0.1, SM= 0.8 MS UW CSE AI Facult 23
Other forms of Baes Rule prior likelihood posterior evidence prior likelihood
Conditional Baes Rule,,,,,,,,
Baes Rule & Cond. Independence UW CSE AI Facult 26
Simple Eample of State Estimation Suppose a robot obtains measurement What is dooropen? UW CSE AI Facult 27
Causal vs. Diagnostic Reasoning open is diagnostic. open is causal. Often causal knowledge is easier to obtain. count frequencies! Baes rule allows us to use causal knowledge: open open open UW CSE AI Facult 28
Eample open = 0.6 open = 0.3 open = open = 0.5 open open open open p open open p open open 0.60.5 0.60.5 0.30.5 2 3 0.67 raises the probabilit that the door is open. UW CSE AI Facult 29
Combining Evidence Suppose our robot obtains another observation 2. How can we integrate this new information? More generall, how can we estimate 1... n? UW CSE AI Facult 30
Eample: Second Measurement 2 open = 0.5 2 open = 0.6 open 1 =2/3 UW CSE AI Facult 32 0.625 8 5 3 1 5 3 3 2 2 1 3 2 2 1, 1 2 1 2 1 2 1 2 open open open open open open open 2 lowers the probabilit that the door is open.
These calculations seem laborious to do for each problem domain is there a general representation scheme for probabilistic inference? Yes Baesian Networks