Lecture 7. Intro to Statistical Decision Analysis. Example. A numerical measures of outcomes. Today: Valuing outcomes - Utility.
|
|
- Vivien Anderson
- 6 years ago
- Views:
Transcription
1 Lecture 7 Professor Scott Schmidler Duke University Today: Valuing outcomes -. Stat 340 TuTh 11:45-1:00 Fall 2016 Reference: Lectures draw heavily from Lindley Making Decisions, 2nd ed. Textbook reading: Smith Bayesian Decision Analysis Ch 3 See also: Clemen & Reilly Making Hard Decisions, Ch 13,14 A numerical measures of outcomes Example We showed uncertainty can be given a numerical measure that obeys certain laws (probabilities). We ll now do the same for outcomes/values (utilities). Together these will provide the solution to a decision problem. Consider possible decisions d 1,..., d m and possible outcomes o 1,..., o n. To fill out our decision model (ID or tree) we require a numerical value for each possible (d i, o j ) pair. Previously we used $ and EMV rule, but not all problems naturally lead to maximizing $. A manufacturer produces rolls of textiles (e.g. curtain material). Before selling, he must decide whether to inspect rolls for flaws. d 1 = inspect d 2 = sell w/out inspection Consequences: C = d 1 d 2 [ o 1 o 2 ] c11 c 12 c 21 c 22 o 1 = roll is free of flaws o 2 = has 1 flaw
2 Consequences o 1 : material good o 2 : material flawed d 1 satisfied customer Make new roll + cost of inspection cost of inspection satisfied customer d 2 satisfied customer Customer complaints Must replace with new roll Clearly c 21 is best for manufacturer c 11 probably next best (only cost of inspection) c 12 next best (must discard roll) c 22 worst (only saved inspection cost) Ranking: c 21 > c 11 > c 12 > c 22. Some consequences are preferred to others. (If inspection cost very high but flaw can be fixed, ranking might be different.) Example (cont d) Which decision should the manufacturer choose? Not obvious - presence of flaw is uncertain. If: o 1 (no flaws), d 2 clearly better (c 21 preferred to c 11 ) o 2 (flaws), d 1 better (c 12 preferred to c 22 ) Manufacturer can quantify uncertainty by assigning probabilities p(o 1 ) and p(o 2 ) = 1 p(o 1 ). If p(o 1 ) 1, choose d 2. If p(o 1 ) 0, choose d 1. Somewhere in between decision changes - but where? Depends not only on ranking of consequences, but on how much better one is than another. Numerical values Again we need a standard to compare against. We use two reference consequences: One better (or not worse) than any consequence in table (call this c best ) One worse (or not better) than any consequence in table (call this c worst ) We assume any pair of consequences can be compared (one preferred to other, or both equally desirable), and these comparisons coherent. c best (c worst ) may be in table, but need not be. Numerical values Consider any consequence c ij. Consider urn standard with U N black balls. For ball drawn at random suppose: consequence c best occurs if black ball drawn consequence c worst occurs if white ball drawn So receive c best w/ prob u = U N, c worst w/ prob 1 u. How does c best w/ prob u gamble compare to c ij? If u = 1, clearly better If u = 0, worse As U (and therefore u = U N ) increases, gamble gets better so must be a value of U (so u) s.t. indifferent between c ij and the gamble.
3 a unique number u (0, 1) s.t. c ij is equally desirable as chance u of getting a highly desirable outcome, and 1 u of getting a highly undesirable outcome. Write as µ(c ij ); call this the utility of c ij. By coherence we clearly have c ij preferred to c kl µ(c ij ) > µ(c kl ) c ij equally desirable to c kl µ(c ij ) = µ(c kl ) c ij worse than c kl µ(c ij ) < µ(c kl ) So have a numerical measure of the desirability of any consequence in the table, called a utility. (Note that u is a probability. so obeys the prob laws.) In general, we will have a decision table of the form d 1 d 2. d m o 1 o 2... o n µ(c 11 ) µ(c 12 )... µ(c 1n ) µ(c 21 ) µ(c 22 )... µ(c 2n )... µ(c m1 ) µ(c m2 )... µ(c mn ) E.g. for inspection example we have µ(c 21 ) = 1 µ(c 22 ) = 0 and might get for example C = d 1 : Inspect d 2 : Don t inspect o 1 [ : Good o 2 : Flawed ] Suppose further the manufacturer assesses p(flaw) =.2. Utilities in medicine: QALYs Often in medical decision making we are faced with possible outcomes to which it is difficult to assign dollar values. Examples: chemotherapy side effects chronic pain bilateral mastectomy Yet need quantitative measure of desirability for decision making. A solution: the quality adjusted life year. QALYs Note: are there fates worse than death? Can assign utility to any health state via the standard gamble. Let µ(1 yr in perfect health ) = 1, and µ(immediate death) = 0. Then µ(health state) is prob at which you are indifferent between living 1 yr in health state and gamble with prob µ of 1 year perfect health, (1 µ) of immediate death.
4 Combining probabilities and utilities Combining probabilities and utilities Now have numerical values for uncertainties and consequences. Wish to combine them to make decisions. Key: both are probabilities! If choose d i, outcome depends on uncertain event: if o j occurs, consequence is c ij. But... can replace c ij with chance µ(c ij ) of c, 1 µ(c ij ) of c. For any decision taken, and any event occuring, can think of result as either c or c!!! Now if take d i and o j occurs, prob of c is µ(c ij ): Using extension thm: p(c d i ) = p(c d i, o j ) = µ(c ij ) = p(c o j, d i )p(o j d i ) µ(c ij )p(o j d i ) If c doesn t result, c does, so p(c d i ) measures the merit of d i : higher p(c d i ) means better d i. the best decision is one with maximum p(c d i ). Expected utility Inspection example revisited Notice that p(c d i ) is expected utility under d i : p(c d i ) = µ(c ij )p(o j d i ) = µ(d i ) Therefore the best decision is the one with maximum expected utility. arg max D µ(d i) A decision problem is solved by maximizing expected utility. µ(d 1 ) =.9(.8) +.5(.2) =.82 µ(d 2 ) = 1.0(.8) +.0(.2) =.8 d 1 (Inspect) is the better decision. Notice: If improve production to lower p(flaw) to.1, then µ(d 1 ) =.86 and µ(d 2 ) =.9 can skip inspection. If inspection becomes more costly, so µ(c 11 ), µ(c 12 ) both decrease by.1 (to.8 o 1,.4 o 2 ), then µ(d 1 ) =.72 better not to inspect (but note MEU drops from.82 to.8)
5 Key idea of decision theory Summary of decision analysis Summary: A decision problem can be formulated as lists of decisions and uncertain events. Assuming coherent comparison of events and outcomes, probabilities can be assigned to events and utilities to consequences. Then each decision can be assigned a value (EU), and the best decision is the one with highest value (MEU). So the recommended procedure for making decisions is to 1 List the possible decisions (d 1,..., d m ) 2 List the uncertain events (o 1,..., o n ) 3 Assign probabilities to the events (p(o 1 ), p(o 2 ),..., p(o n )) 4 Assign utilities µ(d i, o j ) to the consequences (d i, o j ) 5 Choose the decision that maximizes expected utility µ(d i ) = µ(d i, o j )p(o j d i ) (Notice that all this follows more or less inevitably from coherence!) Arguing against existence of utilities Recall µ(c ij ) = c with prob p Suppose c ij = $1000 and c = $1001. Mike prefers c ij to c with prob p for any p < 1, since he prefers the certain $1000 to even the slightest risk of getting $0, since the increase ($1) is negligible. Claim: This is incoherent. Arguing against existence of utilities If Mike prefers 1 A certainty of $1000 to 2 A chance p of $1001 against 1 p of $0 for any p < 1, then presumably he also prefers 3 A certainty of $1001 to 4 A chance p of $1002 against 1 p of $0 for same p, since loss is greater while gain is same. Now alter (2) by replacing certain $1001 if p event occurs with (4) chance p of $1002. Makes (2) even worse since (3) preferred to (4). Now get $1002 with p 2 So to be coherent Mike must prefer (1) to 5 A chance p 2 of $1002 against 1 p 2 of $0. Now repeat argument to show Mike must prefer (1) to 6 A chance p 1000 of $2000 against 1 p 1000 of $0.
6 Arguing against existence of utilities But p was arbitrary so p 1000 can be as close to 1 as we want. Now can distinguish between the rewards ($1000 vs $2000) and becomes clear that, for some p, Mike should prefer (6) to (1). Contradicts original claim: clearly Mike must prefer (2) to (1) for some p sufficiently close to 1, to be coherent. For constructing utilities we used reference consequences c, c. Did choice matter? Let c and c be two other reference consequences, s.t. c > c (and so all c ij ) c < c (and so all c ij ) Recall c ij was equivalent to gamble c w/ prob u c ij u 1 u c c Since c < c < c, c is equivalent to c w/ prob p, c w/ prob 1 p for some p. Similarly, c = c w/ prob p + s, c w/ prob 1 p s for s > 0 (since c preferred to c ). So c ij u 1 u c c p+s 1 p s p 1 p c c c c c c ij us+p 1 us p new utility for c ij is us + p after replacing (c, c ) with (c, c ). (Note: p, s do not depend on c ij. Same for any consequence.) c So for any decision the (new) expected utility is: (µ(d i, o j )s + p) p(o j d i ) = s µ(d i ) + p where µ(d i ) is the original expected utility. Choice of reference consequences (c, c ) is irrelevant. ( Origin and scale of utilities are irrelevant.) MEU invariant under affine transformations of U. Convenient to use different origins/scales for different problems. (just like we measure length in Angstroms, yards, light-years,etc.)
DECISIONS UNDER UNCERTAINTY
August 18, 2003 Aanund Hylland: # DECISIONS UNDER UNCERTAINTY Standard theory and alternatives 1. Introduction Individual decision making under uncertainty can be characterized as follows: The decision
More informationA.I. in health informatics lecture 2 clinical reasoning & probabilistic inference, I. kevin small & byron wallace
A.I. in health informatics lecture 2 clinical reasoning & probabilistic inference, I kevin small & byron wallace today a review of probability random variables, maximum likelihood, etc. crucial for clinical
More informationDecision Theory Intro: Preferences and Utility
Decision Theory Intro: Preferences and Utility CPSC 322 Lecture 29 March 22, 2006 Textbook 9.5 Decision Theory Intro: Preferences and Utility CPSC 322 Lecture 29, Slide 1 Lecture Overview Recap Decision
More informationRecitation 7: Uncertainty. Xincheng Qiu
Econ 701A Fall 2018 University of Pennsylvania Recitation 7: Uncertainty Xincheng Qiu (qiux@sas.upenn.edu 1 Expected Utility Remark 1. Primitives: in the basic consumer theory, a preference relation is
More informationEvaluation for Pacman. CS 188: Artificial Intelligence Fall Iterative Deepening. α-β Pruning Example. α-β Pruning Pseudocode.
CS 188: Artificial Intelligence Fall 2008 Evaluation for Pacman Lecture 7: Expectimax Search 9/18/2008 [DEMO: thrashing, smart ghosts] Dan Klein UC Berkeley Many slides over the course adapted from either
More informationCS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2008 Lecture 7: Expectimax Search 9/18/2008 Dan Klein UC Berkeley Many slides over the course adapted from either Stuart Russell or Andrew Moore 1 1 Evaluation for
More informationMATH 446/546 Homework 2: Due October 8th, 2014
MATH 446/546 Homework 2: Due October 8th, 2014 Answer the following questions. Some of which come from Winston s text book. 1. We are going to invest $1,000 for a period of 6 months. Two potential investments
More informationECO 199 GAMES OF STRATEGY Spring Term 2004 Precepts Week 7 March Questions GAMES WITH ASYMMETRIC INFORMATION QUESTIONS
ECO 199 GAMES OF STRATEGY Spring Term 2004 Precepts Week 7 March 22-23 Questions GAMES WITH ASYMMETRIC INFORMATION QUESTIONS Question 1: In the final stages of the printing of Games of Strategy, Sue Skeath
More information1. Two useful probability results. Provide the reason for each step of the proofs. (a) Result: P(A ) + P(A ) = 1. = P(A ) + P(A ) < Axiom 3 >
Textbook: D.C. Montgomery and G.C. Runger, Applied Statistics and Probability for Engineers, John Wiley & Sons, New York, 2003. Chapter 2, Sections 2.3 2.5. 1. Two useful probability results. Provide the
More informationRecursive Ambiguity and Machina s Examples
Recursive Ambiguity and Machina s Examples David Dillenberger Uzi Segal May 0, 0 Abstract Machina (009, 0) lists a number of situations where standard models of ambiguity aversion are unable to capture
More informationLecture 14: Introduction to Decision Making
Lecture 14: Introduction to Decision Making Preferences Utility functions Maximizing exected utility Value of information Actions and consequences So far, we have focused on ways of modeling a stochastic,
More informationENG 8801/ Special Topics in Computer Engineering: Pattern Recognition. Memorial University of Newfoundland Pattern Recognition
Memorial University of Newfoundland Pattern Recognition Lecture 6 May 18, 2006 http://www.engr.mun.ca/~charlesr Office Hours: Tuesdays & Thursdays 8:30-9:30 PM EN-3026 Review Distance-based Classification
More informationApproximating MAX-E3LIN is NP-Hard
Approximating MAX-E3LIN is NP-Hard Evan Chen May 4, 2016 This lecture focuses on the MAX-E3LIN problem. We prove that approximating it is NP-hard by a reduction from LABEL-COVER. 1 Introducing MAX-E3LIN
More informationProbability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...
Probability COMP 245 STATISTICS Dr N A Heard Contents Sample Spaces and Events. Sample Spaces........................................2 Events........................................... 2.3 Combinations
More informationBayesian Decision Theory
Bayesian Decision Theory Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Bayesian Decision Theory Bayesian classification for normal distributions Error Probabilities
More informationQuantum Decision Theory
Quantum Decision Theory V.I. Yukalov and D. Sornette Department of Management, Technology and Economics\ ETH Zürich Plan 1. Classical Decision Theory 1.1. Notations and definitions 1.2. Typical paradoxes
More informationReasoning Under Uncertainty: Introduction to Probability
Reasoning Under Uncertainty: Introduction to Probability CPSC 322 Lecture 23 March 12, 2007 Textbook 9 Reasoning Under Uncertainty: Introduction to Probability CPSC 322 Lecture 23, Slide 1 Lecture Overview
More informationEconomics 201B Economic Theory (Spring 2017) Bargaining. Topics: the axiomatic approach (OR 15) and the strategic approach (OR 7).
Economics 201B Economic Theory (Spring 2017) Bargaining Topics: the axiomatic approach (OR 15) and the strategic approach (OR 7). The axiomatic approach (OR 15) Nash s (1950) work is the starting point
More informationFirst we look at some terms to be used in this section.
8 Hypothesis Testing 8.1 Introduction MATH1015 Biostatistics Week 8 In Chapter 7, we ve studied the estimation of parameters, point or interval estimates. The construction of CI relies on the sampling
More informationSTA Module 4 Probability Concepts. Rev.F08 1
STA 2023 Module 4 Probability Concepts Rev.F08 1 Learning Objectives Upon completing this module, you should be able to: 1. Compute probabilities for experiments having equally likely outcomes. 2. Interpret
More information1 [15 points] Search Strategies
Probabilistic Foundations of Artificial Intelligence Final Exam Date: 29 January 2013 Time limit: 120 minutes Number of pages: 12 You can use the back of the pages if you run out of space. strictly forbidden.
More informationUncertainty. Michael Peters December 27, 2013
Uncertainty Michael Peters December 27, 20 Lotteries In many problems in economics, people are forced to make decisions without knowing exactly what the consequences will be. For example, when you buy
More informationReasoning Under Uncertainty: Introduction to Probability
Reasoning Under Uncertainty: Introduction to Probability CPSC 322 Uncertainty 1 Textbook 6.1 Reasoning Under Uncertainty: Introduction to Probability CPSC 322 Uncertainty 1, Slide 1 Lecture Overview 1
More informationLast update: April 15, Rational decisions. CMSC 421: Chapter 16. CMSC 421: Chapter 16 1
Last update: April 15, 2010 Rational decisions CMSC 421: Chapter 16 CMSC 421: Chapter 16 1 Outline Rational preferences Utilities Money Multiattribute utilities Decision networks Value of information CMSC
More informationDecision-making with belief functions
Decision-making with belief functions Thierry Denœux Université de Technologie de Compiègne, France HEUDIASYC (UMR CNRS 7253) https://www.hds.utc.fr/ tdenoeux Fourth School on Belief Functions and their
More informationDecision Theory: Q-Learning
Decision Theory: Q-Learning CPSC 322 Decision Theory 5 Textbook 12.5 Decision Theory: Q-Learning CPSC 322 Decision Theory 5, Slide 1 Lecture Overview 1 Recap 2 Asynchronous Value Iteration 3 Q-Learning
More informationRational preferences. Rational decisions. Outline. Rational preferences contd. Maximizing expected utility. Preferences
Rational preferences Rational decisions hapter 16 Idea: preferences of a rational agent must obey constraints. Rational preferences behavior describable as maximization of expected utility onstraints:
More informationCS188: Artificial Intelligence, Fall 2010 Written 3: Bayes Nets, VPI, and HMMs
CS188: Artificial Intelligence, Fall 2010 Written 3: Bayes Nets, VPI, and HMMs Due: Tuesday 11/23 in 283 Soda Drop Box by 11:59pm (no slip days) Policy: Can be solved in groups (acknowledge collaborators)
More informationUniversity of California Berkeley CS170: Efficient Algorithms and Intractable Problems November 19, 2001 Professor Luca Trevisan. Midterm 2 Solutions
University of California Berkeley Handout MS2 CS170: Efficient Algorithms and Intractable Problems November 19, 2001 Professor Luca Trevisan Midterm 2 Solutions Problem 1. Provide the following information:
More informationReview. A Bernoulli Trial is a very simple experiment:
Review A Bernoulli Trial is a very simple experiment: Review A Bernoulli Trial is a very simple experiment: two possible outcomes (success or failure) probability of success is always the same (p) the
More informationContents. Decision Making under Uncertainty 1. Meanings of uncertainty. Classical interpretation
Contents Decision Making under Uncertainty 1 elearning resources Prof. Ahti Salo Helsinki University of Technology http://www.dm.hut.fi Meanings of uncertainty Interpretations of probability Biases in
More informationa i,1 a i,j a i,m be vectors with positive entries. The linear programming problem associated to A, b and c is to find all vectors sa b
LINEAR PROGRAMMING PROBLEMS MATH 210 NOTES 1. Statement of linear programming problems Suppose n, m 1 are integers and that A = a 1,1... a 1,m a i,1 a i,j a i,m a n,1... a n,m is an n m matrix of positive
More informationThe dark energ ASTR509 - y 2 puzzl e 2. Probability ASTR509 Jasper Wal Fal term
The ASTR509 dark energy - 2 puzzle 2. Probability ASTR509 Jasper Wall Fall term 2013 1 The Review dark of energy lecture puzzle 1 Science is decision - we must work out how to decide Decision is by comparing
More informationRecursive Ambiguity and Machina s Examples
Recursive Ambiguity and Machina s Examples David Dillenberger Uzi Segal January 9, 204 Abstract Machina (2009, 202) lists a number of situations where Choquet expected utility, as well as other known models
More informationSTAT 201 Chapter 5. Probability
STAT 201 Chapter 5 Probability 1 2 Introduction to Probability Probability The way we quantify uncertainty. Subjective Probability A probability derived from an individual's personal judgment about whether
More informationTHE VINE COPULA METHOD FOR REPRESENTING HIGH DIMENSIONAL DEPENDENT DISTRIBUTIONS: APPLICATION TO CONTINUOUS BELIEF NETS
Proceedings of the 00 Winter Simulation Conference E. Yücesan, C.-H. Chen, J. L. Snowdon, and J. M. Charnes, eds. THE VINE COPULA METHOD FOR REPRESENTING HIGH DIMENSIONAL DEPENDENT DISTRIBUTIONS: APPLICATION
More informationKDF2C QUANTITATIVE TECHNIQUES FOR BUSINESSDECISION. Unit : I - V
KDF2C QUANTITATIVE TECHNIQUES FOR BUSINESSDECISION Unit : I - V Unit I: Syllabus Probability and its types Theorems on Probability Law Decision Theory Decision Environment Decision Process Decision tree
More informationThis corresponds to a within-subject experiment: see same subject make choices from different menus.
Testing Revealed Preference Theory, I: Methodology The revealed preference theory developed last time applied to a single agent. This corresponds to a within-subject experiment: see same subject make choices
More informationStatistics for IT Managers
Statistics for IT Managers 95-796, Fall 2012 Module 2: Hypothesis Testing and Statistical Inference (5 lectures) Reading: Statistics for Business and Economics, Ch. 5-7 Confidence intervals Given the sample
More information05 the development of a kinematics problem. February 07, Area under the curve
Area under the curve Area under the curve refers from the region the line (curve) to the x axis 1 2 3 From Graphs to equations Case 1 scatter plot reveals no apparent relationship Types of equations Case
More informationChapter 2. Decision Making under Risk. 2.1 Consequences and Lotteries
Chapter 2 Decision Making under Risk In the previous lecture I considered abstract choice problems. In this section, I will focus on a special class of choice problems and impose more structure on the
More information8.1-4 Test of Hypotheses Based on a Single Sample
8.1-4 Test of Hypotheses Based on a Single Sample Example 1 (Example 8.6, p. 312) A manufacturer of sprinkler systems used for fire protection in office buildings claims that the true average system-activation
More informationSTA Module 10 Comparing Two Proportions
STA 2023 Module 10 Comparing Two Proportions Learning Objectives Upon completing this module, you should be able to: 1. Perform large-sample inferences (hypothesis test and confidence intervals) to compare
More informationSTA111 - Lecture 1 Welcome to STA111! 1 What is the difference between Probability and Statistics?
STA111 - Lecture 1 Welcome to STA111! Some basic information: Instructor: Víctor Peña (email: vp58@duke.edu) Course Website: http://stat.duke.edu/~vp58/sta111. 1 What is the difference between Probability
More informationRational decisions. Chapter 16. Chapter 16 1
Rational decisions Chapter 16 Chapter 16 1 Outline Rational preferences Utilities Money Multiattribute utilities Decision networks Value of information Chapter 16 2 Preferences An agent chooses among prizes
More information5 ProbabilisticAnalysisandRandomized Algorithms
5 ProbabilisticAnalysisandRandomized Algorithms This chapter introduces probabilistic analysis and randomized algorithms. If you are unfamiliar with the basics of probability theory, you should read Appendix
More informationDecision Graphs - Influence Diagrams. Rudolf Kruse, Pascal Held Bayesian Networks 429
Decision Graphs - Influence Diagrams Rudolf Kruse, Pascal Held Bayesian Networks 429 Descriptive Decision Theory Descriptive Decision Theory tries to simulate human behavior in finding the right or best
More informationToday we ll discuss ways to learn how to think about events that are influenced by chance.
Overview Today we ll discuss ways to learn how to think about events that are influenced by chance. Basic probability: cards, coins and dice Definitions and rules: mutually exclusive events and independent
More informationProbability and Information Theory. Sargur N. Srihari
Probability and Information Theory Sargur N. srihari@cedar.buffalo.edu 1 Topics in Probability and Information Theory Overview 1. Why Probability? 2. Random Variables 3. Probability Distributions 4. Marginal
More informationBargaining, Contracts, and Theories of the Firm. Dr. Margaret Meyer Nuffield College
Bargaining, Contracts, and Theories of the Firm Dr. Margaret Meyer Nuffield College 2015 Course Overview 1. Bargaining 2. Hidden information and self-selection Optimal contracting with hidden information
More informationDecision Analysis. An insightful study of the decision-making process under uncertainty and risk
Decision Analysis An insightful study of the decision-making process under uncertainty and risk Decision Theory A philosophy, articulated by a set of logical axioms and a methodology, for analyzing the
More informationAxiomatic Decision Theory
Decision Theory Decision theory is about making choices It has a normative aspect what rational people should do... and a descriptive aspect what people do do Not surprisingly, it s been studied by economists,
More informationEssential facts about NP-completeness:
CMPSCI611: NP Completeness Lecture 17 Essential facts about NP-completeness: Any NP-complete problem can be solved by a simple, but exponentially slow algorithm. We don t have polynomial-time solutions
More informationMassachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Recitation 3 1 Gaussian Graphical Models: Schur s Complement Consider
More informationMeasurable Ambiguity. with Wolfgang Pesendorfer. August 2009
Measurable Ambiguity with Wolfgang Pesendorfer August 2009 A Few Definitions A Lottery is a (cumulative) probability distribution over monetary prizes. It is a probabilistic description of the DMs uncertain
More informationRECURSION EQUATION FOR
Math 46 Lecture 8 Infinite Horizon discounted reward problem From the last lecture: The value function of policy u for the infinite horizon problem with discount factor a and initial state i is W i, u
More informationCSCI 5832 Natural Language Processing. Today 2/19. Statistical Sequence Classification. Lecture 9
CSCI 5832 Natural Language Processing Jim Martin Lecture 9 1 Today 2/19 Review HMMs for POS tagging Entropy intuition Statistical Sequence classifiers HMMs MaxEnt MEMMs 2 Statistical Sequence Classification
More informationProbability - Lecture 4
1 Introduction Probability - Lecture 4 Many methods of computation physics and the comparison of data to a mathematical representation, apply stochastic methods. These ideas were first introduced in the
More informationBasic Probabilistic Reasoning SEG
Basic Probabilistic Reasoning SEG 7450 1 Introduction Reasoning under uncertainty using probability theory Dealing with uncertainty is one of the main advantages of an expert system over a simple decision
More informationTwo-Stage-Partitional Representation and Dynamic Consistency 1
Two-Stage-Partitional Representation and Dynamic Consistency 1 Suguru Ito June 5 th, 2015 Abstract In this paper, we study an individual who faces a three-period decision problem when she accepts the partitional
More informationIndustrial Technology: Intro to Industrial Technology Crosswalk to AZ Math Standards
East Valley Page 1 of 1 August 1998 3M-P6 Perform mathematical operations on expressions and matrices, and solve equations and inequalities. PO 4 PO 5 PO 6 PO 7 PO 8 PO 9 PO 10 5.0 Demonstrate the elements
More informationWe set up the basic model of two-sided, one-to-one matching
Econ 805 Advanced Micro Theory I Dan Quint Fall 2009 Lecture 18 To recap Tuesday: We set up the basic model of two-sided, one-to-one matching Two finite populations, call them Men and Women, who want to
More informationIntroduction to Statistical Data Analysis Lecture 5: Confidence Intervals
Introduction to Statistical Data Analysis Lecture 5: Confidence Intervals James V. Lambers Department of Mathematics The University of Southern Mississippi James V. Lambers Statistical Data Analysis 1
More informationApplying Bayesian Estimation to Noisy Simulation Optimization
Applying Bayesian Estimation to Noisy Simulation Optimization Geng Deng Michael C. Ferris University of Wisconsin-Madison INFORMS Annual Meeting Pittsburgh 2006 Simulation-based optimization problem Computer
More information18.600: Lecture 3 What is probability?
18.600: Lecture 3 What is probability? Scott Sheffield MIT Outline Formalizing probability Sample space DeMorgan s laws Axioms of probability Outline Formalizing probability Sample space DeMorgan s laws
More informationT.I.H.E. IT 233 Statistics and Probability: Sem. 1: 2013 ESTIMATION AND HYPOTHESIS TESTING OF TWO POPULATIONS
ESTIMATION AND HYPOTHESIS TESTING OF TWO POPULATIONS In our work on hypothesis testing, we used the value of a sample statistic to challenge an accepted value of a population parameter. We focused only
More informationCSE 573: Artificial Intelligence
CSE 573: Artificial Intelligence Autumn 2010 Lecture 5: Expectimax Search 10/14/2008 Luke Zettlemoyer Most slides over the course adapted from either Dan Klein, Stuart Russell or Andrew Moore 1 Announcements
More informationProbability and Independence Terri Bittner, Ph.D.
Probability and Independence Terri Bittner, Ph.D. The concept of independence is often confusing for students. This brief paper will cover the basics, and will explain the difference between independent
More informationAmortized Complexity Main Idea
omp2711 S1 2006 Amortized Complexity Example 1 Amortized Complexity Main Idea Worst case analysis of run time complexity is often too pessimistic. Average case analysis may be difficult because (i) it
More informationMachine Learning Lecture 2
Machine Perceptual Learning and Sensory Summer Augmented 15 Computing Many slides adapted from B. Schiele Machine Learning Lecture 2 Probability Density Estimation 16.04.2015 Bastian Leibe RWTH Aachen
More informationBasic Probability and Decisions
Basic Probability and Decisions Chris Amato Northeastern University Some images and slides are used from: Rob Platt, CS188 UC Berkeley, AIMA Uncertainty Let action A t = leave for airport t minutes before
More informationSystems of Linear Equations
LECTURE 6 Systems of Linear Equations You may recall that in Math 303, matrices were first introduced as a means of encapsulating the essential data underlying a system of linear equations; that is to
More informationDecision Tree Learning. Dr. Xiaowei Huang
Decision Tree Learning Dr. Xiaowei Huang https://cgi.csc.liv.ac.uk/~xiaowei/ After two weeks, Remainders Should work harder to enjoy the learning procedure slides Read slides before coming, think them
More informationBayesian Reasoning. Adapted from slides by Tim Finin and Marie desjardins.
Bayesian Reasoning Adapted from slides by Tim Finin and Marie desjardins. 1 Outline Probability theory Bayesian inference From the joint distribution Using independence/factoring From sources of evidence
More informationNotes on statistical tests
Notes on statistical tests Daniel Osherson Princeton University Scott Weinstein University of Pennsylvania September 20, 2005 We attempt to provide simple proofs for some facts that ought to be more widely
More informationMATH2206 Prob Stat/20.Jan Weekly Review 1-2
MATH2206 Prob Stat/20.Jan.2017 Weekly Review 1-2 This week I explained the idea behind the formula of the well-known statistic standard deviation so that it is clear now why it is a measure of dispersion
More informationBayesian Networks 2:
1/27 PhD seminar series Probabilistics in Engineering : Bayesian networks and Bayesian hierarchical analysis in engineering Conducted by Prof. Dr. Maes, Prof. Dr. Faber and Dr. Nishijima Bayesian Networks
More informationNotes and Solutions #6 Meeting of 21 October 2008
HAVERFORD COLLEGE PROBLEM SOLVING GROUP 008-9 Notes and Solutions #6 Meeting of October 008 The Two Envelope Problem ( Box Problem ) An extremely wealthy intergalactic charitable institution has awarded
More informationST495: Survival Analysis: Hypothesis testing and confidence intervals
ST495: Survival Analysis: Hypothesis testing and confidence intervals Eric B. Laber Department of Statistics, North Carolina State University April 3, 2014 I remember that one fateful day when Coach took
More informationRational decisions. Chapter 16 1
Rational decisions Chapter 16 Chapter 16 1 Rational preferences Utilities Money Multiattribute utilities Decision networks Value of information Outline Chapter 16 2 Preferences An agent chooses among prizes
More information(II.B) Basis and dimension
(II.B) Basis and dimension How would you explain that a plane has two dimensions? Well, you can go in two independent directions, and no more. To make this idea precise, we formulate the DEFINITION 1.
More informationPreference, Choice and Utility
Preference, Choice and Utility Eric Pacuit January 2, 205 Relations Suppose that X is a non-empty set. The set X X is the cross-product of X with itself. That is, it is the set of all pairs of elements
More informationProbabilistic and Bayesian Analytics
Probabilistic and Bayesian Analytics Note to other teachers and users of these slides. Andrew would be delighted if you found this source material useful in giving your own lectures. Feel free to use these
More informationThe rest of the course
The rest of the course Subtleties involved with maximizing expected utility: Finding the right state space: The wrong state space leads to intuitively incorrect answers when conditioning Taking causality
More informationCOMP310 Multi-Agent Systems Chapter 16 - Argumentation. Dr Terry R. Payne Department of Computer Science
COMP310 Multi-Agent Systems Chapter 16 - Argumentation Dr Terry R. Payne Department of Computer Science Overview How do agents agree on what to believe? In a court of law, barristers present a rationally
More informationChapter 3. Estimation of p. 3.1 Point and Interval Estimates of p
Chapter 3 Estimation of p 3.1 Point and Interval Estimates of p Suppose that we have Bernoulli Trials (BT). So far, in every example I have told you the (numerical) value of p. In science, usually the
More informationn α 1 α 2... α m 1 α m σ , A =
The Leslie Matrix The Leslie matrix is a generalization of the above. It is a matrix which describes the increases in numbers in various age categories of a population year-on-year. As above we write p
More informationPart 1: Logic and Probability
Part 1: Logic and Probability In some sense, probability subsumes logic: While a probability can be seen as a measure of degree of truth a real number between 0 and 1 logic deals merely with the two extreme
More informationP (B)P (A B) = P (AB) = P (A)P (B A), and dividing both sides by P (B) gives Bayes rule: P (A B) = P (A) P (B A) P (B),
Conditional probability 18.600 Problem Set 3, due March 2 Welcome to your third 18.600 problem set! Conditional probability is defined by P (A B) = P (AB)/P (B), which implies P (B)P (A B) = P (AB) = P
More informationPhysics 403. Segev BenZvi. Choosing Priors and the Principle of Maximum Entropy. Department of Physics and Astronomy University of Rochester
Physics 403 Choosing Priors and the Principle of Maximum Entropy Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Odds Ratio Occam Factors
More informationCS 241 Analysis of Algorithms
CS 241 Analysis of Algorithms Professor Eric Aaron Lecture T Th 9:00am Lecture Meeting Location: OLB 205 Business Grading updates: HW5 back today HW7 due Dec. 10 Reading: Ch. 22.1-22.3, Ch. 25.1-2, Ch.
More information1 A simple example. A short introduction to Bayesian statistics, part I Math 217 Probability and Statistics Prof. D.
probabilities, we ll use Bayes formula. We can easily compute the reverse probabilities A short introduction to Bayesian statistics, part I Math 17 Probability and Statistics Prof. D. Joyce, Fall 014 I
More informationHST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007
MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationThe Naïve Bayes Classifier. Machine Learning Fall 2017
The Naïve Bayes Classifier Machine Learning Fall 2017 1 Today s lecture The naïve Bayes Classifier Learning the naïve Bayes Classifier Practical concerns 2 Today s lecture The naïve Bayes Classifier Learning
More informationM17 MAT25-21 HOMEWORK 6
M17 MAT25-21 HOMEWORK 6 DUE 10:00AM WEDNESDAY SEPTEMBER 13TH 1. To Hand In Double Series. The exercises in this section will guide you to complete the proof of the following theorem: Theorem 1: Absolute
More informationMath 381 Midterm Practice Problem Solutions
Math 381 Midterm Practice Problem Solutions Notes: -Many of the exercises below are adapted from Operations Research: Applications and Algorithms by Winston. -I have included a list of topics covered on
More informationBasic notions of probability theory
Basic notions of probability theory Contents o Boolean Logic o Definitions of probability o Probability laws Why a Lecture on Probability? Lecture 1, Slide 22: Basic Definitions Definitions: experiment,
More informationCS 188 Introduction to Fall 2007 Artificial Intelligence Midterm
NAME: SID#: Login: Sec: 1 CS 188 Introduction to Fall 2007 Artificial Intelligence Midterm You have 80 minutes. The exam is closed book, closed notes except a one-page crib sheet, basic calculators only.
More information1 Uncertainty and Insurance
Uncertainty and Insurance Reading: Some fundamental basics are in Varians intermediate micro textbook (Chapter 2). A good (advanced, but still rather accessible) treatment is in Kreps A Course in Microeconomic
More informationProbabilities and Expectations
Probabilities and Expectations Ashique Rupam Mahmood September 9, 2015 Probabilities tell us about the likelihood of an event in numbers. If an event is certain to occur, such as sunrise, probability of
More information