Contents. Decision Making under Uncertainty 1. Meanings of uncertainty. Classical interpretation
|
|
- Delilah Young
- 6 years ago
- Views:
Transcription
1 Contents Decision Making under Uncertainty 1 elearning resources Prof. Ahti Salo Helsinki University of Technology Meanings of uncertainty Interpretations of probability Biases in probability elicitation Calibration of experts Discrete Continuous 2/39 Meanings of uncertainty We frequently make statements about uncertainty: It will rain tomorrow. subjective probability The th decimal of π is 6. a fact, the uncertainty lies in the available information I win in a lottery with probability frequentist or classical 0, probability interpretation Uncertainty = An event with unknown outcome Probability = A number for measuring uncertainty. 3/39 Interpretations of probability Classical interpretation P.S. Laplace (1825) Probability = The ratio between the number of possible outcomes favourable to the event to the total number of possible outcomes, each assumed to be equally likely. #( A) PA ( ) = #( S) #(A) = Number of possible outcomes favourable to A #(S) = Total number of possible outcomes Circular definition: probability defined in terms of equally likely. Principle of indifference: Events are equally likely if there is no known reason for predicting the occurrence of one event rather than another. Each event is defined as a collection of outcomes. The probability to get 6 when tossing a dice is 1/6. 4/39
2 Interpretations of probability Frequentist interpretation Interpretations of probability Subjective (Bayesian) interpretation Leslie Ellis, mid 19 th century Probability = The relative frequency of trials in which the favourable event occurs as the number of trials approaches infinity. n( A) n(a) = Number of times that A occurs PA ( ) = lim n n n = Total number of trials You may determine the probability of getting heads by tossing a coin a very large number of times. De Finetti (1937) Probability = Represents an individual s degree of belief in the occurrence of a particular outcome. The probability may change e.g. when additional information is received. The event may have already occurred. I believe it s a 50 % chance that it will rain tomorrow. I m 15 % sure that Martin Luther King was 34 years old when he died. 5/39 6/39 Prerequisite The events must be well defined. Results / Goal A probability number to any outcome of interest. One can use 1) Past evidence 2) Causal models 3) Expert judgement Elicitation of discrete subjective probabilities 1) Direct assessment Ask the respondent to assign numerical values to events. 2) Betting approach Find a specific amount to win or lose such that the DM is indifferent about which side of the bet to take. 3) Reference lottery Compare the uncertain event to events with known probabilities. 7/39 8/39
3 Direct assessment Betting approach The respondent is asked to assess the desired probabilities directly. E.g. a wheel of fortune (probability wheel) can be used to support the elicitation process. Bet 1 Bet 2 Win X if A happens. Lose Y if A does not happen. Lose X if A happens. Win Y if A does not happen. Bet for A Bet against A A not A A not A X -Y -X Y Event A Event B Event C The mode of questioning may affect the results. 9/39 Adjust X and Y until the respondent is indifferent about which bet to take. Now the expected monetary values of the bets must be equal*: X P(A) - Y (1 - P(A)) = - X P(A) + Y (1 - P(A)) Y PA ( ) = X + Y * The respondent is assumed to be concerned with expected monetary value only (i.e. risk neutral). 10/39 Reference lottery The Ellsberg paradox (1961) (1/2) Lottery Win X if A happens. (X > Y) Win Y if A does not happen. Ref. Win X with prob. p. lottery Win Y with prob. (1-p). The reference lottery can be visualised e.g. with a wheel of fortune or a box with e.g. red and blue balls from which you want to draw a red one. The probability is changed by adjusting the wheel or the number of balls. Once the respondent is indifferent about which lottery to participate in, we have: PA ( ) = p Lottery Ref. lottery The respondent s risk attitude does not affect the result. 11/39 A not A p (1-p) X Y X Y Which game would you choose, 1 or 2? Game 1 Win 1000 if you pick a red ball Game 2 Win 1000 if you pick a blue ball What about 3 or 4? Game 3 Win 1000 if you pick a red or yellow ball Game 4 Win 1000 if you pick a blue or yellow ball Balls in the urn: 12/ red blue yellow
4 The Ellsberg paradox (1961) (2/2) Assessment of continuous subjective probabilities (1/3) Most people choose games 1 and 4. But yellow should not matter! Wins: Red Blue Yellow Game Game Game Game Ambiguity aversion: People tend to prefer events with known probabilities! 13/39 Fractile method The expert is asked to give the feasible range (min, max) median, f 50% (i.e. P(X<f 50% ) = 0.5) other fractiles (e.g. 5%, 25%, 75%, 95%). Typical elicitation questions: min: What is the least possible value you could imagine that the variable X could possible obtain? Median (f 50% ): Give a number f 50% such that, in your opinion, X < f 50% is just as probable as X>f 50%. (i.e. P(X<f 50% ) = P(X>f 50% ) ) 5% fractile (f 5% ): Give a number f 5% such that, in your opinion, X < f 5% is just as probable as picking a red ball from an urn with 1 red and 19 blue balls. (i.e. P(X<f 5% ) = 0.05) 14/39 Assessment of continuous subjective probabilities (2/3) A cumulative distribution function is obtained e.g. by interpolation or fitting a curve from a specific class of functions. Cumulative probability 0,8 0,6 0,4 0,2 1 0 Fractiles and fitted distribution Assessment of continuous subjective probabilities (3/3) Histogram method The possible values of the variable are divided into intervals. The probability of the event corresponding to each interval is assessed. A histogram is gained (=estimate for the density function). Anchoring bias may occur (See bias section.). Variable value 15/39 16/39
5 Example: Histogram method Scoring rules How much will the GDP (volume) grow in Finland in year 2004? (Average growth was 4.5 %.) Growth % Probability Elicited 17/39 With scoring rules e.g. the compensation of the respondent can be thread to the accuracy of his reply in such a way that he benefits from telling his true estimate. A scoring rule S(r) is strictly proper if the true opinion of the respondent maximises his expected score: E[S(p)] > E[S(r )], for all r p p = (p 1, p 2,, p n ) = true opinion r = (r 1, r 2,, p n ) = responded distribution The three mots commonly encountered proper scoring rules are: n 2 j( 1, 2,..., n) = 2 j i i= 1 S r r r r r Sj( r1, r2,..., rn) = log( rj) r S ( r, r,..., r ) =,where S j is the score if event j occurs. j j 1 2 n n 12 2 ( r i= 1 i ) 18/39 Example: Scoring rules Biases in probability elicitation A weather forecaster believes that it is a p = 0.7 probability of rain the next day. Which probability r should he respond with, when he gets S 1 (r) = 2*r - r 2 - (1-r) 2 if it does rain and S 2 (r) = 2*(1-r) - r 2 - (1-r) 2 if it does not? His expected compensation is: E[S(r )] = 0.7*S 1 (r ) + 0.3*S 2 (r ), which will be maximised if he responds 0.7, which is his true estimate. Expected compensation 0,8 0,6 0,4 0, ,2 0,4 0,6 0,8 1-0,2-0,4-0,6 r People use rules of thumb, i.e. heuristics in assessing probabilities Often useful May lead to systematic incorrect estimates, biases Gamblers believe their luck will change. Independence: You toss HHHHHHH (H=head), what is next? Is P(HHHHTTTT) < P(HTTHHTHT)? 19/39 20/39
6 Biases in probability elicitation Representativeness If x fits the description of a set A well, then P(x A) is assumed to be large. The relative proportions (prior prob.) are not taken into consideration. Example: You see a very good looking young woman in a bar. Is she more probably a professional model or a nurse? Many people are tempted to answer a model. Yet, professional models are rare, while there are far more nurses: thus, she is actually more likely to be a nurse. Biases in probability elicitation Availability People assess the probability of an event by the ease with which instances or occurrences can be brought to mind. Example: In a typical sample of text in English, is it more likely that a word starts with the letter K or that K is the third letter? Generally about 70 % think that words starting with K are more common. In truth, however, there are approximately twice as many words with K in the third position as there are words that begin with it. 21/39 22/39 Biases in probability elicitation Anchoring and adjustment Biases in probability elicitation Conservatism In several situations people assess probabilities by adjusting a given number. Often the adjustment isn t big enough and the final assessment is too close to the starting value. Example: Is the percentage of African countries in the UN a) greater or less than 65? What is the exact percentage? Average answer: Less, 45% b) greater or less than 10? What is the exact percentage? Average answer: Greater, 25% How to reduce: Avoid giving starting values People tend to change previous probability estimates more slowly than warranted by new data (according to the Bayes theorem). Example: Suppose we have two bags. One bag contains 30 white balls and 10 black balls. The other bag contains 30 black balls and 10 white. Suppose we choose one of these bags at random. For this bag we select five balls at random, replacing each ball after it has been selected. The result is that we find 4 white balls and one black. What is the probability that we were using the bag with mainly white balls? If you are a typical subject, your would have answered between 0.7 and 0.8. The answer is in fact, 0.96, which can be calculated using the Bayes theorem. 23/39 24/39
7 Biases in probability elicitation Hindsight bias People falsely believe they would have predicted the outcome of an event. Once outcomes observed, the DM may assume that they are the only ones that could have happened and underestimate the uncertainty. Prevents ourselves from learning from the past. Warning people of this bias has little effect. How to reduce: Argue against the inevitability of the reported outcome and convince that it might have turned out otherwise. Biases in probability elicitation Overconfidence People tend to be overconfident in their assessments According to research: claimed confidence 100 % 90 % 80 % relative frequency of correct answers 80 % 75 % 65 % How to reduce: Give feedback according to the quality of earlier assessments. 25/39 26/39 Calibration of experts Calibration of experts Calibration of experts Example: Calibration of experts (1/2) Goal: To know that if the DM says the probability is p, the real probability is f(p). Define a calibration curve p f(p) Calibration can be done e.g. by eliciting known probabilities. An expert assesses a cumulative subjective probability distribution function F X (x) for a variable X prior to its realisation When the outcome x* becomes known, define ζ as ζ = F X (x*) Now ζ should follow the uniform distribution ζ ~ U(0,1) 27/39 On 10 recent occasions, a technical expert assessed normal probability distributions on the performance index x i of a series of research experiments: The values ζ i are calculated from the cumulative predictive distributions: ζ i = F N (x i µ i, σ i ) In a situation of perfect calibration the ζ i :s should be 0.1, 0.2, 0.9, /39
8 Calibration of experts Example: Calibration of experts (2/2) Now the calibration function to the left can be drawn. If the expert assessed the distribution shown on right for an eleventh experiment, it could be corrected using the calibration function. Assessed probability distributions can be improved i.e. updated when new information is gained. 29/39 30/39 Updating of discrete probabilities 1. We have a probability estimate for event H: prior probability P(H) 2. New information D is gained The Bayes theorem The updating is done using the Bayes theorem: P( D H) P( H) PH ( D) = PD ( ) 3. Update the estimate using Bayes theorem: posterior probability P(H D) 31/39 32/39
9 Example: Using Bayes theorem Updating of continuous distributions (1/3) 1,5 % of the population suffer from schizophrenia P(S) = (prior probability) Brain atrophy is found in 30 % of the schizophrenic P(A S) = % of normal people P(A S) = 0.02 If a person has brain atrophy, the probability that he is schizophrenic (posterior probability) is: PASPS ( ) ( ) PS ( A) = PASPS ( ) ( ) + PASPS ( ) ( ) = = Picture: Clemen s. 250 Figure: Posterior probability with different prior probabilities. 33/39 Choose a theoretical distribution, P(X=x θ), for the physical process of interest. Assess uncertainty about parameter θ: prior distribution, f(θ) Observe data x 1 Update using Bayes theorem: posterior distribution of θ, f(θ x 1 ) 34/39 Note: Uncertainty about X has two parts: 1. Due to the process itself, P(X=x θ). 2. Uncertainty about θ, f(θ), later updated to f(θ x 1 ). Updating of continuous distributions (2/3) Updating of continuous distributions (3/3) Bayes theorem for continuous θ : f( x1 θ) f( θ) f( θ x1 ) = f ( x θ) f( θ) dθ 1 f(x 1 θ) is called the likelihood function of θ with a given observed data x 1. In most cases the posterior distribution can not be calculated analytically, but must be solved numerically. When we have natural conjugate distributions the posterior distribution can be solved analytically. If the distribution for the physical process of interest and the prior distribution for its parameter are suitably chosen, then the prior and posterior distributions of the parameter have the same form: physical process: θ prior: θ posterior: X ~ Exp(λ) λ ~ Gamma(α, β) λ ~ Gamma(α*, β*) X ~ Bin(n, p) p ~ Beta(r 0, n 0 ) p ~ Beta(r*, n*) X ~ N(µ, σ 2 ) µ ~ N(m 0, σ 2 0 ) µ ~ N(m*, σ2 *) 35/39 36/39
10 Binomial distribution About the beta distribution 1. The physical process of interest is binomial distributed: X ~ Bin(n, p) 2. Prior distribution for p: p ~ Beta(r 0, n 0 ) Γ( r0) Γ( n0 r0) r0 1 n0 r0 f ( ) (1 ) 1 β p = p p,0< p< 1, r0, n0 > 0 Γ( n ) 0 3. Observe a sample of the physical process: sample size: n 1 favourable cases: r 1 4. The posterior distribution, calculated using the Bayes theorem, gets reduced to: p ~ Beta(r 0 +r 1, n 0 +n 1 ) The beta distribution is very flexible. Some examples: 37/39 38/39 Normal distribution 1. The physical process of interest is normal distributed: X ~ N(µ, σ 2 ) (σ is assumed to be known) 2. Prior distribution for µ: µ ~ N(m 0, σ 2 0 ) (notation: σ2 0 = σ2 / n 0 ) 3. Observe a sample of the physical process: sample size: n 1 sample mean: x 1 4. The posterior distribution, calculated using the Bayes theorem, gets reduced to: 2 2 m µ ~ N(m*, σ 2 0σ n1+ x1σ0 n0m0 + n1x 1 *), where m* = = 2 2 σ n1+ σ0 n0 + n σσ 0 n1 σ σ * = = 2 2 σ n + σ n + n /39
Chapter 5. Bayesian Statistics
Chapter 5. Bayesian Statistics Principles of Bayesian Statistics Anything unknown is given a probability distribution, representing degrees of belief [subjective probability]. Degrees of belief [subjective
More informationPreliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com
1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix
More informationV. Probability. by David M. Lane and Dan Osherson
V. Probability by David M. Lane and Dan Osherson Prerequisites none F.Introduction G.Basic Concepts I.Gamblers Fallacy Simulation K.Binomial Distribution L.Binomial Demonstration M.Base Rates Probability
More informationEstimation of reliability parameters from Experimental data (Parte 2) Prof. Enrico Zio
Estimation of reliability parameters from Experimental data (Parte 2) This lecture Life test (t 1,t 2,...,t n ) Estimate θ of f T t θ For example: λ of f T (t)= λe - λt Classical approach (frequentist
More information*Karle Laska s Sections: There is no class tomorrow and Friday! Have a good weekend! Scores will be posted in Compass early Friday morning
STATISTICS 100 EXAM 3 Spring 2016 PRINT NAME (Last name) (First name) *NETID CIRCLE SECTION: Laska MWF L1 Laska Tues/Thurs L2 Robin Tu Write answers in appropriate blanks. When no blanks are provided CIRCLE
More information18.05 Practice Final Exam
No calculators. 18.05 Practice Final Exam Number of problems 16 concept questions, 16 problems. Simplifying expressions Unless asked to explicitly, you don t need to simplify complicated expressions. For
More informationBayesian Inference. Introduction
Bayesian Inference Introduction The frequentist approach to inference holds that probabilities are intrinsicially tied (unsurprisingly) to frequencies. This interpretation is actually quite natural. What,
More informationCounting principles, including permutations and combinations.
1 Counting principles, including permutations and combinations. The binomial theorem: expansion of a + b n, n ε N. THE PRODUCT RULE If there are m different ways of performing an operation and for each
More informationBayesian Networks Basic and simple graphs
Bayesian Networks Basic and simple graphs Ullrika Sahlin, Centre of Environmental and Climate Research Lund University, Sweden Ullrika.Sahlin@cec.lu.se http://www.cec.lu.se/ullrika-sahlin Bayesian [Belief]
More informationProbability theory basics
Probability theory basics Michael Franke Basics of probability theory: axiomatic definition, interpretation, joint distributions, marginalization, conditional probability & Bayes rule. Random variables:
More informationDiscrete Binary Distributions
Discrete Binary Distributions Carl Edward Rasmussen November th, 26 Carl Edward Rasmussen Discrete Binary Distributions November th, 26 / 5 Key concepts Bernoulli: probabilities over binary variables Binomial:
More informationBinomial random variable
Binomial random variable Toss a coin with prob p of Heads n times X: # Heads in n tosses X is a Binomial random variable with parameter n,p. X is Bin(n, p) An X that counts the number of successes in many
More information18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages
Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution
More informationBayesian Inference. STA 121: Regression Analysis Artin Armagan
Bayesian Inference STA 121: Regression Analysis Artin Armagan Bayes Rule...s! Reverend Thomas Bayes Posterior Prior p(θ y) = p(y θ)p(θ)/p(y) Likelihood - Sampling Distribution Normalizing Constant: p(y
More informationAnnouncements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias
Recap Announcements Lecture 5: Statistics 101 Mine Çetinkaya-Rundel September 13, 2011 HW1 due TA hours Thursday - Sunday 4pm - 9pm at Old Chem 211A If you added the class last week please make sure to
More informationChapter 8: An Introduction to Probability and Statistics
Course S3, 200 07 Chapter 8: An Introduction to Probability and Statistics This material is covered in the book: Erwin Kreyszig, Advanced Engineering Mathematics (9th edition) Chapter 24 (not including
More informationA primer on Bayesian statistics, with an application to mortality rate estimation
A primer on Bayesian statistics, with an application to mortality rate estimation Peter off University of Washington Outline Subjective probability Practical aspects Application to mortality rate estimation
More informationFrequentist Statistics and Hypothesis Testing Spring
Frequentist Statistics and Hypothesis Testing 18.05 Spring 2018 http://xkcd.com/539/ Agenda Introduction to the frequentist way of life. What is a statistic? NHST ingredients; rejection regions Simple
More informationClass 26: review for final exam 18.05, Spring 2014
Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event
More informationExamples of frequentist probability include games of chance, sample surveys, and randomized experiments. We will focus on frequentist probability sinc
FPPA-Chapters 13,14 and parts of 16,17, and 18 STATISTICS 50 Richard A. Berk Spring, 1997 May 30, 1997 1 Thinking about Chance People talk about \chance" and \probability" all the time. There are many
More informationCogs 14B: Introduction to Statistical Analysis
Cogs 14B: Introduction to Statistical Analysis Statistical Tools: Description vs. Prediction/Inference Description Averages Variability Correlation Prediction (Inference) Regression Confidence intervals/
More informationProbability, Entropy, and Inference / More About Inference
Probability, Entropy, and Inference / More About Inference Mário S. Alvim (msalvim@dcc.ufmg.br) Information Theory DCC-UFMG (2018/02) Mário S. Alvim (msalvim@dcc.ufmg.br) Probability, Entropy, and Inference
More informationProbability and Statistics. Terms and concepts
Probability and Statistics Joyeeta Dutta Moscato June 30, 2014 Terms and concepts Sample vs population Central tendency: Mean, median, mode Variance, standard deviation Normal distribution Cumulative distribution
More informationBayesian Updating with Continuous Priors Class 13, Jeremy Orloff and Jonathan Bloom
Bayesian Updating with Continuous Priors Class 3, 8.05 Jeremy Orloff and Jonathan Bloom Learning Goals. Understand a parameterized family of distributions as representing a continuous range of hypotheses
More informationBayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007
Bayesian inference Fredrik Ronquist and Peter Beerli October 3, 2007 1 Introduction The last few decades has seen a growing interest in Bayesian inference, an alternative approach to statistical inference.
More informationMACHINE LEARNING INTRODUCTION: STRING CLASSIFICATION
MACHINE LEARNING INTRODUCTION: STRING CLASSIFICATION THOMAS MAILUND Machine learning means different things to different people, and there is no general agreed upon core set of algorithms that must be
More informationAdvanced Herd Management Probabilities and distributions
Advanced Herd Management Probabilities and distributions Anders Ringgaard Kristensen Slide 1 Outline Probabilities Conditional probabilities Bayes theorem Distributions Discrete Continuous Distribution
More informationProbability and Statistics. Joyeeta Dutta-Moscato June 29, 2015
Probability and Statistics Joyeeta Dutta-Moscato June 29, 2015 Terms and concepts Sample vs population Central tendency: Mean, median, mode Variance, standard deviation Normal distribution Cumulative distribution
More informationWith Question/Answer Animations. Chapter 7
With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements
More informationConditional Probability & Independence. Conditional Probabilities
Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F
More informationExam 2 Practice Questions, 18.05, Spring 2014
Exam 2 Practice Questions, 18.05, Spring 2014 Note: This is a set of practice problems for exam 2. The actual exam will be much shorter. Within each section we ve arranged the problems roughly in order
More informationLecture 1: Probability Fundamentals
Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability
More informationRecursive Estimation
Recursive Estimation Raffaello D Andrea Spring 08 Problem Set : Bayes Theorem and Bayesian Tracking Last updated: March, 08 Notes: Notation: Unless otherwise noted, x, y, and z denote random variables,
More informationDiscrete Probability. Chemistry & Physics. Medicine
Discrete Probability The existence of gambling for many centuries is evidence of long-running interest in probability. But a good understanding of probability transcends mere gambling. The mathematics
More informationThe Random Variable for Probabilities Chris Piech CS109, Stanford University
The Random Variable for Probabilities Chris Piech CS109, Stanford University Assignment Grades 10 20 30 40 50 60 70 80 90 100 10 20 30 40 50 60 70 80 90 100 Frequency Frequency 10 20 30 40 50 60 70 80
More informationFormal Modeling in Cognitive Science
Formal Modeling in Cognitive Science Lecture 9: Application of Bayes Theorem; Discrete Random Variables; Steve Renals (notes by Frank Keller) School of Informatics University of Edinburgh s.renals@ed.ac.uk
More informationFormal Modeling in Cognitive Science Lecture 19: Application of Bayes Theorem; Discrete Random Variables; Distributions. Background.
Formal Modeling in Cognitive Science Lecture 9: ; Discrete Random Variables; Steve Renals (notes by Frank Keller) School of Informatics University of Edinburgh s.renals@ed.ac.uk February 7 Probability
More informationAn AI-ish view of Probability, Conditional Probability & Bayes Theorem
An AI-ish view of Probability, Conditional Probability & Bayes Theorem Review: Uncertainty and Truth Values: a mismatch Let action A t = leave for airport t minutes before flight. Will A 15 get me there
More information10/18/2017. An AI-ish view of Probability, Conditional Probability & Bayes Theorem. Making decisions under uncertainty.
An AI-ish view of Probability, Conditional Probability & Bayes Theorem Review: Uncertainty and Truth Values: a mismatch Let action A t = leave for airport t minutes before flight. Will A 15 get me there
More informationMotivation. Stat Camp for the MBA Program. Probability. Experiments and Outcomes. Daniel Solow 5/10/2017
Stat Camp for the MBA Program Daniel Solow Lecture 2 Probability Motivation You often need to make decisions under uncertainty, that is, facing an unknown future. Examples: How many computers should I
More informationCENTRAL LIMIT THEOREM (CLT)
CENTRAL LIMIT THEOREM (CLT) A sampling distribution is the probability distribution of the sample statistic that is formed when samples of size n are repeatedly taken from a population. If the sample statistic
More informationBayesian Inference. p(y)
Bayesian Inference There are different ways to interpret a probability statement in a real world setting. Frequentist interpretations of probability apply to situations that can be repeated many times,
More informationFourier and Stats / Astro Stats and Measurement : Stats Notes
Fourier and Stats / Astro Stats and Measurement : Stats Notes Andy Lawrence, University of Edinburgh Autumn 2013 1 Probabilities, distributions, and errors Laplace once said Probability theory is nothing
More informationLecture 3 Probability Basics
Lecture 3 Probability Basics Thais Paiva STA 111 - Summer 2013 Term II July 3, 2013 Lecture Plan 1 Definitions of probability 2 Rules of probability 3 Conditional probability What is Probability? Probability
More informationBayesian Statistics. State University of New York at Buffalo. From the SelectedWorks of Joseph Lucke. Joseph F. Lucke
State University of New York at Buffalo From the SelectedWorks of Joseph Lucke 2009 Bayesian Statistics Joseph F. Lucke Available at: https://works.bepress.com/joseph_lucke/6/ Bayesian Statistics Joseph
More informationBasic notions of probability theory
Basic notions of probability theory Contents o Boolean Logic o Definitions of probability o Probability laws Why a Lecture on Probability? Lecture 1, Slide 22: Basic Definitions Definitions: experiment,
More informationSTAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01
STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 Nasser Sadeghkhani a.sadeghkhani@queensu.ca There are two main schools to statistical inference: 1-frequentist
More informationRecursive Ambiguity and Machina s Examples
Recursive Ambiguity and Machina s Examples David Dillenberger Uzi Segal May 0, 0 Abstract Machina (009, 0) lists a number of situations where standard models of ambiguity aversion are unable to capture
More informationIntroduction to AI Learning Bayesian networks. Vibhav Gogate
Introduction to AI Learning Bayesian networks Vibhav Gogate Inductive Learning in a nutshell Given: Data Examples of a function (X, F(X)) Predict function F(X) for new examples X Discrete F(X): Classification
More informationCS 188: Artificial Intelligence Spring Today
CS 188: Artificial Intelligence Spring 2006 Lecture 9: Naïve Bayes 2/14/2006 Dan Klein UC Berkeley Many slides from either Stuart Russell or Andrew Moore Bayes rule Today Expectations and utilities Naïve
More informationProbability is related to uncertainty and not (only) to the results of repeated experiments
Uncertainty probability Probability is related to uncertainty and not (only) to the results of repeated experiments G. D Agostini, Probabilità e incertezze di misura - Parte 1 p. 40 Uncertainty probability
More informationAarti Singh. Lecture 2, January 13, Reading: Bishop: Chap 1,2. Slides courtesy: Eric Xing, Andrew Moore, Tom Mitchell
Machine Learning 0-70/5 70/5-78, 78, Spring 00 Probability 0 Aarti Singh Lecture, January 3, 00 f(x) µ x Reading: Bishop: Chap, Slides courtesy: Eric Xing, Andrew Moore, Tom Mitchell Announcements Homework
More informationUniform Sources of Uncertainty for Subjective Probabilities and
Uniform Sources of Uncertainty for Subjective Probabilities and Ambiguity Mohammed Abdellaoui (joint with Aurélien Baillon and Peter Wakker) 1 Informal Central in this work will be the recent finding of
More informationIntroduction to Probability 2017/18 Supplementary Problems
Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A
More informationConditional Probability & Independence. Conditional Probabilities
Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F
More informationCS 361: Probability & Statistics
February 19, 2018 CS 361: Probability & Statistics Random variables Markov s inequality This theorem says that for any random variable X and any value a, we have A random variable is unlikely to have an
More informationApplied Bayesian Statistics STAT 388/488
STAT 388/488 Dr. Earvin Balderama Department of Mathematics & Statistics Loyola University Chicago August 29, 207 Course Info STAT 388/488 http://math.luc.edu/~ebalderama/bayes 2 A motivating example (See
More informationProbability Review - Bayes Introduction
Probability Review - Bayes Introduction Statistics 220 Spring 2005 Copyright c 2005 by Mark E. Irwin Advantages of Bayesian Analysis Answers the questions that researchers are usually interested in, What
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten
More informationProbability - Lecture 4
1 Introduction Probability - Lecture 4 Many methods of computation physics and the comparison of data to a mathematical representation, apply stochastic methods. These ideas were first introduced in the
More informationSTOR 435 Lecture 5. Conditional Probability and Independence - I
STOR 435 Lecture 5 Conditional Probability and Independence - I Jan Hannig UNC Chapel Hill 1 / 16 Motivation Basic point Think of probability as the amount of belief we have in a particular outcome. If
More informationAn introduction to biostatistics: part 1
An introduction to biostatistics: part 1 Cavan Reilly September 6, 2017 Table of contents Introduction to data analysis Uncertainty Probability Conditional probability Random variables Discrete random
More informationMAE 493G, CpE 493M, Mobile Robotics. 6. Basic Probability
MAE 493G, CpE 493M, Mobile Robotics 6. Basic Probability Instructor: Yu Gu, Fall 2013 Uncertainties in Robotics Robot environments are inherently unpredictable; Sensors and data acquisition systems are
More informationDeep Learning for Computer Vision
Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationCONTINUOUS RANDOM VARIABLES
the Further Mathematics network www.fmnetwork.org.uk V 07 REVISION SHEET STATISTICS (AQA) CONTINUOUS RANDOM VARIABLES The main ideas are: Properties of Continuous Random Variables Mean, Median and Mode
More informationBasic notions of probability theory
Basic notions of probability theory Contents o Boolean Logic o Definitions of probability o Probability laws Objectives of This Lecture What do we intend for probability in the context of RAM and risk
More informationDiscrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10
EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped
More informationORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing
ORF 245 Fundamentals of Statistics Chapter 9 Hypothesis Testing Robert Vanderbei Fall 2014 Slides last edited on November 24, 2014 http://www.princeton.edu/ rvdb Coin Tossing Example Consider two coins.
More informationThe Bayesian Paradigm
Stat 200 The Bayesian Paradigm Friday March 2nd The Bayesian Paradigm can be seen in some ways as an extra step in the modelling world just as parametric modelling is. We have seen how we could use probabilistic
More informationProbability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides
Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides slide 1 Inference with Bayes rule: Example In a bag there are two envelopes one has a red ball (worth $100) and a black ball one
More informationCISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.
CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability Arthur G. Werschulz Fordham University Department of Computer and Information Sciences Copyright Arthur G. Werschulz, 2017.
More informationLecture #13 Tuesday, October 4, 2016 Textbook: Sections 7.3, 7.4, 8.1, 8.2, 8.3
STATISTICS 200 Lecture #13 Tuesday, October 4, 2016 Textbook: Sections 7.3, 7.4, 8.1, 8.2, 8.3 Objectives: Identify, and resist the temptation to fall for, the gambler s fallacy Define random variable
More informationToss 1. Fig.1. 2 Heads 2 Tails Heads/Tails (H, H) (T, T) (H, T) Fig.2
1 Basic Probabilities The probabilities that we ll be learning about build from the set theory that we learned last class, only this time, the sets are specifically sets of events. What are events? Roughly,
More informationProbability Theory and Applications
Probability Theory and Applications Videos of the topics covered in this manual are available at the following links: Lesson 4 Probability I http://faculty.citadel.edu/silver/ba205/online course/lesson
More informationQuantitative Understanding in Biology 1.7 Bayesian Methods
Quantitative Understanding in Biology 1.7 Bayesian Methods Jason Banfelder October 25th, 2018 1 Introduction So far, most of the methods we ve looked at fall under the heading of classical, or frequentist
More informationCSC Discrete Math I, Spring Discrete Probability
CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields
More information1 A simple example. A short introduction to Bayesian statistics, part I Math 217 Probability and Statistics Prof. D.
probabilities, we ll use Bayes formula. We can easily compute the reverse probabilities A short introduction to Bayesian statistics, part I Math 17 Probability and Statistics Prof. D. Joyce, Fall 014 I
More informationProbability 5-4 The Multiplication Rules and Conditional Probability
Outline Lecture 8 5-1 Introduction 5-2 Sample Spaces and 5-3 The Addition Rules for 5-4 The Multiplication Rules and Conditional 5-11 Introduction 5-11 Introduction as a general concept can be defined
More informationLecture notes for probability. Math 124
Lecture notes for probability Math 124 What is probability? Probabilities are ratios, expressed as fractions, decimals, or percents, determined by considering results or outcomes of experiments whose result
More informationBayesian analysis in nuclear physics
Bayesian analysis in nuclear physics Ken Hanson T-16, Nuclear Physics; Theoretical Division Los Alamos National Laboratory Tutorials presented at LANSCE Los Alamos Neutron Scattering Center July 25 August
More informationSome slides from Carlos Guestrin, Luke Zettlemoyer & K Gajos 2
Logistics CSE 446: Point Estimation Winter 2012 PS2 out shortly Dan Weld Some slides from Carlos Guestrin, Luke Zettlemoyer & K Gajos 2 Last Time Random variables, distributions Marginal, joint & conditional
More informationUncertain Knowledge and Bayes Rule. George Konidaris
Uncertain Knowledge and Bayes Rule George Konidaris gdk@cs.brown.edu Fall 2018 Knowledge Logic Logical representations are based on: Facts about the world. Either true or false. We may not know which.
More informationA Event has occurred
Statistics and probability: 1-1 1. Probability Event: a possible outcome or set of possible outcomes of an experiment or observation. Typically denoted by a capital letter: A, B etc. E.g. The result of
More informationPROBABILITY CHAPTER LEARNING OBJECTIVES UNIT OVERVIEW
CHAPTER 16 PROBABILITY LEARNING OBJECTIVES Concept of probability is used in accounting and finance to understand the likelihood of occurrence or non-occurrence of a variable. It helps in developing financial
More informationMAT 271E Probability and Statistics
MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday
More informationCS 540: Machine Learning Lecture 2: Review of Probability & Statistics
CS 540: Machine Learning Lecture 2: Review of Probability & Statistics AD January 2008 AD () January 2008 1 / 35 Outline Probability theory (PRML, Section 1.2) Statistics (PRML, Sections 2.1-2.4) AD ()
More informationKDF2C QUANTITATIVE TECHNIQUES FOR BUSINESSDECISION. Unit : I - V
KDF2C QUANTITATIVE TECHNIQUES FOR BUSINESSDECISION Unit : I - V Unit I: Syllabus Probability and its types Theorems on Probability Law Decision Theory Decision Environment Decision Process Decision tree
More informationCHAPTER 15 PROBABILITY Introduction
PROBABILLITY 271 PROBABILITY CHAPTER 15 It is remarkable that a science, which began with the consideration of games of chance, should be elevated to the rank of the most important subject of human knowledge.
More informationComputational Perception. Bayesian Inference
Computational Perception 15-485/785 January 24, 2008 Bayesian Inference The process of probabilistic inference 1. define model of problem 2. derive posterior distributions and estimators 3. estimate parameters
More informationProbability and Probability Distributions. Dr. Mohammed Alahmed
Probability and Probability Distributions 1 Probability and Probability Distributions Usually we want to do more with data than just describing them! We might want to test certain specific inferences about
More informationMachine Learning
Machine Learning 10-701 Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 13, 2011 Today: The Big Picture Overfitting Review: probability Readings: Decision trees, overfiting
More informationSTA Module 4 Probability Concepts. Rev.F08 1
STA 2023 Module 4 Probability Concepts Rev.F08 1 Learning Objectives Upon completing this module, you should be able to: 1. Compute probabilities for experiments having equally likely outcomes. 2. Interpret
More information18.600: Lecture 4 Axioms of probability and inclusion-exclusion
18.600: Lecture 4 Axioms of probability and inclusion-exclusion Scott Sheffield MIT Outline Axioms of probability Consequences of axioms Inclusion exclusion Outline Axioms of probability Consequences of
More informationConditional Probability. CS231 Dianna Xu
Conditional Probability CS231 Dianna Xu 1 Boy or Girl? A couple has two children, one of them is a girl. What is the probability that the other one is also a girl? Assuming 50/50 chances of conceiving
More informationChapter 8 Sequences, Series, and Probability
Chapter 8 Sequences, Series, and Probability Overview 8.1 Sequences and Series 8.2 Arithmetic Sequences and Partial Sums 8.3 Geometric Sequences and Partial Sums 8.5 The Binomial Theorem 8.6 Counting Principles
More informationBayesian Analysis for Natural Language Processing Lecture 2
Bayesian Analysis for Natural Language Processing Lecture 2 Shay Cohen February 4, 2013 Administrativia The class has a mailing list: coms-e6998-11@cs.columbia.edu Need two volunteers for leading a discussion
More informationEvent A: at least one tail observed A:
Chapter 3 Probability 3.1 Events, sample space, and probability Basic definitions: An is an act of observation that leads to a single outcome that cannot be predicted with certainty. A (or simple event)
More informationLecture 10 and 11: Text and Discrete Distributions
Lecture 10 and 11: Text and Discrete Distributions Machine Learning 4F13, Spring 2014 Carl Edward Rasmussen and Zoubin Ghahramani CUED http://mlg.eng.cam.ac.uk/teaching/4f13/ Rasmussen and Ghahramani Lecture
More informationParameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn
Parameter estimation and forecasting Cristiano Porciani AIfA, Uni-Bonn Questions? C. Porciani Estimation & forecasting 2 Temperature fluctuations Variance at multipole l (angle ~180o/l) C. Porciani Estimation
More information