Binomial Data, Axioms of Probability, and Bayes Rule. n! Pr(k successes in n draws)' k!(n&k)! Bk (1&B) n&k
|
|
- Merry Marianna Kelley
- 6 years ago
- Views:
Transcription
1 Prof. Green Intro Stats // Fall 1998 Binomial Data, Axioms of Probability, and Bayes Rule In this class, we link two sets of topics, sampling and probability. To do so coherently, we must blend ideas from Moore and McCabe s chapters 3, 4, and 5. As you review this material, you may wish to read these chapters in tandem. Let s start with a simple probability problem. Suppose that state legislators in the US came in two varieties, Democrats and Republicans. Suppose Republicans comprise 40% of all legislators. If we draw at random four members of the (large) population of state legislators, what is the probability that all four will be Republicans? To answer this question, we reason that for the outcome to obtain, the first draw we make must be a Republican. The probability of this outcome is.40. And the second draw must come up Republican, too, which also has a probability of.40. Since the first and second draws are independent, the probability that both are Republican is (.40)*(.40). Following this logic to its conclusion implies that the 4 probability that all four draws are Republican = (.40) = Suppose the question were: What is the probability of drawing exactly three Republicans in four draws? One tedious way to answer this question is to list the sample space, that is, the set of all possible outcomes. A more convenient way is the binomial formula: n! Pr(k successes in n draws)' k!(n&k)! Bk (1&B) n&k where π represents the true population proportion of Republicans. In this case, (see Table C, p. T-7) 4! Pr(3 Republicans in 4 draws)' 3!(4&3)! (.4)3 (1&.4) 4&3 '(4)(.064)(.6)'.1536 Note that this formula simplifies nicely when we want to know the probability of at least one Republican being selected in four draws. That is one minus the probability of no Republicans being selected. 4! Pr(at least one Republican in 4 draws)'1& 0!(4&0)! (.4)0 (1&.4) 4&0 '1&(.6) 4 '.8704 This kind of simplification can be very useful for a large class of problems. Suppose we manage an airline that runs 100,000 flights each year. Suppose for the sake of argument that the probability of a crash is independent from flight to flight. Suppose that probability is (read one-in-100,000). Contrary to the way laypeople often look at problems such as this, the probability of at least one crash each year is not equal to 1!
2 Pr(at least one crash in 100,000 draws)'1&(1&.00001) 100,000 '.6321 x [Calculator note: use the y button.] Now suppose that your airline tightens its safety procedures so that the probability of a crash is one-in-a-million per flight. That s more like it! Pr(at least one crash in 100,000 draws)'1&(1& ) 100,000 '.0952 When working with binomial problems for cases where (nπ > 30), assuming that the total population is large or that the cases are drawn with replacement, it is convenient to work with the normal approximation instead of the binomial formula. Let X be the observed number of successes in n draws. Let p be the sample proportion, calculated as X/n. For a population that has a proportion π of successes, the sample proportion p will be distributed normally with mean π and standard deviation equal to: Standard deviation of sampling distribution of a sample proportion' B(1&B) n Notice that the population proportion is used here (if it is known). Thus, if we draw 100 legislators at random, we would expect to find 40/100 Republicans, and we would expect that across hypothetical replications of this sampling procedure, our sample estimates will vary with a standard deviation of.049. Using the +/- 2 standard deviation rule, we would suppose that 95% of our samples would fall between about 30% and 50% Republican. When we do not know the population proportion, two strategies are available. One is to make a conservative assumption, namely that π =.50. Note that this value of π makes the numerator of the standard error as large as possible. Thus, our the actual level of sampling variability will be no greater than the value we obtain under this assumption. An alternative is to assume, for purposes of calculation, that p = π. In keeping with the discussion last time regarding two-sample comparison, we can write down a formula for the standard error of the difference between two proportions (p - p ): 1 2 SE of difference in proportions' B 1 (1&B 1 ) n 1 % B 2 (1&B 2 ) n 2 For example, the Outward bound experiment showed that 39.5% of 86 all-white group subjects scored a 14 on the tolerance scale, as compared to 52.8% of the 178 integrated group subjects. The difference is therefore 13.3 with a standard error of 6.5%. Note that the +/-2 SE rule creates a 95% confidence interval [.3, 26.3] that just barely excludes zero.
3 Unconditional and Conditional Probabilities Central to the study of probability is the conceptual distinction between conditional and unconditional probabilities. Consider, by way of illustration, the following crosstabulation: Study hard Goof off Total Get good grades Get bad grades Total The probability that one gets good grades is 175/300 =.583. Call this Pr(G). The probability that one gets good grades given that one studies hard is 75/100 =.75. Call this Pr(G S). If we want to know whether studying pays off, we would compare Pr(G S) to the probability call it Pr(G ~S) that one gets good grades given that one does not study hard. That probability is.50. It should be clear that Pr(G) is not equal to Pr(G S) unless getting good grades and studying are statistically independent events. A different set of questions is answered if we percentage the table across the rows. The probability that one studies hard is 100/300 =.333 = P(S). The probability that one studies hard given that one gets good grades is 75/175 =.429. The fact that most people who get good grades goof off is quite compatible with the observation made above that those who study have a higher probability of getting good grades. People with a weak understanding of statistics often run these two points together. They should have studied harder. Note the following axioms of probability: 1. Pr(G) + Pr(~G) = 1. In this example, 175/ /300 = Pr(G and S) = Pr(G)Pr(S G) = Pr(S)Pr(G S). Here, 75/300 = (175/300)(75/175) = (100/300)(75/100). 3. Pr(S) = Pr(S G)Pr(G) + Pr(S ~G)Pr(~G). Here, (100/300) = (75/175)(175/300)+(25/125)(125/300). 4. Pr(G S) = Pr(G)P(S G) / Pr(S). Obtained by rearranging terms in axiom 2. Putting lines 3 and 4 together generates Bayes rule, an alternative expression for Pr(G S)...
4 Bayes Rule Terminology and Assumptions: Pr(H) =probability that an hypothesis is true (a prior) Pr(E) =probability that one observed a given form of evidence Pr(H E ) =probability that the hypothesis is true given that one observed this evidence (posterior) Pr(E H) =probability that this evidence is observed given that the hypothesis is true (likelihood) Pr(E ~H)=probability that this evidence is observed given that the hypothesis is false (likelihood) Pr(~H) = 1 - Pr(H) = probability that the hypothesis is false Bayes Rule: Pr(E H)Pr(H) Pr(H E)' Pr(E H)Pr(H)%Pr(E -H)Pr(-H) Consequences of observing a study that supports the deterrent effect of the death penalty (E), for two observers with different priors (H) OBSERVER 1 Input: Pr(H) = prior probability that death penalty deters Pr(E H) =0.800 prob of coming up with evidence of deterrent effect, given deterrence Pr(E ~H)=0.300 prob of coming up with evidence of deterrence, given no effect Output: Pr(H E)=0.123 posterior prob that death penalty deters OBSERVER 2 Input: Pr(H) = prior probability that death penalty deters Pr(E H) =0.800 prob of coming up with evidence of deterrent effect, given deterrence Pr(E ~H)=0.300 prob of coming up with evidence of deterrence, given no effect Output: Pr(H E)=0.981 posterior prob that death penalty deters
5 Notice the following features of the foregoing illustration: < If Pr(E H) > Pr(E ~H) for both people, then a given piece of evidence will push them both in the same direction; that is, both become more convinced of the hypothesis. In this case, for two people with the same likelihoods, the percentage-point gap between their posteriors (here.858) will be smaller than the initial gap between their priors (here.900). < If a person believes that Pr(E H) = Pr(E ~H), then Pr(H) = Pr(H E); new information has no effect on someone who believes this evidence to be nondiagnostic. This fact follows directly from Bayes rule. < Bayes rule has nothing to say about where priors come from; it focuses solely on the process by which these priors are updated in light of new information. Note also that Bayes rule can be applied to a succession of learning experiences. Consider, for example, what happens when Observer 1 is exposed to another piece of information. The prior is now.123. Plugging the numbers into the formula gives a new posterior of.272. If the process is repeated again, this time with a prior of.272, the posterior is.499. And so on, until eventually the two observers beliefs converge. Relevance? To real world? A normative or descriptive model? Can it be both? Prospective Economic Evaluations of the Parties, by Party Identification in the Initial Wave of Each Panel Study (Entries are percentage of each partisan group saying that the Democratic Party does a better job of handling the economy.) Panel Study Party Identification in 1990 Economic evaluations Economic evaluations Economic evaluations N of cases (405) (319) (261) Panel Study Party Identification in 1992 Economic evaluations Economic evaluations Economic evaluations N of cases (201) (233) (179) Panel Study Party Identification in 1994 Economic evaluations Economic evaluations N of cases (232) (215) (182)
MAE 493G, CpE 493M, Mobile Robotics. 6. Basic Probability
MAE 493G, CpE 493M, Mobile Robotics 6. Basic Probability Instructor: Yu Gu, Fall 2013 Uncertainties in Robotics Robot environments are inherently unpredictable; Sensors and data acquisition systems are
More informationA Problem Involving Games. Paccioli s Solution. Problems for Paccioli: Small Samples. n / (n + m) m / (n + m)
Class #10: Introduction to Probability Theory Artificial Intelligence (CS 452/552): M. Allen, 27 Sept. 17 A Problem Involving Games } Two players put money in on a game of chance } First one to certain
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten
More informationP Q (P Q) (P Q) (P Q) (P % Q) T T T T T T T F F T F F F T F T T T F F F F T T
Logic and Reasoning Final Exam Practice Fall 2017 Name Section Number The final examination is worth 100 points. 1. (10 points) What is an argument? Explain what is meant when one says that logic is the
More informationModels of Reputation with Bayesian Updating
Models of Reputation with Bayesian Updating Jia Chen 1 The Tariff Game (Downs and Rocke 1996) 1.1 Basic Setting Two states, A and B, are setting the tariffs for trade. The basic setting of the game resembles
More informationDiscrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10
EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped
More informationECE521 Lecture7. Logistic Regression
ECE521 Lecture7 Logistic Regression Outline Review of decision theory Logistic regression A single neuron Multi-class classification 2 Outline Decision theory is conceptually easy and computationally hard
More informationProbability theory basics
Probability theory basics Michael Franke Basics of probability theory: axiomatic definition, interpretation, joint distributions, marginalization, conditional probability & Bayes rule. Random variables:
More informationBayesian Statistics. State University of New York at Buffalo. From the SelectedWorks of Joseph Lucke. Joseph F. Lucke
State University of New York at Buffalo From the SelectedWorks of Joseph Lucke 2009 Bayesian Statistics Joseph F. Lucke Available at: https://works.bepress.com/joseph_lucke/6/ Bayesian Statistics Joseph
More informationSTAT:5100 (22S:193) Statistical Inference I
STAT:5100 (22S:193) Statistical Inference I Week 3 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Recap Matching problem Generalized
More informationWeek 2 Quantitative Analysis of Financial Markets Bayesian Analysis
Week 2 Quantitative Analysis of Financial Markets Bayesian Analysis Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October
More informationContents. Decision Making under Uncertainty 1. Meanings of uncertainty. Classical interpretation
Contents Decision Making under Uncertainty 1 elearning resources Prof. Ahti Salo Helsinki University of Technology http://www.dm.hut.fi Meanings of uncertainty Interpretations of probability Biases in
More information2 Chapter 2: Conditional Probability
STAT 421 Lecture Notes 18 2 Chapter 2: Conditional Probability Consider a sample space S and two events A and B. For example, suppose that the equally likely sample space is S = {0, 1, 2,..., 99} and A
More informationLecture 10: Introduction to reasoning under uncertainty. Uncertainty
Lecture 10: Introduction to reasoning under uncertainty Introduction to reasoning under uncertainty Review of probability Axioms and inference Conditional probability Probability distributions COMP-424,
More information3 PROBABILITY TOPICS
Chapter 3 Probability Topics 135 3 PROBABILITY TOPICS Figure 3.1 Meteor showers are rare, but the probability of them occurring can be calculated. (credit: Navicore/flickr) Introduction It is often necessary
More informationPS 203 Spring 2002 Homework One - Answer Key
PS 203 Spring 2002 Homework One - Answer Key 1. If you have a home or office computer, download and install WinBUGS. If you don t have your own computer, try running WinBUGS in the Department lab. 2. The
More informationBasics of Probability
Basics of Probability Lecture 1 Doug Downey, Northwestern EECS 474 Events Event space E.g. for dice, = {1, 2, 3, 4, 5, 6} Set of measurable events S 2 E.g., = event we roll an even number = {2, 4, 6} S
More informationMA : Introductory Probability
MA 320-001: Introductory Probability David Murrugarra Department of Mathematics, University of Kentucky http://www.math.uky.edu/~dmu228/ma320/ Spring 2017 David Murrugarra (University of Kentucky) MA 320:
More informationECO 317 Economics of Uncertainty. Lectures: Tu-Th , Avinash Dixit. Precept: Fri , Andrei Rachkov. All in Fisher Hall B-01
ECO 317 Economics of Uncertainty Lectures: Tu-Th 3.00-4.20, Avinash Dixit Precept: Fri 10.00-10.50, Andrei Rachkov All in Fisher Hall B-01 1 No.1 Thu. Sep. 17 ECO 317 Fall 09 RISK MARKETS Intrade: http://www.intrade.com
More informationNotes on statistical tests
Notes on statistical tests Daniel Osherson Princeton University Scott Weinstein University of Pennsylvania September 20, 2005 We attempt to provide simple proofs for some facts that ought to be more widely
More informationCS 231A Section 1: Linear Algebra & Probability Review
CS 231A Section 1: Linear Algebra & Probability Review 1 Topics Support Vector Machines Boosting Viola-Jones face detector Linear Algebra Review Notation Operations & Properties Matrix Calculus Probability
More informationMachine Learning
Machine Learning 10-701 Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 13, 2011 Today: The Big Picture Overfitting Review: probability Readings: Decision trees, overfiting
More informationWinLTA USER S GUIDE for Data Augmentation
USER S GUIDE for Version 1.0 (for WinLTA Version 3.0) Linda M. Collins Stephanie T. Lanza Joseph L. Schafer The Methodology Center The Pennsylvania State University May 2002 Dev elopment of this program
More informationCS 231A Section 1: Linear Algebra & Probability Review. Kevin Tang
CS 231A Section 1: Linear Algebra & Probability Review Kevin Tang Kevin Tang Section 1-1 9/30/2011 Topics Support Vector Machines Boosting Viola Jones face detector Linear Algebra Review Notation Operations
More informationPHIL12A Section answers, 28 Feb 2011
PHIL12A Section answers, 28 Feb 2011 Julian Jonker 1 How much do you know? Give formal proofs for the following arguments. 1. (Ex 6.18) 1 A B 2 A B 1 A B 2 A 3 A B Elim: 2 4 B 5 B 6 Intro: 4,5 7 B Intro:
More informationINDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR. NPTEL National Programme on Technology Enhanced Learning. Probability Methods in Civil Engineering
INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR NPTEL National Programme on Technology Enhanced Learning Probability Methods in Civil Engineering Prof. Rajib Maity Department of Civil Engineering IIT Kharagpur
More informationConditional Probability P( )
Conditional Probability P( ) 1 conditional probability where P(F) > 0 Conditional probability of E given F: probability that E occurs given that F has occurred. Conditioning on F Written as P(E F) Means
More informationBasic Probabilistic Reasoning SEG
Basic Probabilistic Reasoning SEG 7450 1 Introduction Reasoning under uncertainty using probability theory Dealing with uncertainty is one of the main advantages of an expert system over a simple decision
More information2) There should be uncertainty as to which outcome will occur before the procedure takes place.
robability Numbers For many statisticians the concept of the probability that an event occurs is ultimately rooted in the interpretation of an event as an outcome of an experiment, others would interpret
More informationBayesian Networks Basic and simple graphs
Bayesian Networks Basic and simple graphs Ullrika Sahlin, Centre of Environmental and Climate Research Lund University, Sweden Ullrika.Sahlin@cec.lu.se http://www.cec.lu.se/ullrika-sahlin Bayesian [Belief]
More informationUnderstanding Inference: Confidence Intervals I. Questions about the Assignment. The Big Picture. Statistic vs. Parameter. Statistic vs.
Questions about the Assignment If your answer is wrong, but you show your work you can get more partial credit. Understanding Inference: Confidence Intervals I parameter versus sample statistic Uncertainty
More informationExamine characteristics of a sample and make inferences about the population
Chapter 11 Introduction to Inferential Analysis Learning Objectives Understand inferential statistics Explain the difference between a population and a sample Explain the difference between parameter and
More informationLecture 4 : Conditional Probability and Bayes Theorem 0/ 26
0/ 26 The conditional sample space Motivating examples 1. Roll a fair die once 1 2 3 S = 4 5 6 Let A = 6 appears B = an even number appears So P(A) = 1 6 P(B) = 1 2 1/ 26 Now what about P ( 6 appears given
More informationSTT 315 Problem Set #3
1. A student is asked to calculate the probability that x = 3.5 when x is chosen from a normal distribution with the following parameters: mean=3, sd=5. To calculate the answer, he uses this command: >
More informationIn the previous chapter, we learned how to use the method of least-squares
03-Kahane-45364.qxd 11/9/2007 4:40 PM Page 37 3 Model Performance and Evaluation In the previous chapter, we learned how to use the method of least-squares to find a line that best fits a scatter of points.
More informationProbability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...
Probability COMP 245 STATISTICS Dr N A Heard Contents Sample Spaces and Events. Sample Spaces........................................2 Events........................................... 2.3 Combinations
More informationLecturer: Dr. Adote Anum, Dept. of Psychology Contact Information:
Lecturer: Dr. Adote Anum, Dept. of Psychology Contact Information: aanum@ug.edu.gh College of Education School of Continuing and Distance Education 2014/2015 2016/2017 Session Overview In this Session
More informationIntro to Probability. Andrei Barbu
Intro to Probability Andrei Barbu Some problems Some problems A means to capture uncertainty Some problems A means to capture uncertainty You have data from two sources, are they different? Some problems
More informationBayesian Updating with Discrete Priors Class 11, Jeremy Orloff and Jonathan Bloom
1 Learning Goals ian Updating with Discrete Priors Class 11, 18.05 Jeremy Orloff and Jonathan Bloom 1. Be able to apply theorem to compute probabilities. 2. Be able to define the and to identify the roles
More information1 A simple example. A short introduction to Bayesian statistics, part I Math 217 Probability and Statistics Prof. D.
probabilities, we ll use Bayes formula. We can easily compute the reverse probabilities A short introduction to Bayesian statistics, part I Math 17 Probability and Statistics Prof. D. Joyce, Fall 014 I
More informationHomework (due Wed, Oct 27) Chapter 7: #17, 27, 28 Announcements: Midterm exams keys on web. (For a few hours the answer to MC#1 was incorrect on
Homework (due Wed, Oct 27) Chapter 7: #17, 27, 28 Announcements: Midterm exams keys on web. (For a few hours the answer to MC#1 was incorrect on Version A.) No grade disputes now. Will have a chance to
More information18.05 Practice Final Exam
No calculators. 18.05 Practice Final Exam Number of problems 16 concept questions, 16 problems. Simplifying expressions Unless asked to explicitly, you don t need to simplify complicated expressions. For
More informationSuggested solution for exam in MSA830: Statistical Analysis and Experimental Design March Pr(G=1) = 0.12 Pr(H= 1 G=1) = 0.89.
Petter Mostad Matematisk Statistik Chalmers Suggested solution for exam in MSA830: Statistical Analysis and Experimental Design March 2010 1. We have the two variables G: G1 means X has the gene, G0 the
More information(right tailed) or minus Z α. (left-tailed). For a two-tailed test the critical Z value is going to be.
More Power Stuff What is the statistical power of a hypothesis test? Statistical power is the probability of rejecting the null conditional on the null being false. In mathematical terms it is ( reject
More informationKDF2C QUANTITATIVE TECHNIQUES FOR BUSINESSDECISION. Unit : I - V
KDF2C QUANTITATIVE TECHNIQUES FOR BUSINESSDECISION Unit : I - V Unit I: Syllabus Probability and its types Theorems on Probability Law Decision Theory Decision Environment Decision Process Decision tree
More informationJoint, Conditional, & Marginal Probabilities
Joint, Conditional, & Marginal Probabilities The three axioms for probability don t discuss how to create probabilities for combined events such as P [A B] or for the likelihood of an event A given that
More informationRationality and Objectivity in Science or Tom Kuhn meets Tom Bayes
Rationality and Objectivity in Science or Tom Kuhn meets Tom Bayes Three (Ambitious) Objectives: 1. Partial rapprochement of Kuhnian ideas (with their subjective element) and logical empiricist approach
More informationConditional Forecasts
Conditional Forecasts Lawrence J. Christiano September 8, 17 Outline Suppose you have two sets of variables: y t and x t. Would like to forecast y T+j, j = 1,,..., f, conditional on specified future values
More informationIt applies to discrete and continuous random variables, and a mix of the two.
3. Bayes Theorem 3.1 Bayes Theorem A straightforward application of conditioning: using p(x, y) = p(x y) p(y) = p(y x) p(x), we obtain Bayes theorem (also called Bayes rule) p(x y) = p(y x) p(x). p(y)
More informationProbability - Lecture 4
1 Introduction Probability - Lecture 4 Many methods of computation physics and the comparison of data to a mathematical representation, apply stochastic methods. These ideas were first introduced in the
More informationBayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007
Bayesian inference Fredrik Ronquist and Peter Beerli October 3, 2007 1 Introduction The last few decades has seen a growing interest in Bayesian inference, an alternative approach to statistical inference.
More informationLast week: Sample, population and sampling distributions finished with estimation & confidence intervals
Past weeks: Measures of central tendency (mean, mode, median) Measures of dispersion (standard deviation, variance, range, etc). Working with the normal curve Last week: Sample, population and sampling
More informationStatistics of Small Signals
Statistics of Small Signals Gary Feldman Harvard University NEPPSR August 17, 2005 Statistics of Small Signals In 1998, Bob Cousins and I were working on the NOMAD neutrino oscillation experiment and we
More information9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering
Types of learning Modeling data Supervised: we know input and targets Goal is to learn a model that, given input data, accurately predicts target data Unsupervised: we know the input only and want to make
More informationSection 7.2 Definition of Probability
Section 7.2 Definition of Probability Question: Suppose we have an experiment that consists of flipping a fair 2-sided coin and observing if the coin lands on heads or tails? From section 7.1 we should
More informationProbability and Statistics
Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute - Systems and Modeling GIGA - Bioinformatics ULg kristel.vansteen@ulg.ac.be CHAPTER 4: IT IS ALL ABOUT DATA 4a - 1 CHAPTER 4: IT
More informationBEGINNING BAYES IN R. Bayes with discrete models
BEGINNING BAYES IN R Bayes with discrete models Beginning Bayes in R Survey on eating out What is your favorite day for eating out? Construct a prior for p Define p: proportion of all students who answer
More informationModule 2 : Conditional Probability
Module 2 : Conditional Probability Ruben Zamar Department of Statistics UBC January 16, 2017 Ruben Zamar Department of Statistics UBC Module () 2 January 16, 2017 1 / 61 MOTIVATION The outcome could be
More informationProbabilistic Reasoning
Course 16 :198 :520 : Introduction To Artificial Intelligence Lecture 7 Probabilistic Reasoning Abdeslam Boularias Monday, September 28, 2015 1 / 17 Outline We show how to reason and act under uncertainty.
More informationProbability and Discrete Distributions
AMS 7L LAB #3 Fall, 2007 Objectives: Probability and Discrete Distributions 1. To explore relative frequency and the Law of Large Numbers 2. To practice the basic rules of probability 3. To work with the
More informationSTATISTICS Relationships between variables: Correlation
STATISTICS 16 Relationships between variables: Correlation The gentleman pictured above is Sir Francis Galton. Galton invented the statistical concept of correlation and the use of the regression line.
More informationBIOS 625 Fall 2015 Homework Set 3 Solutions
BIOS 65 Fall 015 Homework Set 3 Solutions 1. Agresti.0 Table.1 is from an early study on the death penalty in Florida. Analyze these data and show that Simpson s Paradox occurs. Death Penalty Victim's
More informationMachine Learning
Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University August 30, 2017 Today: Decision trees Overfitting The Big Picture Coming soon Probabilistic learning MLE,
More informationCh18 links / ch18 pdf links Ch18 image t-dist table
Ch18 links / ch18 pdf links Ch18 image t-dist table ch18 (inference about population mean) exercises: 18.3, 18.5, 18.7, 18.9, 18.15, 18.17, 18.19, 18.27 CHAPTER 18: Inference about a Population Mean The
More informationIndicative conditionals
Indicative conditionals PHIL 43916 November 14, 2012 1. Three types of conditionals... 1 2. Material conditionals... 1 3. Indicatives and possible worlds... 4 4. Conditionals and adverbs of quantification...
More informationHOW TO WRITE PROOFS. Dr. Min Ru, University of Houston
HOW TO WRITE PROOFS Dr. Min Ru, University of Houston One of the most difficult things you will attempt in this course is to write proofs. A proof is to give a legal (logical) argument or justification
More informationFrom Bayes Theorem to Pattern Recognition via Bayes Rule
From Bayes Theorem to Pattern Recognition via Bayes Rule Slecture by Varun Vasudevan (partially based on Prof. Mireille Boutin s ECE 662 lecture) February 12, 2014 What will you learn from this slecture?
More information1. Regressions and Regression Models. 2. Model Example. EEP/IAS Introductory Applied Econometrics Fall Erin Kelley Section Handout 1
1. Regressions and Regression Models Simply put, economists use regression models to study the relationship between two variables. If Y and X are two variables, representing some population, we are interested
More informationMachine Learning
Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 14, 2015 Today: The Big Picture Overfitting Review: probability Readings: Decision trees, overfiting
More informationProbability is related to uncertainty and not (only) to the results of repeated experiments
Uncertainty probability Probability is related to uncertainty and not (only) to the results of repeated experiments G. D Agostini, Probabilità e incertezze di misura - Parte 1 p. 40 Uncertainty probability
More information5.3 Conditional Probability and Independence
28 CHAPTER 5. PROBABILITY 5. Conditional Probability and Independence 5.. Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted
More informationF71SM STATISTICAL METHODS
F71SM STATISTICAL METHODS RJG SUMMARY NOTES 2 PROBABILITY 2.1 Introduction A random experiment is an experiment which is repeatable under identical conditions, and for which, at each repetition, the outcome
More informationConcepts in Statistics
Concepts in Statistics -- A Theoretical and Hands-on Approach Statistics The Art of Distinguishing Luck from Chance Statistics originally meant the collection of population and economic information vital
More informationQuantitative Analysis and Empirical Methods
Hypothesis testing Sciences Po, Paris, CEE / LIEPP Introduction Hypotheses Procedure of hypothesis testing Two-tailed and one-tailed tests Statistical tests with categorical variables A hypothesis A testable
More informationToday. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use?
Today Statistical Learning Parameter Estimation: Maximum Likelihood (ML) Maximum A Posteriori (MAP) Bayesian Continuous case Learning Parameters for a Bayesian Network Naive Bayes Maximum Likelihood estimates
More informationLast two weeks: Sample, population and sampling distributions finished with estimation & confidence intervals
Past weeks: Measures of central tendency (mean, mode, median) Measures of dispersion (standard deviation, variance, range, etc). Working with the normal curve Last two weeks: Sample, population and sampling
More informationIntroduction to Machine Learning. Lecture 2
Introduction to Machine Learning Lecturer: Eran Halperin Lecture 2 Fall Semester Scribe: Yishay Mansour Some of the material was not presented in class (and is marked with a side line) and is given for
More informationHypothesis tests
6.1 6.4 Hypothesis tests Prof. Tesler Math 186 February 26, 2014 Prof. Tesler 6.1 6.4 Hypothesis tests Math 186 / February 26, 2014 1 / 41 6.1 6.2 Intro to hypothesis tests and decision rules Hypothesis
More information(x t. x t +1. TIME SERIES (Chapter 8 of Wilks)
45 TIME SERIES (Chapter 8 of Wilks) In meteorology, the order of a time series matters! We will assume stationarity of the statistics of the time series. If there is non-stationarity (e.g., there is a
More informationSTAT509: Probability
University of South Carolina August 20, 2014 The Engineering Method and Statistical Thinking The general steps of engineering method are: 1. Develop a clear and concise description of the problem. 2. Identify
More informationSTAT 425: Introduction to Bayesian Analysis
STAT 425: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 2017 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 1) Fall 2017 1 / 10 Lecture 7: Prior Types Subjective
More informationProbability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides
Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides slide 1 Inference with Bayes rule: Example In a bag there are two envelopes one has a red ball (worth $100) and a black ball one
More informationIntroduction to Bayesian Statistics and Markov Chain Monte Carlo Estimation. EPSY 905: Multivariate Analysis Spring 2016 Lecture #10: April 6, 2016
Introduction to Bayesian Statistics and Markov Chain Monte Carlo Estimation EPSY 905: Multivariate Analysis Spring 2016 Lecture #10: April 6, 2016 EPSY 905: Intro to Bayesian and MCMC Today s Class An
More informationCIS 2033 Lecture 5, Fall
CIS 2033 Lecture 5, Fall 2016 1 Instructor: David Dobor September 13, 2016 1 Supplemental reading from Dekking s textbook: Chapter2, 3. We mentioned at the beginning of this class that calculus was a prerequisite
More informationBayesian Updating: Discrete Priors: Spring
Bayesian Updating: Discrete Priors: 18.05 Spring 2017 http://xkcd.com/1236/ Learning from experience Which treatment would you choose? 1. Treatment 1: cured 100% of patients in a trial. 2. Treatment 2:
More informationChapter 12: Inference about One Population
Chapter 1: Inference about One Population 1.1 Introduction In this chapter, we presented the statistical inference methods used when the problem objective is to describe a single population. Sections 1.
More informationProbability and Inference. POLI 205 Doing Research in Politics. Populations and Samples. Probability. Fall 2015
Fall 2015 Population versus Sample Population: data for every possible relevant case Sample: a subset of cases that is drawn from an underlying population Inference Parameters and Statistics A parameter
More informationA Brief Introduction to Bayesian Statistics
A Brief Introduction to Bayesian Statistics Outline According to its designers, Apollo was three nines reliable: The odds of an astronaut s survival were 0.999, or only one chance in a thousand of being
More informationAnswering. Question Answering: Result. Application of the Theorem Prover: Question. Overview. Question Answering: Example.
! &0/ $% &) &% &0/ $% &) &% 5 % 4! pplication of the Theorem rover: Question nswering iven a database of facts (ground instances) and axioms we can pose questions in predicate calculus and answer them
More informationUnit 19 Formulating Hypotheses and Making Decisions
Unit 19 Formulating Hypotheses and Making Decisions Objectives: To formulate a null hypothesis and an alternative hypothesis, and to choose a significance level To identify the Type I error and the Type
More information18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages
Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution
More informationConsider an experiment that may have different outcomes. We are interested to know what is the probability of a particular set of outcomes.
CMSC 310 Artificial Intelligence Probabilistic Reasoning and Bayesian Belief Networks Probabilities, Random Variables, Probability Distribution, Conditional Probability, Joint Distributions, Bayes Theorem
More informationOne-sample categorical data: approximate inference
One-sample categorical data: approximate inference Patrick Breheny October 6 Patrick Breheny Biostatistical Methods I (BIOS 5710) 1/25 Introduction It is relatively easy to think about the distribution
More informationCOMP 551 Applied Machine Learning Lecture 19: Bayesian Inference
COMP 551 Applied Machine Learning Lecture 19: Bayesian Inference Associate Instructor: (herke.vanhoof@mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/comp551 Unless otherwise noted, all material posted
More informationThe Bayesian Paradigm
Stat 200 The Bayesian Paradigm Friday March 2nd The Bayesian Paradigm can be seen in some ways as an extra step in the modelling world just as parametric modelling is. We have seen how we could use probabilistic
More informationOrdinary Least Squares Regression Explained: Vartanian
Ordinary Least Squares Regression Explained: Vartanian When to Use Ordinary Least Squares Regression Analysis A. Variable types. When you have an interval/ratio scale dependent variable.. When your independent
More informationNaive Bayes classification
Naive Bayes classification Christos Dimitrakakis December 4, 2015 1 Introduction One of the most important methods in machine learning and statistics is that of Bayesian inference. This is the most fundamental
More informationChapter Three. Hypothesis Testing
3.1 Introduction The final phase of analyzing data is to make a decision concerning a set of choices or options. Should I invest in stocks or bonds? Should a new product be marketed? Are my products being
More informationRefresher on Probability Theory
Much of this material is adapted from Chapters 2 and 3 of Darwiche s book January 16, 2014 1 Preliminaries 2 Degrees of Belief 3 Independence 4 Other Important Properties 5 Wrap-up Primitives The following
More informationWith our knowledge of interval estimation, we can consider hypothesis tests
Chapter 10 Hypothesis Testing 10.1 Testing Hypotheses With our knowledge of interval estimation, we can consider hypothesis tests An Example of an Hypothesis Test: Statisticians at Employment and Immigration
More information