Probability calculus and statistics
|
|
- Evelyn Hamilton
- 5 years ago
- Views:
Transcription
1 A Probability calculus and statistics A.1 The meaning of a probability A probability can be interpreted in different ways. In this book, we understand a probability to be an expression of how likely it is that an event will occur. Let us look at an example. Let A represent the event that a patient develops an illness, S, over the next year, when the patient shows symptoms, V. We do not know if A will occur or not there is uncertainty associated with the outcome. However, we can have an opinion on how likely it is that the patient will develop the illness. Statistics show that about 5 out of 100 patients develop this illness over the course of 1 year if they show the symptoms V. Is it then reasonable to say that the probability that A will occur is equal to 5%? Yes, if this is all the information that we have available, then it is reasonable to say that the probability that the patient will become ill next year is 0.05 if the symptoms V are present. If we have other relevant information about the patient, our probability can be entirely different. Imagine, for example, that the particular patient also has an illness B and that his/her general condition is somewhat weakened. Then it would be far more likely that the patient will develop illness S. The physician who analyses the patient may, perhaps, assign a probability of 75% for this case: For four such cases that are relatively similar, he/she predicts that three out of four cases will develop the illness. To make this a bit more formalised, we let P(A K) represent our probability that event A will occur, based on the background knowledge K. Often, we simplify the formula and write only P(A). It is then implicit that the probability is based on the background knowledge K. If we say that the probability is 75%, then we Risk Analy sis: A sse ssing Unc e rtaintie s be y ond Ex pe c te d Value s and Probabilitie s 2008 John Wiley & Sons, Ltd. ISBN: T. Aven
2 168 PROBABILITY CALCULUS AND STATISTICS mean that it is just as likely for event A to occur as it is to draw a red ball out of an urn that contains three red balls and one white ball. The uncertainty is the same. We see that we can understand a probability also as being an expression of the uncertainty about what the outcome will be. It is, however, easier to think of probability as an expression of how likely it is that the event will occur. Based on this line of thought, a correct or true probability does not exist. Even if one throws a die, there is no correct probability. This may seem strange, but one must differentiate between proportions, observed or imaginary, and probability in the meaning in which we use the term here. Imagine throwing a dice a great many times say, 6000 times. We would then obtain (if the dice is normal ) about 1000 showing a 1, about 1000 showing a 2, and so on. In the population of 6000 throws, the distribution will be rather similar to 1/6 for the various numbers. But, imagine that we did an infinite number of trials. Then the theory says that we would obtain exactly 1/6. However, these are proportions, observed and resulting from imaginary experiments. They are not probabilities in our way of thinking. A probability applies to a defined event that we do not know will occur or not, and which is normally associated with the future. We will throw a dice. The dice can show a 4, or it can show a different number. Prior to throwing the dice, one can express one s belief that the dice will show a 4. As a rule, this probability is set at 1/6, because it will yield the best prediction of the number of fours if we make many throws. Using a normal (fair) dice, we will calculate that four will be the outcome in about 1/6 of cases in the long run. However, there is nothing automatic in our assignment of the probability 1/6. We have to make a choice. We are the ones who must express how likely it is to obtain a four, given our background knowledge. If we know that the dice is fair, then 1/6 is the natural choice. However, it is possible that one is convinced that the dice is not fair, and that it will give many more fours than usual. Then we may, for example, assign a probability P( four ) = 0.2. No one can say that this is wrong, even though, one can check the proportion of fours for this dice in retrospect and verify its normality. When one originally assigned the probability, the background knowledge was different. Probability must always be seen in relation to the background knowledge. Classical statistics builds on an entirely different understanding of what probability is. Here, a probability is defined as a limit of a relative frequency, meaning the proportion given above when the number of trials become infinitely large. In this manner true probabilities are established. These are then estimated using experiments and analyses. The reader is referred to Aven (2003) for a discussion of this approach and the problems associated with it (see also Section 13.7). A.2 Probability calculus The rules of probability are widely known. We will not repeat them all here, but will only briefly summarise some of the most important ones. The reader is referred to textbooks on probability theory.
3 PROBABILITY CALCULUS AND STATISTICS 169 Probabilities are numbers between 0 and 1. If the event A cannot occur, then P(A) = 0, and if A will occur for certain, then P(A)= 1. If the probability of an event is p, the probability that this event does not occur, is 1 p. Ifwehavetwo events, A and B, then the following formula holds: P(A or B) = P(A)+ P(B) P(A and B) P(A and B) = P(A)P(B A). (A.1) Here P(B A) represents our probability for B when it is known that A has occurred. If A and B are independent, then P(B A) = P(B); in other words, the fact that we know that A has occurred does not affect our probability that B will occur. Suppose that we want to express the probability that two persons will both develop the illness S, if they both have the symptoms V. In other words, we would like to determine P(A 1 and A 2 K) where A 1 represents patient 1 becoming ill and A 2 represents patient 2 becoming ill. We base our analysis on the assignments P(A 1 K) = P(A 2 K) = Is then P(A 1 and A 2 K) = P(A 1 K) P(A 2 K) = = 0.25%? The answer is yes if A 1 and A 2 are independent. But are they independent? If it was known to you that patient 1 had become ill, would it not alter your probability that patient 2 would become ill? Not necessarily it depends on what your background knowledge is: what is known to us initially, whether there is a coupling between these patients in some way or another. For example, if they are both in a weakened physical condition or are related, then it is clear that we know more about patient 2 if we find out that patient 1 has become ill. If our background knowledge is very limited, knowledge that patient 1 has become ill will provide information to us about patient 2. In practice, however, we have so much knowledge about this illness that we can ignore the information that is associated with A 1. We therefore obtain independence since P(A 2 K,A 1 ) = P(A 2 K). If there is coupling between the patients, as illustrated above, then P(A 2 K,A 1 ) will be different from P(A 2 K). Thus we have a dependence between the events A 1 and A 2. A conditional probability, P(A B), is defined by the formula P(A B) = P(A and B)/P (B). We see that this formula is simply a rewriting of formula (A.1). By substituting P(A and B) with P(A)P(B A) (again we use formula (A.1)), the well-known Bayes formula is established: P(A B) = P(A)P(B A)/P (B). We will show the application of this formula in Section A.4.
4 170 PROBABILITY CALCULUS AND STATISTICS A.3 Probability distributions: expected value Let X denote the number of persons who become ill in the course of 1 year for a group of four persons. Assume that you have established the following probabilities that X will take the value i, i = 0, 1, 2, 3, 4: i P(X = i) The expectation, EX, is defined by: EX = = 1.7 The expected value is the centre of gravity of the distribution of X. Ifalever is set up over the point 1.7, then the masses 0.05, 0.40,...,0.05 over the points 0, 1,...,4 will be perfectly balanced. If X can assume one of the values x 1,x 2,..., one can find EX by multiplying x 1 with the corresponding probability P 1, and likewise multiply value x 2, with probability P 2, etc., and sum up all values x j,i.e. EX = x 1 P 1 + x 2 P If X denotes the number of events of a certain type, and this number is either 0 or 1, then the associated probability equals the expected value. This is evident from the formula for expected value, as in this case EX is equal to 1 P (the event will occur). In many situations, we are concerned about rare events in which we, for all practical purposes, can disregard the possibility of two or more such events occurring during the time interval under consideration. The expected number of events will then be approximately equal to the probability that the event will occur once. In applications, we often use the term frequency for the expected value with respect to the number of events. We speak about the frequency of gas leakages, for example, when we actually mean the expected value. We can also regard the frequency as an observation, or prediction, of the number of events during the course of a certain period of time. If we say, for example, that the frequency is 2 per year, we have observed, or we predict, two events per year on the average. The expectation constitutes the centre of gravity of the distribution, as mentioned above, and we see from the example distribution that the actual outcome can be far from the expected value. To describe the uncertainties, a prediction interval is often used. A 90% prediction interval for X is an interval [a, b], where a and b are constants, which is such that P(a X b) = In cases where the probabilities cannot be determined such that the interval has probability 0.90, the interval boundaries are specified such that the probability is larger than, and as close as possible to, In our example, we see that [1, 3] is a 90% prediction interval. We are 90% certain that X will assume one of the values 1, 2 or 3.
5 PROBABILITY CALCULUS AND STATISTICS 171 The variance and standard deviation are used to express the spread around the expected value. The variance of X, VarX, is defined as the expectation of (X EX) 2, while the standard deviation is defined as the square root of the variance. A.3.1 Binomial distribution Let us assume that we have a large population, I, of people (for example, patients) and that we are studying the proportion q of them that become ill over the course of the next year. Let us assume further that we have another similar population II that is composed of n persons. Let X represent the number that develops the illness in this population. What is then our probability that all of those in population II will develop the illness, i.e. P(X = n)? Alternatively, we may think of the populations as comprising technical units, for example machines. In the Bayesian literature, it is common to refer to q as a chance (Singpurwalla 2006). To answer this question, first assume that q is known. You know that the proportion within the larger population I is 0.10, say. Then the problem boils down to determining P(X = n q). If we do not have any other information, it would be natural to say that P(X = n q) = q n. We have n independent trials and our probability for success (illness) is q in each of these trials. We see that when q is known, then X has a so-called binomial probability distribution, i.e. n! P(X = i q) = i!(n i)! qi (1 q) n i, i = 0, 1, 2,...,n, (A.2) where i! = i. The reader is referred to a textbook on probability calculus if understanding this is difficult. When q is small and n is large, we can approximate the binomial probability distribution by using the Poisson distribution: P(X = i r) = ri e r, i = 0, 1, 2,..., i! where r = nq. We know, for example, that (1 q) n is approximately equal to e r. This can be checked using a pocket calculator. We refer to q and r as parameters of the probability distributions. By varying the parameters, we obtain a class of distributions. What do we do if q is unknown? Let us imagine that q can be 0.1, 0.2, 0.3, 0.4 or 0.5. We then use the total probability rule, and obtain: P(X = i) = P(X = i q = 0.1)P (q = 0.1) + P(X = i q = 0.2)P (q = 0.2) P(X = i q = 0.5)P (q = 0.5). By assigning values for P(q = 0.1), P (q = 0.2), etc., we obtain the probability distribution for X, i.e.p(x = i) for various values of i.
6 172 PROBABILITY CALCULUS AND STATISTICS A.4 Statistics (Bayesian statistics) In statistics, focus is often on properties within large populations, for example q in the above example, i.e. the proportion of the large population I that will develop the illness in question. The problem is how to express our knowledge of q based on the available data X, i.e. to establish a probability distribution for q when we observe X. We call this distribution the posterior probability distribution of q. We begin with the so-called prior distribution before we perform the measurements X. Here let us suppose that we only allow q to assume one of the following five values: 0.1, 0.2, 0.3, 0.4 or 0.5. We understand these values such that, for the example 0.5, this means that q lies in the interval [0.45, 0.55). Based on the available knowledge, we assign a prior probability distribution for the proportion q: q P(q = q ) This means that we have the greatest confidence that the proportion q is 0.3 (50%), then 0.2 and 0.4 (20% each) and least likely, 0.1 and 0.5 (5% each). Suppose now that we observe 10 persons and that among these persons there is only 1 that has the illness. How will we then express our uncertainty regarding q? We use Bayes formula and establish the posterior distribution of q. Bayes formula states that P(A B) = P(B A)/P (B) for the events A and B. If we apply this formula, we see that the probability that the proportion will be equal to q when we have observed that 1 out of 10 has the illness is given by P(q = q X = 1) = cf (1 q )P (q = q ), (A.3) where c is a constant such that the sum over the q s is equal to 1, and f is given by f(i q ) = P(X = i q = q ), refer formula (A.2); the quantity X is binomially distributed with parameters 10 and q when q = q is given. Using the formula (A.3), we find the following posterior distribution for q: q P(q = q ) We see that the probability mass has shifted to the left towards smaller values. This was as expected since we observed that only 1 out of 10 became ill, while we, at the start, expected the proportion q to be closer to 30%. If we had a larger observation set, then this data set would have dominated the distribution to an even larger degree.
STA Module 4 Probability Concepts. Rev.F08 1
STA 2023 Module 4 Probability Concepts Rev.F08 1 Learning Objectives Upon completing this module, you should be able to: 1. Compute probabilities for experiments having equally likely outcomes. 2. Interpret
More informationChapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.
Chapter 14 From Randomness to Probability Copyright 2012, 2008, 2005 Pearson Education, Inc. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,
More informationStatistical Methods for Astronomy
Statistical Methods for Astronomy Probability (Lecture 1) Statistics (Lecture 2) Why do we need statistics? Useful Statistics Definitions Error Analysis Probability distributions Error Propagation Binomial
More informationLecture 1: Probability Fundamentals
Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability
More informationA Brief Review of Probability, Bayesian Statistics, and Information Theory
A Brief Review of Probability, Bayesian Statistics, and Information Theory Brendan Frey Electrical and Computer Engineering University of Toronto frey@psi.toronto.edu http://www.psi.toronto.edu A system
More informationFourier and Stats / Astro Stats and Measurement : Stats Notes
Fourier and Stats / Astro Stats and Measurement : Stats Notes Andy Lawrence, University of Edinburgh Autumn 2013 1 Probabilities, distributions, and errors Laplace once said Probability theory is nothing
More informationAST 418/518 Instrumentation and Statistics
AST 418/518 Instrumentation and Statistics Class Website: http://ircamera.as.arizona.edu/astr_518 Class Texts: Practical Statistics for Astronomers, J.V. Wall, and C.R. Jenkins Measuring the Universe,
More informationProbability and Probability Distributions. Dr. Mohammed Alahmed
Probability and Probability Distributions 1 Probability and Probability Distributions Usually we want to do more with data than just describing them! We might want to test certain specific inferences about
More informationProbability theory basics
Probability theory basics Michael Franke Basics of probability theory: axiomatic definition, interpretation, joint distributions, marginalization, conditional probability & Bayes rule. Random variables:
More informationBayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007
Bayesian inference Fredrik Ronquist and Peter Beerli October 3, 2007 1 Introduction The last few decades has seen a growing interest in Bayesian inference, an alternative approach to statistical inference.
More informationPriors, Total Probability, Expectation, Multiple Trials
Priors, Total Probability, Expectation, Multiple Trials CS 2800: Discrete Structures, Fall 2014 Sid Chaudhuri Bayes' Theorem Given: prior probabilities of hypotheses, and the probability that each hypothesis
More informationChapter 8: An Introduction to Probability and Statistics
Course S3, 200 07 Chapter 8: An Introduction to Probability and Statistics This material is covered in the book: Erwin Kreyszig, Advanced Engineering Mathematics (9th edition) Chapter 24 (not including
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1
IEOR 3106: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 1 Probability Review: Read Chapters 1 and 2 in the textbook, Introduction to Probability
More informationProbability, Entropy, and Inference / More About Inference
Probability, Entropy, and Inference / More About Inference Mário S. Alvim (msalvim@dcc.ufmg.br) Information Theory DCC-UFMG (2018/02) Mário S. Alvim (msalvim@dcc.ufmg.br) Probability, Entropy, and Inference
More informationLecture 1. ABC of Probability
Math 408 - Mathematical Statistics Lecture 1. ABC of Probability January 16, 2013 Konstantin Zuev (USC) Math 408, Lecture 1 January 16, 2013 1 / 9 Agenda Sample Spaces Realizations, Events Axioms of Probability
More informationUC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006
Review problems UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Solutions 5 Spring 006 Problem 5. On any given day your golf score is any integer
More informationP (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).
Lectures 7-8 jacques@ucsdedu 41 Conditional Probability Let (Ω, F, P ) be a probability space Suppose that we have prior information which leads us to conclude that an event A F occurs Based on this information,
More informationSTAT:5100 (22S:193) Statistical Inference I
STAT:5100 (22S:193) Statistical Inference I Week 3 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Recap Matching problem Generalized
More informationStatistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions
Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions 1999 Prentice-Hall, Inc. Chap. 4-1 Chapter Topics Basic Probability Concepts: Sample
More informationDept. of Linguistics, Indiana University Fall 2015
L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would
More informationExercise 1: Basics of probability calculus
: Basics of probability calculus Stig-Arne Grönroos Department of Signal Processing and Acoustics Aalto University, School of Electrical Engineering stig-arne.gronroos@aalto.fi [21.01.2016] Ex 1.1: Conditional
More informationChapter 15 Sampling Distribution Models
Chapter 15 Sampling Distribution Models 1 15.1 Sampling Distribution of a Proportion 2 Sampling About Evolution According to a Gallup poll, 43% believe in evolution. Assume this is true of all Americans.
More informationProbability Theory Review Reading Assignments
Probability Theory Review Reading Assignments R. Duda, P. Hart, and D. Stork, Pattern Classification, John-Wiley, 2nd edition, 2001 (appendix A.4, hard-copy). "Everything I need to know about Probability"
More informationBayesian Inference: What, and Why?
Winter School on Big Data Challenges to Modern Statistics Geilo Jan, 2014 (A Light Appetizer before Dinner) Bayesian Inference: What, and Why? Elja Arjas UH, THL, UiO Understanding the concepts of randomness
More informationBasic Statistics for SGPE Students Part II: Probability theory 1
Basic Statistics for SGPE Students Part II: Probability theory 1 Mark Mitchell mark.mitchell@ed.ac.uk Nicolai Vitt n.vitt@ed.ac.uk University of Edinburgh September 2016 1 Thanks to Achim Ahrens, Anna
More informationNotes on Mathematics Groups
EPGY Singapore Quantum Mechanics: 2007 Notes on Mathematics Groups A group, G, is defined is a set of elements G and a binary operation on G; one of the elements of G has particularly special properties
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationPreliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com
1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix
More informationCS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro
CS37300 Class Notes Jennifer Neville, Sebastian Moreno, Bruno Ribeiro 2 Background on Probability and Statistics These are basic definitions, concepts, and equations that should have been covered in your
More informationTOPIC 12 PROBABILITY SCHEMATIC DIAGRAM
TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM Topic Concepts Degree of Importance References NCERT Book Vol. II Probability (i) Conditional Probability *** Article 1.2 and 1.2.1 Solved Examples 1 to 6 Q. Nos
More informationLecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019
Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial
More informationProbability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.
Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for
More informationLecture 16. Lectures 1-15 Review
18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability
More informationProbability is related to uncertainty and not (only) to the results of repeated experiments
Uncertainty probability Probability is related to uncertainty and not (only) to the results of repeated experiments G. D Agostini, Probabilità e incertezze di misura - Parte 1 p. 40 Uncertainty probability
More informationThis is a multiple choice and short answer practice exam. It does not count towards your grade. You may use the tables in your book.
NAME (Please Print): HONOR PLEDGE (Please Sign): statistics 101 Practice Final Key This is a multiple choice and short answer practice exam. It does not count towards your grade. You may use the tables
More informationPart (A): Review of Probability [Statistics I revision]
Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery
More informationData Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber
Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2017 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 13: Expectation and Variance and joint distributions Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin
More informationMachine Learning. Bayes Basics. Marc Toussaint U Stuttgart. Bayes, probabilities, Bayes theorem & examples
Machine Learning Bayes Basics Bayes, probabilities, Bayes theorem & examples Marc Toussaint U Stuttgart So far: Basic regression & classification methods: Features + Loss + Regularization & CV All kinds
More informationVibhav Gogate University of Texas, Dallas
Review of Probability and Statistics 101 Elements of Probability Theory Events, Sample Space and Random Variables Axioms of Probability Independent Events Conditional Probability Bayes Theorem Joint Probability
More informationSociology 6Z03 Topic 10: Probability (Part I)
Sociology 6Z03 Topic 10: Probability (Part I) John Fox McMaster University Fall 2014 John Fox (McMaster University) Soc 6Z03: Probability I Fall 2014 1 / 29 Outline: Probability (Part I) Introduction Probability
More informationThe Random Variable for Probabilities Chris Piech CS109, Stanford University
The Random Variable for Probabilities Chris Piech CS109, Stanford University Assignment Grades 10 20 30 40 50 60 70 80 90 100 10 20 30 40 50 60 70 80 90 100 Frequency Frequency 10 20 30 40 50 60 70 80
More informationBasics of Probability
Basics of Probability Lecture 1 Doug Downey, Northwestern EECS 474 Events Event space E.g. for dice, = {1, 2, 3, 4, 5, 6} Set of measurable events S 2 E.g., = event we roll an even number = {2, 4, 6} S
More informationBayesian Statistics Part III: Building Bayes Theorem Part IV: Prior Specification
Bayesian Statistics Part III: Building Bayes Theorem Part IV: Prior Specification Michael Anderson, PhD Hélène Carabin, DVM, PhD Department of Biostatistics and Epidemiology The University of Oklahoma
More informationDiscrete Probability. Chemistry & Physics. Medicine
Discrete Probability The existence of gambling for many centuries is evidence of long-running interest in probability. But a good understanding of probability transcends mere gambling. The mathematics
More informationA Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.
A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,
More informationM378K In-Class Assignment #1
The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.
More informationBasic Probability. Introduction
Basic Probability Introduction The world is an uncertain place. Making predictions about something as seemingly mundane as tomorrow s weather, for example, is actually quite a difficult task. Even with
More informationProbability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability?
Probability: Why do we care? Lecture 2: Probability and Distributions Sandy Eckel seckel@jhsph.edu 22 April 2008 Probability helps us by: Allowing us to translate scientific questions into mathematical
More informationExpectation, inequalities and laws of large numbers
Chapter 3 Expectation, inequalities and laws of large numbers 3. Expectation and Variance Indicator random variable Let us suppose that the event A partitions the sample space S, i.e. A A S. The indicator
More informationChapter 2 Class Notes
Chapter 2 Class Notes Probability can be thought of in many ways, for example as a relative frequency of a long series of trials (e.g. flips of a coin or die) Another approach is to let an expert (such
More informationMixture distributions in Exams MLC/3L and C/4
Making sense of... Mixture distributions in Exams MLC/3L and C/4 James W. Daniel Jim Daniel s Actuarial Seminars www.actuarialseminars.com February 1, 2012 c Copyright 2012 by James W. Daniel; reproduction
More informationHuman-Oriented Robotics. Probability Refresher. Kai Arras Social Robotics Lab, University of Freiburg Winter term 2014/2015
Probability Refresher Kai Arras, University of Freiburg Winter term 2014/2015 Probability Refresher Introduction to Probability Random variables Joint distribution Marginalization Conditional probability
More informationConditional Probability
Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationIntro to Probability. Andrei Barbu
Intro to Probability Andrei Barbu Some problems Some problems A means to capture uncertainty Some problems A means to capture uncertainty You have data from two sources, are they different? Some problems
More informationAdvanced Herd Management Probabilities and distributions
Advanced Herd Management Probabilities and distributions Anders Ringgaard Kristensen Slide 1 Outline Probabilities Conditional probabilities Bayes theorem Distributions Discrete Continuous Distribution
More informationCS1512 Foundations of Computing Science 2. Lecture 4
CS1512 Foundations of Computing Science 2 Lecture 4 Bayes Law; Gaussian Distributions 1 J R W Hunter, 2006; C J van Deemter 2007 (Revd. Thomas) Bayes Theorem P( E 1 and E 2 ) = P( E 1 )* P( E 2 E 1 ) Order
More informationLecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya
BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete
More informationConditional probabilities and graphical models
Conditional probabilities and graphical models Thomas Mailund Bioinformatics Research Centre (BiRC), Aarhus University Probability theory allows us to describe uncertainty in the processes we model within
More informationLecture 2: Probability and Distributions
Lecture 2: Probability and Distributions Ani Manichaikul amanicha@jhsph.edu 17 April 2007 1 / 65 Probability: Why do we care? Probability helps us by: Allowing us to translate scientific questions info
More informationTopic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1
Topic -2 Probability Larson & Farber, Elementary Statistics: Picturing the World, 3e 1 Probability Experiments Experiment : An experiment is an act that can be repeated under given condition. Rolling a
More informationIntroduction to Stochastic Processes
Stat251/551 (Spring 2017) Stochastic Processes Lecture: 1 Introduction to Stochastic Processes Lecturer: Sahand Negahban Scribe: Sahand Negahban 1 Organization Issues We will use canvas as the course webpage.
More informationMATH 151, FINAL EXAM Winter Quarter, 21 March, 2014
Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists
More information7.1 What is it and why should we care?
Chapter 7 Probability In this section, we go over some simple concepts from probability theory. We integrate these with ideas from formal language theory in the next chapter. 7.1 What is it and why should
More informationChapter Three. Hypothesis Testing
3.1 Introduction The final phase of analyzing data is to make a decision concerning a set of choices or options. Should I invest in stocks or bonds? Should a new product be marketed? Are my products being
More informationCS 630 Basic Probability and Information Theory. Tim Campbell
CS 630 Basic Probability and Information Theory Tim Campbell 21 January 2003 Probability Theory Probability Theory is the study of how best to predict outcomes of events. An experiment (or trial or event)
More informationProbability Year 9. Terminology
Probability Year 9 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some
More informationChapter 4: An Introduction to Probability and Statistics
Chapter 4: An Introduction to Probability and Statistics 4. Probability The simplest kinds of probabilities to understand are reflected in everyday ideas like these: (i) if you toss a coin, the probability
More informationRandom variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState
Random variables, Expectation, Mean and Variance Slides are adapted from STAT414 course at PennState https://onlinecourses.science.psu.edu/stat414/ Random variable Definition. Given a random experiment
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationk P (X = k)
Math 224 Spring 208 Homework Drew Armstrong. Suppose that a fair coin is flipped 6 times in sequence and let X be the number of heads that show up. Draw Pascal s triangle down to the sixth row (recall
More informationProbabilistic models
Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became
More informationTake the measurement of a person's height as an example. Assuming that her height has been determined to be 5' 8", how accurate is our result?
Error Analysis Introduction The knowledge we have of the physical world is obtained by doing experiments and making measurements. It is important to understand how to express such data and how to analyze
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationp. 4-1 Random Variables
Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i
More informationProbability Theory and Random Variables
Probability Theory and Random Variables One of the most noticeable aspects of many computer science related phenomena is the lack of certainty. When a job is submitted to a batch oriented computer system,
More informationP (E) = P (A 1 )P (A 2 )... P (A n ).
Lecture 9: Conditional probability II: breaking complex events into smaller events, methods to solve probability problems, Bayes rule, law of total probability, Bayes theorem Discrete Structures II (Summer
More informationProbability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2
Probability Probability is the study of uncertain events or outcomes. Games of chance that involve rolling dice or dealing cards are one obvious area of application. However, probability models underlie
More informationthe time it takes until a radioactive substance undergoes a decay
1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete
More informationExpectation MATH Expectation. Benjamin V.C. Collins, James A. Swenson MATH 2730
MATH 2730 Expectation Benjamin V.C. Collins James A. Swenson Average value Expectation Definition If (S, P) is a sample space, then any function with domain S is called a random variable. Idea Pick a real-valued
More informationContents. Decision Making under Uncertainty 1. Meanings of uncertainty. Classical interpretation
Contents Decision Making under Uncertainty 1 elearning resources Prof. Ahti Salo Helsinki University of Technology http://www.dm.hut.fi Meanings of uncertainty Interpretations of probability Biases in
More informationLectures on Statistics. William G. Faris
Lectures on Statistics William G. Faris December 1, 2003 ii Contents 1 Expectation 1 1.1 Random variables and expectation................. 1 1.2 The sample mean........................... 3 1.3 The sample
More informationGrundlagen der Künstlichen Intelligenz
Grundlagen der Künstlichen Intelligenz Uncertainty & Probabilities & Bandits Daniel Hennes 16.11.2017 (WS 2017/18) University Stuttgart - IPVS - Machine Learning & Robotics 1 Today Uncertainty Probability
More information2. AXIOMATIC PROBABILITY
IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop
More informationCOMP61011 : Machine Learning. Probabilis*c Models + Bayes Theorem
COMP61011 : Machine Learning Probabilis*c Models + Bayes Theorem Probabilis*c Models - one of the most active areas of ML research in last 15 years - foundation of numerous new technologies - enables decision-making
More informationCSE 312, 2011 Winter, W.L.Ruzzo. 6. random variables
CSE 312, 2011 Winter, W.L.Ruzzo 6. random variables random variables 23 numbered balls Ross 4.1 ex 1b 24 first head 25 probability mass functions 26 head count Let X be the number of heads observed in
More informationProbability (10A) Young Won Lim 6/12/17
Probability (10A) Copyright (c) 2017 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later
More informationQuestion Bank In Mathematics Class IX (Term II)
Question Bank In Mathematics Class IX (Term II) PROBABILITY A. SUMMATIVE ASSESSMENT. PROBABILITY AN EXPERIMENTAL APPROACH. The science which measures the degree of uncertainty is called probability.. In
More informationAn introduction to biostatistics: part 1
An introduction to biostatistics: part 1 Cavan Reilly September 6, 2017 Table of contents Introduction to data analysis Uncertainty Probability Conditional probability Random variables Discrete random
More informationTopic 2: Review of Probability Theory
CS 8850: Advanced Machine Learning Fall 2017 Topic 2: Review of Probability Theory Instructor: Daniel L. Pimentel-Alarcón c Copyright 2017 2.1 Why Probability? Many (if not all) applications of machine
More informationStatistics for Economists. Lectures 3 & 4
Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with
More informationProbability Year 10. Terminology
Probability Year 10 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some
More informationWith Question/Answer Animations. Chapter 7
With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements
More informationReview of Basic Probability
Review of Basic Probability Erik G. Learned-Miller Department of Computer Science University of Massachusetts, Amherst Amherst, MA 01003 September 16, 2009 Abstract This document reviews basic discrete
More informationTwo-sample Categorical data: Testing
Two-sample Categorical data: Testing Patrick Breheny October 29 Patrick Breheny Biostatistical Methods I (BIOS 5710) 1/22 Lister s experiment Introduction In the 1860s, Joseph Lister conducted a landmark
More informationChapter 2.5 Random Variables and Probability The Modern View (cont.)
Chapter 2.5 Random Variables and Probability The Modern View (cont.) I. Statistical Independence A crucially important idea in probability and statistics is the concept of statistical independence. Suppose
More informationDiscrete Probability and State Estimation
6.01, Fall Semester, 2007 Lecture 12 Notes 1 MASSACHVSETTS INSTITVTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.01 Introduction to EECS I Fall Semester, 2007 Lecture 12 Notes
More informationProbability- describes the pattern of chance outcomes
Chapter 6 Probability the study of randomness Probability- describes the pattern of chance outcomes Chance behavior is unpredictable in the short run, but has a regular and predictable pattern in the long
More information