Introduction to Probability
|
|
- Lesley Leonard
- 5 years ago
- Views:
Transcription
1 Introduction to Probability Salvatore Pace September 2, 208 Introduction In a frequentist interpretation of probability, a probability measure P (A) says that if I do something N times, I should see event A happen P (A) N times. This intuition is not wrong and everyday life tells us that as N, the frequentist interpretation becomes correct. For example, flip a fair coin and count how many times you get heads and tails. The more you flip the coin, the more the count between heads and tails will converge to a similar number. It was Andrey Kolmogorov, a 20th century Soviet mathematician, who first successfully quantified this intuition with the following probability axioms:. The probability of an event must be a non-negative real number (P (A) 0). 2. The the sum of the probabilities of each possible event is one (P (Ω) N i= P (A i) =, where Ω = {A, A 2,..., A N } is the set of all possible events). 3. The probability of the sum of two mutually exclusive events (two events that cannot happen at the same time) is equal to the sum of their individual probabilities (P (A + A 2 ) = P (A ) + P (A 2 ) iff events A and A 2 are mutually exclusive). That is all there is. Everything in probability can technically be derived from these three axioms. To state one of the important things that follow, if I have two events A and B are independent (if event A happens it does not affect the probability that event B happens) then the probability that they both events happen is given by P (A and B) = P (A) P (B). () That is, if two events are independent, they probability for them to both happen is equal to the product of their individual probabilities (For an explanation of why this is, look up conditional probability). Furthermore, another useful tool is that if asked for the probability that event A occurs, one could also find one minus the probability that event A does not occur, which by axiom 2 will gave you the same result. 2 Discrete random variables This section will deal with sample spaces that are countable. Countability will be talked more in the beginning of section Equally-likely probability When all elements in your sample space Ω are equally-likely, the probability to get an event A n will be proportional to the number of ways event A n can occur. To make this
2 more clear, let us considered flipping a fair coin three times. Denoting tails as T and heads as H, I can get the following outcomes: Ω = {HHH, HHT, HT H, HT T, T HH, T HT, T T H, T T T } where, for example, HHT means I first got heads, then another heads, and lastly tails. This flipping coins example fits under out equally-likely probability since I am equally likely to get a heads as I am a tails each time I flipped my coin. HHH is just as likely as HHT, as each time I flip the coin, my result is not dependent on my past results. So, I was interested in the question, what is the probability I can get two heads if I flip a fair coin three times, my initial statement of this section says that it should be proportional to the number of ways I can get two heads (HHT or HT H or T HH). We can figure out its exact form by looking at the second axiom of probability. If P (Ω) is proportional to the amount of ways I can get Ω (The number of outcomes one can get from flipping a coin three times), then in order to make sure P (Ω) =, I must normalize my probability measure to one by dividing by the number of outcomes one can get from flipping a coin three times. Therefore, in this example, the probability to get two heads will be: P (2 heads) = N (Two heads) N (Total outcomes) = 3 8, (2) where N (A) denotes the number of ways for event A to occur. In general, the probability to get event A if each outcome of my experiment/trial is equally likely is P (A) = N (A) N (Ω). (3) The game of counting is a rich field of math known as combinatorics, and is very important in statistical physics. In these equally-likely case, all one must do is simply count, but as you can imagine, this counting can become extremely nontrivial fast (for the interested reader, Google Catalan numbers )! 2.2 Binomial Distribution A more general probability game is allowing the probability of outcomes to be unequal. For example, let us go back to our friendly coin flipping game as a motivation, but this time the probability to get heads is p and the probability to get tails is q (which by axiom 2, q = p). What if someone now asked, what is the probability that if I flip a coin N times, that I get n heads? Armed with equation, one may make a guess that P (# Heads = n) p n q N n = p n ( p) N n (4) where the N n term denotes the number of tails I get (If I don t get heads, I must get tails). While this takes into account the given probabilities, it does not take into account that there are multiple configurations to get the same result. For example, if I flip a coin N times and want to get N heads, there are N different ways, or combinations, I can come up with that gets me my results (one for each position the tails could be). Thus, we should multiply equation 4 by the total number of configurations to get n heads. The reason for this factor can also be seen by considering axiom 3. To figure out what this factor is imagine I have N boxes, each needing to be filled with a head or tail. When deciding where my first head should go, there are N boxes 2
3 to chose from. For my second, there are N boxes, and generally, for my kth head, there are N k + boxes to choose from. Thus by the basic counting rule (which states that the if there are m ways of doing one thing, m 2 ways of doing the second thing, m 3 ways of doing the third thing,etc, then the total number of ways to do all of the things is equal to m m 2... m k k i= m i), the number of ways to put n things in N boxes is N! N (N )... (N n + ) = (N n)!. (5) However, if the things you are putting in boxes are indistinguishable, then we accidentally are over counting with the above. For example, if I switched the heads in box with the heads in box 2, I do not get a new results (it is not a separate element in my sample space). Therefore, we must divide equation 5 by the number of permutations I can make with each of the n heads, namely n! The reason it is n! is given by the same argument at which we can to equation 5 with (t first I can choose n different heads to switch, then - since I already chose one to switch - I can chose n, etc). Therefore, the factor I must N! multiply equation 4 by. This leads to what is called the Binomial Distribution: (N n)!n! P N (n) = N! (N n)!n! pn ( p) N n (6) It measures the probability for cases such as: I have N independent trials in my experiment, what is the probability that I get n successful trials if each trial has a probability to succeed p. Another discrete probability distribution is the Poisson distribution. We can derive it as a limiting case of the binomial distribution for when N is large, p is small, and is a constant of about order one. This limiting case occurs when the likelihood of an event to occur is rare. The Poisson distribution is given by: 2.3 Proof: Binomial Poisson P λ (n) = λn e λ Using Stirling s approximation, x! x x exp[ x] 2πx., we can rewrite the binomial distribution n! (7) as P = n! k!(n k)! pk ( p) n k 2πn( n e P )n n 2π(n k)( n k e )n k k! pk ( p) n k n n e k = n k (n k) n k k! pk ( p) n k Now, letting N while keeping = λ, which is a constant: n n p k ( p) n k e k (n k) n k k! = nn ( λ n )k ( λ n )n k e k (n) n k ( k n )n k k! Reminding ourselves that lim n ( x n )n e x, P = λk e λ e k e k k! which is the Poisson distribution. 3 = λk ( λ n )n k e k ( k n )n k k! = λk e λ k λk ( λ n )n e k ( k n )n k!
4 3 Continuous random variables So far we have dealt with random variables (variables that follow some sort of probability function) have been discrete (i.e. the sample spaces has been countable). This idea of discreteness is important in many fields of physics and can be hard to wrap your head around at first. If something is discrete, you can assign each value of it an integer. If something is continuous, you cannot. Hence the math word countable. With this, you can get into the realm of some infinities being bigger than others, which is hard to wrap your head around at first. For the interested reader, for a famous proof that the infinite set of all real numbers is larger than the infinite set of all integers, look up Cantor s diagonal argument. Now, if you have a continuous random variable, then the sample space of possible outcomes must be uncountable and therefore, the probability to get exactly one element from this set is zero. Some textbooks use this as the definition of a continuous random variables. However, it can also be interpreted as a result of a single element from an uncountable set having zero measure. Therefore, for a continuous random variable x, since P (x) = 0, we instead ask questions about P (a < x < b). Now for discrete random variables, something like P (a < x < b) would be answered with a sum P (x i ), which sums over the countable set of x i which satisfy a < x i < b. Therefore, one may expect that instead of sums, for continuous random variables we use integrals! ie, if x is a continuous random variable, then P (a < x < b) = ˆ b a ds p(s). (8) Where p(s) is called the probability density function. This definition can be made more rigorous with the aid of Riemann sums. Following from the 3 axioms of probability, one can see that the only limitation of p(s) is that it must a nonnegative function and properly normalized such that axiom 2 is satisfied. An extremely important probability distribution is the Gaussian distribution (also called the normal distribution). Its is the classical bell curve we all know and love. Its not only essential in physics but in all science fields (its universality is due to something called the central limit theorem.). It is given by: p(x; µ; σ) = 2πσ 2 e (x µ)2 2σ 2 (9) There are many ways to arrive at this, but one which is particularly enlightening is by the de Moivre-Laplace theorem, which shows that the Gaussian distribution is a limiting case of the Binomial distribution, of large N. 3. Proof: Binomial Gaussian Starting with the binomial distribution: P bin (k) = N! k!(n k)! pk q N k We can rewrite it by using Stirling s approximation to a factorial: x! x x exp[ x] 2πx. This leads to: 4
5 N N exp[ N] 2πN k k exp[ k] 2πk(N k) (N k) exp[ (N k)] 2π(N k) pk q N k which simplifies to: ( ) k ( ) N k N 2πk(N k) k N k In the case of large N, we can approximate p k We can use this approximation by N rewriting our current approximation as: ( ) k ( ) N k 2πN k ( k ) k N k N N ( ) k ( ) N k 2π( p) k N k 2πq ( k ) k ( ) N k N k Our amplitude is now correct, but we eventually want to have an exponential term, so we do the following: { { ( 2πq exp Log k { { ( ) } 2πq exp klog k ) k }} { { ( exp Log N k { ( + (N k)log N k ) N k }} ) }} At this point, we will define a new variable that makes the algebra in the future easier. let z = k µ = k σ q. The term z is what we want to have in our exponential term for our Gaussian, so making this substitute will make it easier to get our exponential in the correct final form. With this new variable, it is easy to verify: k = + z q and N k = z q. Let us look at each logarithm s argument with this new variable. and k = + z q = N k = z q = ( + z q ( z q ) ( ) q = + z ) ( ) p = z Plugging this in, and pulling the negative one in the exponent out in front of the logarithm, our approximation now becomes: { ( 2πq exp ( + z { } q q)log + z + ( z { }) } p q)log z 5
6 Using Taylors theorem: Log( + x) x x2, stopping at 2nd order because the larger 2 term in a Gaussian exponential is of 2nd order, our expression is now: { ( 2πq exp ( + z ( ) q q) z z2 q + ( z ( )) } p q) z 2 z2 p 2 Distributing leads to: { (( 2πq exp z ) ( q z2 q 2 + z2 q + z )) } q z2 p 2 + z2 p { } 2πq exp (p + q) z2 2 which, using p + q = and plugging in our expression for z, simplifies the approximation to: { } (k ) 2 2πq exp 2q. 4 Expectation value and variance As a final part of the notes, I will quickly define what the expectation values and variances are. For discrete random variables, the expectation is given by f(x) = s f(s)p (s) (0) and for continuous random variables it is given by ˆ f(x) = ds f(s) p(s) () s where the sum and integral is over all possible values of the random variable. By the definitions, the expectation value is essentially a weighted sum, such that the most likely values will be nearest to the expectation value. Nevertheless, the expectation value does not have to be a possible outcome of the trial/experiment you are doing. Some people may call the expectation value the mean or the average. The variance of a random variable x is defined by: and can be written in the more friendly form Var(x) = (x x ) 2 (2) Var(x) = (x x ) 2 = x 2 2x x x 2 = x 2 2 x x x 2 = x 2 x 2 (3) Where we have used the linearity of the expectation, the fact that the expectation value is simply a number, and the expectation value of a number is the same as the number itself. Conceptually, it measures how spread out the random numbers are from they expectation value. 6
7 The below table shows the expectation value and variance of the distributions talked about in these notes. I very much encourage you to prove some these yourself! Distrubution Expectation Value Variance Binomial ( p) Poisson λ λ Gaussian µ σ 2 5 Exercises ) Given the probability measure P (i, j) = e 5 2i 3 3, for the discrete random variables i!j! i, j {0,, 2, 3,...} show that P (i, j) satisfies the first two probability axioms. 2) A box contains four balls numbered,2,3, and 4. A ball is chosen ar random, its number noted, and the ball is returned into the box. This process is then repeated one more time. a) Determine the sample space Ω b) If each outcome is assigned the same probability, what is the common probability? c) Using the probability assignment in part (b), find the probability that the two numbers chosen are different. 3) Given the probability measure P (x) = C 4x, with x {0,, 2, 3,...}: x! a) Find the normalizing constant C, should that the second probability axiom is satisfied. b) Find the probability that x is at least 2 (ie. P (x 2)) (Hint, look at the 2nd and 3rd probability axioms). 4) You throw two 6 sided dice. Find the expectation value of their sum given that they are fair dice (Each side is equally likely). 5) You flip a weighted coin 00 times, which has the property that it is two times more likely to get heads than tails. What is the probability that you get 42 heads? 6) Prove that the Gaussian distribution (equation 9) is properly normalized. (Hint: If it is properly normalized, than P ( < x < ) = P ( < x < ) P ( < y < ) =, where x and y both follow the Gaussian distribution. Therefore P ( < x < ) P ( < y < ) = implies P ( < x < ) = ) 7
8 Binomiali called Binomial coefficient In general mania In it enka l n pop n ftp.qnncoe fe pdp fff BinomialTheorm N FN ptg p N ft f l int Ent f enqn n popen h pnqm poppope poet n Nn png n Goetizendens poet Cptgf x z any cpdpfcptqy pdpfdplptgf pdpfpnftgfy ffnlftqfttfvcvucetg.tt pntfnip2n Varlx X art thx 2 th tht For Binomial varix 2 7 LAE pntp2nz f N p NE porph Nfuf e g
9 fiodo o Gaussian t.q.jo us Exe INT tfioe to g e as as e as Is e tmrw no e Qiao
Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.
Math 224 Fall 2017 Homework 1 Drew Armstrong Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman. Section 1.1, Exercises 4,5,6,7,9,12. Solutions to Book Problems.
More informationChapter 8: An Introduction to Probability and Statistics
Course S3, 200 07 Chapter 8: An Introduction to Probability and Statistics This material is covered in the book: Erwin Kreyszig, Advanced Engineering Mathematics (9th edition) Chapter 24 (not including
More informationPhysics 6720 Introduction to Statistics April 4, 2017
Physics 6720 Introduction to Statistics April 4, 2017 1 Statistics of Counting Often an experiment yields a result that can be classified according to a set of discrete events, giving rise to an integer
More informationRVs and their probability distributions
RVs and their probability distributions RVs and their probability distributions In these notes, I will use the following notation: The probability distribution (function) on a sample space will be denoted
More informationSTA Module 4 Probability Concepts. Rev.F08 1
STA 2023 Module 4 Probability Concepts Rev.F08 1 Learning Objectives Upon completing this module, you should be able to: 1. Compute probabilities for experiments having equally likely outcomes. 2. Interpret
More information2. Probability. Chris Piech and Mehran Sahami. Oct 2017
2. Probability Chris Piech and Mehran Sahami Oct 2017 1 Introduction It is that time in the quarter (it is still week one) when we get to talk about probability. Again we are going to build up from first
More informationLecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019
Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial
More informationCMPSCI 240: Reasoning Under Uncertainty
CMPSCI 240: Reasoning Under Uncertainty Lecture 5 Prof. Hanna Wallach wallach@cs.umass.edu February 7, 2012 Reminders Pick up a copy of B&T Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/
More informationLecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya
BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationDiscrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10
EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationLecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability
Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic
More informationProbability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008
Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008 1 Review We saw some basic metrics that helped us characterize
More informationProbability (Devore Chapter Two)
Probability (Devore Chapter Two) 1016-345-01: Probability and Statistics for Engineers Fall 2012 Contents 0 Administrata 2 0.1 Outline....................................... 3 1 Axiomatic Probability 3
More informationPhysics Sep Example A Spin System
Physics 30 7-Sep-004 4- Example A Spin System In the last lecture, we discussed the binomial distribution. Now, I would like to add a little physical content by considering a spin system. Actually this
More informationP [(E and F )] P [F ]
CONDITIONAL PROBABILITY AND INDEPENDENCE WORKSHEET MTH 1210 This worksheet supplements our textbook material on the concepts of conditional probability and independence. The exercises at the end of each
More informationLectures on Elementary Probability. William G. Faris
Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................
More informationProbability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom
1 Learning Goals Probability: Terminology and Examples Class 2, 18.05 Jeremy Orloff and Jonathan Bloom 1. Know the definitions of sample space, event and probability function. 2. Be able to organize a
More informationMATH Solutions to Probability Exercises
MATH 5 9 MATH 5 9 Problem. Suppose we flip a fair coin once and observe either T for tails or H for heads. Let X denote the random variable that equals when we observe tails and equals when we observe
More informationHW2 Solutions, for MATH441, STAT461, STAT561, due September 9th
HW2 Solutions, for MATH44, STAT46, STAT56, due September 9th. You flip a coin until you get tails. Describe the sample space. How many points are in the sample space? The sample space consists of sequences
More informationChapter 4: An Introduction to Probability and Statistics
Chapter 4: An Introduction to Probability and Statistics 4. Probability The simplest kinds of probabilities to understand are reflected in everyday ideas like these: (i) if you toss a coin, the probability
More informationMATH 19B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 2010
MATH 9B FINAL EXAM PROBABILITY REVIEW PROBLEMS SPRING, 00 This handout is meant to provide a collection of exercises that use the material from the probability and statistics portion of the course The
More informationGrades 7 & 8, Math Circles 24/25/26 October, Probability
Faculty of Mathematics Waterloo, Ontario NL 3G1 Centre for Education in Mathematics and Computing Grades 7 & 8, Math Circles 4/5/6 October, 017 Probability Introduction Probability is a measure of how
More informationk P (X = k)
Math 224 Spring 208 Homework Drew Armstrong. Suppose that a fair coin is flipped 6 times in sequence and let X be the number of heads that show up. Draw Pascal s triangle down to the sixth row (recall
More informationthe time it takes until a radioactive substance undergoes a decay
1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationRANDOM WALKS AND THE PROBABILITY OF RETURNING HOME
RANDOM WALKS AND THE PROBABILITY OF RETURNING HOME ELIZABETH G. OMBRELLARO Abstract. This paper is expository in nature. It intuitively explains, using a geometrical and measure theory perspective, why
More informationMATH MW Elementary Probability Course Notes Part I: Models and Counting
MATH 2030 3.00MW Elementary Probability Course Notes Part I: Models and Counting Tom Salisbury salt@yorku.ca York University Winter 2010 Introduction [Jan 5] Probability: the mathematics used for Statistics
More informationP (A) = P (B) = P (C) = P (D) =
STAT 145 CHAPTER 12 - PROBABILITY - STUDENT VERSION The probability of a random event, is the proportion of times the event will occur in a large number of repititions. For example, when flipping a coin,
More informationLecture 10: Everything Else
Math 94 Professor: Padraic Bartlett Lecture 10: Everything Else Week 10 UCSB 2015 This is the tenth week of the Mathematics Subject Test GRE prep course; here, we quickly review a handful of useful concepts
More informationDS-GA 1002 Lecture notes 2 Fall Random variables
DS-GA 12 Lecture notes 2 Fall 216 1 Introduction Random variables Random variables are a fundamental tool in probabilistic modeling. They allow us to model numerical quantities that are uncertain: the
More informationProbability theory basics
Probability theory basics Michael Franke Basics of probability theory: axiomatic definition, interpretation, joint distributions, marginalization, conditional probability & Bayes rule. Random variables:
More informationLecture 4: Probability and Discrete Random Variables
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1
More informationSpecial distributions
Special distributions August 22, 2017 STAT 101 Class 4 Slide 1 Outline of Topics 1 Motivation 2 Bernoulli and binomial 3 Poisson 4 Uniform 5 Exponential 6 Normal STAT 101 Class 4 Slide 2 What distributions
More informationCS 361: Probability & Statistics
September 12, 2017 CS 361: Probability & Statistics Correlation Summary of what we proved We wanted a way of predicting y from x We chose to think in standard coordinates and to use a linear predictor
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationLecture 3 Probability Basics
Lecture 3 Probability Basics Thais Paiva STA 111 - Summer 2013 Term II July 3, 2013 Lecture Plan 1 Definitions of probability 2 Rules of probability 3 Conditional probability What is Probability? Probability
More informationProbability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.
Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationDiscrete Distributions
A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose
More informationBINOMIAL DISTRIBUTION
BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called
More informationLecture 16. Lectures 1-15 Review
18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability
More informationExpected Value and Variance
Expected Value and Variance This handout, reorganizes the material in. of the text. It also adds to it slightly (Theorem on p.) and presents two results from chapter of the text, because they fit in here
More informationTOPIC 12: RANDOM VARIABLES AND THEIR DISTRIBUTIONS
TOPIC : RANDOM VARIABLES AND THEIR DISTRIBUTIONS In the last section we compared the length of the longest run in the data for various players to our expectations for the longest run in data generated
More informationNotes Week 2 Chapter 3 Probability WEEK 2 page 1
Notes Week 2 Chapter 3 Probability WEEK 2 page 1 The sample space of an experiment, sometimes denoted S or in probability theory, is the set that consists of all possible elementary outcomes of that experiment
More informationSIMPLE RANDOM WALKS: IMPROBABILITY OF PROFITABLE STOPPING
SIMPLE RANDOM WALKS: IMPROBABILITY OF PROFITABLE STOPPING EMILY GENTLES Abstract. This paper introduces the basics of the simple random walk with a flair for the statistical approach. Applications in biology
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationName: Firas Rassoul-Agha
Midterm 1 - Math 5010 - Spring 016 Name: Firas Rassoul-Agha Solve the following 4 problems. You have to clearly explain your solution. The answer carries no points. Only the work does. CALCULATORS ARE
More informationMATH 10 INTRODUCTORY STATISTICS
MATH 10 INTRODUCTORY STATISTICS Ramesh Yapalparvi Week 2 Chapter 4 Bivariate Data Data with two/paired variables, Pearson correlation coefficient and its properties, general variance sum law Chapter 6
More informationWhat is a random variable
OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr
More informationChapter 5. Means and Variances
1 Chapter 5 Means and Variances Our discussion of probability has taken us from a simple classical view of counting successes relative to total outcomes and has brought us to the idea of a probability
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationNotes 1 Autumn Sample space, events. S is the number of elements in the set S.)
MAS 108 Probability I Notes 1 Autumn 2005 Sample space, events The general setting is: We perform an experiment which can have a number of different outcomes. The sample space is the set of all possible
More informationP (E) = P (A 1 )P (A 2 )... P (A n ).
Lecture 9: Conditional probability II: breaking complex events into smaller events, methods to solve probability problems, Bayes rule, law of total probability, Bayes theorem Discrete Structures II (Summer
More informationChapter 2 Class Notes
Chapter 2 Class Notes Probability can be thought of in many ways, for example as a relative frequency of a long series of trials (e.g. flips of a coin or die) Another approach is to let an expert (such
More information4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur
4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Rahul Roy Indian Statistical Institute, Delhi. Adapted
More informationProbability Theory and Random Variables
Probability Theory and Random Variables One of the most noticeable aspects of many computer science related phenomena is the lack of certainty. When a job is submitted to a batch oriented computer system,
More informationProbability 1 (MATH 11300) lecture slides
Probability 1 (MATH 11300) lecture slides Márton Balázs School of Mathematics University of Bristol Autumn, 2015 December 16, 2015 To know... http://www.maths.bris.ac.uk/ mb13434/prob1/ m.balazs@bristol.ac.uk
More informationMath Bootcamp 2012 Miscellaneous
Math Bootcamp 202 Miscellaneous Factorial, combination and permutation The factorial of a positive integer n denoted by n!, is the product of all positive integers less than or equal to n. Define 0! =.
More informationStatistical Inference
Statistical Inference Lecture 1: Probability Theory MING GAO DASE @ ECNU (for course related communications) mgao@dase.ecnu.edu.cn Sep. 11, 2018 Outline Introduction Set Theory Basics of Probability Theory
More informationDeep Learning for Computer Vision
Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair
More informationChapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.
Chapter 14 From Randomness to Probability Copyright 2012, 2008, 2005 Pearson Education, Inc. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,
More informationMath 564 Homework 1. Solutions.
Math 564 Homework 1. Solutions. Problem 1. Prove Proposition 0.2.2. A guide to this problem: start with the open set S = (a, b), for example. First assume that a >, and show that the number a has the properties
More information12 1 = = 1
Basic Probability: Problem Set One Summer 07.3. We have A B B P (A B) P (B) 3. We also have from the inclusion-exclusion principle that since P (A B). P (A B) P (A) + P (B) P (A B) 3 P (A B) 3 For examples
More informationProbability and Probability Distributions. Dr. Mohammed Alahmed
Probability and Probability Distributions 1 Probability and Probability Distributions Usually we want to do more with data than just describing them! We might want to test certain specific inferences about
More information7.1 What is it and why should we care?
Chapter 7 Probability In this section, we go over some simple concepts from probability theory. We integrate these with ideas from formal language theory in the next chapter. 7.1 What is it and why should
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationIntroduction to discrete probability. The rules Sample space (finite except for one example)
Algorithms lecture notes 1 Introduction to discrete probability The rules Sample space (finite except for one example) say Ω. P (Ω) = 1, P ( ) = 0. If the items in the sample space are {x 1,..., x n }
More informationProbability: from intuition to mathematics
1 / 28 Probability: from intuition to mathematics Ting-Kam Leonard Wong University of Southern California EPYMT 2017 2 / 28 Probability has a right and a left hand. On the right is the rigorous foundational
More informationProbability Theory and Simulation Methods
Feb 28th, 2018 Lecture 10: Random variables Countdown to midterm (March 21st): 28 days Week 1 Chapter 1: Axioms of probability Week 2 Chapter 3: Conditional probability and independence Week 4 Chapters
More informationMethods of Mathematics
Methods of Mathematics Kenneth A. Ribet UC Berkeley Math 10B February 23, 2016 Office hours Office hours Monday 2:10 3:10 and Thursday 10:30 11:30 in Evans; Tuesday 10:30 noon at the SLC Kenneth A. Ribet
More informationFourier and Stats / Astro Stats and Measurement : Stats Notes
Fourier and Stats / Astro Stats and Measurement : Stats Notes Andy Lawrence, University of Edinburgh Autumn 2013 1 Probabilities, distributions, and errors Laplace once said Probability theory is nothing
More informationProbabilistic models
Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became the definitive formulation
More informationLecture 2: Discrete Probability Distributions
Lecture 2: Discrete Probability Distributions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 1st, 2011 Rasmussen (CUED) Lecture
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationHomework 4 Solution, due July 23
Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var
More information3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur
3rd IIA-Penn State Astrostatistics School 19 27 July, 2010 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Bhamidi V Rao Indian Statistical Institute,
More informationp. 4-1 Random Variables
Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i
More information6.042/18.062J Mathematics for Computer Science November 28, 2006 Tom Leighton and Ronitt Rubinfeld. Random Variables
6.042/18.062J Mathematics for Computer Science November 28, 2006 Tom Leighton and Ronitt Rubinfeld Lecture Notes Random Variables We ve used probablity to model a variety of experiments, games, and tests.
More informationCSC Discrete Math I, Spring Discrete Probability
CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields
More informationCountability. 1 Motivation. 2 Counting
Countability 1 Motivation In topology as well as other areas of mathematics, we deal with a lot of infinite sets. However, as we will gradually discover, some infinite sets are bigger than others. Countably
More informationProbability (Devore Chapter Two)
Probability (Devore Chapter Two) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 0 Preliminaries 3 0.1 Motivation..................................... 3 0.2 Administrata...................................
More informationCS 125 Section #10 (Un)decidability and Probability November 1, 2016
CS 125 Section #10 (Un)decidability and Probability November 1, 2016 1 Countability Recall that a set S is countable (either finite or countably infinite) if and only if there exists a surjective mapping
More informationDiscrete Probability. Chemistry & Physics. Medicine
Discrete Probability The existence of gambling for many centuries is evidence of long-running interest in probability. But a good understanding of probability transcends mere gambling. The mathematics
More information18.175: Lecture 17 Poisson random variables
18.175: Lecture 17 Poisson random variables Scott Sheffield MIT 1 Outline More on random walks and local CLT Poisson random variable convergence Extend CLT idea to stable random variables 2 Outline More
More informationSTAT 712 MATHEMATICAL STATISTICS I
STAT 72 MATHEMATICAL STATISTICS I Fall 207 Lecture Notes Joshua M. Tebbs Department of Statistics University of South Carolina c by Joshua M. Tebbs TABLE OF CONTENTS Contents Probability Theory. Set Theory......................................2
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationProbability reminders
CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten
More informationCS 124 Math Review Section January 29, 2018
CS 124 Math Review Section CS 124 is more math intensive than most of the introductory courses in the department. You re going to need to be able to do two things: 1. Perform some clever calculations to
More informationProbability Distributions - Lecture 5
Probability Distributions - Lecture 5 1 Introduction There are a number of mathematical models of probability density functions that represent the behavior of physical systems. In this lecture we explore
More informationExam 3, Math Fall 2016 October 19, 2016
Exam 3, Math 500- Fall 06 October 9, 06 This is a 50-minute exam. You may use your textbook, as well as a calculator, but your work must be completely yours. The exam is made of 5 questions in 5 pages,
More informationALL TEXTS BELONG TO OWNERS. Candidate code: glt090 TAKEN FROM
How are Generating Functions used in finding the closed form of sequences involving recurrence relations and in the analysis of probability distributions? Mathematics Extended Essay Word count: 3865 Abstract
More informationChapter 4 : Discrete Random Variables
STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2015 Néhémy Lim Chapter 4 : Discrete Random Variables 1 Random variables Objectives of this section. To learn the formal definition of a random variable.
More informationLecture 1: Probability Fundamentals
Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability
More informationDept. of Linguistics, Indiana University Fall 2015
L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would
More informationn N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)
CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.) S.T. is the key to understanding driving forces. e.g., determines if a process proceeds spontaneously. Let s start with entropy
More informationElementary Probability. Exam Number 38119
Elementary Probability Exam Number 38119 2 1. Introduction Consider any experiment whose result is unknown, for example throwing a coin, the daily number of customers in a supermarket or the duration of
More information5.2 Fisher information and the Cramer-Rao bound
Stat 200: Introduction to Statistical Inference Autumn 208/9 Lecture 5: Maximum likelihood theory Lecturer: Art B. Owen October 9 Disclaimer: These notes have not been subjected to the usual scrutiny reserved
More information