Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Similar documents
Recap. The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY INFERENTIAL STATISTICS

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Deep Learning for Computer Vision

Probabilistic Reasoning

2) There should be uncertainty as to which outcome will occur before the procedure takes place.

Recitation 2: Probability

Single Maths B: Introduction to Probability

Econ 325: Introduction to Empirical Economics

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008

Properties of Probability

2011 Pearson Education, Inc

Probability theory basics

Probability. Lecture Notes. Adolfo J. Rumbos

STT When trying to evaluate the likelihood of random events we are using following wording.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

Statistical Methods in Particle Physics. Lecture 2

ECO 317 Economics of Uncertainty Fall Term 2009 Notes for lectures 1. Reminder and Review of Probability Concepts

Lecture 3 Probability Basics

Presentation on Theo e ry r y o f P r P o r bab a il i i l t i y

Dynamic Programming Lecture #4

Math 105 Course Outline

RVs and their probability distributions

4. Probability of an event A for equally likely outcomes:

Preliminary Statistics Lecture 5: Hypothesis Testing (Outline)

Probability Theory and Simulation Methods

02 Background Minimum background on probability. Random process

CS 630 Basic Probability and Information Theory. Tim Campbell

Probability Dr. Manjula Gunarathna 1

Math 3338: Probability (Fall 2006)

Review Basic Probability Concept

Chapter 2: Probability Part 1

Basic Probabilistic Reasoning SEG

Communication Theory II

Introduction to Probability

CS626 Data Analysis and Simulation

BACKGROUND NOTES FYS 4550/FYS EXPERIMENTAL HIGH ENERGY PHYSICS AUTUMN 2016 PROBABILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO

Elements of probability theory

Probability the chance that an uncertain event will occur (always between 0 and 1)

Computational Perception. Bayesian Inference

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

Statistics 251: Statistical Methods

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Lecture notes for probability. Math 124

Fundamentals of Probability CE 311S

Probability Theory Review

Statistics and Econometrics I

CMPSCI 240: Reasoning about Uncertainty

P [(E and F )] P [F ]

Chapter Learning Objectives. Random Experiments Dfiii Definition: Dfiii Definition:

Probability Theory and Applications

STA Module 4 Probability Concepts. Rev.F08 1

Basic notions of probability theory

UNIT Explain about the partition of a sampling space theorem?

Probability: Sets, Sample Spaces, Events

Discrete Random Variables

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Basic notions of probability theory

Lecture 1: Basics of Probability

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Statistical Theory 1

Preliminary statistics

STAT200 Elementary Statistics for applications

Probability (Devore Chapter Two)

Probabilities and Expectations

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Origins of Probability Theory

Machine Learning. Bayes Basics. Marc Toussaint U Stuttgart. Bayes, probabilities, Bayes theorem & examples

Probability and Probability Distributions. Dr. Mohammed Alahmed

Lecture 2: Probability. Readings: Sections Statistical Inference: drawing conclusions about the population based on a sample

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

EnM Probability and Random Processes

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Lecture Lecture 5

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

Chapter 1: Introduction to Probability Theory

Axioms of Probability

Statistical Inference

Lecture 1: Probability Fundamentals

Probability and distributions. Francesco Corona

Review of Basic Probability Theory

Standard & Conditional Probability

Chapter 7 Probability Basics

Introduction to Probability

1 Random variables and distributions

Lecture 9: Naive Bayes, SVM, Kernels. Saravanan Thirumuruganathan

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

and 1 P (w i)=1. n i N N = P (w i) lim

Math-Stat-491-Fall2014-Notes-I

Bayesian Models in Machine Learning

Math 1313 Experiments, Events and Sample Spaces

Section 13.3 Probability

ORF 245 Fundamentals of Statistics Chapter 5 Probability

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

Business Statistics. Lecture 3: Random Variables and the Normal Distribution

Making Hard Decision. Probability Basics. ENCE 627 Decision Analysis for Engineering

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 2: Random Experiments. Prof. Vince Calhoun

Transcription:

1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix A.2 and A.3 Barrow M. Statistics for Economics, Accounting and Business Studies Ch. 2 1 Introduction We need to analyse cases where we do not know what is going to happen: where there are risks, randomness, chances, hazards, gambles, etc. Probabilities provide a way of doing this. Probabilities are also useful for statistical inference, since we will need to make some probabilistic statements to infer about the population (random variable) when using a sample (just a subset of the values of the random variable). 2 Approaches of Probability There are two main approaches to Probability Theory, the frequentist view and the subjective view. Relative frequency: Empirical definition of probability P (A) = m n = Number of Outcomes favourable to A Total Number of Observations (trials) Subjective probability: Bayesian definition of probability Probability as a degree of belief We will concentrate on the first of these. 3 Terminology An experiment (or trial) is a process leading to a unique outcome (numerical value) with uncertainty. An outcome is the result of an experiment. For example, Experiment Outcome Tossing a coin Head (H) or Tail (T) Tossing two coins HH, HT, TH, TT Throwing a die 1, 2, 3, 4, 5, 6 All possible outcomes of an experiment define the sample space. Each member (outcome) of the sample space is called a sample point. A collection of the possible outcomes of an experiment is called an event. An event is a subset of the sample space. Events are mutually exclusive if the occurrence of one event prevents the occurrence of another event at the same time.

2 Two events are said to be equally likely if one event is as likely to occur as the other event. Events are said to be collectively exhaustive if they exhaust all possible outcomes of an experiment. Two events are said to be independent if the occurrence of the one does not depend on the other: one occurring does not make the other s occurrence any more or less likely 4 Probability Axioms and Rules Probabilities are numbers which represent the chance of an event happening. Denote the probability of event A happening as P (A) and the probability of event B happening as P (B). The probability of an event always lies between 0 and 1, 0 P (A) 1. P (A) is often known as the marginal probability of A. The probability of event A not happening is 1 P (A). This is called the complement of A, sometimes denoted as A. It holds that P (A) + P (A) = 1. The probability of two events (A, B) both happening, A AND B, is denoted as P (A B), the intersection, and is known as the joint probability. It is the probability of observing both events happening together. For two mutually exclusive events, A and B, their intersection is equal to zero, by definition. So, P (A B) = 0. Denote the probability of event A OR event B happening as P (A B) (the union). It is the probability of one event or the other happening. The union of two events, A and B is calculated as P (A B) = P (A) + P (B) P (A B). Notice that if the events are mutually exclusive, P (A B) = 0, so P (A B) = P (A) + P (B). Two events are said to be independent if the probability of the one occurring does not depend on the other one happening. For two independent events, A and B, the joint probability, i.e. the probability of both happening, is equal to the product of the two marginal probabilities: P (A B) = P (A) P (B). The probability of A happening given that event B has happened is called the conditional probability and is given by: P (A B) = P (A B) P (B) (1)

3 Conditional probabilities play a very important role in decision making. They ask how the information that B happened changes your estimate of the probability of A happening. If A and B are independent, P (A B) = P (A). By definition, knowing that B happened does not change the probability of A happening. Similarly, the probability of B happening given that A happens is: P (B A) = P (A B) P (A) (2) Multiply both sides of (1) by P (B) and both sides of (2) by P (A), and rearrange to give P (A B) = P (A B)P (B) = P (B A)P (A) The joint probability is the product of the conditional probability and the marginal probability in each case. Using the two right hand side relations gives Bayes Theorem: P (B A)P (A) P (A B) = P (B) This formula is widely used to update probabilities of an event A, in the light of new information, B. In this context P (A) is called the prior probability of A, P (B A) is called the likelihood, and P (A B) is called the posterior probability. We can use the joint probability to redefine the marginal probability of an event A happening with reference to event B: P (A) = P (A B) + P (A B) = P (A B)P (B) + P (A B)P ( B) i.e. the probability of event A happening is irrespective of whether event B happens or not. 5 Random Variables We now turn to consider how we apply probability to variables where we are uncertain what values they will take. These are called random variables. Usually, we denote random variables with capital letters, X, Y, Z..., and their realisations (particular values) with small letters x i, y j, z k. 5.1 Discrete Random Variables A discrete random variable, X can take a number of distinct possible values, say x 1, x 2,.., x N with probabilities p 1, p 2,..., p N. The observed values are called the realisations of the random variable. For instance, X the total obtained from throwing two dice is a discrete random variable. It can take the values 2 to 12. After you throw the dice, you observe the outcome, the realisation, a particular number, x i. Associated with the random variable is a probability distribution, p i = f(x i ), which gives the probability of obtaining each of the possible outcomes the random variable can take. Formally, the probability distribution is f(x i ) = P (X = x i ) = the probability that the random variable X takes the value x i, e.g. the value 3.

4 The cumulative probability distribution gives the probability of getting a value less than or equal to x j. That is, j F (x j ) = f(x i ) = P (X x j ) i=1 5.2 Continuous Random Variables Whereas a discrete random variable can only take specified values, continuous random variables (e.g. inflation) can take an infinite number of values. Corresponding to the probabilities f(x i ) for discrete random variables there is a probability density function, PDF, also denoted f(x i ) for continuous random variables. Since there are an infinite number of points on the real line, the probability of any one of those points is zero, although the PDF will be defined for it. However, we can always calculate the probability of falling into a particular interval. Hence, if we want to calculate the probability that X lies between a and b, then P (a X b) = b a f(x)dx Similarly, the cumulative density function, CDF, F (x j ) = Pr(X x j ) gives the probability that the continuous random variable will take a value less than or equal to a specified value x j. F (x j ) = P (X x j ) = xj f(x i )dx i 5.3 Marginal, Joint and Conditional Distributions Suppose we have two random variables X and Y with individual probability density functions f(x i ) and f(y i ). f(x i ) is also known as the marginal P DF of X, the probability distribution that X assumes a given value, for all values of X, regardless of the values taken by Y. The joint probability of X and Y, f(x i, y i ), indicates the probability of both X taking a particular value, x i, and Y taking a particular value, y i, and corresponds to P (A B) (intersection) in terms of events. We can now define the marginal probability distribution for the random variable X with reference to the joint distribution: f(x i ) = f(x i, y j ) = f(x i, y 1 ) + f(x i, y 2 ) +... j=1 (or equivalent integral for continuous random variables) The conditional P DF of X, gives the probability that X takes on the value of x i given that Y has assumed the value y i. f(x i y i ) = f(x i, y i ) f(y i )

5 5.4 Statistical Independence If two random variables, X, Y, are independent, then for all values of X and Y, their joint probability is just the product of the individual probability distributions, f(x i, y i ) = f(x i )f(y i ).