CPSC340. Probability. Nando de Freitas September, 2012 University of British Columbia

Similar documents
Where are we? è Five major sec3ons of this course

UVA CS 6316/4501 Fall 2016 Machine Learning. Lecture 11: Probability Review. Dr. Yanjun Qi. University of Virginia. Department of Computer Science

Lecture 1. ABC of Probability

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

CMPSCI 240: Reasoning about Uncertainty

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Econ 325: Introduction to Empirical Economics

Probability Theory for Machine Learning. Chris Cremer September 2015

Chapter 2.5 Random Variables and Probability The Modern View (cont.)

Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails.

Chapter 6: Probability The Study of Randomness

the time it takes until a radioactive substance undergoes a decay

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

GOV 2001/ 1002/ Stat E-200 Section 1 Probability Review

Probability (Devore Chapter Two)

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

STA Module 4 Probability Concepts. Rev.F08 1

Overview of Probability. Mark Schmidt September 12, 2017

Single Maths B: Introduction to Probability

Fundamentals of Probability CE 311S

Bayesian Approaches Data Mining Selected Technique

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Lecture 1: Probability Fundamentals

Probability- describes the pattern of chance outcomes

Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides

Lecture notes for probability. Math 124

Probability theory basics

4. Probability of an event A for equally likely outcomes:

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Review of Basic Probability

Origins of Probability Theory

CMPSCI 240: Reasoning Under Uncertainty

Intermediate Math Circles November 8, 2017 Probability II

Probability Theory. Probability and Statistics for Data Science CSE594 - Spring 2016

18.600: Lecture 3 What is probability?

Review Basic Probability Concept

Properties of Probability

CS626 Data Analysis and Simulation

Discrete Probability. Mark Huiskes, LIACS Probability and Statistics, Mark Huiskes, LIACS, Lecture 2

Basic Concepts of Probability

Recitation 2: Probability

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Chapter 7 Wednesday, May 26th

Conditional Probability and Bayes

Machine Learning Algorithm. Heejun Kim

Lecture 9: Naive Bayes, SVM, Kernels. Saravanan Thirumuruganathan

Probability & Random Variables

Notes 10.1 Probability

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

Dept. of Linguistics, Indiana University Fall 2015

Basics of Probability

Introduction to Stochastic Processes

Come & Join Us at VUSTUDENTS.net

Probability Notes (A) , Fall 2010

Math 511 Exam #1. Show All Work No Calculators

Conditional Probability & Independence. Conditional Probabilities

Sample Space: Specify all possible outcomes from an experiment. Event: Specify a particular outcome or combination of outcomes.

Lecture Lecture 5

Probability. 25 th September lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.)

2011 Pearson Education, Inc

Soc500: Applied Social Statistics Week 1: Introduction and Probability

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 2: Random Experiments. Prof. Vince Calhoun

Lecture 3 Probability Basics

Basic Statistics and Probability Chapter 3: Probability

7.1 What is it and why should we care?

Lecture 2: Repetition of probability theory and statistics

Discrete Probability and State Estimation

Basic Probability. Introduction

Lecture 1 : The Mathematical Theory of Probability

Conditional Probability & Independence. Conditional Probabilities

Deep Learning for Computer Vision

Review: Probability. BM1: Advanced Natural Language Processing. University of Potsdam. Tatjana Scheffler

2. Probability. Chris Piech and Mehran Sahami. Oct 2017

STAT:5100 (22S:193) Statistical Inference I

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

MAT Mathematics in Today's World

Machine Learning for Large-Scale Data Analysis and Decision Making A. Week #1

CS 4649/7649 Robot Intelligence: Planning

Probability - Lecture 4

Random Variables and Events

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Lecture 1: Basics of Probability

Probability Theory Review

Probability Theory and Applications

Topic 3: Introduction to Probability

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Probability and Statistics Notes

Advanced Probabilistic Modeling in R Day 1

Independence Solutions STAT-UB.0103 Statistics for Business Control and Regression Models

Axioms of Probability

Algorithms for Uncertainty Quantification

EE 178 Lecture Notes 0 Course Introduction. About EE178. About Probability. Course Goals. Course Topics. Lecture Notes EE 178

Consider an experiment that may have different outcomes. We are interested to know what is the probability of a particular set of outcomes.

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

Probabilistic Reasoning

L14. 1 Lecture 14: Crash Course in Probability. July 7, Overview and Objectives. 1.2 Part 1: Probability

Conditional Probability

CMPSCI 240: Reasoning about Uncertainty

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Data Mining Techniques. Lecture 3: Probability

Transcription:

CPSC340 Probability Nando de Freitas September, 2012 University of British Columbia

Outline of the lecture This lecture is intended at revising probabilistic concepts that play an important role in the design of machine learning and data mining algorithms. It is expected that you will learn and master the following three topics: 1. Frequentist and axiomatic definition of probability. 2. Conditioning. 3. Marginalization.

Probability as frequency Consider the following questions: 1. What is the probability that when I flip a coin it is heads? 2. Why? 3. What is the probability that the Lion s gate bridge will collapse before the term is over? Message: The frequentist view is very useful, but it seems that we also use domain knowledge to come up with probabilities. Moreover, it seems that probability can be subjective (different people have different probabilities for the same event).

Probability as measure Imagine we are throwing darts at a wall of size 1x1 and that all darts are guaranteed to fall within this 1x1 wall. What is the probability that a dart will hit the shaded area? Message: Probability is a measure of certainty of an event taking place. i.e. in the example above we were measuring the chances of hitting the shaded area.

Probability Probability is the formal study of the laws of chance. Probability allows us to manage uncertainty. The sample space is the set of all outcomes. For example, for a die we have 6 outcomes: Probability allows us to measure many events. The events are subsets of the sample space. For example, for a die we may consider the following events: We assign probabilities to these events:

The axioms The following two laws are the key axioms of probability: We can visualize all these sets with Venn Diagrams

Or and And operations Given two events, A and B, that are not disjoint, we have: P(A or B) = P(A) + P(B) P(A and B)

Conditional probability Assuming (given that) B has been observed (i.e. there is no uncertainty about B), the following tells us the probability that A will take place: P(A given B) = P(A and B) / P(B) That is, in the frequentist interpretation, we calculate the ratio of the number of times both A and B occurred and divide it by the number of times B occurred. For short we write:p(a B) = P(AB)/P(B); or P(AB)=P(A B)P(B), where P(A B) is the conditional probability, P(AB) is the joint, and P(B) is the marginal. If we have more events, we use the chain rule: P(ABC) = P(A BC) P(B C) P(C)

Conditional probability Can we write the joint probability, P(AB), in more than one way in terms of conditional and marginal probabilities?

Independence We know that: P(AB) = P(A B)P(B) But what happens if A does not depend on B? That is, the value of B does not affect the chances of A taking place. How does the above expression simplify? How does the expression below simplify? P(ABCD) =

Conditional probability example Assume we have a dark box with 3 red balls and 1 blue ball. That is, we have the set {r,r,r,b}. What is the probability of drawing 2 red balls in the first 2 tries? P(B 1 = r, B 2 = r) =

Proof sketch: Marginalization

Conditional probability example What is the probability that the 2 nd ball drawn from the set {r,r,r,b} will be red? Using marginalization, P(B 2 = r) =

Matrix notation Assume that X can be 0 or 1. We use the math notation: X {0,1}. Let P(X 1 =0) = ¾ and P(X 1 =1)= ¼. Assume too that P(X 2 =1 X 1 =0) = 1/3, P(X 2 =1 X 1 =1) = 0. Then, P(X 2 =0 X 1 =0) =, P(X 2 =0 X 1 =1) = We can obtain an expression for P(X 2 ) easily using matrix notation:

We have: Matrix notation Σ P(X 1 ) P(X 2 X 1 ) = P(X 2 ) For short, we write this using vectors and a stochastic matrix: Imagine we kept on multiplying by G. Claim: For a very large number k, after k iterations, the value of π stabilizes. That is, π k is an eigenvector of G with eigenvalue 1.

Google s website search algorithm The previous observation is key to understand how Google s pagerank algorithm works.