Similar documents
Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Exam 1 - Math Solutions

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Introduction to Statistical Inference Self-study

Math438 Actuarial Probability

Mutually Exclusive Events

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Continuous Random Variables

Introduction to Probability 2017/18 Supplementary Problems

Bayesian statistics, simulation and software

Dept. of Linguistics, Indiana University Fall 2015

STAT 302: Assignment 1

STAT 430/510: Lecture 10

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS

Introduction to Machine Learning

MA : Introductory Probability

STA 2023 EXAM-2 Practice Problems. Ven Mudunuru. From Chapters 4, 5, & Partly 6. With SOLUTIONS

Massachusetts Institute of Technology

Math 511 Exam #1. Show All Work No Calculators

P(T = 7) = P(T = 7 A = n)p(a = n) = P(B = 7 - n)p(a = n) = P(B =4)P(A = 3) = = 0.06

Homework 4 Solution, due July 23

Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails.

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Cheng Soon Ong & Christian Walder. Canberra February June 2018

3 Multiple Discrete Random Variables

Quantitative Methods for Decision Making

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

(c) Find the product moment correlation coefficient between s and t.

Chapter 4. Continuous Random Variables 4.1 PDF

Probability Review. Chao Lan

Cogs 14B: Introduction to Statistical Analysis

Chapter 18 Section 8.5 Fault Trees Analysis (FTA) Don t get caught out on a limb of your fault tree.

Stochastic Simulation Introduction Bo Friis Nielsen

ISyE 6739 Test 1 Solutions Summer 2015

CS 109 Review. CS 109 Review. Julia Daniel, 12/3/2018. Julia Daniel

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev

Discrete time Markov chains. Discrete Time Markov Chains, Definition and classification. Probability axioms and first results

SDS 321: Introduction to Probability and Statistics

Probability Theory Review Reading Assignments

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

CME 106: Review Probability theory

Part (A): Review of Probability [Statistics I revision]

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.

Probability Review I

Random variables (discrete)

Sampling Distributions

EE514A Information Theory I Fall 2013

M378K In-Class Assignment #1

A random variable is a quantity whose value is determined by the outcome of an experiment.

Lecture 16. Lectures 1-15 Review

Lecture 5. October 21, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.

Notes 12 Autumn 2005

Stochastic Models of Manufacturing Systems

Test Problems for Probability Theory ,

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs

A Gentle Introduction to Gradient Boosting. Cheng Li College of Computer and Information Science Northeastern University

Expectation of Random Variables

Gaussian random variables inr n

B4 Estimation and Inference

Generative Techniques: Bayes Rule and the Axioms of Probability

Analysis of Experimental Designs

Recap of Basic Probability Theory

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).

WYOMING COMMUNITY DEVELOPMENT AUTHORITY DISCLOSURE REPORT FOR THE 1994 INDENTURE SINGLE FAMILY HOUSING REVENUE BOND SERIES

Your pure maths needs to be far stronger for S4 than in any other Statistics module. You must be strong on general binomial expansion from C4.

Machine Learning Srihari. Probability Theory. Sargur N. Srihari

MAS108 Probability I

We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events.

Recap of Basic Probability Theory

Continuous Random Variables. What continuous random variables are and how to use them. I can give a definition of a continuous random variable.

Chapter 3: Random Variables 1

18.600: Lecture 7 Bayes formula and independence

Discrete Markov Processes. 1. Introduction

ABSTRACT EXPECTATION

CSCE 478/878 Lecture 6: Bayesian Learning and Graphical Models. Stephen Scott. Introduction. Outline. Bayes Theorem. Formulas

12. Special Transformations 1

SDS 321: Introduction to Probability and Statistics

CHAPTER - 3 Probability

Probability Theory and Applications

Rapid Introduction to Machine Learning/ Deep Learning

Bayesian Machine Learning

12 - The Tie Set Method

CSE 312 Final Review: Section AA

Section 4.2 Polynomial Functions of Higher Degree

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur

Lecture 7. Bayes formula and independence

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

18.600: Lecture 7 Bayes formula and independence

Math Introduction to Probability. Davar Khoshnevisan University of Utah

Computational Logic. Standardization of Interpretations. Damiano Zanardini

STATISTICS 1 REVISION NOTES

Statistical Model Checking as Feedback Control

Continuous Random Variables and Continuous Distributions

Topic 3: The Expectation of a Random Variable

LOCUS. Definition: The set of all points (and only those points) which satisfy the given geometrical condition(s) (or properties) is called a locus.

MATH 3MB3 FALL 2018 Univariate Sochastic 1

Lecture 2: Probability, conditional probability, and independence

Lecture 4. Selected material from: Ch. 6 Probability

Transcription:

200 2 10 17 5 5 10 18 12 1 10 19 960 1 10 21

Deductive Logic Probability Theory

π X X X X = X X =

x f z x Physics f Sensor z x f f z

P(X = x) X x X x P(X = x) = 1 /6 P(X = x) P(x) F(x) F(x) = P(X x)

X Y Y Z X Z X X Y X Y X, Y = X Y

P(a) = 1. a P(a, b) = P(a) P(b a) = P(b) P(a b) P(a, b) = P(a b) = P(a b) A B S

P(a, b) = P(a) P(b a) = P(b) P(a b). P(A, B, C) P(B A, C) P(A) P(C A) P(A) P(B A) P(C B) P(A B, C) P(B) P(C A) P(A) P(B) P(C)

P(A, B, C) P(B A, C) P(A) P(C A) P(A) P(B A) P(C B) P(A B, C) P(B) P(C A) P(A) P(B) P(C) P(A, B, C) = P(A B, C) P(B, C) = P(A B, C) P(B C) P(C) = P(A B, C) P(C B) P(B) = P(C B, A) P(B A) P(A)

S H V V P(H V) = 0.95 P(V) = 10 6 P(H, V) = P(H V)P(V) = 0.95 10 6.

b a b P(a i, b j ) = 1 a i P(a i ) = j j P(a i, b j ) x y x y P(x, y) x = 0 x = 1 x = 2 y = 0 0.32 0.03 0.01 y = 1 0.06 0.24 0.02 y = 2 0.02 0.03 0.27

P(a i ) = j P(a i, b j ) P(x, y) x = 0 x = 1 x = 2 y = 0 0.32 0.03 0.01 y = 1 0.06 0.24 0.02 y = 2 0.02 0.03 0.27 P(Y = 0) 0.32. 0.36. 0.40. 0.89.

P(a i ) = j P(a i, b j ) P(x, y) x = 0 x = 1 x = 2 y = 0 0.32 0.03 0.01 y = 1 0.06 0.24 0.02 y = 2 0.02 0.03 0.27 P(Y = 0) 0.32. 0.36. 0.40. 0.89.

P(x, y) x = 0 x = 1 x = 2 y = 0 0.32 0.03 0.01 y = 1 0.06 0.24 0.02 y = 2 0.02 0.03 0.27 = 1.0 P(X = 1) = j P(X = 1, y j) = 0.30 P(Y = 0) = i P(x i, Y = 0) = 0.36

P(y, x) = P(y x) P(x) = P(y x) = P(y,x) P(x). P(x, y) x = 0 x = 1 x = 2 y = 0 0.32 0.03 0.01 y = 1 0.06 0.24 0.02 y = 2 0.02 0.03 0.27 P(Y = 0 X = 1) 0.80. 0.19. 0.83. 0.10.

P(y, x) = P(y x) P(x) = P(y x) = P(y,x) P(x). P(x, y) x = 0 x = 1 x = 2 y = 0 0.32 0.03 0.01 y = 1 0.06 0.24 0.02 y = 2 0.02 0.03 0.27 P(Y = 0 X = 1) 0.80. 0.19. 0.83. 0.10.

P(x, y) x = 0 x = 1 x = 2 y = 0 0.32 0.03 0.01 y = 1 0.06 0.24 0.02 y = 2 0.02 0.03 0.27 Y = y X = 1 P(y X = 1) = P(X = 1, y) P(X = 1) P(y X = 1) y = 0 0.03/0.30 = 0.1 y = 1 0.24/0.30 = 0.8 y = 2 0.03/0.30 = 0.1 1.0

P(a b) = P(b a) P(a) P(b) P(a b) a P(a) a P(b a) a a L(a) = P(b a) P(b) P(b) = i P(a i, b) = i P(b a i)p(a i ).

f 1,2,3 P(f i w) = P(w f i)p(f i ) = P(w f i)p(f i ) P(w) j P(w f j)p(f j ) i P(w f i ) P(f i ) P(w f i )P(f i ) 10 10 4 2.0 10 4 3 10 4 2.1 10 4 5 10 4 0.5 10 4 P(f 1 w) = 2.0 P(f 2 w) = 2.1 P(f 3 w) = 0.5 4.6 4.6 4.6

A B P(a, b) = P(a) P(b) P(a, b) = P(a b)p(b) P(a, b) = P(a)P(b) P(a)P(b) = P(a b)p(b) P(a b) = P(a) A B

H 1 T 1 H 2 T 2 [ 1 /4 ] [ ] 1/4 1 /2 [ = 1 /2 1/2 ] 1/4 1/4 1/2

P(x, y) x = 0 x = 1 x = 2 y = 0 y = 1 y = 2 P(x, y) x = 0 x = 1 x = 2 y = 0 y = 1 y = 2 y y y x x x

Y D N P(y, d, n) P(y d) = = P(y, d) y P(y, d) n P(y, d, n) P(y, d, n). y n

N B F B F N

N N N P(B, F, N) = P(B) P(F) P(N F, B). B F p(b, F) = p(b) p(f) B F N

P(B, F, N) = P(B) P(F) P(N F, B). P(B N) = b f f P(B) P(F = f) P(N F = f, B) P(B = b) P(F = f) P(N F = f, B = b). P(B) = 0.001, P(F) = 0.1, P(N f, b) = b = b = ( ) f = 0.95 0.8, f = 0.5 0.05 P(B N) = 0.0043,

P(A, B, C, D, E, F) = P(A) P(B) P(C A, B) P(D B) P(E D) P(F D). A B C D E F

P(B) P(B) P(A, B) = P(A)P(B A) = P(B)P(A B) A B A B

P A B B A A B P P A B A P B A P B

A B P P A B A P B A P B

A B P P

B F N N p(b, F N) = p(b N) p(f N) B F N N B F

A E C E B F A B A B C D C D E F E F

X Y D D D X Y X Y X Y D D X Y X Y

E[X] = m(x) = X = µ = i x i P(x i ) X E[X] = 0.48 3 + 0.26 1 + 0.26 0 = 1.7 E E[aX + by] = ae[x] + be[y]

f X A E[f(X) A] = i f(x i )P(x i A) A n E[x n ] = i x n i P(x i ) n E[(x µ) n ] = i (x i µ) n P(x i )

var(x) = E[(x µ) 2 ] = i (x i µ) 2 P(x i ) (X) = σ = (x) X = 1 X = 1 (X) = ( 1 0) 2 1 /2 + ( 1 0) 2 1 /2 = 1 (X) = (1 0.80) 2 0.9 + ( 1 0.80) 2 0.1 = 0.60

0 2 4 6 8 10 0 2 4 6 8 10

A B P(a, b) = P(a)P(b)