Set Theory. Pattern Recognition III. Michal Haindl. Set Operations. Outline

Similar documents
Branch-and-Bound Algorithm. Pattern Recognition XI. Michal Haindl. Outline

Feature Selection. Pattern Recognition X. Michal Haindl. Feature Selection. Outline

Notation. Pattern Recognition II. Michal Haindl. Outline - PR Basic Concepts. Pattern Recognition Notions

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2

Recitation 2: Probability

Appendix A : Introduction to Probability and stochastic processes

Analysis of Experimental Designs

EE4601 Communication Systems

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Single Maths B: Introduction to Probability

Origins of Probability Theory

Probability Theory. Alea iacta est!

Statistika pro informatiku

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Econ 325: Introduction to Empirical Economics

Probability Theory Review Reading Assignments

Statistika pro informatiku

Introduction to Probability and Stocastic Processes - Part I

Probability Dr. Manjula Gunarathna 1

MI-RUB Testing Lecture 10

Probability- describes the pattern of chance outcomes

Random Signals and Systems. Chapter 1. Jitendra K Tugnait. Department of Electrical & Computer Engineering. James B Davis Professor.

Lecture 11. Probability Theory: an Overveiw

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Review of Statistics

Bayes Theorem. Jan Kracík. Department of Applied Mathematics FEECS, VŠB - TU Ostrava

Introduction to Information Entropy Adapted from Papoulis (1991)

Given a experiment with outcomes in sample space: Ω Probability measure applied to subsets of Ω: P[A] 0 P[A B] = P[A] + P[B] P[AB] = P(AB)

Lecture 1: Probability Fundamentals

Chapter 2. Probability

Brief Review of Probability

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

Statistics for Business and Economics

1 INFO Sep 05

Dept. of Linguistics, Indiana University Fall 2015

BACKGROUND NOTES FYS 4550/FYS EXPERIMENTAL HIGH ENERGY PHYSICS AUTUMN 2016 PROBABILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO

MI-RUB Testing II Lecture 11

Probability and statistics; Rehearsal for pattern recognition

Statistical Inference

Unit 7 Probability M2 13.1,2,4, 5,6

M378K In-Class Assignment #1

Probability Theory. Patrick Lam

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

Quantitative Methods in Economics Conditional Expectations

i=1 k i=1 g i (Y )] = k

PROBABILITY CHAPTER LEARNING OBJECTIVES UNIT OVERVIEW

STA 291 Lecture 8. Probability. Probability Rules. Joint and Marginal Probability. STA Lecture 8 1

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Probability and Applications

1 Presessional Probability

Deep Learning for Computer Vision

Probability Theory. Fourier Series and Fourier Transform are widely used techniques to model deterministic signals.

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

An event described by a single characteristic e.g., A day in January from all days in 2012

Binary Decision Diagrams

Week 2. Review of Probability, Random Variables and Univariate Distributions

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Probability theory. References:

Probability Theory for Machine Learning. Chris Cremer September 2015

Quick Tour of Basic Probability Theory and Linear Algebra

What is Probability? Probability. Sample Spaces and Events. Simple Event

Preliminary statistics

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Review of Probabilities and Basic Statistics

Formulas for probability theory and linear models SF2941

EE514A Information Theory I Fall 2013

CS 591, Lecture 2 Data Analytics: Theory and Applications Boston University

Bayesian statistics, simulation and software

V7 Foundations of Probability Theory

Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails.

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Review: mostly probability and some statistics

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

PROBABILITY AND RANDOM PROCESSESS

Review of Statistics I

2 (Statistics) Random variables

CS 630 Basic Probability and Information Theory. Tim Campbell

Introduction to Machine Learning

Lecture 2. Conditional Probability

Probability. Lecture Notes. Adolfo J. Rumbos

Properties of Summation Operator

Probability Theory and Applications

MATH 556: PROBABILITY PRIMER

NonlinearOptimization

Introduction to probability theory

Chapter 5. Chapter 5 sections

Chapter 1: Probability Theory Lecture 1: Measure space and measurable function

Review Basic Probability Concept

Review (Probability & Linear Algebra)

Examples of random experiment (a) Random experiment BASIC EXPERIMENT

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

Introduction to Probability Theory

Transcription:

Set Theory A, B sets e.g. A = {ζ 1,...,ζ n } A = { c x y d} S space (universe) A,B S Outline Pattern Recognition III Michal Haindl Faculty of Information Technology, KTI Czech Technical University in Prague Institute of Information Theory and Automation Academy of Sciences of the Czech Republic Prague, Czech Republic Evropský sociální fond. Praha & EU: Investujeme do vaší budoucnosti MI-ROZ 2011-2012/Z Set Operations c M. Haindl MI-ROZ - 03 3/17 Outline c M. Haindl MI-ROZ - 03 1/17 January 16, 2012 Outline A = {1,2,3} B = {3,2,1} sum (union) product (intersection) A 1,...,A n are iff 1 Set Theory 2 Probability c M. Haindl MI-ROZ - 03 2/17

Set Operations Set Operations sum (union) product (intersection) A 1,...,A n are iff sum (union) A = {1,2,3} B = {3,4,5} A B = {1,2,3,4,5} product (intersection) A 1,...,A n are iff Set Operations 2 Set Operations complement Ā = S, S =, AĀ =, A+Ā = S difference A B,A\B A B = A B = A AB De Morgan law A B = Ā B A B = Ā B sum (union) product (intersection) A B = {3} A 1,...,A n are iff c M. Haindl MI-ROZ - 03 5/17

Classical Set Operations 2 P(A) = N A N N A no of favourable outcomes complement Ā = S, S =, AĀ =, A+Ā = S if A i i are disjoint P( A i ) = difference A\B = {1,2} A B,A\B A B = A B = A AB De Morgan law 1654 Blaise Pascal, 1812 Pierre-Simon Laplace Théorie analytique des probabilités A B = Ā B A B = Ā B c M. Haindl MI-ROZ - 03 7/17 c M. Haindl MI-ROZ - 03 5/17 Axiomatic Probability 1 P(A) is positive: P(A) 0 2 Probability of certain events equals 1: P(S) = 1 3 If A and B are mutually exclusive, then: P(A+B) = P(A)+P(B) (otherwise P(A+B) = P(A)+P(B) P(AB) n A no of event A appearance Relative Frequency n A P(A) = lim n n definitions: Classical (A priori definition as a ratio of favourable to total number of alternatives.) Axiomatic (measure, A. Kolmogoroff - 1933) Relative frequency (Richard von Mieses - 1936) Probability as a measure of belief (inductive reasoning) A, B events, S space (certain event), impossible event, 0 P(.) 1 A, B mutually exclusive events, A Ā = S c M. Haindl MI-ROZ - 03 8/17 c M. Haindl MI-ROZ - 03 6/17

Conditional Probability Axiomatic given P(B) > 0 P(A B) = P(AB) P(B) Total Probability A i i mutually exclusive events n A i = S 1 P(A) is positive: P(A) 0 2 Probability of certain events equals 1: P(S) = 1 3 If A and B are mutually exclusive, then: P(A+B) = P(A)+P(B) (otherwise P(A+B) = P(A)+P(B) P(AB) P(B) = P(B A i ) = P(B A i ) Relative Frequency Independent Events P(A, B) = P(A)P(B) def. P(A B) = P(A), P(A 1,...,A n ) = i n A no of event A appearance n A P(A) = lim n n c M. Haindl MI-ROZ - 03 9/17 c M. Haindl MI-ROZ - 03 8/17 Conditional Probability Conditional Probability given P(B) > 0 given P(B) > 0 P(A B) = P(AB) P(B) Total Probability A i i mutually exclusive events n A i = S P(A B) = P(AB) P(B) Total Probability A i i mutually exclusive events n A i = S P(B) = P(B A i ) = P(B A i ) P(B) = P(B A i ) = P(B A i ) P(A, B) = P(A)P(B) def. Independent Events P(A, B) = P(A)P(B) def. Independent Events P(A B) = P(A), P(A 1,...,A n ) = i P(A B) = P(A), P(A 1,...,A n ) = i c M. Haindl MI-ROZ - 03 9/17 c M. Haindl MI-ROZ - 03 9/17

Random Variable Bayes Theorem X : ζ (real /complex)number ζ experiment outcome distribution function of the r.v. X F X (x) = P(X x) F( ) = 0 F(+ ) = 1 nondecreasing function of x F(x 1 ) F(x 2 ) for x 1 < x 2 continuous from the right F(x + ) = F(x) A i i mutually exclusive events P(A i B) = n A i = S P(B A i ) n P(B A i) density function F(x) = x f(t)dt nonnegativity f(x) 0 f(x) = df(x) Thomas Bayes: An Essay Toward Solving a Problem in the Doctrine of Chances, 1764 expected value E{X} = xf(x) = xdf(x) c M. Haindl MI-ROZ - 03 11/17 c M. Haindl MI-ROZ - 03 10/17 Random Variable Random Variable X : ζ (real /complex)number ζ experiment outcome distribution function of the r.v. X F X (x) = P(X x) F( ) = 0 F(+ ) = 1 X : ζ (real /complex)number ζ experiment outcome distribution function of the r.v. X F X (x) = P(X x) nondecreasing function of x F(x 1 ) F(x 2 ) for x 1 < x 2 continuous from the right F(x + ) = F(x) density function F(x) = x f(t)dt nonnegativity f(x) 0 expected value f(x) = df(x) E{X} = xf(x) = xdf(x) c M. Haindl MI-ROZ - 03 11/17 F( ) = 0 F(+ ) = 1 nondecreasing function of x F(x 1 ) F(x 2 ) for x 1 < x 2 continuous from the right F(x + ) = F(x) density function c M. Haindl MI-ROZ - 03 f(x) = df(x) 11/17

Dicrete Random Variable Random Variable 2 F X (x) = i P(X = x i) i : x i x (staircase form) expected value E{X} = i x i P(X = x i ) moments µ k = E{X k } central moments µ k = E { (X E{X}) k} F X (x) Mixed Random Variable discontinuous but not of a staircase form µ 2 = σ 2 variance (dispersion) σ = µ 2 standard deviation median x : F( x) 1 2 F( x +0) 1 2 c M. Haindl MI-ROZ - 03 13/17 c M. Haindl MI-ROZ - 03 12/17 Conditional Distribution F X (x z) = P(X x z) = f(x z) = df(x z) total probability ( n i P(z i ) = 1) F X (x) = P(X x, Z = z) P(z) F X (x z i )P(z i ) i Dicrete Random Variable F X (x) = i P(X = x i) i : x i x (staircase form) expected value E{X} = i x i P(X = x i ) conditional expected value E{X z} = xf(x z) F X (x) Mixed Random Variable discontinuous but not of a staircase form c M. Haindl MI-ROZ - 03 14/17 c M. Haindl MI-ROZ - 03 13/17

Normal Distribution Joint Distribution Gaussian N(µ,Σ) f(y) = (2π) n 2 Σ 1 2 exp { 1 } 2 (y µ)t Σ 1 (y µ) Y = {Y 1,...,Y n } E{y} = µ E{(y µ) T (y µ)} = Σ F Y (y) = P(Y 1 y 1,...,Y n y n ) f Y (y) = n F Y (y) y 1,..., y n E{Y} = (E{Y 1 },...,E{Y n }) cov{y i,y j } = E {(Y i E{Y i })(Y j E{Y j })} c M. Haindl MI-ROZ - 03 17/17 c M. Haindl MI-ROZ - 03 15/17 Normal Distribution Marginal Distribution Gaussian N(µ,Σ) f(y) = (2π) n 2 Σ 1 2 exp { 1 } 2 (y µ)t Σ 1 (y µ) F Yi (y i ) = F Y (,...,,y i,,...) E{y} = µ E{(y µ) T (y µ)} = Σ Σ regular, positive definite matrix if Σ = diag{σ 1,1,...,Σ n,n } then y 1,...,y n independent ỹ y ỹ N conditional distribution N any lin. combination of y i N F Yi (y i ) = f k (y 1,...,y k ) = yi... f(y 1,...,y i 1,t,y i+1,...,y n ) dy 1,...,dy i 1 dtdy i+1,...,dy n R n k f(y 1,...,y n )dy k+1...dy n c M. Haindl MI-ROZ - 03 17/17 c M. Haindl MI-ROZ - 03 16/17