God doesn t play dice. - Albert Einstein

Similar documents
Lecture 8: Probability

Fundamentals of Probability CE 311S

Notes 1 Autumn Sample space, events. S is the number of elements in the set S.)

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

MATH MW Elementary Probability Course Notes Part I: Models and Counting

If S = {O 1, O 2,, O n }, where O i is the i th elementary outcome, and p i is the probability of the i th elementary outcome, then

Probability Dr. Manjula Gunarathna 1

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

Notes Week 2 Chapter 3 Probability WEEK 2 page 1

Origins of Probability Theory

Lecture 1 : The Mathematical Theory of Probability

ECE353: Probability and Random Processes. Lecture 2 - Set Theory

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

What is the probability of getting a heads when flipping a coin

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Axioms of Probability. Set Theory. M. Bremer. Math Spring 2018

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Statistics 1L03 - Midterm #2 Review

Lecture 3. Probability and elements of combinatorics

MATH2206 Prob Stat/20.Jan Weekly Review 1-2

Axioms of Probability

Module 1. Probability

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

Business Statistics. Lecture 3: Random Variables and the Normal Distribution

Review Basic Probability Concept

Review of Basic Probability Theory

Econ 113. Lecture Module 2

4. Probability of an event A for equally likely outcomes:

Probabilistic models

Probability (Devore Chapter Two)

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

3.2 Probability Rules

Binomial Probability. Permutations and Combinations. Review. History Note. Discuss Quizzes/Answer Questions. 9.0 Lesson Plan

STT When trying to evaluate the likelihood of random events we are using following wording.

Lecture 1. Chapter 1. (Part I) Material Covered in This Lecture: Chapter 1, Chapter 2 ( ). 1. What is Statistics?

CMPSCI 240: Reasoning about Uncertainty

Lecture notes for probability. Math 124

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Chapter 8 Sequences, Series, and Probability

Chapter 2 Class Notes

Probability with Engineering Applications ECE 313 Section C Lecture 2. Lav R. Varshney 30 August 2017

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

Properties of Probability

Random Signals and Systems. Chapter 1. Jitendra K Tugnait. Department of Electrical & Computer Engineering. James B Davis Professor.

Probability Theory Review

2. AXIOMATIC PROBABILITY

Probability: Sets, Sample Spaces, Events

324 Stat Lecture Notes (1) Probability

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Probability Theory and Applications

Probabilistic models

Chapter 6: Probability The Study of Randomness

Topic 3: Introduction to Probability

Probability Year 10. Terminology

Probability. Lecture Notes. Adolfo J. Rumbos

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 2: Random Experiments. Prof. Vince Calhoun

Sample Space: Specify all possible outcomes from an experiment. Event: Specify a particular outcome or combination of outcomes.

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Formalizing Probability. Choosing the Sample Space. Probability Measures

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

MS-A0504 First course in probability and statistics

Probability Year 9. Terminology

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

STA Module 4 Probability Concepts. Rev.F08 1

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

Probability 1 (MATH 11300) lecture slides

= 2 5 Note how we need to be somewhat careful with how we define the total number of outcomes in b) and d). We will return to this later.

Econ 325: Introduction to Empirical Economics

6.02 Fall 2012 Lecture #1

Probability and distributions. Francesco Corona

MA : Introductory Probability

Recitation 2: Probability

What is Probability? Probability. Sample Spaces and Events. Simple Event

Probability the chance that an uncertain event will occur (always between 0 and 1)

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Important Concepts Read Chapter 2. Experiments. Phenomena. Probability Models. Unpredictable in detail. Examples

Math 3338: Probability (Fall 2006)

Number Theory and Counting Method. Divisors -Least common divisor -Greatest common multiple

An Introduction to Combinatorics

Set/deck of playing cards. Spades Hearts Diamonds Clubs

PROBABILITY THEORY 1. Basics

Lecture 1: Probability Fundamentals

1 The Basic Counting Principles

Topic 2 Probability. Basic probability Conditional probability and independence Bayes rule Basic reliability

tossing a coin selecting a card from a deck measuring the commuting time on a particular morning

EE 178 Lecture Notes 0 Course Introduction. About EE178. About Probability. Course Goals. Course Topics. Lecture Notes EE 178

Probability and the Second Law of Thermodynamics

STAT 201 Chapter 5. Probability

1 Combinatorial Analysis

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

STAT Chapter 3: Probability

STOR Lecture 4. Axioms of Probability - II

Lecture 4: Probability and Discrete Random Variables

MATH 556: PROBABILITY PRIMER

UNIT 5 ~ Probability: What Are the Chances? 1

7.1 What is it and why should we care?

Statistics for Financial Engineering Session 2: Basic Set Theory March 19 th, 2006

Mathematical Probability

Discrete Mathematics & Mathematical Reasoning Chapter 6: Counting

Transcription:

ECE 450 Lecture 1 God doesn t play dice. - Albert Einstein As far as the laws of mathematics refer to reality, they are not certain; as far as they are certain, they do not refer to reality. Lecture Overview Announcements Set theory review - Albert Einstein Vocabulary: experiments, outcomes, trials, events, sample space 3 axioms of probability Combinatorics Probability what is it? (4 approaches) EE Application: Information Theory ECE 450 D. van Alphen 1

Announcements Regular Office Hr:,, JD 4414 Syllabus Highlights Grading HW Due Dates Recorded Lectures and Tutorials Course Web Page: www.csun.edu/~dvanalp (Follow links: Current Semester ECE 450) ECE 450 D. van Alphen 2

Set Theory On your own time, review set complements, unions, intersections, subsets, set differences, and Venn diagrams from text, pp. 13-19 Recall: Sets A and B are mutually exclusive (m.e., or disjoint) iff: A B = F (the empty set). De Morgan s Laws (A B) = A B (A B) = A B Recall that a set with n elements has subsets. ECE 450 D. van Alphen 3

Vocabulary for Probability An experiment is some action that has outcomes (z, zeta) belonging to a fixed set of possible outcomes called the sample space or the universal set or the probability space, S. Each single performance of the experiment is called a. Chance experiment = random experiment, denoted E Before performing the experiment, the actual outcome is unknown; ECE 450 D. van Alphen 4

Examples of Experiments Example 1: E 1 = single toss of a die S = { } (sample space) S is finite, countable Example 2: E 2 = turning on radio receiver at time t = 0; measure voltage at certain point in circuit, t seconds later; define the outcome z(t) = v(t), where t is fixed; S = {v: - < v < } (sample space) uncountably infinite (ignoring measurement limits) ECE 450 D. van Alphen 5

Examples of Experiments, continued Example 3: E 3 : count the number of photo-electrons, (e), emitted by a particular surface when a particular light beam falls on it for t seconds; define the outcomes z 0 : 0 e's counted, z 1 : 1 e counted, z 2 : 2 e's counted, S = { } countably infinite ECE 450 D. van Alphen 6

More Probability Vocabulary Any subset of the sample space is called an. Thus, A is an event if A S. The elements of the event, A, are the individual outcomes, z, belonging to A. An experiment with n possible outcomes has events associated with it. Example 1, cont.' : A = an odd # appears" = { } B = an even # appears" = { } = A' (A-complement) ECE 450 D. van Alphen 7

Examples of Events & More Vocabulary Example 2, cont.' : A = voltage between 2 and 4, inclusive = {v: } B = voltage greater than 3" = {v: } Example 3, cont. : A = fewer than 4 e's counted" = B = a negative # of e's counted = F (the null set or empty set) We say event A occurs whenever any outcome in A occurs Elementary events are those that consist of a single outcome; compound events consist of several outcomes. ECE 450 D. van Alphen 8

Axioms of Probability Axiomatic approach due to Kolmogorav (a Russian mathematician, early 1900 s) A probability is a # assigned to an event, A, according to three rules or axioms Axiom 1: Pr(A) 0 (No negative probabilities) Axiom 2: Pr(S) = (Something has to happen) Axiom 3: If A & B are m.e., then Pr(A B) = (For 2 m.e. events, probabilities are additive.) We say event A occurs with probability Pr(A) ECE 450 D. van Alphen 9

Corollaries to the Axioms Corollary 1: Pr[A'] = 1 - Pr[A] Proof: Pr(S) = Pr[A' A] = Pr(A') + Pr(A) (why? ) 1 = Pr(A') + Pr(A) (why? ) Pr(A') = 1 - Pr(A) Example: Consider a 52-card deck. Pr(ace) = 4/52 = 1/13 (since there are 4 aces in the deck) Pr(not getting an ace) = Pr(2, 3,, 10, J, Q, K) = 1 - = (by cor. 1) Note that the events {ace} and {2,, 10, J, Q, K} are complementary events ECE 450 D. van Alphen 10

Corollaries, continued Corollary 2: 0 Pr(A) 1 Proof: Ax. 1; Pr(A) = 1 - Pr(A') (Cor. 1) 0 (Ax. 1) 1 Corollary 3: Pr(F) = 0 Proof: Pr(S) = Pr(S F) = Pr(S) + Pr(F) (since S, F m.e.) ECE 450 D. van Alphen 11

Corollaries, continued Corollary 4: Pr(A B) = Pr(A) + Pr(B) - Pr(A B) Proof: Pr(A B) = Pr(A (B A )) = Pr(A) + Pr(B A ) (m.e.) (1) Venn Diagram: S A B (to be completed in class) ECE 450 D. van Alphen 12

Corollaries, continued Similarly: Pr(B) = Pr((A B) (A B)) = Pr(A B) + Pr(B A ) Venn Diagram: S A (m.e.) (2) Now subtract equation (1) from equation (2): Pr(B) - Pr(A B) = Pr(A B) - Pr(A) (proving cor. 4) B (to be completed in class) ECE 450 D. van Alphen 13

Example (verifying the corollary) Experiment: Toss one die; Find Pr(A B) for A, B below: Let A = {1, 3}, B = {3, 5} Note: A B = {3} Pr(A) = Pr({1} {3}) = Pr{1} + Pr{3} = 1/6 + 1/6 = 1/3 Similarly, Pr(B) = 1/3 Pr{1, 3, 5} = Pr(A B) = Pr(A) + Pr(B) - Pr(A B) = 1/3 + 1/3 - Pr{3} = 1/3 + 1/3-1/6 = 3/6 = ½ (agreeing with our intuition) ECE 450 D. van Alphen 14

Combinatorics, Part 1: Combinations (Binomial Coefficients) nc k = "n choose k = n k (n!) (k!) (n k)! = # of ways to choose k objects out of n available objects if the order of the objects doesn t matter = combination of n objects, taken k at a time = # of subsets of size k for a set with n elements Example: # of possible 5-card poker hands: 52C 5 = (MATLAB): >> nchoosek(52,5) = 2,598,960) 52 5 ( ( ) ( ) ) ECE 450 D. van Alphen 15

Combinatorics Example: 5-card Poker Example: Pr(3 Spades in 5-card poker hand) = 1339 52.082 numerator = # ways to choose 3 Spades and 2 non-spades denominator = # of possible 5-card poker hands ECE 450 D. van Alphen 16

Combinatorics Example: 5-card Poker Example: Pr(full house) =??? (3 of one rank, 2 of another; e.g. KKK66) 13 # of ways to choose the first rank: = # of ways to choose the second rank: = # of ways to choose 3 of first kind: = # of ways to choose 2 of second kind: = 12 Pr(full house) = 1.44 x 10-3 ranks: numerical values of the cards, as opposed to the suits ECE 450 D. van Alphen 17

Combinatorics, Part 2: Permutations or Arrangements np k = n! (n k)! = permutation of n objects taken k at a time = # of ways to arrange k out of n objects, assuming that the order matters Example 1: # of possible license plates if they are formed from 26 letters of the alphabet and are 5 letters in length, and no letter can be repeated 26P 5 = 26!/21! = 26 25 24 23 22 = 7,893,600 ECE 450 D. van Alphen 18

Combinatorics Examples, continued Example 2: # of distinct seating arrangements possible for a group of 6 students, all 6 in a row: 6P 6 = 6! = 6 5 4 3 2 1 = 720 Example 3: # of distinct seating arrangements possible for 2 students in a row, chosen from a group of 6 students 6P 2 = 6!/4! = 6 5 = 30 Summary: use combinations when counting the number of ways to select objects if order doesn t matter, as in card games; use permutations when counting the number of ways to arrange objects, when order does matter. ECE 450 D. van Alphen 19

Interpretations of Probability: A. Classical Concept The classical concept assumes all outcomes are equally likely # of outcomes in A Pr(A) = # of possible outcomes in S Justified (for some problems) by the Principle of Indifference or Maximum Ignorance : no reason to favor one outcome over another Usually applied to gambling problems: dice, cards, coins, Example: Pr(bridge hand of 13 cards out of 52 has exactly one ace); solution follows ECE 450 D. van Alphen 20

Classical Probability: Example Pr(bridge hand of 13 out of 52 cards has 1 ace) = # of bridge hands # of possible with exactly 1ace bridge hands = =.439 ECE 450 D. van Alphen 21

Interpretations of Probability: B. Relative Frequency Concept (von Mises) Repeat an experiment N times; suppose (for example) that there are 4 possible outcomes, or elementary events, called A, B, C, and D. Let N A be the # of times event A occurs; similarly define N B, N C, and N D. Clearly, N = N A + N B + N C + N D. Define the relative frequency of event A as: r(a) = N A /N Relative frequency approach: Pr( A) lim r( A) N ECE 450 D. van Alphen 22

Relative Frequency Concept, continued Concept: Best predictor of future performance is past performance Relative frequency interpretation justifies Monte-Carlo Experiments (& thus computer simulations) Typical application: actuarial predictions Example: Pr{a 40-yr. old man dies within 1 yr.} = (# of 40-yr. old men who died in calendar year x) (# of 40-yr. old men at start of calendar year x) ECE 450 D. van Alphen 23

Interpretations of Probability: C. Distribution Concept Think of 1 unit of sand, representing the probability, to be distributed over sample space S 1 unit of sand S Sand is piled highest over the most likely outcomes in S ECE 450 D. van Alphen 24

Interpretations of Probability: D. Measure of Likelihood View Probability is a function whose domain is the sample space and whose range is the set of real numbers between 0 and 1: Impossible events 0 Unlikely events near 0 Very likely events near 1 Certain events 1 ECE 450 D. van Alphen 25

EE Application: Information Theory (Subset of CommunicationTheory) Channels can only accommodate so much information ( There exists an information capacity and maximum rate.) How do we measure information? Some concepts: Communication of information prior uncertainty (Ex: whistle the musical note F # ) Prior uncertainty about outcome surprise on occurrence of event e.g., ask: Will I believe in n years? ECE 450 D. van Alphen 26

Information Theory Concepts & Definition n =1: yes little surprise or information n = 10: yes a little more information n=100: yes very much surprise or information Thus, less likely events yield greater surprise more information Definition: The information in event A is given by I(A) log 1 Pr(A) log(pr(a)) ECE 450 D. van Alphen 27

Information, continued Units of measure for information in event A: I(A) = - Log[Pr(A)] bits if log is base 2 nats if log is base e (natural log) hartleys if log is base 10 (common log) Example 1: Binary Alphabet, S = {0,1} (Think of communicating a string of 1's and 0's, say ASCII, where 1's and 0's are equally likely.) Symbol, s Pr(s) I(s) 0 ½ Log 2 ( ) = 1 bit 1 ½ Log 2 ( ) = 1 bit Average info. per symbol: 1 bit ECE 450 D. van Alphen 28

Information, continued Example 2: Binary Alphabet, S = {0,1} This time we'll still send a stream of 1's and 0's, but they are not equally likely; say Pr(0) = ¼, Pr(1) = ¾ Symbol, s Pr(s) I(s) 0 ¼ Log 2 ( ) = bits 1 ¾ Log 2 ( / ) =.42 bits Average info. per symbol: 1/4(2) + 3/4(.42) =.815 bits Recall: To convert logs from one base to another log b (x) = ECE 450 D. van Alphen 29

Information & Entropy Definition: The entropy of the source, S, is the average information per symbol, given by H(S) = ss I(s)Pr(s) sym. info. For our examples prob(symbol) Due to bandwidth constraints, a source with a large entropy is desirable. Equally likely symbols H(s) = 1 bit/symbol Pr(0) = ¼, Pr(1) = ¾ H(s) =.815 bits/symbol ECE 450 D. van Alphen 30

Review Pr(A B) = Pr(A ) = (general rule) Combination of n things taken k at a time: = Information in the event A, I(A) = Entropy in source S with symbols s: H(S) = ECE 450 D. van Alphen 31