Probability Theory. Fourier Series and Fourier Transform are widely used techniques to model deterministic signals.

Similar documents
LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

Origins of Probability Theory

Review of Basic Probability

Probability- describes the pattern of chance outcomes

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

UNIT Explain about the partition of a sampling space theorem?

CIVL Why are we studying probability and statistics? Learning Objectives. Basic Laws and Axioms of Probability

Probability and Applications

Probability Theory and Applications

Statistical Theory 1

Probability Theory Review Reading Assignments

Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails.

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD

Introduction to Probability and Stocastic Processes - Part I

Probability. 25 th September lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.)

PROBABILITY THEORY 1. Basics

Econ 325: Introduction to Empirical Economics

02 Background Minimum background on probability. Random process

Probability Theory Review

Random Signals and Systems. Chapter 1. Jitendra K Tugnait. Department of Electrical & Computer Engineering. James B Davis Professor.

Fundamentals of Probability CE 311S

Relative Risks (RR) and Odds Ratios (OR) 20

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Basic Probabilistic Reasoning SEG

ICS141: Discrete Mathematics for Computer Science I

Statistical Inference

Sample Space: Specify all possible outcomes from an experiment. Event: Specify a particular outcome or combination of outcomes.

What is Probability? Probability. Sample Spaces and Events. Simple Event

Math 3338: Probability (Fall 2006)

Probability Dr. Manjula Gunarathna 1

Review Basic Probability Concept

Come & Join Us at VUSTUDENTS.net

Lecture 1 : The Mathematical Theory of Probability

ENGI 4421 Introduction to Probability; Sets & Venn Diagrams Page α 2 θ 1 u 3. wear coat. θ 2 = warm u 2 = sweaty! θ 1 = cold u 3 = brrr!

Probability & Random Variables

Chapter ML:IV. IV. Statistical Learning. Probability Basics Bayes Classification Maximum a-posteriori Hypotheses

MA : Introductory Probability

Probability - Lecture 4

Topic 3: Introduction to Probability

HW MATH425/525 Lecture Notes 1

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Announcements. Topics: To Do:

Lecture notes for probability. Math 124

Theory and Problems of Probability, Random Variables, and Random Processes

CIVL 7012/8012. Basic Laws and Axioms of Probability

STA 291 Lecture 8. Probability. Probability Rules. Joint and Marginal Probability. STA Lecture 8 1

Randomized Algorithms. Andreas Klappenecker

ENGI 3423 Introduction to Probability; Sets & Venn Diagrams Page 3-01

Unit 7 Probability M2 13.1,2,4, 5,6

Discrete Mathematics. (c) Marcin Sydow. Sets. Set operations. Sets. Set identities Number sets. Pair. Power Set. Venn diagrams

CPSC340. Probability. Nando de Freitas September, 2012 University of British Columbia

Discrete time Markov chains. Discrete Time Markov Chains, Definition and classification. Probability axioms and first results

INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR. NPTEL National Programme on Technology Enhanced Learning. Probability Methods in Civil Engineering

ECE353: Probability and Random Processes. Lecture 2 - Set Theory

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Statistics for Business and Economics

PROBABILITY CHAPTER LEARNING OBJECTIVES UNIT OVERVIEW

Topic 2 Probability. Basic probability Conditional probability and independence Bayes rule Basic reliability

2) There should be uncertainty as to which outcome will occur before the procedure takes place.

Chapter 2. Probability

Single Maths B: Introduction to Probability

Probability Theory Random Variables Random Processes

Introduction to Information Entropy Adapted from Papoulis (1991)

EnM Probability and Random Processes

Chapter Learning Objectives. Random Experiments Dfiii Definition: Dfiii Definition:

Basic Concepts of Probability

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Test One Mathematics Fall 2009

MATH 556: PROBABILITY PRIMER

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev

Recap of Basic Probability Theory

Recitation 2: Probability

EE 178 Lecture Notes 0 Course Introduction. About EE178. About Probability. Course Goals. Course Topics. Lecture Notes EE 178

Probability the chance that an uncertain event will occur (always between 0 and 1)

Chap 1: Experiments, Models, and Probabilities. Random Processes. Chap 1 : Experiments, Models, and Probabilities

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

Mathematical Statistics

CHAPTER 1. Preliminaries. 1 Set Theory

Recap of Basic Probability Theory

Notes on Mathematics Groups

Conditional Probability and Bayes

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur

Chapter 1 Axioms of Probability. Wen-Guey Tzeng Computer Science Department National Chiao University

Lecture Lecture 5

Bioeng 3070/5070. App Math/Stats for Bioengineer Lecture 3

Introduction to Probability. Experiments. Sample Space. Event. Basic Requirements for Assigning Probabilities. Experiments

F71SM STATISTICAL METHODS

Introduction to Probability

CMPSCI 240: Reasoning about Uncertainty

Introduction to probability theory

CDA6530: Performance Models of Computers and Networks. Chapter 1: Review of Practical Probability

QUANTITATIVE TECHNIQUES

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

P A = h n. Probability definitions, axioms and theorems. Classical approach to probability. Examples of classical probability definition

Stats Probability Theory

Some Concepts of Probability (Review) Volker Tresp Summer 2018

Axioms of Probability

Transcription:

Probability Theory Introduction Fourier Series Fourier Transform are widely used techniques to model deterministic signals. In a typical communication system, the output of an information source (e.g. binary output symbols from a computer) is rom in nature. It is not possible to express such rom signals by explicit time functions. However, a rom signal may exhibit certain statistical properties which can be modelled in terms of probabilities. In this way, the behaviour the performance of a communication system, in the presence of rom signals, can be analysed. Probability theory deals with concepts, not with reality. It is so formulated as to related to the real world but not in an exact manner. Results are derived/deducted from certain axioms. A central concept in the theory of probability is the event. By associating probabilities with various events in a clearly specified experiment, we can derive useful results from the experiment. Events are combined in various ways to form other events set theory is used to describe these operations. Set, Probability, Relative Frequency Set definitions : 1. Set - A collection of objects. 2. Elements - objects of the set. 2.1 Subset of a set A - a set where elements are also elements of the set A. 3. We shall only consider sets whose elements are taken from a largest set S, which we shall call space or universe set. Hence, all sets will be subsets of the set S. 4. An element of a set is any kind of object. It might be a collection of other objects from sets. Hence the name a class of sets or set of sets. Eg. S = {e 1, e 2 }, there are 2 2 subsets namely the subsets {empty}, {e 1 }, {e 2 }, {e 1, e 2 }. The element of the set S is e 1 {e 1 } is a set with element e 1. 5. A null or empty set φ contains no element. We do not want to exclude the possibility of having an empty set because in performing certain operations on sets, we might end up with an empty set. 6. In probability, we assign probability to various subsets of the space S not to elements. The subsets are called events. On the other h, we define functions (eg. 1

rom variables) only on elements. Probability definitions: 1. Outcomes - The result of an experiment (elements of a sample space S). 2. Sample space - The set of all possible outcomes of the rom experiment. 3. Rom experiment - An experiment whose outcome is not known in advance. Eg. If we throw a die a 4 comes out, the outcome of the rom experiment is 4. The outcome is the element 4 not the event {4}. At this trial of the experiment, the following events may occur: {4}, {4, 3}, {even}. 4. Event - (Subset of a sample space S). A collection of outcome(s) of a rom experiment. Subsets of a sample space S form a Borel field where A 1 + A 2 +... A 1 A 2... are also events. 5. Elementary event - An event has a single element or outcome. Element outcome are interchangeable in probability theory. In probability, we might not want to assign probability to every collection of outcomes. For example, it might not be possible to assign probabilities to all events if S has infinite elements. Another reason is that we want to derive other probabilities from some events, not all events, from S. The universal set is the certain event since it contains all possible outcomes of the experiment, denoted as S. The empty set is an impossible event since a trial of an experiment must always have one outcome, denoted as φ. N.B. : If an event has zero probability, the event may not be the impossible event because it may have elements. If an event has unity probability, then it may not be the certain event because it may not have all elements. 1. Consider now a sequence of m independent trails of a rom experiment which has n possible outcomes. The occurrence of the outcome is not in any way influenced by the occurrence of any other. Notice also that we are considering trials of a given rom experiment. In the sequence of m trials, the event A occurs m A times, the event B occurs m B times, so on. Thus m A + m B +... + m n = m (1) (m A /m) + (m B /m) +... + (m n /m) = 1. (2) The term m A /m is defined as the relative frequency of the occurrence of event A. 2

As m ->, the term m A /m tends towards a fixed value. This value is called the probability of the event A is denoted by P(A). 2. Consider now a sequence of m independent trials of a rom experiment which has n possible outcomes. The occurrence of the outcome is not in any way influenced by the occurrence of any other. In the sequence of m trials, the outcome a occurs m a times, the outcome b occurs m b times, so on. Thus m a + m b +... + m n = m (m a /m) + (m b /m) +... + (m n /m) = 1. As m ->, the term m a /m tends towards a fixed value again. This value is called the probability of the outcome a is denoted by P(a). It can be seen that 0 m A /m 1 (3) 0 P(A) 1 (4) P(S) = 1 (5) P(φ) = 0 (6) Consider the events A B in m number of trials of a given experiment. Suppose that 1. The event A B occurs m 1 times, 2. The event B A occurs m 2 times, 3. The event AB occurs m 3 times, 4. The event A + B occurs m 4 times. The above four events are mutually exclusive, so that no more than one of them can occur at any one trial, their union is the certain event so that at least one of them must occur at any trial. Thus Also m 1 + m 2 + m 3 + m 4 = m. (7) m A = m 1 + m 3 (8) m B = m 2 + m 3 (9) m AB = m 3 (10) 3

m A+B = m 1 + m 2 + m 3. (11) Thus (m A+B /m)=(m A /m)+(m B /m)-(m AB /m) (12) so that P(A+B) = P(A) + P(B) - P(AB). (13) P(AB) is called the joint probability of the event AB. If A B are mutually exclusive, P(AB) = 0 (14) P(A+B) = P(A) + P(B). (15) Summary 1. P(A) 0 2. P(S) = 1 3. P(A+B) = P(A) + P(B) - P(AB) 4. P(AB) = 0 for mutually exclusive events A B. P(A+B) = P(A) + P(B) 1, 2 4 are axioms. 3 can be deducted from these axioms, although we did the reverse here. Conditional Probability In many applications, we often perform experiment that consists of sub-experiments. Suppose a rom experiment E consists of two sub-experiments E 1 E 2. We take the event A from experiment E 1 the event B from experiment E 2, consider the event AB as the event from experiment E. AB is the intersection of events from experiments E 1 E 2 P(AB) is called the joint probability of the event AB in the experiment E. Quite often, the occurrence of event B may depend on the occurrence of event A. As before, consider the events A B in m number of trials of a given experiment. Suppose that 1. The event A B occurs m 1 times, 4

2. The event B A occurs m 2 times, 3. The event AB occurs m 3 times, 4. The event A + B occurs m 4 times. The relative frequency of the event A, on the condition that B has occurred, is the number of times both A B occur divided by the number of times B occurs. Therefore, Similarly, m B = m 2 + m 3 (16) m AB = m 3. (17) P(A/B) = m 3 /(m 2 + m 3 ) (18) = conditional probability of A given B. P(B/A) = m 3 /(m 2 + m 3 ) (19) = m BA /m A From the assumption concerning A B, P(A) = m A / m = (m 1 + m 3 ) / m P(B) = m B / m = (m 2 + m 3 ) / m P(AB) = m AB / m = m 3 / m P(A/ B) = m AB / m B = m 3 / (m 2 + m 3 ) P(B/ A) = m BA / m A = m 3 / (m 1 + m 3 ) It is also assumed that P(A) 0 P(B) 0. Thus or or P(B)P(A/ B) = P(AB) P(A/ B) = P(AB) / P(B) (20) P(A)P(B/ A) = P(AB) 5

P(B/ A) = P(AB) / P(A). (21) If P(AB) = P(A)P(B), (22) then the events A B are statistically independent events. Now from equations (20) (22), P(A/B) = P(A) (23) from eqns. (21) (22) P(B/A) = P(B). (24) Eg.1 If A B have no common elements (i.e. mutually exclusive). Under these conditions AB = φ P(A/B) = [P(AB) / P(B)] = 0. Eg.2 Given Figure 1, find P[(A+B)/M]. Figure 1 AB = φ P[(A+B)M] (AM)(BM) = φ = P[AM + BM] = P(AM)+P(BM)-P[(AM)(BM)] = P(AM) + P(BM) P[(A+B)/M] = P[(A+B)M]/P(M) = P(A/M) + P(B/M) Total Probability Consider n mutually exclusive events A 1, A 2,..., A n where A 1 + A 2 +... + A n = S. (25) Figure 2 From equation (25) B = BS = B(A 1 + A 2 +... + A n ) = BA 1 + BA 2 +... + BA n 6

P(B) = P(BA 1 + BA 2 +... BA n ) = P(BA 1 ) + P(BA 2 ) +... + P(BA n ). Since events BA i are mutually exclusive, but P(BA i ) = P(B/A i )P(A i ). Thus P(B) = P(B/A 1 )P(A 1 ) +...+ P(B/A n )P(A n ) n = P(B/A i )P(A i ). (26) i =1 Bayes' Theorem It has been shown that for two events A B, P(AB) = P(A/B)P(B) = P(B/A)P(A). Thus P(A/B) = P(B/A)P(A) / P(B). (27) To calculate P(A i /B) where A 1, A 2,..., A n are n mutually exclusive events, we simply substitute equations (26) into (27) replace A by A i. Thus n P(A i /B) =P(B/A i )P(A i ) / P(B/A i )P(A i ) i =1 (28) P(A) - a priori probability P(A i /B) - a posteriori probability Summary 1. In probability theory, outcomes or elements of a clearly specified experiment form a set S called space or certain event. 2. Events are subsets of S. 3. Events are mutually exclusive if they have no common elements. 4. Assignment of probability to each event must satisfy the 3 axioms. 5. P(AB) = P(A)P(B) --> statistically independent events. 7

6. P(A/B) = P(AB)P(B) - conditional probability of A. 7. P(B/A) = P(AB)/P(A) - conditional probability of B. 8. See equation (28) for n mutually exclusive events A 1, A 2,..., A n where A 1 + A 2 +...+ A n = S. Combined Experiments As we have already seen, one often deal with 2 or more (sub-)experiments or repeated trials of an experiment. To analyse such problems in a proper manner, we must conceive a combined experiment whose outcomes are multiples of objects belonging to the (sub- )experiments. Product of Spaces Given n experiments E 1, E 2,..., E n, the combined experiment resulting from all the experiments is defined as E = E 1 x E 2 x... x E n (29) If S i is the space of E i for i = 1, 2,..., n, the space S of E is the cartesian product of the spaces S 1, S 2,..., S n. We write in the form S = S 1 x S 2 x... x S n (30) It is important to note that the elements of a cartesian product are ordered objects. For example, {1, 2} is different from {2, 1}. The events of the combined experiment are sets of the form A 1 x A 2 x... x A n where A i belongs to S i. Note that all events in E must form a Borel field so that the sums products of the events are also events in E. In many applications, events are independent. For independent events, it can be shown that the probability of the event A in E is P(A) = P 1 (A 1 ) P 2 (A 2 )... P n (A n ) (31) where P i (A i ) is the probability of event A i in E i. It can be seen that P(A) cannot be determined from the probabilities P 1 (A 1 ), P 2 (A 2 ),..., P n (A n ) of each experiment alone unless all events are independent events. Sum of Spaces Given n experiments E 1, E 2,..., E n, the combined experiment resulting from all the experiments is defined as E = E 1 + E 2 +... + E n (32) If S i is the space of E i for i = 1, 2,..., n, the space S of E is 8

S = S 1 + S 2 +... + S n (33) Let A i be the event of the space S i in E i. Thus, the events of the combined experiment are sets of the form A = A 1 + A 2 +... + A n (34) If S i S j are mutually exclusive spaces for i j, we have P(S) = P(S 1 ) + P(S 2 ) +... +P(S n ) (35) P(A) = P(A 1 ) + P(A 2 ) +... +P(A n ) (36) We need to find P(A i ) in terms of P i (A i ) P(S i ) of E i. Since A i S i, the probability of event A i in E i is P(A i / S i ) = P i (A i ) (37) From conditional probability, we have Therefore, P(A i ) = P(A i / S i )P(S i ) (38) P(A i ) = P i (A i )P(S i ) (39) P(A) = P 1 (A 1 )P(S 1 ) + P 2 (A 2 )P(S 2 ) +... +P n (A n )P(S n ) (40) It can be seen that P(A) can be determined from the probabilities P 1 (A 1 ), P 2 (A 2 ),..., P n (A n ), P(S 1 ), P(S 2 ),..., P(S n ) of each experiment alone if all events are mutually exclusive. References [1] Taub, H. Schilling, D. L., Principles of Communication Systems, 2/e, McGraw-Hill, 1987. [2] Shanmugam, K. S., Digital Analog Communication Systems, John Wiley & Sons, 1979. [3] Papoulis, A, Probability, Rom Variables, Stochastic Processes, McGraw- Hill, 1965. 9

A B M S Figure 1 A 1 A 2... A n B S Figure 2 10