(Refer Slide Time: 6:43)

Similar documents
(Refer Slide Time: 0:36)

Lecture 03 Positive Semidefinite (PSD) and Positive Definite (PD) Matrices and their Properties

INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR. NPTEL National Programme on Technology Enhanced Learning. Probability Methods in Civil Engineering

ECE353: Probability and Random Processes. Lecture 2 - Set Theory

CMPSCI 240: Reasoning about Uncertainty

Theory of Computation Prof. Raghunath Tewari Department of Computer Science and Engineering Indian Institute of Technology, Kanpur

Venn Diagrams; Probability Laws. Notes. Set Operations and Relations. Venn Diagram 2.1. Venn Diagrams; Probability Laws. Notes

Announcements. Topics: To Do:

Constrained and Unconstrained Optimization Prof. Adrijit Goswami Department of Mathematics Indian Institute of Technology, Kharagpur

Origins of Probability Theory

Kinematics of Machines Prof. A. K. Mallik Department of Mechanical Engineering Indian Institute of Technology, Kanpur

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

2) There should be uncertainty as to which outcome will occur before the procedure takes place.

Mathematical methods and its applications Dr. S. K. Gupta Department of Mathematics Indian Institute of Technology, Roorkee

Fundamentals of Probability CE 311S

MATH 215 Sets (S) Definition 1 A set is a collection of objects. The objects in a set X are called elements of X.

Advanced 3 G and 4 G Wireless Communication Prof. Aditya K Jagannathan Department of Electrical Engineering Indian Institute of Technology, Kanpur

An Invitation to Mathematics Prof. Sankaran Vishwanath Institute of Mathematical Sciences, Chennai. Unit I Polynomials Lecture 1A Introduction

Basic Quantum Mechanics Prof. Ajoy Ghatak Department of Physics Indian Institute of Technology, Delhi

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Lecture - 24 Radial Basis Function Networks: Cover s Theorem

Statistical Theory 1

Probability Methods in Civil Engineering Prof. Rajib Maity Department of Civil Engineering Indian Institute of Technology, Kharagpur

Probability. 25 th September lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.)

Error Correcting Codes Prof. Dr. P Vijay Kumar Department of Electrical Communication Engineering Indian Institute of Science, Bangalore

(Refer Slide Time: 0:15)

STAT 111 Recitation 1

(Refer Slide Time: 0:23)

GPS Surveying Dr. Jayanta Kumar Ghosh Department of Civil Engineering Indian Institute of Technology, Roorkee. Lecture 06 GPS Position

Basic notions of probability theory

(Refer Slide Time: 0:36)

Hydrostatics and Stability Dr. Hari V Warrior Department of Ocean Engineering and Naval Architecture Indian Institute of Technology, Kharagpur

Econ 325: Introduction to Empirical Economics

Properties of Probability

Advanced 3G and 4G Wireless Communication Prof. Aditya K. Jagannatham Department of Electrical Engineering Indian Institute of Technology, Kanpur

Information Science 2

Statistical Inference

Math 3338: Probability (Fall 2006)

Modern Algebra Prof. Manindra Agrawal Department of Computer Science and Engineering Indian Institute of Technology, Kanpur

(Refer Slide Time: 0:17)

Probability Method in Civil Engineering Prof. Dr. Rajib Maity Department of Civil Engineering Indian Institute of Technology, Kharagpur

Operations on Sets. Gazihan Alankuş (Based on original slides by Brahim Hnich et al.) August 6, 2012

Probability- describes the pattern of chance outcomes

Introduction to Cryptology Dr. Sugata Gangopadhyay Department of Computer Science and Engineering Indian Institute of Technology, Roorkee

Introduction to Electromagnetism Prof. Manoj K. Harbola Department of Physics Indian Institute of Technology, Kanpur

Deep Learning for Computer Vision

Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras

Chap 1: Experiments, Models, and Probabilities. Random Processes. Chap 1 : Experiments, Models, and Probabilities

Data Science for Engineers Department of Computer Science and Engineering Indian Institute of Technology, Madras

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

Lecture 2. Conditional Probability

Module 03 Lecture 14 Inferential Statistics ANOVA and TOI

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

LING 473: Day 3. START THE RECORDING Assignment 1 Review Probability. University of Washington Linguistics 473: Computational Linguistics Fundamentals

Classical Mechanics: From Newtonian to Lagrangian Formulation Prof. Debmalya Banerjee Department of Physics Indian Institute of Technology, Kharagpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dynamics of Ocean Structures Prof. Dr. Srinivasan Chandrasekaran Department of Ocean Engineering Indian Institute of Technology, Madras

Chapter 1 (Basic Probability)

Review of Basic Probability Theory

[Refer Slide Time: 00:45]

1 Chapter 1: SETS. 1.1 Describing a set

Applied Multivariate Statistical Modeling Prof. J. Maiti Department of Industrial Engineering and Management Indian Institute of Technology, Kharagpur

Statistics 251: Statistical Methods

Making Hard Decision. Probability Basics. ENCE 627 Decision Analysis for Engineering

Lecture notes for probability. Math 124

Sample Space: Specify all possible outcomes from an experiment. Event: Specify a particular outcome or combination of outcomes.

Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails.

Relative Risks (RR) and Odds Ratios (OR) 20

Probability Theory and Applications

Lecture 4. Selected material from: Ch. 6 Probability

(Refer Slide Time: 1:09)

(Refer Slide Time: 0:21)

2. Sets. 2.1&2.2: Sets and Subsets. Combining Sets. c Dr Oksana Shatalov, Spring

Electromagnetic Theory Prof. D. K. Ghosh Department of Physics Indian Institute of Technology, Bombay

Stats Probability Theory

Introduction to Probability and Stocastic Processes - Part I

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE

Antennas Prof. Girish Kumar Department of Electrical Engineering Indian Institute of Technology, Bombay. Module 02 Lecture 08 Dipole Antennas-I

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Lecture - 02 Rules for Pinch Design Method (PEM) - Part 02

Quantum Field Theory Prof. Dr. Prasanta Kumar Tripathy Department of Physics Indian Institute of Technology, Madras

Basic notions of probability theory

Chapter 2 PROBABILITY SAMPLE SPACE

Math 42, Discrete Mathematics

Pattern Recognition Prof. P. S. Sastry Department of Electronics and Communication Engineering Indian Institute of Science, Bangalore

THE UNIVERSITY OF HONG KONG DEPARTMENT OF STATISTICS AND ACTUARIAL SCIENCE

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

= A. Example 2. Let U = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10}, A = {4, 6, 7, 9, 10}, and B = {2, 6, 8, 9}. Draw the sets on a Venn diagram.

1.1 Introduction to Sets

Electronics Prof. D C Dube Department of Physics Indian Institute of Technology Delhi

2.4. Conditional Probability

Conditional Probability and Independence

Probability and distributions. Francesco Corona

Two events A and B are independent if P(A B)=P(A) and are dependent otherwise. If two events are independent then they cannot be mutually exclusive.

Introduction to Quantum Mechanics Prof. Manoj Kumar Harbola Department of Physics Indian Institute of Technology, Kanpur

Slide 1 Math 1520, Lecture 21

Probability Methods in Civil Engineering Prof. Dr. Rajib Maity Department of Civil Engineering Indian Institute of Technology, Kharagpur

Probability Methods in Civil Engineering Prof. Dr. Rajib Maity Department of Civil Engineering Indian Institution of Technology, Kharagpur

Linear Programming and its Extensions Prof. Prabha Shrama Department of Mathematics and Statistics Indian Institute of Technology, Kanpur

Basic Statistics and Probability Chapter 3: Probability

Transcription:

Probability and Random Variables/Processes for Wireless Communication Professor Aditya K. Jagannatham Department of Electrical Engineering Indian Institute of Technology Kanpur Module 2 Lecture No 7 Bayes Theorem and Aposteriori Probabitiesr Hello, welcome to another module in this massive open online course on probability and random variables for wireless communications. In the previous model we have looked at various concepts of probability. In this module, let us start looking at another new concept or a new result which is very important in the context of communication. This is the Bayes Theorem. (Refer Slide Time: 0:37) So let us look, in this module, let us start looking at Bayes Theorem. For Bayes Theorem, what we would like to do is we would like to consider the sample space, S. I would like to consider 2 events. Consider 2 events, A0 and A1, these are basically 2 events. And these, both A0 and A1 belong to the sample space S such that A0 union A1 equals the entire sample space S. And A0, A1 are mutually exclusive. That is, A0 intersection A1 equals phi. So we are considering 2 events, A0 and A1 in our sample space S such that A0 union A1 that is the union of these 2 events is equal to the entire sample space, S and these 2 events A0 and A1 are mutually exclusive that is A0 intersection A1 is the null set or null event, phi. Such events, A0 and A1 are known as mutually exclusive and exhaustive.

So these events, A0, A1 are mutually exclusive and exhaustive. Exhaustive meaning, their union spans the entire sample space, S and while their intersection is the null event. Therefore these events are mutually exclusive and exhaustive. (Refer Slide Time: 3:16) Now let us consider another event B. Let us consider an event B in the sample space, S. Now you can see, this part is B intersection A0 and this part is B intersection A1. now you can see clearly, B intersection A0 and B intersection A1 are also mutually exclusive. Further, B intersection A0 and B intersection A1 together span B or together, the union of these 2 is B. Alright! So we can see, for any set B, B intersection A0, B intersection A1 are mutually exclusive events that is their intersection is phi or the null event. Further, B intersection A0 union

B intersection A1 equals the event B. Alright! So we are saying that for any event B, we have B intersection A0 and B intersection A1 are mutually exclusive. That is, their intersection is a null event. And B intersection A0 union B intersection A1 is equal to the entire event B. (Refer Slide Time: 5:14) Therefore, if I now look at the probability of B, so now, B equals B intersection A0 union B intersection A1. Therefore the probability of B equals probability of B intersection A0 plus the probability of B intersection A1 because B intersection A0 and B intersection A1 are mutually exclusive. Now again we know that from the basic definition of conditional probability, we had already shown that probability of B intersection A0 is the probability of B given A0 times the probability of A0. Probability of B intersection A1 equals the probability of B given A1 times the probability of A1. We know from the definition, from our conditional probability module that the probability of B intersection A0 is the probability of B given A0 times the probability of A0. The probability of B intersection A1 is the probability of B given A1 times the probability of A1.

(Refer Slide Time: 6:43) Therefore, substituting these in the expression above, we have the total probability P(B) is the probability B given A0, probability A0 plus probability B given A1 times the probability of A1. So the probability of B is the probability B given A0 times probability A0 plus probability B given A1 times the probability of A1. (Refer Slide Time: 7:22) Now, we also know again that the probability of B intersection A0 again from our conditional probability is the probability of B given A0 times the probability of A0 and this is also equal to the probability of A0 given B times the probability of B. So the probability of B intersection A0 is the probability of B given A0 times the probability of A0 which is also the probability of A0

given B times the probability of B. Therefore from this, what I have is interestingly I have from this. This implies that the probability of A0 given B equals probability of B given A0 into probability of A0 divided by the probability of B. (Refer Slide Time: 8:32) Now what I am going to do here is I am going to substitute the expression of probability of B from the previous page and therefore what I have is this is equal to (Refer Slide Time: 8:33) probability B given A0 times probability of A0 divided by probability of B given A0 times probability of A0 + probability of B given A1 times the probability of A1. This is the expression for probability of A0 given B. So we have probability of A0 given B equals probability of B

given A0 times probability of A0 divided by the probability of B given A0 times probability of A0 + probability of B given A1 times the probability of A1. Right? (Refer Slide Time: 9:27) Similarly, I can derive the expression for probability of A1 given B. This is the expression for probability of A0 given B. The expression for the probability of A1 given B is similar. Probability of B given A1 times the probability of A1 by probability of B given A0 times probability of A0 plus probability of B given A1 times probability of A1 and you can also see that the probability of A0 given B plus the probability of A1 given B equals one because A0 and A1 are mutually exclusive and exhaustive events. Therefore the probability of A0 given B plus the probability of A1 given B is equal to 1.

(Refer Slide Time: 10:40) And these quantities here, these quantities that we have calculated here, so this is basically the Bayes theorem. Right? Now these quantities here, the probabilities of A0 given B and the probabilities of A1 given B: these quantities are very important in the context of communication. These are known as Aposteriori probabilities. These quantities are very important in the context of communication. The probability is P of A0 given B and P of A1 given B, these are known as Aposteriori probability. And we are going to introduce an example later which will clarify the application of these but these quantities are known as the Aposteriori probabilities and these Aposteriori probabilities can now be computed using Bayes Theorem

(Refer Slide Time: 11:45) and these expressions that you see for the Aposteriori probabilities that you see, this is nothing but this is our Bayes Theorem. Our Bayes theorem gives an expression for the Aposteriori probabilities. These quantities, P of A0, P of A1: these are known as the prior probabilities and these quantities, probability of B given A0 and the probability of B given A1, these are called the likelihoods. All of these are important terminologies in the context of communication or wireless communication. So if A0, A1 are mutually exclusive and mutually exhaustive, we have the Bayes gives us a very useful relation to calculate the Aposteriori probabilities, P of A0 given B and P of A1 given B in terms of the prior probabilities P of A0 A1 and the likelihoods, P of B given A0 and P of B given A1. And this is a very important result and we are going to demonstrate an application of this shortly in the context of the MAP principle or the maximum Aposteriori probability receiver but before we do that, let us extend this Bayes Theorem to a general case with N mutually exclusive and exhaustive events, Ai.

(Refer Slide Time: 13:46) So now, let us extend this to a general version of the Bayes Theorem. So let us now state a generalised version of or a general version of Bayes Theorem where we now consider a sample space, S. Let us now consider the sample space, S which is divided into N mutually exclusive and exhaustive. So what I have over here is, I have my sample space, S and A0, A1 up to AN-1. These are N events which belong to S. Further, A0, A1 up to AN-1. are mutually, these are mutually exclusive. These are mutually exclusive implies Ai intersection Aj equal to phi for any i not equal to j. That is, we have the property that Ai that is I have capital N events, A0, A1 up to A N-1, these are mutually exclusive implying that if I take any 2 events, Ai and Aj, Ai intersection Aj is the null event that is phi.

(Refer Slide Time: 16:19) Further, if I take the union of all these events, A0 union A1 union so on until union AN-1 should be equal to the sample space, S. So this is the exhaustive property. So remember, this is the exhaustive property. Exhaustive implying that the union of all these events is this entire sample space and mutually exclusive implying that if you take any 2 events, Ai and Aj, their intersection is the null event. Basically, in terms of set theory, we say the sets of events, A0, A1 up to AN-1 are a partition of the entire sample space, S. That is, their union s spans the entire sample space, S and these different events or sides are disjoint. That is, if I take any 2 sets, their intersection is the empty set. These are mutually exclusive and exhaustive events.

(Refer Slide Time: 17:26) And now the Bayes Theorem for the Aposteriori probabilities, Bayes theorem gives the Aposteriori probability similar to what we have seen before, P of Ai given B is P of remember how we derived the Bayes result, we first said that P of B intersection Ai equals P of B given Ai into P of Ai which is equal to P of Ai given B into P of B which basically implies that P of Ai given B equals divided by P of B, P of B given Ai times P of Ai. Now, if I substitute the expression for P of B, I have P of Ai given B is equal to P of B given Ai times P of Ai divided by summation J equal to 0 to N 1, P of B given Aj times P of Aj. And this is the, this is my result for the Bayes Theorem. That is what is my Bayes Theorem? Bayes theorem states that P of Ai that is the Aposteriori probability of the event Ai given B, probability Ai given B is probability B given Ai times probability Ai divided by the probability B which is basically summation j equal to 0 to N - 1 probability of B given Aj times probability of Aj.

(Refer Slide Time: 20:08) And also, therefore, as we have also seen before, summation of, it is easy to see that the summation of j equal to 0 to N - 1 or i equal to 0 to N 1, probability of Ai given B, this is equal to 1. Further, these quantity s, probabilities of Ai given B, these are the Aposteriori probabilities. Remember these are the Aposteriori probabilities. These probabilities of AI, these quantities, the probabilities of AI, these are the prior probabilities. (Refer Slide Time: 21:19) And the probabilities of B given Ai, these are the likelihoods. Right? So this is a general expression for the Bayes Theorem, in terms that is expression for the Aposteriori probabilities in terms of the prior probabilities and the likelihoods and we said that this Bayes Theorem or this

Bayes result is a very important principle in communications. This is used to construct what we call that as the maximum Aposteriori, the MAP receiver which is something that we are going to look at in our next module. So I would like to conclude this module here. Thank you very much.