ECE Lecture #9, Part 1
|
|
- Alban Morris
- 5 years ago
- Views:
Transcription
1 ECE Lecture #9, Part 1 Pairs of Random Variables, continued - Overview Joint PDF s A Discrete Example (PDF for Pair of Discrete RV s) Jointly Gaussian RV s Conditional Probability Density Functions (Again) Bayes Rules for PDF s Mnemonic Table: Conditional Probabilities vs. Conditional PDF s Communications Example: Signal + Noise 1
2 Pairs of RV s A Discrete Example (from Ross, A First Course in Probability) Suppose that 15 percent of the families in a certain community have no children, 0% have 1 child, 35% have children, and 30% have 3 children. Also suppose that each child is equally likely to be a boy or girl, independent of family size. Experiment: Choose a family at random from the community. RV s: Let B be the # of boys in the family, and let G be the # of girls. Find the joint pdf, f BG (b, g). Note: both RV s (B & G) are discrete, and each RV can take values:,,, or
3 Repeating the givens Fraction of families w/0 children:.15 Fraction of families w/1 child:.0 Fraction of families w/ children:.35 Fraction of families w/3 children:.30 Calculating some probabilities: Pr(B = 0, G = 0) = Pr(no children) =.15 Assumes Pr(G) = Pr(B) =.5 Pr(B = 0, G = 1) = Pr(1 girl and total of 1 child) = Pr(1 girl 1 child) Pr(1 child) =.5 (.0) =.10 Pr(B = 1, G = 1) = Pr(1 boy, 1 girl children) Pr( children) = Pr{ (GB or BG children) (.35) =.5 (.35) =.175 Pr(B =, G = 1) = Pr( boys, 1 girl 3 children) Pr(3 children) = Pr(BBG or BGB or GBB 3 children) (.30) = (3/8) *.30 =.115 etc. 3
4 i (Answers calculated on previous page are in bold.) Table of Joint Probabilities: P(B = i, G = j) j Row sum = Pr(B = i) Col. Sum = Pr(G = j) Verification of the other probabilities (not in bold) will be a homework problem. The joint pdf for B, G would be 16 delta functions in the x-y (or b-g) plane, with areas given by the corresponding probability numbers. The last row and the last column are the marginal probabilities, Pr(G = j) and Pr(B = i). 4
5 Jointly Gaussian RV s RV s and are said to be jointly Gaussian (or jointly normal) if: f (x, y) = 1 x y 1 1 exp{ (1 x m [( ) x m ( y m )( y m ) ( ) x ) ]} Parameters: Mean of =, Mean of =, Standard Deviation of =, Standard Deviation of = Correlation Coefficient (between and ) = 5
6 Jointly Gaussian RV s Claim (to be shown for homework): If we let = 0 in the equation for f (x, y) (repeated below), the joint pdf can be factored into the form: f (x, y) = f (x) f (y) f (x, y) = exp{ (1 x m [( ) x m ( y m )( y m ) ( ) x ) ]} This will prove that: If two jointly normal RV s and have correlation coefficient = 0, then and are independent. 6
7 Jointly Gaussian RV s (aka: Bivariate Gaussians) m = m = 0; = = 1; = 0 m = m = 0; = = 1; = pdf 0.1 pdf y x y x 0 4 Note Bell-shaped curve with circular cross-sections. Note: Bell is squished when x & y are highly correlated; elliptical cross-sections 7
8 MATLAB Code for the -Dimensional Gaussian PDF Function % function pdf = Gauss_d(mx, my, varx, vary, r) % Generates plot of -d Gaussian pdf function pdf = Gauss_d(mx, my, varx, vary, r) stdx = sqrt(varx); stdy = sqrt(vary); maxstd = max(stdx, stdy); xend = mx + 4 * maxstd; xstart = mx - 4 * maxstd; xstep = (xend - xstart)/100; yend = my + 4 * maxstd; ystart = my - 4 * maxstd; ystep = (yend - ystart)/100; A = 1/(*pi*stdx*stdy*sqrt(1-r^)); [x,y] = meshgrid(xstart : xstep : xend, ystart : ystep : yend); exparg = -1/(*(1-r^)); trm1 = ((x - mx).^)./varx; trm = *r*(x - mx).*(y - my)./ (stdx * stdy); trm3 = ((y - my).^)./vary; pdf = A * exp(exparg*(trm1 - trm + trm3)); surf(x,y,pdf) % surface plot of pdf(x,y) ECE 650 D. van Alphen 8
9 Conditional pdf s - Again Recall: f (x M) = (d/dx) F(x M), Special Case: If M = {a < b), then f (x M) = f (x) Pr(a b) Same shape as original pdf over region (a, b) Now: event M will be defined in terms of a nd RV, Example: If M is the event: y, then Pr( x, y) F (x M) = F (x y) = Pr(M) F(x, y) F (y) Similarly: If M is the event: y 1 <, then F (x M) = F (x y 1 < ) = Pr( x,m) Pr(M) F(x, y F (y ) ) F(x, y1) F (y ) 1 9
10 Conditional pdf s - Again In the previous cases, the event M had non-zero probability, so Pr(M) made sense in the denominator. Problem: How do we handle an event M with probability 0, as in M = { = y}, when is a continuous RV? F (x M) = F (x = y) = Pr( x, y) Pr( y) 0 0? Instead, try: F (x = y) = lim F (x y < y + Dy) Similarly: f f (x (y y) Dy 0 x) f (x, y) f (y) f (x, y) f (x) See Cooper & McGillem, 3 rd ed., p. 15 for proof (subscripts on LHS often omitted) 10
11 Bayes Rule for PDF s Shortened notation: Sometimes we write f (x =y) as f(x y); and f (y =x) as f(y x) So from the previous page we would write: f (x y) f (x, y) f (y) and f (y x) f (x, y) f (x) f (x, y) f (x y) f (y) f (x, y) f (y x) f (x) Equating the RHS s of the two equations above yields: Alternate form: f (x y) f (x y)f (y) f (y x) f (x) f(x y) f (y) = f(y x) f (x) f (y x)f (x) f (y) 11
12 Mnemonic: Table of Similar Equations Conditional Probabilities Conditional pdf s Definition (1) Pr(A B) Pr(A,B) Pr(B) f(x y) f(x,y) f () Definition () Pr(B A) Pr(A,B) Pr(A) f(y x) f(x,y) f (x) Total Prob. (1) Pr(B) = Pr(B A i ) Pr(A i ) f (y) = f(y x) f (x) dx Total Prob. () Pr(A) = Pr(A B i ) Pr(B i ) f (x) = f(x y) f (y) dx Bayes Rule (1) Pr(A B) Pr(B A)Pr(A) Pr(B) f(x y) f(y x)f(x) f () Bayes Rule () Pr(B A) Pr(A B)Pr(B) Pr(A) f(y x) f(x y)f(y) f (x) 1
13 Mixed Form of Bayes Rule (Mixing probabilities of events and pdf s) Let A be an event and let be a RV, with pdf f (x). Then Pr(A = x) = f (x A)Pr(A) f (x) and f (x A) = Pr( A x)f Pr(A) (x) 13
14 Signal + Noise An Example N Tx + = + N Suppose that signal is tx d over an additive noise channel, so that the received signal is = + N. Here is called the observable. The specific observable (say = y 1 ) is considered a good estimate for the desired value,. Goal: Find f(x y). For Bayes Rule, we need f(y x). But since = + N, and is given (as implied by f(y x), then the randomness (in ) is all in N, and is modeled by f N (n), so: f(y x) = f N (n = y x) = f N (y-x) (*) 14
15 Signal + Noise Specific Example, MAP Receivers for Binary Communication Consider communicating one of two messages (m 0 or m 1 ) over the additive white noise channel: x m i y i Modulator S (In transmitter) n Rcvr From communication theory, the maximum a posteriori probability (or MAP) decision rule for the receiver (which yields minimum Pr(error) in additive white noise for equally likely signals), based on observable y 0 is: m ^ i Modulator: maps messages or symbols to waveforms Choose m = m 1 iff Pr(m 1 =y 0 ) > Pr(m 0 =y 0 ) 15
16 Signal + Noise Specific Example, MAP Receivers, continued x m i y i Mod. S rcvr m ^ i n MAP Receiver decision rule is based on observable y: ^ Choose m i = m 1 iff Pr(m 1 = y) > Pr(m 0 = y) f(y m 1 ) Pr(m 1 )/f(y) > f(y m 0 ) Pr(m 0 )/f(y) Assume Pr(m 1 ) = Pr(m 0 ) = ½ f N (y-x 1 ) Pr(m 1 ) > f N (y-x 0 ) Pr(m 0 ) f N (y-x 1 ) > f N (y-x 0 ) 16
17 MAP Rcvr: Binary Case, Equally Likely Signals ^ So far: choose m i = m 1 iff f N (y-x 1 ) > f N (y-x 0 ) Now suppose that the modulator mapping (from messages to signals) is: For message m 0 : tx signal x 0 = 5 volts For message m 1 : tx signal x 1 = -5 volts And suppose that the noise is Gaussian: N(0, = sqrt()): ^ MAP rule becomes: choose m i = m 1 iff f N (y+5) > f N (y-5) 1 exp{ (y 5) /( )} 1 exp{ (y 5) /( )} > 17
18 MAP Rcvr: Binary Case, Equally Likely Signals (5, -5) in AWGN So far: Choose m i = m 1 iff: ^ < < < < Receiver implementation, with comparator: m 0 if y m 1 if y 18
ECE 450 Lecture 3. Overview: Randomly Selected Experiments & Hypothesis Testing. Digital Communications Examples. Many General Probability Examples
ECE 450 Lecture 3 Overview: Randomly Selected Experiments & Hypothesis Testing Digital Communications Examples Many General Probability Examples D. van Alphen Review & Practice: Conditional Probability
More informationChapter 5,6 Multiple RandomVariables
Chapter 5,6 Multiple RandomVariables ENCS66 - Probabilityand Stochastic Processes Concordia University Vector RandomVariables A vector r.v. is a function where is the sample space of a random experiment.
More informationECE Homework Set 3
ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3
More informationDiscrete Mathematics and Probability Theory Fall 2015 Jean Walrand 15A Discussion Solutions
CS 7 Discrete Mathematics and Probability Theory Fall 25 Jean Walrand 5A Discussion Solutions. James Bond James Bond is imprisoned in a cell from which there are three possible ways to escape: an airconditioning
More informationLecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs
Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions
More informationGaussian Random Variables Why we Care
Gaussian Random Variables Why we Care I Gaussian random variables play a critical role in modeling many random phenomena. I By central limit theorem, Gaussian random variables arise from the superposition
More information2. Conditional Expectation (9/15/12; cf. Ross)
2. Conditional Expectation (9/15/12; cf. Ross) Intro / Definition Examples Conditional Expectation Computing Probabilities by Conditioning 1 Intro / Definition Recall conditional probability: Pr(A B) Pr(A
More informationA Probability Review
A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in
More informationENGG2430A-Homework 2
ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,
More informationStat 5101 Notes: Algorithms (thru 2nd midterm)
Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................
More informationECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)
ECE 564/645 - Digital Communications, Spring 018 Homework # Due: March 19 (In Lecture) 1. Consider a binary communication system over a 1-dimensional vector channel where message m 1 is sent by signaling
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationUNIT Define joint distribution and joint probability density function for the two random variables X and Y.
UNIT 4 1. Define joint distribution and joint probability density function for the two random variables X and Y. Let and represent the probability distribution functions of two random variables X and Y
More informationECE Lecture #10 Overview
ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationSTA 4321/5325 Solution to Homework 5 March 3, 2017
STA 4/55 Solution to Homework 5 March, 7. Suppose X is a RV with E(X and V (X 4. Find E(X +. By the formula, V (X E(X E (X E(X V (X + E (X. Therefore, in the current setting, E(X V (X + E (X 4 + 4 8. Therefore,
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationECE Lecture #9 Part 2 Overview
ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin
More informationLecture 8: Channel Capacity, Continuous Random Variables
EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel
More information4 Pairs of Random Variables
B.Sc./Cert./M.Sc. Qualif. - Statistical Theory 4 Pairs of Random Variables 4.1 Introduction In this section, we consider a pair of r.v. s X, Y on (Ω, F, P), i.e. X, Y : Ω R. More precisely, we define a
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationUCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE 53 Handout #0 Prof. Young-Han Kim Thursday, April 4, 04 Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei). Time until the n-th arrival. Let the random variable N(t) be the number
More informationREVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)
REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two
More informationMathematics 426 Robert Gross Homework 9 Answers
Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationStat 5101 Notes: Algorithms
Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationParameter Estimation
1 / 44 Parameter Estimation Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay October 25, 2012 Motivation System Model used to Derive
More informationECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else
ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5
More informationECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1
EE 650 Lecture 4 Intro to Estimation Theory Random Vectors EE 650 D. Van Alphen 1 Lecture Overview: Random Variables & Estimation Theory Functions of RV s (5.9) Introduction to Estimation Theory MMSE Estimation
More informationEE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm
EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately
More informationSTAT 430/510 Probability
STAT 430/510 Probability Hui Nie Lecture 16 June 24th, 2009 Review Sum of Independent Normal Random Variables Sum of Independent Poisson Random Variables Sum of Independent Binomial Random Variables Conditional
More informationSingle Maths B: Introduction to Probability
Single Maths B: Introduction to Probability Overview Lecturer Email Office Homework Webpage Dr Jonathan Cumming j.a.cumming@durham.ac.uk CM233 None! http://maths.dur.ac.uk/stats/people/jac/singleb/ 1 Introduction
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More informationECE 650 1/11. Homework Sets 1-3
ECE 650 1/11 Note to self: replace # 12, # 15 Homework Sets 1-3 HW Set 1: Review Assignment from Basic Probability 1. Suppose that the duration in minutes of a long-distance phone call is exponentially
More informationUNIVERSITY OF DUBLIN
UNIVERSITY OF DUBLIN TRINITY COLLEGE FACULTY OF ENGINEERING, MATHEMATICS & SCIENCE SCHOOL OF ENGINEERING Electronic & Electrical Engineering Junior Sophister Engineering Trinity Term, 2015 Annual Examinations
More informationMath 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is
Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i
More informationBasics on Probability. Jingrui He 09/11/2007
Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationA Hilbert Space for Random Processes
Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts A Hilbert Space for Random Processes I A vector space for random processes X t that is analogous to L 2 (a, b) is of
More informationLecture 11: Continuous-valued signals and differential entropy
Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components
More informationLecture 3: Basic Statistical Tools. Bruce Walsh lecture notes Tucson Winter Institute 7-9 Jan 2013
Lecture 3: Basic Statistical Tools Bruce Walsh lecture notes Tucson Winter Institute 7-9 Jan 2013 1 Basic probability Events are possible outcomes from some random process e.g., a genotype is AA, a phenotype
More information18 Bivariate normal distribution I
8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer
More informationDigital Transmission Methods S
Digital ransmission ethods S-7.5 Second Exercise Session Hypothesis esting Decision aking Gram-Schmidt method Detection.K.K. Communication Laboratory 5//6 Konstantinos.koufos@tkk.fi Exercise We assume
More informationf X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx
INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't
More informationSolutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.
Solutions to Homework Set #3 (Prepared by Yu Xiang). Time until the n-th arrival. Let the random variable N(t) be the number of packets arriving during time (0,t]. Suppose N(t) is Poisson with pmf p N
More informationRandom Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.
Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University Two Random Variables Previously, we only dealt with one random variable
More informationLecture 2: Probability
Lecture 2: Probability Statistical Paradigms Statistical Estimator Method of Estimation Output Data Complexity Prior Info Classical Cost Function Analytical Solution Point Estimate Simple No Maximum Likelihood
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Definition: n-dimensional random vector, joint pmf (pdf), marginal pmf (pdf) Theorem: How to calculate marginal pmf (pdf) given joint pmf (pdf) Example: How to calculate
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationCh. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited
Ch. 8 Math Preliminaries for Lossy Coding 8.4 Info Theory Revisited 1 Info Theory Goals for Lossy Coding Again just as for the lossless case Info Theory provides: Basis for Algorithms & Bounds on Performance
More informationMath 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,
Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationLecture 2: Probability
Lecture 2: Probability Statistical Paradigms Statistical Estimator Method of Estimation Output Data Complexity Prior Info Classical Cost Function Analytical Solution Point Estimate Simple No Maximum Likelihood
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationLecture 14: Multivariate mgf s and chf s
Lecture 14: Multivariate mgf s and chf s Multivariate mgf and chf For an n-dimensional random vector X, its mgf is defined as M X (t) = E(e t X ), t R n and its chf is defined as φ X (t) = E(e ıt X ),
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationLecture 13: Conditional Distributions and Joint Continuity Conditional Probability for Discrete Random Variables
EE5110: Probability Foundations for Electrical Engineers July-November 015 Lecture 13: Conditional Distributions and Joint Continuity Lecturer: Dr. Krishna Jagannathan Scribe: Subrahmanya Swamy P 13.1
More informationBasic concepts of probability theory
Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,
More informationOrder Statistics and Distributions
Order Statistics and Distributions 1 Some Preliminary Comments and Ideas In this section we consider a random sample X 1, X 2,..., X n common continuous distribution function F and probability density
More informationFrom Determinism to Stochasticity
From Determinism to Stochasticity Reading for this lecture: (These) Lecture Notes. Outline of next few lectures: Probability theory Stochastic processes Measurement theory You Are Here A B C... α γ β δ
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationECE 541 Stochastic Signals and Systems Problem Set 9 Solutions
ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions Problem Solutions : Yates and Goodman, 9.5.3 9.1.4 9.2.2 9.2.6 9.3.2 9.4.2 9.4.6 9.4.7 and Problem 9.1.4 Solution The joint PDF of X and Y
More informationLecture 1: Basics of Probability
Lecture 1: Basics of Probability (Luise-Vitetta, Chapter 8) Why probability in data science? Data acquisition is noisy Sampling/quantization external factors: If you record your voice saying machine learning
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationLecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157
Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x
More informationBivariate Distributions. Discrete Bivariate Distribution Example
Spring 7 Geog C: Phaedon C. Kyriakidis Bivariate Distributions Definition: class of multivariate probability distributions describing joint variation of outcomes of two random variables (discrete or continuous),
More information( x) ( ) F ( ) ( ) ( ) Prob( ) ( ) ( ) X x F x f s ds
Applied Numerical Analysis Pseudo Random Number Generator Lecturer: Emad Fatemizadeh What is random number: A sequence in which each term is unpredictable 29, 95, 11, 60, 22 Application: Monte Carlo Simulations
More informationC-N M406 Lecture Notes (part 4) Based on Wackerly, Schaffer and Mendenhall s Mathematical Stats with Applications (2002) B. A.
Lecture Next consider the topic that includes both discrete and continuous cases that of the multivariate probability distribution. We now want to consider situations in which two or more r.v.s act in
More informationSTAT 430/510: Lecture 15
STAT 430/510: Lecture 15 James Piette June 23, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.4... Conditional Distribution: Discrete Def: The conditional
More informationECE 673-Random signal analysis I Final
ECE 673-Random signal analysis I Final Q ( point) Comment on the following statement "If cov(x ; X 2 ) 0; then the best predictor (without limitation on the type of predictor) of the random variable X
More informationCh. 8 Math Preliminaries for Lossy Coding. 8.5 Rate-Distortion Theory
Ch. 8 Math Preliminaries for Lossy Coding 8.5 Rate-Distortion Theory 1 Introduction Theory provide insight into the trade between Rate & Distortion This theory is needed to answer: What do typical R-D
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationExam 1 Review With Solutions Instructor: Brian Powers
Exam Review With Solutions Instructor: Brian Powers STAT 8, Spr5 Chapter. In how many ways can 5 different trees be planted in a row? 5P 5 = 5! =. ( How many subsets of S = {,,,..., } contain elements?
More informationLet X and Y denote two random variables. The joint distribution of these random
EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.
More informationHW4 : Bivariate Distributions (1) Solutions
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 7 Néhémy Lim HW4 : Bivariate Distributions () Solutions Problem. The joint probability mass function of X and Y is given by the following table : X Y
More informationIntroduction to Statistics and Error Analysis
Introduction to Statistics and Error Analysis Physics116C, 4/3/06 D. Pellett References: Data Reduction and Error Analysis for the Physical Sciences by Bevington and Robinson Particle Data Group notes
More informationChapter I: Fundamental Information Theory
ECE-S622/T62 Notes Chapter I: Fundamental Information Theory Ruifeng Zhang Dept. of Electrical & Computer Eng. Drexel University. Information Source Information is the outcome of some physical processes.
More informationLecture 8: Shannon s Noise Models
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have
More information2 Chapter 2: Conditional Probability
STAT 421 Lecture Notes 18 2 Chapter 2: Conditional Probability Consider a sample space S and two events A and B. For example, suppose that the equally likely sample space is S = {0, 1, 2,..., 99} and A
More informationECE531 Lecture 8: Non-Random Parameter Estimation
ECE531 Lecture 8: Non-Random Parameter Estimation D. Richard Brown III Worcester Polytechnic Institute 19-March-2009 Worcester Polytechnic Institute D. Richard Brown III 19-March-2009 1 / 25 Introduction
More informationProblem Set 7 Due March, 22
EE16: Probability and Random Processes SP 07 Problem Set 7 Due March, Lecturer: Jean C. Walrand GSI: Daniel Preda, Assane Gueye Problem 7.1. Let u and v be independent, standard normal random variables
More informationComplex Gaussian Ratio Distribution with Applications for Error Rate Calculation in Fading Channels with Imperfect CSI
Complex Gaussian Ratio Distribution with Applications for Error Rate Calculation in Fading Channels with Imperfect CSI Robert J. Baxley, Brett T. Walkenhorst, Guillermo Acosta-Marum Georgia Tech Research
More informationGeneral Random Variables
1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable
More informationBivariate Distributions
Bivariate Distributions EGR 260 R. Van Til Industrial & Systems Engineering Dept. Copyright 2013. Robert P. Van Til. All rights reserved. 1 What s It All About? Many random processes produce Examples.»
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More information4.1 The Expectation of a Random Variable
STAT 42 Lecture Notes 93 4. The Expectation of a Random Variable This chapter begins the discussion of properties of random variables. The focus of this chapter is on expectations of random variables.
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2
IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read
More informationMathematics 1 Lecture Notes Chapter 1 Algebra Review
Mathematics 1 Lecture Notes Chapter 1 Algebra Review c Trinity College 1 A note to the students from the lecturer: This course will be moving rather quickly, and it will be in your own best interests to
More information