Lecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs
|
|
- Morgan Tate
- 5 years ago
- Views:
Transcription
1 s of Spring Quarter 2018 ECE244a - Spring
2 Function s of The characteristic function is the Fourier transform of the pdf (note Goodman and Papen have different notation) C x(ω) = e iωx = = f x(x)e iωx dx (iω) n x n, (1) n! n=0 If the random variable is discrete with probability function p k (k), then the characteristic function is C k (ω) = e iωk p k (k). (2) k= ECE244a - Spring
3 Relationship between C x (ω) and moments of pdf s of The nth moment of the probability distribution, if it exists, may be determined by differentiation of the characteristic function x n = 1 d n i n dω n Cx(ω). (3) ω=0 function weighted sum of the moments x n of the distribution Differentiating and setting ω = 0 generates moments. ECE244a - Spring
4 Example - RV s of function of a is a with variance reciprocally scaled [ ] [ ] 1 (u u )2 exp 2πσ 2σ 2 exp iω u σ2 ω 2 2 ECE244a - Spring
5 s of Consider the transformation z = f(u) where f is so that u = f 1 (z) exists. Given p u(u), what is p z(z)? ECE244a - Spring
6 s (cont.) s of Differential area must be preserved or p u(u)du = p z(z)dz Substitute u = f 1 (z) [ p z(z) = p u f 1 (z) ] du dz ECE244a - Spring
7 Example s of Let z = e u, with p u(u) = e u over [0, ) (Exponential distribution.) Then u = ln z, dz = e u du or du dz = e u = z 1. Range of z [1, ) Substituting p z(z) = e ln z z 1 = 1 z 2 for 1 < z < ECE244a - Spring
8 s of s of Variables Let a new RV be given as the sum of two other z = v + u Given the joint distribution p uv(u, v), what is p z(z)? Start w/joint CDF F uv(u, v) and line z = u + v F z(z) is region where z < u + v or F z(z) = Differentiate both sides d dz Fz(z) = d dz z v dv p uv(u, v)du [ z v dv p uv(u, v)du Use result from calculus g(z) d p u(u)du = p u [g(z)] dg dz dz with g(z) = z v. Then dg dz = 1 and d dz z v p uv(u, v)du = p uv(z v, v) ECE244a - Spring ]
9 s of RV s ( cont. 2) s of Therefore p z(z) = p uv(z v, v)dv If are independent, then p uv(z v, v) = p u(z v)p v(v) and p z(z) = p u(z v)p v(v)dv The pdf of the sum of independent is the convolution of the separate pdfs In terms of the characteristic function M z(z) = M u(u)m v(v) ECE244a - Spring
10 Distribution s of A gaussian random variable has a gaussian probability distribution defined by f x(x). = 1 2πσ e (x x )2 /2σ 2. (4) It is easy to verify that the expected value of x is x and the variance is σ 2. A unique property of gaussian random variables is that any weighted superposition of multiple gaussian random variables, whether they are independent or dependent, is also a gaussian random variable. ECE244a - Spring
11 Area Under s of The probability that a unit-variance gaussian random variable exceeds a value z, P {x > z}, can be expressed in terms of the complementary error function denoted by erfc and defined as ( ) 1 e x2 /2 1 z 2 dx = 2π z 2 erfc, (5) where erfc(z) = 1 erf(z) with erf(z) being the error function that is defined as erf(z) = z 2 e s2 ds. π 0 For large arguments, the erfc function can be approximated by erfc(x) 1 x π e x2. (6) ECE244a - Spring
12 Joint gaussian probability distribution s of The two-dimensional joint gaussian probability distribution for two independent random variables with the same variance is given by f xy(x, y) = 1 2πσ 2 e [(x x )2 +(y y ) 2 ]/2σ 2. (7) If the two gaussian random variables each have zero-mean and are correlated so that ρ xy. = xy /σ 2, (8) then the joint probability distribution may be written as ( ) 1 f xy(x, y) = 2πσ 2 exp x2 2ρ xyxy + y 2 1 ρ 2 xy 2σ 2 (1 ρ 2. (9) xy) If ρ xy = 0, then the probability distribution is separable and factors into f xy(x, y) = f x(x)f y(y). Therefore, uncorrelated gaussian random variables are independent. ECE244a - Spring
13 Circularly Symmetric Variables s of A two-dimensional gaussian distribution is often used to model complex-baseband noise The corresponding random variable is called a complex gaussian random variable. If the two noise components are independent and have equal variance, then the two-dimensional gaussian distribution is circularly-symmetric. The corresponding random variable is called a circularly-symmetric gaussian random variable. It is possible to have a joint two-dimensional probability distribution that has marginal gaussian distributions, but is not jointly gaussian. Knowing that each marginal distribution is gaussian is not sufficient to infer that the joint distribution is gaussian. ECE244a - Spring
14 Distribution s of The probability distribution for the multivariate gaussian distribution is given by f x(x) = where C is the real autocovariance matrix. 1 e 1 2 (x x ) T C 1 (x x ), (10) (2π) N/2 det C The matrix is defined as the expectation of the outer product of the column vector with itself after removing the mean C = (x x ) (x x ) T. (11) The on-diagonal matrix element C ii is the variance of the gaussian random variable x i. The off-diagonal matrix element C ij is the autocovariance of the two gaussian random variables x i and x j. ECE244a - Spring
15 Example s of As an example, let a set of M uncorrelated gaussian random variables have an autocovariance matrix given by C = σ 2 I M where where I M is an M by M identity matrix. Using the matrix identity det(σ 2 I M) = σ 2M, the joint probability distribution is f x(x) = 1 (2πσ 2 ) M/2 e x x 2 /2σ 2. (12) ECE244a - Spring
16 Circularly-symmetric Variables s of A multivariate joint gaussian distribution can also be defined in which each component of the column vector is a complex gaussian random variable z i = Re[z i ] + i Im[z i ]. The complex autocovariance matrix of this vector is defined as W. = (z z ) (z z ). (13) If the complex gaussian random variables are jointly gaussian and jointly circularly symmetric, then the real autocovariance matrix C given in (11) can be expressed in terms of the complex autocovariance matrix W given in (13) [ ] C = 1 Re W Im W. (14) 2 Im W Re W (asked as a problem) ECE244a - Spring
17 pdf for vector of Circularly-symmetric s s of The joint probability distribution for a circularly-symmetric multivariate gaussian distribution expressed in terms of the complex autocovariance matrix is f z(z) = 1 π N det W e (z z ) W 1 (z z ). (15) Using the properties of determinants, the leading term (π N det W) 1 may be written as det(πw) 1 If W = 2σ 2 I M, then following the same steps used to derive (12), we have f z(z) = 1 (2πσ 2 ) M e z z 2 /2σ 2. (16) ECE244a - Spring
18 s of Example A new set of decorrelated gaussian random variables may be defined by using a coordinate transformation matrix T that diagonalizes the autocovariance matrix C. The uncorrelated gaussian random variables in the new coordinate system are then independent, but may have changed mean values and variances. Consider a complex gaussian random variable defined in (9) with ρ xy = 0. The joint probability distribution f xy(x, y) for these correlated gaussian random variables is shown below y y x x (a) (b) ECE244a - Spring
19 Example-cont. s of To determine the transformation that produces decorrelated random variables, define a new set of random variables x and y such that ] [ x y ] = A [ x y where A is the matrix that diagonalizes the autocovariance matrix C, given as [ ] σx C = 2 ρ xyσ xσ y ρ xyσ xσ y σy 2. The matrix A is formed by the eigenvectors of C. If σ 2 x = σ 2 y = σ 2, then x = 1 2 (x + y) and y = 1 2 (x y). The eigenvalues of C are the variances of the decorrelated gaussian random variables and are given by σ 2 x = σ2 (1 + ρ xy) and σ 2 y = σ2 (1 ρ xy)., ECE244a - Spring
20 Decorrelated Distribution s of The decorrelated joint gaussian distribution is [ ] f x y (x, y 1 /2σ ) = 2 (1+ρ xy) 2πσ 2 (1 + ρ xy) e x 2 [ ] 1 /2σ 2 (1 ρ xy) 2πσ 2 (1 ρ xy) e y 2, where the functional form is written to show that the probability distribution in the new coordinate system is separable as f x y (x, y ) = f x (x )f y (y ). Therefore x and y are independent gaussian random variables.joint Variables ECE244a - Spring
21 Correlated s of Two jointly random variables each with zero mean [ ] 1 p uv(u, v) = 2πσ 2 1 ρ exp u2 + v 2 2uvρ 2 2σ 2 (1 ρ 2 ) u v ρ σ 2 Uncorrelated jointly are independent (One of rare times that correlation implies independence.) s of statistically independent are also with the total variance being the sum of the individual variances. s of dependent are also but the variance of sum is not sum of individual variances. If σ same for each distribution, total variance varies from 2σ 2 (two are independent (ρ = 0) to 4σ 2 when the two are completely correlated (ρ = 1). ECE244a - Spring
22 s of Let the pdf of the sum of n independent be defined by z = 1 n i=1 U i u i σ i where it is assumed that z has unit variance and zero mean. As n the pdf of z approaches a distribution regardless of the underlining pdfs or lim pz(z) = 1 e z2 2 n 2π ECE244a - Spring
23 (cont.) s of Result only true in the limit. Finite sums may not approach -especially in tails individual distributions must be small. ECE244a - Spring
24 Phasors s of Recall that a single component of the field is real U(z, t) = A(z, t) cos [ωt kz + φ(z, t)] At a specific point in space r and a specific time t, the amplitude of the field is u = A cos [ φ ] where A and φ are random variables. Let u = Re{U} where U = Ae jφ is a random complex phasor. Consider as a vector in complex plane ECE244a - Spring
25 Phasor s Consider sum of a large number of random phasors added to a constant phasor A as shown below s of Imaginary Axis (quadrature) of independent random vectors Many independent contributions Imaginary Axis (quadrature) Joint gaussian pdf Real Axis (in-phase) Real Axis (in-phase) ECE244a - Spring
26 s of Phasors (cont. 1) s of Write down real and imag. parts of the fluctuation part of the phasor r = Re{a} = A cos(φ) + 1 N i = Im{a} = A sin(φ) + 1 N N α k cos φ k k=1 N α k sin φ k k=1 Both r and i are sums of many independent contributions and thus as asserted by the central limit theorem, the pdf of each is and it can be shown that r and i are jointly ECE244a - Spring
27 Mean of the s of Now calculate mean, variance and correlation of the The mean value of the distribution is given by r = = 1 N 1 N N αk cos φ k k=1 N α k cos φ k (from independence) k=1 = N α cos φ (from iid) = 0 ( φ uniform over ( π, π)) ECE244a - Spring
28 s of Start with definition of variance σr 2 = r 2 r r = 0 from before, so that σr 2 = r 2 and r 2 N N = N 1 αn cos φ n α k cos φ k k=1 n=1 N N = N 1 α 2 cos φ n cos φ k (from independence) k=1 n=1 and same distribution For n = k cos φ n cos φ k = 1 2 (mean of cos 2 (φ) = 1 2 for φ being uniform ) For n = k cos φ n cos φ k = 0 (add out of phase ) ECE244a - Spring
29 Variance and s of The variance then becomes r 2 = i 2 α 2 = σ 2 2 r i r i = 1 N N α 2 cos φ N n sin φ k = 0 k=1 n=1 ECE244a - Spring
30 s of A sum of a large number of phasors produces a joint process f xy(x, y) = σ 2 = α 2 /2 variances are the same 1 2πσ 2 e [(x x )2 +(y y ) 2 ]/2σ 2. (17) x and y are zero mean, jointly and are independent ECE244a - Spring
EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationA Probability Review
A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011
A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationSection 3: Complex numbers
Essentially: Section 3: Complex numbers C (set of complex numbers) up to different notation: the same as R 2 (euclidean plane), (i) Write the real 1 instead of the first elementary unit vector e 1 = (1,
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationProbability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.
Lecture 5 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Probability, CLT, CLT counterexamples, Bayes The PDF file of
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationUncorrelatedness and Independence
Uncorrelatedness and Independence Uncorrelatedness:Two r.v. x and y are uncorrelated if C xy = E[(x m x )(y m y ) T ] = 0 or equivalently R xy = E[xy T ] = E[x]E[y T ] = m x m T y White random vector:this
More informationReview (Probability & Linear Algebra)
Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint
More information5 Operations on Multiple Random Variables
EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y
More information2.7 The Gaussian Probability Density Function Forms of the Gaussian pdf for Real Variates
.7 The Gaussian Probability Density Function Samples taken from a Gaussian process have a jointly Gaussian pdf (the definition of Gaussian process). Correlator outputs are Gaussian random variables if
More information16.584: Random Vectors
1 16.584: Random Vectors Define X : (X 1, X 2,..X n ) T : n-dimensional Random Vector X 1 : X(t 1 ): May correspond to samples/measurements Generalize definition of PDF: F X (x) = P[X 1 x 1, X 2 x 2,...X
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More information5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors
EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we
More informationReview (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology
Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna
More informationStatistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }
Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v
More informationDefinition of a Stochastic Process
Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic
More informationUNIT Define joint distribution and joint probability density function for the two random variables X and Y.
UNIT 4 1. Define joint distribution and joint probability density function for the two random variables X and Y. Let and represent the probability distribution functions of two random variables X and Y
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More informationLecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN
Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and
More informationGaussian random variables inr n
Gaussian vectors Lecture 5 Gaussian random variables inr n One-dimensional case One-dimensional Gaussian density with mean and standard deviation (called N, ): fx x exp. Proposition If X N,, then ax b
More informationUniversity of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries
University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout :. The Multivariate Gaussian & Decision Boundaries..15.1.5 1 8 6 6 8 1 Mark Gales mjfg@eng.cam.ac.uk Lent
More informationLet X and Y denote two random variables. The joint distribution of these random
EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationProblem Set 7 Due March, 22
EE16: Probability and Random Processes SP 07 Problem Set 7 Due March, Lecturer: Jean C. Walrand GSI: Daniel Preda, Assane Gueye Problem 7.1. Let u and v be independent, standard normal random variables
More informationStochastic Processes
Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic
More informationIntroduction to Normal Distribution
Introduction to Normal Distribution Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 17-Jan-2017 Nathaniel E. Helwig (U of Minnesota) Introduction
More informationChapter 5,6 Multiple RandomVariables
Chapter 5,6 Multiple RandomVariables ENCS66 - Probabilityand Stochastic Processes Concordia University Vector RandomVariables A vector r.v. is a function where is the sample space of a random experiment.
More informationThe Central Limit Theorem
The Central Limit Theorem (A rounding-corners overiew of the proof for a.s. convergence assuming i.i.d.r.v. with 2 moments in L 1, provided swaps of lim-ops are legitimate) If {X k } n k=1 are i.i.d.,
More informationECE Lecture #10 Overview
ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions
More informationProblem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},
ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationCourse on Inverse Problems
Stanford University School of Earth Sciences Course on Inverse Problems Albert Tarantola Third Lesson: Probability (Elementary Notions) Let u and v be two Cartesian parameters (then, volumetric probabilities
More informationECE-340, Spring 2015 Review Questions
ECE-340, Spring 2015 Review Questions 1. Suppose that there are two categories of eggs: large eggs and small eggs, occurring with probabilities 0.7 and 0.3, respectively. For a large egg, the probabilities
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental
More informationJoint Distributions: Part Two 1
Joint Distributions: Part Two 1 STA 256: Fall 2018 1 This slide show is an open-source document. See last slide for copyright information. 1 / 30 Overview 1 Independence 2 Conditional Distributions 3 Transformations
More informationElliptically Contoured Distributions
Elliptically Contoured Distributions Recall: if X N p µ, Σ), then { 1 f X x) = exp 1 } det πσ x µ) Σ 1 x µ) So f X x) depends on x only through x µ) Σ 1 x µ), and is therefore constant on the ellipsoidal
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationAverages of Random Variables. Expected Value. Die Tossing Example. Averages of Random Variables
Averages of Random Variables Suppose that a random variable U can tae on any one of L random values, say u 1, u 2,...u L. Imagine that we mae n independent observations of U and that the value u is observed
More informationIEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion
IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there
More informationEAS 305 Random Processes Viewgraph 1 of 10. Random Processes
EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome
More informationLecture 2: Review of Basic Probability Theory
ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent
More informationMultivariate Distribution Models
Multivariate Distribution Models Model Description While the probability distribution for an individual random variable is called marginal, the probability distribution for multiple random variables is
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More information2. The CDF Technique. 1. Introduction. f X ( ).
Week 5: Distributions of Function of Random Variables. Introduction Suppose X,X 2,..., X n are n random variables. In this chapter, we develop techniques that may be used to find the distribution of functions
More information3. Review of Probability and Statistics
3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationFinal. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 06 Instructor: Prof. Stanley H. Chan Final Fall 06 (Dec 6, 06) Name: PUID: Please copy and write the following statement: I certify
More informationBasics on Probability. Jingrui He 09/11/2007
Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability
More informationMultivariate random variables
Multivariate random variables DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Joint distributions Tool to characterize several
More informationSolutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π
Solutions to Homework Set #5 (Prepared by Lele Wang). Neural net. Let Y X + Z, where the signal X U[,] and noise Z N(,) are independent. (a) Find the function g(y) that minimizes MSE E [ (sgn(x) g(y))
More informationfor valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I
Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:
More informationBivariate Distributions. Discrete Bivariate Distribution Example
Spring 7 Geog C: Phaedon C. Kyriakidis Bivariate Distributions Definition: class of multivariate probability distributions describing joint variation of outcomes of two random variables (discrete or continuous),
More informationLecture 5: Moment Generating Functions
Lecture 5: Moment Generating Functions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 28th, 2018 Rasmussen (CUED) Lecture 5: Moment
More informationCircuit Analysis-III. Circuit Analysis-II Lecture # 3 Friday 06 th April, 18
Circuit Analysis-III Sinusoids Example #1 ü Find the amplitude, phase, period and frequency of the sinusoid: v (t ) =12cos(50t +10 ) Signal Conversion ü From sine to cosine and vice versa. ü sin (A ± B)
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationStochastic Adaptive Control (02421) Niels Kjølstad Poulsen
Stochastic Adaptive Control (41) www.imm.dtu.dk/courses/41 Niels Kjølstad Poulsen Build. 33B, room 16 Section for Dynamical Systems Dept. of Applied Mathematics and Computer Science The Technical University
More informationVariations. ECE 6540, Lecture 02 Multivariate Random Variables & Linear Algebra
Variations ECE 6540, Lecture 02 Multivariate Random Variables & Linear Algebra Last Time Probability Density Functions Normal Distribution Expectation / Expectation of a function Independence Uncorrelated
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationL2: Review of probability and statistics
Probability L2: Review of probability and statistics Definition of probability Axioms and properties Conditional probability Bayes theorem Random variables Definition of a random variable Cumulative distribution
More informationProbability Background
Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More informationMTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6
MTH739U/P: Topics in Scientific Computing Autumn 16 Week 6 4.5 Generic algorithms for non-uniform variates We have seen that sampling from a uniform distribution in [, 1] is a relatively straightforward
More informationPhysics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester
Physics 403 Parameter Estimation, Correlations, and Error Bars Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Best Estimates and Reliability
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationThe Multivariate Gaussian Distribution
The Multivariate Gaussian Distribution Chuong B. Do October, 8 A vector-valued random variable X = T X X n is said to have a multivariate normal or Gaussian) distribution with mean µ R n and covariance
More informationGaussian, Markov and stationary processes
Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More information3F1 Random Processes Examples Paper (for all 6 lectures)
3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories
More informationDependence. MFM Practitioner Module: Risk & Asset Allocation. John Dodson. September 11, Dependence. John Dodson. Outline.
MFM Practitioner Module: Risk & Asset Allocation September 11, 2013 Before we define dependence, it is useful to define Random variables X and Y are independent iff For all x, y. In particular, F (X,Y
More informationSection 8.1. Vector Notation
Section 8.1 Vector Notation Definition 8.1 Random Vector A random vector is a column vector X = [ X 1 ]. X n Each Xi is a random variable. Definition 8.2 Vector Sample Value A sample value of a random
More information1 Joint and marginal distributions
DECEMBER 7, 204 LECTURE 2 JOINT (BIVARIATE) DISTRIBUTIONS, MARGINAL DISTRIBUTIONS, INDEPENDENCE So far we have considered one random variable at a time. However, in economics we are typically interested
More informationMultivariate Distributions
IEOR E4602: Quantitative Risk Management Spring 2016 c 2016 by Martin Haugh Multivariate Distributions We will study multivariate distributions in these notes, focusing 1 in particular on multivariate
More informationUCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE53 Handout #34 Prof Young-Han Kim Tuesday, May 7, 04 Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei) Linear estimator Consider a channel with the observation Y XZ, where the
More informationOutline. Random Variables. Examples. Random Variable
Outline Random Variables M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Random variables. CDF and pdf. Joint random variables. Correlated, independent, orthogonal. Correlation,
More informationProbability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver
Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation
More informationDependence. Practitioner Course: Portfolio Optimization. John Dodson. September 10, Dependence. John Dodson. Outline.
Practitioner Course: Portfolio Optimization September 10, 2008 Before we define dependence, it is useful to define Random variables X and Y are independent iff For all x, y. In particular, F (X,Y ) (x,
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals
ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals Jingxian Wu Department of Electrical Engineering University of Arkansas Outline Matched Filter Generalized Matched Filter Signal
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics. Probability Distributions Prof. Dr. Klaus Reygers (lectures) Dr. Sebastian Neubert (tutorials) Heidelberg University WS 07/8 Gaussian g(x; µ, )= p exp (x µ) https://en.wikipedia.org/wiki/normal_distribution
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationx. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).
.8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics
More information2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.
CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More information